In an era where Canadian nonprofits face unprecedented challenges, with close to half (46.1%) reporting increased demand for services while resources dwindle (Statistics Canada), the sector stands at a critical crossroads. Understanding what we’re dealing with is crucial, but can AI and technology improvements really save our missions while protecting those we serve?
The EU AI Act defines an AI system as “a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.” This definition highlights both the power and the responsibility inherent in AI adoption—particularly for organizations dedicated to social justice.
The social justice imperative
The nonprofit sector’s hesitation toward AI stems from deep-rooted concerns about perpetuating systemic inequities. As Sarah Saska, Ph.D, Co-Founder & CEO of Feminuity, explains, “The nonprofit sector’s resistance to AI is valid and deeply rooted in historical and systemic biases that these tools often amplify. However, this resistance does not mean AI should be abandoned—it highlights the need for a more intentional and responsible approach to AI integration.” In terms of social justice, nonprofits, with missions often grounded in equity and justice, are uniquely positioned to lead the charge in building ethical AI systems that align with our values.
Data as power for change
Patricia Gestoso, Ph.D. Founder, Gestoso Consulting understands how AI was constructed and how it is currently affecting us: “Research and product development in AI currently benefits a few powerful actors, reinforces systems of power, and exacerbates and widens inequities. Furthermore, existing governance structures and regulatory processes tend to overlook the concerns and fundamental rights of those who are likely to be disproportionately negatively impacted by ubiquitous integration of AI.”
People need agency in the use of AI. Nonprofits must be part of the conversation to provide feedback on what is needed according to the service requirements of those we serve.
Charities can create their own AI but if they don’t, the data and the present systems will be used against us and the needs of people we support. We see this with the bias that causes harms to specific populations, those people who rely on us and our missions to protect and advocate for them.
“AI development occurred without our participation, so now we must teach ourselves the skills to look under the hood,” says Aja Mason, Founder of Boreal Logic. Low quality data leads to bad policy decisions.
At this time, nonprofits can now participate in AI development, something that was not available when corporations were building AI for mass consumption. As Aja Mason notes, “Now that smaller LMs (learning models) can be used, we don’t have to rely on the corporate giants with their commercially driven focus.” This democratization of AI development offers a crucial opportunity to shift power dynamics and ensure that data serves community interests, rather than corporate ones, for the nonprofit sector.
While researching and finding the Future of Memory project, a quote by Qianqian Ye (Chinese artist, creative technologist and educator) stands out, “We must creatively fight this system that expects us to be passively commodified into data.” This resistance isn’t about rejecting technology, but rather about actively shaping it to serve our communities’ needs. The nonprofit sector’s approach to AI must be both critical and constructive, ensuring that our engagement with these tools advances, rather than undermines, our mission for social justice.
Transforming service through innovation
For leaders like Liban Abokor, CEO of Reimagine Labs, there is a sensitivity to the fact that all of us are impacting people’s lives, and the nonprofit sector wants and needs to get it right. His commitment to leveraging data for evidence-based policy and service solutions is to positively understand and impact complex social challenges. Through Reimagine Labs’ sector-specific AI software, the ‘Navigator’, they are revolutionizing how programs are developed and implemented.
“Reimagine Labs wants to create better programs and services for the sector,” Abokor explains. “What we’ve built is essentially a shortcut to success. Navigator transforms what could be a six-month research and planning process into a matter of hours. It’s like having instant access to the collective wisdom of thousands of programs and research studies.”This innovation emerged from a common sector experience, the disappointment of well-constructed programs missing their mark. Reimagine Labs is ensuring that organizations can access crucial information during program development, plus write the program, so better outcomes and higher impact can be achieved for communities in need.
Democratizing access to resources
Deepa Chaudhary’s work with The Grant Orb, AI for Grant writing, exemplifies how AI can democratize access to funding—a critical aspect of social justice in the nonprofit sector. “AI chatbots like ChatGPT, Claude, and specialized tools like Midjourney, and Grant Orb, are helping nonprofits alleviate burnout and enhance impact in an increasingly demanding environment. These tools don’t just save time – they fundamentally transform how nonprofits operate, enabling them to pursue more opportunities, serve more people, and amplify their mission without straining their limited resources.”
Recognizing the challenges in AI adoption, Chaudhary notes, “Prompting is frustrating. Chatbots are good for information but not tasks, so AI needs to be simplified to improve the uptake by users.” Her solution? Creating a specialized AI tool, the Grant Orb, that reduces grant writing from days and weeks to minutes, saving staff time so they can focus on delivering service, rather than spending days writing about it.
For the sector’s adoption of AI, we still need ‘humans at the wheel’. The human service sector cannot expand by leaving technology to work alone. We need to learn how to use AI to fulfill our missions because of our lack of resources. This is the point when individual organizations can take back their time and flourish by using tools made specifically for them.
Building equity through technology and human connection
The sector’s struggles with data quality and reporting have often reflected broader systemic inequities. As Wilfreda Edward, Executive Director of the Canadian Centre for Nonprofit Digital Resilience CCNDR suggests, by “working with AI, we can create a model where no organization is left behind.” This vision requires embracing a sociotechnical approach to AI—understanding that technology doesn’t exist in isolation, but as part of a complex system involving people, processes, and social contexts. A sociotechnical approach recognizes that AI’s effectiveness depends not just on its technical capabilities, but on how it interacts with human workers, organizational cultures, and social structures.
Can using and developing AI answer many of the fundamental questions facing our sector? What can we do with our limited resources and our current capacity? Can we really do more by working less? The answer lies not just in adopting AI tools, but in developing them within a framework that considers:
- How these tools affect the workplace and staff wellbeing
- Their impact on service delivery and clients
- Improving existing organizational processes
- Their role in either challenging or reinforcing existing power structures
- The broader social and ethical implications of their use
The path forward
Nonprofits need to participate in AI development, not just to survive, but to ensure technology serves the cause of social justice. Augmented reality artist Nancy Baker Cahill suggests, “No tool is neutral, so how can they be deployed or used subversively to be less extractive and more empowering?”
The future of social justice work depends on the sector’s ability to harness AI while staying true to its values. “Nonprofits need to ‘get in the game’, so that no one is left behind.” states Katie Gibson, Consultant and Senior Fellow for Responsible Digital Innovation at The Dais, Toronto Metropolitan University. By actively participating in AI development and adoption, nonprofits can ensure these tools serve the greater good while addressing systemic inequities. The time for action is now—to shape AI development in ways that advance social justice and empower the communities we serve.
Researched and written by Tina Crouse with quotes by interview subjects and editorial assistance by Claude AI.
Tina Crouse is a Grants Specialist and Nonprofit Management Consultant and the creator of the ‘Grant Gauge’, unbiased software for grant making by government and foundations. She is a tech4good innovator and has participated in several webinars and panels regarding AI for Nonprofits. She published the ebook “Conscious Coding for Equitable Funding in the Nonprofit Sector” in 2023. Tina also coaches in social enterprise and social finance and is the creator of a number of firsts in Canada.