Great Fundraising Organizations, by Alan Clayton. Book cover.

Spotlight on AI: common concerns examined

Melanie May | 19 March 2024 | News

A robotic hand reaches up to touch a network of lines. By Tara Winstead on pexels

AI offers powerful opportunities for charities and their fundraisers, from the automation of time-consuming admin tasks to the creation of draft copy that can then be further refined, and valuable insights on audiences that enable more relevant and engaging communications.

While some charities are already purposefully trialling or adopting it, it’s still early days for the sector, and as with any new technology there are naturally a few concerns.

We get the experts’ view on some of the biggest.

Advertisement

Great Fundraising Organizations, by Alan Clayton. Buy now.

I don’t know where to begin

Everyone’s talking about AI and its pros and cons, and we’re constantly hearing about new and different applications, which can make it difficult to know where to start. In fact, it doesn’t have to be complicated, or expensive. A good place to begin is to spend some time experimenting with the free and low-cost content creation tools available, like OpenAI’s ChatGPT, Microsoft Copilot, and Google’s Gemini (previously Bard), to get an idea of what they can do, and how they might be able to help your organisation.  There are also lots of other free AI tools that can help with a multitude of other tasks – check out this site for ideas.

Finding out how other people are using it is also useful.

Zoe Amar, Founder and Director, Zoe Amar Digital, says:

“With more AI tools coming to market, and AI capabilities being added into existing products the amount of choice can be overwhelming. The answer is, as ever, communication and collaboration.

 

“Talking to other fundraisers whether within your own charity or in other organisations to understand how they are using AI. Maybe it is to draft campaign content, ideate ideas for a new fundraising product or look for patterns amongst donor data so you can better target future campaigns. There is a lot of learning happening across the sector but it’s taking place in siloes, so find out what others are doing.

 

“Once you have ideas about use cases, you could then test these out in your team and based on the results, prioritise which AI activities meet your success criteria e.g. improving productivity, ability to generate innovative ideas or grow income through better targeting.”

AI is going to take people’s jobs

Certainly it’s going to change jobs, but it’s a tool designed to help people so should never be left to do whole tasks from start to finish on its own. Retaining that human oversight and insight will be essential for using it responsibly and for getting the best from it.

Paul Hayward, Director of Business Development, Engaging Networks, says:

“We should look at AI as a colleague rather than a competitor. In the short-term, it’ll help reduce the amount of administrative, data focused and technical work fundraisers do, freeing up their time to focus on their fundraising programme and empowering them to be more responsive to the news agenda and political climate.

 

“In the long-term, the role of a fundraiser will change as people will be more likely to edit and amend AI generated content rather than writing it themselves. This will allow us to focus on strategy, think about how we respond to insight and to further develop our programmes to generate more money. As AI becomes more of a part of our everyday work-flow, fundraisers will need to practise and become better at writing prompts, interpreting data and interrogating information to get the best out of AI, but they will be training themselves up for the future job market.”

For those still unsure of how it can add value for their organisation, Phil Dearson, Director at Chameleon by WPNC, suggests thinking of AI in terms of automation, and augmentation.

He explains:

“If there’s a set of repetitive tasks being done time and again every week, every month, then those things are ripe for being automated by generative AI. This is the automation of tasks, not roles, which is good for people because in many cases it’s a waste of time for human beings to be doing admin.

 

“You can also use generative AI tools to make you better at what you already do. For example, when you’re thinking how to approach a particular problem and have some ideas, they can act as a foil. You could ask one to argue against your ideas, or to consider your ideas or approach from a different point of view. Or you could upload anonymized information about the people you’re trying to engage with, and ask the AI to argue from their perspective about what the impact of your strategy might be on them, given what it knows – that can be very, very useful.”

We’ll lose the human touch

As mentioned above, humans will remain essential for ensuring that what AI creates is accurate, ethical and relevant. We had the importance of the human connection reiterated to us during the pandemic with charities picking up the phone to keep in touch with supporters, and post-pandemic with the keen return to in-person events.

This will be just as true with AI – it can (and already does) help charities support simple queries and requests through chatbots, and can quickly write a letter (for example), but people will always be needed to oversee and guide it, and of course, you can’t beat the human touch for more complicated interactions with supporters and beneficiaries.

Jane Trenaman, Managing Director, The HX Consultancy, says:

“Whether you’re writing a strategy or a donor thank you letter, taking the first draft that ChatGPT spits out would be a misguided approach.

 

“In December, OpenAI released their guide to writing prompts which includes 6 strategies for getting better results: write clear instructions; provide reference text; split complex tasks into simpler subtasks; give the model time to ‘think’; use external tools; and test changes systematically. These are all things that require human intervention to make the most of the tool. And human judgement can combine with other sources and also know when human interaction or content generation is the most appropriate avenue.”

Is it ethical?

Using AI to make decisions that involve real people, or to create copy or images raises ethical concerns too. Recent research on this topic by Rogare concluded that AI does not currently have access to sufficiently-sophisticated knowledge of the ethics of fundraising to be able to make ethical decisions, and that because of this, human oversight is needed to ensure any use of it in fundraising practice is done ethically and in accordance with best practice and regulatory codes. This, the report found, will require some upskilling of fundraisers.

Speaking on the findings at launch, fundraising consultant Cherian Koshy who led the team behind the report said:

“As AI enters and becomes widespread in fundraising practice, we must upskill the human overseers with this knowledge and these competencies. Skilled and knowledgeable human oversight of AI in fundraising is absolutely essential.”

At the very least, fundraisers should have policies in place if using it to create imagery and copy to ensure people are always represented with dignity, and that data is used within the bounds of best practice and regulation.

Dearson says:

“It’s how you use this technology that determines whether you’re using it ethically or not. It’s not the technology itself. So if you’re fabricating versions of reality, by generating stories that aren’t true or generating visual representations of things which are not really accurate, then that’s an unethical use of AI. It’s not that AI is unethical, it’s you as a charity, so you need to set some simple guidelines for its use – people need to know what they should and shouldn’t use AI tools for.”

Amar also adds this:

“The number one piece of advice I’ve give any fundraiser is to adopt these tools in line with your values as a charity. There’s a lot of pressure in AI to keep chasing after the latest shiny thing but taking a step back and asking if the ways you are using it, and the tools you deploy, sit right with your ethics is a vital part of the adoption process.”

It’s risky from a security perspective

Another common concern centres around how to keep sensitive data safe. The key here is not to put personal data into any generative AI tool, and to ensure that everyone using AI tools is sticking to best practice and regulations, just as you would with anything else. Establishing guidelines for their use so people are clear on what they should and shouldn’t do with them – as well as which tool they can use – is essential.

Amar says:

“You need to think about the parameters for these tools. What should other fundraisers on your team know about what data they should enter into AI tools? If you have doubts about the security of sensitive data then don’t do it.”

Moving forward

AI undoubtedly holds a lot of promise for charities and their fundraising teams, but as with anything it also comes with some potential risks and challenges. Learning to navigate these will be integral to using it successfully.

To find out more about how AI is helping fundraisers, as well as how to get the most from it, check out our feature here.

Loading

Mastodon