Data might be the fundraiser’s best friend, but from data quality and consistency issues to different teams using databases in different ways, and difficulty in getting the analysis needed against KPIs, fundraisers can also run up against numerous issues.
Here are some commonly experienced data issues, along with ideas and advice for how to solve or approach them.
Issue: Duplicates and conflicting information in our database are causing multiple problems, from higher costs to unhappy supporters
Inconsistent data, with duplicates and conflicting information, can’t be used easily or efficiently. An individual may be listed across several records because they have been set up several times by different departments in slightly differing formats – a name or title maybe have been input as Mrs and in another record as Ms, or an initial vs first name. It gets more complicated still when some fields are completed and others aren’t, such as a Gift Aid consent captured in one record but not in its duplicate.
Because they’re effectively ‘different’ records they’re not easy to catch in deduplication and data cleansing, and the risks can be hugely detrimental. The individual ends up being sent several mail packs (an immediate turn off!), your organisation pays the extra cost of both production and possibly donor attrition, plus data analytics can be skewed.
The trick is to get it right from the word go. When you first set up the system, involve all users in the creation of the protocols for data collection and input – the staff actually inputting the data, as well as those who need to use it, will know what information needs to be collected and saved on record.
Then document it, in a clear and concise data capture protocol that is shared with staff at every collection point, preferably with training to emphasis the critical importance of accuracy, with clear dos and don’ts. And revert back to that protocol documentation on a regular basis for refreshers, and especially when bringing in new staff – add it to any on-boarding process. Ideally, you would also allocate a key stakeholder to be in control of data quality and the protocols around it, somebody who has a vested interest in ensuring data quality, and ensure they are part of the whole process.
Suzanne Lewis, Managing Director, Arc Data
Issue: Our organisation has a lot of siloed data, making it difficult to engage meaningfully with people
Data captured and held in various departments such as comms, volunteering, fundraising – each potentially holding records on the same people – is still common practice. Supporters may come into the charity by any combination of those routes, so it’s important that they feel the organisation knows who they are and how they are connected with the cause, across all of their touchpoints.
For this kind of insight you need a centralised view of your data – a CRM. It doesn’t need to be all singing and all dancing, but it does need to collect data and service all the data users effectively.
At this point, it’s important to appreciate that each of the teams holding the data will feel they own the data. They will want to use it when it suits them, for example when they need to send out a message. But this means the supporter could end up with five different ones landing in the same week or month.
Shifting a data system and the user culture around it from silos to centralised means you need the technology as well as training to help your staff understand the importance of a 360-view and the need to collaborate across data capture and usage to make it possible. You also need to centrally control data with solid data protocols and ideally a data quality manager. The upside of course is the ability to combine knowledge of the consumer and their behaviour for valuable insight and enabling optimum use of the data for contacting the individuals concerned with a single unified approach.
Suzanne Lewis, Managing Director, Arc Data
Issue: No matter how hard we try, our analysis models are never 100%
Fundraising teams can put an enormous amount of time and effort into trying to develop the perfect data analysis or statistical models, but this quest for perfection can be so time-consuming that it blocks rather than aids efforts to move the charity forward.
There is no such thing as a perfect analysis model and there’s a trade-off to be had between striving for the perfect data analysis or statistical model that takes a long time to fully develop, versus having a data solution that is “good enough” – a 80/20 solution or even 95/5 solution if you like. To explain: sometimes there is too much time spent scrutinising and critiquing a model that is essentially 95% accurate and perfectly fine to use for fundraising and marketing targeting. I’ve observed far too many discussions and time spent putting barriers and lengthy processes in the way of implementing relatively straightforward models that potentially move organisations forward and deliver insight. Data analysis is about supporting decision-making, and on occasions a quick and useful 80/20 approach can deliver impact quicker – complex and striving for 100% perfection is not always best.
Issue: We need more supporters but we’re not sure whether an acquisition or a reactivation campaign makes more sense.
The obvious route to building supporter numbers might be an acquisition campaign but these can be costly, so the question is – especially with budgets under pressure – what are the alternatives? Is it worth, for example, looking at how to make the CRM database work harder to reactivate supporters rather than necessarily considering a more expensive supporter recruitment programme?
Both are important, but winning back lapsed supporters through initiatives such as reactivation campaigns or repermissioning contacts is the cheaper option. Acquiring new supporters can cost 5 to 7 times more than retaining an existing one and yet I wonder how many charities get the balance right in their annual budgets when planning the mix of retention and acquisition. Further, studies show that existing contacts are 50% more likely to try new products and spend 30% more than new contacts. Again, it’s important to state the careful balance of allocating spend to both acquisition and retention but ultimately it’s measured by supporter lifetime value. This will vary by charity but are charities focused enough on retaining and reactivating lapsed supporters?
Issue: Data content issues are making it difficult for us to take advantage of what automation & other tools have to offer
Data quality and content is a common key area where most organisations have some kind of issue. The extent and depth of the issues can be enough to prevent efficient usage of the data assets, although it only takes one issue of concern to prevent trust in the wider data. This mistrust means that staff become overly cautious in its usage, with the introduction of potentially multiple manual checkpoints across processes, and case by case fixes of any issues identified. In such a situation, an organisation will never be able to benefit from efficiencies gained through automation of tasks.
Organisations will struggle to make changes for the better unless data quality or collection issues are addressed at source or point of capture. Where teams are looking more and more to seamless data integrations, hands-off automation, and cloud platforms, which may not allow direct back-end access to the data, trust in the data is imperative. Without it, there will always be a barrier to progress. Components to consider could include:
- Validation on online data collection forms and/or data capture that acts as a gate keeper, spanning all personal information collected.
- Integrated data cleaning against core databases and covering elements such as ambiguous values, suspect data, expletive checks and so on. Where an integrated service cannot be employed, ensuring regular batch data processing is in place (which may be via a third-party organisation or through internal usage of validation software).
- Employment of business rules for the correction of data within existing repositories, developed from thorough data discovery and deployed as regularly scheduled tasks or on data load to correct or flag data for attention. The key here is to ensure that subsequent usage of data downstream does not need to repeat such corrections.
- Addressing of historical data as well as ongoing inbound data, coupled with effective use of data retention policies to remove old data and reduce the overhead of data management, especially on aged data which is often neglected as lapsed, but which could be reactivated at any time.
- Accept that data quality will never be absolutely 100%…there will always be something that isn’t dealt with, hasn’t been thought of, or appears as a new challenge – it’s how these are spotted and any outcomes effectively and efficiently managed that counts.
Matt Tamea, Solutions Consultant, Sagacity
Issue: We’ve got multiple users of our database, and it’s affecting data quality
Inconsistency of database use and data quality often exists across the Fundraising department with different teams using the database in different ways. It is then down to a handful of long-standing staff members who have shown a keen interest in this area to ensure their team has quality data on their supporters.
Shared accountability for quality data management is key to its success, and the role of leadership in modelling this culture and approach is critical. If teams don’t see their leader engaging with the database and data best practice, it will slip down the priority list for everyone.
Determining how you want to make the most of your database, and establishing key parameters and guidelines is a great first step. Ideally you’d make this a collaborative process, to make sure that every team is invested in how the database should be maintained and updated, as well as making sure it’s fit for purpose for each team’s needs. Once that’s agreed, implementing a mandatory internal training process for all new starters – as well as refreshers for existing staff – will maintain consistency across teams in how the database is used. It will also help to avoid knowledge being lost with staff churn. You could also consider introducing mandatory objectives for all line reports within your performance management or appraisal process around the database – this will help to ensure everyone is accountable for keeping the data high quality and consistent, and that it stays on the agenda as a key priority for effective fundraising.
Hannah Hyde, Consultant, Think
Issue: We’re struggling to draw the insight we need from our data to do any proper analysis
The data doesn’t provide the analysis needed against KPIs, and this can lead individual fundraisers to think it’s easier to use their own methods – often spreadsheets – to keep track of performance.
Data alone isn’t much use – drawing insight from that data is what’s key. It is likely that these issues originate from a problem with the way the data is structured, or more commonly with the way reporting has been set up to come off the database. Take the time to write a clear brief on what you are trying to evidence and the data you believe is needed for your analysis. Then talk with your data experts about how a report can be created can be drawn directly from the database. If an automated report can’t be created, then potentially the data structure needs to be looked at – if it’s a critical KPI then your data collection and storage should ideally be able to support it.
Michelle Chambers, Managing Director, Think
Issue: Every team has its own set of data definitions, making it difficult to gain any real insight from our database
One problem we consistently see for organisations trying to drive value and insight from their data, is the understanding of data definitions – what one team understand as a lapsed supporter can differ to that of another. The difficulties then arise when requesting data selections or reporting dashboards to be set up – if the definitions are not consistent across the board, then the metrics being reported will not align.
Putting together a team made up of representatives from multiple areas to agree and define the criteria being used can create a common language that will prevent future discrepancies and confusion. Building these definitions into a simple analytical framework will lead to faster and more robust insight, improve understanding of audiences and help optimise engagement programmes.
Laura Leach, Head of Sales and Marketing, Sequoia