Understanding the true scale of grant monitoring reports
Everyone knows monitoring reports for funders are a hassle. They take ages to complete and often are hard to understand and learn from. Because of this grantees don't like filling monitoring reports and funders don't like reading them.
This is a problem as they take up to 15.8 million hours of charities' time to complete.
This is a hidden cost imposed on grantees by funders - money and time that should be better spent.
Why they take so much time
To work this out we combed through the websites and the publications from the country's biggest funders to work out what a typical grant monitoring report looks like. We then asked 10 people from various small to medium sized charities that together report to 6 different local authorities and over 50 different funders to estimate how long it would take to fill out this "average report".
This gave us a total of about 40 hours for each grant - broken down as follows:
What surprised us when we first started talking to small charities about reporting was the amount of time it takes to know who their beneficiaries are. We thought that would be easy.
But then it was explained to us. Imagine if a funder asked you for the number of total unique people that have used your service. Even if you take a register every day, you can't just add up the number of people on each register, as they probably aren't unique. So you have to start making a register from those registers, going through each one individually and adding a new name every time a new person arrives.
If you then try and break that down by demographic data (information that people often don't want to give you) and where they've come from - the time really starts to add up.
Frustratingly, it isn't just us who are surprised to hear this. When we speak to funders they are also surprised. This might be because grantees are often uncomfortable admitting to funders how long this all takes, because they think it's their fault. It isn't, it is hard and the tools that have been available to help with this are expensive.
Most funders want the same thing
Luckily, attendance information is one of the most commonly requested items on the monitoring reports. Unsurprisingly, this is the case for lots of the requirements:
In theory this is good news. If all funders want roughly the same thing, it must be easy for grantees to do their reporting - they only need to copy and paste information between forms.
Funders all want roughly the same information, but they want it presented differently and at different times of the year. If one grant started in March, that grant reporting will be based on a March year end, even if all an organisation's other grants have a December year end.
Some grants require answers to very specific questions related to them - such as a breakdown of beneficiaries by ward. This is great for making grants very locally focussed, but very hard to monitor!
To add to this, charities often have a lot more funders than you think - the number increases pretty much linearly with income.
In fact, if a charity has enough income to have at least 1 FTE member of staff, they probably will have between 5 and 10 different funders. That means between 5 and 10 different reports. Even if two funders were to have exactly the same reporting requirements, it would still take a couple of hours to copy and paste between reports.
Even if two grants do require the same information, they will often ask for it in different ways. Grant report question writers love synonyms.
Some people will ask for differences, others will ask for progress towards outcomes. Some people define outcomes as targets others as changes noticed in a beneficiary.
This just adds an extra unnessesscary layer of confusion to something that is already confusing enough!
Different types of funders mean different requirements
Luckily, you can guess from funders what sort of information they are going to be interested in.
Local authorities grant monitoring reports often ask for a very detailed breakdown of who your beneficiaries are no matter what the size of grant, as for Hackney Council and Camden Council. This is likely since Local Authorities have a much greater pressure to show the value of their grants than any other funders, particularly in the climate of austerity. Having said that, other Councils, e.g. Croydon, ask only for pictures, expenditure and some outcomes for larger grants. Common across almost all of the reporting requirements for Local Authorities is the need to use a form specific to that grant programme.
Charitable Foundations tend to be more varied in what they want. They are still interested in a breakdown of beneficiaries, but tend to be interested in outcomes and what organisations have learnt from their grants.
The format of the monitoring reports themselves varies massively between foundations. Some, such as BBC Children in Need, use an online form. Others, such as John Lyon's Charity, are happy to receive their monitoring information in any format, so long as it containts the key criteria.
Foundations are also more likely to ask very specific questions related to that grant. Comic Relief is a good example of this. They ask for "what our money can buy" as a requirement for every grant, which is a question we have not seen elsewhere.
What's worse - some funders don't even share their monitoring requirements.
But in many ways, these are the good guys - they actually publish their monitoring requirements. This lets potential grantees assess how much time monitoring and reporting will take for a particular grant.
This is important as monitoring and reporting is a significant enough part of delivering a grant, that they should be taken into account when preparing the budget. If you don't know what that will be, you can't properly plan when bidding for funding, which is a real factor in choosing whether to apply for grants:
We've stopped applying for funding from [big national funder] because their reporting requirements are just way too onerous.
By not publishing monitoring requirements, funders are not giving potential grantees the whole story.
Adding it all up
So monitoring reports clearly take a lot of time. In order to properly quantify quite how long, we decided to estimate how this affects the sector as a whole. To do this we had to estimate the number of grants handed out every year. This is where the NCVO almanac 2019 came in handy. First, we worked out the total income from grants by adding up income from government and foundations. We then removed grants for research and to schools, as they have different reporting requirements. This gave an income of ~£18bn a year from grants and contracts. Dividing this total by the average grant size (~£50,000), we were left with a total number of grants of approximately 391,280 per year. (If you want to see our working for this, it can be found here)
Combining this with the average grant time of 40 hours, we find that charities spend 15.8m hours every single year just filling out reports for funders.
That's ~£204m of staff time. Or approximately the same amount as the lottery gives to organisations that work in Health, Social Services, the Environment, Village Halls, Housing, Employment and Training, Umbrella Bodies, and Playgroups and Nurseries.
Clearly, this is just an estimate. In particular, local authority reporting requirements (the way many organisations are funded) are hard to find, and we have not altered the grant requirements depending on size of grant. We have also assumed that organisations are using the kind of improvised paper registers-spreadsheet combination that is very common across smaller charities.
As monitoring reports take so long to complete, they get in the way of learning
Monitoring has the possibility to make a real difference. If done properly, both funders and grantees can learn what is and isn't working, which can then be used to improve future work.
This is due to the variety of questions asked and time taken to do:
- Because each funder asks for different things, it is impossible for grantees to compare between projects funded by different funders.
- Because of the time they take, grantees seem to see monitoring reports as a tick box exercise for the funder and not an opportunity to review what they have achieved with their grant.
But this isn't news
As highlighted in the recent IVAR/Comic Relief report, a lack of learning is a problem in the sector.
One way to cut down the reporting time, and therefore encourage learning, has been a move to allow grantees to send in their reports from other funders.
This is great, it saves grantees loads of time. The problem is that funders then get monitoring reports in thousands of different formats. This makes it impossible for funders to learn from the reports they receive, which means they aren't learning from their grants.
Funders not learning from grants is a real issue. They have the most input on the future of the sector and the sharing of best practice. After all, they spend the most amount of time thinking and discussing how to make the most of grants.
How could grant reports be done better
We think that there should be an easy to use and free way for small charities to record the things that grant reports most need - attendance and outcomes.
We think that a charity's own data should be easy for them to analyse and learn from.
We think charity's should be able to easily share their monitoring information so they can share their impact.
We think funders should be able to easily build aggregated reports from all of their charities so they can inform best practice and better understand the grants they give.
But rather than just talking about it, we have built something that does just that. If you would like to know more about it email us on firstname.lastname@example.org. We'd love to chat.
Contains data from GrantNav a 360Giving application released under the terms of the Creative Commons Attribution Sharealike license (CC-BY-SA)
Contains data from Charity Commission data files shared under an Open Government Licence.