Validating 360DG Submissions - Have you ever tried?

Not sure if the forum is still live. Anyway: I’m wondering whether any of you have ever tried to validate the data in 360DG by comparing it to annual accounts?

I have tried this for 10 organisations, and apart from some discrepancies which I could resolve (for two organisations). The differences are wildly different.

Reasons for differences could fit into one of 3 scenarios:

Category 1: My comparisons are nonsense;
Category 2: My comparisons are not nonsense, the differences are real and the reasons for these differences are legitimate (meaning that the 360DG data is accurate/complete and the accounts are not materially mis-stated).
Category 3: My comparisons are not nonsense, the differences are real and either that the 360DG data is inaccurate/incomplete and/or the accounts are mis-stated (although whether materially so remains to be seen).

I’ve already identified one howling example that falls into Category 3. This was a grant of >£10m (payable over a few years), which is not in 360DG (anywhere!) but which is listed in the organisation’s accounts explicitly (since it was greater than their disclosure threshold of £5m). This grant was the reason I started on this “What about all the others?” adventure.

I’m in two minds whether some Category 2 differences are legitimate (e.g. missing out grants made to a single overseas territory when your main destination is the UK). But in other cases I think missing stuff out is not legitimate (e.g. including revenue grants but not capital grants).

I am, of course, not in a position to comment on the validity of Category 1 differences!!!

Any comments, welcome.

Thanks!

Hi

Apologies for the delay in responding. Your e-mail landed during our ‘extended’ Christmas break.

I cannot give a definitive answer without looking into our own data (which may take me a little while) but I do know of one specific exclusion that would make a material difference between our accounts
and 360Giving albeit not a huge one.

We do not provide details of any grants awarded to individuals for reasons of privacy. This would however be less than 5% of our annual grant giving.

If I get the opportunity to look into this further and find some other examples I will come back to you.

Kind regards

Steve

Steven Mackenzie

Development and Operations Manager

Hi Steven, thanks for replying. Incidentally, who do you work for?

I work for Essex Community Foundation.

I have taken a quick look at the figures and sadly there may be a couple of contributing factors:

Ahh yes. I saw your vid about 360DG community champions on you tube – sorry to have forgotten!

I too took a look. I used the most recent set of accounts (2019) and compared 2018 and 2019 figures. The diffs were really small for 2018, but larger (although not shocking) for 2019. If you want to see my working, I’m happy to email it to your organisation marking it for your attention.

Otherwise, if you get to the point of identifying reasons/categories driving differences do please let me know. I’m wanting to put together a generic list of “reasons why there might be differences”, plus a general discussion paper including an example of how to compare grants information in the accounts to 360DG data (using the 1/10 organisation whose data matched perfectly).

Many thanks for your contribution.

Hi (Do I call you C21 or Bean??)

It would save me some time if you could send over your ‘workings’. I can then just identify the differences and see if there is a definite pattern that we can identify/address. I assume that my e-mail
address is hidden from the forum so please send anything across to me at
steven@essexcf.org.uk.

Cheers

Steve

Thanks for posting - it’s great to see 360Giving data being used and generating interest.

We have developed a standard, open format for voluntarily sharing data about grants made in the UK. There are lots of reasons why an organisation’s annual accounts could be different to the data being shared. 360Giving data isn’t “reporting” in the way finances are reported to regulators such as Companies House or the Charity Commission, and not all grants are necessarily included.

Some organisations opt to share data in the 360Giving format for certain programmes and not others, for example sharing details of grants awarded from open grant processes but not including proactive/strategic awards. Other reasons for differences could include waiting for the grant period to end before publishing it; unused/cancelled grant funds being returned; some grants not being shared for privacy or responsible data considerations (e.g. grants to individuals or politically sensitive topics). We encourage data publishers to briefly describe what is in the data they share in text alongside their data files, as a guide to users.

For others doing this sort of analysis it’s also worth noting that several organisations are sharing data about grants awarded in the past 12 months so this data is more recent than financial reports to regulators - so its important to check that the correct time periods are being referred to (which you’ve already done).

(I’ve moved the post to a different part of the forum as the original category was for a now closed programme)

Hi David,

Indeed, from my analysis of (now) 11 orgs and from the replies Ive had from (some, not all!) organisations, your list doesnt differ from much from mine (although I diverge in that “waiting” wasn’t one of my options…).

Nevertheless, I can confidently add to this list the following:

  • “Capital Grants not included” (which in the case Im thinking of are publicly-known as open govt data elsewhere - albeit as payments, so it’s odd that they wouldnt publish in 360DG as well…); and
  • Minor extraction glitch (one organisation kindly confirmed to me that this was the case with gaps I had identified and updated 360DG accordingly. Yay. Positive result!).

So I still think high level reconciliations are well worth doing since they are not remotely difficult to do in excel. Once done, they can help publishers to:

a) Check that the missing data was intentionally omitted (and make changes if necessary);
b) Gain some comfort over their data maintenance/extraction/publication processes; and
c) Keep notes to users up to date (about what they intentionally leave out and why).

I would also say its worth doing for analysts interested in understanding the activities of a handful of publishers: even in the absence of confirmatory responses about where the differences come from, contemplating the £-value (or %) differences can be quite sobering. For example, a 1% difference might not be material my research but a 20% difference would be.

Thanks.