Jumping to judgement

Spreadsheets, data, metrics can be valuable - but may not provide a full or reliable picture of a charities effectiveness and impact. Each charity has a story to tell. We should listen very carefully before jumping to judgement writes CCA CEO David Crosbie in Pro Bono News:

Jumping to judgement, Pro Bono News, 28 September 2022

The issue here is not just inaccuracy, but the whole premise that you can assign a number that accurately describes the effectiveness of an organisation (ChangePath rate CCA a three star out of five charity). We cannot know which charity is achieving their mission by looking at their balance sheets or analysing their operational versus staffing costs.

I have made many failures in my life. Some good failures that taught me valuable lessons, some not so good, partly because they led to various levels of blame and retribution rather than learning. One of my failures was the non-completion of my PhD.

Prof Mark Lyons took me on as a PhD student many years ago. Sadly, Mark passed away in 2009 and I didn’t complete my PhD. But what I learnt through all the reading, writing, the regular discussions with Mark over our three years together in the 90s, was invaluable.

I recalled this failure from my past as I observe the growing debate in the charities sector about effectiveness. We now have an Assistant Minister for Charities in Dr Andrew Leigh, who is committed to drawing on data and developing evidence-based approaches to policy making, and this includes policies related to charities. As the push for outcomes-based funding grows, questions around evidence of effectiveness are more regularly being raised. My experience is that we need to be very careful in this space.

My chosen PhD topic was the effectiveness of charities and how it could be measured. The thesis I developed with Mark was grounded in the key idea that charities have a clear purpose, and the most important goal of any charity is to deliver on their purpose. If that was the case, charities that actively measured the achievement of their purpose should be performing better than charities that don’t measure their outcomes against their stated purpose. Having measures of whether you are achieving your purpose enables you to reward success and not perpetuate failure.

The challenge was to show that those charities that measured their activities and outcomes against their purpose were more effective than those charities that had no measures in place relating to their purpose. Of course, I failed to complete my studies, partly because the challenges presented in such a thesis are numerous and complex.

In reality, some of the best work done by charities is almost at a peer-to-peer level where program classification and recording of measures is the exception. As an example, I have visited many Indigenous communities and seen highly responsive programs addressing the short and medium term needs of people within a community, but with minimal documentation. A breakfast program run for the long grass communities around Darwin not only provided food, safety, bathrooms, laundry and clean clothes, it took those that needed it to healthcare, legal care, family reconciliation meetings and many other interventions all based on the triaged needs of each individual who turned up for a free breakfast. When I asked about the outcomes of the program, those responsible took some prompting before explaining to me that what they were trying to achieve was to see people begin to talk more, stand taller, walk further, drink and smoke less, spend more time with their families and community. Not one of these measures was documented.

The Darwin long grass breakfast program would not rate well on GiveWell or GuideStar even though I found it to be one of the most effective Indigenous programs I visited. In fact, I became part of a small group of people who helped fight for the program to continue getting the small amount of money it received in grants from the NT government. It was a very vulnerable program, mostly because it lacked the kind of documentation that enables charities to show they are making a difference. But counter to my failed thesis, the lack of documentation did not mean all those volunteers and workers at the Darwin long grass breakfast program were not doing exceptionally good work.

And so it is with many community embedded programs where volunteers play a key role, where the professional staff are almost entirely client focused, where there is little if any back office, where displaying dignity and respect and establishing authentic caring relationships is so much more important than tracking and recording indicators of success.

This is one of the reasons I am no fan of charity ranking websites. I am especially not a fan of websites like ChangePath that publish inaccurate ill-informed statements about charities. According to ChangePath, Community Council for Australia (CCA) does not make our financial statements available, made no statements about outcomes and CCA are not up to date with our Australian Charities and Not-for-profits Commission (ACNC) information. All of these statements or measures are wrong. Fully audited financial reports are available on the CCA website (they are published each year) along with a comprehensive listing of activities and outcomes. The ACNC information is fully up to date, including the link to our annual reports and audited financial statements.

The issue here is not just inaccuracy, but the whole premise that you can assign a number that accurately describes the effectiveness of an organisation (ChangePath rate CCA a three star out of five charity). We cannot know which charity is achieving their mission by looking at their balance sheets or analysing their operational versus staffing costs.

I am not suggesting we should readily accept excuses for not documenting organisational performance. I often encourage charities to be more transparent and better document their outcomes. It is always good for charities to let their communities and supporters know how they make a difference. I worked with the Darwin long grass breakfast program to develop simple checklists each day that would help show how they make a difference. To me, this is an important part of good governance and good practice. It can and does often tell people outside the organisation how effective the charity is. But that does not mean that every charity failing to properly document is not doing good work.

I also encourage charities to document the enactment of values within their organisation. How welcome does someone feel when they enter a charity? How safe do people feel within your programs and services? Do people feel respected, listened to, heard? For me these values metrics are almost as important as the balance sheet. Like many people, I have learned to read the charity settings I enter. What I find important is not how expensive the furniture is, but the enactment of values between people, and I am not just talking about the staff. And yet, some of the best therapeutic settings I have visited do not demonstrate either their values or their effectiveness in their annual reports or their balance sheets.

We should all commend those charities that do invest in transparency, measurement and ensuring they are making a difference. They are the leaders, the kind of organisations we should aspire to be.

At the same time, having spent years trying to develop appropriate metrics, I have yet to discover any set of simple comparable proxy measures that provides a simple clear way of judging whether a charity is doing good work and achieving its mission. We should not readily apply broad measures across all charities or assume that one data set is going to tell us all we need to know about which charities are doing good work. Nor should we assume the better the marketing the better the organisation.

Each charity has a story to tell, and we should listen very carefully before jumping to judgement.

 

Read on Pro Bono News: jumping-to-judgement