Continuing the Conversation…

Continuing the Conversation…

Wednesday, April 30, 2008 – 09:56

By Andre ProctorWhy and how do we measure, learn and share what we learn?“Measurement should not be imposed on organisations, or it becomes a threatening rather than a learning opportunity,” writes Sh


By Andre Proctor

Why and how do we measure, learn and share what we learn?

“Measurement should not be imposed on organisations, or it becomes a threatening rather than a learning opportunity,” writes Shelagh Gastrow in her highly critical response to David Bonbright’s article  entitled: ‘What do we need to know?’ In the original article, Bonbright concludes with a call to:

“invest in the creation of … open communities of practice and performance measurement commons that accelerate bottom-up processes of … merging, imitating and above all learning.”

So where is the disagreement?

Both are calling for flexible, bottom-up, context and process-sensitive approaches to measurement in social change organisations aimed above all at learning and improving their effectiveness. Yet Gastrow has clearly interpreted Bonbright’s argument as simply another way of imposing rigid, top-down and mechanistic planning and measurement models.

This exchange highlights some important differences in our perceptions and the way we use language that are common in our sector, and which often cause the discussion on measurement in social change to get stuck.

If we can identify and address these ‘sticking points’, then perhaps we can turn what is now an often polarised debate into a more generative dialogue through which we can come to new understanding, synergies and solutions.

Sticking point 1 – Why do we work in the sector?

Gastrow says passion for causes – Bonbright says to solve social problems and improve peoples’ lives. This distinction is important because it can never be enough simply to learn for our own benefit alone. He puts it this way: “Learning – and sharing what we learn – is the categorical imperative of social change” and gives the example of how deep one-on-one learning relationships can generate wonderful information that often stays locked up in that specific relationship. Part of the challenge is to find ways of getting that learning about what works, which organisations have got it right, who we can imitate and learn from, and how we can contribute what we learn to enhance societal learning and change. There is no necessary dichotomy between one-on-one learning and public reporting as Gastrow suggests.

Sticking point 2 – Benchmarks and standards

Many people feel a powerful negative auto-response when they hear these words. But when Bonbright speaks of benchmarks and standards, he is not referring to standards imposed from above. He is referring to communities of similar and like-minded organisations and their constituents (especially beneficiaries) evolving shared understandings of what kind of strategies, relationships, attitudes, values and cultures work best for them and their field – and finding out how these can best be measured and communicated.

Closely linked to this is the question of rating agencies and how civil society should respond to them. Gastrow implies that this can only be a bad thing, and Bonbright agrees that to date it has been exactly that. But we face a choice: We can take a purist line and refuse to get involved in this game and risk leaving the entire field to ‘the industry’ which will mean that a large part of the resources pool flows to those who can play to the script, or we can take them on and develop alternative more meaningful rating systems that will help channel resources to organisations that reflect the values, relationships and behaviors that are most valued by practitioners. He gives two examples of this.

Sticking point 3 – The status quo and the nature of an ‘information market’

Gastrow states that there is no short cut for donors but to deeply get to know the organisations they fund. Of course, this kind of long term intimate relationship would be an ideal if it were always possible – and if this is her experience of the status quo, then she is indeed very fortunate.

But the overwhelming reality for most CSOs is that their donors do not have the capacity nor interest to do this. Most give because a project sounds nice, and in the absence of deep mutual learning, they protect themselves by insisting on short-term, project grants linked to specific outputs and activities rather than the kind of long term complex interventions that can lead to sustainable change.  Likewise, research has shown that almost all individual giving is ‘impulse giving’ based on a very superficial understanding of the organisation. The current system favours the big organisations with glossy public image rather than small community-based organisations that might have the greatest impact locally.

Here we must confront what we mean by that dreaded term ‘market’.

Gastrow, drawing a parallel with what has happened in global economic markets, is afraid that an information market will necessarily be dominated by the rich and the powerful – and that meeting the requirements of this distorted market will become the sole purpose of measuring and reporting. But do all markets have to behave in this way? Markets can be much closer in spirit and power structure to civil society itself.

The kind of bottom-up public information market that David envisages is one that will prioritise the kind of performance data that CSOs and their beneficiaries value most. This kind of market has the potential to influence resource flows profoundly to the organisations that are truly searching for and discovering effective solutions to social problems. It also has the potential, as demonstrated in a number of nascent online giving markets, to facilitate new direct long-term relationships between millions of individual givers with millions of small but effective organisations in poor communities all over the world. It does not seek to replace intimate one-on-one relationships, but grow these a million fold. It is beginning to happen.

But this kind of market can only be realised if there is the kind of credible and consensual information base that will help donors find organisations that are really making the kind of difference that they care about. And that can excite the passion for a long term relationship based on feelings of true partnership that together we are actually solving the really tough problems that face us.

Sticking point 4 – The importance of relationships, process and a plurality of tools and measures

Historically it is true, as both Gastrow and Bonbright point out, that all attempts to create standard reporting systems to date have ended up as burdensome, bureaucratic structures that impede rather than enable good developmental practice.

What is different about David’s ‘hundred flowers’ of collaborations and communities of practice, is that relationships and process are emerging as the most important indicators of potential effectiveness.

There is indeed an emerging consensus that relationships and processes are at least as important as outputs in achieving sustainable outcomes. It is also clear that there can never be a single measurement or tool that can capture everything that matters.

This is not something to be feared, as Gastrow implies. Many of the approaches and tools that are emerging concentrate on assessing things like the quality of relationships between and organisation and its beneficiaries, its peers and its donors. Others focus on organisational capabilities. David lists some of these in his article, but there are many others.

Keystone, where Bonbright and I work, is just one of many such initiatives. We are exploring with comprehensive deep dialogic learning and reporting methodologies based on what we call ‘eco-intelligence’ and ‘constituency voice’. Our aim is to transform the much dreaded ‘M&E’ function into a meaningful process of planning and learning for impact that, in the words of the political economist Amartya Sen, enables the agency of the people meant to benefit from an organisation’s work. In order to ensure that this empowerment orientation is represented in the way an organisation reports on its work, we are developing short and practical ways of quantitatively measuring and reporting complex qualitative data – such as organisational capabilities and the quality of relationships and impacts as perceived by those most affected by their work.

We realise that this way of representing our work can feel threatening, after all, much of our best work in the sector is genuinely difficult to measure in the short term. We think that donors are well-placed to lead by example on this and we are pleased to say that many of them agree.

We are now, for example, undertaking parallel initiatives with groups of grantmakers in East and Southern Africa that will enable grantees to give anonymous feedback on the performance of their donors, how donors impact them, and how donors could support them better. The resulting reports will give each donor a detailed and nuanced understanding of the state and quality of its relationships with grantees as well as allow it to see how it compares with other grantmakers. Data that shows that you are in the bottom third of your peer group with respect to some dimension of your work – for example, how your grantees rate your technical knowledge of their subject area – gives a very clear sense as to where and how to invest in organisational improvements.

When this kind of data is reported back to grantees by the donor – ‘OK, this is what you said to us. This is what we think we should do in response? What do you think? Have we responded appropriately?’ – we enter the kind of learning dialogues that strengthen relationships and improve performance.

What now?

Each one of these sticking points is worthy of deep discussion and debate in their own right. It is impossible to do them all justice when lumped together in a single article like this.

Perhaps readers of NGO Pulse would like to deepen the discussion on specific issues that Gastrow and Bonbright have brought to light.

Andre heads the South African office of Keystone and can be contacted on andre@keystoneaccountability.org.

Add Comments

Login or register to post comments

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top