An online resource for designing a better world.

Measuring Made Easy – Acumen’s Lean Data Model

May 25, 2016

In 2015 Acumen, an international organization that raises philanthropic funds to invest in businesses whose products and services address poverty, launched Lean Data. Their new evaluation model addressed a persistent problem they saw in the field. Many of their social enterprises needed a way to measure impact, but they were too cash- and time-strapped to use more traditional measurement tools effectively. As part of our series on Measuring Impact, Impact Design Hub spoke with Tom Adams, Acumen’s Director of Impact to learn more about the Lean Data approach.

Impact Design Hub: Can you describe your organization and the work that you do overall?

Tom Adams: Acumen is best known as an impact investor. We raise mostly philanthropic funds to invest patient capital — where we don’t expect a short-term return — in businesses whose products and services are enabling the poor to transform their lives. Within Acumen I lead the organization’s work to better understand the end-consumers our portfolio companies serve. We do this so that we can help those companies grow and so that we can all — Acumen, our companies, and our supporters — better understand our social impact.

Impact Design Hub: How does the Lean Data process work?

Tom Adams: A project itself goes through a simple process:

Diagnose — agree with entrepreneur on metrics and questions to answer.

Design — prototype surveys.

Deliver — implement survey, validate/clean data.

Analyze —run tests, draw conclusions.

Advise — give suggestions, plan for a next iteration.

Though if I had to get simpler than that, we ask entrepreneurs what they wish they knew about their customers that they currently don’t, and we go out and get the answers. There are a bunch of case studies in our recent report, but we’ve collected data on everything from the changes in milk yields for dairy farmers, through to the consumer awareness and purchasing funnel for clean cook stoves.

Impact Design Hub: How did the Lean Data model come about?

Tom Adams: We started Lean Data for two reasons. First because established methods of impact evaluation are not a repeatable model for the social enterprise and impact investing space. They’re typically too expensive, too slow and too burdensome — if we want social measurement to become the norm we need something different. Second because we thought we could measure a blend of social metrics and customer insights to drive more value to companies working to create social impact.

Impact Design Hub: Beyond being less burdensome, how would you describe the core difference in approach with how Lean Data works vs. other more established approaches?

Tom Adams: At its core Lean Data is two things. First a shift in mindset away from reporting for compliance (i.e. because your funder told you too!) and toward creating value for a company and its customers. And second the use of low-cost technology in order to get high-quality data direct from consumers as efficiently as possible. Within that we’re experimenting all the time to make our processes as streamlined as possible and the data collected as standardized as we can. We’re increasingly creating plug’n’play Lean Data tools — question sets which we know work over different technologies, and can be adapted for different contexts and languages, which can be rolled out rapidly.

Impact Design Hub: When looking at impact measurement, both the financial and the environmental sectors have built successful standardized metrics (money and Co2 emissions, respectively). Why do you think it’s so hard for the social sector to follow this path? What are we still missing?

Tom Adams: That’s a toughie. If I had to give one answer, I’d say it’s because of the heterogeneity in what we think matters as social impact. Finance is lucky as only one thing matters, money. Similarly in preventing climate change one thing matters, Co2 emissions. Though for the environmental movement more generally it’s not that simple – what’s the metric for preserving the beauty of rainforests? In the social sector there are some standard metrics, e.g. DALYs in the health sector (disability-adjusted life year – i.e. years lost due to disease, disability or early death). But the variation of what matters to create social value means that we need large numbers of different metrics.

Some folks might say that if social impact can be about anything (from self-esteem to life expectancy), then anything goes and it’s pretty useless. However I do think that in the future we can start to agree on principle lists of what matters to us socially. These should be based on what people in society say is important to them. The Equality Measurement Framework is a good example of that and Big Society Capital did something similar with their social outcomes matrix. It’s not going to happen overnight but over time we can start populating such matrices with metrics that matter. Eventually we might be able to convert each of these metrics into a unifying value (possibly even dollars) based on valuation techniques, but that’s another story.

Impact Design Hub: In the social sector, as in science and medicine, randomized controlled trials are considered the gold standard of impact measurement. But it can often be difficult to achieve that kind of laboratory-style precision when tackling complex social problems. How does the Lean Data approach address that challenge?

Tom Adams: Randomization is needed to determine causality, that X causes Y. That’s the ideal: that we have proof it was our action that caused a given social impact, and not things other people did or simply chance. But as you point out it’s not always feasible; moreover just because something is the gold standard doesn’t mean it should be used every time (we don’t drink expensive wine with every meal). Instead we believe you need to make a judgement about what data/method is appropriate by comparison to the significance of the decision you are making. Even the most enthusiastic “randomista” would probably agree that formal evaluations aren’t for everyone. Also, in a world where decisions are being made all the time, the majority of which aren’t based on much data at all, we may need to be practical and use what is feasible and proportionate even if it’s not perfect. At Acumen, from time to time, we work with partners to conduct randomized trials, but most of our decisions don’t warrant such involved or costly approaches.

Impact Design Hub: What have the biggest achievements of the Lean Data approach been to date? What about setbacks or unexpected bumps on the road?

Tom Adams: Most importantly, we’ve changed the conversation around impact measurement with our investees. Talk of impact measurement across our sector has I think started to become at best uninteresting, at worst a necessary evil people must do to access capital. I think this is for three reasons: someone, somewhere decided to call it the dire-sounding “monitoring and evaluation,” it’s been made overly complex (at its core this is just about listening to people, what could be simpler?), and most importantly the whole dynamic of impact has generally been top-down: “you must collect the things I tell you to, in order to prove your impactful to me.”

We’ve agreed not to collect things unless it drives value to the business we’re investing in too, and to make it as simple as possible to get data back from end consumers. We treat our companies as clients rather than as people who owe us data. In terms of bumps in the road, we’ve had a number but nothing especially out of the ordinary, surveys that don’t work as we’d hope, young talent heading to business school and so on.

Impact Design Hub: If you’re focused on the needs of the firms that sounds like you’re collecting pretty varied data. But don’t funders and investors need standardization so they can evaluate where to devote their resources? How does ACUMEN, as a social investor, aggregate the data from its grantees and communicate its impact?

Tom Adams: Yes, that’s right — a lot of the data we collect is pretty tailored. But we certainly have our eye on standardization. Our hypothesis is that the best way to drive standards is mostly bottom up. Collecting data from multiple firms in the sector, we’re seeing consistencies in what people value and are steadily moving toward increased standards. That’s not to say we don’t ever try to move companies towards standard measures ourselves. We find that doing so is much easier if you can say “such and such gathered this data and they found it really useful for X and Y reasons.” I also think that in the future, once we have enough data points, people will naturally want to benchmark themselves against competitors and industry norms. It’s a natural tendency to want to see where we rank compared to our peers; there’s nothing wrong with a little social competition!

Impact Design Hub: Obviously your evaluation model derives from global development work, and is most applicable to that field. Do you think there are types of impact design work where this model wouldn’t work as well, or is it broadly applicable to any effort to create social impact?

Tom Adams: It’s true that we’re an international organization, but I think the principles we’re adopting would apply anywhere. Collecting data by listening directly to people, using efficient tools to do so, and thinking about the value created for everyone through data collection — there’s nothing unique about that whether you’re working in London or Lagos. Some of the tools we use like SMS surveying might not be as effective in say the U.S. as they are in Kenya, but there are some great polling apps and the like that could take their place.

Impact Design Hub: Lean Data seems to be in part inspired by the Lean Start-Up model in entrepreneurship. Do you see any limitations to applying that idea to the social impact sector? Is Lean Data equally suited to measure impacts for projects that have customers as those that have non-paying beneficiaries?

Tom Adams: Why not? Build-measure-learn seems pretty universal. I’d say a problem with some development programs is that they assume their solution works, don’t listen enough to end clients (consumers or beneficiaries), and aren’t able to pivot. Our aim is to use Lean Data to help companies get data faster and respond more effectively to their customers’ wants and needs. This should be just as applicable to folks aiming to impact non-paying beneficiaries.

Leave a Reply

Continue With More Articles in This Feature

Measuring Impact – Introduction to the Series

May 10, 2016
For all our recent talk of designing for flux, one thing is certain: Designers who pursue social impact aren’t in it for the money. Ask people in most other professions why they do what they... Read More

Can We Measure Impact From Design?

May 11, 2016
Can social impact from design be measured? We asked leading practitioners working in the social impact realm to answer our challenge as part of our Measuring Impact series. What do you think? Let us know by... Read More

Gaming for Social Change

June 8, 2016
Agent, we have received an URGENT EVOKE to help save protected land in your community. You are called to action, how will you respond? Thus begins a player’s journey into EVOKE, an immersive online experience that... Read More

The Ethics of Measurement

June 22, 2016
By Lauren Weinstein You and your team have just designed and implemented a new tool to help people in Uganda manage their finances from their mobile phones. You’ve spent a couple of months on the... Read More