How Might We Put People at the Center of Evaluation?
\r\n\r\nAs a design and innovation consultancy built on the notion of putting people at the center of the design process, taking a...
As a design and innovation consultancy built on the notion of putting people at the center of the design process, taking a human-centered approach to evaluation is critical for IDEO. We saw a great example of this when we spent time with IDE in Ethiopia, where Monitoring and Evaluation specialists spend time walking the fields with farmers and connect their personal stories to the data about them.
When evaluating the effectiveness of a program, quantitative data alone does not convey enough meaning, and typically leaves us with many questions. Numbers are, of course, necessary, but shouldn't be relied on alone. Statistics should be complemented by deep stories of the impacts on an individual, family, or a community, and we should spend as much time thinking about how to effectively craft these stories as we do focusing on how to present the numbers.
Putting people at the center of evaluation means connecting with them on a personal level. We do this by spending time with people in the field-observing them at their homes or while doing their jobs. We build trust in the communities by working with local partners and even doing homestays in rural villages. As we get to know people, we gather richer and richer stories about their lives. And as we test prototypes of various innovations, we look to understand how it actually changes people's lives.
When we worked with VisionSpring to design eye "camps" for children in rural India, we created a number of different procedures. We observed children as they went through the eye testing process, and talked with them about how to improve it. A few children started crying as soon as they sat down to get their eyes checked, because the pressure of the equipment on their faces was too great when the test was conducted by an adult. However, when they were given the opportunity to test the eyes of their classmates, the same children were confident and excited. The feedback they sought from children and teachers (including the personal stories about the children who got glasses and were now able to succeed in school) allowed VisionSpring to continue to conduct the eye camps, with better results.
So now we put these questions to you: How have you put people at the center of evaluation? How have you reconciled raw data with human stories? What strategies have you used to measure impact beyond quantitative analysis? Can you think of any realms-education, the prison system come to mind-where evaluation could stand to be less number-centric?
Jocelyn Wyatt leads IDEO's Social Impact domain. On Friday, look for a response on this topic from Hallie Preskill, director of the Strategic Learning and Evaluation Center at FSG Social Impact Advisors.