"Right" means what is right for your organization, your project, your people, your environment, your context, your values, and your beliefs. I get frustrated with people who sell a specific tool or method without a full understanding of what an organization, project, or task is seeking to do. The answer rarely comes in an off-the-shelf prescribed package but rather in methods and tools that provide the absolutely necessary information needed to see whether a task is doing what it intended to do (and if it is not-why not?). And crucially, the tools and methods provide information that can stimulate and generate improvements.
The combination of tools and methods that provides the vital information then becomes the framework from which relevant information and knowledge can be shared with and communicated to stakeholders. That is, information for stakeholders is a subset of the already gathered information.
Some tips to making a good decision on the "right" tools:
\n
Really know what your organization, project, or task is about. Get clarity on what you're trying to achieve and the definition of success. Breakdown and define every word you use; if you can't explain what you mean, use another word-no sloppiness of words (e.g. by "empower" we mean "a person has gained and now behaves"; by "community" we mean "these people in this area").
\n
Predict what you believe will be critical-a collection of truly valuable qualitative data for measuring achievements and challenges. But be prepared to be wrong; the same forum could encourage bias in observation. Sometimes people act in a certain way if they are being assessed, even informally. It may need an independent person to observe or a less invasive observation tool to find out what is really happening.
\n
Investigate and be open to tool types and methods. Be critical of all tools and methods, and choose the best for your project, which may not be those that an expert suggests (for example, if your project aims to change attitudes of people, it is likely you will know when they have genuinely changed their attitude and whether a tool or method would capture this accurately). It could mean selecting a more resource intensive tool (like involving face-to-face contact) that ultimately gives the most accurate and meaningful data critical to success, rather than a cheaper tool that might give a lot of data but that is less meaningful to your work (such as involving indirect or electronic contact to record attitude changes).
\n
Break down how you as a person measure success. What do you look for to see if someone's behavior has changed? Does the tool or method include recording of these critical indicators?
\n
Have ICT caution. Ensure that you are honest when it comes to reviewing the latest technology. It is easy to get swept away by tools that look pretty and are popular but miss collecting really meaningful information.
\n
Learn lessons from others. It is unlikely that your combination of tools and methods are exactly the same as another organization's-right is right for what you are trying to achieve in your context. Look for ideas of tools but not a simple adoption of an entire framework.
\n
Break down any additional information that is required. This could mean financial accountability, ethical accountability, and good management practices. Ensure this information is also being collected (but is not the only information collected).
\n
Consider separately what is required by funders of the work. A good funder will recognize your logically selected tools and methods, and only request information that has not been proposed but that they need to collectively report across projects (this should be minimal).
\n \n
When it comes to tools and methods, if it isn't right for accurately measuring success, providing pivotal learning, or being accountable-why entertain them? And if someone else is insisting you must, what are their motivations-are they really aiming to achieve the same success? Also, are you trying to measure too much? There must always be a balance of resource allocated to measuring, reviewing, and innovating with actually carrying out the work. Keep it simple: prioritize what you must know and work from there.
Questions
\n
What have been some of the most meaningful tools and methods an organization has used to evaluate its work?
\n
Have you had experiences of the resources needed to implement the tool or method not justifying the information gained in return?
\n \n
Rose Casey-Challies is the Director of Partners in Impact, working with socially driven organizations to identify what social impact they want to make and ensuring they have the means to show, learn, and move forward from their experience.