The independent evaluators have completed the evaluation of the Ushahidi Haiti Project (UHP), after many months of interviews, sorting through Skype chats, and trying to connect the dots. The final report foundhere and on the eval website, aims to contribute two things to the CrisisMapping community: first, an understanding of UHP itself, and second, an understanding of the methodologies that can best be used to evaluate and learn from open source projects to provide useful information for future implementers. We commissioned this report not at the request of any donor but in a real attempt to offer a transparent account of the strengths and weaknesses of the project. As the evaluation manager, I wanted to take the opportunity to offer some of my own reflections on UHP and what I think it means to us more than one year later.
I remember the day, at least one year ago, when we sat around in the basement of The Fletcher School and discussed the ‘theory of change’ that could describe why we all believed that the UHP was important. My good friend Sabina Carlson hit it: “The louder the voices from the ground, the better the response will be.” The final version of the theory of change in the Terms of Reference for the evaluation was a bit more specific: “Access to accurate and timely information from the ground during post-crisis response periods will enable humanitarian responders to act more efficiently.
The point is the same. This isn’t about technology. It’s not about Ushahidi. It’s about participation. We believe in open source because we believe there is a better way, we believe that individuals should have louder voices and more ability to help themselves, and that by giving a people a voice we will see better results. We believe that it is about results, not about the achievements of individuals or organizations, but simply the fact that more people were given a chance to participate.
We all know – I hope – that the Ushahidi Haiti Project was far from perfect and this point is reiterated again in this evaluation. Yet, this project is significant in that it is one small part of a paradigm shift. Lives were saved – this is true and a point that cannot be overstated. And yet the many of the significant results of this project are not seen in the immediate effects, but in the long-term influence that the initiative had in proving that open-source, participatory information gathering can work.
The evaluators rightly refer to the Red Cross/Red Crescent rules of conduct in this report. On page 11 they cite Principle 7, ‘Ways shall be found to involve programme beneficiaries in the management of relief aid.’ This project was one huge step forward in making this aspiration a reality. Many more have been taken (Standby Task Force in Libya, for example) and there are many more to come.
The second contribution that I hope for this report is to the process of thinking about how to evaluate open-source projects, difficult by since by nature they are free, public, and open (page 12). One cannot simply ask participants if they are satisfied, since satisfaction is subjective, and participants are hard to indentify. This ambiguity is more than welcome – however, the difficulties it presents in evaluation must be embraced and utilized in a way that still allows us to assess projects in a systematic way in order to move the field forward. Systematic means using metrics, comparing to internationally accepted standards such as the Red Cross/Red Crescent rules of conduct, and being transparent about methods and limitations to data collection.
One last note – one weakness of this report is the focus on UHP at the expense of 4636 and the role of the Haitian Diaspora. This is partially due to the original design of the evaluation and partially due challenges with finding those people in order to get their responses to surveys and interviews. This is not meant in any way to downplay the tremendous role that they played. We are incredibly grateful for all of their hard work.
In addition to Diaspora and Mission 4636, I would like to thank the evaluators for their diligence, Nona Lambert (the internal evaluation manager), each of the volunteers and those of you that contributed to this report. The evaluation benefited greatly from the support of Tufts University, including Peter Walker and theFeinstein Center for administrative support and Professor Scharbatke-Church for her evaluation expertise. I would also like to recognize the work of Solutions, a technology company based in Haiti, on building a locally-driven mapping platform, Noula.ht, which continues to promote the crowdsourcing of data in the country. We look forward to your comments and feedback.