The first phase of the Humanitarian Data Toolkit (HDT) experimentation is over, and we are already working on improvements and refinements for the next iteration. The HDT was piloted under a Lean Startup model, experimenting with a relatively rough prototype as the beginning of a process of testing and iterative development. A recently released report documents the journey of the pilot – based on our experience, working as a collaborative team testing out the effectiveness of doing an information needs assessment with the HDT in Dadaab, Kenya.
KEY PRACTICAL FINDINGS:
- The system worked! The toolkit enables the quick execution of an information needs assessment, keeping user error to a minimum during the data collection.
- The HDT provides backup measures in case of equipment failure, lack of internet access, or insufficient electricity.
Reducing user error and increasing quality
- Enumerator accuracy in data collection improves rapidly, even with the challenge of learning a new tool and a new methodology in a short time
- Working directly with data collection supervisors on quality control of the data at the end of each day of data collection was time consuming, but critical to ensuring that learning was shared throughout the team
- Quality data collection and rapid improvement is a direct result of these consistent quality control procedures and ongoing individualized feedback from supervisors to enumerators
Surveys by mobile vs. paper and pencil
- The HDT provides the ability to decide what data collection tool to use, on the spot, for each survey administered. The ability for enumerators to choose their tool increases both their sense of security in a volatile environment and also their autonomy in decision-making processes.
- There is a notable difference in efficiency between phones and paper: as anticipated, using mobile phones for data collection decreases both the number of enumerator mistakes and the time it takes to administer a survey. Survey time for paper averaged at 35 minutes, while surveys on a phone averaged 20 minutes.
- When surveys were done with paper, the use of digitalization software was cost effective and time efficient. For example; in a 2011 study, Internews needed two days for manual data entry of 150 surveys. By contrast, using the software Captricity, we only needed 6 hours to digitize and create a database of 400 surveys.
- Focusing on interactive learning modules was effective for the trainees to simultaneously learn the research methodology and the technological tools. The training approach included a pilot test of the system that built the trainees’ research skills while allowing the Internews team to identify and address potential weak points in skills, technology, and research methodology.
Software and equipment highlights
- Using FormHub software to create surveys was technically challenging but extremely rewarding. Learning the FormHub syntax system required an initial investment in technical capacity before the pilot inception. The initial time investment is more than that required to master some other digital survey tools. Over time, however, the investment is rewarded with a high level of adaptability and flexibility of the software.
- A solar panel – a totally independent source of energy – means a team can continue to work in situations where electricity is unreliable. The upfront investment in a solar panel is steep (about US $1200) but is worth the investment in the long term.
- Even inexpensive smartphones are effective and perfectly adequate for intensive surveying. The HDT pilot used Samsung Pocket phones, costing US $100; there were no major technical problems or failures associated with the phones.
In keeping with the Lean Startup model, further rounds of pilot testing in different types of emergency situations are needed to understand problems or gaps and better develop the system. Refining the technology, support materials, and designing optimum communications interfaces will all support scale-up and sustainability of the HDT. We welcome others to use our model to create their own HDT toolkit, use the guidebooks we created, and implement their own needs assessments. It is our hope that this report is the beginning of an engagement with a community of users of the HDT, who will do their own piloting with the system, add to the knowledge base and dialogue, and co-create the iteration and scale up of the project.