Almost nothing makes gathering robust information harder than a natural disaster.
But new services and infrastructure emerging globally promise to revolutionize humanitarian response. The edge of the network is alive with open-source technologies, open data, mobile and social computing, cloud services, open GIS, microblogging, sensing, homemade unmanned aerial vehicles, open analytics, and visualization. These will all contribute to a revolution in humanitarian and development affairs over the next 10-15 years.
New capacities for command and control, information coordination, situational awareness, decision support, and for activating local and remote resources represent a paradigm shift in how institutions collaborate, decide, and act. Humanitarian response in the next decade will require new roles, rules, and habits in a multi-polar international political environment.
These changes promise to diversify the tele-geography of response even as they may polarize institutions. There may be battles over whether responses to disaster and human suffering should be centralized or decentralized, technologically advanced or technologically proven, formal or informal. Like all revolutions, this one will uncover structural tensions in the conduct of humanitarian operations. Transformed technology will transform institutions and human behavior.
How do we connect the evolving richness of tools at the Net’s edge to our humanitarian institutions? I have worked for the past year on a data analysis project at the United Nations’ Office for the Coordination of Humanitarian Affairs (OCHA). It has given me several important insights into how our future world of open, social, mobile data should work:
1) Focus on what humans need to make better decisions.
Data does not interpret itself. The questions we ask generate the data we need and make sense of it once we get it. The promise of new, advanced, or real-time technology will not be in how it replaces the humans, but rather how it augments decision-making by both people and machines. As technology advances we must learn to ask new questions tailored to the new capabilities. The OCHA team is working to understand the decisions people need to make so it can figure out the questions they need to ask, and what information services they will need. By imagining new questions we couldn’t answer until now it may be easier to set an agenda for data and technology. Too often in the midst of such transitions, we believe technology is its own answer.
2) Don’t just collect the data, coordinate the actors.
The real challenge is not storing and analyzing data, but coordinating and linking data sets, providers, and users to generate a clearer situational picture. This is partly a data challenge, but mostly one of human coordination. David Megginson, chief architect for the OCHA data analysis project, says, “The keys here are simplicity and extensibility. You can define standards that are near-universal for a small number of data points, or near-comprehensive for a small number of participants, but you don’t get to have both.”
3) Realize that information infrastructures have built-in communities.
Unlike computing in the past, social-mobile-sensing-cloud-open infrastructure depends on communities of developers and users. But we don’t have the institutional habits or designs for that. So tension arises about where standards come from and who is an authorized participant. Data, identity, and community are getting more bound up; information is getting more granular and real-time but also more social. This means we have to start thinking about how communities lead to certain information and vice versa. The goal is to create positive feedback loops among those communities.
4) Design data coordination to evolve and continuously improve.
Data is going to grow and technology is not going to stop advancing, so you can’t just design the system that works today. You need to design a data institution that evolves, and improves along with change. This means staying open in our attitudes and our systems, trying new things, asking new questions, and being willing to connect new actors. Governance matters. There must be a way to oversee and sanction the connections and systems, not just technically create them.
Humanitarian response in what might be called the age of context, and the consolidation of the mobile/social/cloud/sensing stack, represents a bigger shift than even the advent of the web in 1993. It will change and unbundle institutions in radical ways (as we’ve seen in numerous cases, including the Arab awakening). Companies had no idea in 1993 that they would soon need chief information officers and new information cultures. Similarly, humanitarian institutions must come to grips soon with the need to redesign themselves. Data coordination will be a big part of that transformation.
Richard Tyson is founder and special project office adviser of the UN OCHA Data Analysis Project. He will speak about the future of humanitarian coordination at the Techonomy 2013 conference, Nov. 11-13. Follow conversations about the event @Techonomy and #Techonomy13.