An Ill Wind? The Role of Accessible ICT following Hurricane Katrina

While drafting a paper for ODI on Network Humanitarianism, I dredged up a copy of the note that I released on 10 September 2005 following Hurricane Katrina. It’s fair to say that nobody paid much attention to it at the time, partly because it really was a different world back then in terms of disseminating information. I’m posting a copy here not because I think it offers much insight – frankly some of it is a little embarrassing – but because it’s nice to have a record of these things. Also, I liked the phrase “first responders of the wired world”…

___

Introduction

In the wake of Hurricane Katrina, there has been an astonishing amount of activity in web-based initiatives responding to the consequences of the disaster. Examining the characteristics of the response of the technology community to Hurricane Katrina1 tells us much about the way the web has shaped social responses to disaster, raises some interesting issues about the impact of ICT in disaster response, and points towards what might happen in future.

Disasters Old and New

The communications revolution and the growth of the mass media has already changed our perceptions of disasters and the communities they affect. Even 20 years ago, coverage of the Ethiopian famine (and the global response, including the well-documented Live Aid event) was facilitated by communications improvements in such a way as to become imprinted on the minds of a generation. It is not, strictly speaking, accurate to call this a revolution; similar events had taken place in a previous generation, in the form of coverage of the war and famine in Biafra in the late 1960s, and popular responses such George Harrison’s Concert for Bangladesh in 1971. What had changed by the 1980s was the scope and scale of the coverage, and the subsequent level of public awareness and engagement.

It was clear following the Indian Ocean tsunami that the information revolution was in the process of similarly changing the way in which we respond to disasters. This was demonstrated by the rise of web-based fund-raising; Christian Aid raised over £700,000 online in nine days, amounting to nearly four times as much as it raised through donations over the phone. The spread of broadband, improvements in satellite telecommunications and the availability of imagery has made possible GIS and cartographic projects that would not have been possible five years ago.2 The rise of the open source movement has led to initiatives such as the Sahana project, an attempt to develop a suite of web-enabled applications for disaster response organisations.

What lessons can be drawn from what we have seen in the response to Hurricane Katrina? This paper is intended to generate discussion, based on conclusions that have particular relevance for the Sahana project and other, future initiatives to develop dedicated platforms and applications for disaster response.

Discussion Points for Future Response

For a variety of political and logistical reasons, government response to the hurricane was perceived to be woefully slow by the public. There will be an ongoing public dissection of those reasons for a long time, and we are interested in the reasons only so much as they impact on the use of the US’ considerable technology capacity. However the perception of delay itself is relevant, since it spurred a large number of people to action who might otherwise have remained nothing more than engaged observers.

Amongst these people were what we can refer to as the ‘first responders’ of the wired world; those individuals and organisations who are tightly tuned to the web, and who consequently pick up on breaking news earlier than the general population. It has already been documented how bloggers can break news more quickly than the mainstream media; much discussion has been devoted to how this will impact on the collection and dissemination of information – and how this will affect the traditional journalistic virtue of objectivity. During Hurricane Katrina, blogs such as Ernie the Attourney and Queer and Loathing in America reported their experiences – once the bloggers had access to the web – while others were set up to capture information more actively, as with The Interdictor or the Slidell Hurricane Damage Blog.

Outside the immediate area, first responders read their emails, spoke to friends and colleagues by phone, and later watched and listened to broadcast media bring the news. One characteristic of these first responders as a group is that many of them are also early adopters, tending to be evangelists for new technologies, particularly where those technologies have social applications. For the first time, first responders were able to use these skills in order to respond to the impact of a major disaster. The result was a large-scale mobilisation of existing resources to meet a pressing and immediate need – the transmission of news from the disaster area to the ‘outside world’.

This transmission went beyond passive reporting of events, however, which is a requirement well served by the mainstream media. The distributed, interactive nature of the internet meant that these first responders were able to take an active role in two key areas of this response: immediate shelter needs and family tracing. A huge number of attempts to build web-based listings of individuals affected by the hurricane sprung up – some of them more successful than others, but all of them the result of private or charitable rather than government initiatives. Even the Red Cross – traditionally the market leader in family tracing – was caught out, their service just one of many.

By 3 September, a small group of “first responders”3 decided to address the problem of multiple data streams, and the PeopleFinder project was born. A massive volunteer data scraping, cleaning and entry effort was organised at short notice and a consolidated database began to be built, outside the traditional, centralised institutions for this kind of response such as FEMA or the Red Cross. Hugh McLeod, author of the Gaping Void blog, commented that it was “interesting to watch how the information is self-organising.”4 In fact, the information was far from self-organising; it was the web-based response that became self-organising. The PeopleFinder meme, supported by a tightly-linked network of bloggers became a centre of gravity for missing lists, generating very quickly its own data standard, the People Finder Information Format (PFIF) – a very basic specification, but a tremendous effort and a step forward. It seems likely that PFIF will continue in development and may become a useful standard for general adoption.

Although many of the first responders described above are also innovators in the technology field, apart from the development of the PFIF, it is hard to see significant innovation in the response. What we did see was innovative uses of existing platforms (such as Wikis), applications (such as Google Maps) and services (such as Craigslist), as these well-known, accessible and user-friendly tools were used to build knowledge bases to support the response. At the core of all these responses was a single resource: the relational database. This clearly demonstrated that the primary requirement for any aspect of disaster response is data.

It was unsurprising, and indeed appropriate, that there was little innovation to be seen. It was unsurprising because there simply was not time to engage with recently developed technologies, or to deploy experimental tools. It was appropriate because using a disaster response as a test site for new technologies is practically risky and ethically questionable – what is required are proven technologies that will save lives. The result was to fall back on the everyday tools that were already available; another lesson from the hurricane response was that, in one of the most technologically advanced nations on earth, the most useful tools were the ones that were easily to hand.

So what were the applications and services that were put to use? In tracking the coverage of the hurricane response, the resources online fell into a limited range of categories. Blogs gave individuals the opportunity to publish news and opinion in real-time to a broad audience independently of the mainstream media. This made them perfect for broadcasting requests for assistance, pointing people in the direction of other resources, and providing running commentary on the unfolding disaster. Wikis provided quick and simple content management that a large number of people can contribute to. As always, Wikipedia had excellent coverage of the hurricane; more interestingly, Wikis were used to organise information about a variety of projects – the Katrina Help Wiki5, the Hurricane Katrina Help Page, Think New Orleans – and to create ad hoc portal sites for the general public.

The most obviously useful application, however, were the Message Boards provided by sites such as Craigslist or NOLA.com. These fell into two sub-categories: family tracing, and shelter offers. Family tracing provided either requests for information about individuals, or information about the individuals themselves, in order to reassure or reunite families and friends. Shelter offers were a simple co-ordination tool, ensuring needs for assistance and offers of assistance to be matched up; this was later extended to some basic services, such as cleanup equipment, on Craigslist.

Perhaps the most interesting development, however, was the wide availability of geographic information, particularly using facilities such as Google Maps and Google Earth. Geographic information is extremely powerful in shaping the direction of any response, and some data looked to be genuinely useful – both general information, and more specialised datasets, such as damage assessments (Google Earth), shelter maps (Google Maps) and remote sensing imagery (NOAA). Users were actively encouraged to adapt this data for their own needs; it is increasingly obvious that Google will play an important role in familiarising a wider audience with the visualisation of data, and consequently pave the way for more widespread acceptance of GIS.

There are two caveats to this optimistic view. First, the level of geographic information seen during the hurricane was very basic – orientation rather than analysis. We need to invest in developing more useable analytical tools for a general audience, using better graphical interfaces such as Google’s, and to ensure that there are data models for humanitarian GIS work. It is also worth remembering that the the coverage of remote sensing and the quality of geographic information is better in the US than anywhere else on the globe, raising questions over how soon those services might be replicated in other locations. The only solution to this problem is greater investment in improving the global coverage of baseline geographic data; if we accept that geographic information is vital, not just for disaster response, but for a wide range of human endeavours, we must also accept that it requires more funding to map the earth.

In addition, useful as these services were (particularly to somebody like me, who was following the response from the other side of the Atlantic), they were limited in their scope – limited by the reach of the internet. Evelyn Rodriguez, a blogger who survived the Indian Ocean tsunami, spoke from experience when she pointed out that “survivors in most immediate need… are rarely going to be in any position to get online… I just don’t see online resources as the highest priority for a survivor as it’s not likely we’re safely at our keyboards on a broadband connection when and after disaster strikes.”6

So who are these resources useful for? Evelyn Rodriguez again has the answer: “online information immediately in the aftermath is mostly going to be useful for other family members trying to sort out information and the general public.” To a large extent, these services were provided on the basis of assumed needs. Ethan Zuckerman, one of the movers behind the PeopleFinder project, noted himself that “[c]omputer programmers are naturally inclined to solve problems with code,”7 and many of the services fit into the wired world view that technology can solve most (if not all) of the world’s problems.

It will be some time before we are able to evaluate the success, or even the utility, of these services. In terms of adding value to the response, were they worth the investment? Did their impact extend beyond providing an outlet for the humanitarian spirit of those involved with them? Were many families going to the web to trace their relatives in New Orleans? With no available statistics, it is impossible to answer any of these questions, but two things are clear. One is that technology has its limits; the other is that, despite these limits, it makes it possible for people to contribute where they have no other opportunity, and that in itself is valuable for empowering individuals.

On the ground, however, people’s concerns were different. On 3 September, a CBS reporter visited the area in an army helicopter delivering MREs, only to be told by residents of the town of Pass Christian, “[w]e have food. We need fuel to power our generators. We can be self-sufficient if they would just get us some fuel.” Further on, in the town of Kiln, the same message, with one resident explaining that “no one blames the Army National Guard for delivering the wrong items; it’s more a matter that there is a disconnect between the providers and those in need.” 8 The question of co-ordination, while not explicitly spoken about, clearly plagued the entire response – both governmental and non-governmental – to the hurricane.

Such was the importance attached to communications (and such was the chaos of the early response) that a number of conspiracy theories began to circulate: radio signals were being jammed, groups trying to establish a radio station in the Astrodome were shut down by the authorities. Radio communications and Internet provision became the subject of mainstream media coverage, as community-based organisations and private citizens tried to provide these facilities in adverse circumstances. Despite their efforts, there appeared to be no effective public information system operating in the affected areas and, judging from television reports, updates on the situation (or directions towards assistance) did not appear to reach those in need.

This failure to transmit information to those who truly needed it spoke to the weakness of all the initiatives described above. Inspiring as they are, they are also fragmented and un-coordinated, much like every other humanitarian response that I’ve been involved with, if I’m honest. The result is inefficient, and means that we don’t meet the needs of people and communities as quickly and appropriately as we should. It’s too late for the people of New Orleans – just as it was too late for those affected by the Indian Ocean tsunami – but we need to start asking the difficult questions now. It is my hope that the impact of Katrina will lend momentum to existing projects and generate new initiatives that can be used in future disasters.

On a personal note, it has been an interesting experience following some of the technical blogs – particularly those involved in PeopleFinder (mentioned above), ACT (who set up a community internet centre in the Astrodome), Recovery 2.0 (who are already thinking about readiness for the next disaster), and a multitude of others. They’ve been dealing with exactly the same issues that we’ve been dealing with in the field for the last decade – massive amounts of uncleaned data, lack of interoperability between systems, problems establishing basic infrastructure – and I can feel their frustration.

The only difference is that the US has the resources to put behind these initiatives and the logistics to make them happen, something that isn’t generally true when we deploy to locations such as Aceh or Darfur. We need the momentum generated by Hurricane Katrina to have a multiplier effect, so that the initiatives that come out of it also benefit the rest of the world. This means we have to learn (something that the humanitarian community is painfully bad at) in order to make sure that we don’t end up fighting the last war, developing tools that are wonderful if you are in a developed country with resources to spare but useless if you are anywhere else in the world.

Conclusion

Based on the points raised above, there are three key questions that we should all be asking, as new technologies – and particularly web-based services – empower us to take action.

  1. Are there ways of rapidly developing network organisations to co-ordinate these initiatives, without destroying the volunteer spirit, spontaneity and inventiveness of the decentralised approach? This is the perennial problem with co-ordination; the Katrina response may point us in useful directions that we could apply to common services in other humanitarian situations.

  2. What lessons can we learn from the Katrina response, and how can we ensure that those lessons are turned into actions? Which tools worked, and which didn’t? For example, did internet and telephone connectivity make a significant difference, or was it a diversion from the real needs of those trapped in the Astrodome? These questions should be asked in a spirit of open enquiry, and not in order to denigrate the efforts of those who worked so hard to set up those projects.

  3. What’s missing from the picture painted above? From my perspective, what’s missing are report of how ICT was used in what we usually think of as humanitarian activities – shelter management, distribution of food or non-food items, health services, etc. Doubtless there are applications being used in these aspects of the response, particularly by larger organisations such as the Red Cross, but it was hard to identify them from a distance. There needs to be an analysis of the gaps in the technology – what applications were needed, but weren’t available? What solutions were explored for rapid connectivity needs, either static or mobile?

Despite the terrible cost, the hurricane response will be a valuable learning experience, and those involved on the ground should do their best to document it. Some writers have already started thinking about these issues. The obvious starting points are tools to manage data, no matter where that data is coming from; standards, protocols and operating procedures that mean systems can speak to each other. The next step is applications, scaleable, flexible and interoperable, that can be placed anywhere and operated easily in environments with varying levels of connectivity and computer literacy. And, finally, future developers should always bear in mind the words of tsunami survivor Evelyn Rodriguez, quoted above: “In an emergency, think: Cheap. Simple. Ubiquitous.”

1Specific references are given in footnotes: URLs are hyperlinked within the text itself.

2“Better the Devil we Know: the Opportunity Costs of Humanitarian GIS” (unpublished), Paul Currion July 2005

3David Geilhulfe, Ethan Zuckerman and Jon Lebowsky.

4Lists Lists Lists”, Hugh McLeod, 4 September 2005

5Initiated by Dina Mehta and veterans of another online initiative, the South-East Asia Earthquake and Tsunami Blog

6International Blogging For Disaster Relief Day”, Evelyn Rodriguez, 2 September 2005

8Reduced to Matchsticks”, Cynthia Bowers, CBS, 3 September 2005

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.