November 12th, 2012 by Rob

There has been a lot of media recently about digital humanitarian volunteers, especially those who are working remotely. Much of it is pure sensationalism, like this quote from Fast Company about Hurricane Sandy last week:

“The proof is in the tweets: More than 20 million about the storm were sent between October 27 and Nov 1.” Fast Company

If you work in information technology, even if not in disaster-response circles, this no doubt makes you shudder. How is the raw volume of communications proof of anything? What is the baseline for the same people tweeting about their lives before Sandy? The value is in what could be processed and utilized, and for this the real numbers come later in the article:

“All together, the city [of NY] has more than 200 staff doing social in English and Spanish. They sent out more than 2,000 tweets between Oct. 26 and Nov 7.”

It sounds impressive until you realize that’s only about 150 tweets per day, among the 200 staff. In other words, New York City sent less than 1 tweet per day per social media staff member. How is this even news? This tells us that engaging in social media in this way was not a significantly large task. It also tells us that Fast Company cannot read their own numbers.

The truth is that despite the press and $100,000s or $1,000,000s of investment, remote digital volunteerism has generally failed to have a significant impact on disaster response, with only a very few exceptions. Of the amount of money that has gone into digital volunteer initiatives, the return has typically been less than 10%.

Crowdsourced information processing is a predominantly paid task, with at least 99% of crowdsourced workers (microtaskers) receiving compensation for their work. The work is characterized by being cost-effective and efficient. Unless you are manually processing 10,000s of data points per day, it will cost more to manage volunteers than to pay professional microtaskers. Even worse, well-meaning remote digital volunteerism can disrupt the burgeoning information economies within a region and ultimately result in a net-deficit by limiting the potential for ongoing digital work.

This is something that I talk about regularly in humanitarian circles. This article is a summary of talks that I have given at the World Bank’s ‘Understanding Risk’ conference earlier this year in Capetown, the recent Wilson Center Roundtable on Connecting Grassroots to Government for Disaster Management and at Tulane University’s Disaster Resilience Leadership Academy, where I helped establish the ‘Crisis Informatics and Analytics’ course in their Masters program. I owe thanks to discussions with a number of people, especially John Crowley and Jennifer Chan at the Harvard Humanitarian Initiative, Jeannie Stamberger and Jessica Ports of Tulane, Shadrock Roberts of USAID, Schuyler Erle and Kate Chapman of the Humanitarian Open Street Map team, and Jeffrey Villaveces of the United Nations Office for the Communication of Humanitarian Affairs.

Contrary to what media organizations like Fast Company tells you, there have not yet been any large crowdsourcing deployments in disaster-response contexts, and some of the largest to date have paid the workers. These past paid deployments give us a convenient baseline to work out the appropriate cost. Translation comes to about $0.30 per sentence, categorization to about $0.05 per report, and geolocation to $0.20. These are numbers can be found in ‘Crowdsourcing and the Crisis-affected Community’ in the Journal of Information Retrieval, and could be applied to processing communications/reports in any scenario.

We can take these numbers and apply them to past remote volunteer initiatives in order to calculate the amount saved by not paying the workers. For those that also used paid crowdsourced workers, we can also estimate the economic impact of the deployment by the total amount of wages that have subsequently been paid to these workers:

Volunteer initiative Volume Gross Value Economic impact
Mission 4636 80,000 $23,000 $250,000+

Ushahidi Haiti 3,400 $850

Chile Earthquake Map 1,200 $300

Pakreport 1,500 $600 $100,000+

Alabama Recovery Map 355 $89

Oil Spill Crisis Map 3,400 $850

Saskatchewan Flood Map 240 $60

Queensland Flood Map 500 $125

Libya Crisis Map 2,500 $600

Sinsai (Japan) 11,500 $3,020

Money saved by digital volunteers / economic impact

The deployments above are listed in more or less chronicle order from the beginning of 2010. In three years, this totals about $30,000 worth of unpaid information processing. These numbers should be taken independent of the less quantitative impacts of the humanitarian work itself. For example, Mission 4636 was primarily a translation platform between Haitian Kreyol and English at a time when translators were few and far between. I remember connecting translators in Canada with relief workers inside logistic bases in Port-au-Prince, where it was easier for the relief workers to connect with a translator over the internet than to engage potential translators just a few metres away outside the gates of the base. It is hard to put a price on the importance of enabling communication in these contexts. However, the remote translators could just as easily have been paid, enabling the same communication and impact, so the comparison holds. This is ultimately what happened with Mission 4636, transferring to paid workers within Haiti at an organization called FATEM.

As with FATEM in Haiti, Pakreport resulted in an injection of money into the information economy. Pakreport was deployed following the floods in Pakistan in 2010 and resulted in the permanent employment of workers through an organization called Brightspyre. For FATEM and Brightspyre, the workers that were originally employed in humanitarian contexts now work as professional microtaskers, bringing in aggregate salaries that far exceed the value of information processing during the disasters.

Of the volunteer components of the deployments above, only Mission 4636 and possibly Sinsai created more value than they cost to run. Some were particularly expensive. For example, the Libya Crisis Map was run by the UN and relied on volunteers with a full-time manager supervising. After four weeks the volunteers were dwindling, so the UN posted three people on burn-out eight-hour back-to-back shifts to ensure that there was always someone staffing the incoming information. It is likely that the UN paid more than $5,000 to coordinate a $600 volunteer effort.
Had they used paid, professional microtaskers, the UN would have had more funds to apply to other aspects of the response.


The lesson not learned

Given the numbers above, I estimate that for every $1 spent on remote digital volunteers for humanitarian work, only $0.10 goes into the actual work, with the rest going on internal expenses, primarily for management (donations to the above initiatives total about $300,000). Wyclef’s recently bankrupt charity, Yele, has been heavily criticized for spending 50% of its money on internal expenses and is currently being investigated for corruption. If 90% of the money spent on digital volunteerism is going on internal expenses, why aren’t digital volunteer networks being investigated for corruption? For deployments only worth a few hundred dollars, it simply doesn’t matter. It is not wrong to run a small inefficient deployment using crowdsourcing – an information processing strategy that would have been new to many people. It would only be unethical if the inefficiency of remote volunteerism was scaled to larger numbers with no place for paid workers.

Since 2010, there have not been any digital humanitarian initiatives that have resulted in paid ongoing employment for crowdsourced workers. There are worrying steps in the direction of expanding remote volunteer networks (sometimes known as the ‘volunteer and technical community’ in humanitarian circles). We have seen enough deployments now to safely say that this model of volunteer information processing does not work, and that humanitarian organizations should utilize more conventional paid crowdsourced workers.

Remote volunteers and the textile industry

So what should a remote volunteer do to avoid wasting 90% of their efforts, but still contribute? The answer is to work on someone else’s crowdsourced task.

There is an analogous situation in the textile industry. When you donate clothing to charity, it does not always end up in the less resourced parts of the world. Many charities have realized that they are having a long-term negative impact on the textile industries of countries when they dump large amounts of donated clothes on the market. Even a temporary disruption can drive people out of the industry on a permanent basis. So when you donate your clothes, the charity sells them in your own country and it is the money from the sale that makes it to the less resourced parts of the world.

The same is true for information processing. If someone within a crisis-affected region can be paid for information processing, and it will cost less than a remote volunteer, then there is no need for a remote worker to disrupt the local information economy. This can prevent efforts like FATEM and BrightSpyre from emerging.

If you are remote to a disaster and have no special local knowledge or connections, the economics come out in favor of you undertaking completely unrelated work for $5-$6 per hour on crowdsourcing/microtasking platforms like Amazon’s Mechanical Turk and simply sending this on to organizations within the region.

Willing workers

I wrote this article while completing some calculations on the volunteer response to Hurricane Sandy. In this case, they were completing damage assessments from arial photographs. The actual report is not yet public, which is why I did not include Sandy among the examples above, but I wanted to conclude with words from the potential workforce. I surveyed 20 professional microtaskers on their willingness to perform disaster-assessment tasks for compensation. All were interested and here are a couple of the responses.

“I think $2.00 is fair for moderating 100 pictures with a single criterion. I am taking into consideration the nature of the task and the organization making the request because I would price it higher for a normal business requester.”

“You are being fair with the given price range. As long as task is easy enough to complete several in a timeframe to equal 5-6 dollars an hour … I do this work as a necessary PT job, so pay does matter!”

If we can create employment for these people and save the humanitarian community money, then I think that there is a strong argument to reverse the recent trend towards non-professional volunteers and return to systems that aim to support professional information services within the disaster-affected communities.

Rob Munro
November 2012

EDIT: added the comic, HT to George Chamales

3 Responses to “Volunteerism vs Professionalism for remote humanitarian work”

  1. Rich Says:

    I believe you have made the common mistake in using this terminology.
    we should either compare Paid vs Volunteer OR Professional vs Amateur.
    I have worked with many volunteers who were, if anything, more professional than the paid members in the effort.

  2. Rob Says:

    Thank you Rich, yes the most professional people that I have worked with have also been volunteers, including among the examples I listed here. I didn’t mean to imply that the quality of volunteers is necessarily worse.
    But ‘paid’ is not quite right either, as it doesn’t necessarily produce more quality, just more reliability, and by ‘professional’ I mean professional microtasker, but potentially still an amateur humanitarian worker.
    Perhaps the most common mistake I made was to try to distill it into a binary distinction.

  3. Rob Says:

    Update. Here’s the report about how we used crowdsourcing to support FEMA during Sandy:
    http://idibon.com/crowdsourced-hurricane-sandy-response/

    While we were able to help, note that the volunteers weren’t able to complete all the assessments. Despite the media, they still mostly disappeared after a day or so.

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>