Your SlideShare is downloading. ×
Crowdmapping & Verification Hanoi Workshop
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Crowdmapping & Verification Hanoi Workshop

266
views

Published on

This is my presentation from a talk I gave at the Vietnam Journalists Association, sponsored by the US State Department and the Broadcasting Board of Governors on Crowdmapping in Hanoi, Vietnam from …

This is my presentation from a talk I gave at the Vietnam Journalists Association, sponsored by the US State Department and the Broadcasting Board of Governors on Crowdmapping in Hanoi, Vietnam from April 3-5, 2013.

Published in: Technology, Travel

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
266
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Before we really dive into this presentation, I would like to tell you a story about a scientist over one hundred years ago.
  • This scientist, Sir Francis Galton was a bit of a polymath, as scientists were in those days. He studied everything from anthropology to meteorology and statistics. He was on the forefront of human sciences like applying statistics to genetic traits and collecting genealogical information in a scientific manner. Through all of his research, he concluded that some people were both mentally and physically superior to others. This led to him to found the field of Eugenics, a science that attempts to “improve” the human race through selective breeding, taking two individuals who have desirable traits and having them reproduce to create a superior individual. It’s this line of thinking that lead to some pretty inhuman and brutal policies in much of the world, including eugenics programs in the US, Nazi Germany and Japan. His main idea is the masses are stupid, uneducated, they don’t have the brains, skills or abilities to help themselves. They must be saved by a superior class of individuals.
  • In 1906, Sir Francis Galton went to the county fair, the West of England Fat Stock and Poultry Exhibition. I’m not sure if there is anything similar in Vietnam but this is a gathering of people from all over the region. They bring their most impressive livestock like huge hogs, fat hens and all sorts of other farm animals. It’s good fun and a spectacle for everyone. Why would a scientist like Sir Francis Galton, who at the time was 85 years old, be interested in the fair? He was most interested in seeing his science in action. The idea is that this livestock was a product of selective breeding. Farmers picked the best of the best and bread them to produce the most meat, milk the most milk or lay the most eggs.
  • It was here that Galton came across a competition. A man was selling tickets to guess the weight of an ox. For a small amount, anybody could buy a slip of paper, write how much they thought a bull weighed and enter for consideration. The person with the closest estimate would be awarded a hefty cash prize. All in all, 800 people entered the competition. These people had a wide range of backgrounds from the obvious butchers and ranchers who have some experience in estimating weight and the majority of others being inexperienced, caretakers, shop owners, regular individuals. These people had no experience judging the weight of livestock. As it turns out, this is an interesting experiment in democracy and a prime example of crowdsourcing. Galton didn’t exactly trust democracy. When there is only a small group of individuals who have the capacity to lead, make intelligent decisions and move society forward, how could the unintelligent group come up with the answer? Galton wanted to verify his thinking and asked the contest organizers if he could have the tickets that people filled out after the tallying was complete. The organizers obliged and handed over the tickets. Galton was making the assumption that the collective guess of this group of mostly “dumb” individuals would be way off. He applied different statistical measures to this sample and the collective guess came out to be 1,197 lbs. What was the actual weight of the ox? 1,198 lbs. The collective made an almost perfect assumption of the weight. This, one of the first scientific looks at crowdsourcing, showed that the power of many, regardless of intelligence, can produce valid results.
  • Sir Francis Galton changed his tone a bit after this experiment. It showed evidence that there is something to collective wisdom, that the sum of a group may be more powerful than any one individual. He said, “The result seems more creditable to the trustworthiness of a democratic judgement than might have been expected.”
  • Now, I can’t take credit for researching this well known anecdote. If you’re interested in that story and finding out a bit more about crowdsourcing, I would recommend picking up “The Wisdom of Crowds” which is conveniently translated into Vietnamese. It really does serve as a great introduction to crowdsourcing in general and crowdmapping, which I want to cover a bit more this hour.
  • As we can see from the example of Sir Francis Galton, crowdsourcing isn’t anything new. In fact, it’s been used to solve all sorts of problems before there was ever a term for it. We have seen this in open entry contests, where brands and governments have thrown out a challenge for the masses to solve. An early example of this is Planter Peanuts, a peanut brand in the US, asked the public to design it’s logo. Perhaps a better example is The Longitude Prize. In the 1700’s, sailors could calculate their Latitude (how far north/south) easily by measuring where the sun was in the sky at noon time. Figuring out longitude (how far east/west) was still incredibly difficult. So, the British government offered money to anyone who could help solve this challenging problem. The Oxford English Dictionary covered a breadth of material that was too difficult for their staff to cull over, fact check and provide context to. They utilized an army of volunteers to go over all the books to add rich information and historical context to help better define all the words in the English language. There were mistakes but overall, it was a huge achievement. A number of research projects in the 20th century had people collecting information about their lives to study human habits. Another amazing project that seems to be before it’s time was to essentially use mathematicians as a distributed computing machine. In 1938, these people were paid to work on difficult math problems to create huge tables of solutions to complex math problems. This reference material materialized as the Handbook of Mathematical Functions, which invariably saved future mathematicians and engineers countless hours of mundane problem solving. Finally, something I really enjoy are the Lonely Planet Guides. It’s a series of travel guides that cover the planet. In the 70s, they collected and published recommendations and travel advice from travelers all over the globe. The Zagat Survey also did something similar, they collected reviews and scored restaurants in the late 70s.
  • Wikipedia is the quintessential example of crowdsourcing. People from all over the world come together to modify articles. Websites like Threadless utilize crowdvoting to select user submitted tshirt designs for purchase. Amazon Mechanical Turks is the prime example of “turksourcing” where individuals perform very small pieces of a larger puzzle to solve problems. Kickstarter kicked off a crowdfunding revolution, where anybody can contribute money to make a project happen. Ushahidi began to popularize crowdmapping, an easy way to collect information and visualize that on a map
  • The Ushahidi Platform is a tool that was originally developed overnight, during a major man made crisis to map violence. It quickly gained recognition as a valuable tool and was open sourced and was set up to work with SMS, smartphones and social media.
  • This was all made by Africans for a problem happing in Kenya.
  • This is a screenshot of what the very first deployment of Ushahidi looked like in the height of election violence in January, 2008. The two top presidential candidates were both accusing each other of election fraud, causing all sorts of problems on the ground where members of different tribes were attacking each other. A few bloggers got together and set this up. With this map, people were able to submit a report to the system via SMS or the web, showing what they were seeing. They were actually providing their “ushahidi”, which means testimony or witness in the Swahili language. In fact, this is how I got involved and came to work with Ushahidi. I was a Peace Corps volunteer in Kenya and left the year before the violence broke out. I was able to use this tool to get the most up to date information of what was happening on the ground. Traditional media proved to be almost worthless in getting out the most up to date information.
  • Since Ushahidi became open source, we’ve seen it adopted all over the globe, from tracking blizzards to natural disasters like floods, giving citizens a voice in their local communities, tracking health issues and major disasters like the Haiti earthquake. I want to show a video that touches on some of the efforts of the team behind one of the more popular Ushahidi maps a few years ago when Port-au-Prince in Haiti was hit by a massive earthquake. You will see another project mentioned called Open Street Map that I will touch on later. Apologies for the lack of Vietnamese translations, it’s only a few minutes long.
  • Crowdmap is the free, hosted version of Ushahidi that virtually anyone can use to start their own map with little mapping experience. We’ll be walking through that together a bit later. Anyone can come to Crowdmap.com and register their own map, which only takes a few minutes. You do sacrifice some customization by hosting your map on our servers but the tradeoff is you don’t have to worry about the nuts and bolts of running a website. 100% of Crowdmap is fully encrypted, meaning someone trying to see the data going from Crowdmap to you or your users is very difficult to decipher.
  • How does Ushahidi and Crowdmap help us tracking health? We do have one great example of an Ushahidi map that tracked one of the worst outbreaks of our time. On April 1st, 2010, there was the great...
  • Zombie Outbreak! The Ushahidi community quickly scrambled to form the Zombie Volunteer Task Force whose mission was to track the spread of the Zombie infection.
  • It all started in a hospital in Uganda where an infected patient at a hospital in Kampala turned into a man eating zombie and things quickly went downhill from there. It quickly spread throughout the western world, probably thanks to an unsuspecting aid worker bringing the virus to Europe and the United States.
  • I checked zombiereports.com before coming here and found that Vietnam was spared from the outbreak. It looks like there was only single infection reported in the Philippines. In all seriousness, this was our April Fool’s joke a couple years ago but it’s proven to be a good example showing a crowdsourced map for health. Even the CDC used the zombie example in some of their health campaigns.
  • One of our first real attempts at using Ushahidi for tracking something health related was to track the H1N1 Swine Flu outbreak in Spring, 2009. This was a mix of media monitoring and crowdsourcing information. It created a pretty rich map showing regions at a high level that were experiencing issues.
  • Stop Stockouts was another map, this time done by a community member who was tracking pharmacies and hospitals that were running out of medications. This map was to show what places were most often having these stockouts and needed more attention.
  • KANCO, the Kenya AIDs NGOs Consortium, was using the platform to map HIV and AIDS organizations around Kenya as a directory and to better provide support to them.
  • There are a number of other projects that have used these tools like the Honduras Health Map tracking health issues in Honduras. The Rescotes en Sanidad map in Spain, talking about public health in Barcelona. The Qiantang River Water Map in China, tracking pollution and water sanitation and it’s impact on health in China.
  • So now that we have a little background in crowdsourcing and mapping, let’s step through the way information can flow into your map.
  • All information flows from various channels into your admin panel. These sources can be reports directly on the homepage of your map, news sources, SMS, smart phones, Twitter and email. We’ll cover a lot of this when we have our hands on session so instead of showing you how to hook all of this up yet, I’ll get into some of the higher level concepts of what reports are.
  • When a visitor comes to your map, they will be presented with a map. The approved reports in your map are represented here with red circles on the map. They are also categorized and can be filtered using the selector on the right hand side.
  • Visitors also have the option to view your reports by a listing with some options to filter content on the right hand side, like showing only reports with photos or only reports within a radius of a point.
  • An individual report will look something like this, with a map to find nearby reports, links to images, report text and commenting. Now is a good time to mention that reports on your map have two very important properties.
  • All reports must have approval or verification tied to them. If you have a report and you want to keep it private either because you’re still working on it or it’s not good enough for your map, you can keep it unapproved. If it’s unapproved, only administrators will be able to access it from the admin section of the site. Reports must also be verified or unverified. This doesn’t impact whether or not visitors can see the report. However, the reports will have this little badge. If it’s verified, it will be green, saying it’s verified. If it’s not verified, it will be red, mentioning it’s unverified. Sometimes it’s important to show people unverified information depending on the situation.
  • One of the most common questions I’m asked when talking about these crowdsourced projects is how to verify that all information on a map is 100% correct. Crowdsourcing, by it’s very nature isn’t prone to being 100% correct. The idea is get a bigger picture idea of what’s happening. That’s why we have the “verified/unverified” property for all reports. Ultimately, it’s up to you to decide how you want to do verification. Think of Ushahidi as a hammer, it’s a tool. We’re not going to tell you how to hold it. Maps are used for a wide range of uses from mapping natural disasters to mapping cheeseburgers so the level of intensity and the methods necessary to verify information is going to vary greatly.
  • I’m going to borrow heavily from my old colleague, Patrick Meier and his study on different ways social media has been verified across five different projects. We’ll be looking at four of them. If you follow the link above you will find the study in case you wish to dive into this a bit deeper.
  • During the Arab Spring, Andy Carvin of National Public Radio in the US gained notoriety for his approach in gathering information. He was monitoring Twitter for reports and doing his best to find the valid bits of information to be shared with his wider audience. He would find an interesting piece of information and then respond directly to the original poster asking for clarifying information. He would also get information from other trusted sources and the crowd in general by tweeting out questions like, “How unusual is this?” and seeing if anyone else could provide context. Carvin was able to use this method to verify that photos of mortars used by Gaddafi were not made and supplied by Israel when the wider media continued reporting that Israeli armaments were being used.
  • In mid-2010, areas of Kyrgyzstan were experiencing widespread violence. Rumors were being passed around about the number of dead or displaced so it was very difficult to verify anything. A local group started a Skype chat, which are generally invite only. The group ballooned to over 2,000 people who were all linked to each other through invitations to join, so nobody here was anonymous. If a report came in, they could easily ask in the chat room for someone in or close to a given area could verify the information. Another resource they used was the local telecom. If someone was sending reports of violence via SMS, they would simply ask the telecom if the SMS message was coming from the area they claimed to be reporting from. Most of the time, they were in a different place entirely.
  • The BBC has an entire team devoted to verifying social media reports. They actively monitor sources to find interesting information. One of their primary sources is comment sections on their own websites. They utilize some of the same techniques as Carvin did during the Arab Spring. Also, they utilize the resources of the BBC, like contacting the Persian Service when getting reports out of Iran.
  • The Standby Volunteer Task Force is a group of trained crowdsourcers and crowdmappers that support projects in need through setting up maps, SMS, social media workflows, media monitoring, the works. Their methods utilize a two step process. If the reporter is a trusted reporter or source, then they may mark their reports as valid, again, depending on the situation. If the source isn’t trusted or known, they will use techniques to verify that the source is legitimate. The first step is checking if the profile of that user is complete with a profile picture, bio, historical usage. Next, verifiers search around the Internet for the same username or name of the user to see if there are other profiles matching the same description. Finally, they check the users followers to see if they seem legitimate. Finally, content is checked to see if it seems odd, if images attached appear to be in the location they say they’re in, if the weather in that region seems appropriate and so on.
  • I would just like to reiterate that verification is not 100% perfect. The idea here is to get a bigger picture and to possibly do deeper dives on individual pieces of content or reports into your map.
  • Ushahidi isn’t alone in crowdmapping. Without the help of other open source communities, Ushahidi and Crowdmap wouldn’t be where it is today.
  • We see a lot of combined uses of different tools to create these mapping projects. What you see here is the Voice of Kibera map. Map Kibera, a part of the same project, was a project to train citizens living in Kibera, one of the larger slums in Africa, to map their community. Before this, Kibera was a blank, grey void on the map. If you didn’t know about it already, you would think they simply didn’t exist. They didn’t exist on Google Maps, they didn’t exist on any accessible mapping service on the Internet.
  • So, the community mapped their neighborhood, creating points of interest that were most important to the people who lived there. On their map, you’ll find churches and mosques, public toilets, schools, clinics and local businesses. Kibera was now literally on the map. With this detailed map, that Voice of Kibera map was able to be born.
  • This site, run by many of the same community members, have been able to do their own citizen reporting on what matters most. People can submit information to them using an SMS shortcode about virtually anything from emergencies to where to find the next football match. It’s really cool to see all of this come together.
  • The Humanitarian OpenStreetMap Team is a volunteer organization that works with OpenStreetMap, the same tool that was used to Map Kibera. They run trainings and mapping projects in areas that have very little map coverage utilizing crowdsourcing. Without their help, a lot of maps wouldn’t be very useful.
  • Thank you so much for letting me give you this introduction to crowdsourcing and some of the things we do with Ushahidi and Crowdmap.
  • Transcript

    • 1. Crowdmapping with Ushahidi Reports and Verification Reports and Verification Brian Herbert Director of Crowdmap @brianherbert Disaster Preparedness and Influenza Workshop Hanoi, Vietnam April 2013
    • 2. Sir Francis Galton• Elitist, believing strongly that some people were both mentally and physically superior to others.• Thought we should make “enhanced” humans by essentially speeding up evolution.
    • 3. Title• A bullet.
    • 4. Sir Francis Galton"The result seems more creditable to the trustworthiness of a democratic judgment than might have been expected."
    • 5. Crowdsourcing Before Crowdsourcing• Open entry contests• Refining reference materials• Surveying for research• Solving math problems• Compiling reviews and guides
    • 6. Crowdsourcing Online• Wikipedia• Threadless• Amazon Mechanical Turk• Kickstarter• Ushahidi & Crowdmap
    • 7. The Ushahidi Platform• A tool developed in the heat of the moment, during a major man made crisis• Quickly became an open source project with a developer community made up of hundreds of individuals• A tool that spans virtually all platforms from SMS, smartphone, web and social media
    • 8. The Ushahidi Platform• A tool developed in the heat of the moment, during a major man made crisis• Quickly became an open source project with a developer community made up of hundreds of individuals• A tool that spans virtually all platforms from SMS, smartphone, web and social media
    • 9. CNN Video clip on Haiti with OSM and Ushahidi
    • 10. Crowdmap• Crowdmap is the free, hosted Ushahidi in the cloud.• Set up in minutes.• Get your own subdomain like mymap.crowdmap.com• SSL Encrypted
    • 11. Health
    • 12. ZOMBIEOUTBREAK
    • 13. Vietnam Spared
    • 14. Reports andVerificationVerification
    • 15. Reports on the Map
    • 16. Reports in a List
    • 17. Individual Report
    • 18. Approval and Verification• Reports can be approved or unapproved• Reports can be verified and unverified
    • 19. VerificationVerification methods dependon the purpose of your map
    • 20. Dr. Meier’s Case Studieson Verifying Information http://irevolution.net/2011/11/29/information-foren
    • 21. Andy Carvin and the Arab Spring • Primarily used Twitter • Burden of proof on original source • Disproved reports of Israeli mortars being used • Used traditional journalism techniques to verify information, just publicly
    • 22. Kyrgyzstan Rumors• Skype was used to verify information• 2,000+ people using the chatroom• Connections with telecom to check geolocation of SMS
    • 23. BBC• User-Generated Content Hub• Actively scour the web• Communicate with posters directly• Verify claims with local news sources
    • 24. Standby Volunteer Task Force • First major deployment was a UN map in Libya • Two step process: • Authentication of source as valid • Triangulation of content as valid • Social media profiles should be complete and traceable across the web • Double check content
    • 25. Verification is not Perfect
    • 26. MappingCommunity
    • 27. voiceofkibera.org
    • 28. voiceofkibera.org
    • 29. HumanitarianOpenStreet Map Team
    • 30. Cám Ơn Rất Nhiều!