In the past year Internews has been setting up several crowdsourcing mapping project and working in different countries trying to match information collected through traditional media channels and information collected from the so called crowd. Those projects has been implemented in different type of environments, at times in the humanitarian context, at times in election contexts, at times for transparency and accountability projects.
Today, we are seeing the merging of those verification technics with computing based systems, mechanical turks, automated systems, where the amount of time and human resources needed to implement verification protocols is becoming smaller and the verification itself is becoming cheaper. Those new systems are getting us closer to the truth.
Big Data for Media Development
Big Data and Data Verification
What does big data mean to Internews?
What is Big Data?
In information technology, big data is a collection of data sets so
large and complex that it becomes difficult to process using
“normal” database management tools or traditional data
processing applications. The challenges of processing this data
include how to capture, to curate, to storage, to search, to
share, to analyze, and to visualize it. The appeal to use larger data
sets is driven by the additional information available from
analysis of a single large set of related data, as compared to
separate smaller sets with the same total amount of
data, allowing correlations to be found to spot trends, determine
quality of research, run preventive analysis, spot
correlations, model risks, and determine real-time impacts of
Social Media ID
ID of the
Content on the
ID of the
ID of the
Content ID on
Follow up with
Different technics to verify
Separated Trusted and untrusted sources
Trusted sources verify untrusted sources
Verification protocols and technics
Putting the burden of the verification process
on the user
Falsification is always
..but so is verification!