,

CrowdSourcing Carries Careful Accuracy

Human Resources Business Concept

 

We love crowdsourcing. It is a way to quantify, and analyze the real-time behaviors of human beings, since they are the data point. Crowdsourcing is one part of what we call “the human sensor” or the concept that people are the best source for improving the conditions around them. People knowing what people want, makes sense right? Which is why crowdsourcing is a massive part of the human sensor.

 

Crowdsourcing is already a massive trend. These numbers presented by eYeka give a proper highlight to the potential and influence the practice of crowdsourcing already has: 

  • 85% of the 2014 Best Global Brands have used crowdsourcing in the last 10 years.
  • The Best Global Brands are three times more likely to use crowdsourcing platforms than websites and social media for their crowdsourcing efforts.
  • Fast Moving Consumer Goods (FMCG) companies increased investment by 48% in 2014 compared to 2013.

So crowdsourcing is a practice that can eclipse into any industry, but we need to ask ourselves this: how accurate is crowdsourcing? Do the numbers actually give us a true sense of the human behaviors behind them? Let’s dive in and find out.  

 

Our neighbors across the pond found that crowdsourced data collection in urban areas are extremely accurate. Co-authored by the University College London and the National University of Ireland (Maynooth), this study found that the only factors of “crowdsourcing accuracy” included geographic, lexiconic, and amenity error when applied to a large urban environment. Even more noteworthy was that the aforementioned errors were low, and that the source accuracy — the people living in these environments — was very strong. 

 

However, we need to ask ourselves the following question: what can we do to insure that OUR crowdsourcing efforts are accurate? The study above was completed by a collective of research professionals, so what can other institutions do to make sure their data is correct? 

 

Lisa Beckers, a guest contributor for crowdsourcing.org, penned the following article highlighting multiple ways in which to quality-assure your crowdsourcing efforts, including manual and algorithmic methods. In plain english, that mean peer-reviewing and using tech-enhanced filters to eek-out faux responses, doubled responses, and other bad data.

 

This is the first brief piece on our discussion on the “human sensor,” and how we can identify its vast components. What tools are you using to collect truly humanized data? 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *