Crowdsourcing Methodology

By Aliu A. Onifade

During the analysis phase and the design phase of the design or software development project, developers must ask various questions, conduct researches, and test their prototypes to create a serviceable product (Grier, 2013).

In particular, traditional research methods such as focus group interviews or surveys could be immensely time consuming and costly. On the other hand, crowdsourcing provides a swift and inexpensive solution.

Most specifically, crowdsourcing relies on the intelligence of the “crowd” to perform tasks in exchange for compensation. In other words, it’s outsourcing to a crowd.

What is crowdsourcing?

The term “crowdsourcing” was first published in Wired magazine in 2006 by its editors Jeff Howe and Mark Robinson (Howe & Jeff, 2006). They used this term to illustrate how a company uses the internet to “outsource work to a crowd”.

There are five major forms of crowdsourcing. They include crowdfunding, crowd contests, macro tasks, self-organised crowds, and micro-tasks.

It could be said that, in the Human-Centered Interaction (HCI) context, the most relevant type of crowdsourcing is micro-tasking.

In fact, micro tasking is a new labor market phenomenon. Here, open self-managed recruitment to the general public through online channels, for a small monetary reward, often replaces mundane and tedious tasks that cannot be performed by computers (Difallah et., 2015).

Crowdsourcing Methodology Description

At its core, crowdsourcing is a method that allows developers, researchers, and businesses to outsource their tasks to an extensive number of participants, generally through the use of the internet.

As mentioned earlier, the most relevant type of crowdsourcing is micro-tasking, which we will simply refer to as “crowdsourcing” from here onwards.

Crowdsourcing customarily involves “Requesters”, “Workers” and a “Platform”. The developers, researchers, individuals, or businesses that need a task to be done are the Requesters.

The crowd, who completes the tasks, is called Workers. The parties connect through a crowdsourcing platform such as Amazon Mechanical Turk (mTurk) or Interpreter Connect.

These platforms act as a marketplace for Requesters and Workers to offer and find work, MTurk is one of the most popular crowdsourcing platforms to date.

The process flow of crowdsourcing is simple, as shown in Figure 1. The process starts with Requesters designing a task for Workers to carry out. These tasks are often called Human Intelligence Tasks (HITs).

Process of crowdsourcing methodology

These are tasks that require human intellect to complete, this means that the computers cannot process them. HITs may include small tasks such as image tagging, audio transcription, translation, or data verification.

The output from HITs may then be used as a contribution to larger research or system creation in the analysis phase or results of prototype testing in the design phase. Crowdsourcing platforms resembling mTurk often offer Best Practice guidelines for Requesters to create HITs.

Designing the HITs usually includes a detailed description and instructions guideline to complete the task(i.e. what needs to be done), requirements for the Workers’ qualifications (e.g. age, special skills, reliability score from the platform, location, etc.), determining how many Workers need to perform the task and the number of compensation that the Workers will receive (Mturk, 2017).

There are various considerations in which Requesters should be aware of when designing a HIT. For example, pricing the HIT correctly, laying out the questions correctly, breaking down the task into HITs (not turning it into a macro-task), and minimising the Worker’s time spent with the HIT.

Summary

Crowdsourcing could be remarkably beneficial to developers, businesses, individuals, and researchers. Although it is primarily conducted during the analysis phase it could also be practical during the design phase as well.

The method helps the conductors to perform research with a larger sample size at a faster rate and lower cost, without compromising the quality of the results.

This is because Requesters can access a large number of Workers at a quick rate through the help of an online platform, which will also help ensure the quality of the Workers.

The larger number of Workers allows Requesters to generalize their results, creating a more robust and strong output. Furthermore, Requesters can also select “good results” individually.

Bibliography

Difallah, Djellel & Catasta, Michele & Demartini, Gianluca & Ipeirotis, Panagiotis & Cudre-Mauroux, Philippe. (2015). The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk. 617-617. 10.1145/2740908.2744109.

Grier, D (2013) Understanding the five types of Crowdsourcing. Retrieved from https://www.dummies.com/business/start-a-business/understanding-the-five-types-of-crowdsourcing/ On 16 November 2020

Howe, & Jeff, (2006). The Rise of Crowdsourcing. Wired. 14.

Mturk Blog (2017). Tutorial: Understanding HITs and assignments. Retrieved from https://blog.mturk.com/tutorial-understanding-hits-and-assignments-d2be35102fbd On 17 November 2020

Share and Enjoy !

0Shares
0 0