A Guide to Choosing your Crowd-Souring Platform: An Interview with Jill Coleman, Ph.D.

By Aya Haneda

With the growing need for convenience and accessibility to diverse and larger populations, crowd-sourcing platforms are a rapidly growing service utilized by researchers. Published research articles using crowdsourcing have continued to rise in the past decade. More broadly, crowdsourcing refers to the use of internet services to collect information, opinions, or work from a group of individuals (Investopedia, 2022). In behavioral science, crowdsourcing utilizes these services to obtain information from specific populations. Crowdsourcing for research utilizes platforms such as Amazon Mechanical Turk (MTurk), Prolific Academic, Qualtrics Panels, Survey Monkey Audience, StudyResponse, Witmart (Palmer & Strickland, 2016).

Question: Which online crowd sourcing platform is easiest?

While there are strengths and weaknesses to each crowdsourcing platform, the easiest to navigate are the most commonly utilized MTurk and Prolific Academic. It seems that in many institutions, at least for researchers in the social psychology community prefer to use MTurk.

Disclosure: There are strengths and weaknesses to each platform that will be discussed generally. These may or may not be applicable to your own research study.

Strengths for Crowdsourcing

  1. Speed – The biggest advantage to using a crowd-sourcing platform is the amount of time it takes to recruit a sample. For example, if you want to run a study with 200 participants, you can have this sample in one day, or at most a couple of days, while for SONA this will likely take at least one year.
  2. Access to hard-to-reach populations – Crowdsourcing allows you access to populations that are historically hard-to-reach. These may include specific populations that are tailored to your research including populations of certain gender identification, sex, age, ethnicity.
  3. Access to cross-cultural data – As crowdsourcing is available worldwide, researchers are able to obtain access samples throughout the world, creating a more culturally friendly data.
  4. Cost – Historically, crowdsourcing platforms have been known to be an affordable method to get data. While each platform differs in the amount of recommended compensation, weighing the above strengths make crowdsourcing platform an attractive choice.

Weaknesses for Crowdsourcing

  1. Sampling Bias – While different platforms have an additional capability to target certain populations, sampling bias occurs as participants must have certain qualifications to initially enroll onto these platforms. For example, participants must have internet access or a home address.
  2. Bots and Data Quality – Researchers using MTurk usually find large numbers of bot-like response in their data pool. This drawback is not only for MTurk but occurs with every online survey. The presence of bots has increased as the increasing need to work from home have skyrocketed. Dr. Coleman noted that back in early 2010, data obtained from MTurk was comparable to ‘face-to-face’ studies. To reduce the number of bots, Dr. Coleman recommends adding more attention check questions or open-ended questions. Some example attention check questions include “Type the letter of the first letter of this sentence”, “For this question, type 3”.

Question: Which platform has better quality of data and fewer bots?

It seems that Prolific is a highly recommended platform by many researchers, more so being ‘better’ in terms of data quality and less bots. Prolific has their own strategy in place to control the enrollment of these bots. View Prolific’s strategy to reducing bots here (https://blog.prolific.co/bots-and-data-quality-on-crowdsourcing-platforms/) Additionally, Prolific is attractive for researchers who are looking for a sample with specific characteristics. However, Prolific also has its flaws.

Prolific has a pricing guide listed on their website which varies between the length of time it takes to complete the study, the number of participants needed, and sample characteristics. They provide researchers with a ball park for the range of payments. While the cost differential between MTurk and Prolific are becoming smaller, it seems that MTurk still continues to be a slightly cheaper option than Prolific.

Question: What about the SONA system that is used at Roosevelt?

The SONA system has been a way of a traditional way of collect data from psychology department in various universities and colleges. It provides currently enrolled students to participate in studies to earn extra credit in their psychology classes. The strength is that SONA is free and has no opportunities for traditional bots, it may take a long time to get data. Dr. Coleman recommends that researchers plan ahead for at least two years to obtain data. While SONA is free, it is heavily skewed towards college students, more so students identifying as female. Depending on the institution however this will differ. Roosevelt University has more racial diversity compare to other institutions and may result in a more diverse demographic data. However, critics have emphasized that researchers should not use college-aged students as it is an ‘easy’ way to perform research.

Question:  So which platform should I use?

These are some points that you should consider when finalizing the platform to utilize.

  1. Characteristic of your sample – Who do you need in your sample? Are they residents of only Illinois and ages 18-25, or are they more board? For example, if your study requires participants by African American females, since you would have to pay more on MTurk for the specifier, researchers may lean towards Prolific.
  2. Bots and Data Quality – Do you have time on a daily basis to review all incoming data? You must check daily and flag participants with bot-like answers. This is less likely to happen in SONA compared to MTurk studies that receive a larger number of bot participation.
  3. Budget – How much can you afford to compensate participants? It is important to remember that you will not only be compensating your participants but will also be paying the platform. This will range from an additional 30-40% on top of what you will be compensating your participants. In addition, if you have specifiers, you will be required to pay more.
  4. Number of participants – How many participants do you need? If you only need 50 participants that are college-aged population then SONA is a recommended platform for you.
  5. Timeline – How long do you have for data collection? For SONA, it is recommended you have a one-to-two-year buffer to ensure an adequate amount of data. If you need 100 participants, SONA will take one year compared to MTurk and Prolific which you may obtain in one day.

Question: What is one important point to address as student and faculty decide which platform to service?

There are several things that students and faculty should keep in mind when deciding what platform to use. Firstly, you want to keep in mind the time it takes for each participant to complete the research task. You want to remember to reimburse people ‘fairly’ for their time. However, this does not necessarily mean reimbursing at a minimum wage, for example, $15 per hour in Chicago as many participants do not do this as their full-time job. This may become tricky and difficult to navigate as choosing to pay a large amount of money may seem you are trying to entice participants to pursue your study, while if you pay too little it may seem you are not respecting them. Thus, it is important for you to establish and learn what the “going rate” is on each platform, or in another word the typical or expected value for the time it takes to complete the study on the platform.

Question: Is the consent process on a crowd sourcing platform different compared to SONA?

On any chosen platform, consent may be obtained through several ways. For example, on the SONA system, you are required to provide a description of the study, what is expected by the participant throughout the study, review the amount of time it takes to complete the study and compensation. The participant will then proceed to your study. On a crowdsourcing platform (e.g., MTurk or Prolific), the participant will be asked to click on a link and review the consent form, similar to what is described above.

 

  SONA Amazon MTurk Prolific Academy
Population College-Aged Diverse Diverse
Quality Response (e.g., Bots) Rarely Frequent Less frequent
Time for 100 Participant One year One day One day
Price Free Less Expensive Most Expensive

 

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *