A Guide to Choosing your Crowd-Souring Platform: An Interview with Jill Coleman, Ph.D.

By Aya Haneda

With the growing need for convenience and accessibility to diverse and larger populations, crowd-sourcing platforms are a rapidly growing service utilized by researchers. Published research articles using crowdsourcing have continued to rise in the past decade. More broadly, crowdsourcing refers to the use of internet services to collect information, opinions, or work from a group of individuals (Investopedia, 2022). In behavioral science, crowdsourcing utilizes these services to obtain information from specific populations. Crowdsourcing for research utilizes platforms such as Amazon Mechanical Turk (MTurk), Prolific Academic, Qualtrics Panels, Survey Monkey Audience, StudyResponse, Witmart (Palmer & Strickland, 2016).

Question: Which online crowd sourcing platform is easiest?

While there are strengths and weaknesses to each crowdsourcing platform, the easiest to navigate are the most commonly utilized MTurk and Prolific Academic. It seems that in many institutions, at least for researchers in the social psychology community prefer to use MTurk.

Disclosure: There are strengths and weaknesses to each platform that will be discussed generally. These may or may not be applicable to your own research study.

Strengths for Crowdsourcing

  1. Speed – The biggest advantage to using a crowd-sourcing platform is the amount of time it takes to recruit a sample. For example, if you want to run a study with 200 participants, you can have this sample in one day, or at most a couple of days, while for SONA this will likely take at least one year.
  2. Access to hard-to-reach populations – Crowdsourcing allows you access to populations that are historically hard-to-reach. These may include specific populations that are tailored to your research including populations of certain gender identification, sex, age, ethnicity.
  3. Access to cross-cultural data – As crowdsourcing is available worldwide, researchers are able to obtain access samples throughout the world, creating a more culturally friendly data.
  4. Cost – Historically, crowdsourcing platforms have been known to be an affordable method to get data. While each platform differs in the amount of recommended compensation, weighing the above strengths make crowdsourcing platform an attractive choice.

Weaknesses for Crowdsourcing

  1. Sampling Bias – While different platforms have an additional capability to target certain populations, sampling bias occurs as participants must have certain qualifications to initially enroll onto these platforms. For example, participants must have internet access or a home address.
  2. Bots and Data Quality – Researchers using MTurk usually find large numbers of bot-like response in their data pool. This drawback is not only for MTurk but occurs with every online survey. The presence of bots has increased as the increasing need to work from home have skyrocketed. Dr. Coleman noted that back in early 2010, data obtained from MTurk was comparable to ‘face-to-face’ studies. To reduce the number of bots, Dr. Coleman recommends adding more attention check questions or open-ended questions. Some example attention check questions include “Type the letter of the first letter of this sentence”, “For this question, type 3”.

Question: Which platform has better quality of data and fewer bots?

It seems that Prolific is a highly recommended platform by many researchers, more so being ‘better’ in terms of data quality and less bots. Prolific has their own strategy in place to control the enrollment of these bots. View Prolific’s strategy to reducing bots here (https://blog.prolific.co/bots-and-data-quality-on-crowdsourcing-platforms/) Additionally, Prolific is attractive for researchers who are looking for a sample with specific characteristics. However, Prolific also has its flaws.

Prolific has a pricing guide listed on their website which varies between the length of time it takes to complete the study, the number of participants needed, and sample characteristics. They provide researchers with a ball park for the range of payments. While the cost differential between MTurk and Prolific are becoming smaller, it seems that MTurk still continues to be a slightly cheaper option than Prolific.

Question: What about the SONA system that is used at Roosevelt?

The SONA system has been a way of a traditional way of collect data from psychology department in various universities and colleges. It provides currently enrolled students to participate in studies to earn extra credit in their psychology classes. The strength is that SONA is free and has no opportunities for traditional bots, it may take a long time to get data. Dr. Coleman recommends that researchers plan ahead for at least two years to obtain data. While SONA is free, it is heavily skewed towards college students, more so students identifying as female. Depending on the institution however this will differ. Roosevelt University has more racial diversity compare to other institutions and may result in a more diverse demographic data. However, critics have emphasized that researchers should not use college-aged students as it is an ‘easy’ way to perform research.

Question:  So which platform should I use?

These are some points that you should consider when finalizing the platform to utilize.

  1. Characteristic of your sample – Who do you need in your sample? Are they residents of only Illinois and ages 18-25, or are they more board? For example, if your study requires participants by African American females, since you would have to pay more on MTurk for the specifier, researchers may lean towards Prolific.
  2. Bots and Data Quality – Do you have time on a daily basis to review all incoming data? You must check daily and flag participants with bot-like answers. This is less likely to happen in SONA compared to MTurk studies that receive a larger number of bot participation.
  3. Budget – How much can you afford to compensate participants? It is important to remember that you will not only be compensating your participants but will also be paying the platform. This will range from an additional 30-40% on top of what you will be compensating your participants. In addition, if you have specifiers, you will be required to pay more.
  4. Number of participants – How many participants do you need? If you only need 50 participants that are college-aged population then SONA is a recommended platform for you.
  5. Timeline – How long do you have for data collection? For SONA, it is recommended you have a one-to-two-year buffer to ensure an adequate amount of data. If you need 100 participants, SONA will take one year compared to MTurk and Prolific which you may obtain in one day.

Question: What is one important point to address as student and faculty decide which platform to service?

There are several things that students and faculty should keep in mind when deciding what platform to use. Firstly, you want to keep in mind the time it takes for each participant to complete the research task. You want to remember to reimburse people ‘fairly’ for their time. However, this does not necessarily mean reimbursing at a minimum wage, for example, $15 per hour in Chicago as many participants do not do this as their full-time job. This may become tricky and difficult to navigate as choosing to pay a large amount of money may seem you are trying to entice participants to pursue your study, while if you pay too little it may seem you are not respecting them. Thus, it is important for you to establish and learn what the “going rate” is on each platform, or in another word the typical or expected value for the time it takes to complete the study on the platform.

Question: Is the consent process on a crowd sourcing platform different compared to SONA?

On any chosen platform, consent may be obtained through several ways. For example, on the SONA system, you are required to provide a description of the study, what is expected by the participant throughout the study, review the amount of time it takes to complete the study and compensation. The participant will then proceed to your study. On a crowdsourcing platform (e.g., MTurk or Prolific), the participant will be asked to click on a link and review the consent form, similar to what is described above.

 

  SONA Amazon MTurk Prolific Academy
Population College-Aged Diverse Diverse
Quality Response (e.g., Bots) Rarely Frequent Less frequent
Time for 100 Participant One year One day One day
Price Free Less Expensive Most Expensive

 

Planning to use crowdsourcing for your research study? Here are a few tips.

You’ve decided that you are going to recruit your participants using crowdsourcing. As a student, you may not know much about it, but you figure that it shouldn’t really be difficult to pull off. You’ve considered using Amazon MTurk, but wonder if there are other sources to consider. You have a (limited) budget and want to use it to pay eligible participants for your study. Your faculty advisor isn’t familiar with it, or has little to no knowledge about how it works. You feel this will be the best and easiest way to go about this process, and time is passing each day you delay. What do you do?

Where do I start?
  1. Speak with your advisor about your interest and options. The IRB has found that when student researchers include their faculty advisors and mentors in their plans and gauge their level of knowledge or interest in their plans, the student has a much better chance to carry out those plans successfully.  if your faculty advisor is not knowledgeable about crowdsourcing, there are other faculty who are. Reach out to them with your faculty advisor, so that you both learn about the process together. The IRB  can refer you to faculty who have used this recruitment process numerous times.  If for some reason you are unable to speak with your faculty advisor in earnest about this, then consider another person to fill that role. If you wait too long for them to get back to you, then you risk putting yourself in a situation that can be avoided by working with a more responsive faculty advisor.
  2. Take the time to read more about it to make sure that this is what you really want to do. Here is an article that can jumpstart your research. A beginner’s guide to crowdsourcing (1)  includes some useful information, links and references.
How do I fill out your IRB application?
  1. Once you click on the link to IRBManager, the online submission system, use the information provided in the area of this blog titled, “application checklist”.
  2. When discussing your plans to recruit your participants in your IRB application, keep in mind that the IRB is concerned about what the recruitment process will look like from the perspective of your participants. The IRB wants to know how you will reach them, what they will learn about the project and whether they will be compensated for their participation. The IRB will flag any studies that don’t clarify these three points. The IRB needs make sure that your study provides everyone the same opportunity to participate (unless you have some reason to exclude people based on your research aims). that no one is unduly influenced by your study, and that compensation is reasonable in relationship to their time and effort contributed to the study.
    1. If you are are planning to recruit participants using crowdsourcing, state this outright.
    2. Once you’ve established that crowdsourcing will be used, explain which tool will be used (Amazon MTurk, Prolific, Qualtrics, SONA). Each tool has very different processes for how end users are selected and engaged in your study. Know the differences and be prepared to discuss the relevant choice to the IRB. Will they be paid an estimated amount based on the anticipated time it will take to fill out the survey? Are there specific guidelines specified by the crowdsourcing platform that inform your decision? If so, what are they? What is the form of compensation? Will they have to complete the survey to receive payment?
    3. DO NOT use a third party system different from the crowdsourcing platform to carry out any part of the study. Keep the consent form, study materials and any additional information relevant to participants within the crowdsourcing platform. To make a different choice will cause problems that may compromise the anonymity of your study and complicate the process for participants, which can show up as harm depending on what occurs. And finally, this will cause problems for you and delay the time it takes for you to complete your research.

Have additional questions? Contact the IRB Office at x2449 or research@roosevelt.edu.