A Guide to Choosing your Crowd-Souring Platform: An Interview with Jill Coleman, Ph.D.

By Aya Haneda

With the growing need for convenience and accessibility to diverse and larger populations, crowd-sourcing platforms are a rapidly growing service utilized by researchers. Published research articles using crowdsourcing have continued to rise in the past decade. More broadly, crowdsourcing refers to the use of internet services to collect information, opinions, or work from a group of individuals (Investopedia, 2022). In behavioral science, crowdsourcing utilizes these services to obtain information from specific populations. Crowdsourcing for research utilizes platforms such as Amazon Mechanical Turk (MTurk), Prolific Academic, Qualtrics Panels, Survey Monkey Audience, StudyResponse, Witmart (Palmer & Strickland, 2016).

Question: Which online crowd sourcing platform is easiest?

While there are strengths and weaknesses to each crowdsourcing platform, the easiest to navigate are the most commonly utilized MTurk and Prolific Academic. It seems that in many institutions, at least for researchers in the social psychology community prefer to use MTurk.

Disclosure: There are strengths and weaknesses to each platform that will be discussed generally. These may or may not be applicable to your own research study.

Strengths for Crowdsourcing

  1. Speed – The biggest advantage to using a crowd-sourcing platform is the amount of time it takes to recruit a sample. For example, if you want to run a study with 200 participants, you can have this sample in one day, or at most a couple of days, while for SONA this will likely take at least one year.
  2. Access to hard-to-reach populations – Crowdsourcing allows you access to populations that are historically hard-to-reach. These may include specific populations that are tailored to your research including populations of certain gender identification, sex, age, ethnicity.
  3. Access to cross-cultural data – As crowdsourcing is available worldwide, researchers are able to obtain access samples throughout the world, creating a more culturally friendly data.
  4. Cost – Historically, crowdsourcing platforms have been known to be an affordable method to get data. While each platform differs in the amount of recommended compensation, weighing the above strengths make crowdsourcing platform an attractive choice.

Weaknesses for Crowdsourcing

  1. Sampling Bias – While different platforms have an additional capability to target certain populations, sampling bias occurs as participants must have certain qualifications to initially enroll onto these platforms. For example, participants must have internet access or a home address.
  2. Bots and Data Quality – Researchers using MTurk usually find large numbers of bot-like response in their data pool. This drawback is not only for MTurk but occurs with every online survey. The presence of bots has increased as the increasing need to work from home have skyrocketed. Dr. Coleman noted that back in early 2010, data obtained from MTurk was comparable to ‘face-to-face’ studies. To reduce the number of bots, Dr. Coleman recommends adding more attention check questions or open-ended questions. Some example attention check questions include “Type the letter of the first letter of this sentence”, “For this question, type 3”.

Question: Which platform has better quality of data and fewer bots?

It seems that Prolific is a highly recommended platform by many researchers, more so being ‘better’ in terms of data quality and less bots. Prolific has their own strategy in place to control the enrollment of these bots. View Prolific’s strategy to reducing bots here (https://blog.prolific.co/bots-and-data-quality-on-crowdsourcing-platforms/) Additionally, Prolific is attractive for researchers who are looking for a sample with specific characteristics. However, Prolific also has its flaws.

Prolific has a pricing guide listed on their website which varies between the length of time it takes to complete the study, the number of participants needed, and sample characteristics. They provide researchers with a ball park for the range of payments. While the cost differential between MTurk and Prolific are becoming smaller, it seems that MTurk still continues to be a slightly cheaper option than Prolific.

Question: What about the SONA system that is used at Roosevelt?

The SONA system has been a way of a traditional way of collect data from psychology department in various universities and colleges. It provides currently enrolled students to participate in studies to earn extra credit in their psychology classes. The strength is that SONA is free and has no opportunities for traditional bots, it may take a long time to get data. Dr. Coleman recommends that researchers plan ahead for at least two years to obtain data. While SONA is free, it is heavily skewed towards college students, more so students identifying as female. Depending on the institution however this will differ. Roosevelt University has more racial diversity compare to other institutions and may result in a more diverse demographic data. However, critics have emphasized that researchers should not use college-aged students as it is an ‘easy’ way to perform research.

Question:  So which platform should I use?

These are some points that you should consider when finalizing the platform to utilize.

  1. Characteristic of your sample – Who do you need in your sample? Are they residents of only Illinois and ages 18-25, or are they more board? For example, if your study requires participants by African American females, since you would have to pay more on MTurk for the specifier, researchers may lean towards Prolific.
  2. Bots and Data Quality – Do you have time on a daily basis to review all incoming data? You must check daily and flag participants with bot-like answers. This is less likely to happen in SONA compared to MTurk studies that receive a larger number of bot participation.
  3. Budget – How much can you afford to compensate participants? It is important to remember that you will not only be compensating your participants but will also be paying the platform. This will range from an additional 30-40% on top of what you will be compensating your participants. In addition, if you have specifiers, you will be required to pay more.
  4. Number of participants – How many participants do you need? If you only need 50 participants that are college-aged population then SONA is a recommended platform for you.
  5. Timeline – How long do you have for data collection? For SONA, it is recommended you have a one-to-two-year buffer to ensure an adequate amount of data. If you need 100 participants, SONA will take one year compared to MTurk and Prolific which you may obtain in one day.

Question: What is one important point to address as student and faculty decide which platform to service?

There are several things that students and faculty should keep in mind when deciding what platform to use. Firstly, you want to keep in mind the time it takes for each participant to complete the research task. You want to remember to reimburse people ‘fairly’ for their time. However, this does not necessarily mean reimbursing at a minimum wage, for example, $15 per hour in Chicago as many participants do not do this as their full-time job. This may become tricky and difficult to navigate as choosing to pay a large amount of money may seem you are trying to entice participants to pursue your study, while if you pay too little it may seem you are not respecting them. Thus, it is important for you to establish and learn what the “going rate” is on each platform, or in another word the typical or expected value for the time it takes to complete the study on the platform.

Question: Is the consent process on a crowd sourcing platform different compared to SONA?

On any chosen platform, consent may be obtained through several ways. For example, on the SONA system, you are required to provide a description of the study, what is expected by the participant throughout the study, review the amount of time it takes to complete the study and compensation. The participant will then proceed to your study. On a crowdsourcing platform (e.g., MTurk or Prolific), the participant will be asked to click on a link and review the consent form, similar to what is described above.

 

  SONA Amazon MTurk Prolific Academy
Population College-Aged Diverse Diverse
Quality Response (e.g., Bots) Rarely Frequent Less frequent
Time for 100 Participant One year One day One day
Price Free Less Expensive Most Expensive

 

Instructions for locating and submitting forms required for amendment or renewal of approved studies in IRBManager

You have submitted and received approval for your IRB application (yay!). Now you have to submit another form to make changes to your original study (amendment), acknowledge a reliance agreement with another institution, request a continuation for your study (renewal/termination form) or terminate your study (renewal/termination form).

All subsequent forms submitted have to be processed through the study page of the initial submission. The initial application established a study page with your study number. Your study number should be connected to every form that is submitted for that particular study thereafter.

You access your study page by clicking on the hyperlinked study number on your home page. Once you click on your study number, that will take you to a page that looks like this.

Notice that your study number is at the top of the page. When you look on the side column, you will notice the “start x form” in the side toolbar (circled below). Push “start x form” on your study page and that will take you to the next screen.

You will see all of the forms you can use in relationship to your study number. From here, you should be able to submit your form.

 

If for some reason you are not able to access your study page, please contact the IRB Office.

Here is a short video that provides instructions.

 

Not sure if your form was submitted properly? Here is how you can be absolutely sure.

If it’s been weeks since you’ve submitted your IRB form and you are wondering why you haven’t heard back yet, there is a likely chance that you haven’t completed the submission process.

Each form requires two steps for submission: sign and submit. If you have seen a screen that asks for you to enter your username and password, then you haven’t signed off on your form. If you haven’t seen a submit screen, then you haven’t actually submitted your form.

Here is what you should look for.

After you enter your information in the form, push “next” at the bottom of your form. On the next screen you will see a page that asks for you to sign off on your form. Push “sign” and this will take you to a screen that asks for your username and password, same as when you log on to IRBManager. Once you complete that step, you should see this screen.

This screen verifies that you have actually signed your form. You will need to push next again to go to the next screen.

When you push “submit” your form will be sent to the next level of review. If you are a student, your faculty advisor will receive the form. If you are a faculty or staff researcher, your form will go to the IRB Office.

Once you see the above screen, your form has been submitted.

If you have completed these steps and you require your faculty advisor to sign off on the form, please make sure that your faculty advisor has done so. Once they have signed off on the form and you still haven’t heard back in two weeks, then please reach out to the IRB Office to make sure that your form has been reviewed.

Planning to use crowdsourcing for your research study? Here are a few tips.

You’ve decided that you are going to recruit your participants using crowdsourcing. As a student, you may not know much about it, but you figure that it shouldn’t really be difficult to pull off. You’ve considered using Amazon MTurk, but wonder if there are other sources to consider. You have a (limited) budget and want to use it to pay eligible participants for your study. Your faculty advisor isn’t familiar with it, or has little to no knowledge about how it works. You feel this will be the best and easiest way to go about this process, and time is passing each day you delay. What do you do?

Where do I start?
  1. Speak with your advisor about your interest and options. The IRB has found that when student researchers include their faculty advisors and mentors in their plans and gauge their level of knowledge or interest in their plans, the student has a much better chance to carry out those plans successfully.  if your faculty advisor is not knowledgeable about crowdsourcing, there are other faculty who are. Reach out to them with your faculty advisor, so that you both learn about the process together. The IRB  can refer you to faculty who have used this recruitment process numerous times.  If for some reason you are unable to speak with your faculty advisor in earnest about this, then consider another person to fill that role. If you wait too long for them to get back to you, then you risk putting yourself in a situation that can be avoided by working with a more responsive faculty advisor.
  2. Take the time to read more about it to make sure that this is what you really want to do. Here is an article that can jumpstart your research. A beginner’s guide to crowdsourcing (1)  includes some useful information, links and references.
How do I fill out your IRB application?
  1. Once you click on the link to IRBManager, the online submission system, use the information provided in the area of this blog titled, “application checklist”.
  2. When discussing your plans to recruit your participants in your IRB application, keep in mind that the IRB is concerned about what the recruitment process will look like from the perspective of your participants. The IRB wants to know how you will reach them, what they will learn about the project and whether they will be compensated for their participation. The IRB will flag any studies that don’t clarify these three points. The IRB needs make sure that your study provides everyone the same opportunity to participate (unless you have some reason to exclude people based on your research aims). that no one is unduly influenced by your study, and that compensation is reasonable in relationship to their time and effort contributed to the study.
    1. If you are are planning to recruit participants using crowdsourcing, state this outright.
    2. Once you’ve established that crowdsourcing will be used, explain which tool will be used (Amazon MTurk, Prolific, Qualtrics, SONA). Each tool has very different processes for how end users are selected and engaged in your study. Know the differences and be prepared to discuss the relevant choice to the IRB. Will they be paid an estimated amount based on the anticipated time it will take to fill out the survey? Are there specific guidelines specified by the crowdsourcing platform that inform your decision? If so, what are they? What is the form of compensation? Will they have to complete the survey to receive payment?
    3. DO NOT use a third party system different from the crowdsourcing platform to carry out any part of the study. Keep the consent form, study materials and any additional information relevant to participants within the crowdsourcing platform. To make a different choice will cause problems that may compromise the anonymity of your study and complicate the process for participants, which can show up as harm depending on what occurs. And finally, this will cause problems for you and delay the time it takes for you to complete your research.

Have additional questions? Contact the IRB Office at x2449 or research@roosevelt.edu.

Why haven’t I heard back from the IRB about my study yet? It’s been a while…

You submitted your IRB application, Renewal or Termination and may not be sure sure why you haven’t heard back from the IRB. Here are a few reasons why this may be the case:

  • You are a student whose faculty advisor did not sign off on your study. This is the most common issue for why this happens. Every student study submitted to the IRB requires approval from your faculty advisor. If you haven’t been in touch with your faculty advisor, check  in with them to find out if they actually reviewed your study after your submission to the IRB. Faculty advisors are now responsible for agreeing to providing proper oversight over student research that requires IRB approval and to provide a one-page plan on how they will monitor student research projects. Once a faculty advisor completes that process and actually pushes the “submit” button, then the project comes to the IRB for review. If your study is sent back to you during the pre-review, review, and post-review stage (if a full board study), then your faculty advisor is required to review and submit it again before the IRB reviews your changes. Keep in contact with your faculty advisor to make sure that they are actually responding to emails received from IRBManager.
  • You are a PI with a different submitter for your study. The IRB application has a place to indicate whether the person is the submitter and/or PI. If that person is one in the same, then the submission should move forward without an issue. If a student or staff person submits a study on behalf of someone else, then the system requires that the PI sign off on the study prior to IRB review. The IRB Office sees numerous studies at this stage all the time.  Please be sure to check for emails from the IRB Office, or research@roosevelt.edu. You likely have been sent an email that requires your attention and response.  Once you receive an email notification from the IRB, please go into your study, review it and push “submit” to complete the submission process.
  • You did not use your actual Roosevelt University email address to submit your study. All study information is tied to your Roosevelt profile. The system recognizes this information for everyone to have single sign-on access through Roosevelt and to access your CITI training information. Any submitter that uses a personal email address while accessing IRBManager or providing contact information for the IRB Office to process is creating a situation that confuses the system’s ability to recognize you as a submitter. Applications submitted with private contact information are not processed by the system and require the IRB Office to delete the second contact information and for the submitter to begin again with another form.
  • You didn’t actually complete the submission process by pushing “submit”. We have found that many submitters simply do not push “submit”.  The submission will only be completed once this task is completed.
  • You received an email notification requesting changes to your application that you haven’t responded to. Please be sure to check your emails to make sure that you are being responsive to any feedback, questions, or required changes to your study.
  • If you have addressed any or all of the above concerns and you are still haven’t received a response in 5-10 business days following your submission, please check in IRBManager to see if there has been a status change to your submission. If it shows as “under review” please call or email the IRB Office to inquire about the status of your study for exempt or expedited studies. If your level of review is full board, please know that it will be reviewed at the next full board meeting. If your level of review is exempt or expedited, it will be reviewed by one or two IRB reviewers, which should not take as long. Please contact the IRB Office (dsomerville@roosevelt.edu) if the review time has been longer than 10 business days for your exempt or expedited study.