Reliance Agreements and Cooperative Research

By LaVonneDowney, Ph.D.

So, what does the new common rule say about cooperative research? First of all, let’s define cooperative research which is research “involving more than one institution”. Previously, both institution’s IRB would review the research which extended and often duplicated the review process for all involved, especially researchers. The new Common Rule wants to avoid having to duplicate IRB reviews and make the process more cooperative. This can be done by having an IRB Authorization / Reliance Agreement. The simplicity of the solution a cooperative /authorizing agreement belies the potential complexity involved. This is in part, due to issues involving whose IRBis doing oversight. Each institution is still responsible for safeguarding the rights and welfare of human subjects and for complying with applicable laws and regulations. Thus, certain steps have to be outlined in any agreement.

These include the following :

  • Determining when other IRBs are involved, and whose guidelines are followed
  • Determining which IRB has the expertise to be the led for example a medical institution might have more experience and structures in place to oversee medical patient-based research than a non-medicine patient-based research university.
  • Determining the time frame of the agreements defining their respective responsibilities in connection with the research; such as where data is stored
  • Communicating as appropriate with the other involved IRBsNotifying investigators of any special expectations with regard to the conduct of multi-site research.
  • The University IRB should determine prior to the initiation of research how it will solicit and review reports of unanticipated problems involving risks to subjects or others thus, certain steps have to be outlined in any agreement.

Again, the good news is this can all be done through cooperative agreements, intra-institutional, reliance, and authorizing agreements. You can find the current IRB Reliance Agreement here under “Sample Consent Forms“.  The neutral news is that this process is often like negotiating a contract, often between two entities with differing layers of requirements and procedures. Thus, the process of creating and moving towards an agreement should start as soon as possible with the understanding that it might take time for both parties( IRB ) to agree on all the details. The helpful news is that the research office can help you with the process. Please use these resources so both sites are working together with their respective strengths to ensure ethical research is being done.

Important Dates

All IRB applications must be submitted via IRB Manager by one of the submission deadlines below at 5:00 p.m. No exceptions. The IRB is no longer accepting paper applications for initial applications, renewals, amendments, or terminations.

Here are the dates for the 2024-2025 Academic Year

IRB Submission Deadline  IRB Committee Review Date 
 

January 15, 2025

February 12,2025 – 5:00 p.m.

March  13, 2025- 5:00 p.m.

April  16, 2025 – 5:00 p.m

May  15, 2025 – 5:00 p.m.

June 13, 2025 -5:00 p.m.

 

 

 

 

 

 

December 15, 2024 – 5:00 p.m.

February 6, 2025

March 6, 2025

April  3, 2025

May  1, 2025

June 5, 2025

July  10, 2025

 

 

 

 

 

January 9, 2025

January 12, 2025 – 5:00 p.m. February 6, 2025
February 8, 2025 – 5:00 p.m. March 6, 2025
March 15, 2025 – 5:00 p.m. April 3, 2025
April 13, 2025 – 5:00 p.m. May 1, 2025
May 17, 2025 – 5:00 p.m. June 5, 2025
June 15, 2025 – 5:00 p.m. July 3, 2025
July 12, 2025 – 5:00 p.m. August 4, 2025
August 17, 2025 – 5:00 p.m. September 2, 2025
September 14, 2025 – 5:00 p.m. October 2, 2025
October 12, 2024 – 5:00 p.m. November 6, 2025

What Information A Consent Form Should Contain

Consent Forms- Needed Information

1. Title of the Study
2. Names and Affiliations of the Primary Investigator
• If a student is conducting the study, state the student’s information first.
3. Purpose of the Study
• Describe the general purpose of the study.
4. Subject Selection Criteria
• Describe how the subjects were chosen.
5. Study Procedures
• In chronological order, describe what the subject will be asked to do (an activity, completing a survey).
• Describe the total length of time for participation (how long, how often).
• If applicable, explain that the investigator will be audiotaping or videotaping, and if this is optional.
6. Potential Risks and Discomforts
• Describe any potential for psychological, social, legal or financial risk or harms to the subject and their probability as a direct result of participation in the research and/or from breach of confidentiality. (Remember: There is no such thing as risk-free human subject research.). If the subject might be upset in responding to the study topics you must include a free service they can contact to speak with about that issue.
7. Potential Benefits
• Describe any expected benefits to the subjects themselves (clearly state if subject will not benefit directly from the study).
• Describe any expected benefits to society and/or science.
8. Cost and Compensation
• Describe any cost to the subject (include time spent).
• Describe any compensation the subject will be offered as a result of participation in the research (if partial participation will result in partial compensation, explain). Also see section 11.
9. Future Use of Data
If working with identifiable and or de-identified data:

• Explain that identifiable private information/de-identified data may be retained and used for additional or subsequent research, and if this is optional. State when data will be destroyed.  the standard is five years
– OR –
• State that the data collected will not be distributed for future research, even with the identifiers removed (note that some funders and journals require de-identified data to be made available to others post research/publication).
• Standard length of time is 5 years and then will be destroyed.

10. Confidentiality
• Describe the level to which subject information will be kept confidential (describe procedures that will be used to safeguard data, including where it will be kept, who will have access to it and at what point it will be destroyed; note the difference between anonymous and confidential).
11. Participation and Withdrawal
• State clearly that participation is voluntary and that the subject may refuse to answer any questions or withdraw from the study at any time without penalty (including loss of benefits to which they would otherwise be entitled).
12. Contact Information
• Give the contact information of the principal investigator and supervised researcher (if applicable) for questions about the study.
• Give the contact information of the Roosevelt University IRB Chair Dr La Vonne Downey, ldowney@roosevelt.edu 312-322-7112) for questions about the subject’s rights as a human subject or concerns about the research.
13. Subject Consent
Example
I have read (or had read to me) the contents of this consent form and have been encouraged to ask questions. I have received satisfactory answers to my questions. I understand that my participation is voluntary and that I may withdraw my participation at any time without penalty. I voluntarily agree to participate in this study.
__ I do __ I do not give you permission to make audio/video recordings of me during this study (if applicable).
__ I do __ I do not give you permission to retain and use my data for future research (if applicable).
• Signatures of subject and investigator. The subject must be able to have a copy of this form.

A Guide to Choosing your Crowd-Souring Platform: An Interview with Jill Coleman, Ph.D.

By Aya Haneda

With the growing need for convenience and accessibility to diverse and larger populations, crowd-sourcing platforms are a rapidly growing service utilized by researchers. Published research articles using crowdsourcing have continued to rise in the past decade. More broadly, crowdsourcing refers to the use of internet services to collect information, opinions, or work from a group of individuals (Investopedia, 2022). In behavioral science, crowdsourcing utilizes these services to obtain information from specific populations. Crowdsourcing for research utilizes platforms such as Amazon Mechanical Turk (MTurk), Prolific Academic, Qualtrics Panels, Survey Monkey Audience, StudyResponse, Witmart (Palmer & Strickland, 2016).

Question: Which online crowd sourcing platform is easiest?

While there are strengths and weaknesses to each crowdsourcing platform, the easiest to navigate are the most commonly utilized MTurk and Prolific Academic. It seems that in many institutions, at least for researchers in the social psychology community prefer to use MTurk.

Disclosure: There are strengths and weaknesses to each platform that will be discussed generally. These may or may not be applicable to your own research study.

Strengths for Crowdsourcing

  1. Speed – The biggest advantage to using a crowd-sourcing platform is the amount of time it takes to recruit a sample. For example, if you want to run a study with 200 participants, you can have this sample in one day, or at most a couple of days, while for SONA this will likely take at least one year.
  2. Access to hard-to-reach populations – Crowdsourcing allows you access to populations that are historically hard-to-reach. These may include specific populations that are tailored to your research including populations of certain gender identification, sex, age, ethnicity.
  3. Access to cross-cultural data – As crowdsourcing is available worldwide, researchers are able to obtain access samples throughout the world, creating a more culturally friendly data.
  4. Cost – Historically, crowdsourcing platforms have been known to be an affordable method to get data. While each platform differs in the amount of recommended compensation, weighing the above strengths make crowdsourcing platform an attractive choice.

Weaknesses for Crowdsourcing

  1. Sampling Bias – While different platforms have an additional capability to target certain populations, sampling bias occurs as participants must have certain qualifications to initially enroll onto these platforms. For example, participants must have internet access or a home address.
  2. Bots and Data Quality – Researchers using MTurk usually find large numbers of bot-like response in their data pool. This drawback is not only for MTurk but occurs with every online survey. The presence of bots has increased as the increasing need to work from home have skyrocketed. Dr. Coleman noted that back in early 2010, data obtained from MTurk was comparable to ‘face-to-face’ studies. To reduce the number of bots, Dr. Coleman recommends adding more attention check questions or open-ended questions. Some example attention check questions include “Type the letter of the first letter of this sentence”, “For this question, type 3”.

Question: Which platform has better quality of data and fewer bots?

It seems that Prolific is a highly recommended platform by many researchers, more so being ‘better’ in terms of data quality and less bots. Prolific has their own strategy in place to control the enrollment of these bots. View Prolific’s strategy to reducing bots here (https://blog.prolific.co/bots-and-data-quality-on-crowdsourcing-platforms/) Additionally, Prolific is attractive for researchers who are looking for a sample with specific characteristics. However, Prolific also has its flaws.

Prolific has a pricing guide listed on their website which varies between the length of time it takes to complete the study, the number of participants needed, and sample characteristics. They provide researchers with a ball park for the range of payments. While the cost differential between MTurk and Prolific are becoming smaller, it seems that MTurk still continues to be a slightly cheaper option than Prolific.

Question: What about the SONA system that is used at Roosevelt?

The SONA system has been a way of a traditional way of collect data from psychology department in various universities and colleges. It provides currently enrolled students to participate in studies to earn extra credit in their psychology classes. The strength is that SONA is free and has no opportunities for traditional bots, it may take a long time to get data. Dr. Coleman recommends that researchers plan ahead for at least two years to obtain data. While SONA is free, it is heavily skewed towards college students, more so students identifying as female. Depending on the institution however this will differ. Roosevelt University has more racial diversity compare to other institutions and may result in a more diverse demographic data. However, critics have emphasized that researchers should not use college-aged students as it is an ‘easy’ way to perform research.

Question:  So which platform should I use?

These are some points that you should consider when finalizing the platform to utilize.

  1. Characteristic of your sample – Who do you need in your sample? Are they residents of only Illinois and ages 18-25, or are they more board? For example, if your study requires participants by African American females, since you would have to pay more on MTurk for the specifier, researchers may lean towards Prolific.
  2. Bots and Data Quality – Do you have time on a daily basis to review all incoming data? You must check daily and flag participants with bot-like answers. This is less likely to happen in SONA compared to MTurk studies that receive a larger number of bot participation.
  3. Budget – How much can you afford to compensate participants? It is important to remember that you will not only be compensating your participants but will also be paying the platform. This will range from an additional 30-40% on top of what you will be compensating your participants. In addition, if you have specifiers, you will be required to pay more.
  4. Number of participants – How many participants do you need? If you only need 50 participants that are college-aged population then SONA is a recommended platform for you.
  5. Timeline – How long do you have for data collection? For SONA, it is recommended you have a one-to-two-year buffer to ensure an adequate amount of data. If you need 100 participants, SONA will take one year compared to MTurk and Prolific which you may obtain in one day.

Question: What is one important point to address as student and faculty decide which platform to service?

There are several things that students and faculty should keep in mind when deciding what platform to use. Firstly, you want to keep in mind the time it takes for each participant to complete the research task. You want to remember to reimburse people ‘fairly’ for their time. However, this does not necessarily mean reimbursing at a minimum wage, for example, $15 per hour in Chicago as many participants do not do this as their full-time job. This may become tricky and difficult to navigate as choosing to pay a large amount of money may seem you are trying to entice participants to pursue your study, while if you pay too little it may seem you are not respecting them. Thus, it is important for you to establish and learn what the “going rate” is on each platform, or in another word the typical or expected value for the time it takes to complete the study on the platform.

Question: Is the consent process on a crowd sourcing platform different compared to SONA?

On any chosen platform, consent may be obtained through several ways. For example, on the SONA system, you are required to provide a description of the study, what is expected by the participant throughout the study, review the amount of time it takes to complete the study and compensation. The participant will then proceed to your study. On a crowdsourcing platform (e.g., MTurk or Prolific), the participant will be asked to click on a link and review the consent form, similar to what is described above.

 

  SONA Amazon MTurk Prolific Academy
Population College-Aged Diverse Diverse
Quality Response (e.g., Bots) Rarely Frequent Less frequent
Time for 100 Participant One year One day One day
Price Free Less Expensive Most Expensive

 

Deception vs. Incomplete Disclosure

Hello everyone! Welcome to the first blog post of the semester!

What is deception? Deception is just simply providing false information to the participants of the study. 

What is incomplete disclosure? Incomplete disclosure is a type of deception where the real purpose of a study/research is not given to the participants of said study. 

These two things are extremely important to distinguish in a study because it will most definitely make a difference in how the study is reviewed by the IRB. In some cases, deception is necessary because prior knowledge of the true aim of the study may alter responses from the participants. For example, a study examining how emotions affect decision making may not inform participants of the study they’re being studied based on their emotions because the researcher wants authentic reactions. The IRB reviews studies that involve deception because participants cannot fully consent to a study if they are given false information. The IRB has to determine whether the study needed to use deception, or if they study could be completed without it. All studies using deception require a debriefing script, describing the true aim of the study and making sure that the participant gives permission to gives permission for their data to be used for the study. 

As far as incomplete disclosure, it may be necessary in some cases, only when the researcher documents that it is justified under criteria of federal regulations. Use of incomplete disclosure generally includes minimal risk, the safety and welfare of the participants must be not affected. However, these research techniques might raise concerns for the IRB because it involves manipulating the participant’s ability on whether they want to participate in the study or not. The IRB also discourages use of these techniques if there are better alternatives to yield the researchers’ desired results, if it doesn’t protect the participants interests, and if the risk of participation is not clear to the participant. 

If you are unsure if your study involves deception or incomplete disclosure, please contact the IRB at (312) 341-2449 or dsomerville@roosevelt.edu.

Instructions for locating and submitting forms required for amendment or renewal of approved studies in IRBManager

You have submitted and received approval for your IRB application (yay!). Now you have to submit another form to make changes to your original study (amendment), acknowledge a reliance agreement with another institution, request a continuation for your study (renewal/termination form) or terminate your study (renewal/termination form).

All subsequent forms submitted have to be processed through the study page of the initial submission. The initial application established a study page with your study number. Your study number should be connected to every form that is submitted for that particular study thereafter.

You access your study page by clicking on the hyperlinked study number on your home page. Once you click on your study number, that will take you to a page that looks like this.

Notice that your study number is at the top of the page. When you look on the side column, you will notice the “start x form” in the side toolbar (circled below). Push “start x form” on your study page and that will take you to the next screen.

You will see all of the forms you can use in relationship to your study number. From here, you should be able to submit your form.

 

If for some reason you are not able to access your study page, please contact the IRB Office.

Here is a short video that provides instructions.

 

Not sure if your form was submitted properly? Here is how you can be absolutely sure.

If it’s been weeks since you’ve submitted your IRB form and you are wondering why you haven’t heard back yet, there is a likely chance that you haven’t completed the submission process.

Each form requires two steps for submission: sign and submit. If you have seen a screen that asks for you to enter your username and password, then you haven’t signed off on your form. If you haven’t seen a submit screen, then you haven’t actually submitted your form.

Here is what you should look for.

After you enter your information in the form, push “next” at the bottom of your form. On the next screen you will see a page that asks for you to sign off on your form. Push “sign” and this will take you to a screen that asks for your username and password, same as when you log on to IRBManager. Once you complete that step, you should see this screen.

This screen verifies that you have actually signed your form. You will need to push next again to go to the next screen.

When you push “submit” your form will be sent to the next level of review. If you are a student, your faculty advisor will receive the form. If you are a faculty or staff researcher, your form will go to the IRB Office.

Once you see the above screen, your form has been submitted.

If you have completed these steps and you require your faculty advisor to sign off on the form, please make sure that your faculty advisor has done so. Once they have signed off on the form and you still haven’t heard back in two weeks, then please reach out to the IRB Office to make sure that your form has been reviewed.

Planning to use crowdsourcing for your research study? Here are a few tips.

You’ve decided that you are going to recruit your participants using crowdsourcing. As a student, you may not know much about it, but you figure that it shouldn’t really be difficult to pull off. You’ve considered using Amazon MTurk, but wonder if there are other sources to consider. You have a (limited) budget and want to use it to pay eligible participants for your study. Your faculty advisor isn’t familiar with it, or has little to no knowledge about how it works. You feel this will be the best and easiest way to go about this process, and time is passing each day you delay. What do you do?

Where do I start?
  1. Speak with your advisor about your interest and options. The IRB has found that when student researchers include their faculty advisors and mentors in their plans and gauge their level of knowledge or interest in their plans, the student has a much better chance to carry out those plans successfully.  if your faculty advisor is not knowledgeable about crowdsourcing, there are other faculty who are. Reach out to them with your faculty advisor, so that you both learn about the process together. The IRB  can refer you to faculty who have used this recruitment process numerous times.  If for some reason you are unable to speak with your faculty advisor in earnest about this, then consider another person to fill that role. If you wait too long for them to get back to you, then you risk putting yourself in a situation that can be avoided by working with a more responsive faculty advisor.
  2. Take the time to read more about it to make sure that this is what you really want to do. Here is an article that can jumpstart your research. A beginner’s guide to crowdsourcing (1)  includes some useful information, links and references.
How do I fill out your IRB application?
  1. Once you click on the link to IRBManager, the online submission system, use the information provided in the area of this blog titled, “application checklist”.
  2. When discussing your plans to recruit your participants in your IRB application, keep in mind that the IRB is concerned about what the recruitment process will look like from the perspective of your participants. The IRB wants to know how you will reach them, what they will learn about the project and whether they will be compensated for their participation. The IRB will flag any studies that don’t clarify these three points. The IRB needs make sure that your study provides everyone the same opportunity to participate (unless you have some reason to exclude people based on your research aims). that no one is unduly influenced by your study, and that compensation is reasonable in relationship to their time and effort contributed to the study.
    1. If you are are planning to recruit participants using crowdsourcing, state this outright.
    2. Once you’ve established that crowdsourcing will be used, explain which tool will be used (Amazon MTurk, Prolific, Qualtrics, SONA). Each tool has very different processes for how end users are selected and engaged in your study. Know the differences and be prepared to discuss the relevant choice to the IRB. Will they be paid an estimated amount based on the anticipated time it will take to fill out the survey? Are there specific guidelines specified by the crowdsourcing platform that inform your decision? If so, what are they? What is the form of compensation? Will they have to complete the survey to receive payment?
    3. DO NOT use a third party system different from the crowdsourcing platform to carry out any part of the study. Keep the consent form, study materials and any additional information relevant to participants within the crowdsourcing platform. To make a different choice will cause problems that may compromise the anonymity of your study and complicate the process for participants, which can show up as harm depending on what occurs. And finally, this will cause problems for you and delay the time it takes for you to complete your research.

Have additional questions? Contact the IRB Office at x2449 or research@roosevelt.edu.