Wednesday, March 25, 2015

IRB Workshop on Internet-based Research

On March 9, 2015, the Texas A&M University Division of Research Human Subjects Projection Program hosted an IRB Workshop on Internet-Based Research. Research involving different internet modalities – for example, social media and survey platforms – is becoming increasingly popular. The purpose of the workshop was to have researchers from various fields of study describe their experiences and strategies with IRB approved internet-based research. Dr. Hammond, Director of the Sketch Recognition Lab and Associate Professor in the Department of Computer Science and Engineering, was one of five faculty to share their perspectives, along with Dr. Heidi Campbell (Department of Communication), Dr. Trey Marchbanks (Public Policy Research Institute), Dr. Rebecca Schlegel (Department of Psychology), and Dr. Debra Fowler (Center for Teaching Excellence).

While cold rain poured down in buckets outside, five speakers, who were chosen from a diverse background of academic areas, warmly shared their expertise about internet-based research to a small and attentive audience. The workshop was moderated by Dr. Cynthia Riccio (Department of Educational Psychology) and organized by Dr. Catherine Higgins (Human Subjects Protection Program).


Fourth in the string of five presenters, was Dr. Tracy Hammond, the director of the Sketch Recognition Lab, who began her presentation with a reference to the downpour outside. “Hopefully we’ll get lucky and survive the rain,” she joked. “I don’t know how many of you have been outside in the last half-hour, but we have flooding in the streets, so be forewarned about that.” With that, she began her presentation on, “Internet Studies and the IRB.” She started with a background of her relationship with the IRB process; "Currently I have 18 active studies, 9 pending studies that are actively undergoing revisions, and 5 studies in draft form. In the last week alone, I have received 91 emails from the IRB staff." Why so many? Hammond regularly teaches senior capstone design, resulting in a number of group projects, and has over 20 graduate advisees with their own personal research projects. Hammond says she has all students who have enrolled in a project class with her complete CITI training even if they are not certain that they will be participating in a project that requires IRB approval.

(Dr. Tracy Hammond)

“What I decided to talk about today are the non-intuitive IRB stipulations that I tell each of my graduate students when they are submitting their IRB proposals,” she smiled as she introduced her topic.

In her presentation, Dr. Hammond focused on tips and her experience with the Institutional Review Board (IRB) as well as sampling strategies. Dr. Hammond took the time to also stress the importance of the consent process. Even if the consent form is waived, consent still must be obtained (e.g., "Your participation in this project implies your consent. Click here to consent and continue.") A waiver of documentation of consent does not mean you no longer have to obtain consent, on the contrary, it just means they don't have to sign and write their name to provide consent. This is very useful if you want to collect anonymous data, in cases where the consent form would be the only personal information that would tie their identity to their data.


The IRB stamp information (IRB number, approval date, expiration data) must be included on the consent page and in all recruitment material. Any paper forms (such as fliers and consent forms) must be the exact PDF uploaded in imedris.com with the official stamp on it. Even more all forms questionnaires, recruitment scripts must be exactly the same as the one for which IRB was approved. If a typo is found in any of the material that had previously been approved by the irb, the updated version must be uploaded and approved. (Note that such changes can sometimes be approved in a single day, so there is no excuse for not doing it.)
(Sample IRB Stamp that needs to be located on all recruiting flyers, consent forms, questionnaire forms, and at the start of your internet study.)

When researching, there are multiple sampling strategies that can be implemented. The most common in internet surveys is opportunistic, which is characterized by studies that allow anyone willing to participate take the surveys. In situations where you have a control and testing group, researchers traditionally divide users in one of two ways: round robin or randomized sampling. In round robin sampling, as users come in, you simply assign them to alternating groups, e.g., first person is assigned to the control group, the next the test group, the next the control group, etc.  In random sampling, users are assigned to either the test or control group with 50% probability. With randomized sampling, your group sizes will not be perfectly equal, but they will be approximately equal, and they will be closer to equal with greater probability as the group sizes increase.


After briefly describing different platforms to conduct Internet-based research, Dr. Hammond moved on to discuss the best practices in regard to being approved by the IRB.
She emphasized that is crucial to be consistent and concise. Do not say that an experiment was conducted for 20 – 40 minutes and later state it was done over 30 minutes. While this may seem the same in an informal atmosphere, the IRB will send back projects for corrections regarding inconsistencies.

To professors and advisers, Dr. Hammond suggested starting the process early and making sure to review student’s work before submitting the IRB – make sure that it looks right and is filled out correctly. Dr. Hammond concluded her section by thanking the members of the Sketch Recognition Lab for their help and support.

One woman spoke to Dr. Hammond after her presentation, "I just submitted my IRB last week. Your talk had everything I needed to know. Now I think I need to go back and resubmit." Four others sent emails requesting a copy of the slides, including a request from a liaison from the Texas A&M IRB office for use in "presentations periodically given to faculty or graduate classes." Dr. Hammond responded that she is willing to share the slides with anyone who requests them, and she plans to put a revised version online for public consumption.


____


Dr. Heidi Campbell, an Associate Professor in the Department of Communication, started off the workshop with a presentation titled “Reflections on Digital Methods & Ethics in Internet Studies.” In her presentation, Dr. Campbell discussed the growing of new media as well as how the internet has affected religion and digital cultural studies – a topic which she has done numerous studies on.

(IDr. Heidi Campbell)



Through memes and applications that can be downloaded onto cellphones, the internet and users are able to portray and express religious preference – both in a satirical fashion and a serious. As a few examples, student A may be using the internet to connect with other people from his or her religious background and download mobile applications that allow access to holy texts. Student B may create a meme to poke fun at something that he sees as flawed in religion or an experience that he had with people of the same or different religions than he.

(Image taken from Dr. Campbell’s presentation as an example of a religious-based meme)

In regard to conducting religious and cultural studies online, Dr. Campbell stressed the importance of understanding the ethical and methodological considerations that must be accounted for. Firstly, researchers should identify and realize that content creators online function in a "prosumer" (acting as a producer and consumer simultaneously) and viral culture. This separates them from offline creators. Secondly, one must manage the data collected responsibly – know where the boundary for mobile applications and text based resources exists. Just because something is online, does not make it ethically usable to everyone. Finally, researchers must consider the difference between private and public information. In doing so, they must respect their human participants in each study and in mapping digital footprints.

____


With this information set in place, Dr. Trey Marchbanks, an Associate Research Scientist at the Public Policy Research Institute, took the battalion and ran with it. His presentation, simply titled “Online Research,” consisted of the advantages, disadvantages, and lessons learned from researching and surveying online. 



(IDr. Trey Marchbanks)

Online surveying comes with many advantages – it is cost efficient, the sample size can be as large or as small as the creator needs, and it is quick to create and send out. This instantaneous availability of online surveys allows the creator to gain much needed research information in a timely fashion – something that was nearly impossible when surveys could only be done in person or by mail.

Furthermore, with access to the internet – and by extension, the surveys conducted online – being almost limitless, the participants in studies are able to fill out their information whenever is most convenient for them (within a time frame, of course – gaining subject information after the research is concluded is not helpful). Although these advantages are important to present and future means of research, it is imperative to be aware of the disadvantages to online surveying as well.

Being that online research is digital in nature, there is always the possibility of technical difficulties. Will the survey link actually work? Will participants be able to easily fill it out? Will there be glitches that affect the outcome of the research? All of these, and more, are important questions to keep in mind when conducting Internet-based research with human subjects.

Additionally, the response rate to surveys is not that high – many people see surveys as time wasters or spam messages. Thus, responses to questions may be bias or made in a rush to finish or to make the user feel better about themselves - always take survey answers with a grain of salt. Furthermore, some people may begin the survey, only to either not finish or start answering questions at random out of non-interest. Finally, there must be some means of contacting people further for Internet-based research. It can be difficult to get accurate details about individuals.

Based on experience, Dr. Marchbanks offered the workshop valuable lessons that he has learned with online surveying. His first piece of advice was to work with the IT staff early for in-house and in-field research. Preview the survey with others before sending it to participants – pilot it to make sure that it will not be seen as spam and the results will be comparable. Always give the people involved with the survey plenty of time to respond – just because they have instant access to it does not mean that they will instantly fill it out. Finally, have an opt-out link installed. If subjects do not wish to continue or complete the survey, let them have a way out of the research – after all, they are volunteers and are willingly helping the researcher. Do not take that for granted.

____


Dr. Rebecca Schlegal, an Assistant Professor in the Department of Psychology, followed, discussing various ways of surveying human subject via online sources, but also stressed the good and the bad that comes based on paid surveys and research – such as Amazon Mechanical Turk. In some such research, possible participants fill out a simple and straightforward questionnaire – covering demographics such as marital status, name, age, education level, etc. This pre-survey allows the researcher to narrow the audience of their internet survey. If a person has all the traits that the researcher wants – for example, 55- to 60-years-old, married, with some college education – that person will be sent a link (or transferred to a different page) that gives access to the full survey. On the other hand, if a person does not have the traits needed to participate in the study, they will not gain access to the survey and will not be paid. 



(IDr. Rebecca Schlegal)

Places like Amazon Mechanical Turk are outlets to paid research. It allows individuals, researchers, and businesses to post HITs – Human Intelligence Tasks – that computers are currently unable to do, such as choosing between photographs or writing descriptions. Participants can browse these posted tasks and complete them for monetary payment. 
Payment in surveys acts as a motivator to people who may otherwise not be interested in completing research for nothing to make up their time spent on it. However, payment is low – typically between mere pennies to a dollar. It is a system of positive reinforcement for willing volunteers – a way to show gratitude while allowing the participant to walk away with something of their own. 


(Screenshot of Amazon Mechanical Turk from Dr. Hammond's presentation)


One main issue of these surveys, as Dr. Schlegal described are “super users.” Super users have made filling out Amazon Mechanical Turk their full time job. As such, they have seen many of the same classic questions over and over again on different surveys. For instance, super users have reported answering the following question on upwards of 23 surveys. 

"A hammer and a nail cost $1.10, and the hammer costs one dollar more than the nail. How much does the nail cost?"

When a user has already seen a question 23 times, you are not getting the same answer they might give otherwise. It is important to realize that the Amazon MTurk user base is not necessarily representative of the general population.

____


To finish the workshop's presentations, Dr. Debra Fowler, the Associate Director of the Center of Teaching Excellence, shifted the focus from research based on students to that based on faculty and feedback. While she described the methods she talked about as simple compared to other presentations, her expertise brought with it a new light for the subject of internet-based research. 

(Dr. Debra Fowler)

Dr. Fowler’s honed in on surveys taken by faculty and students that offer feedback and ratings – such as end of the year professor reviews. The information that is collected based on these surveys allow the Center of Teaching Excellence to better understand what changes need to be made at the university to benefit and support students and faculty alike.

While many such surveys are taken by people at Texas A&M University, Dr. Fowler also mentioned that there is research conducted from afar. Surveys are sent to outside universities that allow for professors to rate how well graduates of Texas A&M University perform in their classes. This allows for feedback that can be compared to how well students are learning their area of study while attending university. Furthermore, the collection of this data gives the Center of Teaching Excellence an idea about how well the university is serving its students and if there are any imminent changes that need to be made to better the university as a whole – thus allowing for the upkeep the reputation that Texas A&M University students are excellent workers and scholars.

Dr. Fowler concluded the afternoon of presentations with a final point that can be applied to anyone who is a part of a research team or project – the hardest part of the information and survey process to get participants to understand the importance and necessity of it. People who are not directly influenced by the research may think that it is just a time consuming process that can be ignored or that it is not worth going through trouble to fill out a survey.

For any project to move forward – regardless of the field of study or line of work it is in – true information must be gathered and carefully analyzed. While it may be impossible to thank every participant individually, researchers will always be grateful to everyone who helps them along the way.

No comments:

Post a Comment