How Do Gay and Bisexual Men Feel about Their Privacy When Using Apps? Not Great.

Institute for Sexual and Gender Minority Health and Wellbeing Press Release


Dr. H. Jonathon Rendina (
Dr. Brian Mustanski (

Media Contact: Zaina Awad (

Earlier this year, it was revealed that Facebook had provided a political firm with personal data on hundreds of millions of users. Just a few months later, Grindr was found to have shared profile information with third-party firms hired to assist them with developing new HIV prevention features. Although both were acting within their rights based on the terms users agreed to when first signing up, these cases stirred up public outrage and attracted significant media attention.  Months before the news about these incidents broke, Dr. Jonathon Rendina, Assistant Professor of Psychology at Hunter College, CUNY, and Dr. Brian Mustanski, Professor of Medical Social Sciences and Director of the Institute for Sexual and Gender Minority Health and Wellbeing at Northwestern University, conducted a study to better understand how gay, bisexual and other men who have sex with men (GBMSM) felt about the privacy of their personal information when using apps.

Between April and May, 2017, Dr. Rendina and Dr. Mustanski surveyed 11,032 GBMSM (including both cisgender and transgender males) around the United States, asking them to imagine a hypothetical app with features similar to apps that are most popular today. The participants were asked to report their level of concern about the app developers collecting or generating 12 different types of data that included basic profile information like their age, the types of people they interact with in the app, the types of advertisements they interact with in the app, and data from their phone’s operating system about other apps installed or GPS location. They also asked participants to consider the same issues in regard to the app developers selling the data anonymously to third parties and sharing the data anonymously with researchers.

The study showed that, on average, participants were concerned about several of the data points gathered, and were  most wary of app developers collecting information regarding device usage information (like other apps installed on the device) and device GPS location information. When the question was about app developers anonymously selling the data to third parties, the average person was concerned about 8-10 of the 12 types of data. In contrast, when asked about anonymously sharing the data with researchers, the average person was concerned about only 3-5 of the different types of data, with device-generated data again topping the list of concerns.

These findings are important because the personal information people were asked about regarding the hypothetical app is generated from most of the apps people use in real life. And, in most cases, users must agree to the collection, sharing, and even potentially the sale of these data within the terms of use. This suggests people are giving these apps access to their data despite being very concerned about their potential use, which is likely due to most people not reading the terms due to their length and jargon, as well as to feeling pressured to comply since using the app is impossible to use without agreeing to the terms and conditions.

Given these findings, what can apps and websites do to regain public trust and reduce the privacy concerns of their users? Dr. Rendina notes, “One of the clearest implications is that rushing to change terms of service and privacy policies while keeping them written in an inaccessible fashion won’t work.” Just as academics struggle with the important task of sharing their research in easily understood language, app developers and companies grapple with ensuring that their consumers understand what they’re signing up for. Dr. Mustanski suggested, “some steps that can help include presenting users with a summary of what people care about the most, presenting the ‘gist’ up front, and providing a longer explanation afterwards for those who want it.” It may also be helpful to consider using multimedia, like animations or videos, to capture the user’s attention. Dr. Rendina also noted that, “One of the most evident findings of the study was that people like to have control over their data, which suggests that allowing them to opt-out of things that may concern them would likely be effective.”

For policymakers interested in protecting privacy, regulations are needed that require such protections. When it comes to academics, it is important to note that the users also expressed concern about trusting researchers with their personal data. To increase public trust in research, researchers need to better understand the privacy implications of the technologies they use and work with their institutions to ensure that the ways in which they collect and store data adhere to the most rigorous of privacy protections.

About the study: The results of the study, entitled, “Privacy, Trust, and Data Sharing in Web-Based and Mobile Research: Participant Perspectives in a Large Nationwide Sample of Men Who Have Sex With Men in the United States” (doi: 10.2196/jmir.9019) by H. Jonathon Rendina and Brian Mustanski was published in the Journal of Medical Internet Research (JMIR). The study was conducted in part with funding from the National Institute on Drug Abuse (K01-DA039030, PI: H. Jonathon Rendina; R25-DA031608, PI: Celia B. Fisher) as well as with funding provided by Hunter College, CUNY. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health, the Fordham HIV Prevention Research Ethics Training Institute, Hunter College, CUNY, or Northwestern University.