Hiding opinions from machine learning

Marcin Waniek, Walid Magdy, Talal Rahwan

Research output: Contribution to journalArticlepeer-review


Recent breakthroughs in machine learning and big data analysis are allowing our online activities to be scrutinized at an unprecedented scale, and our private information to be inferred without our consent or knowledge. Here, we focus on algorithms designed to infer the opinions of Twitter users toward a growing number of topics, and consider the possibility of modifying the profiles of these users in the hope of hiding their opinions from such algorithms. We ran a survey to understand the extent of this privacy threat, and found evidence suggesting that a significant proportion of Twitter users wish to avoid revealing at least some of their opinions about social, political, and religious issues. Moreover, our participants were unable to reliably identify the Twitter activities that reveal one’s opinion to such algorithms. Given these findings, we consider the possibility of fighting AI with AI, i.e., instead of relying on human intuition, people may have a better chance at hiding their opinion if they modify their Twitter profiles following advice from an automated assistant. We propose a heuristic that identifies which Twitter accounts the users should follow or mention in their tweets, and show that such a heuristic can effectively hide the user’s opinions. Altogether, our study highlights the risk associated with developing machine learning algorithms that analyze people’s profiles, and demonstrates the potential to develop countermeasures that preserve the basic right of choosing which of our opinions to share with the world.

Original languageEnglish (US)
Article numberpgac256
JournalPNAS Nexus
Issue number5
StatePublished - Nov 1 2022


  • machine learning
  • privacy
  • social media
  • Stance detection

ASJC Scopus subject areas

  • General


Dive into the research topics of 'Hiding opinions from machine learning'. Together they form a unique fingerprint.

Cite this