Most Facebook News Links Are Unread by Users Sharing Them, Study Finds
What are the odds Facebook users read the news articles they end up sharing with the rest of the world?
Only one in four, according to a recent study of our news sharing behavior spanning almost a full U.S. election cycle, which was published in Nature Human Behaviour.
The study, which applied advanced machine learning to analyze more than 35 million public Facebook posts between 2017 and 2020, revealed that 75% of news links were blindly shared by users without checking the original news source attached to the post.
The striking pattern was reflected in about 80 billion blind shares over four years. Researchers say the findings offer a window into the widespread phenomenon of "shares without clicks" (SwoCs) on the platform, and highlight the superficial processing of political news content as a potential driver of misinformation spread online.
“As simple as it may sound, I think the sheer amount of SwoCs found on Facebook across four years is the key finding. Many may assume that users verify beforehand, or at least confirm, the political news they share on social media. This study suggests 75% don’t,” said Eugene Cho Snyder, assistant professor of Humanities and Social Sciences at NJIT, who served as the lead data wrangler and second author of the study.
The team, led by S. Shyam Sundar (James P. Jimirro Professor of Media Effects, and the Director of Penn State’s Center for Socially Responsible Artificial Intelligence) applied computational techniques into big data analysis to explore the political news sharing patterns of millions of Facebook users.
To sort political from non-political content, the team applied a "machine-learning-driven political classifier" to millions of widely circulated news stories on the platform, which they trained based on 8,000 manually reviewed URL titles and blurbs included in Facebook's extensive URL dataset.
Snyder says analyzing over 35 million posts from Facebook's dataset, granted by Meta in collaboration with the Harvard research consortium Social Science One, while also protecting Facebook users’ privacy, presented significant challenges.
"Simply retrieving a few months' worth of data sometimes took days," Snyder explained. "The Facebook data also included differential privacy noise to protect user privacy. Interestingly, we learned that the aggregated findings from Facebook posts that reached a certain number of shares —URLs shared over 2,395 times — were less subject to differential privacy noise. We analyzed these posts, as they offer a big picture to macro-level social media patterns without compromising individual-level privacy.”
Within Meta’s dataset, Facebook users were categorized into five political groups based on their followed media and political pages: very liberal, moderately liberal, neutral, moderately conservative and very conservative. Based on such categorization, the team calculated political affinity at the content level for each domain and URL.
In the end, the study found that the partisans from both sides engaged in SwoCs more than neutral users, especially with political content aligned with their own political stance across all four years.
The analysis also revealed a notable shift in blind sharing patterns over four years. Moderate to highly liberal groups accounted for the most SwoCs in 2017 and 2018 for political news links, but highly conservative users became the most frequent blind sharers in 2020, through a pattern that started to shift in 2019.
“Another interesting finding came from conservative news domains potentially distributing more misinformation, hence leading more conservative users to share such content without clicking,” explained Snyder. “However, only a small portion of the Facebook posts we analyzed were fact checked by a third party, and had external media bias ratings for the news domains each post came from, so this finding deserves careful interpretation.”
Content verification across Meta's platforms continues to evolve — the company recently announced that it would end its third-party fact-checking program on Facebook, Instagram and Threads in favor of a "Community Notes" model.
For now, Snyder says while the findings raise questions about how social media platforms should evolve to address widespread patterns of blind sharing, more research is needed to better understand user sharing behavior.
“More research should be done to gain deeper knowledge of this phenomenon … it is possible some Facebook posts are studied by users before sharing, via other means than clicking on it first,” said Snyder. “Suggesting practical and effective ways toward policy changes and corporate actions, such as promoting content verification on social media via UX design and digital literacy, is something we have to think about together as social media users.”