Internet

Study finds 75% of Facebook links shared without clicking or reading – Advanced Science News


Social media is reshaping how news spreads — but often without substance. A recent deep dive into 35 million Facebook posts posted between 2017–2020 showed that more than 75% of shared links are never even clicked, letting extreme and partisan content dominate the conversation online.

“When a friend shares a link to a news story, we tend to assume that the friend has in some way vetted its contents and has determined its relevance to us,” said S. Shyam Sundar, a researcher at Pennsylvania State University, USA, in an email. “Our study shows that this assumption is not valid, that most of the time the friend is not really gatekeeping the information, but instead passing it on without even reading the contents fully, let alone vetting it for accuracy or relevance to you.”

Rapid online interactions encourage social media users to disseminate news as shares without clicks. People often hit the share button without having clicked on and actually read the news article, spreading it to everyone in their online circle.

A flashy headline and a high number of shares or likes can easily tempt users to share an article.

Missing information

News headlines and blurbs often convey little nuance or tone, limited as they are in word count. Previous research has confirmed that shares without clicks do not translate into a true understanding of the political or scientific angle of the news story. So, while users may feel like they know more, they actually know very little that is based on fact. Indiscriminate sharing in this vein can appreciably further the escalation of misinformation and disinformation.

Sundar and colleagues explored how political affinity would influence this behavior and result in the sharing of both conservative and liberal sources of news. They sourced their data from the Facebook URLs dataset offered by Social Science One, a research consortium for social and behavioral data at Harvard University in collaboration with Meta.

While shares without clicks was the main measure, the team also examined if this action was affected by how polarizing the political content was, the user’s political affinity, and whether they aligned with the political content. The political page affinity score was externally established on the basis of which pages users followed on social media. The research team could then assign users to groups — very liberal, liberal, neutral, conservative, and very conservative.

They categorized 8,000 URLS into “political” and “non-political” to create a training dataset for a machine learning algorithm. It would then pick out all political URLs across the larger dataset on the basis of specific terms found in the content as well as which type of user chose to share the link.

The researchers found that shares without clicks across all domains and URLs accounted for 76.71% of the 56.4 billion recorded shares. When users were politically aligned with a particular domain, like CNN or FOX News, they were more likely to share without clicking on the link. Similarly, URLs that lined up with users’ political affinity were more often shared without being clicked.

“What we find is that the more politically extreme the content is, the more it is shared without being clicked upon first. This is true for both extreme left and extreme right. As we know, there tends to be a lot of strong opinions and biased commentary on the extremes of the political spectrum,” said Sundar. “As such, there is more scope for fake news and conspiracy theories masquerading as legitimate news in politically extreme news domains.”

Across 4,617 top domains and all five user groups, moderate liberals and very conservative users were the most prolific sharers — not readers — of links. When looking at URLs, the researchers noted that moderate to highly liberal users shared the most political content in 2017 and 2018, with very conservative users becoming more active in 2020.

A breeding ground for misinformation

Facebook networks combined with users’ tendency to endorse news that aligns with their political beliefs — often without verifying its accuracy — create the perfect conditions for extreme news to go viral. Moreover, people are less likely to encounter content that challenges their beliefs, thanks to the influence of their networks and the narrowing effect of social media algorithms.

“This tells us that social media users are simply glancing at the headline and the blurb when deciding to blast a news link to their networks,” added Sundar. “Such dissemination can have a multiplicative effect and result in rapid spread of information to millions of folks online.”

The team also assessed if shares without clicks were more likely to occur when fake news was involved. For URLs labeled as false by third-party fact-checkers, conservative users accounted for a higher total number of shares without clicks compared to liberal users. Of the 2,969 false news URLs, some 76.94% million shares without clicks came from conservatives and 14.25% from liberals. Conversely, liberal users made up the most absolute numbers of shares without clicks for true news URLS, with both sides sharing more than neutral users.

But this difference in sharing patterns was influenced by news domains, with more fake news (76% to 82% in this dataset) being published by conservative domains compared to liberal domains. “This suggests that if politically partisan users see a headline that seems aligned with their political ideology, they will readily share the story without bothering to verify if it is really true,” added Sundar.

The researchers suggest that future studies should actively observe and analyze user behavior and take into account the various devices and online platforms users access to better understand the prevalence of shares without clicks.

Presently, some social media platforms let users know that the veracity of a source has not been validated, but more can be done to limit the spread of such content.

“Social media platforms should redesign their interfaces to introduce friction in the sharing process. This runs counter to general practice of interaction design, which emphasizes ease of use and usability,” added Sundar. “Studies like ours demonstrate the downside of this practice, by highlighting that when it is easy for users to click on buttons, they think less about their actions and as a result participate unwittingly in the spread of misinformation.”

This additional friction could take the form of warnings or alerts that caution users of the pervasiveness of shares without clicks and whether or not the user had read the content before sharing the link

Reference: S. Shyam Sundar, et al., Sharing without clicking on news in social media, Nature Human Behavior (2024). DOI: 10.1038/s41562-024-02067-4



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.