The algorithms powering Facebook and Instagram, which drive what billions of people see on the social networks, have been in the cross hairs of lawmakers, activists and regulators for years. Many have called for the algorithms to be abolished to stem the spread of viral misinformation and to prevent the inflammation of political divisions.

But four new studies published on Thursday — including one that examined the data of 208 million Americans who used Facebook in the 2020 presidential election — complicate that narrative.

In the papers, researchers from the University of Texas, New York University, Princeton and other institutions found that removing some key functions of the social platforms’ algorithms had “no measurable effects” on people’s political beliefs. In one experiment on Facebook’s algorithm, people’s knowledge of political news declined when their ability to re-share posts was removed, the researchers said.

At the same time, the consumption of political news on Facebook and Instagram was highly segregated by ideology, according to another study. Ninety-seven percent of the people who read links to “untrustworthy” news stories on the apps during the 2020 election identified as conservative and largely engaged with right-wing content, the research found.

The studies, which were published in the journals Science and Nature, provide a contradictory and nuanced picture of how Americans have been using — and have been affected by — two of the world’s biggest social platforms. The conflicting results suggested that understanding social media’s role in shaping discourse may take years to unwind.

The papers also stood out for the large numbers of Facebook and Instagram users who were included and because the researchers obtained data and formulated and ran experiments with collaboration from Meta, which owns the apps. The studies are the first in a series of 16 peer-reviewed papers. Previous social media studies have relied mostly on publicly available information or were based on small numbers of users with information that was “scraped,” or downloaded, from the internet.

Talia Stroud, the founder and director of the Center for Media Engagement at the University of Texas at Austin, and Joshua Tucker, a professor and co-founder of the Center for Social Media and Politics at New York University, who helped lead the project, said they “now know just how influential the algorithm is in shaping people’s on-platform experiences.”

But Ms. Stroud said in an interview that the research showed the “quite complex social issues we’re dealing with” and that there was likely “no silver bullet” for social media’s effects.

“We must be careful about what we assume is happening versus what actually is,” said Katie Harbath, a former public policy director at Meta who left the company in 2021. She added that the studies upended the “assumed impacts of social media.” People’s political preferences are influenced by many factors, she said, and “social media alone is not to blame for all our woes.”

Meta, which announced it would participate in the research in August 2020, spent $20 million on the work from the National Opinion Research Center at the University of Chicago, a nonpartisan agency that aided in collecting some of the data. The company did not pay the researchers, though some of its employees worked with the academics. Meta was able to veto data requests that violated its users’ privacy.

The work was not a model for future research since it required direct participation from Meta, which held all the data and provided researchers only with certain kinds, said Michael Wagner, a professor of mass communications at the University of Wisconsin-Madison, who was an independent auditor on the project. The researchers said they had final say over the papers’ conclusions.

Nick Clegg, Meta’s president of global affairs, said the studies showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or has meaningful effects on these outcomes.” While the debate about social media and democracy would not be settled by the findings, he said, “we hope and expect it will advance society’s understanding of these issues.”

The papers arrive at a tumultuous time in the social media industry. This month, Meta rolled out Threads, which competes with Twitter. Elon Musk, Twitter’s owner, has changed the platform, most recently renaming it X. Other sites like Discord, YouTube, Reddit and TikTok are thriving, with new entrants such as Mastodon and Bluesky appearing to gain some traction.

In recent years, Meta has also tried shifting the focus away from its social apps to its work on the immersive digital world of the so-called metaverse. Over the past 18 months, Meta has seen more than $21 billion in operating losses from its Reality Labs division, which is responsible for building the metaverse.

Researchers have for years raised questions about the algorithms underlying Facebook and Instagram, which determine what people see in their feeds on the apps. In 2021, Frances Haugen, a former Facebook employee turned whistle-blower, further put a spotlight on them. She provided lawmakers and media with thousands of company documents and testified in Congress that Facebook’s algorithm was “causing teenagers to be exposed to more anorexia content” and was “literally fanning ethnic violence” in countries such as Ethiopia.

Lawmakers including Senator Amy Klobuchar, a Democrat of Minnesota, and Senator Cynthia Lummis, a Republican of Wyoming, later introduced bills to study or limit the algorithms. None have passed.

Of the four studies published on Thursday, Facebook and Instagram users were asked and consented to participate in three of them, with their identifying information obscured. In the fourth study, the company provided researchers with anonymized data of 208 million Facebook users.

One of the studies was titled “How do social media feed algorithms affect attitudes?” In that research, which included more than 23,000 Facebook users and 21,000 Instagram users, researchers replaced the algorithms with reverse chronological feeds, which means people saw the most recent posts first instead of posts that were largely tailored to their interests.

Yet people’s “polarization,” or political knowledge, did not change, the researchers found. In the academics’ surveys, people did not report shifting their behaviors, such as signing more online petitions or attending more political rallies, after their feeds were changed.

Worryingly, a feed in reverse chronological order increased the amount of untrustworthy content that people saw, according to the study.

The study that looked at the data from 208 million American Facebook users during the 2020 election found they were divided by political ideology, with those who identified as conservatives seeing more misinformation than those who identified as liberals.

Conservatives tended to read far more political news links that were also read almost exclusively by other conservatives, according to the research. Of the news articles marked by third-party fact checkers as false, more than 97 percent were viewed by conservatives. Facebook Pages and Groups, which let users follow topics of interest to them, shared more links to hyperpartisan articles than users’ friends.

Facebook Pages and Groups were a “very powerful curation and dissemination machine,” the study said.

Still, the proportion of false news articles that Facebook users read was low compared with all news articles viewed, researchers said.

In another paper, researchers found that reducing the amount of content in 23,000 Facebook users’ feeds that was posted by “like-minded” connections did not measurably alter the beliefs or political polarization of those who participated.

“These findings challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy,” the study’s authors said.

In a fourth study that looked at 27,000 Facebook and Instagram users, people said their knowledge of political news fell when their ability to re-share posts was taken away in an experiment. Removing the re-share button ultimately did not change people’s beliefs or opinions, the paper concluded.

Researchers cautioned that their findings were affected by many variables. The timing of some of the experiments right before the 2020 presidential election, for instance, could have meant that users’ political attitudes had already been cemented.

Some findings may be outdated. Since the researchers embarked on the work, Meta has moved away from showcasing news content from publishers in users’ main news feeds on Facebook and Instagram. The company also regularly tweaks and adjusts its algorithms to keep users engaged.

The researchers said they nonetheless hoped the papers would lead to more work in the field, with other social media companies participating.

“We very much hope that society, through its policymakers, will take action so this kind of research can continue in the future,” said Mr. Tucker of New York University. “This should be something that society sees in its interest.”

Mike Isaac and Sheera Frenkel

Source link

You May Also Like

Trump’s Return to Fox News Gets a Cool Reception … on Fox News

Reunions can be awkward. Former President Donald J. Trump finally returned this…

Air France-KLM: Q4 Corp. Travel Recovery ‘Solid’

Air France-KLM fourth-quarter corporate revenue was 75 percent of 2019 Q4 levels,…

Find out how The Brothers that just do Gutters Became One of the Fastest-Growing Franchises | Entrepreneur

You can assume a few things about The Brothers that just do…

Taylor Swift Just Taught a Masterclass in Emotional Intelligence. These 7 Words Mattered Most

Everybody feels the need to be seen and part of a community.…