5Cb2Bc635F755Bed76920E2Efb2Dd8E4

Facebook News Consumers Are More Anti-Vaccine Than Fox Viewers

[ad_1]

Image for article titled Facebook News Consumers Are More Anti-Vaccine Than Fox News Viewers, Study Finds

Photo: At Washington, DC Facebook headquarters, activists [the Real Facebook Oversight Board] lay body bags and call for Facebook to stop disinformation that leads to Covid deaths on Wednesday, July. 28, 2021 in Washington. (Eric Kayne/AP Images for All the Citizens Ltd.) (AP)

Joe Biden might want to consider re-backpedaling after backpedaling his accusation that social media companies [Facebook] are “killing people” by spreading vaccine conspiracies and bunk. A new study suggests that Facebook’s news consumers are inordinately unwilling to get the covid-19 vaccine.

Facebook fired back at President Biden’s comment earlier this month with a blog post and a study from Carnegie Mellon University’s Delphi Group. It reported that, of millions of Facebook users, 85% of U.S.-based users were vaccinated or planned to get vaccinated. “President Biden’s goal was for 70% of Americans to be vaccinated by July 4,” they sniffed. “Facebook is not the reason this goal was missed.” Biden later clarified that misinformation is killing people.

But the study didn’t account for people who consume news through Facebook, potentially exposing them to its massive disinformation mill and targeting them with the content that Facebook believes will get the most engagement. The new study of that user group nudges Facebook off its high horse.

Researchers from numerous universities, specializing in various public health and political science-related fields, surveyed 20,669 people from all 50 states and D.C., between June and July 2021. They found that 25% of people who only got news from Facebook in the previous 24 hours say they won’t get vaccinated, putting it above only Newsmax (41%) and slightly below Fox (23%).

Image for article titled Facebook News Consumers Are More Anti-Vaccine Than Fox News Viewers, Study Finds

An alarmingly high portion of people got their news (“news”) through Facebook. About a third (31%) had consumed news from Facebook over the past 24 hours, ranking Facebook as the second-largest news provider below CNN. Researchers didn’t define Facebook “news,” which could range from anything from user-generated content to Tucker Carlson to the New York Times.

Image for article titled Facebook News Consumers Are More Anti-Vaccine Than Fox News Viewers, Study Finds

As researcher David Lazer, political science and computer science professor at Northeastern University, pointed out to Gizmodo, Facebook’s numbers simply align with overall population data. “The 85% figure, depending on the threshold [the Delphi Group] used, roughly matches our numbers for the general population for being ‘potentially’ willing to get vaccinated,” he wrote. “Indeed, most surveys find about 15% of the population that is really hardcore that says they will never get the vaccine.”

Facebook and Delphi’s numbers (including people probably willing to get vaccinated) gel with the CDC’s report that nearly 70% of the U.S. adult population has received at least one dose of the vaccine and the Kaiser Family Foundation’s finding that 16% of U.S. residents don’t plan to get the vaccine unless forced to. Facebook’s estimate of 85% of users who got vaccinated or are willing to get it matches up.

Facebook could clean up the site, and activists and researchers have been telling it, for a year, about the culprits. And if it really wants to place the blame on users, it could stop algorithmically recommending the most “engaging” content, be it from Ben Shapiro or Aunt Charlene. Facebook will never be able to say it’s done everything it can to fight misinformation as long as it continues recommending content as a business practice. A March 2021 report by the Center for Countering Digital Hate found that 73% of vaccine disinformation originated from just twelve people. Today, the activist group Real Facebook Oversight backed up those findings with a report that over 83% of posts with the most engagement this quarter came from five disinformation spreaders.

That group also dropped a bunch of body bags at Facebook’s door this morning, pictured above. Facebook’s policy communications director Andy Stone tweeted that they’re out for “cheap stunts” and linked to the insubstantial blog post stating that 85% of U.S. Facebook users are vaccinated.

There’s no way to prove that people are dying specifically because of pieces of information they read on Facebook, but associating a primary vaccine disinformation source with death is not a performative exaggeration. As covid-19 case rates are doubling and tripling, especially in states with paltry vaccination rates like Louisiana and Mississippi and Alabama, we’re reading daily reports of sufferers who wished they’d gotten the vaccine on their deathbeds. Doctors are pleading with the public to reconsider.

A pastor told Dallas-Fort Worth’s Fox News affiliate that he regretted believing disinformation after a brush with death. A 27-year-old who suffered a severe case said he’d believed he didn’t need the vaccine because he was young and fit. One mother who nearly died told ClickOrlando.com that she let disinformation-spreaders influence her with government conspiracy theories. A grieving mother recounted her 28-year-old son’s dying words to the Washington Post: “This is not a hoax, this is real.”

Facebook has historically chosen to sweep criticism under the rug with broad statistics about disinformation it’s removed and its number of moderators and pledges to change and add labels, but none of that translates to meaningful responsibility as a leading news source.

So Facebook’s hands-off attitude has reached Executive Branch intervention time. Earlier this month, White House Press Secretary Jen Psaki told reporters that, fine, the Biden Administration will do the job. She said they’re tracking covid-19 misinformation on Facebook and are making a series of recommendations for the company, and days later, Facebook told Biden to quit “finger-pointing.”



[ad_2]

Source link

1Ab91C9Ab8Eb06E4021E64Cc1Ec1B1F8

DOJ Seizes Middle East News Sites for Allegedly Spreading Disinformation

[ad_1]

Iran’s President-elect Ebrahim Raisi is pictured during his first press conference in the Islamic republic’s capital Tehran, on June 21, 2021.

Iran’s President-elect Ebrahim Raisi is pictured during his first press conference in the Islamic republic’s capital Tehran, on June 21, 2021.
Photo: Atta Kenare (Getty Images)

The U.S. Department of Justice seized 36 domains associated with news outlets in Iran, Yemen, and Palestine on Tuesday, according to a press release from the DOJ. The websites were taken for allegedly promoting disinformation campaigns and for violating U.S. sanctions against the Islamic Revolutionary Guard and radical terrorist groups.

The U.S. government was able to seize the .com and .net domains because they’re American-owned, despite the sites being operated from the Middle East. Some of the news websites that were taken by the U.S. government include Iran’s Press TV, Iran’s Al Alam, Iraq’s Al Forat News, Palestine’s Pal Today, Yemen’s Al Masirah TV, Iraq’s Karbala TV, among plenty of others.

The websites now all show a notice declaring that the domain has been seized by the U.S. government after obtaining a warrant:

Illustration for article titled DOJ Seizes Middle East News Sites for Allegedly Spreading Disinformation

Image: U.S. Department of Justice

The Iranian Islamic Radio and Television Union (IRTVU) was placed on a special list by the Office of Foreign Assets Control in October of 2020 which means that Iranian news outlets aren’t allowed to receive website and domain services based in the U.S. without a special waiver.

The DOJ alleges that Iran’s news sites were merely “disguised as news organizations or media outlets” and “targeted the United States with disinformation campaigns and malign influence operations.”

Notably, Press TV, the Iranian state media channel, remains active on Twitter and has already moved its English-language content to presstv.ir, the Iranian top-level domain.

Press TV has already had on a number of people, including politicians and academics, who question America’s commitment to free speech if they’re censoring the websites of foreign news outlets.

“Apparently they don’t think that they can debate Press TV’s point of view, they are apparently unable to refute what Press TV is saying, they are unable to win a free and fair debate,” Kevin Barrett, an American radio host and critic of U.S. foreign policy against Iran, told Press TV. “The only way they can win the debate is preventing Press TV from presenting its viewpoints.”

[ad_2]

Source link

Anor5Naaucmooqwqn9Wc

Facebook, Twitter, and YouTube Execs to Testify at Senate Hearing

[ad_1]

Illustration for article titled Policy Executives at Facebook, YouTube, and Twitter to Testify at Senate Hearing on Algorithms

Photo: Chip Somodevilla (Getty Images)

Next week, policy executives from Facebook, YouTube, and Twitter will testify at a Senate Judiciary hearing on algorithmic amplification, Politico reports. Social media recommendation algorithms have come under increasing scrutiny in recent years, and Democratic lawmakers have voiced concerns about how they can fuel extremism and the spread of misinformation online.

The Senate Judiciary Subcommittee on Privacy, Technology, and the Law is hosting the hearing, which is scheduled for April 27. It will feature testimony from Monika Bickert, Facebook’s vice president of content policy; Lauren Culbertson, Twitter’s head of U.S. public policy; and Alexandra Veitch, YouTube’s director of government affairs and public policy for the Americas and emerging markets. The panel will also hear from two outside experts: Tristan Harris, president of the Center for Humane Technology, and Joan Donovan, research director at the Shorenstein Center on Media, Politics, and Public Policy.

Congressional aides that spoke with Politico said the committee may call on big tech CEOs like Facebook’s Mark Zuckerberg and Twitter’s Jack Dorsey for future panels. Senator Chris Coons of Delaware, who chairs the subcommittee, said he was considering that option in an interview with the outlet last month.

However, by first hauling in the platforms’ policy executives instead of their CEOs, the panel aims to focus discussions on structural issues and content moderation and avoid “the typical airing of grievances” about the platforms at large that have dominated previous hearings, according to the congressional aides. They also hope to drum up bipartisan support by focusing on these sorts of systemic issues as opposed to how platforms handle specific content, such as political speech, Politico’s sources said.

Democratic lawmakers have been increasingly pushing to hold social media platforms accountable for how their recommendation algorithms amplify harmful and extremist content. In January, House Representatives Tom Malinowski of New Jersey and Anna Eshoo of California sent a series of letters to Big Tech CEOs calling on them to rework their recommendation systems, particularly in the wake of the Capitol Hill attack on January 6. Last month, Malinowski and Eshoo reintroduced legislation to amend Section 230 so that online platforms lose liability immunity if these systems promote content that leads to real-world harms, such as acts of terrorism or civil rights violations.

On Friday, Coon reiterated his concerns about algorithmic amplification and outlined plans to make holding social media companies accountable one of his subcommittee’s top priorities.

“Social media platforms use algorithms that shape what billions of people read, watch and think every day, but we know very little about how these systems operate and how they’re affecting our society,” he told Politico. “Increasingly, we’re hearing that these algorithms are amplifying misinformation, feeding political polarization and making us more distracted and isolated.”

The hearing is slated to begin at 10 a.m. ET on Tuesday, April 27 and will be livestreamed on the Senate Judiciary’s website here.



[ad_2]

Source link

Unzwwfyg0T1Oizydl68H

How to Stop People From Sharing Misinformation, ‘Fake News’

[ad_1]

Illustration for article titled Researchers Have Figured Out a Way to Stop People From Sharing Misinformation
Photo: JOSH EDELSON / Contributor (Getty Images)

A new study in Nature suggests that shifting reader attention online can help combat the spread of inaccurate information. The study, published on March 17, 2021, found that while people prefer to share accurate information, it is difficult to convince the average social media user to seek it out before sharing. Adding a roadblock in the form of a request to rate information accuracy can actually change the quality of information they share online.

“These findings indicate that people often share misinformation because their attention is focused on factors other than accuracy—and therefore they fail to implement a strongly held preference for accurate sharing,” write the authors, suggesting that people usually want to do good but often fail in the heat of the moment. “Our results challenge the popular claim that people value partisanship over accuracy and provide evidence for scalable attention-based interventions that social media platforms could easily implement to counter misinformation online”

The paper, co-authored by David Rand, Gordon Pennycook, Ziv Epstein, Mohsen Mosleh, Antonio Arechar, and Dean Eckles, even suggests a remedy.

First, the researchers confirmed that it was accuracy and not partisan politics of any stripe that concerned the average social media user. While there was a bias one way or the other in the tendency to share information, the researchers found that most of a 1,001 subject cohort chose accuracy over inflammatory or potentially “hot” content.

The researchers when on to recommend roadblocks when it comes to sharing news and information online, thereby reducing the chance that an inaccurate story slips past the reader’s internal censor. From the study:

To test whether these results could be applied on social media, the researchers conducted a field experiment on Twitter. “We created a set of bot accounts and sent messages to 5,379 Twitter users who regularly shared links to misinformation sites,” explains Mosleh. “Just like in the survey experiments, the message asked whether a random nonpolitical headline was accurate, to get users thinking about the concept of accuracy.” The researchers found that after reading the message, the users shared news from higher-quality news sites, as judged by professional fact-checkers.

In other words, many users don’t share fake news because they want to but because they don’t think about what they’re sharing before they press the button. Slowing down the process by asking users whether or now they actually trust a headline or news source means they are far more likely to think twice before sharing misinformation.

In fact, the team has worked on a number of suggested user interface changes including simply asking the reader to be skeptical of headlines. They implemented a number of these for Google’s Jigsaw internet safety project.

Illustration for article titled Researchers Have Figured Out a Way to Stop People From Sharing Misinformation
Screenshot: Jigsaw

The warnings and notifications put the user at the center of the question of information accuracy, thereby engaging us in critical thinking versus mindless sharing. Other interfaces simply show warnings before you read further, a welcome change to the typical click-and-forget attitude most of us have while browsing the internet.

“Social media companies by design have been focusing people’s attention on engagement,” said Rand.But they don’t have to only pay attention to engagement — you can also do proactive things to refocus users’ attention on accuracy.

[ad_2]

Source link