Disinformation and Social Media Algorithms and the Case for Saving Humanity

Cris Haest • Aug 19, 2021

Hate Divides, Love Multiplies except when it comes to increasing engagement on your social media platforms. People will love and like and comment, but what really drives the keyboard to click-clacking is the anger that erupts inside all of us. 

Before we get started, here are some facts:

  • Covid-19 vaccines saved nearly 280,000 lives in the US and prevented more than a million hospitalizations in the United States (View Full Content)
  • Face masks were 79% effective in preventing transmission, if they were used by all household members prior to symptoms occurring (View Full Content)
  • 614,530+ Americans have died due to Covid (View Full Content)
  • To Protect Children, Everyone Should Wear A Mask In Schools, Vaccinated Or Not, per The American Academy of Pediatrics 

If at this point you want to argue with me, dear reader, I must let you know that you have been a victim of disinformation. Please read on and hold all comments until the end. 


Researchers from the universities of Cambridge and New York found that tweets and Facebook posts about opposing political parties are far more likely to be shared than content containing emotional language.

Person Browsing Social Media

Up front: Social media platforms have long been accused of algorithmically promoting divisive content, because it’s more likely to go viral and thereby attract ad revenue.

The team analyzed around 2.7 million posts from news media account and US congressional members.


They found that posts about political opponents were shared roughly twice as often as those about one’s own party. In addition, each extra word about an adversary — such as “Democrat” or “Leftist” in a post by a Republican — increased the odds of a share by 67%.


Notably, emotional language was far less likely to boost engagement.


Per the study paper:

Each individual term referring to the political out-group increased the odds of a social media post being shared by 67%. Out-group language consistently emerged as the strongest predictor of shares and retweets: the average effect size of out-group language was about 4.8 times as strong as that of negative affect language and about 6.7 times as strong as that of moral-emotional language.

The researchers warn that amplifying this rhetoric can incite real-world violence, such as the storming of the US Capitol in January.


Quick take: It’s not surprising that dunking on opponents is a strong driver of virality. Social media platforms are acutely aware of the effect.

The Wall Street Journal reported last year that Facebook researchers had warned the company that their “algorithms exploit the human brain’s attraction to divisiveness.”

A Person Holding a  Phone

Company executives allegedly shut the research down and declined to implement changes proposed by the team.

This content was originally published here.

Facebook removes Russian network that targeted influencers to peddle anti-vax messages

Facebook said on August 9th, it had removed a network of accounts from Russia that it linked to a marketing firm which aimed to enlist influencers to push anti-vaccine content about the COVID-19 jabs.

Vaccine

The social media company said it had banned accounts connected to Fazze, a subsidiary of UK-registered marketing firm AdNow, which primarily conducted its operations from Russia, for violating its policy against foreign interference. Facebook said the campaign used its platforms primarily to target audiences in India, Latin America and, to a smaller extent, the United States.


The company’s investigators called the campaign a “disinformation laundromat,” creating misleading articles and petitions on forums like Reddit, Medium, and Change.org, and using fake accounts on platforms like Facebook and Instagram to amplify the content. Facebook said while the majority of the campaign fell flat, the crux of it appeared to be engaging with paid influencers and these posts attracted “some limited attention.”

False claims and conspiracy theories about COVID-19 and its vaccines have proliferated on social media sites in recent months. Major tech firms like Facebook have been criticized by US lawmakers and President Joe Biden’s administration, who say the spread of online lies about vaccines is making it harder to fight the pandemic.


Facebook said the Russia-linked operation started with the creation of batches of fake accounts in 2020, likely originating from account farms in Bangladesh and Pakistan, which posed as being based in India. It said the network posted memes and comments on its platforms in November and December 2020 claiming the AstraZeneca COVID-19 vaccine would turn people into chimpanzees, often using scenes from the 1968 Planet of the Apes movie.

Alongside this “spammy” campaign, Facebook said a number of health and wellbeing influencers on Instagram also shared hashtags and petitions used by the campaign. It said this was likely part of the operation’s known tactics of working with influencers.


Facebook said that in May 2021, after five months of inactivity, the operation then started questioning the safety of the Pfizer vaccine by pushing an allegedly “hacked and leaked” AstraZeneca document. Facebook investigators said the two phases of activity coincided with periods when several governments were reportedly discussing emergency authorizations for the vaccines.

Facemask and Vaccines

Facebook said it took down 65 Facebook accounts and 243 Instagram accounts as part of the Fazze-linked operation. It said 24,000 accounts followed one or more of the Instagram accounts. 

Social media is killing people with Covid misinformation

U.S President Joe Biden has expressed disappointment on big social media platforms like Facebook for spreading misinformation about the coronavirus and vaccines.


“They’re killing people,” the president said when asked what his message was to social media platforms like Facebook on the spread of false and misleading claims about the virus and the safety of vaccines that prevent it.


“The only pandemic we have is among the unvaccinated, and that’s — they’re killing people,” he continued.

President Biden’s statement comes as health officials are voicing concern over rising cases of the coronavirus and stalling vaccination rates.


“We are seeing outbreaks of cases in parts of the country that have low vaccination coverage because unvaccinated people are at risk, and communities that are fully-vaccinated are generally faring well,” he said. “Our biggest concern is that we are going to continue to see preventable cases, hospitalizations, and sadly, deaths, among the unvaccinated.”

The U.S. Surgeon General Vivek Murthy had on Thursday issued his first advisory raising alarm regarding the spread of false information about COVID-19 and how misinformation is working against government’s efforts to fight the deadly virus.


White House press secretary Jen Psaki also accused Facebook on Thursday of not doing enough to stop the growing rate of false information about the coronavirus and the vaccines.

Happy People

“So we’re regularly making sure social media platforms are aware of the latest narratives, dangerous to public health that we and many other Americans are seeing across all of social and traditional media,” Psaki said. “And we work to engage with them to better understand the enforcement of social media platform policies.”


She added, “As you all know information travels quite quickly. If it’s up there for days and days and days. When people see it, you know, there’s, it’s hard to put that back in a box, and of course, promoting quality information algorithms, I don’t know how they work, but they all do know how they work.”

The spokesperson said Facebook “removed more than 18 million pieces of COVID misinformation, removed accounts that repeatedly break these rules, and connected more than 2 billion people to reliable information about COVID-19 and COVID vaccines across our apps.”


When asked whether the White House found those actions sufficient, Psaki said, “Clearly not because we’re talking about additional steps that should be taken.”


This content was originally published here.

Where is the Covid Misinformation and Conspiracy Theories Coming From?

Fake News Headline

The vast majority of Covid-19 anti-vaccine misinformation and conspiracy theories originated from just 12 people, a report by the Center for Countering Digital Hate (CCDH) cited by the White House this week found.


CCDH, a UK/US non-profit and non-governmental organization, found in March that these 12 online personalities they dubbed the “disinformation dozen” have a combined following of 59 million people across multiple social media platforms, with Facebook having the largest impact. CCDH analyzed 812,000 Facebook posts and tweets and found 65% came from the disinformation dozen. Vivek Murthy, US surgeon general, and Joe Biden focused on misinformation around vaccines this week as a driving force of the virus spreading.

On Facebook alone, the dozen are responsible for 73% of all anti-vaccine content, though the vaccines have been deemed safe and effective by the US government and its regulatory agencies. And 95% of the Covid misinformation reported on these platforms were not removed.


Among the dozen are physicians that have embraced pseudoscience, a bodybuilder, a wellness blogger, a religious zealot, and, most notably Robert F Kennedy Jr, the nephew of John F Kennedy who has also linked vaccines to autism and 5G broadband cellular networks to the coronavirus pandemic.

Kennedy was since removed from Instagram, which Facebook owns, but not from Facebook itself.


“Facebook, Google and Twitter have put policies into place to prevent the spread of vaccine misinformation; yet to date, all have failed to satisfactorily enforce those policies, wrote CCDH’s CEO, Imran Ahmed, in the report. “All have been particularly ineffective at removing harmful and dangerous misinformation about coronavirus vaccines.”


This content was originally published here.

Disinformation for Hire, a Shadow Industry, Is Quietly Booming

Back-alley firms meddle in elections and promote falsehoods on behalf of clients who can claim deniability, escalating our era of unreality.

Typing on the Keyboard

Platforms have stepped up efforts to root out coordinated disinformation. Analysts especially credit Facebook, which publishes detailed reports on campaigns it disrupts.


Still, some argue that social media companies also play a role in worsening the threat. Engagement-boosting algorithms and design elements, research finds, often privilege divisive and conspiratorial content.


The disinformation network has developed hundreds of accounts with elaborate personas. Each has its own profile and posting history that can seem authentic. They appeared to come from many different countries and walks of life.


Graphika traced the accounts back to a Bangladeshi content farm that created them in bulk and probably sold them to a third party.


This content was originally published here.

The Federation of State Medical Boards says doctors who knowingly spread false information about vaccines could lose their license.

They believe this could help stop the spread of disinformation on social media.

Doctors Having a Meeting

He said he's noticed the decision for his patients to get vaccinated has been a hard one for them to make, which he blames on the spread of false information.


"I would call it a sad thing that there is so much disinformation.” Dr. Ward said.


The Federation of State Medical Boards says physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their medical license.


Dr. Ward believes COVID-19 deaths among people who were vaccinated are being highlighted on social media and leaving out factual information.

"It's usually people that have a lot of more issues, like they are obese, they have heart disease, they have diabetes. That is an example of how people receive information and say 'geez, if I get a vaccine, it's not going to save me,” Dr. Ward said.



Dr. Ward said those with breakthrough cases have been shown to rarely end up at the hospital, and even fatal outcomes are less than one percent.

He said the best way to learn about vaccines is to talk to your doctor.

Doctors Consultation

"The disinformation received by the public has literally many people vaccine-hesitant,” Dr. Ward said.


Since the vaccines are currently under emergency FDA authorization, Dr. Ward believes full approval will help discourage false information.


This content was originally published here.

This isn’t the first time Russia has spread disinformation and conspiracy theories on social media platforms.

Russia 'meddled in all big social media' around US election

The report suggests YouTube, Tumblr, Pinterest, Instagram and Google+ were affected by Russian interference, as well as Facebook and Twitter

Someone Typing on the Keyboard

Russia used every major social media platform to try to influence the 2016 US election, a report claimsNew research says YouTube, Tumblr, Instagram and PayPal - as well as Facebook and Twitter - were leveraged to spread propaganda.

  • The report, released by the US Senate, exposes the scale of Russian disinformation efforts.
  • Its authors criticise the "belated and unco-ordinated response" by tech firms.
  • The report was put together by University of Oxford's Computational Propaganda Project and the social network analysis firm Graphika.
  • It is the first analysis of millions of social media posts provided by Twitter, Google and Facebook to the Senate Intelligence Committee.

While Facebook and Twitter have previously disclosed Russian interference, little has been known about the use of other platforms.


The report suggests YouTube, Tumblr, PayPal and Google+ were all affected, with Russia adapting techniques from digital marketing to target audiences across multiple channels.


"It's a whole family of social media sites," says Philip N Howard, director of the Oxford Internet Institute. "We think the goal was to make the campaigns seem more legitimate."

The research details a vast campaign spearheaded by the Internet Research Agency (IRA) - a Russian company that has been described by the United States Intelligence Community as a troll farm with ties to the Russian government.

The report says Russia had a particular focus on targeting conservatives with posts on immigration, race and gun rights.


There were also efforts to undermine the voting power of left-leaning African-American citizens, by spreading misinformation about the electoral process.

Another report, also released today by the Senate, by the research firm New Knowledge, similarly highlights Russia's efforts to target African-Americans.

It explains how Russia's IRA were focused on "developing Black audiences and recruiting Black Americans as assets," which included encouraging activists to stage rallies.


One IRA campaign highlighted in the Oxford and Graphika paper, Black Matters US, existed across Twitter, Facebook, Instagram, YouTube, Google+, Tumblr and PayPal. These various accounts would promote each other's posts and events.

When Facebook suspended the group on its platform, the group's Twitter account complained about the move and accused the social network of "supporting white supremacy".


"What is clear is that all of the messaging clearly sought to benefit the Republican Party - and specifically Donald Trump," the report says.


"Trump is mentioned most in campaigns targeting conservatives and right-wing voters, where the messaging encouraged these groups to support his campaign. The main groups that could challenge Trump were then provided messaging that sought to confuse, distract and ultimately discourage members from voting."

Mobile Phone

While the data used by the researchers was provided by Facebook, Twitter and Google, their findings criticise the "belated and uncoordinated response" from these companies to Russia's disinformation campaign.


The researchers highlight details that could have led internet companies to detect interference earlier, such as the use of the Russian rouble to buy advertisements and internet signatures relating to the IRA's base of operations.


The IRA was among the three companies indicted earlier this year, as part of special counsel Robert Mueller's investigation into Russian interference in the 2016 election. Twelve of the agency's employees faced indictment charges, as well as its alleged financier, Yevgeny Prigozhin.


"Social media have gone from being the natural infrastructure for sharing collective grievances and co-ordinating civic engagement, to being a computational tool for social control, manipulated by canny political consultants and available to politicians in democracies and dictatorships alike," the report says.


This content was originally published here.


Read the Full U. S. Senate Report


BOOK EXCERPT: Making the Most of Communications and Social Media In Political Campaigns 

KEEPING UP WITH THE TIMES

Social media isn’t all negative outcomes when used for politics and government communications. Bright Social Agency’s cofounder Cris Haest discusses social media and politics with author, Angie Timmons. 


“Social media’s potential is remarkable because it can help a candidate “reach around the world in minutes” to organize volunteer activities, hold virtual town halls, and broadcast campaign events. Social media allows campaigns to “easily share a candidate’s stance on topics, even as those topics are breaking headlines, without the need to wait for a report to connect with your staff for an interview or the cameras to be rolling. It helps increase the ability to communicate with all possible voters and allow candidates the ability to see in real time how their constituents feel and are affected by all topics.”

"The Social Media Maelstrom

On social media, algorithms (automated reasoning calculations based on a particular formula)  allow bots (a program or network that operates without human oversight) to target users with information based on those users’ trends. For instance, someone who searches for news about Republican candidates will be presented with “news” about Republicans by bots that share on a mass scale what their algorithms tell them will be of interest to that particular social media user--without regard for the quality or source of the information being shared. Therefore, fake news can be shared on an incredible scale. This is dangerous, considering that 65 percent of adult Americans get their news from social media (according to a 2016 study by the Pew Research Center), and considering that the public (and even some candidates) use social media as their primary way of discovering and discussing politics.


UPDATED STATS: In 2019, the Pew Research Center found that over half of Americans (54%) either got their news "sometimes" or "often" from social media, and Facebook was the most popular social media site where American adults got their news.

Social Media expert Cris Haest said that while social media has expanded the way politicians and candidates connect with the community, it has also “become a weapon to take down people on the opposing side via smear campaigns turned trending topics, to help enlarge and encompass echo chambers that both help with progress and help with enraging those that choose an anger- or fear-based response.”


While fake news is technically propaganda or yellow journalism reinvented for the digital era, the term itself has been used incorrectly to undermine legitimate news sources. For example, during his presidency President Donal Trump attempted to cast doubt on legitimate news organizations that reported anything negative about him by calling them “fake news: sources. He even went so far as to refuse interviews with or banish reports from major mainstream news organizations that he deemed “fake news” just because they’d reported something he perceived as critical of him.

However, social media isn’t all bad news when it comes to politics. Haest says that social media can be used for politicians. For example, devices like smartphones can be used to photograph, film, and record candidates and share their words and actions with the world over social media. This permanent record makes it harder for politicians to deny their words and actions."


Read the full article

How long has social media played an issue with politics?

Server

Honestly, we could point to Facebook Open Graph, when the network allowed third-party apps, which is how Cambridge Analytica enters the issue but I can honestly tell you the issues would have existed without Open Graph. 


In researching for this article, I came across a LinkedIn post by Charles Johnston, MD Psychiatrist, Author, and Futurist, which says:

“In my recent book, Rethinking How We Think: Integrative Meta-perspective and the Cognitive “Growing Up” on Which Our Future Depends, I went into detail about the dangers of device addition and what can be done to address this concern. Today attention is being given to a further digital media–related dynamic that could ultimately prove at least as dangerous, how social media algorithms—in just how they work—exacerbate social and political polarization. 


In
Rethinking How We Think, I described how advertising-driven digital media in combination with machine learning algorithms creates a mix that requires no malevolence of intent to put is in major danger. By offering electronic substitutes for real fulfillment, our likes and clicks mimic the mechanisms of addiction (addictive substances work by providing the feedback that accompanies feelings like pleasure or power while requiring none of the vulnerability demanded for the real thing). It has been a dirty secret of high teach companies that they were optimizing programs for these addictions effects, but we would see the effect even without their efforts. Simple optimization depends on hijacking our attention—an necessity that is multiplied many times over by the fact that sites are advertising driven. Addiction is a much more reliable way to get attention than providing content that actually benefits us. A related consequence is that social media algorithms inherently create distortion and “fake news.” It has been well documented that the soap opera of sensationalized contend (and outright lies) is much more likely to attract eyeballs—and more likely to trigger search algorithms—than real news. 

Browser Tab

Part of the reason for concern with addictive dynamics is simply that there is nothing more precious and central to meaning than our attention. Another is that there is no obvious way to counter the effect other than the advice that works with other kinds of addiction to “just say no.” George Orwell in his dystopian novel 1984 warned of Big Brother taking control of our minds. The real danger in the future may not be government manipulation, but artificial stimulation masquerading as substance and information being used in ways that ultimately disconnect us from real importance.


The further dynamic that I noted—the way social media is amplifying polarization—in potential dramatically amplifies these dangers. Very intelligent people are setting off alarm bells about this effect. Some are describing it as the primary driver of modern social divisions. (The Netflix film “The Social Dilemma” provides a good summary of this argument.) I think of it more as exacerbating more fundamental processes that had already been well underway. (See my recent article: 
A Very Disturbing, and Dangerous, Situation—Political Polarization and Populism Run Amok.) But I find it of major concern.”

Me too, Charles.We’ll come back to more from Dr. Johnston later on. First, let’s review the timeline of the Cambridge Analytica scandal.

APRIL 21, 2010

FACEBOOK LAUNCHES OPEN GRAPH

In April 2010, Facebook announced the launch of a platform called Open Graph to third-party apps. This update allowed external developers to reach out to Facebook users and request permission to access a large chunk of their personal data — and, crucially, to access their Facebook friends’ personal data too.


2013

'THISISYOURDIGITALLIFE'

Cambridge academic Aleksandr Kogan and his company Global Science Research created an app called "thisisyourdigitallife" in 2013. The app prompted users to answer questions for a psychological profile.


2014

RULE CHANGES

In 2014, Facebook adapted its rules to limit a developer's access to user data. This change was made to ensure a third-party was not able to access a user's friends' data without gaining permission first.


DECEMBER 11, 2015

TED CRUZ

In late 2015, The Guardian reported that Cambridge Analytica was helping Ted Cruz's presidential campaign. The report suggested the Republican candidate was using psychological data based on research spanning tens of millions of Facebook users in an attempt to gain an advantage over his political rivals — including Donald Trump.


2016

DONALD TRUMP

Ahead of the U.S. presidential election, Trump's campaign team began investing heavily in Facebook ads.


MARCH 17, 2018

EXPOSE

In an explosive expose published in mid-March, The Guardian and The New York Times initially reported that 50 million Facebook profiles were harvested for Cambridge Analytica in a major data scandal. This number was later revised to as many as 87 million Facebook profiles.


MARCH 20, 2018

FTC LAUNCH INQUIRY

The Federal Trade Commission (FTC) opened an investigation into whether Facebook had violated a settlement reached with the U.S. government agency in 2011 over user privacy protections.



MARCH 21, 2018

ZUCKERBERG BREAKS HIS SILENCE

In a Facebook post published several days after the initial reports, Zuckerberg eventually responded to the continued fallout over the data scandal. He said: "We have a responsibility to protect your data, and if we can't then we don't deserve to serve you."



MARCH 25, 2018

SORRY WE DIDN'T DO MORE'

Around two weeks after the reports were published, Zuckerberg took out full-page ads in a number of British and American newspapers to apologize for a "breach of trust."


(for a more in depth breakdown of the timeline,
see the original story published here.


APRIL 10, 2018

TIME TO TESTIFY

The United States Senate Judiciary Committee called witnesses to testify about the data breach and general data privacy. They held two hearings, one focusing on Facebook's role in the breach and privacy on social media, and the other on Cambridge Analytica's role and its impact in data privacy. The former was held on April 10, 2018, where Mark Zuckerberg testified and Senator Chuck Grassley and Senator Dianne Feinstein gave statements. The latter occurred on May 16, 2018, where Professor Eitan Hersh, Dr. Mark Jamison, and Christopher Wylie testified, while Senators Grassley and Feinstein again made statements.


AFTERMATH

Aftermath

Following the downfall of Cambridge Analytica, a number of related companies have been established by people formerly affiliated with Cambridge Analytica, including Emerdata Limited and Auspex International. At first, Julian Wheatland, the former CEO of Cambridge Analytica and former director of many SCL-connected firms, stated that they did not plan on reestablishing the two companies. Instead, the directors and owners of Cambridge and its London-based parent SCL group strategically positioned themselves to be acquired in the face of bankruptcy procedures and lawsuits. While employees of both companies dispersed to successor firms, Cambridge and SCL were acquired by Emerdata Limited, a data processing company. Wheatland responded to news of this story and emphasized that Emerdata would not inherit SCL companies’ existing data or assets and that this information belongs to the administrators in charge of the SCL companies’ bankruptcy. David Carroll, an American professor who sued Cambridge, stated that Emerdata was aiming to conceal the scandals and minimize further criticism. Carroll's lawyers argued that Cambridge's court administrators were acting unlawfully by liquidating the company's assets prior to a full investigation being performed. While these administrators subjected SCL Group to criminal injury and a $26,000 fine, a U.K. court denied Carroll's lawsuit, allowing SCL to disintegrate without turning over his data see full article.

Who believes disinformation when presented with it on social media?

A new study has further exposed the effect of polarizing content on social media engagement.

Summary: Study reveals conservatives are less able to distinguish political truths from falsehoods, and the glut of right-wing media organizations producing politically incorrect content is likely to blame.

Source: Ohio State University

Conservatives are less able to distinguish political truths from falsehoods than liberals, mainly because of a glut of right-leaning misinformation, a new national study conducted over six months shows.

People With Their Mobile Phones

Researchers found that liberals and conservatives in the United States both tended to believe claims that promoted their political views, but that this more often led conservatives to accept falsehoods while rejecting truths.


One of the main drivers of the findings appeared to be the American media and information environment.


“Both liberals and conservatives tend to make errors that are influenced by what is good for their side,” said Kelly Garrett, co-author of the study and professor of communication at The Ohio State University.



“But the deck is stacked against conservatives because there is so much more misinformation that supports conservative positions. As a result, conservatives are more often led astray.”


In the end, participants had evaluated as many as 240 statements on a broad range of topics and representing many different viewpoints.


A separate group of people, recruited online, were surveyed to determine whether the claims, if true, would be better for liberals or for conservatives, or if they were neutral.


Overall, both liberals and conservatives were more likely to believe stories that favored their side – whether they were true or not.

The differences in beliefs were often stark, Garrett said.


For example, participants rated this true statement that received widespread social media engagement when it came out: “Investigators for the DHS Office of the Inspector General have identified poor conditions in several Texas migrant facilities, including extreme overcrowding and serious health risks.”


Results showed that 54% of Democrats correctly said that the statement was “definitely true” – compared to only 18% of Republicans.


Another statement – a false one – was “While serving as Sec. of State, Hillary Clinton colluded with Russia, selling 20% of the U.S. uranium supply to that country in exchange for donations to the Clinton Foundation.”


Here, only 2% of Democrats said this was “definitely true,” but 41% of Republicans did.

Person Showing Social Media Notifications

“These are important factual claims, yet we see vast partisan differences in belief,” Garrett said.


One of the major issues identified in the study was that these widely shared truths and falsehoods have different implications for liberals and conservatives.


Two-thirds (65%) of the high-engagement true statements were characterized as benefiting liberals, while only 10% of accurate claims were considered beneficial to conservatives. On the other side, 46% of falsehoods were rated as advantageous to conservatives, compared to 23% of false claims benefiting liberals.


We saw that viral political falsehoods tended to benefit conservatives, while truths tended to favor liberals. That makes it a lot harder for conservatives to avoid misperceptions,” Garrett said.

Although the information environment was the primary reason conservatives were susceptible to misinformation, it may not be the only one.


Results showed that even when the information environment was taken into account, conservatives were slightly more likely to hold misperceptions than were liberals.


“It is difficult to say why that is,” Garrett said. “We can’t explain the finding with our data alone.”


Results did show further distinctions between how conservatives and liberals approached the political claims in the viral stories they evaluated.


Liberals showed greater overall sensitivity, which characterizes an individual’s ability to distinguish truths and falsehoods. Conservatives and liberals were equally good at detecting truths and falsehoods when most true stories were labeled politically neutral.


But if more of the factually accurate stories were labeled political – benefiting either liberal or conservative positions – liberals became better than conservatives at distinguishing true from false statements.

“Conservatives did not get any worse, but they did not keep up with liberals who were getting better at discerning truths and falsehoods,” Garrett said.


Conservatives also showed a stronger “truth bias,” meaning that they were more likely to say that all the claims they were asked about were true.


“That’s a problem because some of the claims were outlandish – there should have been no ambiguity about whether they were true or not,” he said.


Garrett said a strength of this study, compared to many previous ones, is that it analyzed a wide range of political claims, reflecting the diversity of the media environment that Americans are exposed to. It clearly confirms the point made by many media commentators that conservatives are awash in false statements that support what they want to believe.

“We show that the media environment is shaping people’s ability to do this very basic, fundamental task. Democracy depends on people being able to tell the difference between what is true and false and it falters when people have difficulty agreeing on what’s real,” he said.

Conservatives’ susceptibility to political misperceptions

The idea that U.S. conservatives are uniquely likely to hold misperceptions is widespread but has not been systematically assessed.


Research has focused on beliefs about narrow sets of claims never intended to capture the richness of the political information environment. Furthermore, factors contributing to this performance gap remain unclear.


We generated an unique longitudinal dataset combining social media engagement data and a 12-wave panel study of Americans’ political knowledge about high-profile news over 6 months. Results confirm that conservatives have lower sensitivity than liberals, performing worse at distinguishing truths and falsehoods.


This is partially explained by the fact that the most widely shared falsehoods tend to promote conservative positions, while corresponding truths typically favor liberals. The problem is exacerbated by liberals’ tendency to experience bigger improvements in sensitivity than conservatives as the proportion of partisan news increases.


These results underscore the importance of reducing the supply of right-leaning misinformation.


This content was originally published here.

History has shown that the social media platforms will continue to do what serves their best interests. Divisive content will continue to play a huge part of our lives on social media. Social media and the platforms we have active today only heightens problems of the world such as sexual harassment, racism, and child explotation. Having walked the campus of many personally, I can say every single one of these tech companies have the funds and the know-how to create content filters and alerts prior to any of the disinformation, illicit porn, or hateful content being posted, but they don’t. Why? Because those angry views are worth more than the lives that are lost. 


We as the audience must get better at identifying and labeling misinformation, fake news, and conspiracy theories. This includes being able to identify an astroturfer, a scammer, an influencer, a brand, sponsored content, and actual news. Reviewing the news sources, the news owner, the report, and the reporter’s sources also play a part in helping weed out misinformation. As a brand, navigating these social platforms with an agency will ensure maximum brand growth and avoid potential PR crises. In our future posts, we’ll look at these terms, ways to and websites to review if you think you’ve spotted an astroturfer, find out a few you may not believe (thanks twitter sleuths!) and a few tools to help you review potential influencers for a brand partnership. 

We must do our part in calling out fake news when we see it online, and in offline conversations speak up to the people in our lives who have become gullible to spreading conspiracies and fake news. Covid has shown one way that misinformation can cause massive death tolls and without a proper solution, bad actors will continue to exploit social media for spreading havoc, anarchy, confusion, anger, and heartache. The hospital beds are full and the world is on fire. 


Do you know how a social media algorithm works?


Well, I do. I've been writing about, studying it, and yes - even manipulating it myself - since before Facebook Open Graph came out in 2010.


Let me tell you something, social media algorithms have deep machine learning and they know you, they know what will get you to click, share, and comment. With the rise of AI bots, you wouldn't even know the post was generated by a computer and not a person.


Social media platforms have long been accused of algorithmically promoting divisive content, because it’s more likely to go viral and thereby attract ad revenue.


When your motto is "Move Fast and Break Things", something I picked up in poster form way back when I walked the Facebook campus, you really need to take a step back and look at what things - trust in authority, climate, humanity - that are being broken, you have to ask: is our ability to communicate 24/7 worth it?


Let’s revisit Dr. Johnston. 

Angry Man Using His Laptop

“All this leaves us with the obvious of question of just what can be done. The best the technology gurus have to offer as a solution is that something has to happen at a “cultural level.” On that I think they are right (though it is an observation that too easily lets technology companies off the hook when it comes to responsibility).


To get at how I think of that needed cultural level response, we can start with the addiction aspect of the dynamic. When I work with someone who is addicted to something like heroin, ultimately therapy depends on helping the person get in touch with real significance to replace the pseudo-significance the drug has provided. The person can then see clearly how the drug is highjacking their life and robbing them of the things that in fact most matter. The parallel at a cultural level must be an image of meaning, an image of advancement that can work for our time, that can be authentically compelling and sit in clear contrast to imposters. The fact that we can’t get rid of damaging effects by legislation or policy may not have the dead-end consequences we might imagine. The better way to do so is to create enough contrast that social media of the harmful sort get regarded as having little more significance than the afternoon soaps or professional wrestling. 

Social Notifications

I think of this sort of antidote as having both personal and institutional aspects. At a personal level, the kind of psychological changes that Creative Systems Theory describes with the concept of Cultural Maturity would be expected to make being exploited in this way increasingly unpalatable. (See Cultural Maturity: A Guidebook for the Future.) There would be a growing awareness of the importance of making better choices. In working with clients around this kind of question, I point out that there is nothing more precious that our attention. Whether our interest is personal well-being or our larger human future, as individuals we simply can’t let it be exploited and manipulated. 


More institutionally, we need social structures able to engage the important questions before us collectively with the needed maturity. One of the most striking characteristics of contemporary society is the degree we have lost trusted agents. Decades ago we had our Walter Cronkites and Edward R. Murrows who we could count on to at least make a effort to communicate the facts. With postmodern times and new communications technologies, many people celebrated that news would now finally be democratized—we could all be experts. Today we discover that the consequences are not at all what we might have hoped. There may be no more important question as far as our future well-being than what it might mean to again have trusted agents. They must be of a new sort. We must work together to establish and support sources of information and institutional structures we can count on." 


This content was originally published here.

IDENTIFYING CREDIBLE SOURCES

The “lying media” like Fox News, OAN, Newsmax, and conspiracy theorists on social media are guilty of spreading disinformation about masks and vaccines. If you want to know what a credible source is, here is a chart to help. It was created by people who study information without an agenda beyond identifying liars and the truth. 

https://www.adfontesmedia.com


Please get your facts about something that may kill you from actual experts at the CDC & the WHO. Not some talking news head who may have already passed because they refused to believe in covid, or who are literally profiting over your refusal to acknowledge that you may be wrong. 

Being nice hasn't solved this, so it's time to be blunt. Get vaccinated, wear a mask, test frequently, or stay home if you can't do the first two & are actually waiting for FDA approval so that you do not spread something that has a chance of being asymptomatic in you. It may kill me, or my child. 


Love thy neighbor. I am thy neighbor.

This post may contain affiliate links, which means that we may receive a commission if you make a purchase using these links.

Ultimate Guide to B2B Social Media Marketing
By Junaid Ahmed 07 Feb, 2023
Learn everything you need to know about B2B social media marketing today! It's all here in this definitive guide.
How Your SEO Impacts Your Social Media Marketing Strategy?
By Junaid Ahmed 23 Oct, 2022
Social Media is the best platform for your business to market and reach out to virtually millions of people with a click of a button. However, without SEO strategies implemented your social media marketing efforts would have been in vain.
Customer Buying Journey
By Junaid Ahmed 23 Oct, 2022
If you're in sales, marketing or business development, you've likely heard of the customer buying journey—and have probably even been asked to map out one for a client.
Our Website Design & SEO Process
By Junaid Ahmed 02 Oct, 2022
We thoroughly research the needs of our clients and transform them into a comprehensive SEO strategy, tailored to their needs. The design process is focused on creativity and usability. We follow best practices in coding and serve search-friendly content.
10 Ways to Increase ROAS
By Junaid Ahmed 25 Sep, 2022
Do You Need More Targeted Visits To Your Site? Would You Like To Generate More Leads? If you're looking for Fast And Simple Tactics To Improve Your ROAs!, you've come to the right place.
Future Proof Your Social Media Marketing Strategy
By Junaid Ahmed 25 Sep, 2022
Are you looking for an optimized social media marketing strategy? Find out the top 10 ways you can future proof your social media marketing strategy.
Ways to Make Your Social Media Accessible & Inclusive
By Junaid Ahmed 13 Sep, 2022
Accessible Social Media starts with who you're following and what about you is viewable online. Here are 10 tips to help you increase the accessibility of your social media sites.
Create Amazing Content That Converts
By Junaid Ahmed 11 Sep, 2022
You can have an amazing website that doesn't convert. The good news is that there are ways to create converting content, so go ahead and learn the 7 secrets.
The 5 Reasons Why You Need a Website Audit
By Junaid Ahmed 11 Sep, 2022
If you want to create a positive user experience, and you want your customers to find what they are looking for easily, a website audit is one of the best things you can do. By auditing your website on a regular basis, you'll make sure that everything works to your advantage instead of against you.
Why Mental Health, Wellness, and Fitness Brands are Deplatforming from Meta
By Cris Haest 30 Aug, 2022
It had to happen; people are finally paying attention to harmful rhetoric within the wellness and mental health world.
More Posts
Share by: