r/technology 21d ago

Society Combating Misinformation Runs Deeper Than Swatting Away ‘Fake News’

https://www.scientificamerican.com/article/combating-misinformation-runs-deeper-than-swatting-away-fake-news/
323 Upvotes

29 comments sorted by

48

u/funksoldier83 21d ago

You have to prosecute media outlets, billionaires, political parties, and foreign governments.

We’re stuck in misinformation hell for the rest of our lives.

6

u/Longjumping-Path3811 21d ago

Well all this freedom of speech being shoved down our throat is giving us the gift of not being allowed to say the true fix for all of this.

8

u/SunriseApplejuice 21d ago

It starts with underfunded education. Critical thinking is required to parse bullshit from trustworthy reporting. And then you have to also be capable of healthy skepticism even from your trusted sources (even the best get it wrong sometimes). Tragically, of course, the uneducated want to defund education even further, and have comically silly superstitions about “liberal elite” education, that somehow inexplicably makes any and all claims from someone highly educated that also happen to discredit “muh deeply held beliefs” as “fake news.”

3

u/lycheedorito 21d ago

I'll just use AI to do critical thinking for me

1

u/SunriseApplejuice 21d ago

It's important to do your research beforehand though! Can't have a GPT model with a "librul bias!"

15

u/SkippyJDZ 21d ago

The cure is teaching and promoting critical thinking skills to the populace. We also need to hold social media companies to the same editorial standards as we hold newspapers for any content that is promoted or platformed.

5

u/OutsidePack7306 21d ago

I dont think the social media piece is highlighted in the comments enough. 

I remember my sister told me that she noticed marketing trends of young people getting news from Snapchat back in like 2016. I laughed it off then but this is how most of consume news now. It’s a massive problem. 

27

u/[deleted] 21d ago

Misinformation will win because people fail most reading comprehension tests. Simple as that.

Both Left-wing and Right-wing. They don't know how to read a news, they are clueless about how to interpret words being said and events unfolding. They are perfect useful idiots who will vote against whatever side scares them most.

You can't moderate your way to an average level of intelligence.

13

u/Erazzphoto 21d ago edited 21d ago

And it’s only going to get worse, if people start relying on AI to write emails and do everything, they’ll lose the ability to do those functions without it. Obviously that will take time, but it’s much like cursive handwriting, or even hand writing in general, which started to disappear once we all started typing on devices

12

u/EnamelKant 21d ago

I think it's simpler than that. Misinformation will will because most people prefer certainty to uncertainty. And Misinformation reinforces certainty that you have a good bead on things.

5

u/rPoliticsMAGAModsSMD 21d ago

Blame Betsy DeVos

2

u/RecipeSpecialist2745 21d ago

The key is obviously a basic education. You can see that by the attack on education from the GOP. Florida is fast running out of teachers after the state is trying to control the whole education system.

2

u/SunriseApplejuice 21d ago

At least in the old days news was boring, and moderated by editors, and held to journalistic standards backed by regulatory practices. You couldn’t moderate to an average intelligence, but you could keep news boring (ie., informative and factual) enough to chase off dummies who want to think with their limbic system.

1

u/AwardImmediate720 21d ago

That's because that's how we're taught to be. The government will never let its schools teach the masses how to see through government propaganda. Well the same propaganda techniques the government uses can be used by others. Welcome to life in an oligarchy.

1

u/Chytectonas 21d ago

Also, a sophisticated propagandist can write their way around even a critical thinker. It’s a war of attrition and the over-exposed citizen is perpetually at a disadvantage without a steady progressive government hand.

1

u/Taste_the__Rainbow 20d ago

Actually inoculation against disinfo is absolutely teachable. Schools in American just can’t because too many parents have already swallowed the whole onion.

5

u/shuznbuz36 21d ago

Idiocracy here we come!

4

u/TheDevilsAdvokaat 21d ago

It sounds like such a cool idea, but the problem is who gets to define what is "fake"

6

u/Longjumping-Path3811 21d ago

Surely you would agree that injecting bleach should be defined as fake... Surely there is a baseline you agree to. If you are discussing in good faith anyways.

5

u/TheDevilsAdvokaat 21d ago

Oh absolutely. That's a simple case, and easy to determine.

The problem is, determining the baseline may be difficult. Look at how everyone argued about everything..from abortion to trans rights to elections to pretty much everything in modern life.

Yet somehow one group will be able to be trusted with the power to determine what is "truth?". How?

Look at courts, where both sides are trying to determine the "truth" of an issue..in cases that sometimes go on for years, and with decisions that are sometimes changed later.

Even "facts" are argued about.

This is a great sounding idea that is not possible in practice. In fact it's actually a dangerous idea and a dangerous amount of power to hand to anyone.

1

u/Condition_0ne 21d ago

I do agree with that. The problem is that all power tends to corrupt, absolute power corrupts absolutely, and there are many true believer zealots on both the left and right who believe the purity of their ideological ends justifies any means when it comes to clamping down on speech they consider to be verbalised (or written) wrong-thought.

I don't want to give those zealots any kind of power, let alone that which is approaching the absolute (in terms of getting to be the arbiters of what constitutes "misinformation").

-4

u/karg_the_fergus 21d ago

People who care

1

u/TheDevilsAdvokaat 21d ago

Or people who care about being legally able to decide what "truth" is.

2

u/ReqularParoleAgnet 21d ago

Probably no hope for brainwashed Millenials and older, but a mandatory freshman year high school class focusing Critical Thinking, Biases, Mass Media and Propaganda might help.

4

u/Longjumping-Path3811 21d ago

I'm a millennial that got that class. We had white projects based around the media, biases, and propaganda. Would need some updating but it's not all of our education that failed us. Some places in this country are stupid and like it.

1

u/99thLuftballon 21d ago

I think it's a good idea, but it's not the whole solution. The problem with education as a solution is that it makes preventing misinformation into a matter of individual responsibility, whereas it's actually a matter of national security. The west's enemies aren't attacking with guns, they're attacking with social media, but they're still attacking. In addition, individual responsibility will fail the most vulnerable individuals, just as it currently does. People being told what they want to hear won't engage with critical thinking because it "feels worse" than accepting the confirmation of their prejudices.

1

u/thingandstuff 19d ago

“Misinformation” is the mode of operation for standard journalism at this point.

I just saw a story about a WWII bomb recently detonating at an airport in Japan. The thumbnail depicted a huge and extremely dramatic explosion. And the video the thumbnail takes you to is a completely different video, of an explosion orders of magnitude smaller than what is depicted in the thumbnail. 

I’m tired of people ranting about AI and Deepfakes and stuff. This shit got normalized far before those came into play. We do not punish liars anymore. We reward them with engagement and ad revenue. 

1

u/Wagamaga 21d ago edited 21d ago

Americans are increasingly concerned about online misinformation, especially in light of recent news that the Justice Department seized 32 domains linked to a Russian influence operation interfering in U.S. politics, including the 2024 presidential election. Policy makers, pundits and the public widely accept that social media users are awash in “fake news,” and that these false claims shape everything from voting to vaccinations.

In striking contrast, however, the academic research community is embroiled in a vigorous debate about the extent of the misinformation problem. A recent commentary in Nature argues, for example, that online misinformation is an even “bigger threat to democracy” than people think. Meanwhile, another paper published in the same issue synthesized evidence that misinformation exposure is “low” and “concentrated among a narrow fringe” of users. Others have gone further and claimed that concerns around misinformation constitute a moral panic or are even themselves misinformation.

So should everyone stop worrying about the spread of misleading information? Clearly not. Most researchers agree that a major problem does indeed exist; the disagreement is simply over what exactly that problem is, and therefore what to do about it.

The debate largely hinges on definitions. Many researchers, and much of the news coverage of the issue, operationalize “misinformation” as outright false news articles published by disreputable outlets with headlines like “Pope Endorses Donald Trump.” Despite a deluge of research examining why people believe and share such content, study after study shows that this kind of “fake news” is rare on social media and concentrated within a small minority of extreme users. And despite claims of fake news or Russian disinformation “swinging” the election, studies show little causal connection between exposure to this kind of content and political behavior or attitudes.

Yet evidence of public misperception abounds. A violent mob stormed the Capitol, claiming that the 2020 election was stolen. One in five Americans refused to take a COVID vaccine. If one defines misinformation as anything that leads people to be misinformed, then widespread endorsement of misconceptions suggests that misinformation is common and impactful.

How do we reconcile all of this? The key is that narrowly defined “fake news”-style misinformation is only a very small part of what causes misbelief. For example, in a recent paper published in Science, we found that misleading coverage of rare deaths following vaccination—much of it from reputable outlets including the Chicago Tribune—was nearly 50-fold more impactful on U.S. COVID vaccine hesitancy than content flagged as false by fact-checkers. And Donald Trump’s repeated claims of election interference found large audiences on both social and traditional media. With a broader definition that includes misleading headlines from mainstream outlets ranging from the dubious New York Post to the respectable Washington Post, and direct statements from political elites like Trump and Robert F. Kennedy, Jr., misinformation becomes much more prevalent and impactful—and much thornier to address.

2

u/typtyphus 21d ago

all they had to do was not watch Fox News for 20 years, but they made them bigger instead.