return to table of content

Tiny number of 'supersharers' spread the majority of fake news

throwaway115
123 replies
2d7h

"How do we stop people from sharing (what I deem to be) fake news?" is the wrong question. The right question is "How do we give people the tools to identify fake news?" If you give people the tools and they still spread what you deem to be fake news, then you've done what you can. Tough cookies for you.

Too many people get very angry that people share things they disagree with, and then start talking about clamping down on communication. No, sorry, but your ideas lost. People are allowed to think and share things that make you angry. If your views aren't mainstream, there's a reason for that. "Fake news" is only a tiny part of the equation. If you pick up that censorship weapon, that we all implicitly agree to not use, you will not be the last to wield it.

jasonjayr
70 replies
2d6h

There is a stark difference between "fake news" and "difference of opinion that I don't agree with".

"fake news" typically amounts to false rumors, amplifying unprovable/unsubstantiated claims. That's what needs to stop, and what is greatly damaging discourse.

I would be delighted to read about opposing opinions, provided they are presented + supported by provable facts.

belorn
23 replies
2d5h

Opinions supported by provable facts is what distinguish a scientific dissertation compared to a political statement. Politicians do not generally support their opinions with provable facts, and while they can consult with scientists to form opinions or support a political statement, this seems more of the exception than the rule.

In social science, defining provable facts is also a major problem since very few publications can be replicated. In fact, most provable facts in social science is estimated to be provable false, a fact found by meta studies a while back. A common finding is that the further one goes from pure math, the worse the provable facts become with social science sitting furthers away in the spectrum.

Thus I often see a different and more strict definition of "fake news". Fake news is only when an opinion has been been made with the intention of misleading for political gain, with strong emphasis on the intention aspect. It thus becomes very close to the definition of propaganda, but with additional restrictions. Intentions are also very hard to prove, especially with provable facts.

Retric
18 replies
2d4h

Fake news consist of blatantly false facts and conspiracy theories. The NEWS bit is there because these stories are suggesting something specific happened or was revealed. If it says ‘Ukraine invaded Russia,’ that’s fake news. More often it’s conspiracy crap like “Deep state behind X!”

It’s not so ambitious because it’s dificult to completely avoid facts. Political discourse regularly refers to actual facts ‘Russia invaded Ukraine’ is the kind of thing you can fact check and it passes.

Politics also uses facts in misleading ways. The kind of things where Fact Check goes: “Trump wrongly said the judge wouldn’t allow an “advice of counsel” defense. Before the trial, Trump’s attorneys chose not to seek such a defense, and Merchan held them to that decision.” aren’t what most people refer to as Fake News. From a certain point of view ‘Darth Vader killed your father’ and Social Security may tax income but it’s not income tax because we have a tax called Income Tax yadda yadda.

largbae
14 replies
2d3h

Was the idea that COVID-19 leaked from a lab in Wuhan fake news? Is the idea that the vaccines may have side effects worth understanding fake news?

That is what was defined as fake news and actively censored in our very recent past.

The problem with censoring fake news is that ultimately someone has to decide what is real and what is fake, and once censored, it becomes very difficult to discuss and revise that decision.

2OEH8eoCRo0
7 replies
2d3h

Was the idea that COVID-19 leaked from a lab in Wuhan fake news?

What facts was the assertion based on? I've always thought that while possible there wasn't supporting evidence.

arcanemachiner
3 replies
2d

INVESTIGATING THE ORIGINS OF COVID–19

Wednesday, March 8, 2023

HOUSE OF REPRESENTATIVES COMMITTEE ON OVERSIGHT AND ACCOUNTABILITY SELECT SUBCOMMITTEE ON THE CORONAVIRUS PANDEMIC

Excerpt:

"Some say the virus came from nature that, according to recent papers discussed in New York Times, the science is dispositive. Some say it’s too unique, too primed for human transmission, that there’s too much circumstantial evidence that points to COVID–19 coming from a lab. As well, in three years, there’s been no track found to prove that COVID–19 evolved naturally from an animal or a mammal or a tick to become highly infectious to humans. The truth is we don’t know the origins of COVID–19 yet for sure. We don’t have a smoking gun.

First, the science behind COVID–19: the genome of COVID–19 is inconsistent with expectations, and is unique for its group of viruses. COVID–19 has both a binding domain optimized for human cells, and a furin cleavage site, or a small part of the virus that makes it so infectious. That has never been seen before in a SARS-related virus. In other words, COVID–19 has unique characteristics that made it very infectious to humans. These have never been seen before in any other viruses of its type.

Most viral outbreaks are slow and small. CDC data shows SARS infected approximately 8,000 people worldwide, and eight in the U.S. Similar with MERS, which infected approximately 2,000 people worldwide. But COVID–19 was primed for human transmission. It has infected more than 750 million people worldwide. Dr. Redfield, one of our witnesses here today and a virologist, has even said that he believes COVID–19 had a detour from nature to be educated how to infect humans.

Second, the known research occurring in China: We know the Wuhan Institute of Virology was conducting gain-of-function research on novel bat coronaviruses by creating chimeric viruses, combining two viruses together to test infectivity and infecting mice with these viruses, work that former COVID–19 task force coordinator, Dr. Deborah Birx confirmed was, in fact, gain-of-function, contrary to statements by Dr. Fauci. We have learned that the Wuhan Institute has poor biosafety and was conducting this research at only Biosafety Level 2, described as the ‘‘Wild West’’ by Dr. Jeremy Farrar, a virologist from the U.K., now Chief Scientist for the WHO.

We have learned through a leaked DARPA grant application that with U.S. taxpayer backing, the Wuhan Institute proposed inserting furin cleavage sites into novel coronaviruses, the same unique genetic aspect of COVID–19. And we know, according to a State Department fact sheet, the multiple researchers at the Wuhan Institute were sick with COVID–19-like symptoms in the fall of 2019, before the Chinese officially announced the outbreak.

Third, concerning the actions of NIH and EcoHealth Alliance, records show that the National Institutes of Health while the U.S. was under a moratorium on gain-of-function research, exempted EcoHealth Alliance and the Wuhan Institute from this very ban. Records show that the National Institutes of Health allowed EcoHealth to conduct risky research on novel coronaviruses at the Wuhan Institute without going through the potential pandemic pathogen department level review board.

Records show that EcoHealth violated Federal grant policy, and failed to file its five-year progress report for more than two years.

Records show that EcoHealth violated the terms of its grant and failed to report an experiment that resulted in gain-of-function of a coronavirus at the Wuhan Institute.

Fourth, for some reason that we do not yet know, leaders in the scientific community took action to attempt to convince the world that they should not take the lab leak theory seriously. Dr. Francis Collins stated he was more concerned with harm to ‘‘international harmony’’ than he was with investigating the lab leak. Dr. Fauci said the lab leak theory was a ‘‘shiny object that will go away in time.’’

The president of EcoHealth, Dr. Peter Daszak orchestrated a letter in The Lancet that called the lab leak a ‘‘conspiracy theory,’’ a statement that directly benefited Dr. Daszak himself. And four scientists, after a conference call with Dr. Fauci, completely reversed their position. Dr. Kristian Andersen said he found ‘‘the genome inconsistent with evolutionary theory.’’ And Dr. Robert Garry said he ‘‘really can’t think of a possible natural scenario.’’ But a few days later, published a paper saying the exact opposite, a paper based on the new emails we released claim to be prompted by Dr. Fauci himself.

Fifth, the intelligence: FBI Director, Christopher Wray, confirmed publicly that the FBI assessed COVID–19 most likely originated from a lab incident in Wuhan. The Wall Street Journal reported the Department of Energy now also believes a lab leak is the most likely origin. These aren’t run-of-the-mill agencies. The FBI used experts in biological threats and is reportedly supported by the National Bioforensic Analysis Center and the Department of Energy used its own Z Division, experts in investigating biological threats. These are some of the facts as we know them, but there’s so much more to do."

- Hon. Brad R. Wenstrup (chairman of the subcommittee)

https://www.congress.gov/118/meeting/house/115426/documents/...

https://oversight.house.gov/hearing/investigating-the-origin...

---

Here's an info dump on the subject from Swiss Policy Research, a website described by Wikipedia as "a website that has been criticized for spreading conspiracy theories":

https://swprs.org/on-the-origin-of-sars-coronavirus-2/

I'm not presenting this as anything other than a source of information, whether true or not. The site clearly has an agenda, but information is information. The statements there should serve as a jumping-off point for further investigation.

On that note, it looks like Google's algorithm is still boosting official sources on the subject, and is conceivably deboosting sources it deems to be unreliable (I'm not sure if it's controversial to state that online censorship was rampant during the pandemic), so the search for information may be deceptively difficult. You may have better luck with alternative search engines.

2OEH8eoCRo0
1 replies
2d

COVID pandemic started in 2023?

EasyMark
0 replies
1d21h

I doubt if many people (especially now) dispute it could have been manmade. The point was that one party that shall remain nameless was saying 100% that it was a Chinese conspiracy to create a biological weapon and it either leaked or was released intentionally. It was stated as a fact; not as a theory or as possibility but a verified fact, and it was and still is not. Fauci and others felt, given what they knew at the time it was likely natural in origin, and we’ve had similar in the past, in particular Spanish Flu. The “Chi-na” theory was sold hand in hand with the “5G deep state poison” vaccine theories and that’s why most people wanted that stuff off social media. And that was okay, because SM is owned by private entities and in the USA you are allowed to control what is on your platform (at least for now). I was on team wait-and-see where the evidence leads, but I’ve never been worried about vaccines or the FDA process for it. I was never for forced injections either by government or corporate mandates. I’m not sure why people are so ready to jump to extremes rather than see where evidence leads, I have a deep suspicion of anyone who serves up “facts” before there is any evidence.

largbae
2 replies
2d2h

That is a separate topic but recent congressional hearings would be a great way to research. The point that I am making is that both of these topics were censored as fake news, and neither is probably false.

2OEH8eoCRo0
1 replies
2d1h

Recent hearings? It's fake news if it's presented as fact without evidence. If it's presented as a possibility then it's not really news is it?

tanseydavid
0 replies
1d

> If it's presented as a possibility then it's not really news is it?

If it is presented as a possibility then it is a theory, right?

Slap the word "conspiracy" in front of the word "theory" and then you have a theory which can no longer be discussed in a reasonable manner.

An accidental bio-lab leak does not require any sort of conspiracy.

Retric
5 replies
2d3h

Making specific claims without evidence is Fake News because you’re saying you have evidence that X is true when you don’t.

“COVID-19 COULD have leaked from a lab” is perfectly fine. Saying “COVID-19 leaked from a lab” requires evidence or it’s not news even though you claim it is thus the label “Fake” news.

tanseydavid
1 replies
1d

> “COVID-19 COULD have leaked from a lab” is perfectly fine.

IIRC -- in 2020 any mention of such an idea was met with accusations of racism.

Retric
0 replies
21h34m

I’ve rarely seen people bring up racism when talking about the lab leak idea in general.

Distrust of the CCP doesn’t link to racism in most western eyes because they don’t differentiate between different Asian groups as distinct races. Race is basically just White, Hispanic, Black, Asian, Native American, and maybe Pacific Islanders.

largbae
1 replies
2d2h

I agree with your premise here, but both of these topics were censored to the point of removing accounts that discussed them under the assumption that they were fake news. This is the problem with censoring anything.

Retric
0 replies
2d2h

That’s fair.

I don’t have any specific solutions for fake news. My interest is more in how the internet and society is evolving rather than what to do about it.

account42
0 replies
7h54m

What about saying “COVID-19 did not leak from a lab”?

belorn
1 replies
2d2h

If we look at the claim that "ukrain invaded russia", we can lable that as war propaganda with a pretty clear intent, especially if we can source the claim to Russian military or government. The facts such as which specific solder fired the first bullet, at what gps location, using what gun, are all facts too but not important for determining false news. War propaganda is enough to be determined by intent. Actually facts are more a hindrence than helpful, since they add noise to something that should be simple.

In addition, most of us a were not there so the determining factor rest on trust. Do we trust russia, or the multiple sources that said russia invaded Ukrain. We do not need the independent quality of a math proof to determine a case like that.

Outlawing war propaganda is way older than censorship of fake news, and have pro and cons are different than those around fake news.

Retric
0 replies
1d23h

Propaganda need not be false. “Remember Pearl Harbor!” was often used in the following months and years to drum up military support.

So some though not all Fake News is Propaganda, but a great deal of Propaganda isn’t Fake News either.

belorn
1 replies
2d1h

The initial key word I would start with if someone is interested in studying this concept would be "replication crisis". Wikipedia has a good starting off point with https://en.wikipedia.org/wiki/Replication_crisis

I would not point to a specific study. Primarily because I would have to dig through a rather long list to find the one that I found "best", but also because in terms of fact finding, its better if people did this independently. "Why Most Published Research Findings Are False" is a good one however.

prophesi
0 replies
2d

Thanks! I missed that in the wiki article. Replication crisis put me on the right path.

leereeves
0 replies
2d5h

A common finding is that the further one goes from pure math, the worse the provable facts become with social science sitting furthers away in the spectrum.

I think this is true, but it's not because of the distance from pure math. It's a choice made by researchers and journals regarding the standard of evidence required.

Physics demands very strong evidence for publication, a p-value of 5-sigma or more, less than a 1 in a million chance of getting a publishable result from a random experiment.

Social science is typically satisfied with a p-value of 0.05, a 1 in 20 chance of getting a publishable result from a random experiment. That means a whole lot of results are published that are nothing more than the scientific equivalent of dice rolling snake-eyes.

In fact, rolling snake-eyes is less likely (1/36) than getting a publishable result from any given social science experiment (1/20).

Fortunately, this also means it's an easy problem to solve, if the will existed. Simply requiring a much higher standard of proof would filter out a lot of the false social science results.

mc32
15 replies
2d5h

Unfortunately with Covid we saw what people are willing to do from both pro and anti PsOV. They were both wrong. Even the CDC and the like put out bad information. Internally they had dissent but dismissed it and went so far as to misspell things so FOIAs would not find things[1]

No one can be trusted, unfortunately, including trusted sources, or platforms or governments. All of the above will abuse their trust when things prove difficult, trust be damned.

[1]https://www.nytimes.com/2024/05/28/health/nih-officials-foia...

llamaimperative
7 replies
2d5h

No, again, getting things wrong is not the same as lying, and they do not warrant the same degradation in trust.

baryphonic
2 replies
2d5h

If people were avoiding FOIA by having candid discussions on private servers and deliberately misspelling words all while telling the public a contradictory story, that is strong evidence of lying.

If the private communications matched the public ones and there were no efforts to obfuscate, then the best conclusion would be they just called it wrong.

llamaimperative
1 replies
2d3h

I agree hiding from FOIA looks bad, degrades trust in general, and the responsible parties should be punished, but it is definitely not dispositive of lying.

I didn't see any emails where they were showing agreement with a different set of facts than what they were communicating to the public? Open to seeing sources behind that claim though.

tbrownaw
0 replies
2d1h

It doesn't prove (but, uh, is rather strongly suggestive) that their earlier statements were lies or contradicted by the records they're hiding, but it is itself a lie.

xanthor
1 replies
2d5h

What if they're lying about getting things wrong versus lying?

llamaimperative
0 replies
2d3h

That’s bad but not an interpretation one should jump to in a context where it was very easy to get things wrong and rather little incentive to lie.

You think CDC et al didn’t know they had limited public credibility with which they could guide public behavior? These people live and breathe questions of institutional credibility all day every day. They obviously know their careers are put at risk even by being wrong never mind by lying.

wsc981
1 replies
2d5h

The whole "horse dewormer" bullshit propagated by MSM was clearly a lie (in so far that this medicine has been used for decades by humans). So, you can't even trust MSM to not be a 'supersharer' of misinformation.

Same is true for Hunter Biden's laptop story.

Many more examples could be found.

mc32
0 replies
2d4h

Heck you even have Chris Cuomo coming out against many of the things he said and agreeing with some things he disagreed with regarding COVID.

baryphonic
3 replies
2d5h

Internally they had dissent but dismissed it and went so far as to misspell things so FOIAs would not find things.

I hadn't heard this yet. Unbelievable. And yet all of the sites I found it on from a quick search have at some point in the past been branded "fake news." In fact, one source, the New York Post, was falsely branded "Russian disinformation" on the eve of the 2020 election and suspended from Meta and Twitter, only for its story to be verified subsequently when it had minimal consequence.

astrange
2 replies
2d4h

The Hunter Biden's laptop thing continues to be an intelligence op and not "real". It has not been "verified".

Of course it does contain some real content, since it was constructed from a hack of his iCloud account, but that doesn't validate anything else on it and some of the earliest claims about it seemed to involve eg planted child porn. You only find it credible because those were successfully suppressed so hard you don't even remember them.

rufus_foreman
1 replies
2d

From Wikipedia, https://en.wikipedia.org/wiki/Hunter_Biden_laptop_controvers...:

"In November 2022, CBS News published the results of a forensic analysis they commissioned of a copy of the laptop data Mac Isaac initially handed to federal investigators in 2019. The analysis, conducted by Computer Forensics Services, found data, including over 120,000 emails, "consistent with normal, everyday use of a computer", found "no evidence that the user data had been modified, fabricated or tampered with", and found no new files created on the laptop after April 2019, when Mac Isaac received the laptop. The chief technology officer of Computer Forensics Services added: "I have no doubt in my mind that this data was created by Hunter Biden, and that it came from a computer under Mr. Biden's control". Also on November 21, CBS News published the first photograph of the damaged Macbook Pro, which had been provided to them by Hunter Biden's legal team.

According to reports on January 16, 2024, new filings by the U.S. Department of Justice's special counsel, headed by David C. Weiss, appear to be the first public confirmation of the laptop's authenticity by the DOJ. The filings refer to the laptop connected to Hunter Biden stating, “the defendant’s Apple MacBook Pro, which he had left at a computer store.”"

Is that fake news?

astrange
0 replies
1d19h

Yes; notably a special counsel saying something is not "confirmation by the DOJ". A special counsel is a political entity not controlled by the DOJ and not everything they say is an official position or even true.

For instance, they said in the same case that a picture of a table saw with sawdust on it was cocaine Hunter Biden was using.

The defense in the case has said some of the material is not authentic (eg https://storage.courtlistener.com/recap/gov.uscourts.ded.827...) but it's pretty irrelevant to the case so they are probably not actually going to work this out.

sdoering
1 replies
2d5h

We need to clearly differentiate between "best known facts currently available" and "already disproven bullshit".

The first one has an expiration date in the (unknown) future. The second one has one that’s already in the past.

Thirdly there’s the category of "still useful but not true" like Newtonian physics.

belorn
0 replies
2d5h

"already disproven bullshit" is what meta studies are for. In regards to the covid pandemic there are now plenty of those, and unsurprisingly few people liked what they showed. No one likes it when facts do not conform to cultural views.

dahart
0 replies
2d3h

The CDC did put out some info on masks that they pretty quickly retracted and admitted was a mistake. They did that because they were trying to make sure enough masks were available for health care workers, which did in fact run out. You have to acknowledge that they admitted the mistake and take that into account when comparing it to something like the Trump campaign to claim the election was stolen, which everyone knows is entirely and intentionally false but they will never admit.

The idea that you can’t trust anyone is part of the ongoing disinformation campaign. Acting like the truth of information is only binary and ignoring the intent, degree of truth, and degree of damage, is a way for politicians who are spreading misinformation intentionally to rationalize doing so by blaming and attacking everyone else just because they try to do the right thing and get it slightly wrong.

We don’t need to trust politicians, we just need to trust each other a little bit. The campaign to sow distrust and polarize voters is working. Don’t let it work on you. Yes, some people are sometimes bad. Don’t forget that people are sometimes good, and most of the country aren’t politicians. Most of the country is decent people trying to get by, and we all pretty much want the same things.

https://en.wikipedia.org/wiki/Disinformation_attack#Undermin...

somenameforme
5 replies
2d5h

On a meta level this would seem to leave discussion of any sort of revolutionary concept mostly censored. So for instance go back in time to when the idea that the Earth was the center of the universe was near universally believed. If one studies the stars this is precisely what one would tend to believe, and you can even create highly accurate predictions for things like where the stars will be, based on this assumption.

Claims that the Earth actually revolved around the Sun, including by people like Galileo, tended to have extensive initial flaws (for instance Galileo also incorrectly assumed circular orbits, which causes lots of problems) and were completely unproveable given the technology of the time. So you have somebody saying something hat many would have considered plainly offensive and/or pseudo-scientific, that had negligible public support, that had proveable flaws, that contradicted centuries of expert knowledge, and was also spread by somebody who was in general somewhat anti-social - he initially had extensive support from the powerful Church of the era, but lost it largely through publishing what, for the time, were quite vile insults directed at them for not immediately jumping on board with him. His former relationship is the reason he was able to spend out the remainder of his years in the relative comfort of house arrest.

Obviously any sort of 'new vision' of speech policing that would effectively censor Galileo is a terrible idea. And this isn't just an issue of the past, the person who discovered handwashing/germs faced similar issues among countless other examples that are outside the scope of this post.

quacked
4 replies
2d4h

This is a crucial point that many in the thread are missing.

A "fact" as a unit of information is itself subject to the whims of people and cultural attitudes. Is it a "fact" that a whale is a mammal? No, it's a fact that the majority of modern biologists classify a group of animals collectively referred to as whales as a member of a group of animals they call mammals. Is it a fact that "X murdered Y"? No, but it is a fact that a group of people working together to investigate agreed to formally write down that X is a murderer and retaliate accordingly. (You can't even say for sure that they all believed that X murdered Y because each may have a different understanding of what "murder" is, or that they had doubts but when along with the vote, etc.)

When people say "I only believe in statements that are supported by facts" they rarely think about the nuances of the "supporting facts". 600 years ago it was a fact that Christ died for our sins, etc.

dcow
3 replies
2d3h

Well it’s incredibly likely that someone who most people referred to as Christ did indeed die <for our sins>. I think you mean to challenge the reliability or interpretation of the other stuff that allegedly happened afterward.

marcosdumay
1 replies
2d2h

The most likely is that that one person the Bible is about never existed.

There would be documentation if he existed.

somenameforme
0 replies
2d

This is such an extremely interesting comment, given the context of this conversation, because it sums up so much with so little. There is indeed overwhelming documentation (and other evidence) of Jesus' existence [1], and essentially no doubt of such. The Bible in general corresponds quite well with historic evidence on most topics*. Where it diverges from history is obviously in the divine.

But does this mean one shouldn't be able to publicly express doubt of Jesus' existence? I would say no. Because while there is both overwhelming evidence and consensus, there was also overwhelming evidence and consensus for the Earth being the center of the universe. And the greatest leaps in society's state of knowledge tends to come from the times when these 'things everybody knows' end up simply being wrong.

It's okay to say, or even believe, things that are most likely wrong. The whole point about Freedom is having the Freedom to make choices, even when those choices or views may not be what somebody else would consider appropriate. When such choices become sufficiently detrimental, like theft or murder, we prohibit them by law. But prohibiting having the 'wrong view' just seems very myopic. If you change your view after reading those articles, that's cool. If you don't (or even more likely don't even check them out), that's also cool.

[1] - https://en.wikipedia.org/wiki/Sources_for_the_historicity_of...

----

* = Interestingly this exact observation led Thomas Jefferson, who created his own sort of sect-of-1 Christianity, to compose his own Bible, the Jefferson Bible [2]. He took all the likely factual context and writings in the bible, and removed all the supernatural aspects of it - essentially turning it into a historical text with moral lessons. Quite a shame no copies remain.

[2] - https://en.wikipedia.org/wiki/Jefferson_Bible

quacked
0 replies
2d3h

It can't be considered a fact, because it's unverifiable. The "fact" is that certain parties (museums, Vatican, etc.) claim to possess genuine historical documents that describe events that occurred in the past that Christ existed and was killed by the Romans.

I believe it's probably true, but it's important to realize that I'm taking the word of these historians and archivists on faith; I've never seen these documents, and I could not verify their authenticity were I allowed access to them.

It's only through careful examinations of what we as individuals can personally verify can we start to identify what kind of informational waters we swim in, and start to protect ourselves from "fake news".

thsksbd
4 replies
2d6h

" "fake news" typically amounts to false rumors, amplifying unprovable/unsubstantiated claims. "

Like the Steele dossier or the rumor of a pee-pee tape [1].

Look, Id love to silence Rachel Maddow as much as the next guy, but there is no objective function to reliably do that without also silencing inconvenient truths (Mai Lei massacre, fake WMD, Biden's corruption, etc).

[1] That's when I knew power was out to get Trump by any means. Journos seriously reporting that a hotelier is unaware that top hotels are riddled with bugs? Bullshit.

toss1
3 replies
2d4h

Or, if you want better examples, how about the massively promoted "2000 mules" movie?

It was literally all lies, just proven in court, and the producer just apologized to one of the subjects for those lies and pulled it from distribution.

When one of the primary architects of right-wing strategy advocates a key tactic of "flooding the zone with bullshit", that side has no complaint. Particularly when that tactic is derived specifically from Russian dezinformatsiya techniques, where the goal is not to get people to believe the lies (the few who do are a bonus), but to exhaust reason and get people to give up and say "we can't tell what is true". At that point, they are most manipulable.

[0] https://www.npr.org/2024/05/31/g-s1-2298/publisher-of-2000-m...

[1] https://www.cnn.com/2024/05/31/media/salem-will-stop-distrib...

Ray20
1 replies
2d3h

Particularly when that tactic is derived specifically from Russian dezinformatsiya techniques

I think it is incorrect to call such tactics "Russian dezinformatsiya techniques". It's rather "leftist techniques". Putin just uses it because he was a member of communist party and KGB and was taught it this way.

toss1
0 replies
2d2h

Nonsense. First, "Russian dezinformatsiya techniques" is more specific, and specifically related to the current events. Second, no one owns them, they're not copyrighted or anything. Third, "leftist" is far more vague. Fourth, Russia has used these techniques for centuries; so if you must go with some kind of large group, if anything it'd be 'Royalist Techniques', predating the Russian revolution.

thsksbd
0 replies
2d

My list were the lies that the news agencies we are supposed to trust peddled. My examples were lies peddled by the respectable likes of the BBC, The Economist and NPR.

So, unless you want "The Epoch Times" to be included with the truth censors, your donkey movie is not at all like my examples.

Btw, these lies I listed cost millions of lives in the third world. But Im old fashioned like that. I still think Black/Brown lives matter.

throwaway115
4 replies
2d6h

I would be delighted to read about opposing opinions, provided they are presented + supported by provable facts.

Opinion

  a view or judgment formed about something, not necessarily based on fact or knowledge.

epgui
3 replies
2d6h

Opinions don't need to be supported by facts or based on facts.

They do need to be compatible with factual reality, otherwise they're called delusions.

DavidPiper
1 replies
2d5h

Unfortunately, phantasms[1] and mass delusions are also very much possible. Increasingly, opinions don't need to be compatible with factual reality to be believed.

[1] https://www.reddit.com/r/Anarchy101/comments/pyfhzb/please_e... (Sorry for the Reddit link, it's the most succinct explanation I could find)

epgui
0 replies
2d4h

That’s true… and it’s perfectly compatible with what I wrote.

tanseydavid
0 replies
23h56m

Delusional people do not speak in terms of opinion they present their perception as facts.

nradov
4 replies
2d5h

Most "facts" aren't provable in the mathematical sense. We're pretty sure that smoking increases the risk of lung cancer based on extensive research but it's never been 100% proven. So, is it a "fact" that smoking causes cancer? That depends on the level of evidence required and who does the evaluation.

Fake news and false rumors are as old as humanity. They are not a problem that needs to be solved, and we certainly don't need governments or media executives acting as arbiters of truth.

That said, the Twitter/X Community Notes feature seems to be working well. It has been refreshing to see (mostly) false claims by prominent public figures debunked.

https://www.hindustantimes.com/world-news/us-news/biden-clai...

mc32
0 replies
2d5h

“He ducked in to a 7-11 and came out three minutes later with a beer.”

No, your honor, actually, I ducked in to a Quick Mart and came back out two minutes later with a perry. So that is absolutely false.

Are members of Congress clients of prostitute (sorry escorts)? Most likely. Have any engaged in statutory rape? Likely given what we know about the Epsteins. Does that translate into that conspiracy? No. But the worst is now DC points out the ridiculousness of that conspiracy to boohoo the likelihood that some members of Congress and their staff engage in below board activities.

ordu
0 replies
2d3h

> Most "facts" aren't provable in the mathematical sense.

Mathematics is not applicable to reality by itself. Math can be a tool among others to deal with truths and lies, but it is useless by itself.

> Fake news and false rumors are as old as humanity. They are not a problem that needs to be solved

Why do you think so?

Firstly there is a non sequitur: from "X is old and was not a problem" doesn't follow that "X is not a problem now". Burning coal was not a problem, but it is a big problem now.

Secondly, why do you think that they are not a problem to solve? There are agents that weaponize rumors and fake news. I see no obvious ways to reason, that they will fail inevitably, that we need just wait and watch how they will fail. But if you see a way to prove it (in any meaning of a proof, not necessarily in mathematical one), I'd like to hear it.

> we certainly don't need governments or media executives acting as arbiters of truth.

Are you proposing a radical change on how a democracy works? Any democracy has courts that act as arbiters of truths. Democracy cannot exist without these arbiters. Democracy has parliaments, congresses and suchlike to decide on what is a best way to tackle the current problems. These authorities are also acting as arbiters of truth in some sense.

I believe you want to say, that we cannot trust governments to run uncontrolled, and yes, we cannot. But the problem of limiting powers of governments is an old problem, it has solutions in some specific cases. So now we need to figure out how to control fake news without giving governments too many powers. It will not be easy, but we cannot say it is impossible before we tried.

JKCalhoun
0 replies
2d5h

So, is it a "fact" that smoking causes cancer?

I could be wrong, but I think we're discussing when someone posts something like, "Smoking does not cause cancer."

That's very different.

lenkite
2 replies
2d4h

We have already seen in the last few years that "fake news" and "misinformation" of today has a reasonably high probability of becoming "verified news" and "factual information" of tomorrow. Of-course the "fact-checkers" and "misinformation experts" rarely correct themselves or apologize for their censorship.

flyingcircus3
1 replies
2d3h

This take is akin to finding a broken clock that just happens to show the correct time at the moment you found it, and then refusing to accept that the clock is broken, even hours later. Then as the days go on, you accumulate more evidence that the clock is actually working, because you keep checking on the two minutes per day that the broken clock is "accurate", while discounting the other 1438 minutes per day when the clock is wrong.

lenkite
0 replies
2d1h

Well, if the "fact-checkers" are the broken clock, then I would agree with you.

leereeves
2 replies
2d5h

There is a stark difference between "fake news" and "difference of opinion that I don't agree with".

There is. But there's plenty of room in the definition of "fake news" to be biased and to exclude falsehoods that you support.

Even this paper excludes falsehoods propagated by the left, like the Steele Dossier or the claim that Hunter Biden's laptop was fake.

And as expected, some people here are downvoting this, thus proving my fundamental point. Any discussion of "fake news" is inherently political, though perhaps not always consciously.

Edit: and despite all the evidence otherwise, some people still believe in the fake news.

"Data from a laptop that the lawyer for a Delaware computer repair shop owner says was left by Hunter Biden in 2019 – and which the shop owner later provided to the FBI under subpoena – shows no evidence of tampering or fabrication, according to an independent review commissioned by CBS News."

https://www.cbsnews.com/news/hunter-biden-laptop-data-analys...

astrange
1 replies
2d4h

The laptop found in the computer store of a blind guy containing a copy of Hunter Biden's iCloud was fake, yes.

The prosecutor in the case against him did recently claim it was real without providing new evidence of that. In the same document, he claimed a picture of sawdust on a table saw was a picture of Hunter Biden doing cocaine.

(Other stories about "Ashley Biden's diary" were also faked, though by different people, and faked in different ways.)

tanseydavid
0 replies
23h49m

> claimed a picture of sawdust on a table saw was a picture of Hunter Biden doing cocaine.

They also keep trying to say that Hunter's glass flute (or maybe its a piccolo?) is a crack-pipe!

Why would he photograph himself with a crack-pipe in his mouth? He wouldn't -- he's obviously practicing his music lessons.

spydr
0 replies
14h32m

A lot of the “fake news” in circulation about Covid turned out to be true. Oh how some of this info was shut down, from twitter and Facebook removing them , to news outlets reporting on them. How do you decide what is fake news and what is something that we just got wrong and a small group of people are pointing out?

logifail
0 replies
2d2h

I would be delighted to read about opposing opinions, provided they are presented + supported by provable facts

Looking back at the output of certain major media outlets during the pandemic, I'd suggest it might an idea to hold everyone to the same standards, regardless of whether the opinion is seen as "opposing" or "mainstream"...

"It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."

gustavus
0 replies
2d3h

Yes false rumours and unsubstantiated claims such as

- COVID came from the Wuhan virology lab

-The NIH was funding gain of function research in Wuhan.

- There are rich pedophile businessmen that are pimping out underage girls to be able to blackmail them.

I could go on and on, because those are all horrible unsubstantiated rumours and I am glad we hunted down people who shared them persecuted and hounded them and limited their freedom to share things like that.

api
9 replies
2d6h

We’re not talking about differing views but about flat lies.

It’s basically a DDOS attack. It takes seconds especially with AI to generate bullshit but hours to debunk it.

I certainly haven’t thought of a way to address that imbalance.

thsksbd
7 replies
2d6h

Like the Steele Dossier?

Or that Hunter's laptop outlining his old man's corruption was fake?

What do you think are lies that should be censored and how quick are you to identify them?

Nursie
4 replies
2d4h

Nobody cares about Hunter Biden’s laptop. It’s the biggest nothing-burger ever. If it actually detailed any corruption, then it would have pursued in court by now. It’s always been a complete waste of time.

justonenote
1 replies
1d20h

Regardless of the specific case of the laptop, it is beyond naive to think that every incidence of corruption that is known about is pursued in court.

Even non-political figures who are known to be corrupt will not be prosecuted or pursued because of the difficulty of securing a conviction or simply priorities and limited resources.

When it comes to people to with powerful political connections, you are talking a different level of 'difficulty pursing this case'.

Nursie
0 replies
1d15h

And if I was making such a claim that would be a very valid argument.

astrange
0 replies
2d3h

One of the earliest mentions of it was on fake news sites owned by Guo Wengui claiming it had child porn on it, presumably because they'd planted some on it - the suppression was so effective that the people claiming it's real don't remember this either.

IIRC the articles had very poor ideas of what Americans would find scandalous because they spent much of their time talking about how big Hunter Biden's penis was.

api
0 replies
1d5h

If I were Hunter I would have demanded that the laptop be imaged and a torrent posted to the Internet. Label it NSFW if there was a porn collection.

No mystery no story.

It’s like when Bezos responded to a tabloid threatening to post his dick pic and out him for sending one by owning up to it and threatening to post it himself. Based.

techostritch
0 replies
2d2h

I’m not sure what point you’re trying to make here. Are you trying to make the point that the Steele dossier was fake? Certainly not censored but it would have been great if it was desensationalized.

2four2
0 replies
2d4h

Strange agenda here. This conversation isn't about those things, and isn't about censorship.

shzhdbi09gv8ioi
0 replies
2d2h

One way to address this kind of attack is by vetting the source. If you put some effort into it, you could possibly conclude that for example nytimes.com does not auto generate nonsense stories with AI, so you could then assume their text is written by humans.

Of course this falls apart in this day and age, where many seem to rely on whatever flows thru their social media feeds / reels.

It is certainly a user problem, where we need to counter by educating ourselves to deal with this the new reality of bad actors being common place, educate ourselves and our peers, and permanently solve the social chain of trust-issue in our digital lives with new technical solutions.

dgellow
8 replies
2d6h

The majority doesn’t care, you can give the best tools to identify, they won’t trust them or ignore them

throwaway115
7 replies
2d6h

You're being hyperbolic. Community Notes on X are an example of these tools, and they are not ignored or distrusted from what I've seen. If your tools are poor, make better tools. But also, as a first principle, accept that some people won't believe what you want them to believe, no matter what. That doesn't mean you get to start trying to control them, because you got frustrated.

croes
4 replies
2d6h

Would you trust community Notes on Truth Social?

They aren't a tool to identify fake news just an additional information layer which you either trust or xou don't.

If your tools are poor, make better tools

If you are a bad painter buy better brushes.

You solution doesn't give the people the tools you still have just a source which claims to show what's true.

The real tool is the user himself using his brain. But that's hard to achive.

anileated
2 replies
2d5h

If you trust community notes on Twitter, I have news for you (https://www.taiwannews.com.tw/news/5080048). Community notes are used as just another channel to spread misinformation, but this time under convenient guise of truth, and that is why it is a flawed design.

croes
1 replies
2d5h

I know, but I needed an example where the tool as such is ca. 100% useless to show it's not a real tool but just a additional layer of some authority you need to trust.

anileated
0 replies
2d1h

Ah, perhaps I thought I was objecting to you though we were mostly in agreement.

kylebenzle
0 replies
2d6h

Exactly, Twitter tends to be liberal and so their community responses also tend to be liberal.

brentm
0 replies
2d3h

I don't think the poster is being too hyperbolic. A very large share of the most vocal on both sides of the political aisle prioritize narrative over truth. Literal truth is less important to these people than something that is maybe not 100% true but captures 100% of what they feel. Tools can help these stories spread less wide but they will not be able to prevent the most ardent from continuing to use these stories as cornerstones of their beliefs. This is probably a small fraction of the country overall but a large share of the active on political X/Twitter.

TheOtherHobbes
0 replies
2d6h

They're still the wrong tools. "This is categorically untrue" will still be distrusted and ignored by the people it's meant to reach, because they're not looking for truth.

Most social media interactions about status plays, "buy my stuff" grifting, and - most of all - tribal identity politics and affirmations of belonging and identification.

With fake news there's also a fair amount of narcissistic contrarianism.

All of this passed through an online culture optimised for engagement through dopamine hits, rage, fear, and grudge farming, quick-hit entertainment, and other addictive mechanisms.

The real motivation in all of this is emotional self-soothing, not a desire for accurate facts.

So gluing "This isn't true" on any of that won't make a difference, because the problem is structural.

Fake News is a cultural and emotional exploit. It reflects cultural values, and it needs cultural and psychological defences.

Some options include better accountability for mainstream journalism; finding healthier income models that don't rely on addictions and ad tech; teaching media literacy in schools; and generally tidying up online media toxicity at all levels.

yosito
7 replies
2d7h

You can identify fake news to people, and unless someone from their "tribe" identifies it, they will just say that the party identifying it is fake.

throwaway115
6 replies
2d6h

So then you need to come to terms with people rejecting what you want them to believe. Some people won't believe you no matter what. Does that mean you get to reach for another tool, like force, or censorship? No, it means you need to accept that you don't control people.

drekk
2 replies
2d6h

It's not about belief or control. If you're harming the social organism by spreading verifiable falsehoods (vaccines cause autism) then you should be rooted out and treated like the cancer you're choosing to be. I can't yell fire in a theater. I'm somehow alive despite that censorship. People need to grow up.

throwaway115
0 replies
2d5h

Where can I read about the constitutional rights of "the social organism"?

justonenote
0 replies
2d5h

Do we really have to go over the history of ideas that were once considered falsehoods but are now accepted as truth?

Your argument rests on the presumption that we have some kind of Oracle that can divine true information from false.

rooted out and treated like the cancer you're choosing to be

It's cool that you think humans have changed in nature significantly from the time of the Inquisitions, but when you speak like this about people it's pretty clear we haven't.

yosito
0 replies
1d20h

There are tools you can use when you want people to see your perspective, things like empathy and persuasion. You don't just have to fatalistically accept that people think differently than you, you can absolutely attempt to change their mind, in ethical ways.

throw0101c
0 replies
2d5h

No, it means you need to accept that you don't control people.

Not wrong, but when others' mis- and ill-formed view of the world starts effecting others then some kind of action needs to happen:

* https://www.cdc.gov/measles/data-research/index.html

* https://www.cdc.gov/measles/about/history.html

* https://www.cbc.ca/news/health/canada-measles-outbreak-vacci...

* https://en.wikipedia.org/wiki/Vaccines_and_autism

* https://www.google.com/search?q=vaccine+autism

justonenote
0 replies
2d6h

it means you need to accept that you don't control people.

This is absolutely the correct take, but also an idea that is unconscionable to the majority of people in government and other institutions involved in setting policies today.

sobkas
4 replies
2d6h

"How do we stop people from sharing (what I deem to be) fake news?" is the wrong question. The right question is "How do we give people the tools to identify fake news?" If you give people the tools and they still spread what you deem to be fake news, then you've done what you can. Tough cookies for you.

When you teach people to recognise fake news they will be able to recognise your fake news. What you want is to cut people from enemy fake news (or truth, they are enemy, it doesn't matter) so they will believe everything you say while enemy can't influence them.

throwaway115
1 replies
2d5h

I don't want that, but I see your point. I would rather our population have a healthy immune system where bad ideas just don't survive, than to have us live as Bubble Boy, cut off from the rest of the world forever.

atmavatar
0 replies
2d4h

Alas, I fear even if we had a perfect utopia where it was easy to give everyone the choice between fake news that reinforces their beliefs and unopinionated, factual news, it won't turn out any different than placing a bunch of hungry children in a room where there's a table of sweets and a table of nutritious vegetables.

Hell, even if it was adults given some training for self-control, the vast majority of them will dip into the sweets table from time to time.

People are really good at rationalization, such that even if they know up-front which choice was fake news and which was not, I bet you'd similarly observe a significant fraction (if not majority or super majority) of people choosing the fake news to feel better.

Sadly, I'm unaware of any real solutions to the fake news problem that don't make things even worse.

thriftwy
1 replies
2d6h

The Internet was created with the implicit idea that everybody brings their stuff (including news) to the table and then the best one wins. Back then the US/Free World ones were clearly the best.

If you want nobody in your country to read any news procured by China/Russia, this kind of defeats the original idea. You've now built something else entirely - like a glorified teletext or minitel.

rightbyte
0 replies
2d6h

Ye I think this is the underlying problem.

The Man® want us to only read his fake news to be able to get away with whatever they wanna push.

One fundamental precondition for a functional democracy is the right to be wrong, since that is judged arbitrary anyways.

The problem with manipulation stems from algorithmic ad feeds anyways. E.g. chronological 'my friends only' feeds did not have that problem.

toss1
2 replies
2d4h

How is this argument different from "Whichever side shouts loudest and longest wins."?

Volume is not equal to facts or truth.

Yes, censorship is dangerous.

But the article is NOT an argument for censorship.

Everyone knows the sagacious Mark Twain quote: "A lie will fly around the whole world while the truth is getting its boots on".

Social media puts that effect on rocket fuel. When some cohorts are using that to consistently and disproportionally amplify lies, it is very reasonable to add a small amount of friction, or require someone to prioritize what they repost. This does zero to stop posting lies, it only forces them to select the highest quality — to them — set to repost.

It is no different from the restriction on this Hacker News site; if you post too much here, you get a message like "Please slow down, you're posting too fast.", and you cannot repost for several hours. I have not seen anyone crying "CENSORSHIP!!" over that restriction. I also doubt there exists anyone who would think that the discussion on Twitter is better than HN.

So, why are you crying "CENSORSHIP!!" about a similar proposal for other social media (and be sure to address how a restriction on volume has zero restriction on any specific content)?

tbrownaw
1 replies
2d

But the article is NOT an argument for censorship.

Really? It sure looks to me like it's advocating putting limits on people's ability to say things.

toss1
0 replies
1d17h

It is putting zero constraints on what can be said

It only suggests constraining the rate of broadcasting

You can go to the town square and shout all you want, you'll still be limited by how much you can enunciate in 24 hours. That is not censorship.

Similarly constraining an automated system to act like physical reality is not censorship.

If I constrain WHAT you say in your allotted time, THAT is censorship.

bluepizza
2 replies
2d6h

We provide everyone with way too many hard to use tools, just because we don't want to stop the problem at the source.

A normal adult needs to use to many tools and chase so many different types of education by themselves, off the top of my mind: financial, real estate, health, technology, purchasing habits, utilities providers, news and misinformation, nutrition, commonly applied scams.

Maybe we should just make taking advantage of people illegal, and prosecute those that do it.

sokoloff
0 replies
2d5h

Maybe we should just make taking advantage of people illegal

How do you define this in law?

Many of us tap-tap on keyboards and charge people thousands of dollars per week for that. Are we taking advantage of them? We then watch as the staff who works hard to clean our office buildings make 1/10th of what we make. Are we taking advantage of them? If someone comes to my house on a weekend and charges me double the normal price to unclog my toilet or get my heat working again, am I being taken advantage of? If there’s a shortage of generators in an area and I drive in a truckload of them from hundreds of miles away and make them available at a profit to myself, am I taking advantage?

Different people will have legitimately differing opinions on questions above. “Make it illegal to take advantage of people via laws” is much easier to say than to do.

TeMPOraL
0 replies
2d5h

My litmus test is what advertisers, marketers and salesmen can do, as the limit of that is where it turns into illegal scams. Yes, I claim this is a continuum, the legal line dividing shrewd business from fraud is arbitrary, and IMO placed way too far towards the "fraud" end, i.e. allowing too much.

biophysboy
2 replies
2d6h

I don’t disagree, but my worry is that this approach will be as effective as diet/exercise has been for obesity

DavidPiper
1 replies
2d6h

"The solution can't be personal responsibility. But also the solution is personal responsibility."

It's something I keep coming back to, and I'm really not sure how to resolve it.

probably_wrong
0 replies
2d5h

My opinion on that dilemma: some problems are so widespread that it makes more sense to deal with them collectively, aiming to cut them down at the root. We usually appoint people to deal with with them because it's more effective than everyone having to act in concert. Those people are called "politicians".

Do those people do a good job? Kinda - they tend to get everything uniformly wrong and sometimes they really screw something up, but in general they land more often than not in reasonable compromises. And while your Senator may be a corrupt idiot, the people in their staff may be okay and steer them generally in a sane direction.

tw04
1 replies
2d6h

People are allowed to think and share things that make you angry. If your views aren't mainstream, there's a reason for that.

China has entered the chat. Obviously HN is a western focused site, but it always fascinates me when people just assume Western ideas of freedom are just a solved problem.

Those freedoms, historically speaking, are very much not just a thing we should all expect will continue in perpetuity. So statements like “your ideas lost” sound a bit childish from a global perspective. Do I hope freedom of speech continues to prevail? Absolutely. Is it just the inevitable conclusion you suggest? Not at all.

llamaimperative
0 replies
2d5h

And it’s absolutely possible that changes in technology have dramatically shifted the dynamics of the marketplace of ideas (of course this is the entire purpose of advancing technology).

It is feasible that a Wild West marketplace of ideas had a tendency to select truth given the communication media from the printing press to social media, and now with social media that dynamic is weaker, nonexistent, or inverted.

RecycledEle
1 replies
1d23h

"How do we stop people from sharing (what I deem to be) fake news?" is the wrong question. The right question is "How do we give people the tools to identify fake news?"

Very true.

I feel so sorry for the people who can not use the Scientific Method to determine the credibility of news sources.

Here is what I do: I find a news story from the source that is about events close to me. Then I go over and check. I hope you try this.

I test politicians the same way. I try to find video of them discussion something from my area, and then check it. If they lied about my local area, they probably lie about everything.

RecycledEle
0 replies
1d23h

If you want to stop fake news, it only takes 1 law.

You can still publish whatever you want, but if the word "news" appears in the page then you must provide links to all original sources.

You claim a source told you that POTUS will have a press conference today? I want a video of the conversation you had with that source and full documentation of who spoke with who, where, when, and under what circumstances.

Oh, it's an anonymous source and you put it on a web page that says it's news? That's 6 months in confinement for whoever did that. Someone will serve 6 months if it means hunting down every last stockholder.

Problem solved.

techostritch
0 replies
2d3h

The narrative a lot of people seem to adopt is that the fake news debate is about limiting the types of messaging and propaganda available to two sides of a political system. And so attacking fake news is only about attacking the tools available to the other side.

I think what’s wrong about this narrative is the idea that fake news is good for the people spreading it. Like I don’t think that obsessing about the Russia investigation was great for liberals and I think there’s a ton of examples of Republican infighting that is the result of people believing things that aren’t true, and just like factually so. There’s a difference between spreading all sorts of malicious opinions about LGBT; and I’m not happy about that but whatever and the recent story of a woman who got elected to the school board to get the LGBT curricula out of schools, read the whole curricula, realized that there was no LGBT curricula in schools, told her supporters how happy she was to realize that and then was viciously attacked for it. Or the guy who actually attacked a pizza restaurant, or the Romney campaign being surprised they lost in 2012, because they believed a conservative propaganda site performing intentionally biased polling to show a not favorable result.

Like looking at the current American political situation, I think you could make a strong argument that fake news is worse for Republicans than it is for Democrats.

mindslight
0 replies
2d3h

It's wrong to frame this topic as being about wanting "the censorship weapon" or not, when we're mostly talking about centralized tech companies boosting this tripe to "drive engagement" - the role of the censor is already being actively played by these companies! When Aunt Mildred had to click 'forward', choose the list of people she was going to bombard with her spam, left the subject line containing FwReFwFwFw, and then the sheer majority didn't even reply, those were all natural limiting factors. Now Faceboot takes mere reading as a positive act of "engagement", shows it to people it predicts will be receptive, and then concentrates activity between them creating an illusion of social proof.

This ties right into your desire for tools, because tech companies don't actually want people using independent tools to access their systems. They're already giving us the tools they want us to use - poor ones designed around slowing us down, confusing our unadorned human brains, and creating a slow drip of partial accomplishment rather than letting us complete a task and move on. Attempts to even augment their tools with things like Adblock are met with a lot of grumbling and sometimes even outright blocking. Full blown third party clients (eg yt-dlp) are continually seen as a fringe thing, and churn between names/maintainers/hosts due to legal shakedowns. The nonsense of prior-restrained-based "API keys" runs roughshod over developers' thought processes. A sustainable ecosystem of independent third party clients in the mainstream app stores is essentially impossible - a wink, nod, and specious reference to trademark, DMCA, or CFAA is all it takes to get the other tech gatekeepers to collude.

Unfortunately, this all points to the only way we're fixing any of this is some kind of regulation. The centralized tech companies have set up their positions at a Schelling point of individual agency, and have heartily grown into the power vacuum. Like atoms in a gas, it's impossible for us to all just individually move in the same direction at the same time, and so the only way to get a lever big enough to move them is collective action via the government, as imperfect as that is.

Personally I'm in favor of antitrust enforcement that breaks up this bundling between hosting and client software (and hardware devices as well). They should be considered separate products, when developed at the same company be completely independent business units, with backroom communication (eg unpublished APIs) prohibited. I'd say this would get us 80% of the way there to having the choice of tools that don't themselves encourage the spreading of disinformation. Then additional functionality of automated filtering etc could grow organically on top of that, in a user-representing way. But for any of that to happen, the client ecosystems have to be freed first.

malfist
0 replies
2d7h

I don't disagree that people should learn to better recognize fake news, but when you're solution is to have the whole population change their behavior, your solution will not work.

Removal of a few super spreaders is infinitely more feasible than having everyone else change their behavior

john-radio
0 replies
2d3h

If you give people the tools and they still spread what you deem to be fake news, then you've done what you can. Tough cookies for you.

Setting aside the inflammatory tone of your take on this - I think your argument would be strengthened by an example or two of where you're suggesting the line should be drawn - this just is not how running social media sites currently works at all. Companies can and do censor user content all the time, for better or worse. What "tough cookies"? More like tough cookies for people that don't like living in a country where a third of the population believes we're ruled by baby blood drinking reptile people.

AlienRobot
0 replies
2d

I think the correct thing to do would be to stop sharing news.

I just don't see how a site about the latest drama, crime, war, or tragedy happening at the moment anywhere on the planet or on the Internet can be good for anyone's mental health.

If you asked me, I'd say more social media should strive to be like Pinterest, where people don't share their wordy opinions nobody will be better off reading, they just share pictures of things that interest them (or in many cases pictures of products they're selling).

Like, if you're a gamer, which is the better site: one that keeps telling you about the latest drama among Twitch streamers, or one that just had 30 screenshots and short video clips of various games displayed on screen at the same time?

2OEH8eoCRo0
0 replies
2d5h

Too many people get very angry that people share things they disagree with

No? We are angry that the firehouse of falsehood is making civil discourse between Americans impossible.

Sneaky of you to slip that in there, new throwaway acct.

foreigner
36 replies
2d11h

Personally I would prefer social media without the ability to "retweet" at all. I want actual original updates from the people I follow, not news (fake or otherwise) or chain letters.

MenhirMike
15 replies
2d10h

That makes it harder to find new accounts to follow though. I would say that the vast majority of accounts that I follow were discovered because other people that I follow retweeted (or boosted in the case of Mastodon) them.

Neil44
4 replies
2d10h

That's just about adding more people to your echo chamber though isn't it?

vasco
2 replies
2d8h

This idea that you should "escape your bubble" on the internet being better than not is confusing to me. If someone enjoys a personalized experience that's what they should have. Or are you telling someone who only reads fiction horror books that they need to escape their echo chamber and read other stuff because under your definition they are not eclectic enough in the media they consume?

mcherm
1 replies
2d7h

"Needs to escape their echo chamber and read other stuff" is a bit too strong, but if you moderated it to "would benefit from escaping their echo chamber and being exposed to other stuff" then yes, I would take that position for literature preferences as well as political news.

I sometimes read literature outside my comfort zone. Sometimes I put it down -- not for me. But occasionally I discover a new genera I enjoy.

PaulHoule
0 replies
2d7h

The trouble with political content is the style rather than the content, I think.

If you see ads for Ground News on YouTube they make it sound like you need add up what the right-wing ideologues say and what the left-wing ideologues say you get the truth. I get accused of being a “centrist” when I point out that people are using a toxic, hateful and othering style of communication (e.g. capitalizing Black doesn’t do anything about the difficult and long-standing problems of racism, maybe even mocks or trivializes the problem, but it sure polarizes the world into big B and little B people)

rnmmrnm
0 replies
2d9h

pretty fun though.

glandium
2 replies
2d10h

That's arguably what "the algorithm" is about.

8organicbits
1 replies
2d9h

I think there's a marked difference between seeing boosts of the people you directly follow versus an opaque algorithm that shows you content from the broader network. The former puts you in control: you choose who to follow based on if you like their content and their boosts. The latter is usually tuned towards showing you content that will keep you on the site (and looking at ads), but isn't necessarily content you want to see. As commercial social networking sites become increasingly profit driven, they'll crank the algorithm towards engagement at the cost of everything else. In contrast, a chronological feed of content created by or boosted by the people you directly follow is not "the algorithm".

On Mastodon, I discovered someone via a boost who had wrote a great blog post. I followed them but found my feed was now full of bird photography (their main hobby, and something I don't care for). I added a filter for "#BirdsOfMastodon" (or something), but there were still too many without that tag. So I unfollowed them and now I don't see any bird photos.

mcny
0 replies
2d8h

Also I recently learned that different people see different comments on the same Instagram post and they to me is something I’d never thought of about echo chambers. Like you could send someone an Instagram post and assuming Instagram has their history they might see different comments than you

everdrive
2 replies
2d8h

That makes it harder to find new accounts to follow though.

That would be the benefit.

thejohnconway
1 replies
2d7h

What does that mean? If you don’t want to follow accounts, I would have thought that sort of social media isn’t for you (which is a pretty valid stance!)

kortilla
0 replies
2d6h

Using social media to follow people you already know about is fine.

n_plus_1_acc
1 replies
2d10h

I use Mastodon but with out retweets ("boosts") in my timeline, because some of it I find only marginally interesting. I can still find new accounts to follow by reading discussions under tootd.

PaulHoule
0 replies
2d7h

I feel similarly about “boosting”, a lot of people who I follow that sometimes have worthwhile things to say in my opinion boost stuff that is utter trash leaving the question of do I unfollow the booster, mute the OP, block the OP, etc.

There are a few ways a toot can get boosted massively, politically oriented outrage is by far the most common.

ben_w
1 replies
2d9h

That sounds like a good change?

I mean, if you don't actually know someone yet, is it really a good idea to hang on their every word the way social media does?

MenhirMike
0 replies
2d9h

If you follow people of certain hobbies - for example, retro computing - then it's a great way to discover others sharing the same interest. You can check their timeline to see if that's someone you want to follow or just choose not to. And if you really dislike someone, you can just block them. I think that retweets/boosts become a really good discovery mechanism here.

If your hobby is e.g., NFL Football, then yeah, the signal/noise ratio makes retweets a terrible thing that just clutters your feed.

Wowfunhappy
6 replies
2d7h

Retweets on Twitter predate the official feature, people used to prepend "RT @username" to the message they wanted to retweet. So I don't know that you could ever stop this.

sillysaurusx
5 replies
2d7h

Elon managed to stop people posting links on Twitter by basically killing any tweet that has one.

One could imagine doing the same for any tweet starting with "RT @username" or its variants.

eatonphil
4 replies
2d7h

I don't say this whatsoever to show off but just as an observer of the system: when I tweet about a new blog post I write which includes the link to the post (not in a followup tweet), the single tweet often gets a few hundred likes and tens of retweets.

While to a degree I believe people (even Twitter themselves) when they say tweet with links are downgraded, it clearly isn't crippling.

sillysaurusx
2 replies
2d7h

Sure, when you have 18k followers, you’re in the point-{oh,…}-one percent.

Meanwhile with 7k followers I barely get ten.

But I’ll try it more and see. Thanks for pointing that out.

EDIT: it’s possible you might be preempting the downweight because you’re including a screenshot with all of your links. I.e. the downweight only happens if twitter shows the expando for an external link, which doesn’t happen when screenshots are included. Either way, I’ll be implementing your technique. :)

eatonphil
1 replies
2d6h

EDIT: it’s possible you might be preempting the downweight because you’re including a screenshot with all of your links. I.e. the downweight only happens if twitter shows the expando for an external link, which doesn’t happen when screenshots are included. Either way, I’ll be implementing your technique. :)

Interesting. The reason I've always done this is just because I want to give people a meaningful preview of the first few paragraphs. And I figured that it's been effective. In more than one ways, perhaps, now I see.

sillysaurusx
0 replies
2d6h

It’s possible that the screenshot is just so good at grabbing attention that it cancels out the downweight, too. It’s hard to know anything when the algorithm is a black box.

Your previews are very good, by the way. Much better than anything you’d see from a default expando. I’ll try the same.

p3rls
0 replies
2d5h

There's certainly a system behind the scenes on twitter accounting for links-- I have many times your followers (80k)since I have abused the system (mostly though giveaways from my sponsor) my links often get many times fewer than hundreds of likes and tens of retweets.

You can experiment with the no link vs link thing for yourself by putting the link it a followup reply like some news accounts do btw.

bjornsing
2 replies
2d9h

That’s pretty much what you have on Facebook. It’s also the reason that that platform is completely uninteresting, intellectually. What you get is famous people capitalizing on their fame / follower count, and the rest of us having no chance.

kristiandupont
1 replies
2d8h

[...] and the rest of us having no chance.

Chance for what?

bjornsing
0 replies
2d7h

Of getting any reach for our thoughts, ideas, opionions, of course…

TZubiri
2 replies
2d10h

Make a feature to reshare so that reshares are tracked

9dev
1 replies
2d10h

If I’m not mistaken, that is what WhatsApp does - you can only forward a message so many times. They implemented this to counter fake news spreading in India quickly. I didn’t follow whether that worked as intended, however.

TZubiri
0 replies
1d18h

You don't need to. Every social media implements this and has done so for years.

You can trust tradition, your instincts, logic, or science, whatever.

If you are stuck on constant skepticism questioning even the most blatant truths you will be stuck on step 1.

Ferret7446
2 replies
2d10h

That doesn't make sense. People will just copy the text and/or link the post.

kristiandupont
0 replies
2d9h

That takes more effort though, which is probably a good thing.

eloisant
0 replies
2d8h

This is what people were doing, writing "RT:" with the other post copy pasted. That's why Twitter (and probably others) created the retweet functionality, it was created by usage first.

29athrowaway
1 replies
2d9h

Retweet was not originally a feature of Twitter. It was initially a convention where users would write "RT @user Their tweet"

raverbashing
0 replies
2d8h

Exactly

RTs were done manually before they were a thing. Hashtags, the same (so you could search for #something more easily as a topic).

graemep
0 replies
2d8h

I agree, particularly with regard to FB (which is the only social media I use regularly). Most reshared stuff is junk

However, most things FB promotes (i.e. suggests, or that just shows up in my feed for no reason) are the same, and they are all there for the same reason. To increase engagement.

XorNot
0 replies
2d7h

Retweets are arguably better then the alternative since they create a chain of association back to the original source.

The problem is people uncritically believe the original source which is just someone saying something like "I just heard X is happening!" with no actual evidence at all.

p1dda
27 replies
2d10h

What is 'fake news' for one person is truth to another.

What it boils down to is who decides what is the truth and whether they have the right to censor everything and everybody else.

In the 90's, the internet came as a salvation for free speech because now everybody got to have their voice heard. I definitely don't like this trend of censorship, it's not what our western democracy is built upon.

jemmyw
12 replies
2d10h

It's news that's untrue and made up, rather than mistaken reporting. I know we can never know real truth in all circumstances, and there'll always be situations of vagueness and ambiguity. But there's still the things that did happen, to some level of veracity, and the things that people made up. The dissemination of shit people have made up while also lying and saying it's the truth does seem to present a problem.

BoingBoomTschak
7 replies
2d9h

Unless people are caught red-handed fabricating those, how can you know it's fake? You think that cross-referencing or having "reliable sources" like Wikipedia will help you know what's going on on the other side of world (e.g Ukraine/Russia)? It's probable even the people in the involved countries don't truly know what's going on.

The only facts are what I can perceive with my own senses and what comes directly from people I know enough to have a judgment I consider reliable on their trustworthiness.

Fact is that entities using the expression fake news are often not better than the ones they accuse.

yakshaving_jgt
2 replies
2d7h

The russian federation have repeatedly been shown to produce fake news.

Massive amounts of fake news.

Suggesting that nobody can really know the difference between fantasy and reality is a misinformation tactic used by the kremlin to create more useful idiots who suggest the same.

BoingBoomTschak
1 replies
1d7h

Can you imagine that "the kremlin" says the same to its population? Why do you think your own powers that be are any different?

Reading your comments in this thread, I know beforehand that my replying will be useless, but I'm bored.

yakshaving_jgt
0 replies
1d7h

The kremlin said that Ukraine invaded russia. The rest of the world said russia invaded Ukraine.

If you can’t see which one is right and which one is wrong, then yes, there’s no point in you replying, to anyone, about anything.

kmudrick
1 replies
2d7h

The only facts are what I can perceive with my own senses and what comes directly from people I know enough to have a judgment I consider reliable on their trustworthiness.

"My Mama says that alligators are ornery because they got all them teeth and no toothbrush!" - The Waterboy

If the Waterboy trusts Mama does this make it a fact?

BoingBoomTschak
0 replies
1d7h

I don't see your point, you can judge you own mother untrustworthy. The point is that you need to have an extended and real life contact with someone before deciding to trust them.

Ultimately, the word "fact" in this context means "what someone considers a fact". You can be wrong, but if you can't trust your own senses or judgment, how can you live in this world? Well, rhetorical question, since there are no sincere solipsists.

jemmyw
0 replies
2d6h

I'm well aware that Western media aren't necessarily reporting the ground truth about that war. But it's an interesting one in that there's so much camera footage, mostly from Ukrainians, but also from Russian soldiers with smartphones. So, while current events are always skewed in reporting, osint has been able to piece together a more accurate picture afterwards.

alpaca128
0 replies
2d7h

The only facts are what I can perceive with my own senses

Our perception and memories are known to be very unreliable.

throwaway290
3 replies
2d8h

The dissemination of shit people have made up while also lying and saying it's the truth does seem to present a problem.

The problem is that what you think of as "made up shit" and "lying" can be manipulated without you knowing.

For example, Covid lab leak theory was deliberately made to look like fake news by some guys who would lose money if it were true: https://web.archive.org/web/20210709212358id_/https://www.bm...

There is no need to be a conspiracist to simply understand that with 100% likelihood at any time there is something true that you think is fake news, no matter which side you are on.

jemmyw
2 replies
2d6h

I followed that news story at the time. There was plenty of speculation in the media, but the facts were laid out - there was a lab in the city, they were studying these types of viruses, that type of virus was also present in local wildlife that was traded.

Then it became clear that the Chinese govt wasn't going to investigate or admit if it was a leak, accidental or not. At which point I file as we'll never know + who cares at this point.

And then after that it blew up as a political football in the US. It's not fake news, it's just pointless opinions.

Fake news is saying it was a lab leak, or it wasn't a lab leak. Neither is a true statement. Both are polarising and ill informed positions.

In any case, I don't disagree with your premise that we can be manipulated quite easily.

throwaway290
1 replies
1d14h

It was a lab leak, or it wasn't a lab leak is just an opinion, anyone can have one.

I'm talking when reputable media reports on people who say "it may be a lab leak" as conspiracists, stories and posts get demoted on FB an IG, etc. That's manipulation and "fake news". It was widely believed on US left apparently, yet those who bought that angle and who talk about "fake news" like they have some magic radar are mostly the same people.

At which point I file as we'll never know + who cares at this point

We just went through a traumatic pandemic with pretty bad effects and pretty massive human life toll, so... everyone?

jemmyw
0 replies
22h18m

so... everyone?

Yes you misunderstood because I worded that badly and glibly. We can't know and the issue is so muddied and we had a lot to deal with. The virus was already out, knowing wouldn't put it back. The best reaction is to reconsider all these labs anyway.

yakshaving_jgt
4 replies
2d10h

Except for when it's very obviously bullshit, like the number of vatniks who insisted that the russians weren't going to invade Ukraine, or that insisted the russians hadn't invaded Ukraine even after tanks were halfway to Kyiv.

The fact is, there are many bullshitters, and their bullshit isn't up for interpretation as you suggest.

HPsquared
3 replies
2d7h

Even in that case, the people posting those statements probably believed that at the time.

yakshaving_jgt
2 replies
2d7h

Really? You think the foreign minister of the russian federation — the longest serving since the Tsarist era — doesn't know what's happening in his own country?

HPsquared
1 replies
2d6h

Obviously political appointees are going to be unreliable, I was more talking about the hordes of plebs on social media.

yakshaving_jgt
0 replies
2d4h

Where do you think they get their misinformation?

techsupporter
3 replies
2d10h

What is 'fake news' for one person is truth to another.

"Everyone is entitled to his own opinion, but not his own facts." - Everyone from Daniel Patrick Moynihan all the way through to Mike Pence.

There are some things that are demonstrably true and untrue. Society gains nothing, and loses much, by simply throwing up our hands and saying "well, nothing to be done!" in the face of this.

who decides what is the truth and whether they have the right to censor everything and everybody else

That is one heck of a leap.

Screaming the cries of censorship when faced with people who disagree--either on your point, your presentation of your point, or the facts underlying your point--is escaping to a place where your position is no longer the one being debated, it's the "terms of the debate." Thus, one no longer needs to defend their position; they can simply stand to the side and feign being the actual aggrieved party.

I definitely don't like this trend of censorship, it's not what our western democracy is built upon.

It is not censorship. Censorship is not saying "you are incorrect about this and I am calling you out for it and encouraging others to take note of your being incorrect." Heck, censorship is not even being shouted down for one's beliefs. If you say something and other people yell at--or praise!--you for it, that's not censorship.

This article alone says there is no active censorship since people can, and do, merrily spread incorrect statements and half-baked "just asking questions" that even the Weekly World News would have found too preposterous to print.

roenxi
2 replies
2d9h

"Everyone is entitled to his own opinion, but not his own facts." - Everyone from Daniel Patrick Moynihan all the way through to Mike Pence.

Is there any evidence either of them believe that view to be a fact? Politicians routinely make up their own facts. In practice people are actually entitled to their own facts. There are mechanisms to mediate disagreements as best we can but we rely on a certain amount of goodwill rather than an ability to determine the truth.

macintux
1 replies
2d8h

Historically I would argue politicians interpreted facts in their favor more than they created their own facts.

Certain demagogues are fond of nonstop lying, but in U.S. politics that hasn’t been the dominant approach until recently. It’s dramatically harder to compromise on governance when Congress stops agreeing on facts, instead of how to interpret facts and what conclusions to draw from them.

roenxi
0 replies
2d6h

Historically I would argue politicians interpreted facts in their favor more than they created their own facts.

Different pronunciations of the same word; what they do is a juvenile ritual on top of lying. They don't observe facts neutrally or build a solid argument that they test against reality, they start with the conclusion and work out what they need to say to get other people there. And I'm not saying it is a failing on the part of politicians - the voters demand that sort of behaviour from them. But if that is acceptable, flat lying is acceptable. The intent and outcome aren't different as long as bargains are honoured.

It’s dramatically harder to compromise on governance when Congress stops agreeing on facts, instead of how to interpret facts and what conclusions to draw from them.

I don't know if there is any particular evidence that is a bad thing. The main lesson from the 2000s and 2010s was if Congress is united on a course of action it is probably going to be a disaster.

The US still hasn't managed to shake off the PATRIOT act or clean out the secret law court that was established. We seem to be well into a trend where every president will be subject to a spying campaign before entering office. The US economy has been out-capitaled by literal communists in China. It seems like an excellent time to have someone challenging the basic facts that the congress has been agreeing on.

gbnvc
2 replies
2d10h

No these superspreaders know what they’re writing is lies. I don’t really understand how holding people accountable for their actions is censorship, unaccountability is just censoring the victims.

HPsquared
1 replies
2d7h

The problem is that if applied rigourously, this would also mean banning all mainstream outlets.

yakshaving_jgt
0 replies
2d7h

What a ridiculous false equivalence.

Before the russians invaded, Western mainstream media reported that the russians were preparing to invade Ukraine.

Mainstream media [kremlin propaganda] in the russian federation mocked Western mainstream media and insisted that the russians weren't about to invade.

The two are not the same.

gorbachev
0 replies
2d7h

1+1=17122

akie
0 replies
2d10h

It is not censorship to point out and correct falsehoods. Without a common understanding of the world around us, at least in terms of what is true or not, what hope can we have to come to a political consensus?

treprinum
22 replies
2d8h

It's a well-known fact that a tiny focused group can have a major impact on the society; that's how many initially fringe progressive ideas were implemented, like LGBT rights. The problem of fake news is that some of them turn out to be true, discrediting whoever opposed them, casting doubt on the "arbiters of truth", giving further credibility to supersharers ("what if this person is right when we know official media lied to us in this and that case?"). Now we have even well-respected people that spent their whole life in the power circles like Jeffrey Sachs casting doubt on many official narratives. This further weakens the "truth" signal and increases confusion. I suspect AI is going to make everything worse fairly soon as well.

sillysaurusx
15 replies
2d7h

It’s also sometimes an important phenomenon. Remember when it was impossible to talk about the lab leak theory without being labeled?

Eventually mainstream media came around, but they did a lot of damage to their credibility. Not to mention the whole Iraq having weapons of mass destruction narrative that many of us grew up seeing proven false firsthand.

At this point I just don’t trust anything. It’s fortunate to be of a scientific mindset, since you can make up your own mind with some confidence that you’re not crazy. That doesn’t really work for most people, which seems like the main problem you touch on.

rimunroe
6 replies
2d6h

Remember when it was impossible to talk about the lab leak theory without being labeled? Eventually mainstream media came around, but they did a lot of damage to their credibility.

…but they didn’t. The lab leak theory is still mostly the realm of conspiracy theorists, and is (rightly, imo) treated as such. I’ve seen stories reporting on efforts investigating the lab leak hypothesis, but they’re always about how the study turned up nothing and thus the best evidence continues to point at direct animal -> human transmission.

Occasionally conspiracy theories do turn out to be true, but that’s vanishingly rare next to the number of them which turn out to be false.

At this point I just don’t trust anything. It’s fortunate to be of a scientific mindset, since you can make up your own mind with some confidence that you’re not crazy. That doesn’t really work for most people, which seems like the main problem you touch on.

Be very careful about this. It’s crucial to think critically and evaluate sources. However, if there’s a theory among domain experts (like epidemiologists or climatologists) which is overwhelmingly believed (Covid didn’t come from a lab) and you believe something which has the reputation of being a conspiracy theory, the odds of you being right don’t look good. Your odds get worse the longer the prevailing theory has been accepted, as in the intervening time the people in the minority will have been searching for convincing support.

thsksbd
3 replies
2d6h

You know we have the government emails vis a vis the lab leak?

rimunroe
0 replies
2d6h

Which emails are you referring to? The ones with Fauci in early 2020? If so, they ruled that out as an option within a few months. I can understand being outraged about the suppression of those emails (though I understand why they’d want to do that), but the fact they were suppressed doesn’t have any bearing on the validity of the lab leak theory.

drekk
0 replies
2d5h

You know we have video evidence of the WMDs right? There were reconnaissance photos, elaborate maps and charts, and even taped phone conversations between senior members of Iraq's military.

rimunroe
0 replies
1d23h

I did say "mostly" for a reason. A few months after the FBI director made those comments the Director of National Intelligence (who oversees the FBI and the rest of the US intelligence community) released a report reiterating that an animal origin was most likely and that there wasn't evidence of a lab leak. The Wikipedia article[1] on the lab leak theory will summarize it better than I can:

In February 2023, The Wall Street Journal reported that the US Energy Department, based on new intelligence, had shifted its view from "undecided" to "low confidence" that the pandemic originated with a lab leak. In the intelligence community, "low confidence" means the information is sourced to low-quality or otherwise untrustworthy sources. In the wake of these reports, FBI Director Christopher Wray reiterated the bureau's assessment, saying that the Government of China was doing its best to thwart any investigation. White House National Security Advisor Jake Sullivan responded to the report saying "some elements of the intelligence community have reached conclusions on one side, some on the other. A number of them have said they just don't have enough information to be sure", and there was still "no definitive answer" to the pandemic origins' question. The reassessment renewed the political debate around the issue in the US.

In June 2023, the Office of the Director of National Intelligence declassified their report on the virus' origins, in compliance with an Act of Congress compelling it to do so. The report stated that while the lab leak theory could not be ruled out, the overall assessment of the National Intelligence Council and a majority of IC assets (with low confidence) was that the pandemic most likely began as a zoonotic event. No evidence was found that SARS-CoV-2 or a progenitor virus existed in a laboratory, and there was no evidence of any biosafety incident. Proponents of the lab leak hypothesis reacted by accusing the agencies of conspiring with the Chinese, or of being incompetent. Covering the story for the Sydney Morning Herald, its science reporter Liam Mannix wrote that the US report marked the end of the lab leak case, and that it had ended "not with a bang, but a whimper".

[1] https://en.wikipedia.org/wiki/COVID-19_lab_leak_theory#Intel...

staunton
2 replies
2d7h

I just don’t trust anything

Surely you trust a lot of things and people every day.

One cannot make up ones mind about almost anything without being influenced by others, nor without taking into account information received from others. The point is to always be willing to question and revise one's opinions while consciously making decisions despite uncertainty.

sillysaurusx
1 replies
2d7h

It seems like we’re saying the same thing. Trust implies that you can afford to outsource your skepticism. Being willing to question whoever you trust is at odds with that.

It’s probably more accurate to say that whenever someone says something, it’s treated as an independent data point, each of which is assigned somewhere between "probably true" and "probably false". The datapoints get updated as more observations about the world come in.

Observations are different than taking someone’s word for it.

amalcon
0 replies
2d1h

That assigned probability is a function of trust in the speaker, prior plausibility, other available information, and the consequences of being wrong in either direction. You can't avoid the trust question by talking about probability.

JKCalhoun
2 replies
2d5h

Remember when it was impossible to talk about the lab leak theory without being labeled?

Fortunately I don't think I have ever curbed my reason or logic because of possibly being "labeled".

At this point I just don’t trust anything.

That's sad to read. It suggests those that would sow disinformation have won. All my life I have been skeptical — somehow I was raised with the mind of a scientist - likely many of us here have. But I have also learned to be skeptical of skepticism if you know what I mean — contrarians, doubting Thomasii, etc.

EDIT: I see your comment here: "Observations are different than taking someone’s word for it." You should have led with that. Still, as we can't all be on the ground in Anytown, China or conducting a scientific test on superconductivity, we have to have a framework of what "sounds reasonable based on life experiences" and accept a certain amount of what we hear/read.

elevatedastalt
1 replies
2d3h

It's not just about being 'labeled'.

What amount of censure would break your steely resolve? Maybe you are ok with your posts being down-ranked. What if your accounts are disabled next?

JKCalhoun
0 replies
1d15h

If my accounts are disabled I don't really want to have anything to do with those sites.

As an example, I'm pretty sure there are a few subreddits that would ban me for my beliefs. I have no interest in those subreddits.

To take an extreme example, I'm also old enough to remember enjoying life before there was an internet. If all of the internet tossed me out ... might be the best thing that happened to me in many, many decades.

dahart
0 replies
2d3h

At this point I just don’t trust anything

That is what the spreaders of fake news want, it is literally the agenda to sow mistrust, and it’s working. This, I fear, is going to be the real and lasting damage of this effort, it’s breaking down our ability to find common ground and have civil discourse, it’s breaking our ability to function democratically.

acdha
0 replies
2d5h

Remember when it was impossible to talk about the lab leak theory without being labeled?

No, because I saw people talking about it continuously throughout the period. The conspiracy nuts and right-wing activists making wild evidence-free claims that COVID was a Chinese bio weapon or genetic engineering mistake were rightly dismissed, but people approaching it scientifically and trying to assess evidence for clues that it might have been a mishandled lab sample were not. That’s always been the problem with this: “lab leak” meant different things to different people, and the erstwhile martyrs tried to represent themselves as representing some broader truth when they had always been trying to work backwards from a politically-advantageous explanation.

HPsquared
4 replies
2d7h

AI text generation is going to be very strong at generating arguments for whatever position the user requests.

euroderf
2 replies
2d7h

This sounds like a way to make trial lawyers obsolete.

HPsquared
1 replies
1d21h

Lawyers (especially as a collective) are well-placed to outlaw things.

euroderf
0 replies
1d9h

I can see a sci-fi flick where a SkyNet-style A.I. world takeover is blocked by a plucky band of loquacious paralegals.

visarga
0 replies
2d6h

The same models can curate your feeds of what you don't want to see. This could be a fix for the problem - you can't actually remove unwanted content from the network, but you can hide it like ad blockers. I think we need local AI to de-garbage our internet interactions from now on.

Who decides what should be hidden? You can do it yourself, or align with some trusted source, the same way we subscribe to ad blocking feeds. Yes, there is a risk of a bubble, but you have the tools to avoid it, LLMs are very flexible in how they get prompted.

This only saves you from having to endure the garbage, but if you're actually interested in stopping others from seeing it, no help. I think that's a social problem, not a technological one.

nemo44x
0 replies
2d5h

It's a well-known fact that a tiny focused group can have a major impact on the society;

It’s Pareto essentially. And yes it’s in everything. In any organization a well organized minority will exert power over the disorganized majority, or remainder of people.

It’s why you see ridiculous municipal codes, etc. Small but well organized groups of people create pressure to bring them into existence. Opposition to it is disorganized and apathetic. The minority group can’t be too small but because it’s small it’s easier to organize a larger group of outer supporters (sympathizers, useful idiots, etc) that will march with you on a nice Saturday afternoon, etc. Now your fringe idea looks like it is majority held and enough otherwise disinterested people say “sure, whatever”.

derekbreden
13 replies
2d11h

Anecdotally, I’ve paid to promote a number of tweets over the last few years, and have consistently found that there is a certain percentage of users that are apparently retweeting absolutely everything in their feed. My content is niche, and usually has a very small audience, and so these “blind retweeters” stick out like a sore thumb when I look into who does anything with my tweet.

The behavior I’ve observed is consistent with the article, in that the timings of their flood of retweets could very well indicate a real human pressing the buttons, and I suspect they are. And, it does not surprise me that when looking at the long tail of the most absurd fake news that has a small audience on an individual tweet level, that one would find the majority of the retweets for such obvious fakeness is coming from such blind retweeters.

The effect overall is that whatever bubble such users are in gets amplified a bit, absurdity and all, though I’m skeptical they play any role at all in what becomes truly popular.

The real problem is far more nuanced with the less obvious fake news that even those not tapping blindly are taken in by, the less obvious fake news that gets a large audience because a lot of people find it believable, hoaxes that persist in the collective unconscious long after their time in the spotlight has faded, regardless of any debunking or fact checking that played a role in its popularity dying out.

I think the effect of these super spreaders repeating complete nonsense is minuscule compared to the effect of organically popular almost reasonable nonsense that truly goes mainstream.

prox
12 replies
2d11h

I was recently reminded of the 1/9/90 rule :

The so-called 1/9/90 rule posits that on a social media network or review site, only 1 percent of users will actively create content. Another 9 percent, the editors, will participate by commenting, rating or sharing the content. The other 90 percent watch, look and read without responding.

This is my experience as well. So also true on HN, where are majority are just lurkers. (Hi there lurkers, good time to get an account! ;)

I think if you are lurking you far more likely to just consume something without actively giving it a second thought. I think of the adagio that to really know something, you have to teach it (or being able to explain at least) and that doesn’t happen in passivity. It happens in dialogue.

kreyenborgi
3 replies
2d10h

I think if you are lurking you far more likely to just consume something without actively giving it a second thought.

Or you've learnt the hard way that voicing your opinion leads to getting invested in your viewpoint (and worthless internet points) and more time, energy and emotion spent on social media, whereas "passively" consuming you can just walk away from any silly argument that you weren't even trying to start, without it staying with you the rest of the night.

prox
1 replies
2d9h

Sometimes yes. It depends on how sincere or good faith a comment is, or the general atmosphere of a platform. While having lengthy discussions on HN is pretty much impossible, I still feel lots of people are showing willingness of entertaining “curious conversations”

I do think you highlight a certain skill thats advantageous to have , like knowing when something is indeed a silly argument or lost cause of an emotional drain.

082349872349872
0 replies
2d9h

While having lengthy discussions on HN is pretty much impossible

It doesn't happen often, requiring effort from all sides, but I've found the 3rd day is about when a superannuated HN discussion starts getting interesting.

(but some email threads I'm in have lasted for years, so YMMV)

portaouflop
0 replies
2d8h

Ah so it’s about growing up then?

devjab
3 replies
2d10h

It happens in dialogue.

I think it’s a fundamental flaw to assume that anything on SoMe is actual dialogue.

Take our exchange here as an example. You may never read my reply, you may never respond to it and if you do I may never read it. On top of that, we will probably never talk again and we certainly won’t remember each other.

So what is really happening isn’t really dialogue. It’s talking into the vast nothingness for a shot of dopamine or whatever our brains use to reward voicing our opinion. Maybe it started with dialogue long ago, when online communities were smaller and you’d actually talk with the same people every day. But in 2024 I might as well have written this reply on my notes app where no one would ever read it.

Twitter especially is the modern version of standing on a box in Hyde park, screaming nonsense at passers by. Some of them will tell their friends about the idiot on the box in a pub later, but nobody will remember it. Probably not even the person doing the screaming, because they’ll be on with a new topic the next day.

rightbyte
2 replies
2d10h

So what is really happening isn’t really dialogue. It’s talking into the vast nothingness for a shot of dopamine or whatever our brains use to reward voicing our opinion.

I'd say this is a dialogue though. Twitter however, seems like one way com mostly and those who are not notable shout into the void for no good reason.

prox
1 replies
2d9h

I would like to think HN is dialogue, even though the format isn’t suited for lengthy discussions as things move on. Especially the core tenet of “curious conversation” is very helpful.

Other platforms or certain subs on reddit? Quality differs wildly.

mcmoor
0 replies
1d19h

I don't think HN is dialogue, for sole reason that I don't get notification whenever someone reply to me. I wouldn't even know that I'm involved in a dialogue!

spacechild1
2 replies
2d10h

I think of the adagio

I think you meant 'adage'. 'adagio' is a slow musical tempo, which left me confused for a while :)

prox
0 replies
2d10h

I meant adage, I listened to classical music recently (more than usual)

:)

Thank you for the correction!

alwa
0 replies
2d10h

I kind of like thinking about those two words set against each other, though!

Sounds like “adagio” is a contraction of “ad agio” as in “at leisure” in Italian, while the usual references suggest that “adage” might have a couple possible derivations. The main one being that it comes through French from the Latin “ad” + a form of “aio,” or “I say,” so in the straightforward sense “a saying”; another etymlogy positing a root in “adigo” as in “drive, force.” [0]

[0] https://www.etymonline.com/word/adage

bbarnett
0 replies
2d10h

Likely more like 0.1 or even 0.01%, same with editors.

sethammons
8 replies
2d7h

Way back, like waaaaay back, you can imagine strangers meet out in the wilderness, each wary of the other. To determine threats or potential friends they would need to establish trust. Where you are from, who you know, do we know the same people, do you lie, etc. "Ah, you know the Baker and are traveling through here to trade with them."

Early webs of trust were on personal knowledge of others. I think a similar web of trust needs to reemerge. Vouching and relationships and when the vouchee messes up it affects the voucher. Imagine if your relationship graph could crumble when you let someone into the circle of trust who ends up a wolf.

If you could cut out whole circles of trust when they are compromised, we could maybe get more authentic web back. See shit ads or blog spam, penalize the circle that houses it by no longer seeing its content.

Many smaller circles join to other circles, where irl know people in the first circle. Like a new take on web rings.

If your only portal to the internet was through your rings and related rings, we may get echo chambers but we may also gain the ability to remove bad actors from our immediate rings. Fake news propaganda? Penalized the ring and all members until that wolf is removed.

#MorningThoughts.

xyst
2 replies
2d5h

Reminds me of the “Social Credit System” in China

pennomi
1 replies
2d4h

Social credit isn’t inherently flawed, everyone naturally develops a working model in their brain where we judge other people based off of a variety of factors. It’s just hugely scary when a government is the one operating a social credit system.

If there were some theoretical framework that would assist users in judging strangers on the internet, according to their own personal settings, that could be effective.

For example, one thing you see on Reddit is when somebody starts posting disagreeable comments in a community, a concerned member might look at the offending user’s entire post history and discover“oh, the only thing this guy does is make pro-[topic] statements”. The member then shares this information with the rest of the community, hoping to discredit all statements by the offender.

I almost think there could be a personal LLM that does this for you, and preemptively“tags” trolls and spreaders of misinformation in a way that allows you to automatically filter or dismiss their comments.

sethammons
0 replies
2d3h

oh man, like an llm generated heuristic pop up over other users that you can't help but see before engaging. `pennomi tends to make reasonable arguments seeking balance from both sides. Few of their comments are downvoted and there seems to be an interest in spacex`. That's very interesting.

Anotheroneagain
1 replies
2d3h

You shouldn't base your opinions on trust.

dahart
0 replies
2d3h

That’s a nice platitude, but completely impractical. Try to really count the number of things you know first-hand, like really truly know because you were there and you’ve also controlled for alternative explanations. Be honest.

Also think hard about the number of things you believe to be true but haven’t verified yourself.

My list of truths I know first-hand isn’t very big, I don’t know first-hand how an airplane or computer works (even though I design software and hardware), I don’t know first-hand that people landed on the moon, and I don’t know first hand anything about national politics. Pretty much everything we do is based on belief and things other people tell us (which includes stories we read or watch on video).

I don’t know it for a fact, but I believe that most people have about the same level of first-hand knowledge as me, that most people know very few truths, and the majority of how we function day-to-day is based on beliefs that we haven’t and won’t take the time to prove.

This is perhaps a generalization of Brandolini’s Law. [1] I briefly looked for whether there’s a broader and less negative version of this that applies to all information and not just misinformation, and didn’t find it yet. But if you think about it, it’s true that the amount of energy needed to know something is true first-hand is far, far higher (way more than an order of magnitude) than the amount of energy needed to use that information effectively. We basically don’t have the option to spend all of our time verifying the truthfulness of every opinion we have, or nothing at all would get done.

[1] https://en.wikipedia.org/wiki/Brandolini%27s_law

somenameforme
0 replies
2d4h

From my perspective, the web analogy fails immediately. I might trust somebody on one topic, but eyeroll at them on another. And that trust is also subject to change, and often does. Sometimes it's because the person changes, and sometimes because I change. So, for me at least, it's one node with a zillion flickering 1-depth connections.

And I think most people are also of a similar mindset, even if they might not necessarily realize it. It's how you get things like a democracy where everybody ostensibly votes for the candidate they want to win, and then we get a congressional approval rating that's trending to the single digits. [1] That's only possible because are extremely dissatisfied even when their guy and his party are both in control, otherwise it'd at least hover around 50%.

[1] - https://news.gallup.com/poll/1600/congress-public.aspx

makeitdouble
0 replies
2d7h

To determine threats or potential friends they would need to establish trust.

They'd just wear nice clothes and be well groomed enough, bonus point if they're in uniform, and everybody would trust them.

Knowing who to trust has never been our forte.

carlosjobim
0 replies
2d7h

We've lived in mass societies for more than a century. Individuals do not exist anymore, since all people are mainly formed by mass media and mass indoctrination in schools. The few individuals that exist can weild oversized influence, for example as super sharers.

bell-cot
8 replies
2d4h

Way back ~1990, when internet email was first being rolled out to non-technical employees at fairly-tech-savvy orgs, I noticed that there were a very small minority of people - yes, generally non-technical older women - who suffered from "Recreational Sharing Disorder". It didn't matter if the stuff was blond jokes, or "everyone has to pray for Katie, who has cancer", or "forward this 1,000 times and there will be world peace", or what. Obviously they had neither filters for credibility, nor good filters for respecting other peoples' time budgets and expressed interests. And as the number of connected users expanded, a tiny % of RSD cases could quickly generate enough email traffic to badly impact a site's email service. (Which was usually dial-up modem or fractional T-1.)

Mathematically, it was very interesting. And with an occasional reminder from management, they could pretty-well control their RSD at work.

These days, there's a toxic hot mess of politics and emotions laid over the whole situation. But it might be useful to recall that the underlying problem is mostly a network with a very large N, and the bell-curve distributions of certain human behaviors.

zug_zug
4 replies
2d4h

I appreciate the anecdote.

The distinction I’d draw is that email was a push model, but today it’s platforms that decide what we see. If platforms wanted, I think it’d be trivial for them to make a setting for their users to filter out politics/outrage/whatever

For example Reddit will (unprompted) put links to videos of police brutality on the home page when I’m not part of that subreddit.

To me this anti pattern reminds me of putting candy bars in the checkout lane — pitting me against my self control on purpose

spixy
2 replies
2d1h

For example Reddit will (unprompted) put links to videos of police brutality on the home page when I’m not part of that subreddit.

What? Yes if you go to r/popular/ or r/all/ but on home page it wont (I just tested to be sure and it shows only my subscribed subreddits).

smegger001
0 replies
2d

I bet you removed all of the default subs and gp didn't.

benfortuna
0 replies
1d17h

In my experience of Reddit you don't just get your subscribed subreddits, but also "you might like this" posts from subreddits you aren't subscribed to.

So I think there's definitely an algorithm at play here.

bell-cot
0 replies
2d2h

True, to a degree.

But if you recall the 1987-ish to 2007-ish era - there was enormous growth in the anger-positive, fact-indifferent conservative talk radio and TV segment. Which was not 100% opt-in...but still far more so than most modern social media.

Another anecdote: Back about 2002, an old friend of mine cut off all contact with a life-long, retired friend of his. In retirement, the friend-of-friend had been bitten by the Viral Conservative Anger bug - and refused to stop forwarding dozens of VCA emails to my friend every week. Similar for discussing any other topic on the phone, or ...

(Pre-retirement, the friend-of-friend had been a highly skilled engineer, and had seemed to be a wonderful person.)

itronitron
1 replies
2d4h

Reminds me of the day after 9/11. I was working at a small company located a few hours drive from NYC that was a subsidiary of a much larger company headquartered further west.

For whatever reason, an employee in one of the lead administrative groups at 'HQ' was forwarding very high resolution photos of the previous day's events and the group email address they were using happened to include everyone in any organization that was in some way under the management chain, so anywhere from 10,000 to 20,000 people likely received those pictures. They sent about three emails before stopping, or being stopped.

For better or worse, the people suffering from RSD have moved to Facebook.

dylan604
0 replies
2d

For better or worse, the people suffering from RSD have moved to Facebook.

Better for me. They've stopped sending emails, and I don't use FB! So win-win for me. Although, that email filter that sequestered all of "that uncle's" emails is still ready to go if needed. Which is one advantage of the emails vs socials.

causality0
0 replies
2d3h

One of the best things about my parents retiring was the cessation of the endless torrent of garbage e-mails they sent me every day.

https://www.youtube.com/watch?v=KCSA7kKNu2Y

ungreased0675
6 replies
2d6h

This genre of research is disturbing to me. Researchers found accounts spreading ideas they didn’t like, tracked down their real world identities, and then recommended ways of silencing them. Please consider the implications of this activity, especially (inevitably) when it’s used to target communities we’re a part of.

Boxxed
2 replies
2d4h

This is not about stopping "ideas I don't like," it's about stopping "This shit is made up, and it's intended to be damaging." I do not understand why so many people cannot or refuse to see the difference.

Nursie
1 replies
2d1h

It’s bizarre isn’t it. We live in a world where people are slinging lies about the place, deliberately, for monetary or political gain.

And yet a lot of contributors here see no difference between that and factual information or honest reportage. It’s distressing that so many people seem to swim in this mire of relativity, and see calls for truthfulness and adherence to reality as some sort of sin, an admission of bias and a desire to suppress the other side.

dent9876543
0 replies
1d23h

It's not surprising. Many of the truths that have been slung around in recent times have turned out to be noble lies or outright falsehoods, manipulating opinion for monetary or political gain.

The problem is not so much in the research itself, but that it is presented and reported unaware or indifferent to those events.

jfengel
1 replies
2d6h

That seems to deny even the existence of truth and falsehood. There's only opinion, "ideas they didn't like", and no chance that those ideas might actually be wrong.

There's also kind of an odd contradiction. This is an idea you don't like, and suggesting it should be stopped.

thriftwy
0 replies
2d5h

The obvious problem is that we do not know what's true and what's false until way later.

We can tell "organic" from "forced", perhaps, but then the results will not be pretty for any political camp.

whoitwas
0 replies
2d6h

The people dox themselves using social media. The study uses an established method to classify fake news. It's scientific, not stuff they don't like.

rapjr9
6 replies
2d9h

My aunt used to occasionally forward emails to me that were obviously part of viral messaging attempts. They would often be religious in nature, but they'd also sometimes have messages of racial hatred or political topics. They would say things like "If you resend this email 100 times then Bill Gates will pay you a penny per email when you have sent 1,000,000 emails! This is the gospel truth!". There would be a long list of other forwarded emails after that with the email addresses of all the other women who had forwarded that and other emails, so if you looked through that history there would be a variety of topics. "Don't break this chain or you will go to hell!" These retweet networks may be a continuation of a similar thing, playing on peoples religiousness to get them to spread false messages (mixed in with true messages). Here's an example of one of them, I've tried to preserve the unusual spacing:

{start of message}

Read all of this one, it is interesting!!

  Near the bottom--the part highlighted in green--will give you GOOSEBUMPS!!!
 

 
 
You don't want to miss this!

VERY INTERESTING-

1. The Garden of Eden was in Iraq

2. Mesopotamia, which is now Iraq, was the cradle of civilization!

3. Noah built the ark in Iraq

4. The Tower of Babel was in Iraq

5. Abraham was from Ur, which is in Southern Iraq

6. Isaac's wife Rebekah is from Nahor , which is in Iraq

7. Jacob met Rachel in Iraq

8. Jonah preached in Nineveh - which is in Iraq

9.. Assyria, which is in Iraq, conquered the ten tribes of Israel

10. Amos cried out in Iraq

11. Babylon, which is in Iraq, destroyed Jerusalem

12. Daniel was in the lion's den in Iraq

13. The three Hebrew children were in the fire in Iraq (Jesus had been in Iraq also as the fourth person in the Fiery Furnace!)

14. Belshazzar, the King of Babylon saw the 'writing on the wall' in Iraq

15. Nebuchadnezzar, King of Babylon, carried the Jews captive into Iraq

16... Ezekiel preached in Iraq

17... The wise men were from Iraq

18. Peter preached in Iraq

19. The 'Empire of Man' described in Revelation is called Babylon --which was a city in Iraq

And you have probably seen this one: Israel is the nation most often mentioned in the Bible.

But do you know which nation is second?

It is Iraq !

However, that is not the name that is used in the Bible..

The names used in the Bible are Babylon , Land of Shinar , and Mesopotamia ... The word Mesopotamia means between the two rivers, more exactly between the Tigris And Euphrates Rivers ..

The name Iraq means country with deep roots.

Indeed Iraq is a country with deep roots and is a very significant country in the Bible.

No other nation, except Israel , has more history and prophecy associated

With it than Iraq

And also, This is something to think about:

Since America is

Typically represented by an eagle.

Saddam should have read up on his Muslim passages ....

The following verse is from the Koran, (the Islamic Bible)

Koran ( 9:11 ) - For it is written that a son of Arabia would awaken a fearsome Eagle.. The wrath of the Eagle would be felt throughout the lands of Allah and lo, while some of the people trembled in despair still more rejoiced; for the wrath of the Eagle cleansed the lands of Allah;

And there was peace.

(Note the verse number!) Hmmmmmmm?!

I BETTER NOT HEAR OF ANYONE BREAKING THIS ONE OR SEE IT DELETED.

This is a ribbon for soldiers fighting in Iraq ..

Pass it on to everyone and pray.

Something good will happen to you tonight at 11:11 PM

This is not a joke.

Someone will either call you or will talk to you online and say that they love you.

Do not break this chain..

Send this to 13 people in

The next 15 minutes.

Go

{End of message}

There were a variety of common topics, religion, support for soldiers, funny stories about old people, pet stories, and then the odd political or social message.

christophilus
1 replies
2d9h

My grandpa used to forward me similar emails. Any idea what the motivation is for whoever originally writes these things? There doesn’t seem to be any sort of financial angle. Is it just boomer-trolling for the sake of it?

rapjr9
0 replies
2d9h

The above research paper says one purpose of some of the emails was to collect email addresses. If you wanted to target a religious, military supporting, older crowd this was a way to collect their email addresses. They are also intended to encourage faith, support the military, and support caring for those less fortunate.

And perhaps to spread the occasional political message urging people to "write your congresspeople", etc. I remember getting a few that were obviously racist or which spread lies. I used to collect all the spam email I got and besides those pushing porn there were a lot of them that were twisted. Maybe those were early attempts to create "word of mouth", peer-to-peer communication channels.

brazzy
1 replies
2d9h

I'm pretty sure the "send a copy of this to X people or something bad will happen to you!" meme already existed in the paper mail days.

rapjr9
0 replies
2d9h

Yes, the research paper above says it started centuries ago.

rapjr9
0 replies
2d9h

Here's one that is more political, anti-Obama:

Subject: Lead, follow or get the hell out of the way!

                    Lead, follow or get the hell out of the way!
{picture of Lee Iococca's book}

                                Just as true today as it was when his book first came out.
                                He was, and still is, a brilliant businessman!
                                 Often we need to be reminded of Iococca's words.
                                 
                                 
                                Remember Lee Iacocca, the man who rescued Chrysler Corporation from its death throes?  He's now 82 years old and has a new book, 'Where Have All The Leaders Gone?'.

                                Lee Iacocca Says: 

                                'Am I the only guy in this country who's fed up with what's happening? Where the hell is our outrage with this so called president? We should be screaming bloody murder! We've got a gang of tax cheating clueless leftists trying to steer our ship of state right over a cliff, we've got corporate gangsters stealing us blind, and we can't even run a ridiculous cash-for-clunkers program without losing $26 billion of the taxpayers' money, much less build a hybrid car. But instead of getting mad, everyone sits around and nods their heads when the politicians say, 'trust me the economy is getting better..'

                                Better? You've got to be kidding. This is America , not the damned, 'Titanic'. I'll give you a sound bite: 'Throw all the Democrats out along with Obama!' 

                                You might think I'm getting senile, that I've gone off my rocker, and maybe I have. But someone has to speak up. I hardly recognize this country anymore..

                                The most famous business leaders are not the innovators but the guys in handcuffs.. While we're fiddling in Afghanistan , Iran is completing their nuclear bombs and missiles and nobody seems to know what to do. And the liberal press is waving 'pom-poms' instead of asking hard questions. That's not the promise of the ' America ' my parents and yours traveled across the ocean for. I've had enough. How about you? 

                                I'll go a step further. You can't call yourself a patriot if you're not outraged. This is a fight I'm ready and willing to have. The Biggest 'C' is Crisis! (Iacocca elaborates on nine C's of leadership, with crisis being the first.)

                                Leaders are made, not born. Leadership is forged in times of crisis. It's easy to sit there with thumb up your butt and talk theory. Or send someone else's kids off to war when you've never seen a battlefield yourself. It's another thing to lead when your world comes tumbling down.

                                On September 11, 2001, we needed a  strong leader more than any other time in our history. We needed a steady hand to guide us out of the ashes. A hell of a mess, so here's where we stand.

                                We're immersed in a bloody war now with no plan for winning and no plan for leaving.  But our soldiers are dying daily.

                                We're running the biggest deficit in the history of the world, and it's getting worse every day! 

                                We've lost the manufacturing edge to Asia , while our once-great companies are getting slaughtered by health care costs. 

                                Gas prices are going to skyrock again, and nobody in power has a lucid plan to open drilling to solve the problem.  This country has the largest oil reserves in the WORLD, and we cannot drill for it because the politicians have been bought by the flea-hugging environmentalists.  
                                 Our schools are in a complete disaster because of the teachers union. 

                                Our borders are like sieves and they want to give all illegals amnesty and free healthcare. 

                                The middle class is being squeezed to death every day. 

                                These are times that cry out for leadership.

                                But when you look around, you've got to ask: 'Where have all the leaders gone?' Where are the curious, creative communicators? Where are the people of character, courage, conviction, omnipotence, and common sense? I may be a sucker for alliteration, but I think you get the point.

                                Name me a leader who has a better idea for homeland security than making us take off our shoes in airports and throw away our shampoo?

                                We've spent billions of dollars building a huge new bureaucracy, and all we know how to do is react to things that have already happened.

                                Everyone's hunkering down, fingers crossed, hoping the government will make it better for them.  Now, that's just crazy.. Deal with life.

                                Name me an industry leader who is thinking creatively about how we can restore our competitive edge in manufacturing. Who would have believed that there could ever be a time when 'The Big Three' referred to Japanese car companies? How did this happen, and more important, look what Obama did about it!
                                Name me a government leader who can articulate a plan for paying down the debit, or solving theenergy crisis, or managing the health care problem. The silence is deafening. But these are the crises that are eating away at our country and milking the middle class dry. 

                                I have news for the Chicago gangsters in Congress. We didn't elect you to turn this country into a losing European Socialist state. What is everybody so afraid of? That some bonehead on NBC or CNN news will call them a name? Give me a break. Why don't you guys show some spine for a change?

                                Had Enough? Hey, I'm not trying to be the voice of gloom and doom here.  I'm trying to light a fire. I'm speaking out because I have hope - I believe in America . In my lifetime, I've had the privilege of living through some of   America 's greatest moments. I've also experienced some of our worst crises: The 'Great Depression,' 'World War  II,' the 'Korean War,' the 'Kennedy Assassination,' the 'Vietnam War,' the 1970's oil crisis, and the struggles of recent years since 9/11.

                                Make your own contribution by sending this to everyone you know and care about. It's our country, folks, and it's our future. Our future is at stake!!
                                ***********************************
                                LET'S GET THE MUSLIM ROOKIE OUT OF THE WHITEHOUSE!!!

gmerc
6 replies
2d6h

Facebook figured this out too a long time ago when building news tab.

But unfortunately they also saw that these people were 90% aligned with the same political party and Joel Kaplan helped Zuck understand that actioning it would mean declaring war on the party.

So Zuck tucked his tail.

Proving once again that most of our problems today are not tech problems and don’t have tech solutions but are societal in nature.

We know what to do on climate change, we just don’t like the tradeoff. Or more precisely our governmental systems can not overcome the resistance because we forgot to incentivize long term survival.

https://www.politico.com/news/2021/10/25/facebook-fatal-flaw...

We know what to do about misinformation.

We know what to do about micro plastics pollution

We know what to do about antibiotic resistance

We know what to do about gun violence.

So we hold out for magical technology fixes.

The special trick in the US is that everything can be made a partisan problem, which instantly means paralyzing government intervention.

plasticchris
5 replies
2d6h

This level of certainty is a bit frightening. We know what to do? These are complex issues with many trade offs and unintended consequences.

gmerc
4 replies
2d6h

My point - We don’t like the tradeoffs or consequences. Usually relating to our entirely man made economic system.

What? Arrest Co2 increase in the atmosphere

plasticchris
3 replies
2d5h

So many points. Does your thinking start with “it is obvious that…” or “the right side of history” ? We as humans are just very susceptible to a combination of fallacy and social reinforcement, no matter our political stance.

gmerc
2 replies
2d4h

There’s nothing complex about human induced Co2 in the atmosphere. We know what we have to do.

Ray20
1 replies
2d2h

No, we are not. It is just Western-centrism, and you for some reason forget, that there is 5 billion more people on the Planet, for whom the consequences of the fight against CO2 emission is more dangerous then the global warming. The same goes for microplastics pollution and the other things you listed.

gmerc
0 replies
2d1h

As someone living in Southeast Asia 2m NN and at close to 40deg every day now - a hearty Fuck You.

The same from my friends in Dheli who have 45C/12xF head.

The same for Vietnam, Malaysia, Indonesia dealing with coastal erosion and drowning in western shit plastics.

Spare the fake bullshit about being concerned about the rest of the world. We have a word for people like you. Fascists.

Apparently we moved from denial to denial of human effect to the “it’s good for you” phase of colonialism again.

gverrilla
5 replies
2d7h

EDIT: changed my mind

throwawayqqq11
0 replies
2d7h

Consider the fact that they only looked at twitter a scientific limitation too.

On twitter

is litterally in the papres title, they dont claim to represent public discourse in its entirety.

Limitations and future directions

Several limitations should be noted. First, our sample may contain systematic differences from a fully representative sample.

Your argument is nothing but a fluffy ad hoc reaction which ironically imho enables these supersharers.

kmudrick
0 replies
2d7h

I mean the subtitle at the very top of the article literally says "Less than 1% of Twitter users posted 80% of misinformation about the 2020 U.S. presidential election" so it seemed pretty clear to me. No fake news here.

epgui
0 replies
2d7h

It’s not fake news. They’re telling you exactly what they did. It’s called a caveat.

Edit: kudos for being reasonable and changing your mind! :)

bozey07
0 replies
2d7h

Even less clickbaity title:

Most Republicans didn't share misinformation about the 2020 U.S. presidential election on Twitter.

baxtr
0 replies
2d7h

It’s called X now!

the_real_cher
4 replies
2d7h

Theres no way to get rid of "fake news".

And nor would you want to as many times what people mean by fake news is dissenting news.

The key is to educate people to think for themselves and question everything.

dahart
2 replies
2d2h

This article is about misinformation, not dissent. I don’t know how many times people think fake news means dissent rather than misinformation, but the term is defined to mean misinformation. https://en.wikipedia.org/wiki/Fake_news

It is a pretty common tactic for the spreaders of true misinformation to claim their position is dissent rather than lying, that they are the underdog and being ostracized for dissenting…

Getting you to question everything is part of what they want. The goal of misinformation is mistrust. It’s too difficult to question everything, see e.g., Brandolini’s Law.

I don’t know what the solution is, but I personally know highly educated people myself who are stuck in the misinformation rabbit hole. The problem with mistrust and conspiracy theory is that once it takes hold, it breaks people’s ability to evaluate trustworthy and even scientific sources. Don’t assume education is the key. Fake news is very, very effective, which is why they are doing it.

the_real_cher
1 replies
2d2h

It is a pretty common tactic for the spreaders of true misinformation to claim their

it's also a common tactic of people to label things they don't agree with as misinformation...

Getting you to question everything is part of what they want.

What is the alternative you're proposing here?

Blind belief?

If you do respond can you try to keep it short and sweet and in a single paragraph.

dahart
0 replies
2d2h

Like I said, I don’t know what the solution is, because I think its a hard problem that we can’t just solve with education. I certainly didn’t propose blind belief. It’s true that doesn’t work. Blind disbelief also doesn’t work. I’m just pointing out that there’s evidence that “question everything” not only is not the solution, it may be feeding the trolls and serving their agenda, as is framing fake news as being suppressed dissent.

I agree some people label dissent as misinformation. I suppose that’s another form of fake news. Maybe part of the solution is to stop discussing politics all the time, stop listening to it, and stop taking sides. We don’t need to decide whether every thing a politician said is true or not, we have better things to talk about.

Yes it would be nice if we could teach people to have the right level of skepticism that fake news wouldn’t work, that’s a good goal to have. It would also be good to find a way to build some trust and stop the cycle of increasing polarization. We do have shared values. Figuring out how to talk about those again would be great. The trend has been in the opposite direction, maybe because gossip and drama and tribalism are easier. On the other hand, I do reserve a lot of optimism that most people are good and aren’t obsessed with politics. This article provides some evidence of that, if the majority of the talking points are spread by a tiny minority. Maybe there is a solution here that can address this tiny minority directly, maybe this isn’t a general populace education issue but one we have hope of fixing because the audience is rather small…

gnz11
0 replies
2d5h

Highly educated people get hoodwinked by "fake news" and pseudoscience all the time though. I don't have a solution and education is certainly a step forward but the issue is much more complex.

johnny99k
2 replies
2d5h

Why does this article only show right-wing news as if the left is always truthful?

mjmsmith
0 replies
2d5h

"As for sociodemographics, we found that supersharers were significantly more likely to be women, older adults, and right-leaning"

arandomusername
0 replies
2d3h

Most of the time those researchers are left leaning, so they will focus on fake news from the right.

taciko
1 replies
2d8h

Fake news are huge problem.

We need A creative non-profit organization combating mis- and disinformation.

Here we go: " Save the Truth"

https://savethetruth.org/

defrost
0 replies
2d8h

Looks a lot like AI generated pan handling riding on the back of third parties with no real substance other than pleas for $$$'s

The meconium spruiking account adds gravitas though ...

stareatgoats
1 replies
2d4h

The term "fake news" is so broad and fraught with ideological positioning, linguistic overreach, and self-righteous, naïve realism that it really ought to be banned. It does not do what it seemingly intends to do, which is to make people more critical of what they read and share; it does the opposite, just labeling sources as bad or good, which is as close to propaganda as it gets.

A more nuanced classification than the "fake news/good news" binary is sorely needed.

klabb3
0 replies
2d3h

It seems like a bad science yes, or perhaps editorialized by media. And completely unexpected that all information proliferation follow power laws. If they were comparing different categories of information (political, rage bait, technical discussions, neutral boring news) etc they could say something interesting. But this is such a narrow focus on a mostly value judgment based class of misinformation. It would be much more interesting to know how certain types of information differ from each other in terms of topological properties.

mcmoor
1 replies
2d6h

*Tiny number of 'supersharers' spread the majority of news.

I bet that's what a further study will conclude.

Chikatorma
0 replies
5h10m

After that

Tiny number of 'supersharers' spread the majority of democracy
admissionsguy
1 replies
2d7h

An alternative solution is to abolish universal suffrage. Re-introduce wealth requirement since people who are able to accumulate wealth tend to see the world as it is. Then we don't have to worry about the vagaries of the poor & gullible anymore.

rsynnott
0 replies
2d6h

Re-introduce wealth requirement since people who are able to accumulate wealth tend to see the world as it is.

… Ever met any rich people? There are some seriously confused rich people out there, let me tell you.

whoitwas
0 replies
2d6h

We know social media is used to manipulate people. If you want unbiased news, look at AP, not Meta or Twitter.

tibbydudeza
0 replies
2d3h

Ignorance is strength - that guy was onto something.

smusamashah
0 replies
2d8h

Wasn't this the takeaway from the book "The Tipping Point". It talks about small number of people with lots of connections are the source of making things popular / viral. This is exactly that.

slowhadoken
0 replies
2d2h

Why is the author siting studies about Twitter from 2016 to 2019 but referring to “X users” when Musk bought the company in 2022?

richrichie
0 replies
2d4h

Federal Government (e.g. CIA) is a key spreader of misinformation.

psychlops
0 replies
2d6h

Looks like the groundwork is being laid to dispute the results of the upcoming US presidential election if the wrong person gets elected.

martin82
0 replies
1d5h

I don't even need to open the linked article to know that when they say "fake news" they actually mean "facts that go against the current narrative".

jmyeet
0 replies
2d7h

I want to highlight another aspect to this that doesn't seem to be mentioned: reddit.

We all know about adding "reddit" or "site:reddit.com" as a search term. So does Google. We saw in the recent leaks how Google is increasingly value reddit as a source. Why do we do this? Because it's one of the few remaining places that isn't completely astroturfed.

But this is a temporary situation. You can see this on any controversial topic. It is incredibly easyto hijack a subreddit or simply change what bubbles by brigading posts. This can be with real people (usually coordinated via a Discord) or with bots. This is not simply a bot detection problem. It might be viewable as a voting ring problem but I have my doubts.

Depending on the size of the subreddit this may only take dozens of people acting in concert. Even larger subreddits may only take a few hundred. But such a group can completely change what posts make it to the top and also what comments on each post make it to the top.

My point here is taht many on HN and elsewhere like to bemoan "fake news" on social media, which is a real problem to be sure. But at the same time they will extoll the virtues of reddit. The only difference is they're either unaware of reddit manipulation or it hasn't happened to their favorite subreddits yet.

hallqv
0 replies
2d7h

E.g. msnbc, New York Times, cnn

graemep
0 replies
2d8h

Fake news is hard to classify (unless you limit to obvious fakes with no reasonable room to dispute), and hard to sample fairly. This study is also limited to a single event.

What is misinformation can be debatable too, and it can affect the results. For example, a study by Kings College London on Covid misinformation in the UK during lockdown looked at mostly very clear cut things (e.g. its caused by 5G etc) but also included the lab leak theory as misinformation. That shifted the results because they lab leak theory appealed to a different group of people (older, right wing) to the others (younger, left wing).

I am pretty sure you would get different results in other countries, but my point is what you include changes the results.

My personal experience of political misinformation on social media (mostly FB) is that it is subtle (e.g. quotes out of context to change their meaning, or with misleading commentary), sometimes originates from mainstream media, and comes mostly from the "left". However, that is because my friends, at least those who are inclined to discuss politics on social media, are mostly affluent professional British people who vote Labour (the historically left wing party). A hopelessly biased sample.

emrah
0 replies
1d16h

Services like Twitter which serve a very very large number of people have a responsibility to maintain a civil, clean platform.

If the trouble makers are a small minority, it should be easy to cut them off then.

And for those who will cry "freedom of speech", liberty does not mean everyone gets to do whatever they want. Everyone's freedoms are bounded by the freedoms of others

cjdaly
0 replies
2d10h

Did you see the article claiming Hunter Biden's laptop wasn't really Hunter Biden's laptop, but rather had "all the classic earmarks of a Russian information operation"? If stories like this, run by fake news sites such as NPR or NYT, popped up in your social media feed about the 2020 U.S. presidential election, they probably came from a tiny group of people with a massive impact.

cess11
0 replies
2d12h

'“I do not see a lot of benefit in allowing people to send unrestricted amounts of retweets in a day,” Grinberg says.'

I do not see a lot of benefit in allowing Grinberg to speak on policy matters.

causality0
0 replies
2d3h

Interesting data but technically an orphan statistic. Who is it that's spreading the real news? The same group of people could be responsible for 80% of all news spreading and we wouldn't know from this study.

amelius
0 replies
2d4h

I bet this is also how most religions started.

HPsquared
0 replies
2d7h

Same goes for real news, and everything else. Power law distribution.

DanielBMarkham
0 replies
2d6h

We live in a reductionist world. that's not going to change. The other commenter is right, the only way out is better tooling.

I can't tell the fake people from the real ones anymore, much less accurately judge information quality. That's the cesspool that ad-driven internet created and here we are, no exit in sight.

I've worked with many organizations in crisis and on the way out. There comes a time that everybody knows they're in the endgame, but they march on anyway. People's actions become a bit like those in a Potemkin Village. The end days of the Soviet Union was like that.

To me, the tools we're going to need to go forward have frack-all to do with coding. Small trust groups of real people, incremental sharing, overlapping interests and groups, devices that do one and only one things at a time (so that you and your social group can help you monitor your usage to optimize). We know these things work for humans. We've been doing them thousands of years.

Probably not a popular thing to say on HN, but these are the tools we need to move forward as a species. Tech can certainly help, but it can never even come close to replacing the shared social/evolutionary experience we all have baked in.

I'm positive long-term about tech and progress, but just like any other massive change in tooling, we need to cut out the bullshit and re-adapt social norms. We've done this kind of thing many times before and hopefully we'll continue to do so.

Barrin92
0 replies
2d10h

"It also points to a possible solution, he says: “Simple limits on retweets would constrain the spread of this information while having little effect on the vast majority of users.”

This is a good solution. I remember a while ago, I believe it was in India, WhatsApp limited the number of users you could share something with as a response to the platform being used for riling up ethnic violence. Putting an inhibitor on this cascading virality has to improve the sanity in any communication network.

Individual accounts broadcasting at the rate of major TV news stations with no scrutiny isn't a combination that was ever going to produce something approaching the truth. The entire premise of how groups produce truth is through a representative wisdom-of-the-crowds effect. 0.1% of highly correlated users swinging every discussion is extremely pathological.

Most glaring problem is of course that limiting your most viral users is complete anathema to the business incentives of any for-profit social media platform, so in particular I don't see it happening on Twitter.

1vuio0pswjnm7
0 replies
1d16h

One of the most ridiculous behaviours of so-called "Big Tech" and its employees is to try to blame end users for problems.

That certain individuals had a noticeable propensity for forwarding chain letters and other garbage on the early internet, from which they received _no economic benefit_, may have informed the tactics of today's "Big Tech" intermediaries but what these so-called "tech" companies do today is to _encourage_ this behaviour for their _own_ economic benefit. Truthfully, it goes beyond merely "promoting" sharing. The www user visiting a so-called "tech" company website does not _make a choice_ to share, like someone sharing e-mail chain letters on the early internet, the www user using a so-called "tech" company website _has no choice_.

https://www.theatlantic.com/technology/archive/2011/09/the-p...

https://adage.com/article/digitalnext/facebook-s-frictionles...

It is in the economic interests of the so-called "tech" company intermediary to _promote_ what is referred to as "viral" content. The quality of the content is irrelevant. If it is "viral", shared and viewed by large audiences, then it has potential to generate revenue for the so-called "tech" company. It has sufficient audience size and therefore it has value to advertisers.

Thus, for example, one can visit the YouTube website to search for videos titles and descriptions that contain a certain keyword. As a result of so-called "tech" company tactics, the results of the search may contain videos whose titles and descriptions do not contain the keyword and in fact have no relevance at all to what is being searched for. The sharing of this content is not optional. It is being forced, without any voluntary user input, by so-called "tech" companies hoping to profit from advertising services.

In sum, the so-called "tech" companies want end users of the www to be looking at the _same_ content. These intermediaries manipulate people who use their websites so that users look at the _same_ pages. There could be millions of people all wanting to view different pages but if each page has an audience of one this is not useful for selling advertising services, hence the so-called "tech" company will manipulate users so that they look at the _same_ pages. It is not doing this for the benefit of the people using the www, or even for the benefit of the "content creators", it is doing this for its _own_ economic benefit.

127
0 replies
2d3h

There really should be a way to target and remove purposefully malicious communication from the open social platforms.