YouTube says it'll bring back creators banned for Covid and election content (businessinsider.com)

softwaredoug 2 days ago

I'm very pro-vaccines, I don't think the 2020 election was stolen. But I think we have to realize silencing people doesn't work. It just causes the ideas to metastasize. A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

andy99 2 days ago

The more important point (and this is really like a high school civics debate) is that the government and/or a big tech company shouldn't decide what people are "allowed" to say. There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think. People seem to forget that sometimes someone they don't agree with is in power. What if they started banning tylenol-autism sceptical accounts?

mapontosevenths 2 days ago

> the government and/or a big tech company shouldn't decide what people are "allowed" to say.

That "and/or" is doing a lot of work here. There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

Then again, Alphabet is now claiming they did want to host it and mean old Biden pressured them into pulling it so if we buy that, maybe it doesn't matter.

> What if they started banning tylenol-autism sceptical accounts?

What if it's pro-cannibalism or pedophilia content? Everyone has a line, we're all just arguing about where exactly we think that line should be.

int_19h 2 days ago

> There's a huge difference between government censorship and forcing private companies to host content they don't want to host on servers they own.

It really depends. I remember after the Christchurch mosque shootings, there was a scramble to block the distribution of the shooter's manifesto. In some countries, the government could declare the content illegal directly, but in others, such as Australia, they didn't have pre-existing laws sufficiently wide to cover that, and so what happened in practice is that ISPs "proactively" formed a voluntary censorship cartel, acting in lockstep to block access to all copies of the manifesto, while the government was working on the new laws. If the practical end result is the same - a complete country block on some content - does it really matter whether it's dressed up as public or private censorship?

And with large tech companies like Alphabet and Meta, it is a particularly pointed question given how much the market is monopolized.

onecommentman 1 day ago

I wonder, in the case of mass violence events that were used as advertisement for the (assumed) murderer’s POV, whether there should be an equivalent of a House of Lords for the exceptional situation of censoring what in any other context would be breaking news. You don’t want or need (or be able) to censor a manifesto for all time, but you would want to prevent the (assumed) murderers from gaining any momentum from their heinous acts. So a ninety day (but only 90 day) embargo on public speech from bad actors, with the teeth of governmental enforcement, sounds pretty reasonable to me. Even cleverer to salt the ether with “leaks” that would actively suppress any political momentum for the (presumed) murderers during the embargo period, but with the true light of day shining after three months.

NaN years ago

undefined

MostlyStable 2 days ago

It can simultaneously be legal/allowable for them to ban speech, and yet also the case that we should criticize them for doing so. The first amendment only restricts the government, but a culture of free speech will also criticize private entities for taking censorious actions. And a culture of free speech is necessary to make sure that the first amendment is not eventually eroded away to nothing.

plantwallshoe 2 days ago

Isn’t promoting/removing opinions you care about a form of speech?

If I choose to put a Kamala sign in my yard and not a Trump sign, that’s an expression of free speech.

If the marketing company I own decides to not work for causes I don’t personally support, that’s free speech.

If the video hosting platform I’m CEO of doesn’t host unfounded anti-vax content because I think it’s a bad business move, is that not also free speech?

NaN years ago

undefined

NaN years ago

undefined

AfterHIA 2 days ago

Bingo. This is Adam Smith's whole point in the second half of, "Wealth Of Nations" that nobody bothers to read in lieu of the sentiments of the Cato Institute and the various Adam Smith societies. Nations produce, "kinds of people" that based on their experience of a common liberty and prosperity will err against tyranny. Economics and autocracy in our country is destroying our culture of, "talk and liberality." Discourse has become, "let's take turns attacking each other and each other's positions."

The American civilization has deep flaws but has historically worked toward, "doing what was right."

https://www.adamsmithworks.org/documents/book-v-of-the-reven...

lkey 2 days ago

Or it might be the case that that 'culture' is eroding the thing it claims to be protecting. https://www.popehat.com/p/how-free-speech-culture-is-killing...

NaN years ago

undefined

SantalBlush 2 days ago

Are you in favor of HN allowing advertisements, shilling, or spam in these threads? Because those things are free speech. Would you like to allow comments about generic ED pills?

I simply don't believe people who say they want to support a culture of free speech on a media or social media site. They haven't really thought about what that means.

NaN years ago

undefined

asadotzler 2 days ago

Will you criticize my book publishing company for not publishing and distributing your smut short story?

NaN years ago

undefined

NaN years ago

undefined

briHass 2 days ago

The line should be what is illegal, which, at least in the US, is fairly permissive.

The legal process already did all the hard work of reaching consensus/compromise on where that line is, so just use that. At least with the legal system, there's some degree of visibility and influence possible by everyone. It's not some ethics department silently banning users they don't agree with.

mitthrowaway2 2 days ago

The middle ground is when a company becomes a utility. The power company can't simply disconnect your electricity because they don't feel like offering it to you, even though they own the power lines. The phone company can't disconnect your call because they disagree with what you're saying, even though they own the transmission equipment.

mc32 2 days ago

The thing is that people will tell you it wasn’t actually censorship because for them it was only the government being a busy body nosey government telling the tech corps about a select number of people violating their terms (nudge nudge please do something)… so I think the and/or is important.

AfterHIA 2 days ago

Great post mc32 (I hope you're a Wayne Kramer fan!)

This private-public tyranny that's going on right now. The FCC can't directly tell Kimmel, "you can't say that" they can say, "you may have violated this or this technical rule which..." This is how Project 2025 will play out in terms of people's real experience. You occupy all posts with ideologically sympathetic players and the liberality people are used to becomes ruinous as, "the watchers" are now, "watching for you." The irony is that most conservatives believe this is just, "what the left was doing in the 2010's in reverse" and I don't have a counterargument for this other than, "it doesn't matter; it's always bad and unethical." Real differences between Colbert and Tate taken for granted.

NaN years ago

undefined

AfterHIA 2 days ago

There's a literal world of literature both contemporary and classical which points to the idea that concentrations of power in politics and concentrations of wealth and power in industry aren't dissimilar. I think there are limits to this as recent commentaries by guys like Zizek seem to suggest that the, "strong Nation-State" is a positive legacy of the European enlightenment. I think this is true, "when it is."

Power is power. Wealth is power. Political power is power. The powerful should not control the lives or destinies of the less powerful. This is the most basic description of contemporary democracy but becomes controversial when the Randroids and Commies alike start to split hairs about how the Lenins and John Galts of the world have a right to use power to further their respective political objectives.

https://www.gutenberg.org/files/3207/3207-h/3207-h.htm (Leviathan by Hobbes)

https://www.gutenberg.org/ebooks/50922 (Perpetual Peace by Kant)

https://www.heritage-history.com/site/hclass/secret_societie...

JumpCrisscross 2 days ago

> the government and/or a big tech company shouldn't decide what people are "allowed" to say

This throws out spam and fraud filters, both of which are content-based moderation.

Nobody moderates anything isn’t unfortunately a functional option. Particularly if the company has to sell ads.

ncallaway 2 days ago

As with others, I think your "and/or" between government and "big tech" is problematic.

I think government censorship should be strictly prohibited. I think "company" censorship is just the application of the first amendment.

Where I think the problem lies with things like YouTube is the fact that we have _monopolies_, so there is no "free market" of platforms.

I think we should be addressing "big tech" censorship not by requiring tech companies to behave like a government, but rather by preventing any companies from having so much individual power that we _need_ them to behave like a government.

We should have aggressive anti-trust laws, and interoperability requirements for large platforms, such that it doesn't matter if YouTube decides to be censorious, because there are 15 other platforms that people can viably use instead.

AfterHIA 2 days ago

Another way of articulating this: "concentrations of power and wealth should not determine the speech or political sentiments of the many."

My fear is that this is incredibly uncontroversial this is until it's not-- when pushes becomes shoves we start having debates about what are, "legitimate" concentrations of power (wealth) and how that legitimacy in itself lets us, "tolerate what we would generally condemn as intolerable." I feel we need to take a queue from the Chomsky's of the world and decree:

"all unjustified concentrations of power and wealth are necessarily interested in control and as such we should aggressively and purposefully refuse to tolerate them at all as a basic condition of democratic living..."

This used to be, "social democracy" where these days the Democratic Party in the United States' motto is more, "let us make deals with the devil because reasons and things." People have the power. We are the people. Hare fucking Krsna.

heavyset_go 2 days ago

This is just a reminder that we're both posting on one the most heavily censored, big tech-sponsored spaces on the internet, and arguably, that's what allows for you to have your civics debate in earnest.

What you are arguing for is a dissolution of HN and sites like it.

asadotzler 2 days ago

No one in Big Tech decides what you are allowed to say, they can only withhold their distribution of what you say.

As a book publisher, should I be required to publish your furry smut short stories? Of course not. Is that infringing on your freedom of speech? Of course not.

mitthrowaway2 2 days ago

No, they ban your account and exclude you from the market commons if they don't like what you say.

mulmen 2 days ago

Yes that’s how free markets work. Your idea has to be free to die in obscurity.

Compelled speech is not free speech. You have no right to an audience. The existence of a wide distribution platform does not grant you a right to it.

These arguments fall completely flat because it’s always about the right to distribute misinformation. It’s never about posting porn or war crimes or spam. That kind of curation isn’t contentious.

Google didn’t suddenly see the light and become free speech absolutists. They caved to political pressure and are selectively allowing the preferred misinformation of the current administration.

NaN years ago

undefined

NaN years ago

undefined

AfterHIA 2 days ago

If the furry smut people became the dominant force in literature and your company was driven out of business fairly for not producing enough furry smut would that too constitute censorship?

I want to see how steep this hill you're willing to die on is. What's that old saying-- that thing about the shoe being on the other foot?

zetazzed 2 days ago

Does Disney have a positive obligation to show animal cruelty snuff films on Disney Plus? Or are they allowed to control what people say on their network? Does Roblox have to allow XXX games showing non-consensual sex acts on their site, or are they allowed to control what people say on their network? Can WebMD decide not to present articles claiming that homeopathy is the ultimate cure-all? Does X have to share a "trending" topic about the refusal to release the Epstein files?

The reason we ban government censorship is so that a private actor can always create their own conspiracy theory + snuff film site if they want, and other platforms are not obligated to carry content they find objectionable. Get really into Rumble or Truth Social or X if you would like a very different perspective from Youtube's.

AfterHIA 2 days ago

Let's say that in the future that the dominant form of entertainment is X-rated animal snuff films for whatever reason. Would a lack of alternative content constitute an attack on your right to choose freely or speak? Given your ethical framework I'd have to say, "no" but even as your discursive opponent I would have to admit that if you as a person are adverse to, "X-rated furry smut" that I would sympathize with you as the oppressed if it meant your ability to live and communicate has been stifled or called into question. Oppression has many forms and many names. The Johnny Conservatarians want to reserve certain categories of cruelty as, "necessary" or, "permissable" by creating frameworks like, "everything is permitted just as long as some social condition is met..."

At the crux of things the libertarians and the non-psychos are just having a debate on when it's fair game to be unethical or cruel to others in the name of extending human freedom and human dignity. We've fallen so far from the tree.

singleshot_ 1 day ago

> a big tech company shouldn't decide what people are "allowed" to say

On their platform, that’s exactly what they are entitled to do. When you type into the box in the Facebook app, that’s your speech. But unless the platform wants to add your contribution to their coherent speech product, they have every right to reject it.

Otherwise, the government is deciding what people can say, and you’d be against that, right?

Further, if I wanted to start a social media platform called thinkingtylenolcausesautismisstupid.com, wouldn’t restricting my right to craft my product defeat the whole point of my business?

Giving platforms the ability to moderate their output to craft a coherent speech product is the only reason we have multiple social networks with different rules, instead of one first-mover social network with no rules where everyone is locked in by network effects.

mulmen 2 days ago

I have some ideas I want to post on your personal webpage but you have not given me access. Why are you censoring me?

AfterHIA 2 days ago

I have a consortium of other website owners who refuse to crosslink your materials unless you put our banner on your site. Is this oppression? Oppression goes both ways, has many names, and takes many forms. Its most insidious form being the Oxford Comma.

mulmen 6 hours ago

> Is this oppression?

Are you the government? If not then it is not oppression. It is free speech. This is the point of my rhetorical device.

matthewrobertso 1 day ago

The government told me to.

mitthrowaway2 2 days ago

Is andy99's personal webpage a de-facto commons where the public congregates to share and exchange ideas?

AfterHIA 2 days ago

I know that your post is rhetorical but I'll extend your thinking into real life-- has andy99 personal webpage been created because you're an elected official representing others? Would this still give andy99 the right to distribute hate speech on his personal webpage? I think we can harmonize around, "unfortunately so" and that's why I think the way forward is concentrating on the, "unfortunately" and not the, "so."

We have the right to do a potentially limitless amount of unbecoming, cruel, and oppressive things to our fellow man. We also have the potential for forming and proliferating societies. We invented religion and agriculture out of dirt and need. Let us choose Nazarenes, Jeffersons, and Socrates' over Neros, Alexanders, and Napoleons. This didn't use to be politically controversial!

mulmen 1 day ago

It would be if they’d stop censoring me!

ben_w 1 day ago

> There's tons of dumb stuff online, the only thing dumber is the state dictating how I'm supposed to think

  I've seen stupidity on the internet you wouldn't believe.
  Time Cube rants — four simultaneous days in one rotation — burning across static-filled CRTs.
  Ponzi pyramids stretching into forever, needing ten billion souls to stand beneath one.
  And a man proclaiming he brought peace in wars that never were, while swearing the President was born on foreign soil.
  All those moments… lost… in a rain of tweets.
But even that dumb stuff aside: there's two ways for a government to silence the truth: censorship, and propaganda.

We've got LLMs now, letting interested parties (government or not) overwhelm everyone with an endless barrage of the worst, cheapest, lowest quality AI slop, the kind that makes even AI proponents like me go "ah, I see what you mean about it being autocomplete", because even the worst of that by quality is still able to bury any bad news story just as effectively as any censorship. Too much noise and not enough signal, is already why I'm consuming far less YouTube these days, why I gave up on Twitter when it was still called that, etc.

And we have AI that's a lot better at holding a conversation than just the worst, cheapest, lowest quality AI slop. We've already seen LLMs are able to induce psychosis in some people just by talking to them, and that was, so far as we can tell, accidental. How long will it be before a developer chooses to do this on purpose, and towards a goal of their choice? Even if it's just those who are susceptible, there's a lot of people.

What's important is the freedom to share truth, no matter how uncomfortable, and especially when it's uncomfortable for those with power. Unfortunately, what we humans actually share the most is gossip, which is already a poor proxy for truth and is basically how all the witch hunts, genocides, and other moral-panic-induced horrors of history happened.

It is all a mess; it is all hard; don't mistake the proxy (free speech in general) for the territory (speak truth to power, I think?); censorship is simultaneously bad and the only word I know for any act which may block propaganda which is also bad.

asadotzler 2 days ago

My refusing to distribute your work is not "silencing." Silencing would be me preventing you from distributing it.

Have we all lost the ability to reason? Seriously, this isn't hard. No one owes you distribution unless you have a contract saying otherwise.

jhbadger 2 days ago

It's not that simple. For example, when libraries remove books for political reasons they often claim it isn't "censorship" because you could buy the book at a bookstore if you wanted. But if it really would have no effect on availability they wouldn't bother to remove the book, would they?

amanaplanacanal 2 days ago

Libraries are typically run by the government. Governments aren't supposed to censor speech. Private platforms are a different matter by law.

typeofhuman 2 days ago

Not OP, but we did learn the US federal government was instructing social media sites like Twitter to remove content it found displeasing. This is known as jawboning and is against the law.

SCOTUS. Bantam Books, Inc. v. Sullivan, holds that governments cannot coerce private entities into censoring speech they disfavor, even if they do not issue direct legal orders.

This was a publicly announced motivation for Elon Musk buying Twitter. Because of which we know the extent of this illegal behavior.

Mark Zuckerberg has also publicly stated Meta was asked to remove content by the US government.

brookst 2 days ago

Crazy how fast we got from “please remove health misinformation during a pandemic” (bad) to “FCC chair says government will revoke broadcast licenses for showing comedians mocking the president” (arguably considerably worse).

typeofhuman 2 days ago

If you're referring to Jimmy Kimmel. You should probably consider that while the FCC member made that comment, Sinclair (the largest ABC affiliate group) and others had been demanding ABC cancel his show for its horrible ratings, and awful rhetoric which inhibited them from selling advertising. His show was bad for business. It's worth suspecting ABC let no good opportunity go to waste: save Kimmel's reputation and scapegoat the termination as political.

More here: https://sbgi.net/sinclair-says-kimmel-suspension-is-not-enou...

NaN years ago

undefined

NaN years ago

undefined

themaninthedark 2 days ago

>On July 20, White House Communications Director Kate Bedingfield appeared on MSNBC. Host Mika Brzezinski asked Bedingfield about Biden's efforts to counter vaccine misinformation; apparently dissatisfied with Bedingfield's response that Biden would continue to "call it out," Brzezinski raised the specter of amending Section 230—the federal statute that shields tech platforms from liability—in order to punish social media companies explicitly.

>In April 2021, White House advisers met with Twitter content moderators. The moderators believed the meeting had gone well, but noted in a private Slack discussion that they had fielded "one really tough question about why Alex Berenson hasn't been kicked off from the platform."

Is there a difference between the White House stating they are looking at Section 230 and asking why this one guy has not been banned?

NaN years ago

undefined

pfannkuchen 2 days ago

I think the feeling of silencing comes from it being a blacklist and not a whitelist.

If you take proposals from whoever and then only approve ones you specifically like, for whatever reason, then I don’t think anyone would feel silenced by that.

If you take anything from anyone, and a huge volume of it, on any topic and you don’t care what, except for a few politically controversial areas, that feels more like silencing. Especially when there is no alternative service available due to network effects and subsidies from arguably monopolistic practices.

mock-possum 2 days ago

Also allowing it to be posted initially for a period of time before being taken down feels worse than simply preventing it from ever being published on your platform to begin with.

Of course they would never check things before allowing them to be posted because there isn’t any profit in that.

sterlind 2 days ago

I'd certainly consider an ISP refusing to route my packets as silencing. is YouTube so different? legally, sure, but practically?

michaelt 2 days ago

If we were still in the age of personal blogs and phpbb forums, where there were thousands of different venues - the fact the chess forum would ban you for discussing checkers was no problem at all.

But these days, when you can count the forums on one hand even if you're missing a few fingers, and they all have extremely similar (American-style) censorship policies? To me it's less clear than it once was.

scarface_74 2 days ago

No because you are perfectly technically capable of setting your own servers in a colo and distributing your video.

jabwd 2 days ago

yes... coz youtube is not your ISP. A literal massive difference. RE: net neutrality.

ultrarunner 2 days ago

At some level these platforms are the public square and facilitate public discussion. In fact, Google has explicitly deprioritized public forum sites (e.g. PHPbb) in preference to forums like YouTube. Surely there is a difference between declining to host and distribute adult material and enforcing a preferred viewpoint on a current topic.

Sure, Google doesn't need to host anything they don't want to; make it all Nazi apologia if they thing it serves their shareholders. But doing so and silencing all other viewpoints in that particular medium is surely not a net benefit for society, independent of how it affects Google.

Scoundreller 2 days ago

“Covid” related search results were definitely hard-coded or given a hand-tuned boost. Wikipedia was landing on the 2nd or 3rd page which never happens for a general search term on Google.

I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

Scoundreller 2 days ago

“Covid” related search results were definitely hard-coded. Wikipedia was landing on the 2nd or 3rd page which never happens.

I’d even search for “coronavirus” and primarily get “official” sites about Covid-19 even tho that’s just one of many coronaviruses. At least Wikipedia makes the front page again, with the Covid-19 page outranking the coronavirus page…

unyttigfjelltol 2 days ago

> My refusing to distribute your work is not "silencing."

That distinction is a relic of a world of truly public spaces used for communication— a literal town square. Then it became the malls and shopping centers, then the Internet— which runs on private pipes— and now it’s technological walled gardens. Being excluded from a walled garden now is effectively being “silenced” the same way being excluded from the town square was when whatever case law you’re thinking was decided.

Jensson 2 days ago

> No one owes you distribution unless you have a contract saying otherwise.

The common carrier law says you have to for for some things, so it makes sense to institute such a law for some parts of social media as they are fundamental enough. It is insane that we give that much censorship power to private corporations. They shouldn't have the power to decide elections on a whim etc.

AfterHIA 2 days ago

I 100% agree with your sentiment here Jensson but in Googling, "common carrier law" what I get are the sets of laws governing transportation services liability:

https://en.wikipedia.org/wiki/Common_carrier

Is there perhaps another name for what you're describing? It piques my interest.

Jensson 2 days ago

Common carrier also applies to phones and electricity and so on, it is what prevents your phone service provider from deciding who you can call or what you can say. Imagine a world where your phone service provider could beep out all your swear words, or if they prevented you from calling certain people, that is what common carrier prevents.

So the equivalent of Google banning anyone talking about Covid is the same as a phone service provider ending service for anyone mentioning covid on their phones. Nobody but the most extreme authoritarians thinks phone providers should be allowed to do that, so why not apply this to Google as well?

NaN years ago

undefined

timmg 2 days ago

It's interesting how much "they are a private company, they can do what they want" was the talking point around that time. And then Musk bought Twitter and people accuse him of using it to swing the election or whatever.

Even today, I was listening to NPR talk about the potential TikTok deal and the commenter was wringing their hands about having a "rich guy" like Larry Ellison control the content.

I don't know exactly what the right answer is. But given their reach -- and the fact that a lot of these companies are near monopolies -- I think we should at least do more than just shrug and say, "they can do what they want."

Ekaros 1 day ago

If you refuse to distribute some information you are making editorial decision. Clearly you are reviewing all of the content. So you should be fully liable for all content that remains. Including things like libel or copyright violation.

To me that sounds only fair trade. You editorialize content. You are liable for all content. In every possible way.

joannanewsom 2 days ago

Jimmy Kimmel wasn't being silenced. He doesn't have a right to a late night talk show. Disney is free to end that agreement within the bounds of their contract. Being fired for social media posts isn't being silenced. Employment is for the most part at will. Getting deported for protesting the Gaza war isn't being silenced. Visas come with limitations, and the US government has the authority to revoke your visa if you break those rules. /s

You seem to think there's a bright line of "silenced" vs "not silenced". In reality there's many ways of limiting and restricting people's expressions. Some are generally considered acceptable and some are not. When huge swaths of communication are controlled by a handful of companies, their decisions have a huge impact on what speech gets suppressed. We should interrogate whether that serves the public interest.

amanaplanacanal 2 days ago

The US has pretty much given up on antitrust enforcement. That's the big problem.

scarface_74 2 days ago

The federal government was literally pressuring ABC to take Kimmel off the air. Even Ted Cruz and other prominent republicans said that was a bridge too far.

joannanewsom 1 day ago

The federal government was literally pressuring YouTube to remove certain COVID content that did not violate its policies. It's said explicitly in the story.

What I'm trying to get at is it's possible to stifle people's freedom of expression without literally blocking them from every platform. Threatening their livelihood. Threatening their home. Kicking them off these core social media networks. All of these things are "silencing". And we should be wary of doing that for things we simply disagree about.

NaN years ago

undefined

NaN years ago

undefined

hn_throw_250915 2 days ago

[dead]

justinhj 2 days ago

So you're saying that YouTube is a publisher and should not have section 230 protections? They can't have it both ways. Sure remove content that violates policies but YouTube has long set itself up as an opinion police force, choosing which ideas can be published and monetized and which cannot.

justinhj 2 days ago

Thank you. I was completely wrong about section 230.

tzs 2 days ago

Section 230 does not work like you think it does. In fact it is almost opposite of what you probably think it does. The whole point was to allow them to have it both ways.

It makes sites not count as the publisher or speaker of third party content posted to their site, even if they remove or moderate that third party content.

bee_rider 2 days ago

YouTube’s business model probably wouldn’t work if they were made to be responsible for all the content they broadcasted. It would be really interesting to see a world where social media companies were treated as publishers.

Might be a boon for federated services—smaller servers, finer-grained units of responsibility…

kypro 2 days ago

I agree. People today are far more anti-vaccine than they were a few years ago which is kinda crazy when you consider we went through a global pandemic where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

I think if public health bodies just laid out the data they had honestly (good and bad) and said that they think most people should probably take it, but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

trollbridge 2 days ago

And the attempts at censorship have played a part in people drifting towards being more vaccine-hesitant or anti-vaccine.

It's often a lot better to just let kooks speak freely.

hypeatei 1 day ago

> It's often a lot better to just let kooks speak freely.

They have always been able to speak freely. I still see vaccine conspiracies on HN to this day. It was rampant during COVID as well.

vFunct 2 days ago

It's less about censorship and more about more people becoming middle-class and therefore thinking they're smarter than researchers.

There is nobody more confident in themselves than the middle-class.

khazhoux 2 days ago

That’s a very confident statement presented without a hint of evidence.

NaN years ago

undefined

gm678 2 days ago

That didn't happen in a vacuum; there was also a _lot_ of money going into pushing anti vaccine propaganda, both for mundane scam reasons and for political reasons: https://x.com/robert_zubrin/status/1863572439084699918?lang=...

logicchains 2 days ago

>where one of the only things that actually worked to stop people dying was the roll out of effective vaccines.

The only reason you believe that is because all information to the contrary was systematically censored and removed from the media you consume. The actual data doesn't support that, there are even cases where it increased mortality, like https://pmc.ncbi.nlm.nih.gov/articles/PMC11278956/ and increased the chance of future covid infections, like https://pubmed.ncbi.nlm.nih.gov/39803093/ .

wvenable 2 days ago

It isn't hard to find that randomized controlled trials and large meta-analyses show that COVID vaccines are highly effective. No need to rely on media. You can point to one or two observational re-analyses that show otherwise but overall they are not particularly convincing given the large body of easily accessible other evidence.

lisbbb 1 day ago

I don't think a meta analysis is worth anything at all, to be totally honest with you. I also don't think those gene therapy shots were at all effective, given how many people contracted covid after receiving the full course of shots. I think basic herd immunity ended covid and the hysteria lasted far beyond the timeframe in which there was truly a problem. Furthermore, I think those shots are the cause of many cancers, including my wife's. The mechanism? The shots "programmed" the immune system to produce antibodies against covid to the detriment of all other functions, including producing the killer T-Cells that destroy cells in the early stages of becoming cancerous. That's why so many different cancers are happening, as well as other weird issues like the nasty and deadly clotting people had. I have no idea about mycarditis, but that's fine because it is a well documented side effect that has injured a lot of people. So cancer and pulmonary issues are the result of those poorly tested drugs that were given out to millions of people without informed consent and with no basic ethical controls on the whole massive experiment. And before you gaslight me, please understand that my wife, age 49 was diagnosed with a very unusual cancer for someone of her sex and age and it's been a terrible fight since June of 2024 to try and save her life, which has nearly been lost 3x already! Of course I have no proof that the Pfizer shots caused any of this, but damn, it sure could have been that. Also, her cousin, age 41, was diagnosed with breast cancer that same year. So tell me, how incredibly low probability is it that two people in the same family got cancer in the same year? It's got to be 1 in 10 million or something like that. Just don't gaslight me--we can agree to disagree. I'm living the worst case scenario post covid and I only hope my daughter, who also got the damn shots never comes down with cancer.

NaN years ago

undefined

rpiguy 2 days ago

I appreciate you.

People have become more anti-Vax because the Covid vaccines were at best ineffective and as you said anything contra-narrative is buried or ignored.

If you push a shitty product and force people to take it to keep their jobs it’s going to turn them into skeptics of all vaccines, even the very effective ones.

More harm than good was done there. The government should have approved them for voluntary use so the fallout would not have been so bad.

OrvalWintermute 2 days ago

[flagged]

NaN years ago

undefined

cynicalkane 2 days ago

This is typical of Covid conspiracy theorists, or conspiracy theorists of any sort: one or two papers on one side prove something, but an overwhelming mountain of evidence on the other side does not prove something. The theorist makes no explanation as to how a planetful of scientists missed the obvious truth that some random dudes found; they just assert that it happened, or make some hand-waving explanation about how an inexplicable planet-wide force of censors is silencing the few unremarkable randos who somehow have the truth.

The first paper seems to claim a very standard cohort study is subject to "immortal time bias", an effect whereby measuring outcomes can seem to change them. The typical example of sampling time bias is that slow-growing cancers are more survivable than fast-growing ones, but also more likely to be measured by a screening, giving a correlation between screening and survivablility. So you get a time effect where more fast-acting cancers do not end up in the measurement, biasing the data.

But in measurements such that one outcome or the other does not bias the odds of that outcome being sampled, there can be no measurement time effect, which is why it's not corrected for in studies like this. The authors do not explain why measurement time effects would have anything to do with detecting or not detecting death rates in the abstract, or anywhere else in the paper, because they are quacks, who apply arbitrary math to get the outcome they want.

As another commenter pointed out, randomized controlled trials -- which cannot possibly have this made-up time effect -- often clearly show a strongly positive effect for vaccination.

I did not read the second paper.

lisbbb 1 day ago

[flagged]

kelnos 19 hours ago

Please stop posting conspiracy theory garbage.

A sibling read the your first link and noted the problems with it. I read just the abstract of the second link, and it's clear their methodology and description of what they're measuring can't actually support their conclusions.

vkou 2 days ago

> but left it to people to decide, the vast, vast majority of people would still have gotten the vaccine but we wouldn't have allowed anti-vaccine sentiment to fester.

Nah, the same grifters who stand to make a political profit of turning everything into a wedge issue would have still hammered right into it. They've completely taken over public discourse on a wide range of subjects, that go well beyond COVID vaccines.

As long as you can make a dollar by telling people that their (and your) ignorance is worth just as much - or more - than someone else's knowledge, you'll find no shortage of listeners for your sermon. And that popularity will build its own social proof. (Millions of fools can't all be wrong, after all.)

kypro 2 days ago

I agree. Again the vast majority would have gotten the vaccine.

There's always going to be people for all kinds of reasons pushing out bad ideas. That's part of the trade-off of living in a free society where there is no universal "right" opinion the public must hold.

> They've completely taken over public discourse on a wide range of subjects

Most people are not anti-vax. If "they've" "taken over public discourse" in other subjects to the point you are now holding a minority opinion you should consider whether "they" are right or wrong and why so many people believe what they do.

If can't understand their position and disagree you should reach out to people in a non-confrontational way, understand their position, then explain why you disagree (if you still do at that point). If we all do a better job at this we'll converge towards truth. If you think talking and debate isn't the solution to disagreements I'd argue you don't really believe in our democratic system (which isn't a judgement).

vel0city 2 days ago

While I do agree "most people are not anti-vax", the rates of opting out of vaccines or doing delayed schedules or being very selective have gone way up.

Some of these public school districts in Texas have >10% of students objecting to vaccines. My kids are effectively surrounded by unvaccinated kids whenever they go out in public. There's a 1 in 10 chance that kid on the playground has never had a vaccine, and that rate is increasing.

A lot of the families I know actively having kids are pretty crunchy and are at least vaccine hesitant if not outright anti-vax.

https://www.dshs.texas.gov/sites/default/files/LIDS-Immuniza...

someNameIG 2 days ago

It's more that people in general* connect to personal stories far more than impersonal factual data. It's easy to connect to seeing people say they had adverse reactions to a vaccine than statistical data showing it's safer to get vaccinated than not. It's also easier to believe conspiracies, its easier to think bad things happen due to the intent of bad people, than the world being a complex hard to understand place with no intent behind things happening.

These are just things that some of the population will be more attracted to, I don't think it has anything to do with censorship, lockdowns, or mandates. At most the blame can be at institutions for lacking in their ability to do effective scientific communication.

*And this skews more to less educated and intelligent.

nxm 1 day ago

Issue is when we weren't/aren't even allowed to question the efficacy or long-term side effects of any vaccine.

stefantalpalaru 2 days ago

> one of the only things that actually worked to stop people dying was the roll out of effective vaccines

"A total of 913 participants were included in the final analysis. The adjusted ORs for COVID-19 infection among vaccinated individuals compared to unvaccinated individuals were 1.85 (95% CI: 1.33-2.57, p < 0.001). The odds of contracting COVID-19 increased with the number of vaccine doses: one to two doses (OR: 1.63, 95% CI: 1.08-2.46, p = 0.020), three to four doses (OR: 2.04, 95% CI: 1.35-3.08, p = 0.001), and five to seven doses (OR: 2.21, 95% CI: 1.07-4.56, p = 0.033)." - ["Behavioral and Health Outcomes of mRNA COVID-19 Vaccination: A Case-Control Study in Japanese Small and Medium-Sized Enterprises" (2024)](https://www.cureus.com/articles/313843-behavioral-and-health...)

"the bivalent-vaccinated group had a slightly but statistically significantly higher infection rate than the unvaccinated group in the statewide category and the age ≥50 years category" - ["COVID-19 Infection Rates in Vaccinated and Unvaccinated Inmates: A Retrospective Cohort Study" (2023)](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10482361/)

"The risk of COVID-19 also varied by the number of COVID-19 vaccine doses previously received. The higher the number of vaccines previously received, the higher the risk of contracting COVID-19" - ["Effectiveness of the Coronavirus Disease 2019 (COVID-19) Bivalent Vaccine" (2022)](https://www.medrxiv.org/content/10.1101/2022.12.17.22283625v...)

"Confirmed infection rates increased according to time elapsed since the last immunity-conferring event in all cohorts. For unvaccinated previously infected individuals they increased from 10.5 per 100,000 risk-days for those previously infected 4-6 months ago to 30.2 for those previously infected over a year ago. For individuals receiving a single dose following prior infection they increased from 3.7 per 100,000 person days among those vaccinated in the past two months to 11.6 for those vaccinated over 6 months ago. For vaccinated previously uninfected individuals the rate per 100,000 person days increased from 21.1 for persons vaccinated within the first two months to 88.9 for those vaccinated more than 6 months ago." - ["Protection and waning of natural and hybrid COVID-19 immunity" (2021)](https://www.medrxiv.org/content/10.1101/2021.12.04.21267114v...)

dotnet00 2 days ago

[flagged]

mrcwinn 2 days ago

If that were the case, wouldn’t we see vaccine skepticism in poorly educated, racist non-Western nations?

dotnet00 2 days ago

As the other reply mentions, that's where the "in your face" part comes in. Many of the diseases that can be prevented by vaccines are in living memory for those countries.

On top of that, 'poorly educated' in those countries often means never having been to a proper school, never having finished basic schooling, being illiterate, or lacking access to information (be it the internet or social programs). That kind of skepticism is easier to help, because it stems from a place of actual ignorance, rather than believing oneself to be smarter than everyone else.

Jensson 2 days ago

You do see a lot of vaccine skepticism in such countries, this study found about half of Africans view vaccines negatively.

https://pmc.ncbi.nlm.nih.gov/articles/PMC9903367/

braiamp 2 days ago

You don't see those, because it's on their faces. Or more accurately on our faces. I live in such country, and we kill for having our kids vaccinated. We live these diseases, so we aren't so stupid to fall for misinformation.

xdennis 2 days ago

> I think the anti-vax thing is mostly because the average Western education level is just abysmal.

What does the West have to do with it? Non-westerners are even more into folk medicine and witch doctors.

dotnet00 2 days ago

They're into folk medicine, but their anti-vax issues generally come from people who don't have any means of knowing better (i.e. never been to school, dropped out at a very early grade, isolated, not even literate). Typically just education and having a doctor or a local elder respectfully explain to them that the Polio shot will help prevent their child from being paralyzed for life is enough to convince them.

Meanwhile the 'educated' Westerner, to whom Polio is a third-world disease, will convince themselves that the doctor is lying for some reason, will choose to take the 75% chance of an asymptomatic infection because they don't truly appreciate how bad it can otherwise be, will use their access to a vast collection of humanity's information to cherry pick data that supports their position (most likely while also claiming to seek debate despite not intending to seriously consider opposing evidence), and if their gamble fails, will probably just blame immigrants, government or 'big pharma' for doing it.

andrewmcwatters 2 days ago

And yet, SEA and others are still better educated than us.

NaN years ago

undefined

kypro 2 days ago

Anti-vax has never really been a thing though. I don't know what the data is these days, but it used to be like 1% of the population who were anti-vax.

We have the same thing going on with racism in the West where people are convinced racism is a much bigger problem than it actually is.

And whether it's anti-vax or racist beliefs, when you start attacking people for holding these views you always end up inadvertently encouraging people to start asking why that is and they end up down rabbit holes.

No one believes peas cause cancer for example, but I guarantee one of best ways to make people start to believing peas cause cancer is for the media to start talking about how some people believe that peas do cause cancer, then for sites like YouTube and Facebook to starting ban people who talk about it. Because if they allow people to talk about UFOs and flat Earth conspiracies why are they banning people for suggesting that peas cause cancer? Is there some kind of conspiracy going on funded by big agriculture? You can see how this type of thinking happens.

dotnet00 2 days ago

Anti-vax was enough of an issue that vaccine mandates were necessary for Covid.

It also isn't convincing to be claiming that racism isn't as big in the West given all the discourse around H1Bs, Indians (the Trump base has been pretty open on this one, with comments on JD Vance's wife, the flood of anti-Indian racism on social media, and recently the joy taken in attempting to interfere with Indians forced to fly back to the US in a hurry due to the lack of clarity on the H1B thing), how ICE is identifying illegals, a senator openly questioning the citizenship of a brown mayoral candidate and so on.

I agree that denying something is the easiest way to convince people of the opposite, but it's also understandable when social media companies decide to censor advice from well known individuals that people should do potentially harmful things like consume horse dewormer to deal with Covid. Basically, it's complicated, though I would prefer to lean towards not censoring such opinions.

logicchains 2 days ago

The anti-vax thing is because every single comparative study of vaccinated and unvaccinated children found a greater rate of developmental disorders in vaccinated children. They're also the only products for which you're not allowed to sue the manufacturers for liability, and the justification given by the manufacturers for requesting this liability protection was literally that they'd be sued out of business otherwise. If they were as safe as other treatments they wouldn't need a blanket liability immunity.

Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

Anthony R. Mawson et al., “Preterm Birth, Vaccination and Neurodevelopmental Disorders: A Cross-Sectional Study of 6- to 12-Year-Old Vaccinated and Unvaccinated Children,” Journal of Translational Science 3, no. 3 (2017): 1-8, doi:10.15761/JTS.1000187.

Brian Hooker and Neil Z. Miller, “Analysis of Health Outcomes in Vaccinated and Unvaccinated Children: Developmental Delays, Asthma, Ear Infections and Gastrointestinal Disorders,” SAGE Open Medicine 8, (2020): 2050312120925344, doi:10.1177/2050312120925344.

Brian Hooker and Neil Z. Miller, “Health Effects in Vaccinated versus Unvaccinated Children,” Journal of Translational Science 7, (2021): 1-11, doi:10.15761/JTS.1000459.

James Lyons-Weiler and Paul Thomas, “Relative Incidence of Office Visits and Cumulative Rates of Billed Diagnoses along the Axis of Vaccination,” International Journal of Environmental Research and Public Health 17, no. 22 (2020): 8674, doi:10.3390/ijerph17228674.

James Lyons-Weiler, "Revisiting Excess Diagnoses of Illnesses and Conditions in Children Whose Parents Provided Informed Permission to Vaccinate Them" September 2022 International Journal of Vaccine Theory Practice and Research 2(2):603-618 DOI:10.56098/ijvtpr.v2i2.59

NVKP, “Diseases and Vaccines: NVKP Survey Results,” Nederlandse Vereniging Kritisch Prikken, 2006, accessed July 1, 2022.

Joy Garner, “Statistical Evaluation of Health Outcomes in the Unvaccinated: Full Report,” The Control Group: Pilot Survey of Unvaccinated Americans, November 19, 2020.

Joy Garner, “Health versus Disorder, Disease, and Death: Unvaccinated Persons Are Incommensurably Healthier than Vaccinated,” International Journal of Vaccine Theory, Practice and Research 2, no. 2, (2022): 670-686, doi: 10.56098/ijvtpr.v2i2.40.

Rachel Enriquez et al., “The Relationship Between Vaccine Refusal and Self-Report of Atopic Disease in Children,” The Journal of Allergy and Clinical Immunology 115, no. 4 (2005): 737-744, doi:10.1016/j.jaci.2004.12.1128.

jawarner 2 days ago

Mawson et al. 2017 (two papers) – internet survey of homeschoolers recruited from anti-vaccine groups; non-random, self-reported, unverified health outcomes. Retracted by the publisher after criticism.

Hooker & Miller 2020/2021 – analysis of “control group” data also from self-selected surveys; same methodological problems.

Lyons-Weiler & Thomas 2020, 2022 – data from a single pediatric practice run by one of the authors; serious selection bias.

Joy Garner / NVKP surveys – activist-run online surveys with no verification.

Enriquez et al. 2005 – a small cross-sectional study about allergy self-reports, not about overall neurodevelopment.

Large, well-controlled population studies (Denmark, Finland, the U.S. Vaccine Safety Datalink, etc.) comparing vaccinated vs. unvaccinated children show no increase in autism, neurodevelopmental disorders, or overall morbidity attributable to recommended vaccines.

MSM 2 days ago

I picked one at random (NVKP, "Diseases and Vaccines: NVKP Survey Results") and, while I needed to translate it to read it, it's clear (and loud!) about not actually being a scientific study.

"We fully realize that a survey like this, even on purely scientific grounds, is flawed on all counts. The sample of children studied is far too small and unrepresentative, we didn't use control groups, and so on."

Turns out the NVKP roughly translates to "Dutch Organization for those critical towards vaccines."

I understand being skeptical about vaccines, but the skepticism needs to go both ways

lkey 2 days ago

"If they were as safe as other treatments they wouldn't need a blanket liability immunity." Citation very much needed for this inference.

Even if I granted every single paper's premise here. I'd still much rather have a living child with a slightly higher chance of allergies or asthma or <insert survivable condition here> than a dead child. How quickly we forget how bad things once were. Do you dispute that vaccines also accounted for 40% of the decline in infant mortality over the last 50 years? And before that, TB, Flu, and Smallpox killed uncountably many people. Vaccines are a public good and one of the best things we've ever created as a species.

Do you also have theories about autism you'd like to share with the class?

NaN years ago

undefined

TimorousBestie 2 days ago

> Anthony R. Mawson, et al., “Pilot Comparative Study on the Health of Vaccinated and Unvaccinated 6 to 12-year-old U.S. Children,” Journal of Translational Science 3, no. 3 (2017): 1-12, doi: 10.15761/JTS.1000186

Retracted: https://retractionwatch.com/2017/05/08/retracted-vaccine-aut...

If you edit down your list to journal articles that you know you be valid and unretracted, I will reconsider looking through it. However, journal access in general is too expensive for me to bother reading retracted articles.

conception 2 days ago

Here’s where the “bad ideas out in the open get corrected” now is tested. There are 4 really good refutations of your evidence. Outside of the unspoken “perhaps vaccines cause some measurable bad outcomes but compare then to measles. And without the herd immunity vaccinations aren’t nearly as useful” argument.

So the important question is: Are you now going to say “well, I guess i got some bad data and i have to go back and review my beliefs” or dig in?

barbazoo 2 days ago

> If they were as safe as other treatments they wouldn't need a blanket liability immunity.

Other treatments aren’t applied preventatively to the entire population which is why the risk presumably is lower.

tnias23 2 days ago

The studies you cite are the typical ones circulated by antivaxers and are not considered credible by the medical community due to severe methodological flaws, undisclosed biases, retractions, etc.

To the contrary, high quality studies consistently show that vaccines are not linked to developmental disability or worse health outcomes.

boxerab 2 days ago

Yes! This MUST be why the VAERS adverse event tracker went through the roof right after the rollout began, and why excess death remains sky high in many countries to this day - because a product that didn't stop you from catching or spreading the virus was one of the only things preventing deaths. Couldn't have been our, you know, immune system or anything like that, or that the average age at death was 80 along with several co-morbidities.

yongjik 2 days ago

I feel like we're living in different worlds, because from what I've seen, giving people platforms clearly doesn't work either. It just lets the most stupid and incendiary ideas to spread unchecked.

If you allow crazy people to "let it ride" then they don't stop until... until... hell we're still in the middle of it and I don't even know when or if they will stop.

n4r9 17 hours ago

Spot on. At least in the UK, anyone who thinks fake news will just "fizzle out" on social media hasn't been paying attention to the increasing frenzy being whipped up by the alt right on Twitter and Telegram, and consequences like the Southport riots.

atmavatar 2 days ago

I wonder how much of that is giving a platform to conspiracy theorists and how much of it is the social media algorithms' manipulation making the conspiracy theories significantly more visible and persuasive.

prawn 2 days ago

Is there any consideration of this with regard to Section 230? e.g., you're a passive conduit if you allow something to go online, but you're an active publisher if you actively employ any form of algorithm to publish and promote?

thrance 1 day ago

Look at 4chan and it's derivatives: minimal algorithms and they're the shitholes of ideas on the internet.

mac-attack 2 days ago

It's poorly thought out logic. Everyone sees how messy and how mistakes can be made when attempting to get to a truth backed by data + science, so they somehow they conclude that allowing misinformation to flourish will solve the problem instead of leading to a slow decline of morality/civilization.

Very analogous to people who don't like how inefficient governments function and somehow conclude that the solution is to put people in power with zero experience managing government.

mitthrowaway2 2 days ago

There's a journey that every hypothesis makes on the route to becoming "information", and that journey doesn't start at top-down official recognition. Ideas have to circulate, get evaluated and rejected and accepted by different groups, and eventually grasp their way towards consensus.

I don't believe Trump's or Kennedy's ideas about COVID and medicine are the ones that deserve to win out, but I do think that top-down suppression of ideas can be very harmful to truth seeking and was harmful during the pandemic. In North America I believe this led to a delayed (and ultimately minimal) social adoption of masks, a late acceptance of the aerosol-spread vector, an over-emphasis on hand washing, and a far-too-late restriction on international travel and mass public events, well past the point when it could have contributed to containing the disease (vs Taiwan's much more effective management, for example).

Of course there's no guarantee that those ideas would have been accepted in time to matter had there been a freer market for views, and of course it would have opened the door to more incorrect ideas as well, but I'm of the view that it would have helped.

More importantly I think those heavy restrictions on pre-consensus ideas (as many of them would later become consensus) helped lead to a broader undermining of trust in institutions, the fallout of which we are observing today.

thrance 1 day ago

That journey of "fringe hypothesis" to "actual fact" doesn't start out on right-wing Facebook groups, nor has it ever. There are more efficient channels for this. Make a paper, submit it for review, tell the press maybe. But social media can't play a part in establishing the truths we hold for granted, lest we want to be ruled by absolute buffoons that would make vaccines and paracetamol illegal on pseudo-scientific grounds.

mac-attack 2 days ago

The issues you are bringing up don't highlight that they stuck with the wrong decision, but rather that they didn't pivot to the right decision as fast as you'd like... yet your solution is bottom-up decision-making that will undoubtedly take much much longer to reach a consensus? How do you square that circle?

Experts can study and learn from their prior mistakes. Continually doing bottom-up when we have experts is inefficient and short-sighted, no? Surely you would streamline part of the process and end up in the pre-Trump framework yet again?

Also, I'm curious why you have such a rosy picture of the bottom-up alternatives? Are you forgetting about the ivermectin overdoses? 17,000 deaths related to hydroxychloroquine? The US president suggesting people drinking bleach? It is easy to cherry pick the mistakes that science makes while overlooking the noise and misinformation that worms its way into less-informed/less-educated thinkers when non-experts are given the reins

NaN years ago

undefined

sazylusan 2 days ago

Perhaps free speech isn't the problem, but free speech x algorithmic feeds is? As we all know the algorithm favors the dramatic, controversial, etc. That creates an uneven marketplace for free speech where the most subversive and contrarian takes essentially have a megaphone over everyone else.

cptnapalm 2 days ago

As I understand it, Twitter has something called Community Notes. So people can write things, but it can potentially have an attached refutation.

prisenco 2 days ago

Community notes is better than nothing, but they only relate to a single tweet. So if one tweet with misinformation gets 100k likes, then a community note might show up correcting it.

But if 100 tweets each get 1000 likes, they're never singularly important enough to community note.

cptnapalm 2 days ago

Fair enough on that. The problem I've seen (and don't have a good idea for how to fix) is on Reddit where the most terminally online are the worst offenders and they simply drown out everything else until non-crazy people just leave. It doesn't help that the subreddit mods are disproportionately also the terminally online.

NaN years ago

undefined

hn_throwaway_99 2 days ago

Glad to see this, was going to make a similar comment.

People should be free to say what they want online. But going down "YouTube conspiracy theory" rabbit holes is a real thing, and YouTube doesn't need to make that any easier, or recommend extreme (or demonstrably false) content because it leads to more "engagement".

squigz 2 days ago

Online, sure. But online doesn't mean YouTube or Facebook.

AfterHIA 2 days ago

I feel that this is the right approach-- the liability and toxicity of the platforms isn't due to them being communication platforms it's because in most practical or technical ways they are not: they are deliberate behavior modification schemes where-in companies are willfully inflaming their customer's political and social sentiments for profit in exchange for access to the addictive platform. It's like free digital weed but the catch is that it makes you angry and politically divisive.

In this sense platforms like X need to be regulated more like gambling. In some ways X is a big roulette wheel that's being spun which will help stochastically determine where the next major school shooting will take place.

prisenco 2 days ago

Right, engagement algorithms are like giving bad takes a rocket ship.

The words of world renown epidemiologists who were, to be frank, boring and unentertaining could never possibly compete with crunchymom44628 yelling about how Chinese food causes covid.

Bad takes have the advantage of the engagement of both the people who vehemently agree and the people who vehemently disagree. Everyone is incentivized to be a shock jock. And the shock jocks are then molded by the algorithm to be ever more shock jockish.

Especially at a time when we were all thrown out of the streets and into our homes and online.

And here I'll end this by suggesting everyone watch Eddington.

AfterHIA 1 day ago

Just wiki'd Eddington and I'm adding it to my watch list. Thanks for the recommend prisenco.

One of the sentiments I've been flirted with in posts below/above is the idea that while bad takes and their amplification are indeed a kind of societal evil-- in a society which was more effectively mediated bad takes might serve a vital purpose in the discourse. Societies committed to their own felicity might treat disagreements as an opportunity to extend the public discourse. This seems to be the crux of the thing-- we can take all day about checks and balances but unless a society is truly at some level committed to its own preservation and expansion those checks and balances will end up becoming tools for domination and exploitation as we see in the United States.

It don't care how well you can bake you can't make apple pie with rotten apples. No amount of sugar will correct the rot. Th trick is growing healthy apples.

sazylusan 2 days ago

Building on that, the crazy person spouting conspiracy theories in the town square, who would have been largely ignored in the past, suddenly becomes the most visible.

The first amendment was written in the 1700s...

Aloha 2 days ago

I think it made sense as a tactical choice at the moment, just like censorship during wartime - I dont think it should go on forever, because doing so is incompatible with a free society.

llm_nerd 2 days ago

It didn't even make sense at the time. It tainted everything under a cloud that the official, accepted truth needed to suppress alternatives to win the battle of minds. It was disastrous, and it is astonishing seeing people (not you, but in these comments) still trying to paint it as a good choice.

It massively amplified the nuts. It brought it to the mainstream.

I'm a bit amazed seeing people still justifying it after all we've learned.

COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

And to state my position like the root guy, I'm a progressive, pro-vaccine, medical science believer. I listen to my doctor and am skeptical if not dismissive of the YouTube "wellness" grifters selling scam supplements. I believe in science and research. I thought the worm pill people were sad if not pathetic. Anyone who gets triggered by someone wearing a mask needs to reassess their entire life.

But lockdowns went on way too long. Limits on behaviour went on way too long. Vaccine compliance measures were destructive the moment we knew it had a negligible effect on spread. When platforms of "good intentions" people started silencing the imbeciles, it handed them a megaphone and made the problem much worse.

And now we're living in the consequences. Where we have a worm-addled halfwit directed medicine for his child-rapist pal.

LeafItAlone 2 days ago

>It massively amplified the nuts. It brought it to the mainstream.

>COVID was handled terribly after the first month or so, and hopefully we've learned from that. We're going to endure the negative consequences for years.

In theory, I agree, kind of.

But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be. Vaccine production and approval were well under way, brought to fruition in part due to the first Trump administration. The “nuts” had long been mainstream and amplified before this “silencing” began. Misinformation was rampant and people were spreading it at a quick speed. Most people I know who ultimately refused the vaccines made up their minds before Biden took office.

jasonlotito 2 days ago

> But also - we were 10+ months into COVID raging in the US before Biden’s administration, the administration that enacted the policies the article is about, came to be.

Google makes it very clear that these were choices they made, and were independent of whatever the government was asking. Suggesting these policies are anything other than Google's is lying.

llm_nerd 2 days ago

Sure, but I'm not remotely blaming Biden[1]. A lot of tech companies took this on themselves, seeing themselves as arbiters of speech for a better world. Some admin (Trump admin) people might have given them suggestions, but they didn't have to do the strong-arm stuff, and the results weren't remotely helpful.

We already had a pretty strong undercurrent of contrarianism regarding public health already -- it's absolutely endemic on here, for instance, and was long before COVID -- but it mainstreamed it. Before COVID I had a neighbour that would always tell me hushed tones that he knows what's really going on because he's been learning about it on YouTube, etc. It was sad, but he was incredibly rare. Now that's like every other dude.

And over 80% of the US public got the vaccine! If we were to do COVID again, I doubt you'd hit even 40% in the US now. The problem is dramatically worse.

[1] That infamous Zuck interview with Rogan, where Zuck licked Trump's anus to ingratiate himself with the new admin, was amazing in that he kept blaming Biden for things Meta did long before Biden's admin took office or even took shape. Things he did at the urging of the Trump admin pt 1. I still marvel that he could be so astonishingly deceptive and people don't spit in his lying face for it.

ioteg 2 days ago

[dead]

tmaly 5 hours ago

The issue for me is that kids are on YouTube, and I think there should be some degree of moderation.

Zanfa 1 day ago

IMO free speech requires moderation, but the "how" is an unsolved problem. In a completely unmoderated environment, free speech will be drowned out by propaganda from your adversaries. The decades of experience and the industrial scale that Russian (or similar) troll factories can manufacture grassroots content or fund influencers is not something that can be combated at an individual level.

It would be a mistake to think such operations care too much about specific talking points, the goal is to drown out moderate discussion to replace it with flamewars. It's a numbers game, so they'll push in hundreds of different directions until they find something that sticks and also both sides of the same conflict.

tonfreed 2 days ago

The best disinfectant is sunlight. I'm similarly appalled by some of the behaviour after a certain political activist was murdered, but I don't want them to get banned or deplatformed. I'm hoping what we're seeing here is a restoration of the ability to disagree with each other

LeafItAlone 2 days ago

>The best disinfectant is sunlight.

Is it? How does that work at scale?

Speech generally hasn’t been restricted broadly. The same concepts and ideas removed from YouTube still were available on many places (including here).

Yet we still have so many people believing falsehoods and outright lies. Even on this very topic of COVID, both sides present their “evidence” and and truly believe they are right, no matter what the other person says.

TeeMassive 2 days ago

What's your alternative? The opposite is state dictated censorship and secrecy and those have turned very wrong every single time.

LeafItAlone 2 days ago

I honestly don’t know. My libertarian foundation want me to believe that any and all ideas should be able to be spread. But with the technological and societal changes in the past 10-15 years, we’ve seen how much of a danger this can be too. A lie or mistrust can be spread faster than ever to a wider audience than previously ever possible. I don’t have solution, but what we have not is clearly not working.

NaN years ago

undefined

tzs 2 days ago

> The best disinfectant is sunlight

Have you actually tried to shine sunlight on online misinformation? If you do you will quickly find it doesn't really work.

The problem is simple. It is slower to produce factually correct content. A lot slower. And when you do produce something the people producing the misinformation can quickly change their arguments.

Also, by the time you get your argument out many of the people who saw the piece you are refuting and believed it won't even see your argument. They've moved on to other topics and aren't going to revisit that old one unless it is a topic they are particularly interested in. A large number will have noted the original misinformation, such as some totally unsafe quack cure for some illness that they don't currently have, accepted it as true, and then if they ever find themselves with that illness apply the quack cure without any further thought.

The debunkers used to have a chance. The scammers and bullshitters always had the speed advantage when it came to producing content but widespread distribution used to be slow and expensive. If say a quack medical cure was spreading the mainstream press could ask the CDC or FDA about it, talk to researchers, and talk to doctors dealing with people showing up in emergency rooms from trying the quack cure, and they had the distribution networks to spread this information out much faster than the scammers and bullshitters.

Now everyone has fast and cheap distribution through social media, and a large number of people only get their information from social media and so the bullshitters and scammers now have all the advantages.

DangitBobby 2 days ago

And not letting the disease spread to begin with is better than any disinfectant.

slater- 2 days ago

>> The best disinfectant is sunlight.

Trump thought so too.

thrance 2 days ago

How's that working out? The worst ideas of the 20th century are resurfacing in plain sunlight because the dem's couldn't pluck their heads out of the sand and actually fight them.

Now vaccines are getting banned and the GOP is gerrymandering the hell out of the country to ensure the end of the democratic process. Sure, let's do nothing and see where that brings us. Maybe people will magically come to their senses.

andrewmcwatters 2 days ago

Well, people literally died. So, I think we all know how it played out.

The same thing since time eternal will continue to occur: the educated and able will physically move themselves from risk and others will suffer either by their own volition, or by association, or by lot.

theshrike79 1 day ago

The problem is the algorithm.

Content that makes people angry (extreme views) brings views.

Algorithims optimise for views -> people get recommended extreme views.

You can test this with a fresh account, it doesn't take many swipes on Youtube Shorts to get some pretty heinous shit if you pretend to be a young male to the algorithm.

yojo 2 days ago

I think there's a difference between silencing people, and having an algorithm that railroads people down a polarization hole.

My biggest problem with YouTube isn't what it does/doesn't allow on its platform, it's that it will happily feed you a psychotic worldview if it keeps you on the site. I've had several family members go full conspiracy nut-job after engaging heavily with YouTube content.

I don't know what the answer is. I think many people would rightly argue that removing misinformation from the recommendation engine is synonymous with banning it. FWIW I'd be happy if recommendation engines generally were banned for being a societal cancer, but I'm probably in the minority here.

electriclove 2 days ago

I agree and I’m pro vaccines but want the choice on if/when to vaccinate my kids. I believe there were election discrepancies but not sure if it was stolen. I felt the ZeroHedge article about lab leak was a reasonable possibility. All these things were shutdown by the powers that be (and this was not Trump’s fault). The people shutting down discourse are the problem.

amanaplanacanal 2 days ago

You pretty much have the choice about vaccinating your kids. You might not be able to send them to public school without vaccinations though, depending on your local laws.

electriclove 1 day ago

In California, it is required for public schools and many private schools also require it, so effectively it isn't much of a choice.

dakial1 1 day ago

I used to think like you, believing that, on average, society would expurge the craziness, but the last decade and the effect of social media and the echo chambers in groups made me see that I was completely wrong.

trinsic2 2 days ago

They don't silence people to stop narratives. People are silenced to cause divisions and to exert control over the population. When people stop using tech they don't control and supporting people or systems that do not have their best interests a heart, only then will we see reach change.

brookst 2 days ago

There is no conspiracy. It’s all emergent behavior by large groups of uncoordinated dunces who can’t keep even the most basic of secrets.

trinsic2 1 day ago

Its known strategy that happens all the time by corrupt individuals in governments around the world. Its so pervasive, im not going to even to bother posting links.

hash872 2 days ago

It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

If 'silencing people' doesn't work- so online platforms aren't allowed to remove anything? Is there any limit to this philosophy? So you think platforms can't remove:

Holocaust denial? Clothed underage content? Reddit banned r/jailbait, but you think that's impermissible? How about clothed pictures of toddlers but presented in a sexual context? It would be 'silencing' if a platform wanted to remove that from their private property? Bomb or weapons-making tutorials? Dangerous fads that idiotic kids pass around on TikTok, like the blackout game? You're saying it's not permissible for a platform to remove dangerous instructionals specifically targeted at children? How about spam? Commercial advertising is legally speech in the US. Platforms can't remove the gigantic quantities of spam they suffer from every day?

Where's the limiting principle here? Why don't we just allow companies to set their own rules on their own private property, wouldn't that be a lot simpler?

softwaredoug 2 days ago

I used to believe this. But I feel more and more we need to promote a culture of free speech that goes beyond the literal first amendment. We have to tolerate weird and dangerous ideas.

andrewmcwatters 2 days ago

Better out in the open with refutations or warnings than in the dark where concepts become physical dangers.

benjiro 2 days ago

Refuting does not work... You can throw scientific study upon study, doctor upon doctor, ... negatives run deeper then positives.

In the open, it becomes normalized, it draws in more people. Do you rather have some crazies in the corner, or 50% of a population that believes something false, as it became normalized.

The only people benefiting from those dark concepts are those with financial gains. They make money from it, and push the negatives to sell their products and cures. Those that fight against it, do not gain from it and it cost them time/money. That is why is a losing battle.

NaN years ago

undefined

TeeMassive 2 days ago

> It's their private property, they can ban or promote any ideas that they want to. You're free to not use their property if you disagree with that.

1) They are public corporations and are legal creation of the state and benefit from certain protections of the state. They also have privileged access to some public infrastructures that other private companies do not have.

2) By acting on the behest of the government they were agent of the government for free speech and censorship purposes

3) Being monopolies in their respective markets, this means they must respect certain obligations the same way public utilities have.

hash872 2 days ago

Re: 1- one certain protection of the state that they benefit from is the US Constitution, which as interpreted so far forbids the government to impair their free speech rights. Making a private actor host content they personally disagree with violates their right of free speech! That's what the 1st Amendment is all about

2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

3. What market is Youtube a monopoly in?

themaninthedark 1 day ago

2. This has already been adjudicated and this argument lost https://en.wikipedia.org/wiki/Murthy_v._Missouri

The 6–3 majority determined that neither the states nor other respondents had standing under Article III, reversing the Fifth Circuit decision.

In law, standing or locus standi is a condition that a party seeking a legal remedy must show they have, by demonstrating to the court, sufficient connection to and harm from the law or action challenged to support that party's participation in the case.

Justice Amy Coney Barrett wrote the opinion, stating: "To establish standing, the plaintiffs must demonstrate a substantial risk that, in the near future, they will suffer an injury that is traceable to a government defendant and redressable by the injunction they seek. Because no plaintiff has carried that burden, none has standing to seek a preliminary injunction."

The Supreme Court did not say that what was done was legal, they only said that the people who were asking for the injunction and bringing the lawsuit could not show how they were being or going to be hurt.

TeeMassive 1 day ago

1. By accepting unique protections and benefice of and from the state, they are no longer entirely private. 2. See comments below. It doesn't say what you think it says. 3. Google has a quasi monopoly (it doesn't require having full control) and abused it with YouTube with its search result. While it's true that it's not YT entirely by itself.

drak0n1c 2 days ago

Read the article, along with this one https://reclaimthenet.org/google-admits-biden-white-house-pr...

In this case it wasn't a purely private decision.

rahidz 2 days ago

"Where's the limiting principle here?"

How about "If the content isn't illegal, then the government shouldn't pressure private companies to censor/filter/ban ideas/speech"?

And yes, this should apply to everything from criticizing vaccines, denying election results, being woke, being not woke, or making fun of the President on a talk show.

Not saying every platform needs to become like 4chan, but if one wants to be, the feds shouldn't interfere.

krapp 2 days ago

>A lot of people will say all kinds of craziness, and you just have to let it ride so most of us can roll our eyes at it.

Except many people don't roll their eyes at it, that's exactly the problem. QAnon went from a meme on 4chan to the dominant political movement across the US and Europe. Anti-vax went from fringe to the official policy position of the American government. Every single conspiracy theory that I'm aware of has only become more mainstream, while trust in any "mainstream" source of truth has gone down. All all of this in an environment of aggressive skepticism, arguing, debating and debunking. All of the sunlight is not disinfecting anything.

We're literally seeing the result of the firehose of misinformation and right-wing speech eating people's brains and you're saying we just have to "let it ride?"

Silencing people alone doesn't work, but limiting the damage misinformation and hate speech can do while pushing back against it does work. We absolutely do need to preserve the right of platforms to choose what speech they spread and what they don't.

braiamp 2 days ago

> But I think we have to realize silencing people doesn't work

It actually does work. You need to remove ways for misinformation to spread, and suppressing a couple of big agents works very well.

- https://www.nature.com/articles/s41586-024-07524-8 - https://www.tandfonline.com/doi/full/10.1080/1369118X.2021.1... - https://dl.acm.org/doi/abs/10.1145/3479525 - https://arxiv.org/pdf/2212.11864

Obviously, the best solution would be prevention, by having good education systems and arming common people with the weapons to assess and criticize information, but we are kinda weak on that front.

wvenable 2 days ago

> But I think we have to realize silencing people doesn't work.

Doesn't it though? I've seen this repeated like it's fact but I don't think that's true. If you disallowed all of some random chosen conspiracy off of YouTube and other mainstream platforms I think it would stop being part of the larger public consciousness pretty quickly.

Many of these things arrived out of nothing and can disappear just as easily.

It's basic human nature that simply hearing things repeated over and over embeds it into your consciousness. If you're not careful and aware of what you're consuming then that becomes a part of your world view. The most effective way to bring people back from conspiratorial thinking (like QAnon) is to unplug them from that source of information.

ants_everywhere 2 days ago

These policies were put in place because the anti-vax and election skepticism content was being promoted by military intelligence organizations that were trying to undermine democracy and public healthy in the US.

The US military also promoted anti-vax propaganda in the Philippines [0].

A lot of the comments here raise good points about silencing well meaning people expressing their opinion.

But information warfare is a fundamental part of modern warfare. And it's effective.

An American company or individual committing fraud can be dealt with in the court system. But we don't yet have a good remedy for how to deal with a military power flooding social media with information intentionally designed to mislead and cause harm for people who take it seriously.

So

> I think we have to realize silencing people doesn't work

it seems to have been reasonably effective at combating disinformation networks

> It just causes the ideas to metastasize

I don't think this is generally true. If you look at old disinformation campaigns like the idea that the US faked the moon landings, it's mostly confined to a small group of people who are prone to conspiracy thinking. The idea of a disinformation campaign is you make it appear like a crazy idea has broad support. Making it appear that way requires fake accounts or at least boosters who are in on the scheme. Taking that away means the ideas compete on their own merit, and the ideas are typically real stinkers.

[0] https://www.btimesonline.com/articles/167919/20240727/u-s-ad...

vkou 2 days ago

> But I think we have to realize silencing people doesn't work.

We also tried letting the propaganda machine full-blast those lies on the telly for the past 5 years.

For some reason, that didn't work either.

What is going to work? And what is your plan for getting us to that point?

_spduchamp 2 days ago

Algorithmic Accountability.

People can post all sorts of crazy stuff, but the algorithms do not need to promote it.

Countries can require Algorithmic Impact Assements and set standards of compliance to recommended guidelines.

amanaplanacanal 2 days ago

This seems unlikely to be constitutional in the US.

lkey 2 days ago

I think you are granting false neutrality to this speech. These misinfo folks are always selling a cure to go with their rejection of medicine. It's a billion dollar industry built off of spreading fear and ignorance, and youtube doesn't have any obligation to host their content. As an example, for 'curing' autism, the new grift is reject Tylenol and buy my folic acid supplement to 'fix' your child. Their stores are already open and ready.

lkey 2 days ago

To finish the thought, scientists at the CDC (in the before times) were not making money off of their recommendations, nor were they making youtube videos as a part of their day job. There's a deep asymmetry here that's difficult to balance if you assume the premise that 'youtube must accept every kind of video no matter what, people will sort themselves out'. Reader, they will not.

mvdtnz 2 days ago

And silencing these people only lends credence to their "they don't want you to know this" conspiracy theories. Because at that point it's not a theory, it's a proven fact.

lkey 2 days ago

These people will claim they were 'silenced' regardless. Even as they appear with their published bestseller about being silenced on every podcast and news broadcast under the sun, they will speak of the 'conspiracy' working against them at every step. The actual facts at hand almost never matter. Even at a press conference where the President is speaking on your behalf they'll speak of the 'groups' that are 'against' them, full of nefarious purpose. There is no magical set of actions that changes the incentive they have to lie, or believe lies. (except regulation of snake oil, which is not going to happen any time soon)

mvdtnz 2 days ago

And most people roll their eyes and don't believe it. Which is why it's a good idea not to make it true.

NaN years ago

undefined

fullshark 2 days ago

It works 99% of the time and you are overindexing on the 1% of the time it doesn’t to draw your conclusion.

deegles 2 days ago

no, letting misinformation persist is counterproductive because of the illusory truth effect. the more people hear it, the more they think (consciously or not) "there must be something to this if it keeps popping up"

NullCascade 2 days ago

Elon Musk's takeover of X is already a good example of what happens with unlimited free speech and unlimited reach.

Neo-nazis and white nationalists went from their 3-4 replies per thread forums, 4chan posts, and Telegram channels, to now regularly reaching millions of people and getting tens of thousands of likes.

As a Danish person I remember how American media in the 2010s and early 2020s used to shame Denmark for being very right-wing on immigration. The average US immigration politics thread on X is worse than anything I have ever seen in Danish political discussions.

dyauspitr 1 day ago

Silencing people is the only thing that works is what I’ve learned on the internet.

aesthethiccs 1 day ago

Yes we should be allowed to bully idiots into the ground.

bencorman 2 days ago

I wish someone could have seen the eye roll I just performed reading this comment.

Silencing absolutely works! How do you think disinformation metastasized!?

benjiro 2 days ago

Funny thing, several person that counter responded and disagreed got grayed out (aka negative downvoted ... as in censored).

Reality is, i have personally seen what this type of uncontrolled anti-vax stuff does. The issue is that its harder to disprove a negative with a positive, then people realize.

The moment you are into the youtube, tiktok or whatever platform algorithm, you are fed a steady diet of this misinformation. When you then try to argue with actual factual studies, you get the typical response from "they already said that those studies are made up"... How do you fight that? Propaganda works by flooding the news and over time, people believe it.

That is the result of uncensored access because most people do not have the time to really look up a scientific study. The amount of negative channels massive out way positive / fact based channels because the later is "boring". Its the same reason why your evening news is 80% deaths, corruptions, thefts, politicians and taxes or other negative world news. Because it has been proven that people take in negative news much more. Clickbait titles that are negative draw in people.

There is a reason why holocaust denial is illegal in countries. Because the longer some people can spew that, the more people actually start to believe it.

Yes, i am going to get roasted for this but people are easily influenced and they are not as smart as they think themselves are. We have platforms that cater to people's short attention span with barely 1~3 min clips. Youtube video's longer then 30min are horrible for the youtubers income as people simply do not have the attention span and resulting lost income.

Why do we have laws like seatbelt, speed limits, and other "control" over people. Because people left to their own devices, can be extreme uncaring about their own family, others, even themselves.

Do i like the idea of censorship for the greater good, no. But when there are so many that spew nonsense just to sell their powders, and their homemade vitamine C solutions (made in China)... telling people information that may hurt or kills themselves, family or others.

Where is the line of that unbridled free speech? Silencing people works as in, your delaying the flow of shit running down a creek, will it stop completely? No, but the delay helps people downstream. Letting it run uninterrupted, hoping that a few people downstream with a mop will do all the work, yea ...

We only need to look at platforms like X, when "censorship" got removed (moderation). Full of free speech, no limits and it turned into a soak pit extreme fast (well, bigger soak pit).

Not sure why i am writing this because this is a heated topic but all i can say is, I have seen the damage that anti-vax did on my family. And even to this day, that damage is still present. How a person who never had a issue with vaccinations, never had a bad reaction beyond the sour arm for a day, turned so skeptical to everything vaccination. All because those anti-vax channels got to her.

The anti-vax movement killed people. There is scientific study upon study how red states in the US ended up with bigger amounts of deaths given the time periodes. And yet, not a single person was ever charged for this, ... all simply accepted this and never looked back. Like it was a natural thing, that people's grandparents, family members died that did not need to die.

The fact that people have given up, and now accept to let those with often financial interests, spew nonsense as much as they like. Well, its "normal".

I weep for the human race because we are not going to make it.

breadwinner 2 days ago

> silencing people doesn't work

I agree, but how do you combat propaganda from Putin? Do you match him dollar for dollar? I am sure YouTube would like that, but who has deep enough pockets to counter the disinformation campaigns?

Similar issue with Covid... when you are in the middle of a pandemic, and dead bodies are piling up, and hospitals are running out of room, how do you handle misinformation spreading on social media?

JumpCrisscross 2 days ago

Slow down our algorithmic hell hole. Particularly around elections.

LeafItAlone 2 days ago

>Slow down our algorithmic hell hole.

What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

JumpCrisscross 2 days ago

> What are your suggestions on accomplishing this while also bent compatible with the idea that government and big tech should not control ideas and speech?

Time delay. No content based restrictions. Just, like, a 2- to 24-hour delay between when a post or comment is submitted and when it becomes visible, with the user free to delete or change (in this case, the timer resets) their content.

I’d also argue for demonetising political content, but idk if that would fly.

NaN years ago

undefined

breadwinner 2 days ago

Easy solution: Repeal Section 230.

Allow citizens to sue social media companies for the harm caused to them by misinformation and disinformation. The government can stay out of this.

NaN years ago

undefined

breadwinner 2 days ago

If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

This is the conundrum social media has created. In the past only the press, who were at least semi-responsible, had the ability to spread information on a massive scale. Social media changed that. Now anyone can spread information instantly on a massive scale, and often it is the conspiracy theories and incorrect information that people seek out.

"We were a bit naive: we thought the internet, with the availability of information, would make us all a lot more factual. The fact that people would seek out—kind of a niche of misinformation—we were a bit naive." -- Bill Gates to Oprah, on "AI and the Future of us".

JumpCrisscross 2 days ago

> If the government asks private companies to do that, then that's a violation of 1st amendment, isn't it?

Yes. An unfortunate conclusion I’m approaching (but have not reached, and frankly don’t want to reach) is the First Amendment doesn’t work in a country that’s increasingly illiterate and addicted to ad-powered algorithmic social media.

NaN years ago

undefined

TeeMassive 2 days ago

Have you heard about Tik Tok? And you think governments' intelligence agencies are not inserting their agents in key positions at bit tech companies?

altruios 2 days ago

Censorship is a tool to combat misinformation.

It's taking a sword to the surgery room where no scalpel has been invented yet.

We need better tools to combat dis/mis-information.

I wish I knew what that tool was.

Maybe 'inoculating information' that's specifically stickier than the dis/mis-info?

breadwinner 2 days ago

Easy solution: Repeal Section 230.

Social media platforms in the United States rely heavily on Section 230 of the Communications Decency Act, which provides them immunity from liability for most user-generated content.

DangitBobby 2 days ago

This would cause widespread censorship of anything remotely controversial, including the truth. We'd be in a "censor first, ask questions later" society. Somehow that doesn't seem healthy either.

NaN years ago

undefined

homeonthemtn 2 days ago

You are on a platform that polices speech. It is evidence that policing speech helps establish civility and culture. There's nothing wrong with policing speech, but it can certainly be abused.

If you were on the early Internet, you were self policing with the help of admins all the time. The difference was you had niche populations that had a stake in keeping the peace and culture of a given board

We broke those boundaries down though and now pit strangers versus strangers for clicks and views, resulting in daily stochastic terrorism.

Police the damn speech.

softwaredoug 2 days ago

For inciting violence. Sure. Free speech isn’t absolute.

But along with fringe Covid ideas, we limited actual speech on legitimate areas of public discourse around Covid. Like school reopening or questioning masks and social distancing.

We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.

(And I believe those experts actually did about as best they could given the circumstances)

scuff3d 1 day ago

Try to post a meme here, see how long it stays up.

More seriously, it's just not this simple man. I know people really want it to be, but it's not.

I watched my dad get sucked down a rabbit hole of qanon, Alex Jones, anti-vax nonsense and God knows what other conspiracy theories. I showed him point blank evidence that qanon was bullshit, and he just flat out refuses to believe it. He's representative of a not insignificant part of the population. And you can say it doesn't do any damage, but those people vote, and I think we can see clearly it's done serious damage.

When bonkers ass fringe nonsense with no basis in reality gets platformed, and people end up in that echo chamber, it does significant damage to the public discourse. And a lot of it is geared specifically to funnel people in.

In more mainstream media climate change is a perfect example. The overwhelming majority in the scientific community has known for a long time it's an issue. There were disagreement over cause or severity, but not that it was a problem. The media elevated dissenting opinions and gave the impression that it was somehow an even split. That the people who disagree with climate change were as numerous and as well informed, which they most certainly weren't, not by a long shot. And that's done irreparable damage to society.

Obviously these are very fine lines to be walked, but even throughout US history, a country where free speech is probably more valued than anywhere else on the planet, we have accepted certain limitations for the public good.

homeonthemtn 2 days ago

If I were trying to govern during a generational, world stopping epoch event, I would also not waste time picking through the trash to hear opinions.

I would put my trust in the people I knew were trained for this and adjust from there.

I suspect many of these opinions are born from hindsight.

xboxnolifes 2 days ago

Letting fringe theories exist on YouTube does not stop you from accessing the WHO or CDC website.

NaN years ago

undefined

NaN years ago

undefined

themaninthedark 1 day ago

Luckily, it is possible for you to just listen to those you trust. No need for you go pick through other people's opinions.

I don't see how that turns into you needing to mandate what I read and who's opinions I hear.

NaN years ago

undefined

zmgsabst 2 days ago

Really?

Experts have a worse track record than open debate and the COVID censorship was directed at even experts who didn’t adhere to political choices — so to my eyes, you’re saying that you’d give in to authoritarian impulses and do worse.

NaN years ago

undefined

McGlockenshire 2 days ago

The "debate" ended up doing nothing but spreading misinformation.

Society as a whole has a responsibility to not do that kind of shit. We shouldn't be encouraging the spread of lies.

epistasis 2 days ago

Really, discussion was limited? Or blatant lies were rightly excluded from discourse?

There's a big difference, and in any healthy public discourse there are severe reputations penalties for lies.

If school reopening couldn't be discussed, could you point to that?

It's very odd how as time goes on my recollection differs so much from others, and I'm not sure if it's because of actual different experiences or because of the fog of memory.

mixmastamyk 2 days ago

Blatant truths were excluded as well, and that's the main problem. See replies to: https://news.ycombinator.com/item?id=45353884

NaN years ago

undefined

fzeroracer 1 day ago

> We needed those debates. Because the unchecked “trust the experts” makes the experts dumber. The experts need to respond to challenges.

We've had these debates for decades. The end result is stuff like Florida removing all vaccine mandates. You can't debate a conspiracy or illogical thinking into to going away, you can only debate it into validity.

jader201 2 days ago

> Police the damn speech.

What happens when the “police” disagrees with and silences what you believe is true? Or when they allow the propagation of what you believe to be lies?

Who gets to decide what’s the truth vs. lies? The “police”?

palmfacehn 1 day ago

>Who gets to decide what’s the truth vs. lies? The “police”?

This keeps coming up on this site. It seems like a basic premise for a nuanced and compassionate worldview. Humility is required. Even if we assume the best intentions, the fallible nature of man places limits on what we can do.

Yet we keep seeing posters appealing to Scientism and "objective truth". I'm not sure it is possible to have a reasonable discussion where basic premises diverge. It is clear how these themes have been used in history to support some of the worst atrocities.

StanislavPetrov 2 days ago

Policing speech for civility or spam is very different than policing speech for content that you disagree with. I was on the early internet, and on the vast majority of forums policing someone's speech for content rather than vulgarity or spam was almost universally opposed and frowned upon.

frollogaston 2 days ago

Depends who is doing the policing. In this case, White House was telling Google who to ban.

aeternum 1 day ago

I think it was even slightly worse. The White House was effectively delegating the decision of who to ban/police to the NIH/NIAID, an organization that was funding novel coronavirus research in Wuhan.

It's easy to see how at minimum there could be a conflict of interest.

frollogaston 1 day ago

Did I miss somewhere in the article or Google's statement that the NIH was involved?

NaN years ago

undefined

nostromo 2 days ago

You've missed the point entirely.

It’s not if Google can decide what content they want on YouTube.

The issue here is that the Biden Whitehouse was pressuring private companies to remove speech that they otherwise would host.

That's a clear violation of the first amendment. And we now know that the previous Whitehouse got people banned from all the major platforms: Twitter, YouTube, Facebook, etc.

dotnet00 1 day ago

They claim that the Biden admin pressured them to do it, except that they had been voluntarily doing it even during Trump's initial presidency.

The current administration has been openly threatening companies over anything and everything they don't like, it isn't surprising all of the tech companies are claiming they actually support the first amendment and were forced by one of the current administration's favorite scapegoats to censor things.

homeonthemtn 2 days ago

[flagged]

nostromo 2 days ago

Thankfully the constitution explicitly forbids that in the US.

NaN years ago

undefined

zmgsabst 2 days ago

[flagged]

z0r 2 days ago

There is no mass Marxist movement in the USA. There is a left wing crippled by worse than useless identity politics.

homeonthemtn 2 days ago

[flagged]

TacticalCoder 2 days ago

[dead]

heavyset_go 2 days ago

[flagged]

thrance 2 days ago

[flagged]

felixgallo 2 days ago

[flagged]

putzdown 2 days ago

No. This perspective is wrong in both directions: (1) it is bad medicine and, (2) the medicine doesn't treat the disease. If we could successfully ban bad ideas (assuming that "we" could agree on what they are) then perhaps we should. If the damage incurred by the banning of ideas were sufficiently small, perhaps we should. But both of these are false. Banning does not work. And it brings harm. Note that the keepers of "correct speech" doing the banning today (eg in Biden's day) can quickly become the ones being banned another day (eg Trump's). It's true that drowning the truth through volume is a severe problem, especially in a populace that doesn't care to seek out truth, to find needles in haystacks. But again, banning doesn't resolve this problem. The real solution is develop a populace that cares about, seeks out, and with some skill identifies the truth. That may not be an achievable solution, and in the best case it's not going to happen quickly. But it is the only solution. All of the supply-based solutions (controlling speech itself, rather than training good listeners) run afoul of this same problem, that you cannot really limit the supply, and to the extent you can, so can your opponents.

paulryanrogers 2 days ago

What do you think about measures that stop short of banning? Like down ranking, demonetizing, or even hell 'banning' that just isolates cohorts that consistently violate rules?

rahidz 2 days ago

Not OP, but my opinion is that if a platform wants to do so, then I have zero issues with that, unless they hold a vast majority of market share for a certain medium and have no major competition.

But the government should stay out of it.

felixgallo 2 days ago

No. You are objectively wrong. It's great medicine that works -- for example, in Germany, and in the US, and elsewhere, it has stemmed the flow of violent extremism historically to stop the KKK and the Nazis. You can't even become a citizen if you have been a Nazi. Even on the small scale, like reddit, banning /r/fatpeoplehate was originally subject to much handwringing and weeping by the so-called free speech absolutists, but guess what -- it all went away, and the edgelords and bullies went back to 4chan to sulk, resulting in the bullshit not being normalized and made part of polite society.

If you want to live in a society where absolutely anything goes at all times, then could I recommend Somalia?

unclad5968 2 days ago

Can we stop with the Nazi stuff. I don't know if they stopped teaching history, but there is nothing happening in the US that is within an order of magnitude of the evil the Nazi's perpetrated. Being anti-vax is not comparable to genocide.

jjk166 2 days ago

The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

Many evil people weren't Nazis; some Nazis weren't necessarily evil. Evil is not part of the definition of Nazism. Promoting authoritarianism, exclusionary nationalism, institutional racism, autarky, anti-liberalism and anti-socialism are the hallmarks of Nazism. Anyone who holds the beliefs of the Nazis is a Nazi, regardless of what level of success they have to date achieved in carrying out their aims.

unclad5968 2 days ago

> The Nazis in 1933 hadn't done anything within an order of magnitude of the evil they would perpetrate in 1943. They nevertheless were still Nazis, and everyone who did not actively condemn them then was in part responsible for what they did later.

Only because what they did in 1943 surpassed anything imaginable. In 1933 the Nazi party immediately banned all political parties, arrested thousands of political opponents, started forcing sterilization of anyone with hereditary illnesses, and forced abortions of anyone with hereditary illness. Evil is absolutely an identifying part of Nazis. The idea that Nazis are just anti-liberals is exactly why we cannot go around calling everyone we don't like Nazis. The Nazis were not some niche alt-right organization.

If you genuinely think there are Nazis controlling youtube or the government, and all you're doing is complaining about it on hackernews, you're just as complicit as you're claiming those people were.

NaN years ago

undefined

epakai 2 days ago

We read the history, a lot of it rhymes. Conservatives failed, and exchanged their values for a populist outsider to maintain power (see Franz von Papen). The outsider demeans immigrants and 'sexual deviants'. The outsider champions nationalism. He pardons the people who broke the law to support him. Condemns violence against the party while ignoring the more common violence coming out of the those aligned with the party. Encourages the language of enemies when discussing political opponents and protestors.

Nazi has a lot more connotations than genocide. I'm not sure it is worth nitpicking over. Even if you tone it down to Fascist or Authoritarian there will be push back.

tehjoker 2 days ago

[flagged]

unclad5968 2 days ago

[flagged]

dotnet00 2 days ago

How can you say that banning Nazis has worked well considering everything so far this year?

felixgallo 2 days ago

Europe is sliding, but has done ok so far. Crossing fingers.

miltonlost 2 days ago

Well it would if we would actually ban Nazis instead of platform them. They haven't been banned. That's the problem.

dotnet00 2 days ago

You'd have to ban them from society outright without somehow devolving into an authoritarian hellhole in the process (impossible). Trump still primarily posts on a platform specifically created to be a right wing extremist echo chamber.

NaN years ago

undefined

cpursley 2 days ago

What is a Nazi?

NaN years ago

undefined

knifemaster 2 days ago

[flagged]

tehjoker 2 days ago

[flagged]

indy 2 days ago

Perhaps not the wisest comment to make in light of recent events

tehjoker 2 days ago

I didn't say violence. Whatever you read into that comment is a projection. I'm not even sure violence is effective, but something more muscular than op-eds is called for. For example, labor organizing and various forms of self-defense organizations, of which there are many kinds, not only militias. For example, anti-ICE organizing which protects vulnerable people from the gestapo.

breadwinner 2 days ago

The government created this problem when they enacted Section 230. This is at the root of the misinformation and disinformation... social media companies are not responsible for the harm.

The simple solution is repeal Section 230. When information can be transmitted instantly on a massive scale, somebody need to responsible for the information. The government should not police information but citizens should be allowed to sue social media companies for the harm caused to them.

int_19h 2 days ago

The practical end result of repealing Section 230 is that companies will crack down on any even remotely controversial speech because that's the only way to avoid lawsuits.

breadwinner 2 days ago

The New York Times has published plenty of stories you could call controversial. Just this morning the top headline was that Trump lied at the UN. Trump has sued the Times for defamation, yet the paper stands by its reporting. That’s how publishing works: if you can’t defend what you publish, don’t publish it. The Section 230 debate is about whether large online platforms such as Facebook should bear similar accountability for the content they distribute. I think they should. That's the only way we can control misinformation and disinformation.

dawnerd 2 days ago

It also turns into a talking point for them. A lot of these weird conspiracies would have naturally died out if some people didn’t try to shut them down so much.

stinkbeetle 2 days ago

For that matter why is it even such a crazy wild idea for anybody to dare to question medicines and motives from pharmaceutical companies? Or question elections?

Both have always been massively shady. I'm old enough to remember the big stink around the Al Gore election loss, or the robust questioning of the 2016 election for that matter. So ridiculous for self-proclaimed defenders of democracy to want to ban the discussion and disagreement about the facts around elections. Democratic processes and institutions should be open to doubt, questioning, and discussion.

The response to covid vaccines was actually extremely rational. They were highly taken up by the elderly who were shown to have the greatest risk, despite that demographic skewing more conservative (and arguably could be most at risk of "misinformation" from social media). And they were not able to stop transmission or provide much benefit to children and younger people, so they didn't get taken up so much among those groups. So there was really no need for this massive sustained psychological campaign of fearmongering, divisiveness, censorship, and mandates. They could have just presented the data and the facts as they came to hand, and be done with it.

dotnet00 1 day ago

With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate? This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

stinkbeetle 1 day ago

> With medicine there's pushback because the vast majority of the time, someone's scamming you and you likely don't actually know what you're talking about, we had a ton of this during covid, radioactive jewelery that was supposed to protect you, cow piss (I personally know people who tried this...), 5G towers (actual damage done to all sorts of towers), Ivermectin, Hydrochloroquine and more. People who are sick or have a sick loved one are especially vulnerable to these sorts of things (there's an example of such a victim in the comments), and often end up making things worse by waiting too long or causing further damage.

Pushback on what? There's always been new age hippy garbage, Chinese medicine, curing cancer with berries, and that kind of thing around. I don't see that causing much damage and certainly not enough to warrant censorship. People can easily see through it and in the end they believe what they want to believe.

Far far more dangerous and the cause of real damage that I have seen come from the pharmaceutical industry and their captured regulators. Bribing medical professionals, unconscionable public advertising practices, conspiring to push opioids on the population, lying about the cost to produce medications, and on and on. There's like, a massive list of the disasters these greedy corporations and their spineless co-conspirators in government regulators have caused.

Good thing we can question them, their motives, their products.

> With questioning elections, I think Jan 6 would be a pretty good indication of why it wasn't appropriate?

I don't understand your question. Can you explain why you think Jan 6 would be a pretty good indication that discussion and disagreement about elections should be censored?

> This wasn't how questioning the results of elections goes in democracies. Instead, even after courts had investigated, the outgoing president refused to accept the result without any substantiated evidence.

I never quite followed exactly were the legal issues around that election. Trump was alleged to have tried to illegally influence some election process and/or obstructed legal transfer of power. Additionally there was a riot of people who thought Trump won and some broke into congress and tried to intimidate law makers.

I mean taking the worst possible scenario, Trump knew he lost and was scheming a plan to seize power and was secretly transmitting instructions to this mob to enter the building and take lawmakers hostage or something like that. Or any other scenario you like, let your imagination go wild.

I still fail to see how that could possibly justify censorship of the people and prohibiting them from questioning the government or its democratic processes. In fact the opposite, a government official went rogue and committed a bunch of crimes so therefore... the people should not be permitted to question or discuss the government and its actions?

There are presumably laws against those actions of rioting, insurrection, etc. Why, if the guilty could be prosecuted with those crimes, should the innocent pay with the destruction of their human rights, in a way that wouldn't even solve the problem and could easily enable worse atrocities be committed by the government in future?

Should people who question the 2024 election be censored? Should people who have concerns with the messages from the government's foremost immigration and deportation "experts" be prohibited from discussing their views or protesting the government's actions?

dotnet00 1 day ago

Robbery is a crime, so why should people take any measures to protect their things from being stolen? Murder is a crime, so why care about death threats?

New age medicine has been around forever, yes. But the effects are only known to be negligible outside of pandemics. We know from history that people did many irrational things during past pandemics due to fear and social contagion.

It's a tough problem, everyone believes themselves an expert on everything, plus trolls and disinformation campaigns. There's also a significant information asymmetry.

It's funny you mention opioids as I just recently came across a tweet claiming that Indians were responsible for getting Americans addicted to them via prescription. In one of the buried reply chains, the poster admits they have no evidence and are just repeating a claim someone made to them sometime. But how many people will read that initial post and reinforce their racist beliefs vs see that the claim was unsubstantiated? And when that leads to drastic action by a madman, who's going to be the target of the blame? The responsibility is too diffused to target any specific person, the government obviously won't, madmen don't act in a vacuum and so the blame falls on the platform.

Yes, no one should have the power to determine what ideas are and are not allowed to propagate, but on the other hand, you could still go to other platforms and are not entitled to the reach of the major platforms, but then again, these platforms are extremely influential. At the same time there's also the problem that people in part view the platforms as responsible when they spread bad ideas, the platform operators also feel some level of social responsibility, while the platform owners don't want legal responsibility.

NaN years ago

undefined

cactusplant7374 2 days ago

> From President Biden on down, administration officials “created a political atmosphere that sought to influence the actions of platforms based on their concerns regarding misinformation,” Alphabet said, claiming it “has consistently fought against those efforts on First Amendment grounds.”

This actually surprised me because I thought (and maybe still think) that it was Google employees that led the charge on this one.

softwaredoug 2 days ago

It's in their interests now to throw Biden under the bus. There may be truth to this, but I'm sure its exaggerated for effect.

nitwit005 2 days ago

Worth noting that Trump directly threatened to put Zuckerberg in prison for life in relation to this: https://www.cnn.com/2024/08/31/politics/video/smr-trump-zuck...

I wouldn't trust any public statement from these companies once that kind of threat has been thrown around. People don't exactly want to go to prison forever.

HankStallone 2 days ago

It was. At the time, they felt like they were doing the right thing -- the heroic thing, even -- in keeping dangerous disinformation away from the public view. They weren't shy about their position that censorship in that case was good and necessary. Not the ones who said it on TV, and not the ones who said it to me across the dinner table.

For Google now to pretend Biden twisted their arm is pretty rich. They'd better have a verifiable paper trail to prove that, if they expect anyone with a memory five years long to believe it.

dotnet00 2 days ago

To be fair, even if they were being honest about Biden twisting their arm (I don't buy it), the timing makes it impossible to believe their claim.

CSMastermind 2 days ago

Why wouldn't you buy it?

The Twitter files showed direct communications from the administration asking them ban specific users like Alex Berenson, Dr. Martin Kulldorff, and Dr. Andrew Bostom: https://cbsaustin.com/news/nation-world/twitter-files-10th-i...

Meta submitted direct communications from the administration pressuring them to ban people as part of a congressional investigation: https://www.aljazeera.com/news/2024/8/27/did-bidens-white-ho...

It would be more surprising if they left Google alone.

dotnet00 2 days ago

The implication of saying they were "pressed" by the Biden admin (as they claim in the letter) is that Google was unwilling. I don't buy that. They were complicit and are now throwing the Biden admin under the bus because it is politically convenient. Just like how the Twitter files showed that Twitter was complicit in it.

NaN years ago

undefined

braiamp 2 days ago

If you read those documents, you will see that the administration was telling them that those accounts were in violation of Twitter TOS. They simply said "hey, this user is violating your TOS, what are you gonna do about it?", and Twitter simply applied their rules.

NaN years ago

undefined

tstrimple 1 day ago

They don't need a paper trail. Conservatives will believe anything damning they see about liberals. Just vague accusations or outright lies work plenty well to keep conservatives foaming at the mouth over imagined issues.

frollogaston 2 days ago

It's been known for years that the White House was pressuring Google on this. One court ordered them to cease temporarily. I wanted to link the article, but it's hard to find because of the breaking news.

diego_sandoval 2 days ago

At the time, YouTube said: “Anything that would go against World Health Organization recommendations would be a violation of our policy.” [1] which, in my opinion, is a pretty extreme stance to take, especially considering that the WHO contradicted itself many times during the pandemic.

[1] https://www.bbc.com/news/technology-52388586

danparsonson 2 days ago

> the WHO contradicted itself many times during the pandemic

Did they? I remember them revising their guidance, which seems like something one would expect during an emerging crisis, but I don't remember them directly contradicting themselves.

rogerrogerr 2 days ago

As super low hanging fruit:

June 8, 2020: WHO: Data suggests it's "very rare" for coronavirus to spread through asymptomatics [0]

June 9, 2020: WHO expert backtracks after saying asymptomatic transmission 'very rare' [1]

0: https://www.axios.com/2020/06/08/who-coronavirus-asymptomati... 1: https://www.theguardian.com/world/2020/jun/09/who-expert-bac...

Of course, if we just take the most recent thing they said as "revised guidance", I guess it's impossible for them to contradict themselves. Just rapidly re-re-re-revised guidance.

margalabargala 2 days ago

The difference between a contradiction and a revision is the difference between parallel and serial.

I'm not aware that the WHO ever claimed simultaneously contradictory things.

Obviously, rapid revisions during a period of emerging data makes YouTube's policy hard to enforce fairly. Do you remove things that were in line with the WHO when they were published? When they were made? Etc

zmgsabst 2 days ago

You’re removing people who were correct before the WHO revised their position.

NaN years ago

undefined

NaN years ago

undefined

naasking 2 days ago

A censorship policy that changes daily is a shitty policy. If people on June 8th criticized that official position before they reversed the next day, do you think it was right or a good idea for them to be censored?

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

brailsafe 2 days ago

> The difference between a contradiction and a revision is the difference between parallel and serial.

Eh, ya kind of, but it seems more like the distinction between parallel and concurrent in this case. She doesn't appear to be wrong in that instance while at the same time the models might have indicated otherwise, being an apparent contradiction and apparently both true within the real scope of what could be said about it at that time.

natch 2 days ago

They would not utter the word Taiwan. That’s an huge red flag that they are captured and corrupt. Are you claiming this has changed?

NaN years ago

undefined

dazilcher 1 day ago

> I'm not aware that the WHO ever claimed simultaneously contradictory things.

Whether they did or not is almost irrelevant: information doesn't reach humans instantaneously, it takes time to propagate through channels with varying latency, it gets amplified/muted depending on media bias, people generally have things going on in life other than staying glued to new sources, etc.

If you take a cross sample you're guaranteed to observe contradictory "parallel" information even if the source is serially consistent.

danparsonson 2 days ago

OK and if you said something that you later realised to be wrong, would you be contradicting yourself by correcting it? What should they have done in this situation? People do make mistakes, speak out of turn, say the wrong thing sometimes; I don't think we should criticise someone in that position who subsequently fixes their error. And within a couple of days in this case! That's a good thing. They screwed up and then fixed it. What am I missing here?

stinkbeetle 2 days ago

When you're a global organization who is pushing for the censorship of any dissent or questioning of your proclamations, it's really on you not to say one thing one day then the opposite the next day, isn't it? They could have taken some care to make sure their data and analysis was sound before making these kinds of statements.

If you posted to YouTube that it is very rare for asymptomatics to spread the disease, would you be banned? What if you posted it on the 9th in the hours between checking their latest guidance and their guidance changing? What if you posted it on the 8th but failed to remove it by the 10th?

What if you disagreed with their guidance they gave on the 8th and posted something explaining your stance? Would you still get banned if your heresy went unnoticed by YouTube's censors until the 10th at which time it now aligns with WHO's new position? Banned not for spreading misinformation, but for daring to question the secular high priests?

NaN years ago

undefined

NaN years ago

undefined

f33d5173 2 days ago

Them correcting themselves isn't a bad thing. The point is that it would be absolutely retarded to require that people never disagree with the WHO. Please try and follow the thread of the conversation and not take it down these pointless tangents.

NaN years ago

undefined

brookst 2 days ago

Is there a difference between an expert opinion in the midst of a pandemic and an organizational recommendation?

rogerrogerr 2 days ago

Sure seemed like you'd get kicked off YouTube equally fast for questioning either one.

NaN years ago

undefined

1oooqooq 2 days ago

they also changed the symptoms definitions, so ...

danparsonson 2 days ago

So as researchers learned more about COVID the WHO should've just ignored any new findings and stuck to their initial guidance? This is absurd.

wdr1 1 day ago

> Did they?

They said it was a fact that COVID is NOT airborne. (It is.)

Not they believed it wasn't airborne.

Not that data was early but indicated it wasn't airborne.

That it was fact.

In fact, they published fact checks on social media asserting that position. Here is one example on the official WHO Facebook page:

https://www.facebook.com/WHO/posts/3019704278074935/?locale=...

danparsonson 1 day ago

None of that argues that they contradicted themselves. You and several others have just hijacked this thread to pile on the WHO.

Argue that they were incompetent in their handling of it, sure, whatever. That's not the comment you're replying to.

Manuel_D 1 day ago

Some WHO reports were suggesting that lockdowns do more harm than good as early as late 2020.

kevin_thibedeau 2 days ago

Don't forget that they ban-hammered anyone who advanced the lab leak theory because a global entity was pulling the strings at the WHO. I first heard about Wuhan in January of 2020 from multiple Chinese nationals who were talking about the leak story they were seeing in uncensored Chinese media and adamant that the state media story was BS. As soon as it blew up by March, Western media was manipulated into playing the bigotry angle to suppress any discussion of what may have happened.

zeven7 1 day ago

I believe having Trump as president exacerbated many, many things during that time, and this is one example. He was quick to start blaming the "Chinese", he tried to turn it into a reason to dislike China and Chinese people, because he doesn't like China, and he's always thinking in terms of who he likes and dislikes. This made it hard to talk about the lab leak hypothesis without sounding like you were following Trump in that. If we had had a more normal president, I don't think this and other issues would have been as polarized, and taking nuanced stances would have been more affordable.

amanaplanacanal 2 days ago

My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

Eventually I started seeing some serious discussion about how it might have been accidentally created through gain of function research.

api 2 days ago

I’m undecided on the issue, but… if I were trying to cover up an accidental lab leak I’d spread a story that it was a giant conspiracy to create a bio weapon. For extra eye rolls I’d throw in some classic foil hat tropes like the New World Order or the International Bankers.

If it was a lab leak, by far the most likely explanation is that someone pricked themselves or caught a whiff of something.

A friend of mine who lived in China for a while and is familiar with the hustle culture there had his own hypothesis. Some low level techs who were being given these bats and other lab animals to euthanize and incinerate were like “wait… we could get some money for these over at the wet market!”

naasking 2 days ago

> My memory is that the "lab leak" stuff I saw back then was all conspiracy theories about how it was a Chinese bioweapon.

No, that was just the straw man circulated in your echo chamber to dismiss discussion. To be clear, there were absolutely people who believed that, but the decision to elevate the nonsense over the serious discussion is how partisan echo chambers work.

gusgus01 1 day ago

That was one of the main arguments by some of my coworkers and friends when COVID came up socially. I specifically remember a coworker at a FAANG saying something along the lines of "It's a bioweapon, so it's basically an act of war".

potsandpans 2 days ago

I called this out in this thread and was immediately downvoted

McGlockenshire 2 days ago

> because a global entity was pulling the strings at the WHO'

excuse me I'm sorry what?

cmilton 2 days ago

Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source. There is a debate out there for 100k to prove this. Check it out.

pton_xd 2 days ago

> Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

A novel coronavirus outbreak happens at the exact location as a lab performing gain of function research on coronaviruses... but yeah, suggesting a lab leak is outlandish, offensive even, and you should be censored for even mentioning that as a possibility. Got it.

This line of thinking didn't make sense then and still doesn't make sense now.

aeternum 1 day ago

Yes, Jon Stewart really nailed this point, it's a great clip and worth the re-watch.

j_w 1 day ago

But the first cases were all linked to a wet market far enough from the lab that it would be highly improbable for the cases to come from the lab itself.

NaN years ago

undefined

cmilton 1 day ago

Correlation does not equal causation my friend.

Plenty of people were able to talk about lab leak conspiracies. That is why we are still debating it today.

NaN years ago

undefined

mayama 2 days ago

> There is no proof of a lab leak and evidence leads to the wet market as the source

Because WHO worked with CPC to bury evidence and give clean chit to wuhan lab. There was some pressure building up then for international teams to visit wuhan lab and examine data transparently. But, with thorough ban of lab leak theory, WHO visited china and gave clean chit without even visiting wuhan lab or having access to lab records. The only place that could prove this definitively buried all records.

potsandpans 2 days ago

The topic at hand is not whether it's a bold claim to make. The question is: should organizations that control a large portion of the world's communication channels have the ability to unilaterally define the tone and timber of a dialog surrounding current events?

To the people zealously downvoting all of these replies: defend yourselves. What about this is not worthy of conversation?

I'm not saying that I support lab leak. The observation is that anyone that discussed the lab leak hypothesis on social media had content removed and potentially were banned. I am fundamentally against that.

If the observation more generally is that sentiments should be censored that can risk peoples lives by influencing the decisions they make, then let me ask you this:

Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated.

blooalien 1 day ago

> "Should Charlie Kirk have been censored? If he were, he wouldn't have been assassinated."

On the other hand, if he were, then whoever censored him might have just as easily become the target of some other crazy, because that appears to be the world we live in now. Something's gotta change. This whole "us vs them" situation is just agitating the most extreme folks right over the edge of sanity into "Crazy Town". Wish we could get back to bein' that whole "One Nation Under God" "Great Melting Pot" "United States" they used to blather on about in grade-school back in the day, but that ship appears to have done sailed and then promptly sunk to the bottom... :(

mensetmanusman 2 days ago

It’s not a bold claim. The Fauci emails showed he and others were discussing this as a reasonable possibility.

naasking 2 days ago

> Because that is a bold claim to make. There is no proof of a lab leak and evidence leads to the wet market as the source.

It was not a bold claim at the time. Not only was there no evidence that it was the wet market at the time, the joint probability of a bat coronavirus outbreak where there were few bat caves but where they were doing research on bat coronaviruses is pretty damning. Suppressing discussion of this very reasonable observation was beyond dumb.

tbrownaw 2 days ago

> Suppressing discussion of this very reasonable observation was beyond dumb.

I thought it wasn't so much an error as a conflict of interest.

themaninthedark 1 day ago

It is not as cut and dry as you think. Also it is rather hard to get any evidence when you aren't allow to visit the "scene of the crime" so to speak and all data is being withheld.

https://www.nytimes.com/interactive/2024/06/03/opinion/covid...

Even Dr Fauci said in 2021 he was "not convinced" the virus originated naturally. That was a shift from a year earlier, when he thought it most likely Covid had spread from animals to humans.

https://www.deseret.com/coronavirus/2021/5/24/22451233/coron...

(..February 2023..) The Department of Energy, which oversees a network of 17 U.S. laboratories, concluded with “low confidence” that SARS-CoV-2 most likely arose from a laboratory incident. The Federal Bureau of Investigation said it favored the laboratory theory with “moderate” confidence. Four other agencies, along with a national intelligence panel, still judge that SARS-CoV-2 emerged from natural zoonotic spillover, while two remain undecided.

https://www.nejm.org/doi/full/10.1056/NEJMp2305081

WHO says that "While most available and accessible published scientific evidence supports hypothesis #1, zoonotic transmission from animals, possibly from bats or an intermediate host to humans, SAGO is not currently able to conclude exactly when, where and how SARS-CoV-2 first entered the human population."

However "Without information to fully assess the nature of the work on coronaviruses in Wuhan laboratories, nor information about the conditions under which this work was done, it is not possible for SAGO to assess whether the first human infection(s) may have resulted due to a research related event or breach in laboratory biosafety."

https://www.who.int/news/item/27-06-2025-who-scientific-advi...

WHO paraphrased: We have no data at all about the Wuhan Laboratory so we can not make a conclusion on that hypothesis. Since we have data relating to natural transmission from animals we can say that situation was possible.

natch 2 days ago

But there is no proof of any real wet lab connection and evidence points to the lab as a source.

hyperhopper 2 days ago

The united states also said not to buy masks and that they were ineffective during the pandemic.

Placing absolute trust in these organizations and restricting freedom of speech based on that is a very bootlicking, anti-freedom stance

anonymousiam 2 days ago

Fauci was trying to prevent a run on masks, which he believed were needed by the health care workers. So he probably justified his lie to the US to himself because it was for the "greater good" (The ends justify the means is not my view BTW).

It turns out that masks ARE largely ineffective at preventing CoViD infection. It's amazing how many studies have come up with vastly different results.

https://egc.yale.edu/research/largest-study-masks-and-covid-...

(Before you tell me that the story I cited above says the opposite, look at the effectiveness percentages they claim for each case.)

There's also this: https://x.com/RandPaul/status/1970565993169588579

iyn 1 day ago

Actual (N95/FFP2/FFP3) masks DO work, your comment is misleading. The study you've linked says:

> Colored masks of various construction were handed out free of charge, accompanied by a range of mask-wearing promotional activities inspired by marketing research

"of various construction" is... not very specific.

If you just try to cover your face with a piece of cloth it won't work well. But if you'll use a good mask (N95/FFP2/FFP3), with proper fit [0] then you can decrease the chance of being infected (see e.g. [1])

[0] https://www.mpg.de/17916867/coronavirus-masks-risk-protectio...

[1] https://www.cam.ac.uk/research/news/upgrading-ppe-for-staff-...

dotnet00 1 day ago

They claim a 5% reduction in spread with cloth masks and a 12% reduction with surgical masks. I think 1 less case out of every 10 or 20 is pretty acceptable?

Especially at the time when many countries were having their healthcare systems overloaded by cases.

pixxel 1 day ago

[dead]

lisbbb 1 day ago

I didn't want to be the one to have to say it, but neither masks nor social distancing had any scientific backing at all. It was all made up, completely made up. The saddest thing I see all the time is the poor souls STILL wearing masks in 2025 for no reason. I don't care how immunocompromised they are, the mask isn't doing anything to prevent viral infection at all. They might help against pollen. I also can't believe how many doctors and nurses at my wife's cancer clinic wear masks all the damn time even though they are not in a surgical enviornment. It's all been foisted upon them by the management of those clinics and the management is completely insane and nobody speaks up about it because it's their job if they do, so the isanity just keeps rolling on and on and it is utterly dehumanizing and demoralizing. If a cancer patient wants to wear a mask because it affords them some tiny comfort, then fine, but that is purely psychological. I've seen it over and over and over because I've been at numerous hospitals this past year trying to help my wife survive a cancer that I think Pfizer may be to blame for.

jbm 1 day ago

I'm sorry about your wife.

There was scientific basis for N95 masks and similar masks. If you are talking about cloth and paper masks, I mostly agree. Even then there were tests done with using even those surgical masks with 3d printed frames. I remember this as one example of people following this line of thinking.

https://www.concordia.ca/news/stories/2021/07/26/surgical-ma...

As for dehumanization, I used to live in Tokyo and spending years riding the train. I think blaming masks for dehumanization when we have entire systems ragebaiting us on a daily basis is like blaming the LED light for your electric bill.

Social Distancing having "no scientific backing" is very difficult to respond to. Do you mean in terms of long term reduction of spread, or as a temporary measure to prevent overwhelming the hospitals (which is what the concern was at the time)?

I do agree that it was fundamentally dishonest to block people from going to church and then telling other people it was OK to protest (because somehow these protests were "socially distanced" and outdoors). They could have applied the same logic to Church groups and helped them find places to congregate, but it was clearly a case of having sympathy for the in-group vs the out-group.

D-Machine 1 day ago

Basically, yes. However, if we make a distinction between respirators (e.g. N95 mask) and masks (including "surgical" masks, which don't really have a meaningfully better FFE than cloth masks), then at least respirators offer some protection to the wearer, provided they also still minimize contact. But, in keeping with this distinction, yes, masks were never seriously scientifically supported. It is incredibly disheartening to see mask mandates still in cancer wards, despite these being mandates for (objectively useless) cloth/surgical masks.

iyn 1 day ago

> I didn't want to be the one to have to say it, but neither masks nor social distancing had any scientific backing at all.

This is false. Even quick search shows multiple papers from pre-covid times that show masks being effective [0][1]. There are many more studies post-covid that show that N95/FFP2/FFP3 masks actually work if you wear them correctly (most people don't know how to do this). Educate yourself before sharing lies.

[0] https://pubmed.ncbi.nlm.nih.gov/21477136/

[1] https://pubmed.ncbi.nlm.nih.gov/19652172/

amanaplanacanal 2 days ago

Yeah they burned a lot of trust with that, for sure.

lisbbb 1 day ago

They burned it beyond down to the ground and below. And many of you on here willfully continue to trust them and argue vehemently against people who try to tell you the actual truth of the matter. RFK Jr. is a flawed human being, but he's doing some good work in unwinding some of the web of lies we live under right now.

aeternum 1 day ago

It's good RFK is more willing to question things but he seems just as guilty when it comes to spinning webs of lies.

If we think tylenol might cause autism why doesn't he run/fund a nice clean and large randomized controlled trial? Instead he spreads conjecture based on papers with extremely weak evidence.

alphabettsy 1 day ago

He’s just bringing different lies with new sponsors.

dakial1 1 day ago

I think the problem is that apparently some people discovered there is a profitable business model in spreading misinformation, so a trustful (even if not always right), non malicious, reference of information might be needed.

But who watches the watchmen?

sterlind 2 days ago

it was an extreme time, but yes, probably the most authoritarian action I've seen social media take.

misinformation is a real and worsening problem, but censorship makes conspiracies flourish, and establishes platforms as arbiters of truth. that "truth" will shift with the political tides.

IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons.

YZF 1 day ago

This just seems incredibly difficult. Even between people who are highly intelligent, educated, and consider themselves to be critical thinkers there can be a huge divergence of what "truth" is on many topics. Most people have no tools to evaluate various claims and it's not something you can just "teach kids". Not saying education can't move the needle but the forces we're fighting need a lot more than that.

I think some accountability for platforms is an important part of this. Platforms right now have the wrong incentives, we need to fix this. It's not just about "truth" but it's also about stealing our attention and time. It's a drug and we should regulate it like the drug it is.

adiabatichottub 2 days ago

As I recall from my school days, in Social Studies class there were a set of "Critical Thinking" questions at the end of every chapter in the textbook. Never once were we assigned any of those questions.

tbrownaw 2 days ago

I'd expect questions with that label to have the sort of answers that are a pain to grade.

cultofmetatron 2 days ago

[flagged]

Jensson 2 days ago

You do realize Arabs also massacred a lot of Jews at the same time? Both sides were absolutely abhorrent at the time, it was war between quickly assembled militias and civilians fighting for survival, that is never going to end well.

Example of Arabs lynching Jews, they started killing each other before the partition happened, so everyone knew it would be all out war after the British left:

> Arab workers stormed the refinery armed with tools and metal rods, beating 39[d] Jewish workers to death and wounding 49.

https://en.wikipedia.org/wiki/Haifa_Oil_Refinery_massacre

NaN years ago

undefined

NaN years ago

undefined

tjpnz 2 days ago

Some of the worst examples of viral misinformation I've encountered were image posts on social media. They'll often include a graph, a bit of text and links to dense articles from medical journals. Most people will give up at that point and assume that it's legit because the citations point to BMJ et el. You actually need to type those URLs into a browser by hand, and assuming they go anywhere leverage knowledge taught while studying university level stats.

I spent several hours on one of these only to discover the author of the post had found a subtle way to misrepresent the findings and had done things to the graph to skew it further. You cannot expect a kid (let alone most adults) to come to the same conclusion through lessons on critical thinking.

blooalien 1 day ago

> "IMO we need to teach kids how to identify misinformation in school. maybe by creating fake articles, mixing them with real articles and having students track down sources and identify flaws. critical thinking lessons."

You just described a perfectly normal "Civics & Current Events" class in early grade-school back when / where I grew up. We were also taught how to "follow the facts back to the actual sources" and other such proper research skills. This was way back when you had to go to an actual library and look up archived newspapers on microfiche, and encyclopedias were large collections of paper books. Y'know... When dinosaurs still roamed the streets... ;)

Aurornis 1 day ago

> IMO we need to teach kids how to identify misinformation in school.

This is extremely difficult. Many of the people who thrive on disinformation are drawn to it because they are contrarian. They distrust anything from the establishment and automatically trust anything that appears anti-establishment. If you tell them not to trust certain sources that’s actually a cue to them to explore those sources more and assume they’re holding some valuable information that “they” don’t want you to know.

The dynamics of this are very strange. A cluster of younger guys I know can list a dozen different times medical guidance was wrong in history from memory (Thalidomide, etc), but when you fact check Joe Rogan they laugh at you because he’s a comedian so you can’t expect him to be right about everything. “Do your own research” is the key phrase, which is a dog whistle to mean find some info to discount the professionals but then take sources like Joe Rogan and his guests at face value because they’re not the establishment.

lesuorac 2 days ago

2 years is a pretty long ban for a not even illegal conduct.

Although if they got banned during the start of covid during the Trump administration then we're talking about 5 years.

asadotzler 2 days ago

No one owes them any distribution at all.

zug_zug 2 days ago

Absolutely. Especially when those election deniers become insurrectionists.

beeflet 2 days ago

that is a two-way street

Simulacra 2 days ago

They went against a government narrative. This wasn't Google/Youtube banning so much as government ordering private companies to do so.

JumpCrisscross 2 days ago

> wasn't Google/Youtube banning so much as government ordering private companies to do so

No, it was not. It’s particularly silly to suggest this when we have live example of such orders right now.

The companies were nudged. (And they were wrong to respond to public pressure.) The President, after all, has a “bully pulpit.” But there were no orders, no credibly threats and plenty of companies didn’t deplatform these folks.

EasyMark 1 day ago

That's what I told my MAGA friends. Biden recommended stuff, Trump threatens stuff. So far only one of them has followed through with action. Trump has threatened business deals and prosecution, and is currently sending government after his opponents with the DoJ. Yet those same people are as quiet as mice now on "government bullying"

spullara 2 days ago

They literally had access to JIRA at Twitter so they could file tickets against accounts.

JumpCrisscross 2 days ago

> literally had access to JIRA at Twitter so they could file tickets against accounts

I’m not disputing that they coördinated. I’m challenging that they were coerced.

We wouldn’t describe Fox News altering a script on account of a friendly call from Miller and friends the “government ordering private companies” around. (Or, say, Florida opening their criminal justice records to ICE the federal government ordering states around.) Twitter’s leadership and the Biden administration saw eye to eye. This is a story about a media monoculture and private censorship, not government censorship.

NaN years ago

undefined

unethical_ban 2 days ago

Do you think no nefarious nation state actors are on social media spinning disinformation?

NaN years ago

undefined

nailer 1 day ago

Zuckerberg mentioned meta were getting government employees that were calling Facebook absolutely furious, and when they didn’t take down legal speech administration did not approve of, there was an immediate investigation launched into Meta that he considers retaliatory.

https://apnews.com/article/meta-platforms-mark-zuckerberg-bi...

https://open.spotify.com/episode/3kDr0LcmqOHOz3mBHMdDuV?si=j...

starik36 2 days ago

That was certainly the case with Twitter. It came out during the congressional hearings. FBI had a direct line to the decision makers.

JumpCrisscross 2 days ago

> was certainly the case with Twitter

It was not. No threats were made, and Twitter didn’t blindly follow the FBI’s guidance.

The simple truth is the leftist elements that wanted to control the debate were there in the White House and in Twitter’s San Francisco offices. Nobody had to be coerced, they were coördinating.

brokencode 2 days ago

A direct line to threaten decision makers? Or to point out possible misinformation spreaders?

NaN years ago

undefined

LeafItAlone 2 days ago

And do you think the impetuous behind this action happening now is any different? In both cases YouTube is just doing what the government wants.

stronglikedan 2 days ago

[flagged]

3cKU 2 days ago

[flagged]

jackmottatx 2 days ago

[dead]

system7rocks 2 days ago

We live in a complicated world, and we do need the freedom to get things right and wrong. Never easy though in times of crisis.

Silver lining in this is the conversation continued and will continue. I can see governments needing to try to get accurate and helpful information out in crisis - and needing to pressure or ask more of private companies to do that. But also like that we can reflect back and go - maybe that didn’t work like what we wanted or maybe it was heavy-handed.

In many governments, the government can do no wrong. There are no checks and balances.

The question is - should we still trust YouTube/Google? Is YouTube really some kind of champion of free speech? No. Is our current White House administration a champion of free speech? Hardly.

But hopefully we will still have a system that can have room for critique in the years to come.

electriclove 2 days ago

It is scary how close we were to not being able to continue the conversation.

doom2 1 day ago

If anything, I think we're even closer. It feels like the current administration is stifling speech more than ever. It's open season on people who don't proudly wave the flag or correctly mourn Charlie Kirk. People who dare speak against Israel are being doxxed and in some cases hounded out of their jobs. Books are being taken off library shelves on the whim of a very few community members with objections. And all of it is getting a giant stamp of approval from the White House.

type0 2 days ago

> Is our current White House administration a champion of free speech? Hardly.

So after January 22 2026, US leaves WHO and youtube users will be able to contradict WHO recommendations

whinvik 1 day ago

Its odd. People on HN routinely complain how Stripe or PayPal or some other entity banned them unfairly and the overwhelming sentiment is that it was indeed unfair.

But when it comes to this thread, the sentiment mostly is banning is good and we should trust Google made the right choice.

squigz 1 day ago

Like the other commenter says, HN isn't a hive mind and doesn't always agree on things.

More than that... different situations usually require different conclusions.

seivan 1 day ago

[dead]

breadwinner 2 days ago

I think it would be wise to listen to Nobel Prize-winning journalist Maria Ressa of The Philippines, regarding unchecked social media.

"You and I, if we say a lie we are held responsible for it, so people can trust us. Well, Facebook made a system where the lies repeated so often that people can't tell."

"Both United Nations and Meta came to the same conclusion, which is that this platform Facebook actually enabled genocide that happened in Myanmar. Think about it as, when you say it a million times... it is not just the lie but also it is laced with fear, anger and hate. This is what was prioritized in the design and the distribution on Facebook. It keeps us scrolling, but in countries like Myanmar, in countries like Philippines, in countries where institutions are weak, you saw that online violence became real world violence."

"Fear, anger, hate, lies, salaciousness, this is the worst of human nature... and I think that's what Big Tech has been able to do through social media... the incentive structure is for the worst of who we are because you keep scrolling, and the longer you keep scrolling the more money the platform makes."

"Without a shared reality, without facts, how can you have a democracy that works?"

https://www.cnn.com/2025/01/12/us/video/gps0112-meta-scraps-...

themaninthedark 2 days ago

"Beware of he who would deny you access to information for in his heart he dreams himself your master." - Commissioner Pravin Lal, U.N. Declaration of Rights

ethbr1 2 days ago

Full quote: "As the Americans learned so painfully in Earth's final century, free flow of information is the only safeguard against tyranny. The once-chained people whose leaders at last lose their grip on information flow will soon burst with freedom and vitality, but the free nation gradually constricting its grip on public discourse has begun its rapid slide into despotism. Beware of he who would deny you access to information, for in his heart he deems himself your master."

(Alpha Centauri, 1999, https://civilization.fandom.com/wiki/The_Planetary_Datalinks... )

HocusLocus 1 day ago

I sit here in my cubicle, here on the motherworld. When I die, they will put my body in a box and dispose of it in the cold ground. And in the million ages to come, I will never breathe, or laugh, or twitch again. So won't you run and play with me here among the teeming mass of humanity? The universe has spared us this moment."

~Anonymous, Datalinks.

01HNNWZ0MV43FF 1 day ago

Anyway this video about Biden drinking the blood of Christian children is brought to you by Alpha Testerone 2 Supplements, now FDA-approved

Yeul 1 day ago

[flagged]

NaN years ago

undefined

dzhiurgis 1 day ago

That's why you buy $20,000 GPU for local inference for your AI-ad-blocker, geez.

Orrrrr you pay $20 per month to either left or right wing one on the cloud.

tensor 2 days ago

There is a difference between free flow of information and propaganda. Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

I think you could make a reasonable argument that the algorithms that distort social media feeds actually impede the free flow of information.

AnthonyMouse 1 day ago

> Much like how monopolies can destroy free markets, unchecked propaganda can bury information by swamping it with a data monoculture.

The fundamental problem here is exactly that.

We could have social media that no central entity controls, i.e. it works like the web and RSS instead of like Facebook. There are a billion feeds, every single account is a feed, but you subscribe to thousands of them at most. And then, most importantly, those feeds you subscribe to get sorted on the client.

Which means there are no ads, because nobody really wants ads, and so their user agent doesn't show them any. And that's the source of the existing incentive for the monopolist in control of the feed to fill it with rage bait, which means that goes away.

The cost is that you either need a P2P system that actually works or people who want to post a normal amount of stuff to social media need to pay $5 for hosting (compare this to what people currently pay for phone service). But maybe that's worth it.

nobody9999 1 day ago

>We could have social media that no central entity controls, i.e. it works like the web and RSS instead of like Facebook. There are a billion feeds, every single account is a feed, but you subscribe to thousands of them at most. And then, most importantly, those feeds you subscribe to get sorted on the client.

The Fediverse[1] with ActivityPub[0]?

[0] https://activitypub.rocks/

[1] https://fediverse.party/

NaN years ago

undefined

nradov 2 days ago

There is no generally accepted definition of propaganda. One person's propaganda is another person's accurate information. I don't trust politicians or social media employees to make that distinction.

kelvinjps 1 day ago

There is definitely videos that are propaganda.

Like those low quality AI video about Trump or Biden, saying things that didn't happened. Anyone with critical thinking knows that those are either propaganda or engagement farming

NaN years ago

undefined

tensor 2 days ago

What you think is propaganda is irrelevant. When you let people unnaturally amplify information by paying to have it forced into someone’s feed that is distorting the free flow of information.

Employees choose what you see every day you use most social media.

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

refurb 1 day ago

And propaganda by definition isn’t false information. Propaganda can be factual as well.

fellowniusmonk 1 day ago

So many people have just given up on the very idea of coherent reality? Of correspondence? Of grounding?

Why? No one actually lives like that when you watch their behavior in the real world.

It's not even post modernism, it's straight up nihilism masquerading as whatever is trendy to say online.

These people accuse every one of bias while ignoring that there position comes from a place of such extreme biased it irrationally, presuppositionaly rejects the possibility of true facts in their chosen, arbitrary cut outs. It's special pleading as a lifestyle.

It's very easy to observe, model, simulate, any node based computer networks that allow for coherent and well formed data with high correspondence, and very easy to see networks destroyed by noise and data drift.

We have this empirically observed in real networks, it's pragmatic and why the internet and other complex systems run. People rely on real network systems and the observed facts of how they succeed or fail then try to undercut those hard won truths from a place of utter ignorance. While relying on them! It's absurd ideological parasitism, they deny the value of the things the demonstrably value just by posting! Just the silliest form of performative contradiction.

I don't get it. Fact are facts. A thing can be objectively true in what for us is a linear global frame. The log is the log.

Wikipedia and federated text content should never be banned, logs and timelines, data etc... but memes and other primarily emotive media is case by case, I don't see their value. I don't see the value in allowing people to present unprovable or demonstrably false data using a dogmatically, confidentally true narrative.

I mean present whatever you want but mark it as interpretation or low confidence interval vs multiple verified sources with a paper trail.

Data quality, grounding and correspondence can be measured. It takes time though for validation to occur, it's far easier to ignore those traits and just generate infinite untruth and ungrounded data.

Why do people prop up infinite noise generation as if it was a virtue? As if noise and signal epistemically can't be distinguished ever? I always see these arguments online by people who don't live that way at all in any pragmatic sense. Whether it's flat earthers or any other group who rejects the possibility of grounded facts.

Interpretation is different, but so is the intentional destruction of a shared meaning space by turning every little word into a shibboleth.

People are intentionally destroying the ability to even negotiate connections to establish communication channels.

Infinite noise leads to runaway network failure and in human systems the inevitably of violence. I for one don't like to see people die because the system has destroyed message passing via attentional ddos.

NaN years ago

undefined

ruszki 1 day ago

There isn’t. Yet, everybody knows what I mean under “propaganda against immigration” (just somebody would discredit it, somebody would fight for it), and nobody claims that the Hungarian government’s “information campaign” about migrants is not fascist propaganda (except the government, obviously, but not even their followers deny it). So, yes, the edges are blurred, yet we can clearly identify some propaganda.

Also accurate information (like here is 10 videos about black killing whites) with distorted statistics (there is twice as much white on black murder) is still propaganda. But these are difficult to identify, since they clearly affect almost the whole population. Not many people even tried to fight against it. Especially because the propaganda’s message is created by you. // The example is fiction - but the direction exists, just look on Kirk’s twitter for example -, I have no idea about the exact numbers off the top of my head

ASalazarMX 1 day ago

Propaganda wouldn't be such a problem if content wasn't dictated by a handful of corporations, and us people weren't so unbelievably gullible.

boltzmann-brain 1 day ago

indeed, didn't YT ban a bunch of RT employees for undisclosed ties? I bet those will be coming back.

vintermann 1 day ago

Oh, but can you make an argument that the government, pressuring megacorporations with information monopolies to ban things they deem misinformation, is a good thing and makes things better?

Because that's the argument you need to be making here.

potato3732842 1 day ago

You don't even need to make the argument. Go copy paste some top HN comments on this issue from around the time the actions we're discussing youtube reversing happened.

NaN years ago

undefined

estearum 1 day ago

Not really. You can argue that the government should have the right to request content moderation from private platforms and that private platforms should have the right to decline those requests. There are countless good reasons for both sides of that.

In fact, this is the reality we have always had, even under Biden. This stuff went to court. They found no evidence of threats against the platforms, the platforms didn't claim they were threatened, and no platform said anything other than they maintained independent discretion for their decisions. Even Twitter's lawyers testified under oath that the government never coerced action from them.

Even in the actual letter from YouTube, they affirm again that they made their decisions independently: "While the Company continued to develop and enforce its policies independently, Biden Administration officials continued to press the company to remove non-violative user-generated content."

So where does "to press" land on the spectrum between requesting action and coercion? Well, one key variable would be the presence of some type of threat. Not a single platform has argued they were threatened either implicitly or explicitly. Courts haven't found evidence of threats. Many requests were declined and none produced any sort of retaliation.

Here's a threat the government might use to coerce a platform's behavior: a constant stream of subpoenas! Well, wouldn't you know it, that's exactly what produced the memo FTA.[1]

Why hasn't Jim Jordan just released the evidence of Google being coerced into these decisions? He has dozens if not hundreds of hours of filmed testimony from decision-makers at these companies he refuses to release. Presumably because, like in every other case that has actually gone to court, the evidence doesn't exist!

[1] https://www.politico.com/live-updates/2025/03/06/congress/ji...

NaN years ago

undefined

yongjik 1 day ago

That sounds great in the context of a game, but in the years since its release, we have also learned that those who style themselves as champions of free speech also dream themselves our master.

They are usually even more brazen in their ambitions than the censors, but somehow get a free pass because, hey, he's just fighting for the oppressed.

ethbr1 1 day ago

I'd say free speech absolutism (read: early-pandemic Zuckerberg, not thumb-on-the-scales Musk) has always aged better than the alternatives.

The trick is there's a fine line between honest free speech absolutism and 'pro free speech I believe in and silence about the freedom of that I don't.' Usually when ego and power get involved (see: Trump, Musk).

To which, props to folks like Ted Cruz on vocally addressing the dissonance of and opposing FCC speech policing.

potato3732842 1 day ago

Anything that people uncritically good attracts the evil and the illegitimate because they cannot build power on their own so they must co-opt things people see as good.

soganess 2 days ago

Not in the original statement, but as it referenced here, the word 'information' is doing absolutely ludicrous amounts of lifting. Hopefully it bent at the knees, because it my book it broke.

You can't call the phrase "the sky is mint chocolate chip pink with pulsate alien clouds" information.

arevno 1 day ago

While this is true, It's also important to realize that during the great disinformation hysteria, perfectly reasonable statements like "This may have originated from a lab", "These vaccines are non-sterilizing", or "There were some anomalies of Benford's Law in this specific precinct and here's the data" were lumped into the exact same bucket as "The CCP built this virus to kill us all", "The vaccine will give you blood clots and myocarditis", or "The DNC rigged the election".

The "disinformation" bucket was overly large.

There was no nuance. No critical analysis of actual statements made. If it smelled even slightly off-script, it was branded and filed.

nradov 1 day ago

The mRNA based COVID-19 vaccines literally did cause myocarditis as a side effect in a small subset of patients. We can argue about the prevalence and severity or risk trade-offs versus possible viral myocarditis but the basic statement about possible myocarditis should have never been lumped into the disinformation bucket.

https://www.cdc.gov/vaccines/covid-19/clinical-consideration...

NaN years ago

undefined

BrenBarn 1 day ago

But it is because of the deluge that that happens. We can only process so much information. If the amount of "content" coming through is orders of magnitude larger, it makes sense to just reject everything that looks even slightly like nonsense, because there will still be more than enough left over.

NaN years ago

undefined

NaN years ago

undefined

ayntkilove 1 day ago

You can call it data and have sufficient respect of others that they may process it into information. Too many have too little faith in others. If anything we need to be deluged in data and we will probably work it out ourselves eventually.

protocolture 1 day ago

Facebook does its utmost to subject me to Tartarian, Flat Earth and Creationist content.

Yes I block it routinely. No the algo doesnt let up.

I dont need "faith" when I can see that a decent chunk of people disbelieve modern history, and aggressively disbelieve science.

More data doesnt help.

intended 1 day ago

This is a fear of an earlier time.

We are not controlling people by reducing information.

We are controlling people by overwhelming them in it.

And when we think of a solution, our natural inclination to “do the opposite” smacks straight into our instinct against controlling or reducing access to information.

The closest I have come to any form of light at the end of the tunnel is Taiwan’s efforts to create digital consultations for policy, and the idea that facts may not compete on short time horizon, but they surely win on longer time horizons.

ethbr1 1 day ago

The problem is that in our collective hurry to build and support social networks, we never stopped to think about what other functions might be needed with them to promote good, factual society.

People should be able to say whatever the hell they want, wherever the hell they want, whenever the hell they want. (Subject only to the imminent danger test)

But! We should also be funding robust journalism to exist in parallel with that.

Can you imagine how different today would look if the US had leveraged a 5% tax on social media platforms above a certain size, with the proceeds used to fund journalism?

That was a thing we could have done. We didn't. And now we're here.

probably_wrong 1 day ago

Beware of those who quote videogames and yet attribute them to "U.N. Declaration of Rights".

Starman_Jones 1 day ago

They're not wrong; the attribution is part of the quote. In-game, the source of the quote is usually important, and is always read aloud (unlike in Civ).

probably_wrong 1 day ago

I would argue that they are, if not wrong, at least misleading.

If you've never played Alpha Centauri (like me) you are guaranteed to believe this to be a real quote by a UN diplomat. It also doesn't help that searching for "U.N. Declaration of Rights" takes me (wrongly) to the (real) Universal Declaration of Human Rights. I only noticed after reading ethbr1's comment [1], and I bet I'm not the only one.

[1] https://news.ycombinator.com/item?id=45355441

NaN years ago

undefined

BrenBarn 1 day ago

The thing is that burying information in a firehose of nonsense is just another way of denying access to it. A great way to hide a sharp needle is to dump a bunch of blunt ones on top of it.

Cheer2171 2 days ago

Beware he who would tell you that any effort at trying to clean up the post apocalyptic wasteland that is social media is automatically tyranny, for in his heart he is a pedophile murderer fraudster, and you can call him that without proof, and when the moderators say your unfounded claim shouldn't be on the platform you just say CENSORSHIP.

rixed 1 day ago

Is your point that any message is information?

Without truth there is no information.

jancsika 1 day ago

That seems to be exactly her point, no?

Imagine an interface that reveals the engagement mechanism by, say, having an additional iframe. In this iframe an LLM clicks through its own set of recommendations picked to minimize negative emotions at the expense of engagement.

After a few days you're clearly going to notice the LLM spending less time than you clicking on and consuming content. At the same time, you'll also notice its choices are part of what seems to you a more pleasurable experience than you're having in your own iframe.

Social media companies deny you the ability to inspect, understand, and remix how their recommendation algos work. They deny you the ability to remix an interface that does what I describe.

In short, your quote surely applies to social media companies, but I don't know if this is what you originally meant.

totetsu 2 days ago

Raising the noise floor of disinformation to drown out information is a way of denying access to information too..

N_Lens 2 days ago

We must dissent.

idiotsecant 1 day ago

Sure, great. Now suppose that a very effective campaign of social destabilisation propaganda exists that poses an existential risk to your society.

What do you do?

It's easy to rely on absolutes and pithy quotes that don't solve any actual problems. What would you, specifically, with all your wisdom do?

nradov 1 day ago

Let's not waste time on idle hypotheticals and fear mongering. No propaganda campaign has ever posed an existential threat to the USA. Let us know when one arrives.

CJefferson 1 day ago

Have you seen the US recently? Just in the last couple of days, the president is standing up and broadcasting clear medical lies about autism, while a large chunk of the media goes along with him.

NaN years ago

undefined

rixed 1 day ago

It doesn't have to be national threat. Social media can be used by small organisations or even sufficiently motivated individuals to easily spread lies and slanders against individuals or group and it's close to impossible to prevent (I've been fighting some trolls threatening a group of friends on Facebook lately, and I can attest how much the algorithm favor hate speach over reason)

NaN years ago

undefined

Steltek 1 day ago

There are twin goals: total freedom of speech and holding society together (limit polarization). I would say you need non-anonymous speech, reputation systems, trace-able moderation (who did you upvote), etc. You can say whatever you want but be ready to stand by it.

One could say the problem with freedom of speech was that there weren't enough "consequences" for antisocial behavior. The malicious actors stirred the pot with lies, the gullible and angry encouraged the hyperbole, and the whole US became polarized and divided.

And yes, this system chills speech as one would be reluctant to voice extreme opinions. But you would still have the freedom to say it but the additional controls exert a pull back to the average.

2OEH8eoCRo0 1 day ago

Facebook speaks through what it chooses to promote or suppress and they are not liable for that speech because of Section 230.

Manuel_D 1 day ago

Not quite: prior to the communications Decency Act of 1996 (which contained section 230), companies were also not liable for the speech of their users, but lost that protection if they engaged in any moderation. The two important cases at hand are Stratton Oakmont, Inc. v. Prodigy Services Co. And Cubby, Inc. v. CompuServe Inc.

The former moderated content and was thus held liable for posted content. The latter did not moderate content and was determined not to be liable for user generated content they hosted.

Part of the motivation of section 230 was to encourage sites to engage in more moderation. If section 230 were to be removed, web platforms would probably choose to go the route of not moderating content in order to avoid liability. Removing section 230 is a great move if one wants misinformation and hateful speech to run unchecked.

ben_w 9 hours ago

You say "Not quite" but it looks to me like you're agreeing?

potato3732842 1 day ago

There's a special irony in this being the top comment on a site where everyone has a rightthink score and people routinely and flagrantly engage in "probably bad faith, but there's plausible deniability so you can't pin it on them" communication to crap on whatever the wrongthink on an issue is.

As bad as facebook and it's opaque algorithms that favor rage bait are, the kind of stuff you get by keeping score is worse.

StanislavPetrov 2 days ago

>"You and I, if we say a lie we are held responsible for it, so people can trust us."

I don't know how it works in The Philippines, but in the USA the suggestion that media outlets are held responsible for the lies that they tell is one of the most absurd statements one could possibly make.

lfpeb8b45ez 2 days ago

How about InfoWars?

StanislavPetrov 1 day ago

I was referring more to established Media that people consider credible like the NBC, CBS, The Guardian, The New York Times, the Wall Street Journal, The Atlantic, etc. The fact that the only person in "media" who has been severely punished for their lies is a roundly despised figure (without any credibility among established media or the ruling class) is not a ringing endorsement for the system. While the lies of Jones no doubt caused untold hardship for the families of the victims, they pale in comparison to the much more consequential lies told by major media outlets with far greater influence.

When corporate media figures tell lies that are useful to the establishment, they are promoted, not called to account.

In 2018 Luke Harding at the Guardian lied and published a story that "Manafort held secret talks with Assange in Ecuadorian embassy" (headline later amended with "sources say" after the fake story was debunked) in order to bolster the Russiagate narrative. It was proven without a shadow of a doubt that Manafort never went to the Embassy or had any contact at all with Assange (who was under blanket surveillance), at any time. However, to this day this provably fake story remains on The Guardian website, without any sort of editor's note that is it false or that it was all a pack of lies!(1) No retraction was ever issued. Luke Harding remains an esteemed foreign correspondent for The Guardian.

In 2002, Jonah Golberg told numerous lies in a completely false article in The New Yorker that sought to establish a connection between the 9/11 attacks and Saddam Hussein called, "The Great Terror".(2) This article was cited repeatedly during the run up to the war as justification for the subsequent invasion and greatly helped contribute to an environment where a majority of Americans thought that Iraq was linked to Bin Laden and the 9/11 attackers. More than a million people were killed, in no small part because of his lies. And Goldberg? He was promoted to editor-in-chief of The Atlantic, perhaps the most prestigious and influential journal in the country. He remains in this position today.

There are hundreds, if not thousands, of similar examples. The idea suggested in the original OP that corporate/established media is somehow more credible or held to a higher standard than independent media is simply not true. Unfortunately there are a ton of lies, falsehoods and propaganda out there, and it is up to all of us to be necessarily skeptical no matter where we get our information and do our due diligence.

1. https://www.theguardian.com/us-news/2018/nov/27/manafort-hel...

2. https://www.newyorker.com/magazine/2002/03/25/the-great-terr...

anonymousiam 2 days ago

A sympathetic jury can be an enemy of justice.

I'm not an Alex Jones fan, but I don't understand how a conspiracy theory about the mass shooting could be construed as defamation against the parents of the victims. And the $1.3B judgement does seem excessive to me.

AlexandrB 2 days ago

You should read up on some details. The defamation claim is because Alex Jones accused the parents of being actors who are part of staging the false flag. The huge judgement is partly because Alex Jones failed to comply[1][2] with basic court procedure like discovery in a timely way so a default judgement was entered.

Despite his resources, Alex Jones completely failed to get competent legal representation and screwed himself. He then portrayed himself as the victim of an unjust legal system.

[1] https://www.npr.org/2021/11/15/1055864452/alex-jones-found-l...

> Connecticut Superior Court Judge Barbara Bellis cited the defendants' "willful noncompliance" with the discovery process as the reasoning behind the ruling. Bellis noted that defendants failed to turned over financial and analytics data that were requested multiple times by the Sandy Hook family plaintiffs.

[2] https://lawandcrime.com/high-profile/judge-rips-alex-jones-c...

> Bellis reportedly said Jones' attorneys "failure to produce critical material information that the plaintiffs needed to prove their claims" was a "callous disregard of their obligation," the Hartford Courant reported.

NaN years ago

undefined

protocolture 1 day ago

The specific conspiracy theory implied fraud and cover up on behalf of the parents. Lmao.

thrance 1 day ago

Ever watched Fox News?

vachina 2 days ago

This is why China bans western social media.

yupyupyups 2 days ago

Say what you will about the CCP, it's naive to let a foreign nation have this much impact on your subjects. The amount of poison and political manipulation that are imported from these platform is astronomical.

scarface_74 2 days ago

Well when the local media bends a knee and outright bribes the President (Paramount, Disney, Twitter, Facebook), why should we trust the domestic media?

nxm 1 day ago

Like Biden administration pressured social media to take down information/account that went against their narrative

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

ethbr1 2 days ago

Instead of implementing government information control, why not invest those resources in educating and empowering ones citizenry to recognize disinformation?

BrenBarn 1 day ago

To me this is sort of like saying why do we need seat belts when we could just have people go to the gym so they're strong off to push back an oncoming car. Well, you can't get that strong, and also you can't really educate people well enough to reliably deal with the full force of the information firehose. Even people who are good at doing it do so largely by relying on sources they've identified as trustworthy and thus offloading some of the work to those. I don't think there's anyone alive who could actually distinguish fact from fiction if they had to, say, view every Facebook/Twitter/Reddit/everything post separately in isolation (i.e., without relying on pre-screening of some sort).

And once you know you need pre-screening, the question becomes why not just provide it instead of making people hunt it down?

NaN years ago

undefined

rixed 1 day ago

Instead of investing resources in education, why not let people discover by themselves the virtues of education?

Sarcasm aside, we tend to focus too much on the means and too little on the outcomes.

beepboopboop 2 days ago

That’s hundreds of millions of people in the US, of varying ages and mostly out of school already. Seems like a good thing to try but I’d imagine it doesn’t make a tangible impact for decades.

CJefferson 1 day ago

Because no one person can fight against a trillion dollar industry who has decided misinformation makes the biggest profit.

How am I supposed to learn what’s going on outside my home town without trusting the media?

rgavuliak 1 day ago

Because it doesn't seem to work?

xracy 1 day ago

'An ounce of prevention is worth a pound of the cure.'

It's so much easier to stop one source than it is to (checks notes) educate the entire populace?!? Gosh, did you really say that with a straight face? As if education isn't also under attack?

yupyupyups 1 day ago

I never defended the authoritarianism of the CCP. I only said it makes sense to block foreign platforms, regardless if the state is a tyranny or not. Framing it as if it's some kind of tactic to help keep the populous indoctrinated is a very simplistic take.

Take Reddit, for example. It's filled with blatant propaganda, from corporations and politicians. It's a disgustingly astroturfed platform ran by people of questionable moral character. What's more, it also has porn. All you need is an account to access 18+ "communities". Not exactly "enlightening material" that frees the mind from tyranny.

mns 1 day ago

Because in that case you wouldn't be able to use disinformation yourself.

idiotsecant 1 day ago

Because you want to use it yourself. You can't vaccinate if you rely on the disease to maintain power. You can't tell people not to be afraid of people different than themselves if your whole party platform is being afraid of people different than yourself.

Broken_Hippo 1 day ago

Because it isn't that simple.

If we could just educate people and make sure they don't fall for scams, we'd do it. Same for disinformation.

But you just can't give that sort of broad education. If you aren't educated in medicine and can't personally verify qualifications of someone, you are going to be at a disadvantage when you are trying to tell if that health information is sound. And if you are a doctor, it doesn't mean you know about infrastructure or have contacts to know what is actually happening in the next state or country over.

It's the same with products, actually. I can't tell if an extension cord is up to code. The best that I can realistically do is hope the one I buy isn't a fake and meets all of the necessary safety requirements. A lot of things are like this.

Education isn't enough. You can't escape misinformation and none of us have the mental energy to always know these things. We really do have to work the other way as well.

NaN years ago

undefined

erxam 2 days ago

Sorry, 'recognizing disinformation'? You must have meant 'indoctrination'.

(They don't necessarily exclude each other. You need both positive preemptive and negative repressive actions to keep things working. Liberty is cheap talk when you've got a war on your hands.)

nradov 2 days ago

China reflexively bans anything that could potentially challenge Chairman Xi's unchecked authority and control over the information flow.

_dain_ 2 days ago

>unchecked social media

Passive voice. Who exactly is supposed to do the "checking" and why should we trust them?

breadwinner 2 days ago

Citizens. Through lawsuits. Currently we can't because of Section 230.

nradov 2 days ago

Nonsense. If social media users engage in fraud, slander, or libel then you can still hold them accountable through a civil lawsuit. Section 230 doesn't prevent this.

breadwinner 1 day ago

Will/can Facebook tell you the real identity of the user? If no then Facebook has to take responsibility for the fraud/slander/libel. Currently Section 230 means they can't be held responsible.

NaN years ago

undefined

trhway 2 days ago

The "editorializing" may possibly be applied i think (not a lawyer) when the platform's manipulation of what a user sees is based on content. And the Youtube's banning of specific Covid and election content may be such an "editorializing", and thus Youtube may not have Section 230 protection at least in those cases.

nradov 2 days ago

Have you even read Section 230? Editorializing is irrelevant.

trhway 2 days ago

Censorship works both ways. When i tried speaking against violence and genocide perpetrated by Russia in Ukraine i was shut down on Linkedin.

Even here on HN, i was almost banned when i said about children abduction by Russia https://news.ycombinator.com/item?id=33005062 - the crime that half year later ICC wrote the order against Putin.

breadwinner 2 days ago

You know how this used to work in the old days? Instead of publishing allegations yourself, you would take your story to a newspaper reporter. The reporter will then do the investigations then, if there is solid evidence, the story will be published in the newspaper. At that point the newspaper company is standing behind the story, and citizens know the standing of the newspaper in their community, and how much credence to give to the story, based on that. Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up. This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years. Repealing Section 230 will accomplish this.

themaninthedark 2 days ago

I remember a story that was investigated and then published...it was spread far and wide. The current president of the US stole the election and our biggest adversary has videos of him in compromising positions. Then debunked. (Steele dossier) https://www.thenation.com/article/politics/trump-russiagate-...

I remember a story that was investigated and then published...for some reason it was blocked everywhere and we were not allowed to discuss the story or even link to the news article. It "has the hallmarks of a Russian intelligence operation."(Hunter Biden Laptop) Only to come out that it was true: https://www.msn.com/en-us/news/politics/fbi-spent-a-year-pre...

I would rather not outsource my thinking or my ability to get information to approved sources. I have had enough experience with gell-mann amnesia to realize they have little to no understanding of the situation as well. I may not be an expert in all domains but while I am still free at least I can do my best to learn.

tanjtanjtanj 1 day ago

> Russiagate

It was never “debunked”, that is far too strong a word. Is it true? Who knows! Should we operate as if it was true without it being proven? Definitely not.

> Hunter’s laptop

In what way was that story buried or hidden? It was a major news story on every news and social network for over half a year. There was only consternation about how the laptop was acquired and who or what helped with that endeavor. The “quieting” of the story is BS and only came about a long time after the fact. Biden’s people sought (unsuccessfully) to have images removed from platforms but there was never an effort to make it seem like the allegations that stemmed from the laptop were misinformation.

NaN years ago

undefined

ModernMech 1 day ago

w.r.t. the Steele Dossier, it was always from the beginning purported to be a "raw intelligence product", which is understood by everyone involved in that process to mean it is not 100% true -- the intelligence is weighted at different levels of confidence. Steele has said he believed his sources were credible, but he did not claim the dossier was 100% accurate. He weighed it at 50/50, and expected that investigators would use it as leads to verify information, not as proof in itself.

And on that point the FBI investigations didn't even start on the basis of the Steele Dossier; they started on the basis of an Australian diplomat, Alexander Downer, who during a meeting with top Trump campaign foreign policy advisor George Papadopoulos, became alarmed when Papadopoulos mentioned that the Russian government had "dirt" on Hillary Clinton and might release it to assist the Trump campaign. Downer alerted the Australian government, who informed the FBI. The Steele Dossier was immaterial the investigation's genesis.

So any claim that the dossier as a whole has been "debunked" is not remarkable. Of course parts of it have been debunked, because it wasn't even purported to be 100% true by the author himself. It's not surprising things in it were proven false.

Moreover that also doesn't mean everything in it was not true. The central claim of the dossier -- that Donald Trump and his campaign had extensive ties to Russia, and that Russia sought to influence the 2016 U.S. election in Trump’s favor -- were proven to be true by the Muller Report Vol I and II, and the Senate Select Intel Committee Report on Russian Active Measures Campaigns and Interference in the 2016 Election, Vols I - VI.

> The current president of the US stole the election

Not a claim made in the dossier.

> and our biggest adversary has videos of him in compromising positions.

This hasn't been debunked. The claim in the dossier was that Russia has videos of Trump with prostitutes peeing on a bed Obama slept in, not peeing on Trump himself. The idea that it was golden showers is a figment of the internet. Whether or not the scenario where people peed on a bed Obama slept in happened as laid out in the dossier is still unverified, but not "debunked".

scarface_74 2 days ago

[flagged]

NaN years ago

undefined

nradov 2 days ago

It never worked. Newspapers in the old days frequently printed lies and fake news. They usually got away with it because no one held them accountable.

itbeho 1 day ago

William Randolph Hearst and the Spanish-American war come to mind.

trhway 2 days ago

>At that point the newspaper company is standing behind the story

the newspaper company is the bottleneck that the censors can easily tighten like it was say in USSR. Or like even FCC today with the media companies like in the case of Kimmel.

Social media is our best tool so far against censorship. Even with all the censorship that we do have in social media, the information still finds a way due to the sheer scale of the Internet. That wasn't the case in the old days when for example each typewritter could be identified by unique micro-details of the shape of its characters.

>Social media destroyed this process, now anyone can spread allegations at lightning speed on a massive scale without any evidence to back it up.

Why to believe anything not accompanied by evidence? The problem here is with the news consumer. We teach children to not stick fingers into electricity wall socket. If a child would still stick the fingers there, are you going to hold the electric utility company responsible?

>This has to stop. We should return to the old way, it wasn't perfect, but it worked for 100s of years.

The same can be said about modern high density of human population, transport connections and infectious decease spreading. What you suggest is to decrease the population and confine the rest preventing any travel like in the "old days" (interesting that it took Black Death some years to spread instead of days it would have taken today, yet it still did spread around all the known world). We've just saw how it works in our times (and even if you say it worked then why aren't we still doing it today?). You can't put genie back into the bottle and stop the progress.

>Repealing Section 230 will accomplish this.

Yes, good thing people didn't decided back then to charge the actual printer houses with lies present in the newspapers they printed.

nextaccountic 1 day ago

Social media is also a bottleneck. In places like India, Facebook will comply with censorship or they will get blocked

pkphilip 1 day ago

What happens when the press refuses to publish anything which doesn't align with their financial or political interest?

mensetmanusman 2 days ago

There is no way to go back to this. It’s about as feasible as getting rid of vehicles.

breadwinner 2 days ago

I am not saying we should go back to physical newspapers printed on paper. News can be published online... but whoever is publishing it has to stand behind it, and be prepared to face lawsuits from citizens harmed by false stories. This is feasible, and it is the only solution to the current mess.

NaN years ago

undefined

NaN years ago

undefined

petermcneeley 2 days ago

> We should return to the old way, it wasn't perfect, but it worked for 100s of years

At this stage you are clearly just trolling. Are you even aware of the last 100s of years? From Luther to Marx? You are not acting in good faith. I want nothing to do with your ahistorical worldview.

EB-Barrington 2 days ago

[dead]

King-Aaron 2 days ago

I can think of another hot-potato country that will get posts nerfed from HN and many others

gchamonlive 2 days ago

That's the evil genius behind the general movement in the world to discredit democratic institutions and deflate the government.

Who would hold Meta accountable for the lies it helps spread and capitalize upon them if not the government.

So by crippling democratic institutions and dwarfing the government to the point of virtual non-existence, all in the name of preserving freedom of speech and liberalism -- and in the process subverting both concepts -- elected leaders have managed to neutralize the only check in the way of big corps to ramp up this misinformation machine that the social networks have become.

stinkbeetle 2 days ago

I think it would be even wiser to start by holding to account the politicians, corporations, and government institutions regarding their unchecked lies corruption and fraud.

But no, yet again the blame is all piled on to the little people. Yes, it's us plebs lying on the internet who are the cause of all these problems and therefore we must be censored. For the greater good.

I have an alternative idea, let's first imprison or execute (with due process) politicians, CEOs, generals, heads of intelligence and other agencies and regulators, those found to have engaged in corrupt behavior, lied to the public, committed fraud, insider trading, fabricated evidence to support invading other countries, engage in undeclared wars, ordered extrajudicial executions, colluded with foreign governments to hack elections, tax evasion, etc. Then after we try that out for a while and if it has not improved things, then we could try ratcheting up the censorship of plebs. Now one might argue that would be a violation of the rights of those people to take such measures against them, but that is a sacrifice I'm willing to make. Since We Are All In This Together™, they would be willing to make that sacrifice too. And really, if they have nothing to hide then they have nothing to fear.

When you get people like Zuckerberg lying to congress, it's pretty difficult to swallow the propaganda claiming that it's Joe Smith the unemployed plumber from West Virginia sharing "dangerous memes" with his 12 friends on Facebook that is one of the most pressing concerns.

n4r9 1 day ago

I don't think "breadwinner" is blaming the little people.

stinkbeetle 1 day ago

No, the ruling class is. breadwinner I guess has bought into the propaganda, but hasn't made the connection that it's basically putting all the blame on the little people and proposes to put all the burden of "fixing" things onto them, with measures that will absolutely not actually fix anything except handing more power to the ruling class.

n4r9 16 hours ago

breadwinner is clearly putting the blame on Big Tech for putting perverse incentive structures in place.

refurb 1 day ago

The problem is not the content, the problem is people believing things blindly.

The idea that we need to protect people from “bad information” is a dark path to go down.

BrenBarn 1 day ago

I don't see it so much as protecting people from bad information as protecting people from bad actors, among whom entities like Facebook are prominent. If people want to disseminate quackery they can do it like in the old days by standing on a street corner and ranting. The point is that the mechanisms of content delivery amplify the bad stuff.

refurb 1 day ago

It’s a terrible idea and creates more problems than it solves.

You eliminate the good and the bad ideas. You eliminate the good ideas that are simple “bad” because it upsets people with power. You eliminate the good ideas that are “bad” simply because they are deemed too far out the Overton window.

And worst of all, it requires some benevolent force to make the call between good and bad, which attracts all sorts of psychopaths hungry for power.

thrance 1 day ago

Have you been living under a rock these past few years? The "bad" ideas outnumber the "good" ones ten to one. The current secretary of health lets internet conspiracies dictate its politics, such that vaccines are getting banned, important research is getting defunded, and now they're even going after paracetamol (!!). People will die.

Cue the quote that says it takes 30 minutes to debunk 30 seconds of lying.

NaN years ago

undefined

vintermann 1 day ago

Exactly what are you trying to say about unbanning YouTubers here?

afavour 1 day ago

That it could be dangerous to readmit people who broadcast disinformation? The connection seemed pretty clear to me.

vintermann 1 day ago

I certainly guessed that was what you wanted to say. Funny how polarization makes everything predictable.

But what I just realized is that you don't explicitly say it, and certainly make no real argument for it. Ressa laments algorithmic promotion of inflammatory material, but didn't say "keep out anti-government subversives who spread dangerous misinformation" - which is good, because

1. We can all see how well the deplatforming worked - Trump is president again, and Kennedy is health secretary.

2. In the eyes of her government, she was very much such a person herself, so it would have been pretty bizarre thing of her to say.

Ironically, your post is very much an online "go my team!" call, and a good one too (top of the thread!). We all understand what you want and most of us, it seems, agree. But you're not actually arguing for the deplatforming you want, just holding up Ressa as a symbol for it.

n4r9 1 day ago

> We can all see how well the deplatforming worked - Trump is president again

Not a compelling argument...

Jan 2021 - Twitter bans Trump (for clear policy violations)

Apr 2022 - Musk buys Twitter

Aug 2023 - Twitter reinstates Trump's account

Nov 2024 - Trump re-elected, gives Musk cabinet position

NaN years ago

undefined

afavour 1 day ago

You realise I didn’t make the original post, right?

NaN years ago

undefined

Slava_Propanei 1 day ago

[dead]

topspin 2 days ago

All those words, and no mention of Section 230, which is what this is really all about. Google can see which way the wind is blowing and they know POTUS will -- for better or worse -- happily sign any anti-"Big Tech censorship" bill that gets to his desk. They hope to preempt this.

Yes, I know about the Charlie Kirk firings etc.

dang 2 days ago

Ok, we've changed the URL above to that first link from https://www.offthepress.com/youtube-will-let-users-booted-fo.... Thanks!

murphyslab 2 days ago
rustystump 2 days ago

The problem with any system like this is that due to scale it will be automated which means a large swath of people will be caught up in it doing nothing wrong.

This is why perma bans are bad. Id rather a strike system before a temp ban to give some breathing room for people to navigate the inevitable incorrect automation. Even then if the copyright issue is anything to go by this is going to hurt more than help.

whycome 2 days ago

What exactly constituted a violation of a COVID policy?

PaulKeeble 2 days ago

A lot of channels had to avoid even saying the word Covid. I only saw it return recently to use at the end of last year. There were a variety of channels banned that shouldn't have been such as some talking about Long Covid.

doom2 2 days ago

Now you see channels avoiding saying "Gaza" or "genocide". I haven't seen any proof platforms are censoring at least some content related to Israel but I wouldn't be surprised.

carlosjobim 2 days ago

Every opinion different from the opinion of "authorities". They documented it here:

https://blog.youtube/news-and-events/managing-harmful-vaccin...

From the two links in the post, Google fleshes it out in great detail, with many examples of forbidden thought.

miltonlost 2 days ago

[flagged]

someuser2345 2 days ago

> content that falsely alleges that approved vaccines are dangerous and cause chronic health effects

The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

> claims that vaccines do not reduce transmission or contraction of disease

Isn't that true of the covid vaccines? Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely, but later on, they changed the goal posts to "it will reduce your symptoms of covid".

joecool1029 2 days ago

> The J & J vaccine was approved at the time, but was later banned for causing chronic health effects.

That's not what happened. Authorities received rare reports of a clotting disorder and paused it for 11 days to investigate. That pause was lifted but the panic caused a crash in demand and J&J withdrew it from the market. Source: https://arstechnica.com/health/2023/06/j-fda-revokes-authori...

NaN years ago

undefined

2muchcoffeeman 2 days ago

This highlights what’s so difficult with science communication.

Right here on what should be a technical minded forum, people don’t understand what science is or how it works. Or what risk is. And they don’t even challenge their own beliefs or are curious about how things actually work.

If the “smart” people can’t or won’t continuously incorporate new information, what are our chances?

teamonkey 2 days ago

> Originally, the proponents claimed that getting the vaccine would stop you from getting covid entirely

Some people don’t understand how vaccines work, so may have claimed that, but efficacy rates were very clearly communicated. Anyone who listened in high school biology should know that’s not how they work.

roenxi 2 days ago

That policy catches and bans any scientists studying the negative health effects of vaccines who later turns out to be right.

1) YouTube doesn't know what is true. They will be relying on the sort of people they would ban to work out when the consensus is wrong. If I watched a YouTube video through of someone spreading "vaccine misinformation" there is a pretty good chance that the speakers have relevant PhDs or are from the medical profession - there is no way the YouTube censors are more qualified than that, and the odds are they're just be random unqualified employees already working in the euphemistically named "Trust & Safety" team.

2) All vaccines are dangerous and can cause chronic health effects. That statement isn't controversial, the controversy is entirely over the magnitude. Every time I get a vaccine the standard advice is "you should probably hang around here for 5 minutes, these things are known to be dangerous in rare cases". I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

> This would include content that falsely says that approved vaccines cause ... cancer ...

Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

3) All policies have costs and benefits. People have to be able to discuss the overall cost-benefit of a policy in YouTube videos even if they get one of the costs or benefits completely wrong.

gus_massa 2 days ago

> I think in most countries you're more likely to get polio from a polio vaccine than in the wild. On the one hand, that is a success of the polio vaccine. On the other hand, the vaccine is clearly dangerous and liable to cause chronic health problems.

In https://en.wikipedia.org/wiki/Polio_eradication#2025 I count 2 countries with the will type and 17 with the vaccine derived type (and like 160 without polio!)

There are two vaccines, the oral that has attenuated ("live") virus and the inyectable that has inactivated ("dead") virus.

* The oral version is not dangerous for the person that recibes it [1], but the virus can pass to other persons and after a few hops mutate to the dangerous version. The advantage is that the immunity is stronger and it also stops transmission.

* The inject able version is also not dangerous [1], it doesn't stop transmission, but it also can't mutate because the virus is totally dead.

Most fist word countries, and many other countries with no recent case use only the inyectable version. (Here in Argentina, we switched to only inyectable like 5 years ago :) .)

Countries with recent case of other problems use a mix, to reduce transmission. (I think the inyectable one is also cheaper and easier to store.) (Also, a few years ago they dropped globally one of the strains from the oral one, because that strains is eradicated. The inyectable one has that strains just in case, but it can't escape.)

[1] except potencial allergic reactions, that are rare, but I also remember big signs with instructions for the nurse explaining in case of an emergency what to do, what to inject, where to call ... The risk is not 0, but very low. I wonder if the trip to the hospital to get the vaccine is more dangerous.

handoflixue 2 days ago

> Cancer is such a catch all that we can pretty much guarantee there will be some evidence that vaccines cause cancer. Everything causes cancer. Drinking tea is known to cause cancer.

I'm reminded of the Prop 65 signs everywhere in California warning "this might cause cancer"

TeeMassive 2 days ago

> This seems like good banning to me. Anti-vaxxer propaganda isn't forbidden thoughts. It's bad science and lies and killing people.

Any subject important enough in any public forum is potentially going to have wrong opinions that are going to cause harm. While some people could be wrong, and could cause harm, the state itself being wrong is far more dangerous, especially with no dissident voices there to correct its course.

Edit: I see you're getting downvoted for simply stating your honest opinion. But as a matter of principle I'm going to upvote you.

mapontosevenths 2 days ago

[flagged]

rpiguy 2 days ago

People have the right to believe things that could get them killed and the right to share their beliefs with others.

Allowing the debate to be shut down is undemocratic and unscientific (science without question is nothing more than religion).

Not allowing people to come to different conclusions from the same data is tyranny.

NaN years ago

undefined

NaN years ago

undefined

Bender 2 days ago

Pfizer hid a lot of the damage done as did the others. A lot of people can die by the time books come out. [1] That's one of the many reasons I held off and glad I did.

[1] - https://www.amazon.com/Pfizer-Papers-Pfizers-Against-Humanit...

NaN years ago

undefined

immibis 2 days ago

Shouting "fire" in a crowded theater being illegal was used to make it illegal to oppose the draft (Schenck v. United States). So actually, since opposing the draft is legal, shouting "fire" in a crowded theater is legal too.

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

NaN years ago

undefined

potsandpans 2 days ago

Saying lab leak was true

perihelions 2 days ago

According to Google's censorship algorithm, Michael Osterholm's podcast (famous epidemiologist and, at the time, a member of President Biden's own gold-star covid-19 advisory panel).

https://x.com/cidrap/status/1420482621696618496 ("Our Osterholm Update podcast episode (Jul 22) was removed for “medical misinformation.”" (2021))

Most ironic thing I've ever seen. I still recall it perfectly, though it's been four years. Never, ever trust censorship algorithms or the people who control them: they are just dumb parrots that suppress all discussion of an unwanted topic, without thought or reason.

delichon 2 days ago

My wake up moment was when they not only took down a Covid debate with a very well qualified virologist, but also removed references to it in the Google search index, not just for the YouTube link.

miltonlost 2 days ago

[flagged]

delichon 2 days ago

I am not comfortable letting Google make that decision for me. You are?

NaN years ago

undefined

barbacoa 2 days ago

Google went so far as to scan people's private google drives for copies of the documentary 'plandemic' and delete them.

potsandpans 2 days ago

Can you please provide evidence? I'm not saying I don't believe you. It's just... extraordinary claims etc

NaN years ago

undefined

jimt1234 2 days ago

[flagged]

zobzu 2 days ago

[flagged]

woeirua 2 days ago

It seems to me that a lot of people are missing the forest for the trees on misinformation and censorship. IMO, a single YouTube channel promoting misinformation, about Covid or anything else, is not a huge problem, even if it has millions of followers.

The problem is that the recommendation algorithms push their viewers into these echo chambers that are divorced from reality where all they see are these videos promoting misinformation. Google's approach to combating that problem was to remove the channels, but the right solution was, and still is today, to fix the algorithms to prevent people from falling into echo chambers.

kypro 2 days ago

I've argued this before, but the algorithms are not the core problem here.

For whatever reason I guess I'm in that very rare group that genuinely watches everything from far-right racists, to communists, to mainstream media content, to science educational content, to conspiracy content, etc.

My YT feed is all over the place. The algorithms will serve you a very wide range of content if you want that, the issue is that most people don't. They want to hear what they already think.

So while I 100% support changing algorithms to encourage more diversity of views, also I think as a society we need to question why people don't want to listen to more perspectives naturally? Personally I get so bored here people basically echo what I think. I want to listen to people who say stuff I don't expect or haven't thought about before. But I'm in a very significant minority.

woeirua 2 days ago

I might agree that the algos making recommendations on the sidebar might not matter much, but the algos that control which videos show up when you search for videos on Google, and also in YouTube search absolutely do matter.

theossuary 2 days ago

The problem with this is that a lot of people have already fallen into these misinformation echo chambers. No longer recommending them may prevent more from becoming unmoored from reality, but it does nothing for those currently caught up in it. Only removing the channel helps with that.

hsbauauvhabzb 2 days ago

Algorithms that reverse the damage by providing opposing opinions could be implemented.

amanaplanacanal 2 days ago

Why would Google ever do that? People are likely to leave YouTube for some other entertainment, and then they won't see more ads.

hsbauauvhabzb 1 day ago

I agree. My point was that it is possible. Google would never do it without being forced.

squigz 2 days ago

I don't think those people caught up in it are suddenly like "oop that YouTuber is banned, I guess I don't believe that anymore". They'll seek it out elsewhere.

int_19h 2 days ago

If anything, these people see the removal of their "favorite" videos as validation - if a video is removed, it must be because it was especially truthful and THEY didn't like that...

theossuary 1 day ago

It's actually been showed many times that deplatforming significantly reduces the number of followers an influencer has. Many watch out of habit or convenience, but won't follow when they move to a platform with less moderation.

CobrastanJorji 2 days ago

Yeah, there are two main things here that are being conflated.

First, there's YouTube's decision of whether or not to allow potentially dangerous misinformation to remain on their site, and whether the government can or did require them to remove it.

Second, though, there's YouTube's much stronger editorial power: whether or not to recommend, advertise, or otherwise help people discover that content. Here I think YouTube most fairly deserves criticism or accolades, and it's also where YouTube pretends that the algorithm is magic and neutral and they cannot be blamed for actively pushing videos full of dangerous medical lies.

asadotzler 2 days ago

Why. Why is Google obligated to publish your content? Should Time Magazine also give you a column because they give others space in their pages? Should Harvard Press be required to publish and distribute your book because they do so for others.

These companies owe you nothing that's not in a contract or a requirement of law. That you think they owe you hosting, distribution, and effort on their algorithm, is a sign of how far off course this entire discourse has moved.

stronglikedan 2 days ago

The problem is that misinformation has now become information, and vice versa, so who was anyone to decide what was misinformation back then, or now, or ever.

I like the term disinformation better, since it can expand to the unfortunately more relevant dissenting information.

3cKU 2 days ago

[dead]

terminalshort 2 days ago

The algorithm doesn't push anyone. It just gives you what it thinks you want. If Google decided what was true and then used the algorithm to remove what isn't true, that would be pushing things. Google isn't and shouldn't be the ministry of truth.

woeirua 2 days ago

Exactly, they shouldn't be the ministry of truth. They should present balanced viewpoints on both sides of controversial subjects. But that's not what they're doing right now. If you watch too many videos on one side of a subject it will just show you more and more videos reinforcing that view point because you're likely to watch them!

terminalshort 1 day ago

Why should Youtube try to tell me what it thinks I should want to watch instead of what I actually want to watch? I'm not particularly interested in their opinion on that matter.

woeirua 1 day ago

Because search fundamentally requires curation. An algorithm has to "decide" which videos are most relevant to you. Otherwise, you'd be flooded with irrelevant results every time you make a search query.

TremendousJudge 2 days ago

"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?

terminalshort 1 day ago

Given the number of people that describes it's pretty clear that people do want that. It's not exactly a new and surprising thing that people want things that are bad for them.

ggm 2 days ago

Without over-doing it, as a non-american, not resident in the USA, It is so very tempting to say "a problem of your making" -but in truth, we all have a slice of this because the tendency to conduct state policy by mis-truths in the media is all-pervasive.

So yes. This is a problem rooted in the USA. But it is still a problem, and it's a problem for everyone, everywhere, all the time.

frollogaston 2 days ago

Still curious if the White House made them pin those vaccine videos on the homepage, then disable dislikes.

bluedino 2 days ago

I'm banned from posting in a couple subreddits for not aligning with the COVID views of the moderators. Lame.

croes 2 days ago

I was banned because a moderator misunderstood my single word answer to another post.

Reddit bans aren‘t an indicator for anything

c-hendricks 2 days ago

Whenever someone says "i was banned from ..." take what they say with a huge grain of salt.

int_19h 2 days ago

On Reddit, you can get banned from some subreddits simply because you have posted in another completely different sub (regardless of the content of the post).

It's not even always politics, although that's certainly a major driving force. But then you have really stupid fights like two subs about the same topic banning each others' members.

pinkmuffinere 2 days ago

Everybody here is strangers online, so I think grains of salt are reasonable all around. That said, I'm not sure that people-who-were-banned deserve above average scrutiny. Anecdotally, a lot of the RubyGems maintainers were banned a week ago. It seems really unfair to distrust people _just_ because a person-in-control banned them.

c-hendricks 2 days ago

Oof, I'm outside my edit window and didn't make my correct point. It's when people say "I was banned from _____ for _____". When people say "for _____" I take their word with a huge grain of salt.

Not even much to do with Reddit, it's something I picked up from playing video games: https://speculosity.wordpress.com/2014/07/28/the-lyte-smite/

NaN years ago

undefined

qingcharles 2 days ago

The problem (?) with Reddit is that the users themselves have a lot more control over bans than on other social media where it is the platform themselves that do the banning. This makes bans much more arbitrary even than on Facebook and et al.

frollogaston 3 hours ago

It's fine, but the voting system basically makes Reddit not a place for any kind of serious discussion. Which is also fine if you don't waste time expecting more from it.

mvdtnz 2 days ago

Reddit (both admins and many subreddit moderators) are extremely trigger happy with bans. Plenty of reasonable people get banned by capricious Reddit mods.

EasyMark 1 day ago

I was banned because I was simply in a covid sub debating with the covid-deniers. The "powers-that-be" mods literally banned anyone on that particular sub from popular subs, some of which I hadn't even been in, ever. There was (is?) a cabal of mods on there that run the most popular subs like pics/memes/etc that definitely are power hungry basement dwellers that must not have a life.

frollogaston 4 hours ago

I was banned from a subreddit and then Reddit itself for intentionally and egregiously violating several of the rules

alex1138 2 days ago

Stop excusing it. It's a very real, very serious problem with Reddit. They're very much abusive on this and many other topics

frollogaston 2 days ago

The answer is to leave Reddit and let them have their echo chamber. There's no point of posting there anyway.

incomingpain 1 day ago

That's the funny thing about reddit. You can get banned trivially on a whim of a mod. I've been banned from multiple subreddits that I've never been to. Simply because I posted on another subreddit and that mod found detestable.

My favourite. I'm trans/autistic. I was posting on r/autism being helpful. OP never mentioned their pronouns, just that they have a obgyn and feminine problems. I replied being helpful. but I misgendered them and they flipped out. Permabanned me from r/asktransgender, even though i never posted on it. Then left me a pretty hateful reply on r/autism. Reddit admins give me a warning for hate toward trans people. Despite me never doing any such thing and being one.

Right about the same time r/askreddit had a thread about it being hard not to misgender trans. So i linked this thread, linking an imgur of the reddit admin warning. I went to like 30,000 upvotes. r/autism mods had to reply saying they dont see any hate in my post and that people should stop reporting it.

Loocid 2 days ago

Eh, I was banned from several major subreddits for simply posting in a conversative subreddit, even though my post was against the conservative sentiment.

c-hendricks 2 days ago

Same, happened to me after replying to a comment in the JRE sub, I think I was calling something / someone dumb. Coincidentally, that sub is openly against him now.

Tried clarifying this in another comment, my point was more that people who say "I was banned from X for doing something innocuous" are often not telling the whole truth.

NaN years ago

undefined

petermcneeley 2 days ago

Arguing online about the merits of free speech is as paradoxical as having discussions about free will.

thrance 1 day ago

I think you have a shallow understanding of both free speech and free will if you think this is the gotcha you seem to think it is. Why couldn't people have discussions about free will in a determinist universe? They could be weaved by the laws of physics into having them.

As for free speech online, do you think there should be no limit to what can be said or shared online? What about pedophilia or cannibalism? Or, more relevantly, what about election-denialism, insurrectionism or dangerous health disinformation that are bound to make people act dangerously for themselves and society as a whole? Point is, free speech is never absolute, and where the line is drawned is an important conversation that must be had. There is no easy, objective solution to it.

petermcneeley 4 hours ago

There is an evolution from Luther to the Internet. But lets not pretend to know a reversal when we see it.

I also cringed at your list.

"what about election-denialism"

I dont think I can help you.

lupusreal 2 days ago

Prediction, nobody will be unbanned because they'll all be found to have committed other bannable offenses. Youtube gives Trump a fake win while actually doing nothing.

st-keller 1 day ago

More speech! The signal vs. noise-ratio shifts. So access to information will become more difficult. More disinformation and outright nonsense will make it more difficult to get to the valuable stuff. Ok - let‘s see how that works!

throwmeaway222 1 day ago

I'm shocked at how often people flip-flop their arguments when discussing private entities censoring speech. It's frustrating because it feels like the only speech allowed today is right-wing commentary. When Democrats were in power, it seemed like only left-wing commentary was permitted. It's baffling that, despite our education, we're missing the point and stuck in this polarized mess.

bromuro 2 days ago

YouTube is like old school televison - at different scale, they have to answer to politics and society. Our videos are their line up.

cavisne 2 days ago

They should bring back the content too. When history books are written the current state of things is misleading.

saubeidl 2 days ago

The world is going backwards rapidly. The worst people are once again welcomed into our now-crumbling society.

keeda 2 days ago

In other news (unrelated, I'm sure):

"DOJ aims to break up Google’s ad business as antitrust case resumes"

https://arstechnica.com/gadgets/2025/09/google-back-in-court...

TwoNineFive 2 days ago

They have a desperate need for false-victimhood.

Without their claim to victimization, they can't justify their hatred.

incomingpain 1 day ago

Canada has a tyrannical style government that has been censoring speech. I had a discussion recently with a liberal who was arguing that it's a good thing the government is censoring the speech of their political opponents. That free speech comes with consequences.

My argument, free speech is a limit on the government. Give them as much consequences you please but not with government power.

That's the problem here, Democrats were using government power to censor their political opponents; but they wouldnt have been able to do it without government power.

EasyMark 1 day ago

I'm not sure why they would, it's kind of a dumb move. They aren't violating anyone's freedom of speech by banning disinformation and lies. It's a public service, those people can head on over to one of the many outlets for that stuff. This is definitely a black mark on YouTube.

ironman1478 2 days ago

There isn't really a good solution here. A precedent for banning speech isn't a good one, but COVID was a real problem and misinformation did hurt people.

The issue is that there is no mechanism for punishing people who spread dangerous misinformation. It's strange that it doesn't exist though, because you're allowed to sue for libel and slander. We know that it's harmful, because people will believe lies about a person, damaging their reputation. It's not clear why it can't be generalized to things that we have a high confidence of truth in and where lying is actively harmful.

asadotzler 2 days ago

No speech was banned. Google didn't prevent anyone from speaking. They simply withheld their distribution. No one can seem to get this right. Private corporations owe you almost nothing and certainly not free distribution.

ironman1478 2 days ago

In the article it mentions that Google felt pressured by the government to take the content down. Implying that they wouldn't have if it wasn't for the government. I wasn't accusing Google of anything, but rather the government.

Maybe it's not banning, but it doesn't feel right? Google shouldn't have been forced to do that and really what should've happened is that the people that spread genuine harmful disinformation, like injecting bleach, the ivermectin stuff or the anti-vax stuff, should've faced legal punishment.

reop2whiskey 2 days ago

What if the government is the source of misinformation?

ironman1478 2 days ago

It's interesting you say that, because the government is saying Tylenol causes autism in infants when the mother takes it. The original report even says more verification is required and it's results are inconclusive.

I wouldn't be surprised if some lawsuit is incoming from the company that manufactures it.

We have mechanisms for combatting the government through lawsuits. If the government came out with lies that actively harm people, I hope lawsuits come through or you know... people organize and vote for people who represent their interests.

EasyMark 1 day ago

It certainly happens we're currently flooded with it from current regime

- Tylenol causes autism

- Vaccines cause autism

- Vaccines explode kids hearts

- Climate change is a hoax by Big Green

- "Windmill Farms" are more dangerous for the environment than coal

- I could go on but I won't

alex1138 2 days ago

Virtually all of the supposed misinformation turned out not to be that at all. Period, the end. All the 'experts' were wrong, all those that we banned off platforms (the actual experts) were right

alex1138 2 days ago

[flagged]

flohofwoe 1 day ago

Even more misinformation, Russian propaganda and bots to sift through in the recommendations and comments, got it!

jameslk 1 day ago

Misinformation, disinformation, terrorism, cancel culture, think of the children, fake news, national security, support our troops, and on and on. These will be used to justify censorship. Those who support it today may find out it's used against them tomorrow.

serf 2 days ago

i'd like to think that if I were a YTer that got banned for saying something that I believed in that I would at least have the dignity not to take my value back to the group that squelched me.

..but i'm not a yter.

TeMPOraL 2 days ago

It's showbiz. For those making actual money there, sacrificing dignity is the price of entry.

dev1ycan 1 day ago

Social media and lack of scientific research literacy is going to eventually prove to be fatal for modern society, even with this Tylenol thing, I have on one side people that believe a study blindly without reading that it's not taking into consideration several important variables and more studies are needed, and on the other hand I have people that did not read at all the study saying that it's impossible Tylenol could be causing anything because it is the only pain med pregnant women can take... clear non understanding of how controlled trials work...

Same thing with the UFO "Alien" video that was "shot down" by a hellfire missile (most likely a balloon), people just automatically assume that because it was said in congress it has to be true, zero analysis whatsoever of the footage or wanting to seek analysis by an expert, nope, it must be an alien.

There is so much misinformation, so much lack of understanding, and so many people, from every side that just have complete and utter lack of understanding of how seemingly basic things work, I am afraid for the future.

But yeah! let's unban unscientific sources, oh and people who are okay with a literal coup on a democracy.

guelo 2 days ago

The amount of flagged hidden comments here by the supposed anti censorship side is almost funny.

dang 2 days ago

If you (or anyone) run across a flagged comment that isn't tediously repeating ideological battle tropes, pushing discussion flameward, or otherwise breaking the site guidelines, you're welcome to bring it to our attention. So far, the flagged comments I've seen in this thread seem correctly flagged. But we don't see everything.

On this site, we're trying for discussion in which people don't just bash each other with pre-existing talking points (and unprocessed rage). Such comments quickly flood the thread on a divisive topic like this one, so flagging them is essential to having HN operate as intended. To the extent possible at least.

(oh and btw, though it ought to go without saying, this has to do with the type of comment, not the view it's expressing. People should be able to make their substantive points thoughtfully, without getting flagged.)

https://news.ycombinator.com/newsguidelines.html

croes 2 days ago

Flagging isn’t the worst that can happen, you could also be rate limited what prevents you from answering in a discussion because of „you are posting too fast“

I know what I‘m talking about

dang 2 days ago

Yes, when accounts have a pattern of posting too many unsubstantive and/or flamewar comments, we sometimes rate limit them.

We're happy to take the rate limit off once the account has built up a track record of using HN as intended.

NaN years ago

undefined

alex1138 2 days ago

Yeah but in practice this isn't actually the case, people flag all the time for people just having a dissenting opinion, fitting none of the categories you mentioned

dang 2 days ago

As mentioned, I haven't seen cases of that in the current thread. If there are any, I'd appreciate links. We don't see everything.

NaN years ago

undefined

NaN years ago

undefined

braiamp 2 days ago

There's one comment literally spreading misinformation and it isn't flagged, but instead got pushback by others, critically pointing the weakness of their arguments.

3cKU 2 days ago

[dead]

valentinammm 2 days ago

[dead]

boxerab 2 days ago

tl;dr The Biden Administration has been caught using the government to force Twitter, YouTube and Facebook to censor its political enemies.

EasyMark 1 day ago

They never forced them, and they certainly never said "that's a nice merger you got there, it would be awful if something were to happen to it" per the current policies of the US government.

boxerab 1 day ago

leftism is truly an inversion of reality - current govt is not outsourcing censorship to do end run around 1A, Biden admin did.

EverydayBalloon 1 day ago

[dead]

cindyllm 2 days ago

[dead]

cbradford 2 days ago

So absolutely no one involved will have any repercussions. So they will all do it over again at the next opportunity

JumpCrisscross 2 days ago

> they will all do it over again at the next opportunity

Future tense?

asadotzler 2 days ago

They are mega-corporations. They always do what ever the hell they want, certainly absent your input. Did you really believe they don't do what ever they want, because that's pretty damned naive.

johnnyanmac 2 days ago

yeah, 2025 in a nutshell. The year of letting all the grifts thrive.

lazyeye 2 days ago

What should the punishment be for having opinions the govt disagrees with?

Supermancho 2 days ago

Promoting medical misinformation or even health misinformation should be critically judged. Alternative health companies are rubbing their hands together.

The next Drain-o chug challenge "accident" is inevitable, at this rate.

cubefox 1 day ago

What is considered "misinformation" depends on whatever the censoring authority in question (e.g. Facebook or YouTube or some news website) _believes_ to be misinformation.

For example, in 2020, the WHO(!) Twitter account literally tweeted that masks don't work. That same statement would have been considered medical misinformation by a different authority.

Another example: the theory that Covid leaked from a lab in Wuhan which was known to do gain of function experiments on coronaviruses was painted as a wacky conspiracy theory by most of the mainstream media, despite the fact that many respectable sources (e.g. the CIA) later concluded that it has a significant amount of plausibility versus the alternative Wuhan wet market hypothesis which required that the virus somehow arrived there from a bat cave more than a thousand kilometres away.

lazyeye 1 day ago

That sounds great in theory. In practice, "misinformation" ends up being defined as anything the govt finds inconvenient. Or it is selectively applied so that when misinformation comes from all sides of the political spectrum, only people the govt doesnt like (in the more general sense) get kicked off platforms.

th0ma5 2 days ago

Notoriety

lazyeye 2 days ago

Yep..and fame, admiration, contempt, loathing, indifference etc

oldpersonintx2 2 days ago

[dead]

jimt1234 2 days ago

[flagged]

najarvg 2 days ago

[flagged]

ch4s3 2 days ago

Far too many people are free speech hypocrites.

eschulz 2 days ago

who doesn't get free speech?

jimt1234 2 days ago

[flagged]

cptnapalm 2 days ago

[flagged]

SV_BubbleTime 2 days ago

[flagged]

NaN years ago

undefined

apercu 2 days ago

[flagged]

apercu 2 days ago

[flagged]

guelo 2 days ago

[flagged]

paulryanrogers 2 days ago

Steelman argument is it's better to know what liars, bigots, and other naughty people are up to than push them entirely underground. And someday future moderators may think you're naughty/lying/a quack/etc.

IMO we should not let private platforms become near monopolies, and certainly not without regulation, since they become a defacto public square. But if we're going to let them eat the world, then hopefully they'll at least use good judgment and measures like de-ranking or even banning folks who encourage others to do harm. Making bans temporary is a safety valve in case of bad moderation.

immibis 2 days ago

That steelman is still a pretty bad argument, though. I don't see why giving liars, bigots and other naughty people a megaphone is required in order to know what they're saying.

jimmygrapes 2 days ago

I suppose the argument there is that it's not necessarily a megaphone for the fella with 24 followers. The concern comes from when someone amasses a following through "acceptable" means and then pivots. Not sure how to balance that.

NaN years ago

undefined

paulryanrogers 2 days ago

Yeah, I personally still see a place for permanent bans. But I can see the other side.

brokencode 2 days ago

Who gets to decide who’s naughty? One day it’s the Biden admin, and the next it’s the Trump admin. That’s the tough part about censorship.

You can leave it up to companies, but what happens when Trump allies like Elon Musk and Larry Ellison buy up major platforms like Twitter and TikTok?

Do we really trust those guys with that much power?

NaN years ago

undefined

hash872 2 days ago

What is Youtube a 'near monopoly' in? Online video.....? Do you have any idea how much video there is online that's not on Youtube? They don't meet the legal definition of a monopoly

bawolff 2 days ago

People change/make mistakes. Permanent bans are rarely a good idea.

ryandrake 2 days ago

Earlier in 2025, the video game Fortnite announced[1] that they were giving cheaters with lifetime bans a "second chance" and let them return to the game. Lo and behold, cheating in the game spiked up this year and has returned as a huge ongoing problem. Turns out, the vast majority of the bans were probably correct, and when you let people back into something who were banned for doing X, they're going to immediately start doing X again once they're back in.

1: https://www.fortnite.com/news/fortnite-anti-cheat-update-feb...

bawolff 2 days ago

Both these things can be true.

People deserve second chances every now and then. Many people squander their second chances. Some people don't.

stefantalpalaru 2 days ago

[dead]

dotnet00 2 days ago

Admittedly, Google was very heavy handed with Covid censorship. Sure, there was a lot of genuine misinformation that maybe deserved it, but they also tended to catch a lot of actual qualified scientists engaging in scientific debate (say, arguing in favor of masks and the transmission through air theory in the early days) or even some discussion that wasn't opposing the official stances.

Somewhat related, it's pretty insane how even to this day YouTubers have to avoid referring to by name a global multi-year situation that everyone who existed at the time went through. It's due to advertisers rather than government pressure, but still, insane.

andy99 2 days ago

Yeah at the time I get the impression they were banning dissent, not just egregious or dangerous content (whatever that even means). I though most places came to their senses a long time ago and walked back that heavy handedness, I'm surprised this just happened.

layman51 2 days ago

Your point reminded me that around the time when the pandemic first started, I saw a YouTube video on physics titled something like "Corona and Arc Discharge" and it had the contextual note that is sometimes added to videos. I think the official name YouTube gives it is: "topical context in information panel". I thought it was a funny case where the automated system thought this physics video had something to do with COVID.

IncreasePosts 2 days ago

Merriam Webster defines con man as "a person who tricks other people in order to get their money : con artist"

Even if people were straight up wrong about their COVID-19 theories, I don't think many of the banned people were trying to get viewers to send them money.

mapontosevenths 2 days ago

> trying to get viewers to send them money.

They were trying to get viewers to get money. It's an important distinction.

heavyset_go 2 days ago

We both know that ads and sponsorships are a significant way influencers monetize their viewers.

All they have to do is lie to attract eyeballs and they make money. E-begging isn't necessary, the platforms allow you to extract value from viewers at an incredible scale.

jmyeet 2 days ago

First, let's dispense with the idea that anybody is a free speech absolutist. Nobody is. No site is. Not even 4chan is (ie CSAM is against 4chan ToS and is policed).

Second, some ideas just aren't worth distributing or debating. There's a refrain "there's no point debating a Nazi". What that means is there is a lot of lore involved with being a modern Nazi, a labyrinth of conspiracy theories. To effectively debate a Nazi means learning all that lore so you can dismantle it. There's no point. In reality, all you end up doing is platforming those ideas.

I'm actually shocked at how ostensibly educated people fall into the anti-vax conspiracy trap. Covid definitely made this worse but it existed well before then. Certain schools in San Francisco had some of the lowest child vaccination rates in the country.

As a reminder, the whole vaccine autism "theory" originated from one person: Andrew Wakefield. He was a doctor in the UK who was trying to sell a vaccine. The MMR vaccine was a direct compeititor so he just completely made up the MMR link to autism. He his medical license because of it. But of course he found a receptive audience in the US. He is and always was a complete charlatan.

Likewise, the Covid anti-vax movement was based on believing random Youtube videos from laymen and, in many cases, an intentional ignorance in the most esteemed traditions of American anti-intellectualism. People who are confidently wrong about provably wrong things who had no interest in educating themselves. Some were griters. Some were stupid. Many were both.

We had people who didn't understand what VAERS was. (and is). We had more than 10 million people die of Covid yet people considered the vaccine "dangerous" without any evidence of side effects let alone death. As one example, you had people yelling "J'accuse!" at hints of myocardial inflammation from the vaccine. But you know what else causes myocardial inflammation? Getting Covid.

If you're excited by this move, it just further highlights that you have no idea whta's going on and zero interest in the truth. What's happening here is big tech companies capitulating to the fringe political views of the administration, a clear First Amendment violation, to curry favor, get their mergers approved, get cgovernment contracts and so on.

Regardless of your views on this or any otehr issue you should care about capitulation by social media sites in this way.

This comments on this post are just a graveyard of sadness.

int_19h 2 days ago

The problem with those "ideas that just aren't worth" is the usual, who decides?

In my country of origin, you get called a Nazi simply for being opposed to the war of aggression that it is currently engaged in. In US, we have a long history of "terrorist" and "extremist" being similarly abused.

jmyeet 2 days ago

Do you think it's a good idea that this administration gets to decide what is and isn't acceptable speech? That's one of my points. So regardless of your positions on Covid and the 2020 you shouldn't celebrate this move because the government shouldn't have this kind of influence.

int_19h 1 day ago

Oh, absolutely, I don't think this move by Google has anything to do with them being some kind of staunch free speech supporters. It's an obvious and rather pathetic attempt to suck up to the Trump administration, which itself is cancer when it comes to rights and freedoms. I'm no COVID denialist either.

I just don't think that "there's no point debating a Nazi" is, in general, a good argument in favor of censorship, whether public or private. It's one of those things that have a good ring to it and make some superficial sense, like "fire in the crowded theater", and then you look at how it works in the real world...

rob74 1 day ago

> Google's move to reinstate previously banned channels comes just over a year after Meta CEO Mark Zuckerberg said [...] that the Biden administration had repeatedly pressured Meta in 2021 to remove content related to COVID-19. "I believe the government pressure was wrong, and I regret that we were not more outspoken about it," Zuckerberg wrote in the August 2024 letter.

I'm sure Zuckerberg will say the same thing in 2029 too if the ruling party changes again. Until then, removing fact-checking and letting conspiracy theorists have their freedom of speech while suppressing voices critical of the current administration will make that change less likely...

reop2whiskey 2 days ago

is there any political censorship scheme at this large of scale in modern us history?

rimbo789 2 days ago

Yes; the way the us government big business and the media, specifically Hollywood colluded during the Cold War

pcdoodle 2 days ago

So great to see the censorship apparatus in full swing on HN. Lots of great comments into the dust bin.

moomoo11 2 days ago

I think hardware and ip level bans.. should be banned.

I know that some services do this in addition to account ban.

ocdtrekkie 2 days ago

Any service which allows user generated content and allows arbitrary IP addresses to create infinite accounts is guaranteed to be overrun with CSAM. It's practically a law of physics.

jjk166 2 days ago

If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life at the hands of actual authorities. Websites banning such posters only serves to alert them that they need to improve their tactics and give them the opportunity to hide. Removing only the offending content and alerting authorities is the appropriate thing a website like Youtube should be doing.

Even if one does argue that CSAM should result in hardware and IP bans, there's no reason that can't be a sole exception to a wider prohibition on such bans.

JumpCrisscross 2 days ago

> If you actually cared about CSAM you would want those posting it to self incriminate and then face consequences in real life

We don’t have the resources for this, even when the FBI isn’t being purged and sent to Home Depots. Unrestricting IPs means a boom for CSAM production and distribution.

NaN years ago

undefined

ocdtrekkie 2 days ago

Yes, we should let people "self-incriminate" with Tor and disposable email services...

NaN years ago

undefined

alex1138 2 days ago

So the other day, I linked to something on Rumble right here on Hacker News and was told to find a better source

First of all, you can't separate a thing's content from the platform it's hosted on? Really?

Second of all, this is why

I'll just go do this again and if you flag me it's on you, you have no standing to do it (the internet is supposed to be democratic, remember?)

https://rumble.com/v28x6zk-sasha-latypova-msc.-nsa-team-enig...

https://rumble.com/v3zh3fh-staggering-17m-deaths-after-covid...

https://rumble.com/vt62y6-covid-19-a-second-opinion.html

https://rumble.com/v2nxfvq-international-covid-summit-iii-pa...

I could go on. Feel free if you want to see more. :)

(Was it misinformation when Fauci said you shouldn't rush a vaccine or all hell breaks loose years later? Or when he intimated that masks wouldn't work for covid?)

braiamp 2 days ago

The reason why you are asked better source is because, and let me say this slowly, anyone can post any crap on the internet without repercussions. Lets start with the one that references "Sasha Latypova". If I search her credentials she earned a title on Master of Business Administration, except that she used that to work as a co-founder of two companies, and none of them are even adjacent to pharmacology, but she is a "global PHARMA regulation expert". I'm sure that the other persons there will not have those issues, right?

The_President 1 day ago

“And let me say this slowly” No point in typing this out - it is condescending to the parent poster.

1121redblackgo 1 day ago

Equal and opposite reactions, if parent poster is falsely that confident, then its fair to meet them with strong condescension on the other side is it not?

1121redblackgo 2 days ago

Boo