Apparently there is no limit, Joe remarked. Anything can be said in this place and it will be true and will have to be believed.
The Third Policeman
The deception of whole peoples is not a matter which can be lightly regarded.
Falsehood in Time of War
What do you do with dissent when there's no time for it because it's an emergency, but you live in a democracy that supposedly values debate? It's complicated! Take the Covid crisis, for example, which arrived with all sorts of unknowns begging for educated opinions and all sorts of rules begging for unwavering compliance. What do you do when some of the educated opinions are at odds with the unwavering compliance? Something's got to give.
Stanford University physician Jay Bhattacharya tweets statistical data showing that Covid almost exclusively threatens the elderly.
Twitter blacklists him.[1]
Another Stanford University physician, John Ioannidis, one of the world's leading epidemiologists, argues that the Covid response should be informed by data and questions the wisdom of the lockdowns.
YouTube removes the video.[2]
At a United States Senate committee hearing, Dr. Pierre Kory, Critical Care Service Chief at University of Wisconsin School of Medicine and Public Health, presents evidence of early treatment protocols having success against Covid in large-scale programs in India, Argentina, Peru, and elsewhere.
YouTube removes the video — a video of a Senate hearing.[3]
Dr. Robert Malone, a contributor to the invention of mRNA vaccine technology, calls for caution, saying the platform he helped create came with certain dangers.
Twitter bans him.[4]
Dr. Martin Kulldorff, Professor of Medicine at Harvard University and member of the CDC Covid-19 Vaccine Safety Technical Work Group, argues in favor of the J&J vaccine, saying it's safe to continue giving it to older people, after the CDC decided to suspend it for everyone. He also argues against vaccine mandates, particularly for those with natural immunity from prior infection.
For the former opinion, he is fired from the CDC. For the latter, he is fired from Harvard.[5]
Dr. Meryl Nass treats her Covid patients with ivermectin and criticizes her state governor's Covid policies on a radio program. Her patients do well; none are harmed; none complain.
At the behest of the governor's sister, the Maine medical board suspends Dr. Nass's medical license.[6]
An analysis of 325 post-vaccine autopsy findings by a team of highly regarded experts, including Harvey Risch, Yale University's Professor Emeritus of epidemiology, is posted on The Lancet's preprint server.
The Lancet abruptly withdraws the paper before peer review.[7] The paper subsequently does get peer reviewed elsewhere and is published by the Elsevier journal Forensic Science International, where it becomes the top trending research paper worldwide across all subject areas, according to the Observatory of International Research,[8] only to be withdrawn and completely wiped from the journal's server a few weeks later following anonymous complaints.[9]
Dr. Peter McCullough, Consultant Cardiologist and Vice Chair of Internal Medicine at Baylor University Medical Center, with over a thousand peer-reviewed publications, argues that there is no medical reason for people with natural immunity from prior Covid infection to get the jab, and accurately notes the number of deaths reported in the CDC's own vaccine safety monitoring system, which makes no claims as to causality but is intended to signal potential problems.
The American Board of Internal Medicine – which relies on Dr. McCullough's publications for some of its educational and testing content – revokes his certification.[10]
These are just a few examples of what was evidently a concerted and uncompromising effort to manage information around Covid. Countless others could be listed.
Some will feel that this information-control program was a responsible public health intervention to save lives by minimizing dangerous disinformation and misinformation. Others will feel it was inappropriate censorship that not only ran contrary to democratic and scientific principles but was ultimately counterproductive to public health goals.
A person on either side of that divide could, would, and did call those on the other side "anti-science." The nice thing about that is that science has ways of showing who's got the better argument. We'll be taking a close look at some of those ways in theory and practice.
This book will neither advocate nor vilify any particular positions, because what am I, a doctor, a scientist, an engineer? Nope, nope, and nope. This book is about methods, which we can all understand. It's about the process by which we arrive at positions.
My starting point – my bias – centers on the principle that heterodox views and the debates that they entail are the lifeblood of both science and democracy. There are undoubtedly limits to that principle, but first things first. How is truth determined? Does a position become an orthodoxy by force of evidence or by evidence of force? We can test for that — by looking at the process.
In order to function, science and democracy both need competing ideas and transparent processes. One can't simply assert and demand; one must demonstrate and persuade. Yet, as we can see in the examples above, complex new issues that would seemingly invite thoughtful discussion and informed debate are instead almost instantly triaged into required and forbidden views – an absolutist binary in the mold of the "with us or against us" declaration with which George W. Bush launched the age of the permanent emergency.[11]
We do literally live in a state of permanent emergency, by the way, at least in the United States. It's official. Proclamation 7463 was declared after September 11, 2001, because of "the continuing and immediate threat of further attacks." It has been renewed annually ever since, by Democrat and Republican alike, spilling into putative domestic threats,[12] whittling away at civil liberties,[13] and by all appearances metastasizing into a general-purpose with-us-or-against-us mindset that exists in tension with democratic principles. Along with required and forbidden opinions comes the division of society into seemingly irreconcilable factions faced off in deep mutual disdain.
In the paradigm of the absolutist binary, transgressive views are considered not just incorrect, but literally intolerable. Proving dissent wrong would require something unacceptable: intellectual engagement. Dismissing it requires only pointing and laughing. Since not being a ridiculous laughingstock ranks pretty highly among most people's life goals, turning dissent into shame — tabooing dissent, in other words — is a highly effective deterrent. The required views are thereby shielded from serious scrutiny.
Thus the absolutist binary isn't just about identifying friends and enemies, it's also about terrifying people into conformity. Views on the wrong side of that binary can be met with severe social, professional, and financial consequences.[14] Goodbye hypotheses, data, forensic techniques, and peer review. Hello "On Narrative" (with us, intelligent, good, sane) and "Off Narrative" (against us, evil, stupid, mad).
It isn't just Covid. Court cases[15] and other documents[16] have demonstrated that media companies (social and otherwise) have actively established required and forbidden views on many thorny issues, from foreign wars to election interference, often at the behest of government agencies.[17] Of course, the official response is that agency actions aren't about squashing dissent or restricting anybody's legitimate free speech rights — they're about dealing with dangerous disinformation in the midst of a crisis.
The question is: Without dissent, discussion, and debate, how do you know it's disinformation?
The lab-leak theory of Covid's origins was initially denounced in the most withering possible terms as a contemptible conspiracy theory from the lunatic fringe. A person could get kicked off social media for suggesting the idea. Then it emerged that the FBI, the Defense Intelligence Agency, and the Department of Energy each assessed it to be a credible if not likely explanation.[18] [19] [20] Hunter Biden's famous laptop with its cache of incriminating documents was pure Russian disinformation according to the consensus of 51 former intelligence officials — 51, count 'em – until it was confirmed to be authentic after all.[21]
One might be tempted to make the argument that these turnabouts indicate that there's no problem, that the system is working: "When they get things wrong, they're soon corrected." Sometimes. However, the point isn't that they got things wrong. It's that they demonized dissent. The lab-leak theory might still be wrong, but it should never have been deemed a forbidden thought.
Demonization of dissent is an anti-democratic, anti-scientific practice that delays and at times surely prevents the determination of truth. It's also the single most defining characteristic of a dystopian society. Ask any science fiction writer.
It is therefore this tabooing of dissent that draws our scrutiny. It indicates that the normal, desirable procedures of science and democracy have been suspended. It has proven, at least sometimes, to correlate with unreliable official narratives. And, as John Stuart Mill put it in On Liberty: "All silencing of discussion is an assumption of infallibility."[22] And as we've seen, the would-be Ministers of Truth, like everyone else, are far from infallible.
So if an official narrative is being heavily promoted and dissent around it is tabooed as disinformation, maybe it's worth a closer look on general principle. Maybe the dissent really is disinformation. Maybe it's not. Maybe the word disinformation is being used as disinformation. How can we tell?
An obvious difficulty for us as lay people is that understanding whether something is disinformation or not often requires expertise that is far beyond our capabilities. We don't know who did or didn't commit a chemical attack. We weren't there. We didn't even receive any of the soil samples. We don't know whether a cheap generic medicine is as good or better than a profitable patented new drug. We don't know how to begin to navigate the specialized terrain of clinical trials, observational studies, conflicts of interest, or corporate capture of nominal regulators.
All we know usually is that somebody is offering evidence of something. Should we believe it? Is it accurate? Does it mean what they say it means? Is there missing context? Were there unstated conflicts of interest or built-in biases? Are there valid counter-arguments?
Who the hell knows! We know we want assertions to be backed by evidence, but it turns out evidence isn't often of much use to us. So how can we navigate this minefield of science, pseudo-science, lunacy, and bullshit? Trust the experts and fact checkers? Which ones? Sponsored by whom? We desperately need a reliable way involving our own thinking to tell reality from delusion in the post-truth world. That's why we'll focus not on evidence we can't evaluate, but on methods we can.
Truth may fail, but lies leave a trail.
Nonsense is detectable. We can tell when there's a rabbit off because we know what's supposed to happen and not happen in science. The scientific method comes with certain standard expectations: empiricism, transparency, reproducibility, independent analysis, and the need to account for all of the evidence. It involves the welcoming of multiple hypotheses in order to avoid confirmation bias. It is explicitly opposed to political and religious dogma as barriers to the demonstration of truth.
There are other things the scientific method explicitly does not involve. It doesn't involve hiding, distorting, falsifying, inventing, ignoring, or cherry-picking data. It doesn't involve rigging studies to ensure a desired outcome. It doesn't involve accusing dissenters of heresy and demonetizing their YouTube channels.
These are the sorts of things we can fruitfully examine in order to get a better idea of the reliability of science-based positions on either side of a binary divide. We don't need to be subject experts. We're all capable of making judgements about a process when we know how it's generally supposed to go.
Let's practice.
There's a scientific dispute. One side says: "Your conclusion isn't consistent with the evidence we have right here. Please have a look and let's see if we can resolve the discrepancies."
If the other side comes back with "I see, I see — good heavens you're right, it's back to the drawing board!" or "Hmm, no, my friend, I'm afraid it's you who have made the mistake — you've miscalculated the Cabibbo angle!" — great. Either way. Science is happening.
But if the other side comes back with: "I'm calling the gestapo! Say goodbye to your fingernails!" — or with unsupported allegations, ad hominem attacks, or mindless pejoratives — then we're not talking science any more, are we? We still might not have the truth quite in hand, but we'll certainly have a scent to follow.
If a narrative is constructed with bad science, no matter how aggressively it is promoted, no matter how many people believe it, it will remain fundamentally flimsy.
The key to evaluating assertions, as Bertrand Russell has argued in his seminal treatise on propaganda,[23] is to develop a mental state of "critical undogmatic receptiveness," also known as an actively open mind. Before we can cultivate that elusive state, we mainly need one other thing first: We have to care whether things are true or not. I mean actually care.
Do we? It's not that easy. The usual thing is to go binary and keep to our own people: the stuff my people believe is true-good-sane, and the stuff those other people believe is false-evil-nuts. We're with us, and they're against us, and that's all we want to know. Whatever we want to believe, we know we can always find a link to some article that supports it. We know the other side can too, but their links are stupid and ours are real.
Wine and chocolate are good for me? It's true and I'm done researching. Those are my go-to studies. I'll point to them no matter what anyone else says. Are those things really true? I don't know. I don't want to know. They might be true. They're true enough for me to stop looking. They're true enough to maintain my happy place.
This is confirmation bias. Confirmation bias is trying to prove yourself right. Science – truth-seeking – is trying to prove yourself wrong.
As noted above, documents from the Twitter Files and other sources have revealed that government pressure on media plays an active role in narrative management. The hard-to-control online frontier of social media and alternative news sources seems to have driven the recent increased interest in more overt forms of censorship and propaganda than had previously been required.
Arm twisting isn't often necessary for the legacy media. Consent, as Noam Chomsky and Edward Herman have explained, is manufactured by culture.[24] It's the air we breathe. From early education through the mechanisms of career advancement, the culture "selects for obedience and subordination," as Chomsky told journalist Andrew Marr. When Marr objected that he himself as a BBC journalist could say anything he wanted, Chomsky famously replied: "I'm sure you believe everything you're saying. But what I'm saying is if you believed something different, you wouldn't be sitting where you're sitting."[25] No instructions, no conspiracy required.
This kind of cultural selection for obedience isn't just for journalists. It's for all of us. Narrative conformity is constantly reinforced as normal, good, intelligent, sane. Dissent is for morons and lunatics. These messages comprise what Lacan called the "symbolic order" of a culture, and (to varying degrees) we internalize them. Lacan's "big Other" is the figure that represents total internalization of all official narratives – the perfect non-transgressive consumer of PR and propaganda. Most real people diverge, at least privately, at least a little bit, and hold little shadow views that in turn hold some of the best potential for social change.
An internalized belief is one that doesn't need overt enforcement – or inspire rebellion. Examining narratives for truth content means allowing for the possibility that propaganda might not be something that only happens to other people. It means allowing for the possibility that some of our internalized beliefs about smart-sane and stupid-loony might be backwards. It means allowing for the possibility of deviating from one's tribe in a particular matter. It means having a bit of courage and a taste for adventure.
Why would we want to do that? We could start with President Eisenhower's unsettling farewell address, which he spent warning us about corruption from overly influential military, corporate, scientific, and technocratic elites, urging us to "take nothing for granted" because security and liberty can only be compelled by "an alert and knowledgeable citizenry."[26]
Our motivation also comes from the awareness that, while we certainly don't want to end up agreeing with any nut jobs or odious ideologues, we also don't want to end up being credulous suckers for authoritarian hucksters. Surely our safest, most bulletproof path is to do our best to make sure we're not buying any pigs in a poke from anybody. How do we do that? We listen to Bertrand Russell. We employ critical undogmatic receptiveness.
This book is going to try a bit of that. We'll examine controversial case studies where the official narrative is so tightly woven into the common-sense fabric of society that dissent is only for morons and lunatics. These are the issues where the case in favor of limits to free speech should be at its most compelling, where the dismissal of dissent as disinformation should be most clearly justified. These will be issues with science at the core, so that we can compare what did happen — how the competing positions were determined — with what we know should have happened: transparency, independent review, and competing justifiable hypotheses treated dispassionately in an honest search for truth.
The question we will try to answer is not which position is correct in any given case. Again: we're not subject experts. Rather, we're interested in the question of whether either or both positions have diverged from the normal procedures of science. If we can't tell, hey: toss a coin. Go with your people. If we can tell, well, hopefully any funny business isn't on the side we'd prefer to believe. If it is, we'll find ourselves at a crossroads. A crisis of intellect, and of conscience. It'll be exciting!
The looking-at-methods method should be able to go some way towards separating the wheat from the chaff. If the nut job position is nutty, it should show exactly why. If the official position doesn't hold up, it should show that too.
The idea is this: If even these universally derided positions can produce somewhere among them a single surprisingly sound argument in contrast to all the messaging, we'll have ample reason to interrogate the messengers. If the emperor has no clothes, we're going to need to have a serious conversation with all the townspeople who told us otherwise.
No futile attempt to hide authorial biases will be made, but the conclusions that can be drawn from this exercise will of course be your own (my own "conclusion" chapter notwithstanding). You might decide that all the right things have happened; the symbolic order has structural integrity and all we have to do is manage our disinformation nut jobs properly.
You might decide that, in at least some cases, we've strayed from the path of science and democracy. If that happens often enough, in important-enough ways, it might indicate that our society isn't quite what it says on the tin, that perhaps Eisenhower wasn't just being a big old drama queen. It might prove to be a good argument for slowing down and taking time for thought, discussion, debate, and democracy before we yield to knee-jerk binary absolutism with all its tribal divisiveness and potential for violence.
Bearing in mind all of the above, as you proceed through the rest of this slim volume I have just one request:
Stop me if I say something crazy.
-- Read the rest! --
ACLU, undated. Surveillance Under the Patriot Act. American Civil Liberties Union. https://www.aclu.org/issues/national-security/privacy-and-surveillance/surveillance-under-patriot-act
Barnes, Julian E., 2023. "Lab Leak Most Likely Caused Pandemic, Energy Dept. Says." The New York Times. February 26, 2023. https://www.nytimes.com/2023/02/26/us/politics/china-lab-leak-coronavirus-pandemic.html
Bush, George W., 2001. "President Bush Addresses the Nation [full text of his public address to a Joint Session of Congress]." Washington Post. September 20, 2001. http://www.washingtonpost.com/wp-srv/nation/specials/attacked/transcripts/bushaddress_092001.html
Chomsky, Noam and Edward S. Herman, 1988. Manufacturing Consent: The Political Economy of the Mass Media. New York: Knopf Doubleday.
Eisenhower, Dwight, 1961. President Eisenhower's Farewell Address. https://www.archives.gov/milestone-documents/president-dwight-d-eisenhowers-farewell-address
Gerken, Tom, 2024. "Zuckerberg regrets bowing to Biden 'pressure' over Covid." BBC News. https://www.bbc.co.uk/news/articles/czxlpjlgdzjo
Greenwald, Glenn, 2024. SCOTUS Protects Biden Administration's Social Media Censorship Program from Review. Video Transcript. https://greenwald.locals.com/post/5804293/cnn-s-kasie-hunt-has-humiliating-meltdown-ahead-of-biden-trump-debate-scotus-protects-biden-admin.
Hulscher, Nicolas, et al., 2023. A Systematic Review of Autopsy Findings in Deaths after COVID-19 Vaccination. Preprints with The Lancet. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4496137
Hulscher, Nicolas, et al., 2024. "A systematic review of autopsy findings in deaths after Covid-19 vaccination." Forensic Science International: 112115. https://www.sciencedirect.com/science/article/pii/S0379073824001968
Johnson, Ron, 2021. "YouTube Cancels the U.S. Senate." Wall Street Journal. February 2, 2021. https://www.wsj.com/articles/youtube-cancels-the-u-s-senate-11612288061
Kessler, Glenn, 2021. "Timeline: How the Wuhan lab-leak theory suddenly became credible." Washington Post. May 25, 2021. https://www.washingtonpost.com/politics/2021/05/25/timeline-how-wuhan-lab-leak-theory-suddenly-became-credible
Kulldorff, Martin, 2024. "Harvard Tramples the Truth." City Journal. https://www.city-journal.org/article/harvard-tramples-the-truth
Levin, Sam, 2019. "Revealed: FBI Investigated Civil Rights Group As 'Terrorism' Threat And Viewed KKK As Victims." The Guardian. February 1, 2019. https://www.theguardian.com/us-news/2019/feb/01/sacramento-rally-fbi-kkk-domestic-terrorism-california
Marr, Andrew, 1996. "Interview with Noam Chomsky." The Big Idea (BBC). February 1996. https://www.youtube.com/watch?v=GjENnyQupow&t=557s
Matza, Max and Nicholas Yong, 2023. "FBI chief Christopher Wray says China lab leak most likely." BBC News. March 1, 2023. https://www.bbc.co.uk/news/world-us-canada-64806903
Mill, John Stuart, 2011 (1859). On Liberty. London: Project Gutenberg (Originally John W. Parker and Son). https://www.gutenberg.org/files/34901/34901-h/34901-h.htm
Mordock, Jeff, 2024. "FBI agent confirms authenticity of Hunter Biden's laptop." The Washington Times. June 4, 2024. https://www.washingtontimes.com/news/2024/jun/4/erika-jensen-confirms-authenticity-of-hunter-biden/
Niemiec, Emilia, 2020. "COVID-19 and misinformation." EMBO reports 21 (11): e51420. https://doi.org/10.15252/embr.202051420
OIR, 2024. "Observatory of International Research." from https://ooir.org/.
Othot, Seamus, 2023. "Dr. Meryl Nass, Top Critic of Mills' COVID Policies, Sees Medical License Suspension Extended." The Maine Wire. December 15, 2023. https://www.themainewire.com/2023/12/dr-meryl-nass-top-critic-of-mills-covid-policies-sees-medical-license-suspension-extended/
Russell, Bertrand, 2014 (1922). Free Thought and Official Propaganda. Project Gutenberg.
Shir-Raz, Y., et al., 2022. "Censorship and Suppression of Covid-19 Heterodoxy: Tactics and Counter-Tactics." Minerva: 1-27.
Swanson, Bret, 2023. "Covid Censorship Proved To Be Deadly." The Washington Post. July 7, 2023. https://www.wsj.com/articles/covid-censorship-proved-to-be-deadly-social-media-government-pandemic-health-697c32c4
Taibbi, Matt, 2023. The Censorship-Industrial Complex. The Twitter Files. March 9, 2023. https://twitterfiles.substack.com/p/the-censorship-industrial-complex.
World Council for Health, 2022. "World Council for Health Stands with Dr Peter A. McCullough, MD, MPH." from https://worldcouncilforhealth.org/news/statements/peter-mccullough/