Source Feed: Walrus
Author: Timothy Caulfield
Publication Date: July 17, 2025 - 06:30
Before You Share That Health Tip, Read This
July 17, 2025

W ith each passing day, our information environment is becoming more chaotic. It is filled with lies and spin and rage. I’ve spent decades studying how health and science are represented in the public sphere. And it has never been as bad as it is right now. Here are my current top ten concerns keeping me up at night.
1. Health misinformation increasingly linked to political identity
This is, by far, the biggest challenge. People’s positions on everything from raw milk and miracle supplements to reproductive care, gender, and diet are often shaped by how they vote. In fact, when it comes to specific flashpoints, such as vaccines or a debunked COVID-19 therapy like ivermectin, partisan leanings are a strong predictor of belief and behaviour. And this matters. Once a topic becomes a proxy for political principles, the discourse becomes nastier, and it is more difficult to change minds.
2. A decrease in trust driven by the spread and embrace of misinformation and disinformation
Science communicators, health experts, researchers, and clinicians should remain open to criticism, learn from mistakes, and adapt in light of new evidence. But the bulk of distrust currently shaping the discourse around health isn’t the result of blunders by authorities (which, of course, happen). It stems from the spread of falsehoods, both deliberate and unintentional, that take root in the public imagination. As summarized in a recent editorial in the journal The Lancet, health disinformation has “become a deliberate instrument to attack and discredit scientists and health professionals for political gains.” And the more distrust deepens, the more difficult it becomes to find voices the public can trust to challenge the lies.
3. The rise of misinformation about misinformation
Efforts to understand or call out the harms of misinformation are being maligned, often by suggesting it is a thinly veiled attempt to erode freedom of expression. Indeed, this argument has been used to justify the cancellation of research funding and fact-checking efforts in the US. The reality, of course, is that misinformation is a fantastically complex phenomenon that demands careful study, and most efforts to constructively counter happen within the marketplace of ideas (e.g., critical thinking, prebunks, debunks, fact checks, accuracy nudges, etc.). This work is not focused on silencing anyone. As recently noted in a Time article by misinformation researchers Sander van der Linden and Lee McIntyre, “Empowering people to spot manipulation—irrespective of the issue—is the opposite of censorship.”
4. The elevation and leveraging of bad and fake science
Our information environment has become polluted with bad science published in predatory and vanity journals. These operations can appear legitimate—and, as a result, can fool politicians, the media, the public, and, even, clinicians—but they don’t adhere to the usual thresholds required for academic outlets (such as an independent editorial board and high-quality peer review). One of the best examples of this problem was Robert F. Kennedy Jr.’s reference, during his Department of Health and Human Services confirmation hearings, to a “study” that allegedly demonstrated (it didn’t) a connection between vaccines and autism (there isn’t any). It was published in a “journal” that looked impressively scholarly-ish but, in reality, is nothing more than an antivaxx blog.
5. The embrace of false balance and the normalization of the extreme
Our information environment has become one huge false-balance machine. Fringe and disproven marginal ideas are being elevated—by social media, politicians, and popular podcasters—and presented as having equal weight as large and robust bodies of evidence. Studies have shown this kind of bothsidesism can increase the spread of misinformation, duping the public about what the science on a topic actually says and normalizing extreme and harmful lies. (Why is the US government researching the long-debunked connection between vaccines and autism?)
6. The demonization of scientific consensus
Scientific consensus is not a meta narrative pushed by secret elites with an evil agenda. It is the collective judgment of experts, often drawn from hundreds of independent studies carried out using a variety of methods. Yes, we must always be open to contrarian views (so long as they are supported by actual science). And, yes, the scientific and health care communities are often slow to consider new findings (e.g., COVID-19 is airborne). But it would be paralyzing if we derided the accumulated evidence on a given topic, especially when that foundation is broad, rigorously tested, and robust. Ask yourself this: Why is the scientific consensus demonized only for political wedge issues (vaccines, climate change, etc.) and not in the context of bridge building, aircraft design, and household plumbing?
7. Smear campaigns
Harassing, bullying, and trying to discredit science-informed voices doesn’t simply stir up doubt. It also chills public discourse. A hostile environment can drive experts out of the conversation, making it less likely accurate perspectives will reach a wider audience—which, of course, is often the goal of trolls: silence the truth. But now, more than ever, we need contributions from a range of communities and disciplines in the fight against harmful bunk.
8. The growing dominance of fear mongering
Negativity bias—our hardwired tendency to remember and believe the enraging and scary things—drives much of our information environment. This has not only made us more anxious (I’ve argued we are now living in a feardemic) but it has also facilitated the spread of misinformation. Indeed, as noted in a summary of the relevant literature by the American Psychological Association, “people are more likely to believe false statements that appeal to emotions such as fear and outrage.” Fear works particularly well when it is coupled with an alarming anecdote. One frightening story (true or not) can overwhelm our ability to think scientifically.
9. The use of AI
AI is going to make everything on this list more challenging. The production of convincing content can be done quickly, in a targeted manner, and that content can then be rapidly shared across multiple platforms. And research tells us that while individuals think they can detect AI-generated content, they mostly can’t. I’m also worried about the use of AI to create fake but realistic-looking scientific studies. This fabricated research is already populating scientific databases and can be used to legitimize harmful health misinformation.
10. The lack of meaningful action
There is good news. As more studies explore the nature and drivers of health misinformation, we discover better ways to counter it. We are also learning about the complex and interrelated factors that can make individuals and communities vulnerable. But, in general, not enough is being done on the policy front. We’ve seen social media platforms pull back efforts to counter misinformation and the US government denigrate and defund misinformation research. We are moving in the wrong direction.
Stopping misinformation requires a coordinated push from every corner of society: researchers, policy makers, educators. No single fix will do it. But we can hold the line. We don’t have to let the lies take over.The post Before You Share That Health Tip, Read This first appeared on The Walrus.
Police said their divers found the man's body at around 6:30 p.m. Wednesday in the water near Verdun, a southwestern borough.
July 17, 2025 - 12:56 | Alessia Simona Maratta | Global News - Canada
225 Preston St., 613-600-9201, ekbar.ca Read More
July 17, 2025 - 12:44 | Peter Hum | Ottawa Citizen
Premier Scott Moe is scheduled to speak from Saskatoon on Thursday and call on other Canadian premiers to join the New West Partnership Trade Agreement.
July 17, 2025 - 12:39 | Pooja Misra | Global News - Canada
Comments
Be the first to comment