Emergency managers have spent decades preparing for hurricanes, floods, and wildfires. But how many of you have prepared for the possibility that someone is actively working to make those disasters worse?

And I don't mean by making the impact of disasters or fires worse, although that does happen, but by sending information designed to make the response fail.

The traditional view of disaster misinformation treats it as a communications problem: confused citizens share bad information, rumors spread faster than facts, and agencies scramble to correct the record. Heading into 2026, that framing is now incomplete. What happened during Hurricane Helene in October 2024, during the Valencia floods that same month, and during the LA wildfires in early 2025 suggests something far more deliberate. Misinformation during disasters is no longer just a byproduct of chaos. It has become a weapon, and in some cases, adversarial actors are pulling the trigger and impacting your ability to respond.

What I've learned in recent weeks, with travel and workshops in different countries, is that the emergency/crisis management profession needs to stop treating this as a public affairs challenge and start recognizing it for what it is: a gray zone operation that exploits the most vulnerable moments in civilian governance.

The pattern is becoming hard to ignore

When Hurricane Helene struck the southeastern United States in September 2024, the usual post-disaster rumors appeared within hours. False claims that FEMA would seize survivor property. Reports that disaster relief funds had been diverted to support migrants. A widely circulated narrative that the agency would only provide $750 to affected families and nothing else, full stop.

These weren't organic misunderstandings. According to the Institute for Strategic Dialogue, 33 posts on X containing claims already debunked by FEMA and the White House had generated more than 160 million views by mid-October. The volume was significant. But what made Helene different was the operational impact. FEMA had to pause door-to-door outreach in parts of North Carolina after credible threats against aid workers. FEMA's acting associate administrator for response and recovery acknowledged that misinformation was "reducing the likelihood that survivors will come to FEMA with a trusted way to register for assistance". For the first time in recent memory, misinformation didn't just confuse the public. It physically stopped responders from doing their jobs.

And in 2026, that word “trust” is more important than ever.

A month later, catastrophic floods killed more than 230 people near Valencia, Spain. The Spanish fact-checking organization Maldita identified 112 separate disinformation narratives circulating about the disaster, from claims that HAARP weather weapons caused the flooding to false reports that hundreds of bodies were hidden in underground parking garages. Spain's 2024 national security report later confirmed what investigators already suspected: pro-Kremlin actors had exploited the disaster to erode trust in democratic institutions. The initial distribution of key false narratives was traced through Pravda-linked channels and websites connected to Russian propaganda operations.

The objective, according to Maldita's analysis, was not to spread confusion about the floods themselves. It was to say: "Look, your King and your democracy don't work".

Then came the LA wildfires in January 2025, where false claims about government lasers causing the fires circulated alongside misleading narratives about firefighter failures and political blame. By July 2025, Texas floods were attributed to cloud seeding companies, resulting in death threats against executives of those firms.

Each disaster. The same playbook. Exploit the information vacuum. Amplify institutional distrust. Turn natural catastrophe into political crisis.

Now, government entities have their own issues about keeping the public’s trust, and most definitely need to work on better services and delivery to the public. But this is deliberate targeting of the one thing institutions need to serve the public…trust.

Why disasters are ideal gray zone terrain

The concept of the gray zone, that ambiguous space between peace and open conflict where adversarial actors apply pressure without crossing military thresholds, has typically been discussed in terms of cyber operations, election interference, and mostly in critical infrastructure sabotage. But disasters create conditions that gray zone actors can only dream of manufacturing on their own.

Consider what a major disaster delivers: populations in genuine distress, authorities stretched beyond capacity, communication infrastructure degraded, trust already fragile, and intense public demand for immediate answers that governments cannot yet provide. The information vacuum that follows a catastrophe is not a gap to be filled. It is a vulnerability to be exploited.

This is what separates adversarial disinformation from ordinary rumor. Organic misinformation tends to be local, self-correcting as facts emerge, and driven by genuine (if misguided) concern. Weaponized disinformation during disasters is designed to prevent self-correction, to embed false narratives so deeply that official corrections become evidence of cover-up rather than clarification.

The EU has recognized this dynamic at the strategic level. The European External Action Service's Foreign Information Manipulation and Interference framework explicitly identifies Russia and China as actors who exploit crises to destabilize democratic societies. France tracked nearly 80 Russian disinformation campaigns between August 2023 and March 2025. The Pravda ecosystem alone was found to operate over 190 websites targeting 45 countries with automated pro-Russian content, with particular intensity in countries of geopolitical interest to Moscow.

But recognition at the strategic level hasn't translated into preparedness at the operational level. Emergency managers are still working with communication plans designed for correcting honest mistakes, not countering deliberate information operations. The tools are wrong because the threat model is wrong.

The gap between what happened and what was planned for

Here's the operational reality. Most emergency communication procedures or protocols follow a reactive model: monitor social media, identify false claims, issue corrections. FEMA's Hurricane Rumor Response page during Helene was a good example of this approach. Professionally executed. Well sourced. And completely outpaced by the scale of what it was facing.

The problem isn't execution. It's architecture. Fact-checking works when false information is accidental and correctable. It fails when the objective of the disinformation campaign is to make people distrust the fact-checkers themselves. When corrections are framed as government propaganda, the standard playbook doesn't just underperform. It feeds the cycle it's trying to break.

This is the same structural mismatch identified in our earlier analysis of gray zone operations across European civilian infrastructure: single-domain response systems overwhelmed by multi-domain pressure. An emergency management agency trying to fight an information warfare campaign with press releases and rumor response pages is operating in the wrong domain with the wrong tools at the wrong speed.

The World Economic Forum named misinformation and disinformation as the number one global risk in its 2025 Global Risks Report, for the second consecutive year. Meta announced the end of its third-party fact-checking programs, and X rolled out the community notes feature which has had some impact. And platform algorithms continue to reward engagement over accuracy, creating economic incentives for the production and distribution of false content during precisely the moments when accurate information matters most.

If you're finding value in this article, there's more where that came from. We created a free toolkit to go along with it, packed with practical insights you can start applying today.

Subscribe to keep reading

This content is free, but you must be subscribed to Crisis Lab to continue reading.

Already a subscriber?Sign in.Not now

Keep Reading