I. Introduction: The Leak That Wasn't a Hack
In early 2025, headlines broke about a military operation leak involving a Signal group chat. National Security Advisor Mike Waltz and other senior Trump administration officials had accidentally added Jeffrey Goldberg, editor-in-chief of The Atlantic, to an encrypted Signal thread that included sensitive operational discussions about a planned U.S. strike on Houthi targets in Yemen. Goldberg, upon realizing the magnitude of what he was reading, published the contents. What might have once sparked resignations and internal purges instead prompted shrugs and spin. President Trump affirmed his support for Waltz. No one was fired.
The moment was absurd. But it wasn’t an accident. Not really.
Instead, it was the latest expression of a long-term systemic pattern: a slow erosion of institutional discipline around digital communication and classified information. This wasn’t the first time signals leaked. And if history is any guide, it won’t be the last. In fact, the deeper you dig, the more clearly you see: Signalgate caused Signalgate.
II. 2001–2003: The Origin of Total Signal Capture
The story starts in the chaos of the post-9/11 world. After the terrorist attacks of September 11, 2001, the United States launched the "Global War on Terror." In tandem came sweeping domestic security reforms. Chief among them: the USA PATRIOT Act, passed in October 2001. It granted intelligence agencies unprecedented authority to surveil communications, both foreign and domestic.
Within this new legal context, the NSA developed and deployed covert data interception programs targeting the major arteries of the U.S. telecommunications infrastructure. The most notorious implementation was Room 641A, a secret facility built inside an AT&T switching center at 611 Folsom Street in San Francisco.
Technician Mark Klein, who worked for AT&T at the time, discovered in 2003 that a splitter had been installed on the fiber optic lines, duplicating all internet traffic and directing it into a secure room staffed by NSA-linked personnel. In 2006, Klein blew the whistle, revealing a domestic surveillance dragnet operating without warrants or public knowledge. The equipment in Room 641A, including Narus deep-packet inspection gear, wasn’t sifting for threats. It was capturing everything.
This was the first Signalgate: a structural breach, sanctioned at the highest levels, embedded in infrastructure, and exposed not by attackers, but by its own internal contradictions. The logic was simple: if you can see everything, you can prevent anything. The system mistook visibility for control.
III. 2006–2013: Simulation Becomes Doctrine – Cyber Storm and the Normalization of Collapse
Around the time Klein went public, another development was quietly reshaping the U.S. approach to cyber and digital security: Cyber Storm.
Launched in 2006 by the Department of Homeland Security (DHS), Cyber Storm was a series of national-scale cyber defense exercises. The first iteration simulated coordinated attacks on critical infrastructure: power grids, transportation systems, communication networks. Federal agencies, state governments, private sector partners, and international allies all participated.
What began as a tabletop exercise quickly evolved into a complex sociotechnical rehearsal. Cyber Storm scenarios grew to include misinformation campaigns, public panic management, zero-day exploits, and supply chain interference. But underneath the official goals was a deeper dynamic:
Cyber Storm wasn’t just simulating attacks. It was testing the system’s internal fragility.
Participants frequently failed to communicate across silos. Classification slowed response times. Jurisdictional confusion reigned. Most crucially, participants often resorted to backchannel communications when formal systems broke down.
Rather than eliminate these weaknesses, Cyber Storm institutionalized them. By 2013, it was clear that the exercises weren’t building resilience so much as habituating the system to dysfunction. The logic was: yes, failure will happen—just make sure it’s survivable.
This subtle cultural shift had a profound effect. It set the stage for a government that would routinely tolerate informal workflows, workarounds, and degraded security practices as the cost of doing business. It trained operatives not to prevent collapse, but to improvise through it. The playbook wasn’t "defend the system." It was "keep performing as it fails."
IV. 2010–2013: Snowden and the Crisis of Trust
If Room 641A exposed the architecture of surveillance, the early 2010s exposed its operational interior. Enter Edward Snowden.
In 2013, former NSA contractor Edward Snowden leaked an unprecedented cache of classified documents detailing global surveillance operations. Programs like PRISM, XKEYSCORE, and Upstream revealed the staggering scope of data collection. Snowden’s leaks confirmed that the logic of Room 641A had been scaled globally.
But these exposures didn’t lead to lasting reforms. While they sparked public outcry, lawsuits, and legislative debates, the core infrastructure remained intact. What changed was the system’s orientation:
After Snowden, the U.S. surveillance apparatus didn’t contract. It adapted.
It became better at insulating itself from leaks. But also more fatalistic about them. It learned to reframe exposure as a failure of loyalty, not infrastructure. More training, more clearance protocols, more private contractors under tighter NDAs. But the core assumption—that the state can and should know everything—remained untouched.
V. 2014–2024: Informal Systems, Formal Power
Following Snowden, there was a wave of security hardening across agencies. But over time, cracks reappeared. Bureaucratic friction and siloed operations led to increased reliance on personal phones, encrypted apps, and cross-platform collaboration tools.
Encrypted messaging apps like Signal and WhatsApp became unofficial backbones for sensitive coordination. From disaster response teams to national security staffers, encrypted chats became the place where real-time coordination actually happened.
These apps provided speed and plausible deniability. But they also blurred the boundary between official and unofficial conduct. The institutional message became: if it works, use it. If it leaks, we’ll survive it.
Meanwhile, Cyber Storm exercises continued to reinforce this mindset. By the late 2010s, scenarios involved layered misinformation, hybrid threats, and nation-state deception. But they also assumed constant, low-level system compromise. The drill wasn’t about stopping the breach. It was about navigating the breach without collapse.
This shift trained an entire generation of officials to operate under normalized exposure. Which brings us to 2025.
VI. Signalgate II: The Signal Thread That Proved the Simulation Had Become Real
In March 2025, a Signal group chat including National Security Advisor Mike Waltz and other senior Trump administration figures was used to coordinate a military operation. Someone added Jeffrey Goldberg to the thread, presumably by accident. Goldberg, a journalist, read the thread and published what he saw.
There was no breach. No hack. No phishing. Just a contact error—and a total collapse of operational security.
Yet the reaction was muted. No mass firings. No internal shake-ups. The President stood by his team. The press cycle moved on.
Why? Because the system had already internalized the idea that leaks are survivable. The logic of Cyber Storm, the precedent of Snowden, the habits of backchannel coordination—it had all added up to a culture where even high-stakes operational exposure is processed as routine.
VII. The Recursion of Exposure
So yes: Signalgate caused Signalgate.
The first Signalgate (Room 641A) revealed the system’s architectural vulnerability.
Cyber Storm rehearsed exposure until exposure became normalized.
Snowden exposed the content and logic of the system.
The system absorbed the shock and hardened its tolerances.
Informal tools filled in the gaps left by formal processes.
The new Signalgate was not a violation of protocol. It was protocol.
This is the recursion. A feedback loop where the system simulates failure, experiences real failure, learns to survive it, and begins to expect it. Until failure is no longer exception—it is baseline.
VIII. What This Means for Those Who Oppose
If Signalgate II reveals anything, it’s this: the United States no longer treats exposure as a threat. What was once cause for scandal is now just another line in the press briefing. The system has rehearsed failure so many times—through Cyber Storm, through whistleblower leaks, through infrastructural collapse—that it has trained itself to absorb breakdown as ordinary.
For those positioned in opposition to this system, especially those operating from Indigenous, abolitionist, or insurrectionary frameworks, this reality isn’t a source of despair. It’s strategic terrain.
A. Shift from Exposure to Erosion
If exposure is no longer a risk to the system, it should no longer be a primary tactic of resistance. The state doesn’t need secrecy to function—it just needs momentum. Its operations don’t stall when secrets leak; they stall when logistics fail, when legitimacy erodes, when relationships dry up.
The appropriate tactic now is erosion. Strategic subtraction. Undermining—not by calling attention to the breach—but by creating forms of power and relation that can’t be patched.
This means:
Build autonomy where there is no surveillance interface to disrupt.
Move relationships offline, off-grid, out of scope.
Focus less on being seen and more on being unassimilable.
The state can tolerate exposure. It cannot tolerate untrackable autonomy.
B. Understand What the State Can’t Simulate
Cyber Storm taught the U.S. government how to manage collapse within a simulation. But simulations have constraints. They can only model what they can measure. They fail to render:
Relationships grounded in land, language, and lineage.
Collective action that doesn’t declare itself.
Movements without a central communications strategy.
Communities that don’t produce extractable data.
Insurgent power must move into those spaces.
Build movements that refuse scale.
Build alliances that prioritize trust over reach.
Act in ways that generate confusion—not visibility—within state telemetry.
C. Escaping the Feedback Loop: Refusal as Infrastructure
If state power is now organized around responding to disruptions, then disruption alone is no longer enough. We need refusal as a foundation—not just of tactics, but of infrastructure.
What does that look like?
Refusal to digitize: Organizing in analog, archiving in memory, transmitting via word-of-mouth.
Refusal to centralize: Never build a target.
Refusal to announce: Let action speak without a press release.
Refusal to produce data: Understand that even encrypted communications generate metadata.
Infrastructure isn’t just buildings or servers. It’s how we carry each other through time. Resistance infrastructure must be dense, relational, embodied, and uninterpretable.
D. The Role of Conflict in Disorienting Empire
The U.S. simulation machine thrives on legibility. It needs categories: terrorist, protestor, criminal, extremist. When our actions conform to any of these, it can respond predictably. But when conflict emerges in ways that don’t map to these identities—when it is uninterpretable—it becomes disorienting.
The goal is not peace. Nor is it endless riot. It is strategic friction:
Small, constant, material disruptions.
Delays in logistics, not slogans in the street.
Tactics that make security costly and maintenance unbearable.
Make the state question its models.
Make it burn energy trying to explain what’s happening.
Keep it tired.
E. Intelligence is not Information, it is Relation
Every leak assumes that the key to action is better information. But those inside the empire are drowning in information. What they lack is intelligence—understood not as secrets, but as the ability to read context, build trust, make meaning, and move.
We flip this.
Our power lies in:
Knowing our terrains better than the state knows its simulations.
Having relationships that no map can trace.
Maintaining coherence without needing consensus.
Moving with strategic intent, without needing recognition.
F. Building the Future in the Gaps of Their Collapse
Every infrastructure the state neglects becomes an opportunity. Where healthcare fails, we organize care. Where housing fails, we reclaim shelter. Where food supply chains break, we grow, store, and redistribute. This isn’t charity. It’s territory.
Autonomous space is not symbolic. It is strategic.
We don’t build alternative systems to replace the state. We build them because the state already abandoned us. Our systems are how we survive. Their refusal to see them is how we win.
G. What Comes After Legibility?
They know everything. They understand nothing.
The empire believes that to see is to govern. Let them see us. Let them flood their databases with our movements. Let them try to simulate us.
And let them fail.
Because we don’t need them to understand.
We don’t need them to respond.
We just need to build what they cannot predict.
If they no longer distinguish signal from noise, then we are the noise that grows into structure.
We are not inside their collapse.
We are outside it, building something they forgot how to see.
IX. Conclusion: Beyond the Leak
The significance of the 2025 Signalgate isn’t in what it revealed, but in how unsurprising it was.
It showed us that after two decades of surveillance expansion, whistleblower crises, and simulation wargames, the U.S. national security apparatus has evolved not to prevent exposure, but to absorb it.
The question now isn’t how to stop the next Signalgate.
It’s whether a system that treats exposure as routine can still distinguish itself from a breach.
If every signal is eventually compromised, if every boundary is provisional, then what happens when the system stops reacting at all?
That’s the real threat.
Not that information leaks.
But that no one cares anymore when it does.