The cybersecurity industry has conscripted the military establishment as its sales force. The result is fear dressed as doctrine, complexity sold as cure, and exhibition floors full of people who couldn't heat a tin of beans — but will happily invoice you for threat intelligence.

Setting the Scene

Picture the exhibition floor at CyberUK. Hundreds of vendors. Thousands of lanyards. An entire ecosystem of people absolutely certain they are the last line of defence between civilisation and catastrophe. And presiding over much of it, projecting the calm authority of people thoroughly briefed to project calm authority: the former military men. Ex-special forces. Retired intelligence officers. Their LinkedIn profiles a quiet hymn to operations, leadership, and the kind of vague competence that sounds impressive until you ask a specific technical question and watch the eyes go blank.

The pitch is simple: I have operated in difficult environments. Threat is my native language. Trust me with your security budget.

Do not trust them with your security budget.

A Brief Survey of Military Competence

Let's be clear about something. We are antiwar. That position is not complicated, not negotiable, and not going to be softened to spare anyone's feelings. The military exists and sometimes serves a purpose. It is also, by any honest reckoning, an institution that routinely makes catastrophic, farcical, and occasionally hilarious mistakes — and has done so throughout its entire history.

In 2002, a Royal Navy vessel sailed into Spanish territorial waters and anchored. The intended destination was Gibraltar. These are not, by any map available to civilians or military alike, the same place. Spain and Gibraltar are separated by approximately two miles and a border that has existed since 1713. The crew was escorted out. Nobody was court-martialled. Everybody moved on.

Friendly fire. Wrong targets. Intelligence failures of such magnitude they became their own literary genre. The history of military operations is not a record of infallible precision — it is a long, expensive, frequently deadly argument between what was planned and what actually happened.

Then there is The Apprentice. In an episode that deserves far more attention than it received, a former army officer — a Sandhurst graduate who had served in Basra and whose application pitched "cool, calm, and collected leadership" — was sent to buy a gas camping stove. He came back with a tin of baked beans and a petroleum jelly burner, planning to eat the beans, punch air holes in the empty can, and use it as a makeshift stove. Sugar called the task "a bloody disaster" and fired him.

This is not a gotcha. It is the entire point. The skill set required to survive a gruelling selection process, follow a chain of command in high-stress environments, and brief a room with unearned confidence does not transfer — at all, in any direction — to understanding why your Kubernetes ingress controller is your largest liability, or why taking a dependency on a third-party CI plugin is an architectural decision with security implications, or what a BGP hijack actually looks like from the inside.

One set of skills. One very specific context. Beans. Burner. Cold.

Fear Is the Product

What the military worship culture has given cybersecurity is vocabulary and theatre. Threat actors. Kill chains. Cyber warfare. Defence posture. Advanced persistent threats. It is geopolitical language applied to an engineering problem, and it performs one function with extraordinary efficiency: it makes procurement feel like strategy.

The CFO who cannot tell you what a buffer overflow is will sit across a table from a former intelligence officer and feel, for the first time, that he understands the scale of what he faces. The fear is real. The solution being sold is not. Because the solution is always the same: another platform, another dashboard, another scanning tool, another integration, another vendor relationship, another seat licence.

The engineer who actually knows where the bodies are buried — who understands the actual codebase, the actual dependencies, the actual trust boundaries — is not in that boardroom. They are at their desk, running a build, quietly aware that the third-party library their organisation took a dependency on eighteen months ago hasn't been updated in a year and nobody owns it.

That engineer is the only person in the organisation who could have a useful conversation about security. They are not being consulted. The former Regimental Sergeant Major is closing the deal.

The Attack Surface Is the Scaffolding

Here is the contradiction that the entire bolt-on security industry depends on you not examining too closely: it sells you complexity as the cure for complexity.

Every tool on that exhibition floor adds code, infrastructure, third-party dependencies, integrations, API calls, agents running on your systems, and databases of vulnerabilities that are, by definition, always out of date. Each of these is a new attack surface. The vulnerability scanner sitting in your CI pipeline is itself a target — a piece of software, written by humans, with its own dependencies, its own update cadence, and its own potential for compromise.

What the supply chain incidents actually proved

SolarWinds. XZ Utils. 3CX. The March 2026 npm supply chain attack. The attack was not the product — it was the scaffolding. The build tool. The update mechanism. The dependency nobody was watching. The security industry's response to each of these incidents was to sell more scaffolding.

And against a nation-state adversary — a group operating with multi-year planning horizons, undisclosed zero-days, and the deliberate patience to wait for exactly the right moment — your CVE subscription is a library card in a gunfight. The vulnerability they're using is not in any database. It never will be until after it's been used. That is the definition of a zero-day. The scanner cannot see it. The dashboard cannot flag it. The ex-SAS CISO cannot brief against it.

The only honest answer to a growing attack surface is a smaller attack surface. Less code. Fewer dependencies. Less infrastructure. Solve today's problems, and only today's. Features nobody uses are not load-bearing — they are undocumented paths through your system that nobody has thought to audit in four years.

Who Are You Going to Sue? China?

The legal industry's contribution to cybersecurity has been exactly as useful as you'd expect from a profession that arrives after the fact, charges by the hour, and measures success by paperwork rather than outcome.

GDPR. Enormous machinery. Eye-watering fines on paper. What changed architecturally? Companies hired Data Protection Officers and lawyers, not better engineers. The incentive was to manage liability, not fix the problem. Compliance became the goal, and compliance is not security. A system can be fully compliant and completely owned before close of business on the day the auditor signs it off.

In a world driven by state actors operating through seven proxies across five jurisdictions, the lawsuit is not a tool. It is a gesture. You cannot sanction your way out of an advanced persistent threat. You cannot serve papers on a group that the government sponsoring them will deny exists. "Who are you going to sue — China?" is not a rhetorical flourish. It is an accurate summary of the legal framework's utility against the actual threat.

Defence is engineering. It is architecture. It is what you choose not to include, what you choose not to expose, what you choose to audit before it ships. You cannot litigate your way to a smaller attack surface.

The Botnet Nobody Volunteered For

We said we're antiwar. We meant it. But cyberwar is not coming — it is here, and unlike every previous model of warfare, civilians are not collateral damage. They are the target and the weapon simultaneously.

Every compromised device is a conscript in someone else's army. The botnet is not abstract. It is routers, cameras, thermostats, and televisions running default credentials or firmware that hasn't been patched since the factory. Their owners are entirely unaware. They plugged in a device and went about their lives. That device is now participating in a DDoS attack on a hospital, or a credential stuffing campaign against a bank, or infrastructure reconnaissance for an operation that will execute in eighteen months.

Anyone who builds software or connects hardware to a network carries a duty of care to the people who use it. This is not optional and it is not a compliance checkbox. A minimal codebase with secure defaults and dependencies you can actually read and reason about is not gold-plating — it is basic civic responsibility. You cannot be conscripted into a botnet army if there is nothing to grab hold of.

The cybersecurity industry, left to its current trajectory — consolidation, defence contractor money, government contracts, former intelligence establishment in the C-suite — will become a mirror of the defence industry it has spent twenty years cosplaying. Incentivised to sustain threat rather than eliminate it. Structured to profit from fear. Resistant to the simple, brutal, architectural answers that would actually work. We should name this now, clearly, while the industry is still young enough to go a different way.

The Kid the Prosecutor Never Asks About

While the industry is busy awarding itself at black-tie dinners, it is also quietly cheering on the prosecution of teenagers for doing something the industry itself cannot: actually getting into systems.

Most of these kids are curious, not criminal. They found a tool, followed a thread, and ended up somewhere they shouldn't be — frequently because organised criminal infrastructure handed them a prebuilt exploit, aimed them at a target, and disappeared the moment consequences arrived. The gang gets paid. The kid gets a record. The industry gets a press release about the arrest and calls it a win.

Hiring profile, not threat profile

The instinct to probe systems at sixteen, self-taught, following nothing but curiosity — that is not a threat profile. That is a hiring profile. Credit where it's due: GCHQ runs Capture the Flag competitions specifically to find these people, and on this one question the signals intelligence establishment has seen something the legal system refuses to. The prosecutors celebrating the conviction do not talk to the GCHQ recruiters.

The security establishment, which cannot find genuine technical talent for love nor money, watches that talent get criminalised and says nothing. Some of the best potential defenders this industry will ever see are being handed criminal records by people who think cybersecurity is a branch of law enforcement rather than a branch of engineering.

The Rewrite Is Possible. You Were Lied To.

The bolt-on security model survives on one crucial and largely fictitious piece of received wisdom: the codebase is too large, too complex, too undocumented to rewrite. Better to layer tools on top of it forever. The consultancy lives. The vendor renews. The system grows.

AI changes the economics of the question that always made rewrites feel impossible: what does this system actually do? That used to require months of archaeology through undocumented decisions made by people who left three years ago. It is now tractable much faster. And once you can answer that honestly, the next question — do we actually need all of this? — almost always has the same answer. No. Half the system exists because someone asked for something once and nobody had the courage to remove it.

Strip it out. Write the replacement with security as the grain of the wood, not a layer applied afterward. Threat model before you write a line. Keep the codebase small enough that a human being can hold it in their head and reason about every path through it. That is security engineering done properly — not a product category, not a dashboard, not a framework with a three-letter acronym being sold by someone who once navigated a jungle.

The uRadical position

A disciplined small team with genuine understanding of their own system is more secure than an enterprise with a forty-tool stack that nobody fully comprehends. Write less. Depend on less. Audit everything you do ship. Solve today's problems only. Keep the surface small enough that a human being can reason about it. Security is architecture. Not acquisition. Not theatre. Not a retired colonel with a deck of slides and a firm handshake.

The exhibition floor will keep filling up. The lanyards will keep multiplying. Corporal Jones and the boys will keep fixing bayonets at threats that bayonets cannot touch, shouting "don't panic" while absolutely panicking, billing by the day for the privilege. Carry On Security, cyber edition — same cast, same incompetence, significantly higher invoice.

Like that burner in the bean tin, this was never going to be enough. Stop wasting your time watching it flicker. Build the thing that will actually make a difference.