The Threat Generation.
How We Armed a Generation With Words and Wondered Why They Pulled the Trigger.
[The Threat Generation. The Almighty Gob.]
There is a church hall in the West Midlands with chairs still set out in rows. A table at the front. A jug of water nobody poured. The kind of room that knows what it used to be for.
Judy Foster knew what it was for. She had been coming to rooms like this since 1998. Twenty-six years of Friday afternoons and constituent problems and the patience democratic representation actually requires. In November 2024 she resigned from Dudley Council. The reason given was concerns about her personal safety.
No national headlines. No prime ministerial statement. A by-election notice and a gap where a person used to be.
A generation that has never heard of Judy Foster.
Already scrolling past something else.
The people who fought the Second World War knew what an emergency was. They had felt it in their bodies — the siren, the shelter, the letter that arrived and the one that didn’t. They had sat with genuine scarcity, genuine loss, genuine threat, and learned — not as philosophy but as survival — the difference between what required immediate action and what required patient endurance.
That calibration. That internal measure of what actually constitutes danger. It was not wisdom in the abstract. It was the product of consequence.
We no longer produce it. Something else moved in.
The arming began quietly. It always does.
Crisis arrived first — through the wreckage of 2008, the banking collapse and the austerity that followed, a decade of wage stagnation the Resolution Foundation would later describe as the worst since the Napoleonic Wars. By 2015 everything was a crisis. Housing crisis. NHS crisis. Cost of living crisis. The word migrated from event to permanent condition so gradually that nobody noticed the moment it stopped meaning anything specific.
Toxic came next. Brexit primed it in 2016. By 2018 Oxford Dictionaries named it Word of the Year — no longer a chemistry term, now the standard descriptor for any political environment, any discourse, any person whose views you found unacceptable. The word that originally described something that would kill you on contact, now deployed to describe a conversation you found uncomfortable.
Then emergency. November 2018. Bristol City Council — a city this publication knows well. Green Party councillor Carla Denyer moved a motion declaring a climate emergency — the first in Europe, widely credited as the moment the language went global. By May 2019 the UK Parliament had followed. By July 2019 more than 400 local authorities and parliaments had copied the motion.
The word emergency has a specific meaning. Normal rules suspended. Extraordinary measures justified. The usual processes of deliberation, consultation and democratic consent set aside because there is no time.
That is not a neutral word. Deployed in a political environment already primed by crisis and toxic and harm and safe and unprecedented, it does something very specific.
And that something has a name.
And then came hate — not the emotion, which has always existed, but the designation. The category. The word that, once applied, ends the conversation rather than beginning it. You cannot reason with hate. You cannot negotiate with it. You can only oppose it. Both sides understood that. Both sides used it.
It goes straight to the brain stem.
And the brain stem does not negotiate.
What follows is emotional incontinence. Not a political position. Not even anger in any meaningful sense. The unmediated discharge of whatever the brain stem produces — before the rational mind has been given a chance to arrive. Before consequence has had a chance to register.
There used to be something that stood between the feeling and what came next. You remember it, don’t you. Nothing dramatic. Nothing philosophical. The cup of tea after the news. The walk to work. The conversation with someone who thought differently. The night’s sleep before the reply was sent. In that ordinary unremarkable gap, the rational mind had a chance to arrive — to weigh the feeling against what you already knew, to ask whether what you were about to do would still seem right in the morning.
Nobody voted to remove it. Nobody declared an emergency about its disappearance. It went — quietly, between 2010 and 2015, when the smartphone became the primary surface between a person and the world. The algorithm moved into the gap the moment it appeared. The gap has been occupied ever since.
Someone saw that gap and built a business in it.
The words were the kindling. Someone built the machine to light them. Someone profited from the fire. A jury in Los Angeles said so.
In March 2026 that jury found Meta and Google liable for deliberately engineering their platforms to addict children. The verdict confirmed what the internal documents had already shown — that the platforms knew what they were doing to developing minds, knew the online harm it caused, and continued because the harm was profitable.
What the verdict described was the removal of friction.
Every social environment that preceded the platform contained friction — the time it took to write the letter, the pub conversation, the face across the table at the surgery. Friction is the gap between feeling and action. It is the space where the rational mind has a chance to operate.
It is where consequence lives.
The harmful algorithms removed it. Entirely. Deliberately. Outrage could now travel from stimulus to action in seconds, at midnight, from a sofa, with no social consequence and the algorithmic reward of likes and the warm tribal feeling of being agreed with by strangers. The Southport riots of 2024 showed precisely what follows — algorithmic misinformation, a population primed to react, real-world violence within hours.
The words were already loaded. The platform handed them to anyone who felt threatened and removed everything that stood between the feeling and the trigger.
This is the Threat Generation.
Not a birth year. Not a demographic category. A generation of kidults — adults who never learned to sit with a feeling before acting on it — shaped into emotional incontinence by a machine engineered to exploit the brain stem before rational thought arrives.
You recognise this, don’t you. You’ve felt it — the flicker of outrage, the phone in your hand before the rational mind arrived. The difference between you and the person who sent the death threat is not moral superiority. It is friction. Whatever friction remained in your life at that moment. The walk. The cup of tea. The person in the next room.
Think about that person in the next room.
They are not uniquely malicious. They are not uniquely stupid. They are the product of an environment assembled, piece by piece, over fifteen years, by governments that chose not to regulate, platforms that chose to profit, and a political class that chose to use the vocabulary of emergency for everything from climate policy to parking restrictions.
The small business owner outside Oxford whose delivery van could no longer access the filter zone had a legitimate grievance. The algorithm found it, attached it to a TikTok video about climate lockdowns and globalist control, and delivered a death threat to the councillor who approved the scheme. The grievance was real. The conspiracy was manufactured. The councillor’s home address was publicly listed.
In Sheffield a councillor answered his phone and heard: I am coming for you and your family. You can get ready. You are a dead man.
The Clean Air Zone had reduced nitrogen dioxide levels by sixteen percent.
The chair in Sheffield is still there too.
The surgery is the point where representative democracy stops being a theory. The Friday afternoon when the system has a face, the constituent has a chair, and the gap between the person who makes the decision and the person who lives with it briefly closes. Notice how quietly it is disappearing.
You already sense something has changed. When did you last go to a surgery? When did you last know your councillor’s name?
That room is closing all over the country. Here is what closing looks like.
Twenty-two percent of British councillors — a death threat or threat of violence received.
Ninety-six percent of MPs — abuse.
One in three has considered not standing again.
One in six has considered resigning now.
Mike Freer wore a stab vest to public events. The man who later murdered David Amess had been watching his Finchley office. He stood down at the 2024 election. There comes a point, he said, when the threats to your personal safety become too much.
Mike Freer had a name and a platform. Most people don’t.
The single parent with the DWP letter she doesn’t understand, the phone number that rings out, the appointment cancelled and never rescheduled. The market trader whose delivery reroute is killing his margins. The family in temporary accommodation three miles from a school their child used to walk to. These are not political abstractions. They are the people the surgery was built for. They are the people who lose when it closes.
MPs’ security costs have risen fifteen thousand percent since 2010, according to data this publication obtained under Freedom of Information legislation. The money goes on stab vests and panic buttons and bombproof letterboxes.
Not one penny of it addresses the machine that made the fear necessary.
A child is on their phone tonight. Right now. Filling the gap between the end of the day and the beginning of sleep — that low-resistance state the harmful algorithm was specifically designed to find. It delivers something that produces a flicker of feeling. The child pauses. The pause is the data point. The algorithm notes it and delivers more.
The drama Adolescence ignited a national conversation in 2025 about exactly this — the radicalisation of children through online content, the speed at which a feed reshapes a developing mind. The conversation was important. The government convened a working group. The Online Safety Act exists. Neither has been enough.
The government is now trialling a social media ban for teenagers. The consultation opened on 2 March 2026. It closes on 26 May 2026. The algorithm will still be there on 27 May.
You know this child. You might live with one. You might be raising one. You might remember being one, before the algorithm knew your name.
The wartime generation’s threat threshold was set by consequence. By the weight of words that meant what they said because what they described was real.
This child’s threshold is being set right now by an algorithm found liable in a court of law for doing precisely this to children, deliberately, for profit. The platforms call it engagement. The courts called it addiction.
They are not the enemy. They are the product.
The teacher who might have built the space between stimulus and response — who might have modelled what it looks like to encounter a difficult idea without becoming dangerous — is leaving the profession. The caseload is unmanageable. The behaviour is escalating. The parents are in the WhatsApp group at midnight.
The social worker who might have reached the family before the crisis became irreversible has sixteen other families on their list and a reporting system that takes three hours to update.
The local newspaper that might have told people what the traffic filter actually did closed in 2019.
Every institution that might have stood between the child and the algorithm has been either defunded, demoralised, or driven out.
The vacancy was filled immediately.
We did not build the Threat Generation by accident. We built the conditions. We handed over the vocabulary. We licensed the machine. We removed the friction. We defunded everything that might have stood in the way. And then we expressed surprise at the body count.
The wartime generation would have looked at a death threat sent over a traffic filter and understood, in their bones, the difference between a real threat and a manufactured one. They had the calibration. They knew what emergency actually cost.
We took that calibration — that hard-won human capacity for proportionate response — and replaced it with a word.
We declared emergencies from council chambers. We named everything toxic. We expanded harm until it meant any idea that made someone uncomfortable. We made hate a category rather than a feeling — and handed it to everyone simultaneously, so that the march against hate and the person sending the death threat were both operating from the same instruction. Oppose. Eliminate. Do not engage.
We handed a generation the vocabulary of existential threat and a machine that rewarded them for using it, and built no institution capable of teaching them that a feeling is not a fact — or that the inability to tell the difference has a name. Emotional incontinence. We built the conditions for it. We removed everything that might have treated it.
Then we expressed surprise when they pulled the trigger.
The Threat Generation did not arrive from nowhere. We built the road that brought them here — word by word, policy by policy, platform by platform.
And we left every surgery door unlocked while we did it.
Judy Foster knew what the surgery was for, that the Threat Generation won’t see, or fix.
The Almighty Gob is a Bristol-based publication covering politics, power, and the gap between what institutions say and what they actually do.
Sources and citations
Judy Foster / Dudley Council resignation Judy Foster, Labour councillor for Brockmoor and Pensnett, Dudley Council. Resigned November 2024 citing personal safety concerns. Deputy leader of the Labour group. 26 years’ service from 1998. Reported across local West Midlands press, November 2024.
Resolution Foundation — wage stagnation Resolution Foundation. A New Generational Contract. 2018. Finding: the 2008–2018 decade represented the worst period of wage stagnation since the Napoleonic Wars.
Oxford Dictionaries — Word of the Year Oxford Languages. Word of the Year 2018: toxic. Oxford Languages. Word of the Year 2019: climate emergency. Published annually by Oxford University Press.
Bristol City Council climate emergency declaration Bristol City Council, November 2018. Motion proposed by Green Party councillor Carla Denyer — the first climate emergency declaration by a local authority in Europe. UK Parliament followed May 2019. More than 400 local authorities and parliaments had adopted the declaration by July 2019. Reported by Climate Emergency UK and widely referenced in parliamentary record.
Meta and Google liability verdict — Los Angeles Kaley G.M. v. Meta Platforms Inc. and Google LLC. Jury verdict, Los Angeles, March 2026. Jury found Meta and Google liable for deliberately engineering platforms to addict children. $6 million in damages awarded. First successful case to sidestep Section 230 protections by focusing on platform design rather than content. Reported by Reuters, NBC, Rappler, March 25–26, 2026.
Southport riots — algorithmic misinformation Southport attacks, July 2024. False claims about the attacker spread rapidly across social media platforms, driving riots between 30 July and 7 August 2024. Ofcom confirmed illegal content and disinformation spread “widely and quickly” online, and that “algorithmic recommendations” played a role in driving divisive narratives. Science, Innovation and Technology Committee inquiry into social media, misinformation and harmful algorithms, launched 2024, follow-up session 24 March 2026.
Oxford traffic filter — death threats to councillors Oxfordshire County Council low traffic neighbourhood and traffic filter schemes, 2022–2023. Multiple documented death threats received by councillors following online misinformation campaigns linking filter schemes to climate lockdown conspiracy theories. Reported across Oxfordshire local press and referenced in LGA Debate Not Hate campaign documentation.
Sheffield Clean Air Zone — death threats Sheffield City Council Clean Air Zone, introduced 2022. Councillors received documented death threats including the direct speech quoted in this article, reported across Sheffield press and referenced in parliamentary testimony on MP and councillor safety. Sheffield Clean Air Zone: nitrogen dioxide levels reduced by sixteen percent — Sheffield City Council monitoring data, 2023.
Councillor abuse statistics — 22 percent Local Government Association. Debate Not Hate survey. August 2024. Finding: 22% of British councillors have received a death threat or threat of violence. 73% of councillors experienced abuse in the past year. 88% of parish and town councillors reported abuse.
MP abuse statistics — 96 percent, one in three, one in six Speaker’s Conference on the security of candidates, MPs and elections. 2025. Finding: 96% of MPs have experienced abuse. 39% received calls for harm. 27% received death threats. One in three MPs has considered not standing for re-election. One in six has considered resigning from public office.
Mike Freer — security, David Amess connection Mike Freer, Conservative MP for Finchley and Golders Green. Wore stab vest to public events following sustained threats. The man who murdered Sir David Amess MP in October 2021 had previously visited Freer’s constituency office. Freer announced he would not stand at the 2024 general election, citing personal safety. Statement reported by BBC, Sky News, and national press, 2024.
MPs’ security costs — fifteen thousand percent Security expenditure for MPs obtained under Freedom of Information legislation by this publication. Security costs rose from approximately £39,000 in 2010 to approximately £5.9 million in 2023/24 — an increase of approximately fifteen thousand percent. Parliamentary Estates Directorate FOI response.
Adolescence — Netflix drama Adolescence. Netflix, 2025. Four-part drama examining the radicalisation of a teenage boy through online content. Prompted widespread national debate about children’s online safety, cited in parliamentary debates and Hansard, 2025.
Online Safety Act Online Safety Act 2023. Received Royal Assent 26 October 2023. Established duty of care for online platforms regarding illegal content and content harmful to children. Ofcom designated as regulator. Key provisions came into force from July 2025.
Social media ban trial for teenagers UK Department for Science, Innovation and Technology. Six-week pilot trialling social media restrictions for 300 teenagers, announced 25 March 2026. Interventions include app removal, one-hour daily caps, and overnight curfews. Part of the government’s broader digital wellbeing consultation, which opened 2 March 2026 and closes 26 May 2026.
Children’s digital wellbeing consultation UK Government. Growing Up in the Online World: A National Consultation. Launched 2 March 2026. Consultation closes 26 May 2026. Seeking views on social media minimum age restrictions, addictive design features, age verification, and AI chatbot access for children. Government committed to publishing response summer 2026.


