The Outsourcing of the Self.
We Didn't Lose Our Minds. We Handed Them Over.
[outsourcing-of-the-self.jpeg]
You know how this works. You’ve lived it today already. Most of us have.
The alarm goes off — that part’s yours, fair enough. You’ll probably reach for the phone before you’ve said a word to anyone. The feed is already curated. Someone else decided what’s relevant this morning. You didn’t ask who. The route to wherever you’re going will be decided by a satellite. The music on the way chosen by an algorithm that knows your taste better than you do, because the choosing gradually stopped feeling necessary. The news you absorb on the way — selected, sequenced, weighted — by a platform whose interest is not your information but your attention.
By the time you’ve had breakfast, you’ve outsourced the first part of your day without realising.
And it didn’t feel like a loss.
That’s the thing. It never does.
There’s a word that runs through almost everything wrong with the way we live now. Not a complicated word. Not a new one. It’s been sitting in the business pages for decades, hiding in plain sight, doing its quiet damage.
Outsourcing.
A company outsources its call centre. Contracts the cleaning to an agency. Hands the logistics to someone with a van and a zero-hours contract. Keeps the profit. Moves the problem. It’s a tidy arrangement — if you’re the one keeping the profit.
However, outsourcing didn’t stay in the boardroom. It got into everything. Into the way institutions operate, the way governments govern, the way platforms function, and — this is the part that matters — the way you and I now relate to our own minds.
We outsourced thinking. Then feeling. Then judgement. Then responsibility. Then consequence.
And at each stage we received something back — convenience, comfort, speed, the warm sensation of not having to try quite so hard — and the transaction felt fair. Felt like progress, even.
It wasn’t progress. It was a slow, methodical removal of the self.
This is not a diagnosis. It’s an observation. Not every person, not every moment — but as a pattern, as a direction of travel, it’s hard to look away from.
This didn’t begin with the smartphone. It began earlier. And it began deliberately.
A man called Edward Bernays — Sigmund Freud’s nephew, and a pioneer of modern propaganda — worked out in the 1920s that the rational mind isn’t where decisions are made. The real target is the feeling underneath it. The anxiety. The longing. The need to belong. By the postwar consumer boom that insight was industrial scale. An entire economy built on the systematic bypassing of the pause between feeling and decision. Common sense, it turns out, is bad for sales. So they went around it.
Television arrived and changed the terms again. For the first time an emotional experience could be delivered directly into your living room without you having to do anything to receive it. No reading. No decoding. Just absorption. The muscle that critical thinking runs on — the slight friction of processing something yourself — started softening. Not dramatically. Just a little. Just enough.
Rolling news did something more specific. It abolished the natural gap between event and reaction. The morning paper had a cooling period built in — by the time you read about something it had already settled into fact. Rolling news made everything permanent emergency. Common sense needs stillness. Rolling news removed the stillness and called it keeping you informed.
The early internet handed ordinary people the ability to broadcast — which sounded democratic, and in some ways genuinely was, though it also meant the gap between feeling something and publishing it collapsed for everyone. Before, the cooling period was structural. The architecture enforced the pause.
Then the smartphone. 2010 to 2015, roughly. That’s when the last friction went. The broadcast capability was now in your pocket, always — including at 3 AM when your defences are down, in the middle of an argument, in a crowd where the brain stem is already running the show. The smartphone removed the last physical barrier between impulse and publication.
By then, the outsourcing of the mind was essentially complete.
We didn’t have agency taken from us. We donated it. Gradually. In exchange for convenience.
Here’s where it gets closer to home.
You ring the number on the letter. You’re connected to someone in another country reading from a script. They can’t resolve it. They escalate. You wait. You ring back. Different person. Same script. Nobody has the authority to fix it. Nobody owns the problem. The company kept the brand. Outsourced the human contact. And with it, outsourced the accountability.
Sound familiar? Of course it does. Because it’s the same transaction at every level.
A council contracts its highway repairs to a private company. That company subcontracts the work to another. The pothole remains. The resident asks who’s responsible. The council points to the contractor. The contractor points to the subcontract terms. The subcontractor points to the schedule of works. The road stays broken. Somewhere in that chain of passed parcels, accountability evaporated — and nobody even noticed it go.
This is not inefficiency. This is architecture. Responsibility has been distributed until it belongs to nobody.
Now look at Section 230 of the American Communications Decency Act. The legal shield that allowed social media platforms to say: we are not publishers. We are the pipe. The content belongs to the users. It meant a platform could algorithmically amplify content — actively choose it, boost it, serve it to the most susceptible people at the most vulnerable moments — and still claim no relationship with the consequences.
That’s not a pipe. That’s a hand on a shoulder, guiding someone down a corridor.
Courts in the United States have begun testing that argument. In 2023, the Supreme Court considered Gonzalez v. Google — a case asking directly whether Section 230 protected algorithmic recommendation of terrorist content. The Court declined to rule on the Section 230 question, leaving the shield intact for now. However, the argument hasn’t gone away. Cases involving algorithmic recommendation — where platforms don’t merely host harmful content but actively push it, repeatedly, to specific individuals — continue to expose the gap between what Section 230 was written to protect and what the algorithm actually does. That reckoning is arriving. Slowly. Expensively. Too late for some.
Everyone pointing sideways. Nobody home. The consequence real. The responsible party: unavailable.
And here’s what happens next.
Consequence doesn’t disappear because responsibility has been distributed into invisibility. Something still went wrong. Someone still got hurt. The pothole is still there. The city still burned. And human beings are wired — rightly, healthily — to need an explanation. To need somewhere to point. Cause and consequence. Someone should answer for this.
However, when the architecture has been deliberately designed so that nobody owns the consequence, that instinct has nowhere legitimate to land.
So it lands anywhere. Wrongly. On the immigrant, the previous government, the other tribe, the nearest available target. The blame game isn’t irrationality. It’s a rational instinct operating inside a system deliberately designed to frustrate it.
And when the blame game exhausts itself and resolves nothing — which it always does, because it was never aimed at the actual cause — what’s left is a quiet kind of stillness. Not peace. Just the absence of any expectation that things might change.
This is what learned helplessness looks like. Not despair. Just the quiet decision to stop expecting anything different.
Think about how many times today a notification pulled you away from whatever you were actually doing. You dealt with it quickly. You had to. Life doesn’t wait. And in that moment, the pause never happened. None of us are untouched by this — not really. We’re all doing it. The difference is whether we ever stop to notice. Because the pause didn’t just wander off. It was crowded out. And it can be reclaimed. Only if we recognise first that we lost it.
The platforms have a word for what they want you to be. They call it an engaged user. What they mean is something closer to a humanoid — outwardly present, inwardly operated. Not by themselves. By the algorithm they handed themselves over to. You dealt with it quickly. You had to. Life doesn’t wait. And in that moment, the pause never happened. None of us are untouched by this — not really. We’re all doing it. The difference is whether we ever stop to notice. Because the pause didn’t just wander off. It was crowded out. And this too can be reclaimed. Again, only if we recognise first that we lost it.
Ownership didn’t disappear. It was just passed around until nobody was holding it.
Not all outsourcing is the same. That distinction is the whole point.
The calculator does your arithmetic. The washing machine does your laundry. Functional outsourcing frees up capacity. That’s tools doing what tools are for.
Cognitive and emotional outsourcing is different. That’s when what to think, what to fear, what to want, who to blame — all of it starts arriving from elsewhere. The first kind frees up capacity. The second kind fills that freed capacity with someone else’s agenda.
The tragedy is that we can no longer feel the difference. It all arrives through the same screen, in the same feed, at the same speed, with the same frictionless ease. The algorithm that chooses your playlist and the algorithm that chooses your outrage are, functionally, the same mechanism. One feels like a gift. The other feels like a conviction. Neither one is yours.
Buried inside even the most emotionally incontinent response — the pile-on, the performative grief, the furious share — there is often a legitimate point. A real grievance. A genuine injustice underneath the noise. The cause may be entirely valid.
However, the relationship to the cause has been outsourced too. The person didn’t reason their way there. They were handed an arrival point, pre-packaged, shaped to fit their existing insecurity, and given the emotional fuel to feel certain about it. Someone else did the thinking. Someone else identified the grievance, framed it, and distributed it to people primed to receive it.
The feeling is real. The grievance may be real. The thinking was done elsewhere. By someone with an agenda. And handed over, like everything else.
The algorithm didn’t steal your mind. You left it behind somewhere along the way. You just never went back for it.
We’ve all seen the film. Someone walks into a room. A screen activates. An AI voice asks questions. Analyses responses. Makes recommendations. Decisions that were once yours — what to eat, where to go, who to trust, what to think — arrive from somewhere else entirely. We watched it and thought: that’s the future. That’s not us. Not yet.
Look around.
The screen is already in your pocket. The voice is already in your kitchen. The recommendations are already shaping your choices before you’ve consciously made them. The film wasn’t a warning. It was a schedule.
We built the robot. Then we became it. Submissive to the very thing we created. Told what to do, what to think, what to say — not by another human being, but by an algorithm with no conscience, no context, and no skin in the game. The machine got the agency. We got the notifications.
Consider the futility of it. One side blames the algorithm. The other side blames the opposite tribe. Neither stops to notice that both responses were shaped by the same feed, delivered to the same brain stem, producing the same emotional discharge — just pointed in different directions. The echo chamber doesn’t just confirm what you believe. It tells you who to blame. The outrage is real. The hypocrisy is invisible. Because you can’t see the hand that’s moving you when you’ve outsourced the looking.
The thing is, most of us never made it consciously. We arrived here one small outsourcing at a time. Without noticing. Without signing anything.
And agency, once outsourced gradually enough, takes conscious effort to reclaim.
This piece itself is an act of outsourcing. You’re reading someone else’s thinking. The question isn’t whether you receive ideas from outside yourself. You always will. The question is whether what you receive hands you a conclusion or hands you a key.
The countercultural act is to reclaim the pause. Not dramatically. Not as ideology. Just as a practice.
To sit with a question before reaching for someone else’s answer. To feel something without immediately broadcasting it. To ask, of any strong conviction that arrives fully formed: where did this actually come from? Did I think my way here, or was I delivered here?
Every small reclamation of that kind is a small restoration of the self. The muscle, used again. The gap, reopened. The author, returning.
Common sense didn’t pack its suitcase and leave. It was shown the door. Quietly. By people who found it commercially inconvenient.
The consequence always belonged to someone. It still does. It just got lost in the chain.
The outsourcing of the self was never announced. No contract was signed. No small print was read. It happened in the accumulated weight of a thousand small conveniences, each one reasonable, each one harmless, each one leaving a little less of you behind.
We didn’t lose our minds. We handed them over.
The door is still there. Opening it, though — that part was never outsourced.
The Almighty Gob is a Bristol-based publication covering politics, power, and the gap between what institutions say and what they actually do.
Sources and further reading.
Edward Bernays and the engineering of consent Bernays, E. (1928). Propaganda. Horace Liveright. The foundational text in which Bernays sets out his theory of mass persuasion through emotional rather than rational appeal. Widely available and extensively cited in media studies, advertising history, and political theory.
Section 230 of the Communications Decency Act 1996 47 U.S.C. § 230. The full text and legislative history is available via the United States Congress at congress.gov. For a comprehensive legal overview see: Congressional Research Service (2022). Section 230: An Overview. CRS Report R46751.
Gonzalez v. Google LLC (2023) 593 U.S. ___ (2023). The Supreme Court of the United States considered whether Section 230 immunised YouTube’s algorithmic recommendation of Islamic State content. The Court declined to rule on the Section 230 question and resolved the case on separate grounds. Full opinion available via supremecourt.gov.
Learned helplessness The concept was developed by psychologists Martin Seligman and Steven Maier in the 1960s and 1970s through a series of controlled studies on the relationship between perceived control and behavioural response. Key paper: Seligman, M.E.P. & Maier, S.F. (1967). Failure to escape traumatic shock. Journal of Experimental Psychology, 74(1), 1–9.
Algorithmic recommendation and Section 230 — current legal position For the most current analysis of how courts are treating algorithmic recommendation under Section 230 see: Congressional Research Service (2023). Liability for Algorithmic Recommendations. CRS Report R47753. Available via congress.gov.
Cognitive and emotional outsourcing to algorithmic systems Joseph, R. et al. (2025). The algorithmic self: how AI is reshaping human identity, introspection, and agency. Frontiers in Computer Science. Available via PubMed Central (PMC).


