On Human Dignity and the Age of Automation
Long before résumés and job titles, work was simple. You needed something, so you made it, grew it, or traded for it. Labor wasn't abstract — it was survival, and it carried an inherent dignity. What you could do with your hands and your mind was inseparable from who you were.
Then we invented currency. And with currency came a quiet but profound shift: your labor no longer spoke for itself. It was translated — into coins, into bills, into numbers on a screen. Each abstraction moved us further from the direct relationship between effort and meaning. Barter became currency. Currency left the gold standard. Money became faith — faith in institutions, in governments, in systems most people will never fully understand.
Jean Baudrillard saw this clearly. In Simulacra and Simulation, he described how symbols gradually detach from the things they represent until the symbol itself becomes the reality. Money was once a stand-in for labor, for goods, for tangible value. Over time, it became what Baudrillard might call a simulacrum — a symbol for infinite purchasing possibility, untethered from anything real. People stopped treating money as a means to an end and began treating it as the end itself. The map replaced the territory.
But something else happened along the way. As the economy abstracted, people specialized. You were no longer a generalist surviving off the land — you were a carpenter, a surgeon, a machinist, a software engineer. Specialization gave people something money alone never could: identity. Pride. The quiet dignity of being good at something.
Now we are entering a deeper layer of that same abstraction. AI and automation aren't just changing how we exchange labor — they are abstracting what it means to do things in the first place. The hyperreality that Baudrillard described around money is now extending to work itself. When a machine can write, diagnose, design, and build, the question is no longer just "what is my labor worth?" It becomes: "what does it mean to do this at all?"
That is the question this essay is trying to sit with.
We Specialized Ourselves Into a Corner
Specialization was, by most measures, a triumph. It built civilizations. It advanced medicine, engineering, art, and science at a pace that generalism never could. When you devote your life to one craft, you achieve a depth of mastery that commands respect — from others and from yourself.
But specialization came with a hidden cost. The more we tied our identities to what we do, the more fragile those identities became. A steelworker isn't just someone who shapes metal — it's who he is. A radiologist isn't just reading scans — it's decades of training, status, purpose, a life organized around a skill. When that skill becomes automatable, it's not just a job that's threatened. It's a sense of self.
My father told me about his first job. There were payroll employees who used the abacus to handle all accounting, scheduling, and payroll. Then came Microsoft Excel. What had taken a room full of people could suddenly be done by one person who was good with a spreadsheet. The job of dozens of abacus accountants was replaced overnight — not by a machine that was smarter, but by a tool that made one person's output worth more than the rest combined.
He also grew up in a world without cars. Everyone walked, rode trains, and sent physical mail that took weeks or months to arrive. Now those have been replaced by texts, emails, and highways. Life became faster, more accessible, more connected — but those who had built entire lives around the old ways were made obsolete. Not because they failed, but because the world moved.
This is not a new phenomenon. Factory workers felt it during industrialization. Weavers felt it when the loom mechanized. But there's a difference this time. Previous waves of automation replaced physical labor and left cognitive work untouched — you could always retrain, move up the abstraction ladder, become the person who designs the machine instead of operating it. AI doesn't offer that escape hatch. It operates at every level of abstraction simultaneously. It writes. It reasons. It creates. The ladder we've been climbing has something waiting at the top.
There's a moment in Avengers: Civil War where Captain America challenges Tony Stark: "Take off that suit of armor — what are you?" And Stark can't answer. He doesn't know who he is underneath. It's a scene from a superhero film, but the question it asks is the same one facing millions of people right now. It's telling that this anxiety shows up in the most popular stories of our time — it means the question isn't academic. It's everywhere.
Stark's arc across the films is instructive. He had to confront himself without the armor. He had to find out who he was as a person before he could understand what the suit was for. He stopped wearing the full suit until he found his identity — and then wore it not as a crutch, but as a tool to protect the innocent.
That is the seriousness of the modern era. We don't know who or what we are without our jobs, our careers, our skills, our work. The thing we spend at least eight hours a day doing. The thing that keeps us proud. The thing that pays the bills and commands respect in the household and in public spaces. We are beginning to dismantle what it means to work — and we haven't begun to answer what replaces it.
And so the resentment builds — not because people are irrational or afraid of progress, but because the thing they were told to do (specialize, get good, be the best) is the very thing being made obsolete. The social contract broke, and nobody announced it.
The Meaning Crisis Behind the AI Backlash
The backlash against AI is often framed as fear of the unknown, resistance to change, or simple technophobia. That framing is convenient for the people building these systems, but it's wrong. What we are witnessing is not a rejection of technology. It is a crisis of meaning.
For most of modern history, the answer to "who are you?" has been inseparable from "what do you do?" You meet someone at a dinner party and within two minutes you know their occupation. It's the first thing we ask, the first thing we offer. Not because we're shallow — because work became the primary vehicle through which people locate themselves in the world. It tells you where you stand, what you contribute, what you're worth.
When AI threatens that, it doesn't feel like a tool being introduced. It feels like an identity being revoked.
Consider the writer who spent twenty years learning to craft prose, only to watch a language model generate passable copy in seconds. Consider the artist who trained for a decade in anatomy, color theory, and composition, now watching AI produce images from a text prompt. Consider the programmer — someone like me — watching code generation tools write in minutes what used to take hours of careful thought. The output may not be equivalent. The nuance may be missing. But the gap is closing, and everyone can feel it.
The natural response is anger. And that anger is rational. These people did exactly what society asked of them: they specialized, they practiced, they got good. They held up their end of an unspoken agreement — if you master something, you will be valued for it. Now the terms are changing, and no one asked for their consent.
But beneath the anger is something quieter and more disorienting: the realization that they may not know who they are without the work. That the identity they spent a lifetime constructing was built on a foundation that was never as solid as it seemed. The suit of armor was always borrowed.
We have been warned about this, in a sense — just not accurately. George Orwell's 1984 imagined a world where meaning was destroyed through external oppression: surveillance, propaganda, the rewriting of truth by a totalitarian state. Aldous Huxley's Brave New World imagined the opposite — meaning drowned in pleasure, distraction, and engineered contentment. Both authors understood that something essential about human experience was at risk. But neither quite described what we are actually living through.
Orwell's dystopia isn't what's happening. No one is rewriting history at gunpoint. And Huxley was closer — we are certainly drowning in distraction — but he was off in a crucial way. His vision assumed people would surrender willingly to a designed utopia, choosing comfort over freedom. What he didn't anticipate is that we would want neither. We don't want to retreat to some pre-technological "natural world," and we don't want to be sedated by a frictionless paradise. What we want — what we have always wanted — is agency. The ability to choose. The capacity to imbue our own lives with meaning, rather than having meaning assigned to us.
And that is precisely what is being eroded. Not through force, and not through pleasure — but through a thousand small abstractions. When algorithms tell us what to buy before we know we want it. When platforms decide what we see, what we hear, what we should care about. When every app, service, and feed is engineered to reduce friction — which is another way of saying engineered to reduce the need to choose. When the world is constantly working to narrow our agency for the profitability of a corporation, something existential is at stake, even if it doesn't feel dramatic in any single moment.
Social media has made this worse in ways that are hard to overstate. We now live inside a constant stream of illusory noise — curated lives, manufactured outrage, algorithmic feeds designed to keep us engaged rather than informed. It doesn't oppress us like Orwell feared. It doesn't sedate us like Huxley imagined. It fragments us. It fills every quiet moment where meaning might have taken root with someone else's content, someone else's opinion, someone else's crisis. The space where a person might sit with themselves long enough to ask what do I actually want has been colonized by infinite scroll.
This is the meaning crisis. Not unemployment — that's an economic problem with economic solutions. The deeper issue is that we built a civilization that systematically abstracted away human agency — the very thing that allows people to make meaning for themselves — and replaced it with external systems that decide on our behalf. We assigned symbolic values to things. We engineered social outcomes. We optimized for engagement, productivity, and growth. But none of it was designed to help people become better at navigating their own lives, making their own choices, or understanding who they are without the labels the system gave them.
Now, on top of all of that, we are building machines that will outperform us in nearly every domain of productive work. If dignity depends on being useful, and machines are more useful, and we've already forgotten how to locate meaning outside of utility — then where does that leave the human being?
That question isn't hypothetical anymore. It's the lived experience of millions of people right now, whether they have the language for it or not.
What Work Meant Before Machines Could Do It Better
So if the old framework is broken — if tying dignity to output was always fragile, and AI just proved it — then what do we replace it with?
Before we can answer that, we have to confront something uncomfortable: we don't actually know how to think about this. Not because we're stupid, but because even the tools we'd use to figure it out have been commodified.
We pay for therapy to have someone listen to our problems and recite Freud and Jung about trauma. We buy pre-packaged meals because cooking is a hassle. We go on world trips and post them on Instagram and still feel empty when we get home. We buy solutions to every problem in our lives — including the problem of not knowing what our lives are for.
The pattern is the same one Baudrillard identified with money. Meaning has become another thing we seek externally. Another product. Another experience to consume. And when it doesn't arrive — when the trip or the purchase or the career milestone leaves us feeling the same — the system tells us we just haven't found the right one yet. Keep searching. Keep buying. Keep scrolling.
But meaning was never out there. It was never a destination or a product or a thing to acquire. It is imbued — generated internally through awareness, choice, and the willingness to sit with yourself long enough to understand what matters. We are agents in this world. Not consumers of it.
You can live a complete life and die of old age as a non-contributor. Collect food stamps. Sleep in shelters. It's possible. But none of us would choose that by default — which tells you something. Meaning isn't innate. It isn't handed to you at birth. It has to be created. And the creation of it is an internal act, not an external transaction.
Hannah Arendt understood this. In The Human Condition, she distinguished between three fundamental activities: labor — the biological cycle of survival, private and repetitive; work — the creation of durable things that outlast us; and action — the public act of inserting yourself into the human story, of showing who you are through deeds and speech. Her warning, back in 1958, was that modernity was already collapsing all three into labor. We were becoming a society that could only produce and consume, with no space left for work that endures or action that means something.
That collapse has deepened. We didn't just reduce everything to production and consumption — we automated the production. What's left is pure consumption. And consumption without creation is hollow. It's the Instagram trip that doesn't fill the void. It's the paycheck that pays the bills but doesn't answer the question. It's scrolling at 2 AM looking for something you can't name.
Here's what's worth paying attention to: the thing that makes us human is also the thing machines are worst at.
If you've ever seen a reinforcement learning model learn to fight — two stick figures flailing at each other in a simulation — you know what I mean. The model runs billions of epochs. It learns through pure trial and error, adjusting weights and hyperparameters across millions of failed attempts. Eventually, after an incomprehensible amount of computation, it discovers something resembling a stance. Then a jab. Then basic strategy. It gets there, but it gets there blind.
A human watches one boxing match and already understands stance, guard, distance, timing. Not perfectly — but meaningfully. Because a human doesn't just record data. A human understands intent. You see someone shift their weight and you know they're about to throw a punch — not because you've seen a million weight-shifts, but because you know what it feels like to have a body, to want to hit something, to protect yourself. You bring your entire lived experience into that single moment of observation.
That's not processing speed. That's meaning. The ability to imbue experience with understanding — to abstract, to relate, to grasp why and not just what — is the thing that lets a human learn in ten attempts what a machine needs ten billion for. Meaning isn't a philosophical luxury. It's a superior learning mechanism. It's our actual edge.
And it's the one thing we stopped cultivating. We optimized for output — where machines will always eventually win — and neglected the capacity for meaning-making, which is where machines can't even begin. The irony is almost unbearable: the thing that makes us irreplaceable is the thing we abandoned.
The Journey of Self, Not the Identity of Output
So where does this leave us?
Not in despair, if we're honest about it. Possibly at the beginning of something necessary.
If machines are going to take over the tasks we built our identities around — and they will — then we have two options. We can fight it, cling to the old framework, and watch it crumble anyway. Or we can take it as the forced reckoning it is and finally ask the question we've been avoiding: who are you when the output stops mattering?
Nobody is born knowing how to answer that. No one arrives in the world with the capacity to look inward and make sense of what they find. It's a skill. And like every skill, it has to be practiced.
The practice is messy. Humans don't learn the way machines do — we don't converge on optimal solutions through clean iterations. We take wrong turns. We double back. We make irrational choices and follow impulses we can't explain. The path looks like spaghetti. But oddly enough, we still end up somewhere real. Sometimes exactly where we needed to be, through a route no algorithm would have chosen. That's not a flaw in human cognition. That's what it looks like when meaning is being made in real time — not optimized, but lived.
And as you practice — as you sit with the discomfort of not knowing, as you learn to stop reaching for the next purchase or the next post or the next distraction — something starts to clarify. You begin to understand what it means to be an agent in the world. Not a consumer. Not an output machine. An agent. Someone who acts from understanding rather than from habit, from intention rather than from algorithm.
You start to see where technology fits in your life instead of the other way around. You stop asking "what am I worth?" and start asking "what am I doing here?" — not in the cosmic sense, but in the Tuesday afternoon sense. The mundane, lived, specific sense.
That has to happen individually first. There is no shortcut. You can't legislate self-knowledge. You can't automate it. You can't buy it in a course or find it in a feed. It starts with one person sitting with themselves long enough to stop performing and start noticing.
But it doesn't end there. Once you develop that capacity in yourself, you start to recognize it in others — and its absence. You see your community differently. You understand what people actually need versus what they're being sold. And from that understanding, action becomes possible. Not action as productivity. Action as Arendt meant it — the act of stepping into the shared world and doing something that matters because you know why it matters to you.
Self. Then community. Then society. That's the order. Skip a step and you get performative activism, empty policy, or another app that promises to fix your life for $9.99 a month.
The social contract needs to be rewritten. Not just economically — philosophically. Dignity can't be tied to output anymore, because the machines will always produce more. Merit can't be the sole measure of human worth, because the game is rigged when your competitor doesn't sleep. What we need is a framework where dignity is tied to something machines will never have: the understanding of human essence, developed first in the self, then extended to the people around you, then carried into the world.
That is what I'm trying to work toward with SortaLogic. Not just building software — thinking clearly about what's worth building and why. Every project, every essay, every experiment on this site comes back to the same question: how do we build things that help people become more conscious, more intentional, and more capable of authoring their own meaning?
I don't have the answer yet. I'm not sure anyone does. But I believe the work of looking for it — the messy, nonlinear, deeply human work of it — is the most important thing we can be doing right now.
The machines can handle the rest.