We Are All Impostors Now — And That's Fine
Half the workforce is hiding their AI usage from their boss. The other half is pretending they don't need it. Both are wrong — and the sooner we accept that, the sooner we can focus on what actually matters.
According to Microsoft's 2024 Work Trend Index, 52% of people who use AI at work are worried their boss will think they're less skilled. A Salesforce study from the same year found that roughly half of all employees who use generative AI don't tell their manager. A Glassdoor survey put the number higher: 70% of professionals using AI tools at work had never mentioned it.
We are, collectively, hiding the most powerful productivity tool in a generation — because we're ashamed of using it.
I think that shame is the most important signal of our time. Not because it's justified, but because it tells us exactly where we are in the adoption curve: early enough that using AI still feels like cheating. Late enough that not using it is already a competitive disadvantage.
This is the impostor moment. And I think we need to lean into it.
Here's what I see happening around me.
Developers who've never built a full-stack application are shipping production-ready products. Marketers who couldn't write a line of code are automating their own workflows. Designers are prototyping interactive experiences that would have required a team of three, eighteen months ago.
People are punching above their weight class. Dramatically. And they feel guilty about it.
I understand the instinct. When you produce something that exceeds what you could have done alone — using a tool that does the heavy lifting — it triggers something deep. A voice that says: this isn't really mine. I didn't really earn this. Someone will figure out that I don't actually know what I'm doing.
That voice has a name. It's called impostor syndrome. And right now, it's everywhere.
But here's what I think we're forgetting.
We've always outsourced capability. Every major leap in human history has been about extending what one person can do — and every single time, it felt like cheating to the generation that lived through it.
The steam engine scaled our muscles. Before the Industrial Revolution, physical strength was a core professional asset. Blacksmiths, farmers, builders — their value was literally in their bodies. The steam engine didn't just change manufacturing. It changed what it meant to be capable. Agricultural employment in England fell from 36% of the workforce in 1800 to under 9% by 1900. Yet total employment more than doubled. The jobs that replaced farm labor were higher-skilled, better-paid, and entirely new.
The calculator scaled our arithmetic. I'm old enough to remember teachers who insisted we shouldn't use calculators because "you need to understand the math." They weren't wrong about understanding. But the students who embraced the calculator didn't become worse at math. They spent less time on computation and more time on comprehension. The tool didn't replace thinking. It freed up space for more of it.
The internet scaled our memory. Google made it unnecessary to hold every fact in your head. Wikipedia made it possible to learn anything in minutes. We stopped memorizing and started lazy-loading knowledge — pulling in what we needed, when we needed it. That felt uncomfortable at first. "You're not really learning if you're just Googling it." But what it actually did was free our cognitive bandwidth for synthesis, judgment, and creativity — the things that matter more than recall.
Now AI is scaling our cognition itself.
There's a scene in The Matrix that captures this better than any business case ever could. The team is trapped on a rooftop. Below them, agents closing in. Their only way out is a military helicopter parked on the adjacent building. Neo turns to Trinity: "Can you fly that thing?"
"Not yet."
Back in the real world, Tank slams a cartridge into the loading rig and punches the upload key. A flood of data streams directly into her cerebral cortex — rotor dynamics, avionics systems, combat flight maneuvers — years of pilot training compressed into a few seconds of raw neural transfer.
Her eyes snap open. "Let's go."
That scene was science fiction in 1999. In 2026, it's a metaphor that's getting uncomfortably close to reality.
When a developer uses AI to architect a system beyond their current expertise — and then ships it, debugs it, and maintains it — they're doing their version of Trinity's upload. When a founder uses AI to write a financial model they couldn't have built from scratch — and then uses it to raise capital and build a company — they're grabbing the cyclic and flying.
The capability is real. The output is real. The fact that a tool accelerated the acquisition doesn't make it fake.
And the pattern is the same. Every time we've outsourced a human capability to a tool, we've freed ourselves to operate at a higher level. Not lower. Higher.
Peter Diamandis captured this well: "Technology is a resource-liberating mechanism. It can make the once-scarce the now-abundant." That's not just about physical resources. It's about cognitive ones too.
The "fake it till you make it" economy has always existed.
Let's be honest about something: careers have always been built on a foundation of performing competence while developing it in real time. The junior consultant who presents a strategy they barely understand. The new manager who leads a team without ever having managed anyone. The entrepreneur who pitches a vision for a product that doesn't exist yet.
"Fake it till you make it" isn't a bug in the system. It's the system. It's how humans have always operated at the frontier of their capability — by reaching beyond what they currently know and growing into the gap.
AI didn't invent this dynamic. It amplified it. Radically.
What used to be a small stretch — presenting slightly beyond your competence — is now a quantum leap. A solo founder can build what used to require a team of twenty. A marketing manager can execute a strategy that used to need an agency. A student can produce work that competes with a decade of experience.
That's not fraud. That's progress.
The question is never whether the tool is doing part of the work. The question is whether you know what you're building and why.
But the transformation years will be uncomfortable.
I'm not going to pretend this transition is painless. We are in what I've called The Fifth Acceleration — after speech, writing, the printing press, and the internet. Each of those previous accelerations reshaped civilization. Each one created a generation that felt disoriented, displaced, and unsure of its own value.
We're in that disorientation now.
During these transformation years, millions of professionals will use AI to produce work that exceeds their unaided capability — and they'll feel like impostors for it. They'll wonder if their contributions are "real." They'll worry about being exposed. They'll hide their AI usage like a secret vice.
This is normal. This is the psychological cost of a paradigm shift. And it will pass.
It will pass because the alternative — refusing to use the tool — will become untenable. The World Economic Forum projects that AI and automation will displace 85 million jobs by 2025, but create 97 million new ones. A Deloitte study found that technology destroyed approximately 800,000 jobs in the UK over fifteen years — but created 3.5 million new ones, with average wages £10,000 per year higher than the jobs that were lost.
The pattern is consistent and it is old: technology doesn't eliminate work. It elevates it.
Two kinds of impostors. One critical difference.
Here's where the nuance matters.
As AI makes everyone's output look more polished, more professional, more competent — it becomes genuinely harder to tell an expert from a novice. Both can produce impressive work. Both may feel impostor syndrome. But only one can give you reliable results under pressure.
The expert who uses AI has judgment. They know when the output is wrong. They know which questions to ask. They know when to override the tool and when to trust it. Their impostor syndrome is misplaced — they're not faking expertise. They're amplifying it.
The novice who uses AI has output. They can produce something that looks right. But they can't defend it, iterate on it, or know when it's leading them off a cliff. Their impostor syndrome is also misplaced — but in the other direction. They're not as competent as their output suggests.
Both will feel fake. But the gap between them is real and growing.
This is why AI literacy isn't optional anymore. It's becoming the skill that separates. Not whether you use AI — everyone will. But how you use it. Whether you use it as a crutch or as a force multiplier. Whether you're directing the tool or being directed by it.
We don't automate craftsmanship — we amplify it. That principle has never been more relevant.
We've been here before. Every time, we've risen.
The printing press was going to destroy knowledge because people would stop memorizing sacred texts. It didn't. It created the Renaissance, the Scientific Revolution, and universal literacy.
The loom was going to destroy livelihoods. The Luddites smashed machines in protest. Instead, the textile industry expanded fiftyfold and created entirely new supply chains.
The internet was going to make us stupid — Nicholas Carr literally wrote a book called The Shallows about it. Instead, the internet created an entire economy of roles that didn't exist before: UX designer, data scientist, social media manager, cloud architect, content creator, growth hacker, DevOps engineer. The McKinsey Global Institute found that the internet created 2.6 jobs for every single job it displaced.
Dell and the Institute for the Future estimated that 85% of the jobs that will exist in 2030 haven't been invented yet.
Eighty-five percent.
We can't see the new jobs yet. We never can, during the transition. The farmer in 1850 couldn't imagine "software engineer." The factory worker in 1990 couldn't imagine "influencer." The journalist in 2005 couldn't imagine "prompt engineer."
But they came. They always come. And they're always further up the value chain.
So here's what I'd say to anyone feeling like an impostor right now.
You're not faking it. You're doing what humans have always done — reaching for a tool that extends your capability beyond what your bare hands could achieve.
The blacksmith who used the steam hammer wasn't less of a craftsman. The mathematician who used the calculator wasn't less of a thinker. The researcher who used Google wasn't less of a scholar. And you, using AI to build something you couldn't have built alone — you're not less of a professional.
You're more of one. Because you had the judgment to pick up the tool.
The real risk isn't in using AI. It's in refusing to — and watching the world accelerate past you while you protect a version of competence that was always more performance than substance.
The horizons are opening.
Look back at the grand sweep of human history. Our entire journey as a species can be defined by a series of exponential accelerations — each one extending what a single person could do, each one feeling like the end of something, each one turning out to be the beginning of something bigger.
We scaled our muscles with machines. We scaled our memory with the internet. Now we're scaling our minds with AI.
The impostor syndrome you're feeling right now? It's the growing pain of becoming something new. It's the discomfort of operating at a level you haven't fully internalized yet. It's the gap between what you can now produce and what you believe you deserve to produce.
That gap will close. Not because AI gets weaker, but because you'll grow into the capability it gives you. Just like every generation before you grew into theirs.
So use the calculator. Use the search engine. Use the AI.
And see what new horizons open up when you stop apologizing for the tools that make you extraordinary.