KP

Your Brain on Autopilot

When was the last time you drove somewhere new without using GPS?

You probably got there just fine. But could you find your way back without it? Could you actually describe the route to a friend?

There is this strange trade-off we have all quietly made. We get perfect navigation in exchange for having no real idea where we are. We get to our destination, but we couldn't draw a map if our lives depended on it.

Now, here is the uncomfortable question. What if the same thing is happening to the way you think?

The Offloading Effect

Back in 2011, long before ChatGPT was a thing, researchers at Columbia published a study that probably should have been a warning. They found that when people expect information to be easy to find, they just don't bother remembering it. We had already started treating Google like an external hard drive for our brains.

And honestly? This wasn't laziness. It was efficiency. Why memorize something you can look up in three seconds?

But here is what has changed. AI doesn't just store information for us. It thinks for us. And new research suggests this actually matters.

A 2025 study of 666 participants found a negative link between frequent AI use and critical thinking skills. The key finding wasn't just that AI users did worse on reasoning tasks. It was why. Cognitive offloading, the act of handing off mental work, was the connecting factor. The more you delegate, the less you exercise.

This follows a pretty basic brain principle. Our brains are plastic, meaning they are constantly rewiring based on what we actually do. Pathways we use get stronger. Pathways we ignore fade away. When we stop working through problems ourselves, we stop building the circuitry for working through problems. Simple as that.

The Hollowed Mind

Here is where things get more interesting than just saying "AI makes us dumb."

Philosophers Gary Klein and Natalie Klein introduced a concept they call the "hollowed mind." Their argument goes like this. Older technologies let us offload memory. Generative AI lets us offload thinking itself. It takes away the hard, effortful process of wrestling with ideas, making connections, and sitting with uncertainty.

The result isn't that we know less. It's that we are losing the capacity to work through not knowing. We skip the productive struggle that actually builds understanding.

Think about it. When you ask ChatGPT to explain something, you get an answer. What you skip is the fifteen minutes of confusion, the three wrong guesses, and the gradual "oh... I see" that actually cements things in your head. The answer shows up, but the cognitive workout never happens.

Klein calls this the "sovereignty trap." You feel like you are in control. You are the one asking questions, judging answers, and making decisions. But you are increasingly dependent on a system doing the real mental work underneath. You keep the illusion of control while the actual ability quietly fades.

You Don't Know What You Don't Know

Here is the part that should really give you pause. There is evidence we can't even tell this is happening.

Research from the Korn Ferry Institute points to a metacognitive gap. That is a disconnect between what people actually know and what they think they know. When AI fills in your knowledge gaps seamlessly, you lose the signal that would normally tell you something is missing. That struggle that reveals your own ignorance? Gone.

You might feel just as sharp as ever. You might even feel sharper. You are getting more done, answering questions faster, and producing more output. But if you suddenly lost access to these tools, you might find a version of yourself with fewer mental resources than you had five years ago.

This isn't a prediction. It is a possibility worth taking seriously.

The Question That Actually Matters

So is AI making us dumber? That is the wrong way to look at it. A better question is to ask what cognitive trade-offs we are making, and if we are making them on purpose.

GPS isn't bad. But there is a difference between using it as a tool while still paying attention to where you are, and using it as a total substitute for awareness. One leaves you capable. The other leaves you dependent.

The same goes for AI. Call it the aid-versus-crutch test. A thinking partner challenges your ideas, fills gaps you are aware of, and speeds up work you actually understand. A thinking crutch does the cognitive work you have stopped doing yourself, whether you realize it or not.

The difference comes down to attention and intention. It is choosing to struggle with a problem before asking for help. It is noticing when you are outsourcing understanding, not just outsourcing labor.

Nobody is going to stop using AI, and they shouldn't. But there is a version of the next decade where we arrive at destinations without knowing where we are, holding beliefs we didn't reason our way toward, and reciting answers to questions we never really understood.

And there is another version where we use these tools deliberately, keeping our cognitive muscles in shape and staying in charge of our own minds.

The GPS can stay on. Just maybe look out the window once in a while.