There's a persistent anxiety in educational circles right now: if AI can write code, translate languages, summarize books, write the students' homework and papers, will we forget how to think?

Tools don't think for you. They think with you.

When you use a notebook to remember things, your memory doesn't weaken—it extends. The notebook becomes part of your cognitive system. Philosophers call this the "extended mind," and it's not a metaphor. Your intelligence has never been confined to your skull. It lives in the tools you use, the environments you create, the conversations you have.

This matters because it reframes the AI question entirely.

The issue isn't whether AI changes learning—obviously it does. The issue is whether AI-mediated learning produces something less valuable than traditional learning. And the answer, if you look at the evidence rather than the anxiety, is no.

What you actually need to know

Here's what's interesting about programming: the people who insist you must learn to code "by hand" are making an assumption about transfer—the idea that mastering low-level skills automatically gives you high-level understanding. But decades of cognitive research shows this almost never happens. Learning Python syntax doesn't magically make you good at system design. Learning algorithms doesn't transfer to user experience thinking. These are different domains that share little to no cognitive elements.

If your goal is to build software that works, practicing software design is more useful than practicing syntax. And AI lets you engage with design-level problems immediately instead of spending years building procedural fluency first.

The expertise you need isn't typing code character by character. It's:

  • Understanding what you're trying to build and why
  • Knowing when a solution actually solves the problem
  • Recognizing when something will break at scale, or when it's maintainable, or when it's the wrong approach entirely
  • Being able to test and verify that your system does what it should

None of this requires memorizing syntax. All of it requires thinking.

The difference between using and understanding

Obviously you can use AI stupidly. You can accept suggestions without thinking, treat it like a magic box that solves problems you never bothered to define clearly. But you can also use textbooks stupidly. You can memorize formulas without understanding principles. You can follow tutorials without comprehending systems.

The tool isn't the problem. Thoughtlessness is the problem.

When you work with AI intentionally—when you use it to explore possibilities, test ideas, get rapid feedback on approaches—you're engaging in exactly the kind of cognitive apprenticeship that educational researchers have valued for decades. You're seeing expert patterns, evaluating them critically, adapting them to your context, learning through dialectical engagement. The cognitive demand doesn't disappear. It shifts from recall and execution to evaluation and orchestration. That's not easier. It's arguably harder—and definitely more aligned with what actual work looks like.

Will most people use AI to avoid thinking? Probably.

But most people avoided thinking before AI too. They memorized test answers without understanding concepts. They copied homework. They optimized for grades over learning. This isn't new. Tools amplify intention. If your intention is to learn, AI makes you smarter. If your intention is to avoid learning, AI makes that easier too. But that's a problem with intention, not with AI.

Less, but better

At DigTek, we operate on a principle: less, but better. Constraints force clarity. Simplicity reveals complexity.

AI is just another constraint—or more precisely, a tool that helps you navigate constraints more effectively. It doesn't eliminate the need to think clearly about what you're building and why. It makes that thinking more important, not less.

The real question isn't "will AI make us stupid?" The real question is: "Are we willing to think carefully about what we're doing, or will we let thoughtlessness masquerade as productivity?"

AI just makes the answer more visible.


The challenge isn't resisting AI. The challenge is cultivating the intellectual habits that make AI a learning partner rather than a crutch: curiosity about why systems work, skepticism toward easy answers, commitment to understanding over completion. These habits have always been the foundation of deep learning. AI simply makes their importance more apparent—and more urgent.