Computational Thinking is the New Literacy
Why software engineering still matters in an AI world
Welcome to Unknown Arts, where builders navigate AI's new possibilities. Ready to explore uncharted territory? Join the journey!
OpenAI's recent o3 demo sent shockwaves through the software engineering community. For many programmers, it was their first visceral encounter with AI that was clearly smarter than them. Social media filled with engineers questioning their future, wondering if they would soon be obsolete. But watching the demo, something else struck me: the prompts used contained deep software engineering knowledge that no non-engineer would be able to write.
Take the first seemingly simple prompt from the demo. In the span of a short paragraph, you needed to understand servers, HTML forms, API requests, file operations, execution environments, and authentication systems. Years of accumulated software knowledge compressed into a few lines of text. If you don't have software training, you wouldn't know to specify these pieces. You might not even realize they're needed.
What we're witnessing isn't the end of software engineering - it's an unbundling of the craft (yes, I know, I'm on an unbundling kick lately). AI is separating the mechanical act of writing code from the deeper art of thinking about systems. As implementation becomes automated, the human value shifts decisively toward understanding how software should work.
Hacking it together won't scale
Before my first software development job, I had only worked on small projects - solo or with a couple of other people. It was relatively easy to keep everything in my head, and the stakes were low. It didn’t matter much if an application failed for a bit–it was just a small demo. But then I started working on real business software with many engineers and complex systems. The difference was stark. I discovered that hacking together something I didn't fully understand carried serious risks: breaking critical product features, creating maintenance headaches that wasted other engineers' time, and introducing technical debt that compounded over time.
I'm certain many people will figure out how to get AI to build software without any engineering knowledge. And for simple, personal projects, that might work fine. But running a software business isn't like shipping a static piece of media. It’s dynamic, depending on maintaining and growing systems in the wild. Just like I learned in that transition from demos to professional products, there's a vast difference between getting something to work and creating something that can hold up to ongoing real-world use.
Without deep systems understanding, you'll end up with a fragile patchwork of code that becomes increasingly difficult to manage. Each change becomes a game of prompt roulette, hoping AI will understand enough context to avoid breaking something else. It's like building a house with no understanding of architecture - you might get four walls and a roof, but good luck adding a second story later.
As machines think like us, we must think like machines
There's an interesting paradox emerging: the more AI learns to "think" like humans, the more valuable it becomes for humans to understand how machines think. This suggests we need to rethink what technical literacy means in an AI world. As AI handles implementation details, our focus shifts to higher-level computational thinking - understanding system architecture, recognizing patterns and anti-patterns, knowing what questions to ask, and spotting potential problems before they emerge.
The power of AI tools like o3 is remarkable, but wielding them effectively requires more than just a few simple prompts. The patterns that make software maintainable, the principles that guide good architecture, the wisdom to plan for scale - these insights emerge from experience. They come from thinking deeply about systems, from seeing patterns across projects, and from learning through iteration and failure.
For creative professionals like designers, the old question "should I learn to code?" transforms into "should I learn to think like an engineer?" The syntax becomes less important, but the mental models become more valuable than ever. Understanding how software systems work - their possibilities and constraints - becomes crucial for anyone hoping to create in this new landscape.
Final thoughts
Those initial reactions to the o3 demo reflected a natural anxiety about change. But they missed a crucial point: as AI grows more capable at implementation, computational thinking becomes more valuable, not less. It's evolving from a specialized skill into a fundamental literacy for creating in a world where understanding systems matters more than ever.
The future belongs to those who can think most clearly about how software should work.
Until next time,
Patrick
Interested in working together? Check out my portfolio.
Find this valuable?
Share it with a friend or follow me for more insights on X, Bluesky, & LinkedIn .