I've been building systems since 1998. That's over 25 years of watching technology evolve, adapt, and occasionally get completely revolutionized. But nothing—and I mean nothing—has prepared me for the shift we're experiencing with AI and agentic systems.
Looking back at where we started and where we are now feels like examining two completely different worlds. Let me take you on that journey.
The Stone Age of Web Development (1998-2005)
When I started building "web applications" in 1998, that term was generous. We didn't really have web applications—we had static sites with the occasional form submission. JavaScript barely existed in any meaningful way. Every bit of logic had to live server-side, and user interactions meant full page reloads.
But the real pain point? Data storage. There were no open-source database solutions available. I was stuck using DBM key/value stores, which sounds simple until you realize there was no indexing, no searching of values, and no structured data support. Before JSON or YAML existed, I was manually parsing CSV and pipe-delimited strings just to store basic field data. Reading values was tedious; updating them was an absolute nightmare.
Most hosting providers didn't even allow CGI scripts, let alone support web applications. The entire hosting industry was built around static HTML. That's why I started Make-Tracks Secure Hosting—someone had to solve the web app hosting problem. We were building everything with Perl and Apache using mod_perl, crafting solutions that today's developers would find archaic.
The Great Paradigm Shifts
Over the next two decades, I witnessed several fundamental shifts that reshaped how we build systems:
Data storage evolution: First came SQL databases, then local browser storage, job queues, cloud storage buckets, and countless other storage paradigms. Each solved different problems and opened new possibilities.
The JavaScript revolution: This was the complete game-changer. JavaScript enabled Single Page Applications (SPAs) and full REST interactions through Ajax. Suddenly, we could build truly dynamic web experiences. The entire web shifted from static pages to interactive applications.
Hosting explosion: From practically no hosting options for web apps to an overwhelming variety of cloud services, containers, and deployment strategies. The infrastructure problem went from scarcity to abundance.
Yet through all these changes, some principles remained constant: creating good UX has always been paramount. How we deliver that experience has changed dramatically, but putting the user first never wavers. Systems that scale remain essential—though cloud services have completely transformed how we achieve scalability. Security has always mattered, but the requirements have increased exponentially as attack vectors multiplied.
Learning from Wrong Bets
Not every technology choice was perfect. I chose the Dojo JavaScript framework over jQuery because Dojo was clearly superior—better architecture, more features, cleaner implementation. But the industry chose jQuery. My "obviously better" choice became limiting, and I eventually had to migrate projects to jQuery.
I thought the Newton tablet would be a game-changer. The concept was absolutely right, but it was too early—the technology wasn't ready for mass appeal. Both experiences taught me that the best technology doesn't always win. Market adoption, timing, and ecosystem matter as much as technical excellence.
The AI Awakening
My transition to AI started with ChatGPT, though I quickly switched to Claude as a far superior model. Almost immediately, I realized that trivial coding challenges could be automated, and difficult problems became much easier to analyze. But the real revelation was how AI offered solutions outside my normal thinking patterns—it provided outside-the-box approaches that let me tackle problems I would have previously avoided as too complex.
AI started suggesting elegant solutions to async timing issues and race conditions—problems I would have traditionally architected around rather than through. Now I could embrace more effective approaches because AI could handle the complexity.
Building Beyond Better: A Saturday Morning Inspiration
I had been using other AI coding assistants, and while they were conceptually great, they created new problems—conversation drift, losing focus, cognitive overhead. I knew something better was possible.
One Saturday morning, I woke up with a solution swirling in my head. By Sunday night, I had a working prototype. Within a week, I had something others could try. That prototype became Beyond Better.
The key insight was avoiding the "tools explosion" problem. While MCP (Model Context Protocol) has amazing industry adoption, it suffers from overwhelming choice. When each cloud service provides a dozen tools, and you're working with multiple services, LLMs face exhaustive cognitive load managing 50+ tools.
Beyond Better takes a different approach: a small set of carefully crafted "resource" tools (load, edit, write, find, etc.) that work with a standardized data source system. Whether you're working with filesystem, Notion, Google Drive, databases, or any cloud service, the LLM only needs to master less than a dozen tools. It's the difference between learning 50 specialized hammers versus learning one excellent hammer that works with any nail.
The New Paradigm: Beyond Web Applications
Comparing agentic systems to traditional web development isn't really a comparison—it's a completely new paradigm. The traditional web, as impressive as it became, is becoming less relevant in this new age of AI, though that transition will be so gradual many won't recognize it for years.
The current web is a visual interface that will be supplanted by AI agents and personal assistants, most likely voice-driven and supported by dynamic visuals. Your phone won't have dozens of different apps—it will have a single AI agent that dynamically creates user interfaces based on your current task.
I've witnessed massive changes over the decades—smartphones, cloud services, smart homes. But those were incremental compared to AI. As systems migrate to AI-first design and people interact primarily through personal assistants, our current technology will become irrelevant in its present form.
What Transfers Forward
The transferable skill isn't knowing specific technologies—it's understanding how systems operate and how users interact with those systems. AI now handles the technical implementation, but deep understanding of core problems and recognizing elegant solutions remains crucial.
Learning AI has actually been natural for me. It's a progression of a full career watching changes since the 1980s. Prompt and context design require practice, but that's just common sense combined with decades of experience understanding problems and recognizing good solutions.
Advice for the Next 25 Years
Developers need to understand core problems and where true value can be added—that means paying attention to the human element. What do humans really want, not what they're used to from past decades?
Voice interaction has always been natural for humans. Combine that with anything that feeds our five senses—visuals, sound, tactile feedback. The need for keyboards, trackpads, and touchscreens will approach zero. AI will handle the increasingly sophisticated technical solutions to enable this.
Focus on the human part of the equation. Think about the least-friction solutions for humans. Learn to solve big-picture problems, not just apply technology solutions. Use critical thinking and understand human behavior.
The technology will continue evolving—that's guaranteed. But understanding people, problems, and elegant solutions? That skill remains as valuable as ever.
Twenty-five years from now, we'll look back at today's AI as quaint as I now view those DBM key/value stores from 1998. The tools will change, but the need for thoughtful problem-solvers who understand humans will remain constant.