The tools got better faster than most people predicted. Agentic coding assistants that write production-grade SQL from a prompt, pipelines that scaffold themselves, dashboards generated from a plain-English description. Genuinely impressive. Also genuinely beside the point if you don't know what you're looking at when they finish.

AI is a multiplier. That's the part the hype cycle skips. Multipliers have a floor, and that floor is your actual competency. Someone who understands window functions and gets an AI to write them faster is operating at a different altitude than someone who doesn't know what a window function is and is hoping the output looks right. Both got an answer. Only one of them can tell the difference.


SQL isn't going anywhere because the problems it solves aren't going anywhere. Relational data is still how most organizations store the things that matter — transactions, users, events, financials. The query language sitting on top of it has been stable for forty years not because nobody tried to replace it, but because it maps cleanly to how analysts actually think about data: filter it, join it, group it, measure it. An AI that can write that query still needs you to know whether the GROUP BY was correct, whether the join condition introduced duplicates, whether the filter excluded a partition you needed.

Python doesn't go away either. Not because it's irreplaceable in some mystical sense, but because the work it does — transformation, automation, API calls, custom logic — still has to happen. Offloading the syntax doesn't offload the understanding of what you're asking for. If you hand a prompt to a model and can't read the output, you're not working faster. You're just accumulating technical debt you can't see.


The pipeline work is more subtle. Data engineering tooling moves fast, and AI tooling moves on top of it. But the underlying concepts — how data flows from source to destination, where transformations belong, what idempotency means and why it matters when something breaks at 2am — those don't change because the scaffolding got smarter. They're actually more important now. When an AI-generated pipeline fails, someone has to read the stack trace and understand what it's telling them.

The analysts and engineers who compound hardest over the next few years aren't the ones waiting to see what happens. They're the ones who built the foundation and can now move ten times faster because they understand what they're delegating. Prompting is a skill. Reading AI output critically is a skill. Both are useless without the underlying knowledge they're supposed to accelerate.

AI is a force multiplier. Zero times anything is still zero.