The narrative went like this: AI gets smarter, data jobs disappear. Fewer analysts, fewer engineers, fewer people staring at CSVs at 11pm. It was a clean story. It was also wrong. Data job postings have climbed every quarter for two years running, and the roles emerging now are more technical, not less.
The confusion comes from conflating the tool with the work. AI automates tasks — specific, bounded, repeatable tasks. It does not automate the judgment about which tasks matter, the infrastructure that makes models reliable, or the monitoring that catches them when they fail silently. Those are all still human jobs. In most organizations, they're understaffed ones.
Every production AI system has a data supply chain behind it. Someone designed the schema. Someone built the pipeline that feeds it. Someone wrote the validation logic that catches bad rows before they corrupt a training run. Someone checks model drift when predictions start going sideways six months post-launch. None of that is automated. All of it is growing faster than hiring can keep up.
The human-in-the-loop requirement isn't a philosophical position — it's an operational one. Regulated industries can't deploy models without documented oversight. Even unregulated ones have learned, expensively, what happens when they do. The people reviewing outputs, auditing decisions, and maintaining the feedback loops between model behavior and real-world outcomes are data workers. The fancier the model, the more infrastructure it needs underneath it.
There's a second effect that gets less attention: AI tools have lowered the cost of building data products to the point where organizations that couldn't afford them before are building them now. More products means more pipelines, more monitoring, more engineering. The market expanded. It didn't contract.
This doesn't mean the work looks the same. The analyst who spent three hours writing a query now spends thirty minutes reviewing one. The freed time goes to interpretation, to framing better questions, to building the institutional knowledge that makes AI outputs trustworthy rather than plausible. The job got harder to do well — which is, historically, what happens when a profession matures.
AI didn't eliminate the need for people who understand data. It raised the floor on what those people need to know.