Lately I’ve been seeing more and more companies preparing to roll out AI tooling to their engineering organizations. Hundreds, sometimes thousands of programmers are about to be told: here are your new tools, here is your training, here are the metrics we’ll use to measure adoption. The usual playbook.

I came across a job posting where a company is looking for a person to head such an effort. They even used a beautiful phrase: “AI-augmented craftsmanship.” I liked it. It captures something true – that programming, done well, is a craft. But the rest of the posting was the standard transformation toolkit: golden paths, DORA metrics, enablement crews, adoption dashboards.

This is a playbook with a dismal track record. Most such transformations fail – not because the tools are wrong or the metrics are bad, but because they address the wrong problem.

They think they’re implementing new technology. They’re actually asking craftsmen to stop practicing the craft that defined them for decades.

That’s not a training problem. That’s an identity crisis.

The Craft That’s Changing

For decades, what made a good programmer? The ability to write clean, elegant, readable code. The kind where variable names mean something, with well designed classes and methods, where another programmer can open the file six months later and understand what’s happening.

This wasn’t just pragmatism. The “Clean Code” movement was about professional pride. The craft of translating complex logic into precise, beautiful syntax – this is what separated the professional from the amateur, the senior from the junior. It is how programmers knew they were good at what they do.

When you work with tools like Cursor this changes. You stop writing code and start directing its creation – and the rise of agents like Claude Code where you do not even look at the code unless you have to makes the change even more profound. You describe intent, review output, catch errors, guide the AI when it goes astray. Precise knowledge of syntax matters less – the AI knows syntax well enough and analyzes it faster than any human. What matters more is the ability to decompose problems, to design the process of creation itself, to recognize when the generated code is wrong or dangerous or just subtly off.

This is still valuable work. Arguably more valuable. But it’s not the same craft.

The programmer who spent fifteen years (or more!) mastering the intricacies of a language, who took pride in writing code so clean it read like prose – that person is being told that what made them excellent matters less than it used to. This isn’t learning new tools. This is being asked to become someone else.

The Old Skills That Matter More, Not Less

But the new craft isn’t built from scratch. It carries over more from the old one than most people realize – just not the parts they expect.

Understanding code still counts. You need to read and evaluate what the AI produces – and the AI produces a lot, fast. If you can’t tell good code from bad code you’re useless in this new world, perhaps even dangerous.

But what matters even more is execution discipline. Building a product as a series of small changes, each fully DONE – technically complete, tested, releasable – even if functionally incomplete. Incremental, iterative development. This was always a good practice. Now it’s essential, because the worst thing you can do is entrust a large system to an AI agent and let it run wild. The results will look impressive and be riddled with problems nobody understands because nobody reviewed the intermediate steps.

Tests matter more than ever. And since AI can generate tests too, there is even less excuse to skip them.

So the skills that carry forward are not syntax mastery but engineering judgment, architectural thinking, and the discipline of building things right in small steps. The programmers who have these skills are more valuable than ever – and most good programmers do, even if not all of them see it yet.

Why Dashboards Won’t Help

Organizations see resistance to AI adoption and think: more training, better tooling, clearer metrics. They’re solving the wrong problem with ever more precision.

You cannot train someone out of an identity crisis. The engineer who drags their feet isn’t necessarily a luddite. They might be someone who correctly perceives that the thing which made them valuable, which made them *them*, is being declared obsolete. More training sessions won’t fix that. Green dashboards won’t fix that.

What Would Actually Work

First, stop pretending it’s primarily about technology. The tools are the easy part – they are already good and improving quickly. The hard part is the cultural and identity dimension that traditional transformation playbooks ignore.

Before rolling out anything, you need to understand the culture you’re working with. What do these engineers actually value? What makes them proud? What are they afraid of losing? Where is the resistance really coming from – is it fear of job loss, or something deeper? What tribes and groups are there?

This is ethnographic work, not project management. Dave Snowden’s Estuarine Mapping framework offers a useful approach here. The core insight: in any complex system, some things are changeable and some aren’t. Some constraints are like the bedrock of an estuary – immovable. Others are like sandbars – they shift if you apply the right pressure in the right place. Most transformation efforts waste enormous energy fighting bedrock while ignoring sandbars. Map the landscape first. Focus only on what can actually change.

And don’t do this mapping in a strategy room with consultants. Engage the people living in that culture. Including the skeptics. Especially the skeptics – they often see things the enthusiasts miss.

For the new craft to take hold, people need to see it as a craft they can grow in – not as a demotion from “real programming” to “supervising the machine.” That’s a narrative problem, an identity problem. It won’t be solved by tooling or training.

The Missing Qualification

Organizations attempting this transition need leaders who understand three things. Traditional programming – deep enough to have credibility, to understand what engineers are being asked to give up, to share their grief over a craft that is changing. AI tooling – hands-on, practical, not just vendor demos. And most importantly: cultural dynamics and identity change – how groups and tribes actually function, why people resist change that seems obviously beneficial from the CEO’s office.

Most job postings I see for these roles cover the first two. The third is nowhere to be found.