SDD: Months Later, the Verdict

Frederick Chapleau
SDD: Months Later, the Verdict

SDD: Months Later, the Verdict

Last January, we published Two Ways of Coding with AI: SDD vs Vibe Coding. The article generated thousands of impressions and dozens of comments — a sign that the question resonates across our industry. Today, after months of daily Specification Driven Development implementation on real projects, we can deliver an honest assessment: what works remarkably well, what hurts, and what it means for the future of our profession.

What SDD Does Exceptionally Well

Let's start with the good news, because there's plenty of it.

SDD has proven to be a specification tool of unexpected versatility. Not just for technical specifications — we expected that — but across the entire software development spectrum.

Technical and functional specifications. Defining expected behaviors, business rules, edge cases, performance constraints: SDD excels. Specifications become the contract between the product vision and its implementation. AI understands these specifications with remarkable precision and can transform them into coherent architectures, well-defined interfaces, and solid data models.

User interface design. This is a pleasant surprise. SDD specifications effectively guide graphic designers and UX experts in screen design. When every component, every state, every interaction is specified upfront, design work becomes formatting rather than invention. Back-and-forth decreases. Misunderstandings too.

Test definition. A well-written specification is, in essence, a test suite waiting to happen. Each expected behavior naturally translates into a test case. AI generates unit and integration tests that genuinely cover requirements — not just tests to inflate a coverage percentage.

Implementation patterns and practices. SDD allows describing not only what to build, but how to build it: which patterns to use, which application layers to respect, which conventions to follow. It's a living implementation guide.

So far, so good. SDD delivers on its promises and then some.

Where It Hurts: The Sandbox and the Beach

Then comes the moment of truth: implementation.

And this is where we need to be honest, because the nuance is important and changes everything.

Imagine a sandbox. It has sides — higher or lower depending on the flexibility you want to grant those playing in it. The playground is bounded. You know where the sand starts and where it ends. You can build castles, dig tunnels, play freely, but within a known perimeter.

Now imagine a beach. The possibilities are infinite. The sand stretches as far as the eye can see. But the water is dangerously close. And on a beach, you can drown.

It's a metaphor, but the reality of AI-assisted development looks strangely similar.

Asking an AI to implement a complete solution without providing a framework is sending it to the beach. It can technically do everything — the code will compile, tests might pass, the application will work on the surface. But without a sandbox, without clearly defined boundaries, AI does what any brilliant but lost developer would do: it improvises. And improvisation in a production system is technical debt disguised as velocity.

The Senior Developer on Day 1

Think of it differently. You hire a senior developer — skilled, experienced, capable. You ask them to deliver a complete feature on their first day, before they've had a chance to familiarize themselves with the organizational culture, the company's ways of working, coding conventions, in-house patterns, the existing technology foundation.

What happens? They'll deliver something that works. But it won't look like the rest. It won't follow conventions. It won't capitalize on what already exists. It will create friction at the next code review.

AI is exactly that senior developer on day 1.

It has the skills. It has the general knowledge. But it doesn't know your context. It doesn't know your patterns. It doesn't know your foundation.

The Cornerstones of the Sandbox

The good news — and this is the real breakthrough these months in the field have confirmed — is that once the cornerstones are in place, the results are striking.

What are these cornerstones?

Documented development practices and patterns. Not in an architect's head. Documented. In a way that AI can read, understand, and apply them. Which patterns to use for the data access layer? How to structure a service? Which naming conventions to follow? If it's written down, AI follows it.

An existing and accessible technology foundation. AI needs to be able to rely on abstractions, base classes, interfaces already in place. It needs to capitalize on what exists rather than reinvent with every prompt. This is exactly the role our Foundation plays: providing a stable, documented, and coherent base.

Example projects as inference references. This is perhaps the most underestimated element. AI learns by example. If it can see how a similar module was implemented — the injected dependencies, the respected layers, the structured tests — it reproduces the same level of rigor. Without an example, it invents. With an example, it infers and applies.

When these three elements come together, the conformity rate reaches 90% and above. Dependencies are correct. Application layers are respected. Patterns are followed. Coding practices are applied. It's no longer improvisation — it's guided execution.

What Remains: Peer Review

At 90% conformity, 10% still requires human intervention. And this is where peer review comes in — not as a luxury or an optional best practice, but as the last indispensable safety net.

That 10% is sometimes a questionable implementation choice. Sometimes a missed optimization. Sometimes a subtle convention that AI didn't catch. Human code review corrects these gaps and, most importantly, helps enrich the documentation and examples so the next cycle is even better.

It's a virtuous cycle: the more we review, the more we document, the more AI improves, the less there is to review.

And it's only a matter of time before that gap shrinks further. The tools are improving at a breakneck pace.

The Achilles' Heel: User Interface

There is, however, one domain where AI remains fundamentally limited, and it must be said plainly: user interface.

AI has no taste.

It's not creative enough to design an experience that surprises and delights. It's not consistent enough to maintain visual coherence across dozens of screens. It's not empathetic enough to anticipate what a user feels when facing an interface.

AI can generate impeccable CSS. It can implement a design system to the letter. It can produce functional and accessible components. But it's incapable of judging whether a screen delivers the user-friendliness that users expect. It doesn't know if spacing looks right, if the visual hierarchy naturally guides the eye, if the interaction feels satisfying.

This is a domain where humans remain and will remain irreplaceable — at least for the foreseeable horizon.

What This Means for the Future

After these months in the field, our conviction has strengthened: SDD is likely the new way of doing software development.

Not because it's a trend. Not because we sell services around it. But because the facts are there: when AI is properly equipped — with clear specifications, a documented foundation, explicit patterns, and concrete examples — it produces production-quality code at a fraction of the traditional time.

The tools will continue to improve. They will become not only more efficient but gradually more autonomous. The mechanical facet of software development — writing code line by line — will progressively fade.

But there will always be a fundamental place for humans. For three reasons that won't disappear anytime soon:

Adding creativity where it creates value — in user experience, in product innovation, in solving novel problems.

Preventing creativity where it would be dangerous — in security, in compliance, in data integrity. You don't want a "creative" AI handling your authentication mechanisms.

Framing requirements to ensure the solution solves the real problem — not an overly creative interpretation of what the user asked for.

Conclusion: From Concept to Conviction

In January, SDD was a promising concept. Today, it's a conviction validated by practice.

Yes, SDD requires an upfront investment. Yes, you need to document, structure, and build the sandbox before playing in it. But once the framework is in place, productivity is multiplied tenfold, quality is consistent, and AI becomes what it should be: a powerful accelerator within a controlled framework.

The beach will always be tempting. The horizon is beautiful, the possibilities seem infinite. But teams that build lasting systems know that a good sandbox is worth more than a beautiful beach where you risk drowning.

Software development isn't disappearing. It's transforming. And those who master the art of specifying — of defining the sandbox — will be the ones who extract the most value from it.


To learn more about SDD and our approach, see our Foundation CSS and our previous article SDD vs Vibe Coding.

AI is transforming our profession. SDD ensures we stay in control.