The Rules Were Always Temporary

AI

Design

May 8, 2026.

The Rules Were Always Temporary

AI isn't asking designers to learn new tools. It's asking us to remember the restless, irreverent instinct we had before the industry taught us to stay in our lane.

By Tom Kershaw, Group Design Director.

Images by Tom Kershaw.

There's a scene in Hackers—the 1995 film that probably introduced more people to the idea of digital culture than any actual computer—where a kid breaks into a system not because he knows exactly what he'll find, but because the act of looking is the point. The curiosity is the method. The system doesn't own you if you're willing to go somewhere it didn't expect.

I think about that energy a lot these days. Not the breaking in—but the disposition behind it. Exploratory, irreverent, genuinely thrilled by the unknown. It's the same energy that produced some of the most vital creative work of the last thirty years. And it's exactly what the design industry needs to rediscover right now.

Before the rules, there was chaos.

In the mid-nineties, while most of the industry was still figuring out what interactive even meant, a handful of studios and artists were already doing something else entirely.

Tomato—the collective that shaped so much of the era's most arresting visual communication, and the creative force behind Underworld's visual identity—weren't operating from a rulebook. They were treating design as a live experiment: typographic systems that felt like they were short-circuiting, imagery that refused to sit neatly between art and commerce, work that was intentionally unresolved. The Designers Republic, out of Sheffield, were building a visual language for electronic music culture that looked like corporate communication dismantled and reassembled wrong—deliberately, precisely wrong.

These weren't studios that had cracked a process. They were studios that understood something more durable: that the work happens at the edge of what you know, and you find it by going there before you're ready.

«And it points to something I keep coming back to: the most generative moments in design history happened when the tools outran the rules. We are in one of those moments right now.»

OK Computer didn't happen because someone engineered an album to define the anxiety of the late twentieth century. It happened because Radiohead deliberately abandoned conventional studios, treated uncertainty as material rather than obstacle, and let the process find the work rather than the other way around. Jonny Greenwood described it as trying to recreate brilliant records and missing. That missing—that productive failure—is the whole point. And it points to something I keep coming back to: the most generative moments in design history happened when the tools outran the rules. We are in one of those moments right now.

How design and development drifted apart.

To understand why this moment feels so charged, it helps to understand what happened in the decade that preceded it. As digital products grew more complex—microservices, component libraries, design systems, multi-platform deployment—the distance between design and engineering grew with them. Not because anyone wanted it to, but because the stack demanded specialisation. You couldn't hold it all in your head at once, so the disciplines formalized around what each role could reasonably own. 

Figma became the border. Designers worked up to the handoff. Engineers worked from it. The gap between a design file and a shipped product—technically, culturally, linguistically—widened into something that required its own infrastructure to manage. Whole job functions emerged just to bridge it. 

The aesthetic consequences were real, if harder to measure. When design and engineering operate in parallel rather than together, work tends toward the generic. Component libraries optimized for consistency produced interfaces that looked like every other interface. The pressure to design within what was technically feasible—as defined by someone else, interpreted at arm's length—quietly eroded the space where genuinely surprising work could happen. Design got more rigorous and, in many places, less interesting. 

The designers I came up with in the early two thousands would have found this strange. Back then, the roles hadn't separated as widely yet. Macromedia Flash was the closest thing we had to a universal creative instrument—a single tool where you could design, animate, write code, and edit sound in the same session, often in the same breath. It collapsed the distance between imagination and execution in a way that felt almost reckless. You might be building an interaction system in the morning and composing the audio that accompanied it by afternoon. The line between making something and building it was blurry in a way that felt generative—even necessary.

Nobody had agreed on where design ended and development began, which meant the space between them was open and alive.

AI is closing the gap.

What I'm watching happen with AI—and what leading this transformation at Huge has made very clear to me—is that the wall is coming down again. Not because the technical complexity has gone away, but because AI is absorbing enough of it to let designers get closer to the material again.

A designer who can prompt their way to a working prototype, who can generate and test interaction logic without waiting for an engineering sprint, who can move fluidly between the visual and the functional — that designer is operating in a fundamentally different relationship with the product. The handoff doesn't disappear, but the creative dependency on it does. And when design and development start speaking the same language again, work gets interesting again. 

This is the convergence that gets underreported in conversations about AI and design. The discourse tends to focus on what AI replaces or what it threatens. What it's actually doing, at the level of creative practice, is restoring a kind of fluency that the industry spent fifteen years engineering out of the process.

 

«When design and development start speaking the same language again, work gets interesting again.»

The mindset is the method.

There's a harder truth here too. The same tools that are closing the gap between design and development are also raising the bar. When everyone can see what AI makes possible, expectations accelerate alongside the technology. Speed and quality are no longer in tension—both are expected, simultaneously, as a baseline. Clients move faster, briefs get more ambitious, and the window between concept and delivery keeps shrinking. That's the environment designers are working in now. Which is exactly why mindset matters more than process. A framework tells you what to do when conditions are stable. A mindset is what carries you when they aren't. 

Which brings me to the problem with how we're trying to educate designers for this shift. The instinct—understandable, well-intentioned—is to reach for process. New workflows, new frameworks, new role definitions that account for AI's presence in the pipeline. And those things have value. But they have a short half-life. The tools are moving faster than any process can absorb, and whatever framework you codify today will feel dated before it's widely adopted. 

What holds is the mindset. And trying to teach that mindset is harder than teaching a process, because it's not a set of steps—it's a disposition. Three things keep coming up for me. 

Willingness to be wrong in public. The Designers Republic didn't put out safe work. Radiohead didn't make OK Computer by refining what worked before. Good work at the frontier requires showing the rough, treating the unresolved thing as part of the process rather than evidence of failure. AI accelerates this dynamic—output is instant, which means the willingness to discard and try again has to become default, not occasional.

Identity that isn't tied to a single tool. The harder cultural shift, especially for designers who've built real expertise inside a specific discipline or platform. But the designers who made that era interesting weren't defined by their software. They were defined by their point of view. That's portable. It survives the next tool change, and the one after.

Genuine excitement about the unmapped. This sounds obvious, but it's rarer than it should be. The best creative instinct is fundamentally about curiosity as a value. Choosing to go somewhere the system didn't expect. Looking before you know what you'll find. The industry spent a long time rewarding designers who could operate confidently within defined constraints. AI is rewarding the ones who are comfortable without them.

Back on the frontier.

There's something I find genuinely exciting about where we are, and I don't say this to be optimistic for its own sake. The conditions that produce interesting design—ambiguity, speed, collapsing disciplines, tools that don't yet have established grammars—are all present right now, simultaneously, at a scale I haven't seen since the early web. 

The industry spent the last decade getting very good at systems, process, and measurable outcomes. That work mattered. It made design more rigorous and more credible. But it also, quietly, moved the culture away from the frontier. It made design safer than it should be.

AI is pulling the frontier back into view. The gap between design and engineering is narrowing. The generalist never stopped being the most interesting person in the room—the industry just spent fifteen years building systems that forgot to reward them. That's changing. And for designers willing to approach this moment the way those studios approached a blank screen in the mid-nineties—with irreverence, with genuine curiosity, with no particular reverence for how things are supposed to look—the work ahead is going to actually surprise people.

That hasn't been easy to say for a while. It is now.