On this page
← All Chapters / The New Operating Model
09

Product Design & Prototyping

14 min read

A History of Constant Reinvention

The job of a digital product designer has been reinvented multiple times in the past two decades. Each wave of new tools changed not just how designers worked, but what skills mattered and what value they provided. Understanding this history helps us understand why the current AI transformation is both familiar and fundamentally different.

The Photoshop Era

For years, designers created digital interfaces in Adobe Photoshop. Think about how absurd that is. Photoshop was built for retouching photographs. The word "photo" is literally in the name. Yet an entire generation of UI designers learned their craft pushing pixels in a tool designed for an entirely different purpose.

They made it work because it was all they had. Designers created interfaces as flat images, layer upon layer of rectangles and text. They exported these images as PDFs or PSDs and handed them to developers, who would then try to recreate them in code. The handoff was painful. Developers received static images without dimensions, spacing values, or any understanding of how elements should behave responsively. They measured pixels manually, guessed at interactions, and built what they thought the designer intended.

Photoshop was expensive, complex, and overwhelming. Most designers used perhaps 20% of its features. The interface was cluttered with tools for photo manipulation that had nothing to do with UI design. But designers adapted because there was no alternative. The tool shaped the practice: design meant creating static images that represented screens.

The Sketch Revolution

In 2010, a small company in the Netherlands released Sketch, and the design world shifted.

Sketch was purpose-built for UI design. Nothing else. It stripped away the complexity of Photoshop and focused on what interface designers actually needed: vector shapes, artboards for different screen sizes, symbols for reusable components, and an interface that felt clean and intuitive.

The impact was immediate and profound. Designers could work faster because they weren't fighting a tool designed for something else. Multiple artboards let them design for different devices simultaneously, addressing the responsive design challenges that Photoshop had never anticipated. Symbols introduced the concept of reusable components, laying groundwork for what would become design systems.

Sketch proved that tools shape thinking. When designers weren't constrained by photo editing paradigms, they started thinking differently about interfaces. Design became more systematic. Components became first-class concepts. The conversation shifted from "how do I create this image?" to "how do I create this system?"

But Sketch had limitations. It only ran on Mac. Files lived on local machines. Collaboration meant emailing files back and forth, managing versions manually, and hoping nobody overwrote someone else's work. Designers worked in isolation, then synchronized through clunky handoffs.

The Figma Transformation

Figma launched in 2016, and initially people thought the founders were crazy. A professional design tool in the browser? The technical challenges seemed insurmountable. But WebGL had matured enough to make it possible, and Figma bet on a future that others couldn't see.

The browser-based approach wasn't just a technical curiosity. It enabled something genuinely new: real-time collaboration. Multiple designers could work on the same file simultaneously, seeing each other's cursors, watching changes appear instantly. It was Google Docs for design.

This changed everything about how design teams operated.

Handoffs became links instead of files. Developers could inspect designs directly, extracting exact colors, spacing, and dimensions without designers creating specification documents. Comments lived in the design itself, attached to specific elements. Version history was automatic and unlimited. The design file became a living document rather than a series of snapshots.

Design became a team sport in ways it had never been before. Product managers could leave comments. Engineers could ask questions in context. Stakeholders could view progress without scheduling review meetings. The walls between design and other disciplines started crumbling.

Figma also introduced features that accelerated design work itself. Auto Layout made responsive design intuitive. Components and variants enabled sophisticated design systems. Plugins extended functionality in countless ways. The community library meant designers could share and build on each other's work.

By the early 2020s, Figma had become dominant. Adobe, seeing the threat, tried to acquire Figma for $20 billion in 2022, though the deal eventually fell through due to regulatory concerns. The tool that started as an underdog had become the standard.

The Prototyping Layer

Alongside the evolution of design tools, a parallel ecosystem emerged for making designs interactive.

Static images couldn't show how interfaces actually behaved. How does this menu animate? What happens when you click this button? How does navigation flow between screens? These questions required tools that could simulate real interaction.

InVision pioneered clickable prototypes, turning static screens into linked experiences. Principle enabled sophisticated animations. Framer pushed further, bridging design and code with prototypes that could include real logic. Marvel, ProtoPie, and others competed for different niches in the prototyping space.

These tools addressed a real problem: the gap between what designers envisioned and what stakeholders could understand from static mockups. A prototype worth a thousand pictures. Seeing an interaction was infinitely clearer than describing one.

But the ecosystem was fragmented. Designers worked in Sketch or Figma, exported to InVision for prototyping, used Zeplin for developer handoff, and managed assets across multiple tools. Each handoff introduced friction and potential for error.

Figma absorbed much of this functionality, adding prototyping and developer handoff directly into the design tool. The trend was toward consolidation, toward fewer tools doing more things.

The AI Disruption

Every previous transition changed how designers worked. Photoshop to Sketch made them faster. Sketch to Figma made them more collaborative. Prototyping tools made them more expressive.

The AI transition is different. Previous tools changed the how. AI changes the what.

Tools like v0, Galileo AI, and others don't just help designers work faster. They do design work directly. Describe an interface in words, get a complete design back. Upload a sketch, get polished screens. Ask for variations, get ten options in seconds.

This isn't incremental improvement. It's a category shift. The designer's job is no longer primarily to produce designs. AI can do that. The designer's job becomes something else: to judge designs, to guide AI toward better outputs, to understand users deeply enough to know what good looks like.

Each previous tool transition was difficult. Designers had to learn new software, new workflows, new mental models. But their core value, the ability to create designs, remained intact. This transition threatens that core. When anyone can generate designs by typing a description, what is the designer's unique contribution?

The answer lies not in production but in judgment, strategy, and understanding. The designers who thrive will be those who can direct AI toward outcomes that serve users, not just outcomes that look good.

Design Fundamentals That Endure

Amidst all this change, certain principles remain constant.

Users First

Good design starts with users, not aesthetics. What are they trying to accomplish? What context are they in? What do they already understand? What will confuse them?

Steve Krug's "Don't Make Me Think" remains the essential design philosophy. Every moment of confusion is friction. Every unnecessary choice is cognitive load. The best interfaces are obvious. Users shouldn't have to figure out how things work. They should just work.

AI can generate beautiful interfaces instantly. But beauty without usability is failure. The question is never "does this look good?" It's "does this serve users?"

Information Architecture

How information is organized determines whether users can find what they need. Taxonomy, hierarchy, navigation, labeling: these structural decisions matter more than visual polish.

Users have mental models about how things should be organized. When your information architecture matches their mental models, the product feels intuitive. When it doesn't, the product feels confusing, no matter how pretty it is.

AI tools don't automatically understand your users' mental models. They generate plausible structures based on patterns in their training data. Those patterns may or may not match your specific users. Information architecture still requires human understanding of human minds.

Consistency and Patterns

Users learn patterns. Once they understand how something works in one place, they expect it to work the same way everywhere. Consistency reduces cognitive load and builds confidence.

Design systems exist to enforce consistency at scale. They define patterns, components, and rules that ensure coherent experiences across features and teams. A good design system is a multiplier: it makes every designer more effective and every product more coherent.

AI can help maintain and extend design systems. But someone has to define what the patterns should be. That's still human work.

Accessibility

Products should work for everyone, including users with visual, auditory, motor, or cognitive differences. Accessibility isn't a nice-to-have. It's a requirement for products that serve real human diversity.

Accessibility has historically been under-resourced because it required specialized knowledge and additional effort. AI is changing this equation, making accessibility testing and remediation more practical. But the commitment to accessibility has to come first. The tools just make it easier to fulfill.

The Current AI Design Toolkit

The landscape is evolving rapidly. Rather than catalog every tool, let's understand the categories of capability that now exist.

Text-to-Design Generation

Describe what you want, get a design. Tools like Galileo AI generate complete, polished interfaces from natural language descriptions. "Create a dashboard showing monthly revenue, user growth, and churn rate with a clean, minimal aesthetic." Seconds later, you have something visual to react to.

The output quality is remarkable. Designs often look like they came from senior product designers, complete with realistic images, proper spacing, and coherent visual hierarchy. These aren't wireframes. They're high-fidelity mockups.

The value is in exploration speed. Generate ten different approaches in the time it once took to create one. Explore wildly different directions without the investment of traditional design work. Align stakeholders on direction before investing in detailed refinement.

Text-to-Code Generation

Tools like Vercel's v0 take a different approach: instead of generating design files, they generate working code. Describe a component, get React code with Tailwind CSS that you can paste directly into your project.

This bridges design and development in a new way. The output isn't a mockup to be implemented later. It's the implementation itself. For teams building with modern web frameworks, this collapses the design-to-development handoff entirely.

The tradeoff is less design refinement capability. These tools optimize for functional code, not pixel-perfect design exploration. But for many use cases, functional and good-looking is exactly what's needed.

Design-to-Code Translation

Upload a Figma file or screenshot, get working code. Tools like Locofy and others automate the translation that developers used to do manually.

This addresses the persistent handoff problem. Designs no longer need to be interpreted by developers. The translation is automatic, reducing errors and saving time.

AI-Assisted Design Tools

Figma itself is adding AI capabilities. Generate component variants automatically. Fill designs with realistic content. Get suggestions based on your design system. The tools designers already use are becoming smarter.

This integration matters because it meets designers where they are. Rather than switching to new tools, designers can access AI capabilities within their existing workflow.

Prototyping at the Speed of Thought

The most profound change is in prototyping. What once took weeks now takes hours or less.

The New Prototyping Workflow

Traditional prototyping was sequential: concept, wireframes, review, high-fidelity mockups, review, prototype, user testing. Each step took time. Each handoff introduced delay.

AI enables compression. Describe your concept. Generate multiple variations. Pick the most promising. Refine through conversation. Generate working code. Put it in front of users. All in days, not months.

Conversational Iteration

The most powerful pattern is conversational iteration. Generate a design. React to it. Ask for changes. See them immediately. React again.

"Make the navigation more prominent." "Add a search bar in the header." "What if we used cards instead of a list?" "Show me a dark mode version." Each request yields an immediate result.

This feels different from traditional design review. You're not looking at static options and picking one. You're actively sculpting through dialogue. The design emerges from conversation.

When to Prototype

With prototyping costs near zero, the question shifts from "should we prototype?" to "why wouldn't we prototype?"

Any feature discussion benefits from making the feature visible. Any disagreement about user experience resolves faster when you can see the options rather than debate them abstractly. The default should be to prototype.

The Changing Role of Designers

AI doesn't eliminate the need for designers. It changes what designers do and what makes them valuable.

From Production to Direction

When AI can produce designs, human designers shift toward directing design. They define what good looks like. They make judgment calls that AI can't make. They understand users in ways that AI doesn't.

This is similar to how photography evolved. When cameras became accessible to everyone, professional photographers didn't disappear. They focused on vision, composition, and storytelling, the things that cameras couldn't provide on their own.

Taste and Judgment

AI generates options. Humans choose among them. That choice requires taste: the ability to recognize what's good, what serves users, what fits the brand, what will work.

Taste isn't algorithmic. It's developed through experience, exposure, and reflection. Designers with strong taste become more valuable in an AI-augmented world, not less. They're the ones who can guide AI toward good outcomes.

User Advocacy

AI doesn't know your users. It knows patterns from training data that may or may not apply. Designers who deeply understand users remain essential.

The designers who spend time with users, who observe their struggles, who internalize their mental models, bring something AI cannot. They can evaluate AI-generated designs against real user needs.

This argues for designers spending more time on research and less on production. Understanding users is the irreplaceable part.

Product Directors and Design

As a Product Director, you're not designing products yourself. But you're shaping how design happens across your organization.

Design Fluency

AI tools put design capabilities in more hands. Product Managers can generate prototypes. Engineers can explore interfaces. This democratization is mostly good, but it requires new guardrails.

Build design fluency across your product organization. Everyone should understand basic design principles. Everyone should know how to use AI design tools effectively. But also ensure everyone knows when to involve professional designers and what designers bring that AI doesn't.

Quality Standards

When anyone can generate designs, quality standards become critical. What makes a design good enough to ship? Who decides? What review processes ensure consistency?

Fast iteration shouldn't mean lower quality. It should mean more iterations to reach quality faster.

Preserving Strategic Design

Not all design work should be fast. Some design decisions are strategic and deserve deep thinking.

Brand definition. Core user flows. Design system foundations. These benefit from careful, deliberate design processes even when fast iteration is possible. Just because you can go fast doesn't mean you always should.

The Risks of AI-Powered Design

Speed and ease create new risks.

Shallow Design

When designs come easily, it's tempting to accept the first reasonable output. But the first output isn't always the best output. Guard against shallow design by pushing teams to explore alternatives even when the first option seems fine.

Homogenization

AI design tools are trained on existing designs. They excel at producing things that look like other things. This can lead to sameness, products that look like every other product.

Distinctive design requires intentional divergence from patterns. It requires taste that recognizes when conventions should be broken.

User Research Shortcuts

When prototyping is fast, there's temptation to skip research and just build things. But usage data tells you what users do, not why. Fast prototyping should accelerate learning, not replace research.

The Future of Product Design

We're in the early stages of a transformation that will continue for years. The tools will become more capable. The gap between description and implementation will shrink further.

The fundamentals will persist. Users will still have needs. Good design will still serve those needs. Taste and judgment will still matter. Understanding humans will still be essential.

The job of the designer has changed with every generation of tools. Photoshop designers became Sketch designers became Figma designers. Each transition required new skills while preserving core principles.

This transition is harder because AI does the design work itself. The value shifts from production to direction, from creation to curation, from making to judging.

Product Directors who understand this shift can help their teams navigate it. Build the judgment and taste that AI cannot provide. Invest in user understanding that grounds AI-generated designs in real needs. Maintain quality standards even as production accelerates.

Design has always been about making things better for people. AI changes how we make things. It doesn't change why.