Liquid UI

The end of the "Page" and the rise of Generative Components.

DesignJanuary 202610 min read
Liquid UI

The "Page" is a relic of the printing press. We divided documents into pages because paper has edges. Then we brought pages to the web because... actually, I'm not sure why. Habit, maybe.

Why do we still organize software into "Pages"? Because until now, software was static. We had to pre-bake every state. Design a page. Build the page. Ship the page. If users need something different, design another page.

Generative AI kills the Page. It introduces Liquid UI: interfaces that assemble themselves based on user intent.

What Does Liquid Mean?

Liquid means: The UI is not a fixed object. The UI is a living system that adapts to context.

Today, I design a dashboard. It has six widgets. Those are the widgets, forever. If you want different widgets, you ask me to redesign it.

Tomorrow, you say "Show me what's important for my board meeting." The dashboard reshapes itself. It pulls the metrics that matter for board meetings. It hides operational details. It adds a trend comparison because board members always ask about trends.

The dashboard didn't change. The dashboard responded.

The Atomic Theory of Liquid UI

To design Liquid UI, you must stop thinking in screens and start thinking in Intents and Components.

1. The Intent (The Trigger)

The user doesn't click a button. They express an Intent.

  • Intent: "I want to understand why my sales dropped."
  • Intent: "Help me prepare for my meeting with Acme Corp."
  • Intent: "What should I work on today?"

Intents are messier than button clicks. They have context, constraints, and implied preferences. Your job is to map the space of intents your product should handle.

2. The Component (The Atom)

You don't design a dashboard. You design a library of "Smart Components" that know when to appear.

  • The Trend Chart Component: Knows how to visualize time-series data. Knows when a trend is significant enough to highlight.
  • The Summary Card Component: Knows how to condense text. Knows when detail is needed vs. when brevity is better.
  • The Action List Component: Knows how to render tasks. Knows how to sort by urgency.
  • The Comparison Table Component: Knows how to show options side-by-side.

3. The Generative Layout (The Physics)

The AI acts as the "Layout Engine." It hears the intent, picks the components, and arranges them.

Example: User says "Show me Q4 sales." AI fetches data, selects BarChart component, renders it full width. User says "Compare it to marketing spend." AI shrinks BarChart, adds LineChart side-by-side, adds CorrelationSummary below.

The dashboard morphed. It wasn't a different page. The physics of the interface reacted to the conversation.

The New Design Artifacts

Your deliverables change:

Old: Mockup of Dashboard Page → New: Component Catalog + Usage Rules + 20 Intent Scenarios

Old: Figma prototype with 15 screens → New: Interactive prototype that responds to natural language

Old: Design spec with pixel measurements → New: Behavior spec with decision logic

The Job Changes

The job changes from "Screen Architect" to "Component Librarian" and "Physics Designer."

You ensure the bricks are perfect. You write the rules for how they assemble. The AI builds the house.

This is more work upfront. But the payoff is enormous: instead of designing 50 pages, you design 20 components and 30 rules, and the system can generate thousands of relevant interfaces.

Let's talk about your product, team, or idea.

Whether you're a company looking for design consultation, a team wanting to improve craft, or just want to collaborate—I'm interested.

Get in Touch
Newsletter

Weekly design insights

Weekly observations on design, AI, leadership, and the craft of building. What I'm reading, thinking about, and making.