Design systems have a reputation for being long and complex, even when they’re relatively small. And in my experience, they rarely stay small for long. What starts as a tidy component library quickly turns into months of iterating, realigning and cleanups.
Recently, I developed a design system using AI with an MCP-powered Figma integration, Figma MCP. When I began working on this system, I wanted to avoid that pattern. I wasn’t interested in producing more components. I wanted to build something that could hold its structure as the product evolved.
Instead of jumping straight into UI components, I built the system step by step. Tokens came first, components followed. Usage patterns came last. At each layer, AI helped reinforce consistency before complexity had a chance to accumulate.
The real breakthrough came from combining AI with Model Context Protocol inside Figma.
Working With Real Context, Not Abstract Prompts
Without context, AI can produce plausible output. It suggests patterns and writes guidelines that could work in many systems.
But with MCP inside Figma, AI could see my actual file.
It had visibility into:
- Component structures
- Variant properties
- Token definitions
- Naming conventions
- Style mappings
- Layer hierarchy
So I wasn’t asking AI to invent a design system. Instead, I was asking it to inspect, extend, normalize, and refactor the one already taking shape inside the file.
It could read real components and real tokens, so it could evaluate structure instead of guessing at intent.
Starting at the Foundation: Token Architecture
I began at the token layer before touching visible UI. The foundation included:
- Color scales with semantic mappings for success, warning, and destructive states
- Typography ramps with defined size, weight, and line-height ratios
- Spacing increments based on a consistent base unit
- Border radius values
- Elevation and shadow levels
- Semantic surface tokens for background and interaction states
Tokens quietly determine whether a system scales cleanly. If spacing increments drift or semantic colors overlap, the inconsistency spreads across every component that depends on them.
With MCP access, AI compared token values directly inside the file. It flagged:
- Redundant color tokens with nearly identical values
- Spacing increments that broke the established rhythm
- Radius values that did not align with the scale
- Naming inconsistencies that would later confuse implementation
Because it could trace where tokens were used, it also highlighted definitions that were rarely referenced or duplicated under different labels.
That early normalization strengthened the system before I introduced additional surface complexity.
Refining Components With Structural Awareness
Once the token layer felt stable, I shifted focus to components.
Buttons, inputs, cards, and navigational elements all depended on token logic. As the library grew, variant complexity started creeping in. Small differences in state handling created overlapping configurations. Loading states varied slightly across components. Disabled styles drifted.
With contextual visibility, AI compared variant matrices across the system. It surfaced duplication and suggested consolidation where structural overlap existed.
This process changed how I treated variants. Instead of expanding options whenever a new edge case appeared, I defined clear rules for what qualified as a legitimate state.
Variants were treated more like contracts than endless options, helping reduce long-term maintenance issues.
Making Usage Predictable
Because AI could inspect actual component architecture, it supported the development of clearer usage guidance tied to real structure.
It helped generate:
- Do and do not examples based on existing variants
- Accessibility reminders connected to defined contrast values
- Notes on state transitions such as loading and disabled behavior
- Clarifications around when to use one component over another
These suggestions reflected the system as it existed, not generic best practices. Documentation aligned directly with implementation logic.
That alignment made the system easier to adopt and harder to misuse.
What Did AI Actually Contribute?
I wasn’t relying on AI to design the system independently, or make aesthetic judgment. The real value it brought was in amplifying discipline.
Inside the file, it continuously analyzed patterns and surfaced inconsistencies before they compounded. It accelerated refactoring cycles and reduced subjective drift I mentioned earlier.
I saw the impact in three areas:
- Faster iteration when updating or extending components
- Fewer debates around structural decisions
- Greater consistency across tokens, components, and usage guidelines
- Clearer, example-driven usage patterns that engineers could reference directly
Because AI operated within the real structure of the file, it functioned like an embedded audit layer. It continuously reinforced decisions and helped maintain alignment, so that complexity would not outrun a team’s capacity to keep up.
A Quick Note on MCP, Access and Security
Giving AI deep access to a live design file changes the shape of your workflow.
MCP allows AI to read real components, tokens, naming conventions, and system structure. That visibility is what makes the collaboration useful. It also means you are exposing parts of your product architecture. For teams working on proprietary products, that’s not trivial.
When we use MCP in active environments, we route it through a secure gateway to control access and make it observable. This doesn’t lock the system down to the point of friction, but it does prevent contextual access from becoming uncontrolled access.
AI tools are increasingly embedded in product workflows. That means security needs to be part of system design. The same care applied to tokens and component architecture should apply to how tools operate inside the environment.
Maintaining Coherence as Complexity Increases
When a team is small, informal alignment works. Designers clarify decisions directly with engineers, and you can correct small inconsistencies easily.
As soon as velocity increases, that model breaks down. Hiring additional designers and expanding to multiple surfaces increase that structural pressure. Without reinforcement, inconsistency compounds.
A well-structured design system:
- Reduces cognitive load across the team
- Speeds onboarding for new contributors
- Minimizes implementation ambiguity
- Prevents large-scale refactoring later
As tools like MCP evolve, the opportunity for product, design and engineering teams extends beyond cleanup and normalization. They can begin treating structural integrity as a continuous property of their design process. AI becomes part of the system’s architecture, not just a productivity layer on top of it. That is a true product foundation that scales with intention rather than repair.
FAQs





