Blog PostSmart Styling with AI: The Next Step in UI Design

Nicolas Matias

/

Why we started this project

Across UX/UI work, particularly in low-code environments like OutSystems and Mendix, we kept seeing the same issue: despite having robust design systems and reusable components, visual consistency across teams would often break down. Small styling discrepancies slipped through, leading to inconsistent user experiences, prolonged reviews and duplicated rework.

Once we understood the root of the issue, it became clear that there was an opportunity to improvement. What if we could automatically detect design inconsistencies based on design system rules? And even more, what if smart styling guidance could be integrated right from the start of the design process?

Our approach

We decided to start developing a system called UI Styling using AI, not to replace designers, but to support them by bringing automation and intelligence into the styling process.


The tool analyzes user interfaces, whether from Figma files or screenshots, and compares them against predefined design rules. When inconsistencies are detected, it generates contextual suggestions, helping teams correct issues before they become problems.

By taking this approach, it can accelerate styling during both the design and implementation phases, reduce manual work and enhance adherence to brand guidelines.

Because the goal is not to automate creativity, it’s to automate consistency.

Tools and technologies

A modern tech stack is utilized, bringing together several powerful tools. It comprises:

  • React and Next.js for UI logic and frontend rendering;
  • Figma Plugin API to consume a single source of truth directly from design files;
  • OpenAI with custom embeddings for semantically analyzing the use of naming conventions, style, and layout patterns;
  • Tailwind CSS, shadcn/ui and design tokens to transform AI outputs into implementation frameworks;
  • Lighthouse and Storybook as alternative sources for structural and visual analysis.

Challenges we’re addressing

One of the main challenges lies in teaching AI what consistency looks like. Since visual design is often subjective, translating abstract concepts such as balance, rhythm and hierarchy into machine-readable patterns can be particularly complex.

There is also the matter of adaptation, since each design system possesses its own nuances, and the AI needs to interpret rules within that context.

Ultimately, our goal is to maintain a clear boundary: the AI supports, but designers and developers stay in full control.

Impact and relevance

The benefits of this approach are already clear. This is, by integrating intelligent styling validation into the workflow, we can highlight four key advantages:

  1. Reducing time spent on visual QA;
  2. Ensuring stronger consistency across large teams and codebases;
  3. Onboarding new contributors more quickly, with AI offering styling guidance in real time;
  4. Flagging potential issues earlier, thereby avoiding rework late in the process.

This changes not only how we style interfaces, but also how we maintain design integrity throughout the lifecycle of a digital product.


Current status and what’s next

We have developed a working prototype capable of reading Figma files and UI screenshots, identifying misalignments with a design system, and generating correction suggestions.

Over the next few months, the focus will be on integrating this into design and development workflows, including the possibility of reviewing styling consistency during pull requests. We are also implementing a Sentinel Mode that actively monitors screens during their creation or editing, providing real-time feedback.

Once it reaches maturity, we aim to release an MVP to select clients and partners to gather early feedback.

Related projects

This initiative builds directly on our work with Rocket UI, where scalability and maintainability of the front-end are key. It also incorporates lessons learned from internal tools that were developed to support accessibility and interface QA in low-code environments.

Together, these efforts constitute a key component of our broader vision: integrating intelligence into design execution, without compromising creativity or control.