Advocating & Building a Design System
Early-stage Design System - Pie Insurance.

Pie's Design System was created in Adobe XD. I led the transition to Figma and expanded the system from 40 components and patterns to 111, creating a baseline for the needs of five different products.
Additionally, we focused on reducing the number of hosting environments and establishing one central repository to host our applications. This means that changes to the design system would be reflected in all the applicable products.
Problem Statment
The lack of a fully transitioned and standardized design system created inconsistencies between design and code, leading to inefficiencies for designers and developers, increased rework, and slowed implementation timelines. Without a centralized, well-maintained system, teams faced usability issues in components, inconsistent UI patterns, and difficulty maintaining alignment across product experiences.
My Role
As the Product Designer leading the refinement and transition of Pie's design system, my role was to ensure alignment between design and development, improve component usability, and increase efficiency for designers and engineers.
I was assigned as lead designer on the Design Operations team, working cross-functionally with our web platform team. During this time, my key responsibilities included:
-
Building Pie's design system with Figma, achieving component alignment between design and code sandboxes, significantly enhancing front-end development efficiency, and ensuring consistency across Pie's products.
-
Spearheaded usability testing of the design system, validating component effectiveness. Testers highlighted its simplicity, responsiveness, and ease of use, leading to further refinements.
-
Advocating to complete the sandbox and additional components and patterns regularly used across Pie. I also advocated for the participation of Product Owners to make time for repository transition.
🚀 The estimated savings are $75k annually, and a streamlined workflow for approximately 30 employees saves an estimated 240 hours per component without using the system.

Brand Incorporation
To ensure a seamless and cohesive user experience, I collaborated with the marketing team to incorporate brand identity into the design system. Early-stage discussions focused on aligning tone, visual elements, and website consistency, ensuring our system reflected the established brand while remaining adaptable for future growth. By translating brand standards into scalable UI components, I helped create a system that not only reinforced brand recognition but also maintained design flexibility, enabling teams to build with efficiency and consistency across products.
Using Atomic Design Principles
During this process, atomic design principles were leveraged to create a scaleable, modular framework that supports consistency and efficiency. We utilized the breakdown of components into their smallest functional elements -- atoms (buttons, inputs, icons, cards), molecules (form fields, dropdowns), and organisms (full-form sections, document upload).
Process
1️⃣ Identifying Needs & Gaps
-
Conducted an audit of existing components in Figma and compared them to coded implementations in the sandbox.
-
Engaged the designers and developers to understand usability issues, inconsistencies, and gaps in the system.
-
Identified common auto-layout bugs and design layout inconsistencies (e.g., padding, spacing)
2️⃣ Aligning Design & Code
-
Collaborated with the Web Platform team to determine which components needed updates, refinement, or alignment with coded versions.
-
Ensured that naming conventions, properties, and interactions matched engineering expectations for smoother implementation.
3️⃣ Usability Testing & Validation
-
Designed and facilitated usability tests to evaluate whether components were intuitive, easy to use, and functionally effective.
-
Conducted tests with a broad user test pool to observe interactions with components in realistic scenarios.
-
Gathered quantitative and qualitative feedback, focusing on areas like interaction clarity, response time, and overall ease of use.
4️⃣ Refining & Iterating the System
-
Made design adjustments based on testing insights, resolving usability friction, and refining component interactions.
-
Worked closely with developers to implement fixes, ensuring updates were reflected in both Figma and the sandbox environment.
-
Addressed designer-reported issues, including auto-layout fixes and improvements in component flexibility.

Example Process: Date Picker
While updating the date picker component, we discovered several functionality issues. Typing in dates had nuanced inconsistencies, navigating months was cumbersome, and some edge cases—like adding a 31st day to a month without one—caused minor errors.
We took this opportunity to enhance the component’s functionality, usability, and readability. Rather than reinventing the wheel, we opted for an open-source solution that we simplified and adapted to our needs.
The updated date picker was included in usability testing alongside other new and revised components.


Adobe XD Date Picker
Updated Date Picker in Figma
Example Process: Design and Code Alignment
Design Operations (myself) and Web Platform worked closely together to ensure that there was consistency and alignment between what was being built in the design kit and what was being built in the sandbox where one-to-one.

Adobe XD Design Kit

Phase 1: Component creation in the sandbox

APIs + Design Systems
Within the design system team, we were not only looking at the UI components and their general functionality but we were digging deeper into the patterns and sometimes the data presented on the front end via the back end API.
The example displayed here is an issue we found when we would autogenerate selections when a user was typing. This was typically found when using our autocomplete component.
We looked into numerous bugs, such as the different experiences the user could have when typing a word versus the numbers.
Usability Testing Each Component
In order to evaluate wheather the updates and additions made to the design system met the needs of our users and were effective. The research effort targeted to identify any usability issues and areas for improvement and align with business objectives of operational efficiency and being "easy as pie".
Users
Five test users went through this process in about 5-10 minutes. The criteria for the user pool were broad enough to focus on real-world scenarios.
The Test
Five participants completed a series of three tasks to test 10 components. One of the 5 participants had accessible accommodations on their computer, such as a screen reader.
Post Test Questions:
-
Which best describes your ability to complete your tasks on the website? (Very easy- Very Difficult)
-
How would you rate your overall experience with this website? (1 Poor - 5 Excellent)
-
If you could change anything about your experience on this site, what would it be? (Open text)
-
Is there anything else you would like to share about your experience using the website? (Open text)
The Results
Success Metrics for Evaluating Usability:
-
Task Completion rate: 80%—One person felt they could not complete their task because they did not understand what was needed from the input name or because they confused their email address with their mailing address.
-
Time on Task: Average of 7 minutes
-
Satisfaction scores: Excellent/good with an average of 4.5.
What Users Said:
-
More than one use called the experience "simple" and "nice."
-
All users felt their interactions with the tested components were responsive and easy to use.
-
No users rated their experience as three or below.
-
Users did have callouts around clearer labeling and having a feature like a live chatbot.