Vanti. AI redesign

In my role as lead product designer, I spearheaded collaborative brainstorming, shaping user-centric experiences through extensive UX research and crafting low-fidelity prototypes. I actively contributed to management activities, including sprint planning and weekly reviews. My responsibilities extended to managing design system elements and collaborating with development teams for product reviews and feature requirements.

🏙️ Company: Vanti.ai

🗓️ Project date: 2021

🖥 project type: Desktop, b2b SaaS app

👩  my role: Sole product designer

Project summary:

This project aims to revamp the model creation process in Vanti.AI's SaaS product, addressing key user issues. The streamlined flow not only saves time but also prioritizes user needs, enhancing trust, clarity, and memorability in the product experience.

The challenge:

How can we enhance the model-building process?
A significant challenge involves designing for diverse user needs, particularly in factories. Challenges persist in finding suitable AI models, discerning optimal data utilization for prediction and defect detection, and comprehending the potential improvements the model can bring to manufacturing lines.

Redesign of Vanti's dashboard within the stations view

User tasks include converting AI models into manufacturing processes for efficient production deployment and comparing configurations.

Redesign of station creation flow

user needs

Users often find themselves dedicating considerable time to the process of transforming data into a trained AI model deployable on the production line. This is especially true when tasks such as incorporating dataset attributes or utilizing data connectors come into play.

Therefore, it becomes crucial for users to have the ability to swiftly review and compare various models they've constructed, each with different configuration options, ultimately contributing to a more streamlined and effective user experience.

Solution

Process: The set of steps taken to address the project goals.

RESEARCH:

  • Competitive audit
  • User interviews
  • Context study
  • Usability testing

IDEATION:

  • Feature narrative
  • Jobs To Be Done
  • Design principles
  • Journey map UX flow

FEEDBACK:

  • Analyze Interactions using Full story
  • user testing
  • Designer critiques
  • Stakeholder reviews

The Importance of a Competitive Audit in Strategic Planning

RESEARCH

I thoroughly analyzed the current interface structure, gathered usage and engagement data, and conducted a comprehensive competitive analysis by studying successful and unsuccessful examples from various apps.

In addition, I conducted a visual process evaluation, examining how competitors present data, the type of information displayed, filters utilized, and the presentation of specific results.

Furthermore, I delved into the analysis process, exploring the range of actions users can take, the reception of alerts, feature usage, and navigation between different models' performance.

Visual Process

Analysis Process

Unveiling User Pain Points, Navigating Challenges inProduct Interactionand Experience

Context study

Manufacturing technicians, often with engineering degrees, operate machines in automated processes. In factory settings, they work 9-10 hour shifts, ensuring safety and product quality through inspections and testing.

Manufacturing managers, aged 50-70, expert in manufacturing, lack AI familiarity. Bachelor's degree holders, overseeing production, relying on others. Work at factory workstation, address safety and technical issues during 9:00-17:00.

A flow chart showing the new process

Methods

How did we solve the problems?

Interviews:
My goal was to gain a deeper understanding of users' pain points through user interviews. Explainability and trust among our users were the main KPIs I wanted to measure. Using the People + AI Guidebook method, I divided the interview into three main stages: Calibration of trust, explanation strategy, and user research.

Divided into two groups, I asked the same questions to the two groups of users to identify their needs. Users in Group A are typically manufacturing technicians or users who interact with AI models regularly. They don't trust our AI models and keep building new ones without deploying them on production lines.

In Group B, there were non-technical users, such as manufacturing managers who were interested in understanding the production process without having a deep understanding of data science, so they tend to place more trust in our models than they should.

Interviewing users using the following method

Revamped configuration flow featuring both manual and automatic options.

Model creation wireframes from a redesigned process

Sitemap of the redesigned app flow

Usability testing

I continually conducted usability tests with our monitoring and analysis software, assessing user comprehension of app feedback, pinpointing decision-making challenges, and evaluating the app's overall flow. These tests revealed insights that led to the identification of new feature opportunities.

Model creation's primary flow

Users can create the following use cases and models with the new flow for model creation

The user's need is efficiently creating a model and gaining insights into its deployment performance for reliable implementation.

Features

Labeling Stations and Products, Streamlined Model Creation Flow,Immediate Data Verification, and Enhanced Data Attribute Preview in Aperio App.

Enhancements to the model creation flow include a progress bar, staged division for clarity, and immediate data verification. Users can preview and add data attributes with real-time feedback, improving the "white box" experience. The step tracker offers insight into model creation progress, estimating completion time for user retention. Users can multitask during this process.

First step in the redesigned model creation flow.

Revamp UX patterns and layout for the new site overview feature, empowering users to monitor and adjust models seamlessly within the real-time site process.

Label stations and products associated with a model, offering a comprehensive factory view beyond model data. Users can upload diverse data for various use cases, fostering customized model creation aligned with station goals. Additionally, we've clarified acceptable image and tabular data formats to prevent redundant errors.

The flow before revamp

The flow after revamp

Results

Project Success Metrics:

As a result, users were able to create flows faster and deploy more models on the production line, giving them more time to address data quality concerns. The user experience also receives praise from potential customers.

Reflection

Project Hindsight:

The project taught me the importance of analyzing existing flows and finding potential solutions before developing them. In complex flows, it is helpful to have uniform components, screens, and step names.

This will allow everyone to understand the context of the task and conversation, and we will be able to work together more efficiently.

Employing the WHITE BOX METHOD, we elucidated the model creation process, providing users with insights into the stages while they awaited the completion of the model.

Enhanced Model Review Preceding Deployment

Redesigned Model Deployment Report

Testimonials

How people feel about the project:
Kudos to the company's user experience team! The revamped process for creating emotion models is much clearer, making it accessible for anyone. The new features simplify model creation, and the estimated build time is now transparent. Great job!

- The team of Flex's analysts

Next
Case