AI-Powered Visual Search

context

R & D

define

design decisions

pivot

conclusion

← back

← back

AI-powered visual solution to reduce friction in creative talent hiring from 0 to 1

Founding Product Designer @ THAT-SAUCE.COM

TEAM

Product Engineer, Creative Director, Brand Designer

TIMELINE

February – May 2025

WHAT I DID

Product Discovery, Prototyping, User Flows, Info Arch, Cross-Functional Collaboration, Handoff

IMPACT

  • AI interactions doubled results accuracy

  • 90% user satisfaction during demo

  • Decreased dev handover time by half through design system

00 context

that-sauce.com is a platform for brands to search for creative talent based on style and medium using NLP-powered visual search

As the founding product designer, I leveraged product strategy and cross-functional collaboration to define a unique core search experience. Other contributions include onboarding and growth targeting brands and creative talents, and defining the design infrastructure for launch in Fall 2025.

01 design solution

Visual search experience that extracts style keywords from natural language & refines ambiguous terms with AI suggestions

I worked with machine learning engineers to define how people search for visual works using natural language. Through a series of AI-powered interactions, the results closely align with input, despite the often subjective and ambiguous terms that people use to describe visual works. The result is a search engine that truly "works like magic" and allow brands to quickly hire the creatives they want.

Gif showing user browsing search results. Please wait if the gif is still loading... :)
Gif showing user using the prompt enhancer feature. Please wait if the gif is still loading... :)
Gif showing user using the prompt enhancer feature. Please wait if the gif is still loading... :)

02 research & discovery

STEP #1

How does NLP-based and visual search work? How can we make natural language processing “understand” user intent?

I had many conversations with machine learning engineers on how they are training the NLP model behind that-sauce. Understanding how the backend worked allowed me to design interfaces that let users input exactly what we are looking for. Below, I show our understanding of the model.

STEP #2

To better understand user needs, I asked: "What exactly are users searching for? What do they need to search for?"

I had many conversations with machine learning engineers on how they are training the NLP model behind that-sauce. Understanding how the backend worked allowed me to design interfaces that let users input exactly what we are looking for. Below, I show our understanding of the model.

  • “What criteria do you use to find creative talent?”

  • “Describe the last role you hired for and how”

  • “How do you assess creative work for hire?”

THE RESULT

Defining the taxonomy of search with role and style

Starting broad, I explored various entry points to incorporate flexibility & personalization at core nodes in the search journey; these experiments were meant to be broad and to raise questions about my vision

04 rapid iteration

PROBLEM STATEMENT

How might we enable and encourage users to enter the most robust and specific search terms pertaining to role and style?

DESIGN PRINCIPLE #1

Focus on first-time accuracy and decrease “time-to-value”

DESIGN PRINCIPLE #2

Design a fluid and intelligent experience

Iteration #1

I started out experimenting with a search bar that has keyword and style suggestion while typing. We felt that it didn’t give users guidance and context to create search criteria with intention.

Iteration #2

A second iteration included a more complex search bar with options to manipulate, such as role, style, and, color.

Iteration #3

Expanded the search box to include suggestions and filters in a way similar to GenAI interfaces like Perplexity and ChatGPT .The “prompt box” format drew on familiar patterns to encourage more natural, long-form input.

How might we display search results in a clear and unbiased way?
— showing creative work in their original format and dimension

After brainstorming different ideas taking into account how to display creative work in dynamic ways without losing faithfulness to the original, we landed on a single row of creator work accommodating any dimension and a mix of video and static formats.

04 pivot in direction

Due to tech constraints, my team requested a simpler search flow with limited functionalities

As we tried to find product-market fit and the search function was getting bloated with new requirements, I navigated pivoting the core search experience and information architecture.

USER FLOW: BEFORE

USER FLOW: AFTER

NEW CHALLENGE

How might we create a step-by-step yet intuitive search flow to improve match between search input and results

Decision #1

Information architecture

The first core decision was to move the search bar into a new page in order to accommodate more expansive search functionalities.

Decision #2

Search interface explorations

Then, I experimented with AI interfaces inspired by platforms like OpenAI and Perplexity – which provide a workspace for prompting and AI interactions.

Decision #3

Adding visuals and motion to optimize user comprehension

  • Added visual elements to roles to help users understand who / what they might be looking for

  • Animated the transition between first choosing the role then describing them to make the flow feel intuitive rather than a step-by-step form

THE FULL DESIGN PROCESS 🗝️

How did we make the final design decisions? I'd love to walk this through in an interview as the actual process is far more convoluted than covered here :)

06 impact & reflection

Some reflections and takeaways…

Don’t be afraid to experiment

I'm super grateful to have been a part of a team that was open to new ideas & patterns, and who trusted me to spend time experimenting and bringing our unique vision to life.

Establish early alignment with devs and branding/marketing on vision

One major achievement in the beginning of the project was achieving alignment between cross-functional collaborators on the vision of the project – including how core features should feel and achieve. In particular, I leveraged rapid prototyping in low fidelity to quickly demonstrate and generate ideas.