Visual search solution to reduce friction in creative talent hiring from 0 to 1
that-sauce.com is a platform for brands to search for creative talent based on style and medium using NLP-powered visual search.
As the founding product designer, I leveraged product strategy and cross-functional collaboration to define a unique core search experience. Other contributions include onboarding and growth targeting brands and creative talents, and defining the design infrastructure for launch in Fall 2025.
Product Engineer, Creative Director, Brand Designer
February – May 2025
Figma, Notion, Slack, ProtoPie
Product Discovery
Prototyping
User Flows
Cross-Functional Collaboration
End-to-end Wireframe
Information Architecture
Visual search experience that extracts style keywords from natural language & refines ambiguous terms with AI suggestions
I worked with machine learning engineers to define how people search for visual works using natural language. Through a series of AI-powered interactions, the results closely align with input, despite the often subjective and ambiguous terms that people use to describe visual works. The result is a search engine that truly "works like magic" and allow brands to quickly hire the creatives they want.


Visual design foundations, growth, and other core flows for launch in late 2025
Initiating user discovery and cross-functional conversations to define the core search taxonomy
Below is a mental mapping of how we are defining and understanding search input and outputs I initiated early cross-functional conversations with the product director and machine learning engineers to define the core search criteria: role, medium, and style
How can natural language processing “understand” and help search a visual/creative work?
I had many conversations with machine learning engineers on how they are training the NLP model behind that-sauce. Understanding how the backend worked allowed me to design interfaces that let users input exactly what we are looking for. Below, I show our understanding of the model.
Competitive analysis of visual search in the hiring and consumer space helped us define our vision
The main pattern I observed revolved around the complexity of the search experience – whereas some apps include filters and categories upfront, others do not. We aim to create an intuitive search experience with filters that don’t feel like rocket science.
Ideating a search experience foregrounding precision and specificity, but without it feeling like operating a complex machine
The core goal of the search experience is to get people to enter specific and precise descriptions (although the beauty of NLP is that ambiguity still works). I ideated different UIs and flows for the visual search component.
Iteration 1
I started out experimenting with a search bar that has keyword and style suggestion while typing. We felt that it didn’t give users guidance and context to create search criteria with intention.
Iteration 2
A second iteration included a more complex search bar with options to manipulate, such as role, style, and, color.
Iteration 3
Expanded the search box to include suggestions and filters in a way similar to GenAI interfaces like Perplexity and ChatGPT .The “prompt box” format drew on familiar patterns to encourage more natural, long-form input.
How to display search results in a clear and unbiased way – showing creative work in their original format and dimension
After brainstorming different ideas taking into account how to display creative work in dynamic ways without losing faithfulness to the original, we landed on a single row of creator work accommodating any dimension and a mix of video and static formats.
As we tried to find product-market fit and the search function was getting bloated with new requirements, I navigated pivoting the core search experience and information architecture.
The ask:
Our product founder wanted to put guardrails on the search experience to make a more streamlined flow that better matches the ML algorithm powering the search.
How I pushed back:
I felt the idea made our search experience feel more like filling out a form rather than an intuitive experience. I began to iterate on a search flow that better matched the new requirements.
Iterating a more step-by-step but still intuitive search flow to improve match between search input and results
Information architecture
The first core decision was to move the search bar into a new page in order to accommodate more expansive search functionalities.
Search interface explorations
Then, I experimented with AI interfaces inspired by platforms like OpenAI and Perplexity – which provide a workspace for prompting and AI interactions.
Animating search with visuals and motion to create intuitive experience
Final search flow:
Added visual elements to roles to help users understand who / what they might be looking for
Animated the transition between first choosing the role then describing them to make the flow feel intuitive rather than a step-by-step form

Don’t be afraid to experiment
I'm super grateful to have been a part of a team that was open to new ideas & patterns, and who trusted me to spend time experimenting and bringing our unique vision to life.
Establish early alignment with devs and branding/marketing on vision
One major achievement in the beginning of the project was achieving alignment between cross-functional collaborators on the vision of the project – including how core features should feel and achieve. In particular, I leveraged rapid prototyping in low fidelity to quickly demonstrate and generate ideas.



















