COMPANY:

Gigi

ROLE:

Product designer, Product manager

TEAM:

SEO, growth manager, CTO

COMPANY:

Gigi

ROLE:

Product designer, Product manager

TEAM:

SEO, growth manager, CTO

COMPANY:

Gigi

ROLE:

Product designer, Product manager

TEAM:

SEO, growth manager, CTO

SCOPE:

Product development, Product design, Brand experience

Timeline:

2 months

Problem

Gigi was introduced as an AI wingwoman inside a dating app, with strong underlying intelligence but weak real-world adoption. While the AI was capable of learning about users, generating profile descriptions, and suggesting relevant matches, users did not clearly understand when or why to engage with it. The dedicated Gigi tab, intended to be the heart of the experience, functioned mostly as an empty notification space with no clear call to action or emotional pull. As a result, users were not building a relationship with the AI, limiting the quality of data the system could learn from and, ultimately, the quality of matchmaking over time.

Challenge

The core challenge was designing an experience that could both hook users emotionally and remain clear, lightweight, and trustworthy. A key product assumption was that better long-term matches depend on a strong bond between the user and the AI: more trust leads to more interaction, richer data, and better personalization. However, in trying to make Gigi both proactive (to encourage engagement) and on-demand (to answer user questions), the experience became ambiguous. Users struggled to distinguish what they could ask the AI versus what the AI would suggest automatically, creating confusion around control, intent, and value. Solving this required clarifying the AI’s role without breaking the conversational, human feel of the product.

The core challenge was designing an experience that could both hook users emotionally and remain clear, lightweight, and trustworthy. A key product assumption was that better long-term matches depend on a strong bond between the user and the AI: more trust leads to more interaction, richer data, and better personalization. However, in trying to make Gigi both proactive (to encourage engagement) and on-demand (to answer user questions), the experience became ambiguous. Users struggled to distinguish what they could ask the AI versus what the AI would suggest automatically, creating confusion around control, intent, and value. Solving this required clarifying the AI’s role without breaking the conversational, human feel of the product.

The core challenge was designing an experience that could both hook users emotionally and remain clear, lightweight, and trustworthy. A key product assumption was that better long-term matches depend on a strong bond between the user and the AI: more trust leads to more interaction, richer data, and better personalization. However, in trying to make Gigi both proactive (to encourage engagement) and on-demand (to answer user questions), the experience became ambiguous. Users struggled to distinguish what they could ask the AI versus what the AI would suggest automatically, creating confusion around control, intent, and value. Solving this required clarifying the AI’s role without breaking the conversational, human feel of the product.

Woman In The Garden
Woman In The Garden
Woman In The Garden

Design process

I began by analyzing early usage patterns and qualitative feedback to understand why users were not engaging with the Gigi tab. I focused on moments of hesitation, confusion, and drop-off, as well as how users naturally seek dating advice from friends. This helped reframe the problem as a mental-model and trust issue rather than a feature or UI gap. From there, I worked closely with product managers and engineers to define clear boundaries between AI-initiated and user-initiated interactions, align on data sources and privacy constraints, and iterate on conversational flows. The process involved continuously testing clarity, pacing, and cognitive load to ensure the AI felt helpful and approachable rather than overwhelming or intrusive.

I began by analyzing early usage patterns and qualitative feedback to understand why users were not engaging with the Gigi tab. I focused on moments of hesitation, confusion, and drop-off, as well as how users naturally seek dating advice from friends. This helped reframe the problem as a mental-model and trust issue rather than a feature or UI gap. From there, I worked closely with product managers and engineers to define clear boundaries between AI-initiated and user-initiated interactions, align on data sources and privacy constraints, and iterate on conversational flows. The process involved continuously testing clarity, pacing, and cognitive load to ensure the AI felt helpful and approachable rather than overwhelming or intrusive.

I began by analyzing early usage patterns and qualitative feedback to understand why users were not engaging with the Gigi tab. I focused on moments of hesitation, confusion, and drop-off, as well as how users naturally seek dating advice from friends. This helped reframe the problem as a mental-model and trust issue rather than a feature or UI gap. From there, I worked closely with product managers and engineers to define clear boundaries between AI-initiated and user-initiated interactions, align on data sources and privacy constraints, and iterate on conversational flows. The process involved continuously testing clarity, pacing, and cognitive load to ensure the AI felt helpful and approachable rather than overwhelming or intrusive.

Solution

The final solution positioned Gigi as a chat-first, conversational partner rather than a utility or dashboard. The AI builds its understanding of users through a conversational onboarding and optional Instagram connection, allowing it to generate profile descriptions, suggest improvements, and proactively recommend compatible matches with context and reasoning. Inside conversations, Gigi acts as a private wingwoman: users can ask for advice about the person they are chatting with, receive suggestions for activities or meetups, and explore guidance without the other person ever seeing the AI. The Gigi tab was redesigned as a compelling AI hub that clearly separates what users can ask from what the AI suggests automatically, while preserving a natural, human conversation flow and full user control over all decisions.

The final solution positioned Gigi as a chat-first, conversational partner rather than a utility or dashboard. The AI builds its understanding of users through a conversational onboarding and optional Instagram connection, allowing it to generate profile descriptions, suggest improvements, and proactively recommend compatible matches with context and reasoning. Inside conversations, Gigi acts as a private wingwoman: users can ask for advice about the person they are chatting with, receive suggestions for activities or meetups, and explore guidance without the other person ever seeing the AI. The Gigi tab was redesigned as a compelling AI hub that clearly separates what users can ask from what the AI suggests automatically, while preserving a natural, human conversation flow and full user control over all decisions.

The final solution positioned Gigi as a chat-first, conversational partner rather than a utility or dashboard. The AI builds its understanding of users through a conversational onboarding and optional Instagram connection, allowing it to generate profile descriptions, suggest improvements, and proactively recommend compatible matches with context and reasoning. Inside conversations, Gigi acts as a private wingwoman: users can ask for advice about the person they are chatting with, receive suggestions for activities or meetups, and explore guidance without the other person ever seeing the AI. The Gigi tab was redesigned as a compelling AI hub that clearly separates what users can ask from what the AI suggests automatically, while preserving a natural, human conversation flow and full user control over all decisions.

Data

0

%

0

%

%

%

Learnings

This project reinforced that designing AI products is less about adding intelligence and more about shaping relationships, trust, and clarity of control. Proactive AI can drive engagement, but without a clear interaction contract it quickly becomes confusing or intrusive. I also learned the importance of explicitly designing input versus output, especially in conversational systems, and of grounding AI experiences in familiar human behaviors rather than abstract capabilities. Finally, working at the intersection of product strategy, AI interaction design, and cross-functional collaboration highlighted how critical it is to align emotional goals, system constraints, and long-term outcomes when building AI-driven experiences.

This project reinforced that designing AI products is less about adding intelligence and more about shaping relationships, trust, and clarity of control. Proactive AI can drive engagement, but without a clear interaction contract it quickly becomes confusing or intrusive. I also learned the importance of explicitly designing input versus output, especially in conversational systems, and of grounding AI experiences in familiar human behaviors rather than abstract capabilities. Finally, working at the intersection of product strategy, AI interaction design, and cross-functional collaboration highlighted how critical it is to align emotional goals, system constraints, and long-term outcomes when building AI-driven experiences.

This project reinforced that designing AI products is less about adding intelligence and more about shaping relationships, trust, and clarity of control. Proactive AI can drive engagement, but without a clear interaction contract it quickly becomes confusing or intrusive. I also learned the importance of explicitly designing input versus output, especially in conversational systems, and of grounding AI experiences in familiar human behaviors rather than abstract capabilities. Finally, working at the intersection of product strategy, AI interaction design, and cross-functional collaboration highlighted how critical it is to align emotional goals, system constraints, and long-term outcomes when building AI-driven experiences.

PORTFOLIO
87% Accessibility
b Low environmental impact
PORTFOLIO
87% Accessibility
b Low environmental impact
PORTFOLIO
87% Accessibility
b Low environmental impact