Tripadvisor

(2025)

Background

For decades, Tripadvisor has been trusted for its rich, in-depth reviews that help travelers make confident, informed decisions. However, shifts in user behavior and the rise of lower friction contribution models like Google Reviews have changed how people expect to engage with platforms. This project focused on evolving Tripadvisor’s contribution ecosystem by introducing new, lower barrier ways for travelers to share their experiences. While long-form reviews remain highly valuable, their submission requirements had increasingly become a point of friction, leading to measurable drop-off over time. The goal was to complement traditional reviews with lightweight contribution formats that reduced effort, increased participation, and preserved content quality. Ultimately expanding the volume and diversity of user-generated content while meeting users where they are today.

Services

Responsive Design

Discovery

Prototyping

INITIAL WORRIES

While Traveler Tips performed strongly in initial research, discovery revealed three critical risks that needed to be addressed to ensure the concept complemented not diluted Tripadvisor’s core review ecosystem.

1. Redundancy with Existing Reviews

One concern was that Traveler Tips could duplicate information already available in long-form reviews, reducing their incremental value. Reviews often contain buried insights such as “best time to go,” “what to order,” or “where to sit” which could make tips feel repetitive if not clearly differentiated.

This raised an important design question: What makes a tip meaningfully distinct from a review?
Insights from research suggested that users viewed tips less as opinions or narratives and more as situational advice. This pushed us to frame tips as:

  • Highly specific and context-driven (e.g., “Order X,” “Go before Y,” “Avoid Z”)

  • Structured around discrete moments of decision-making rather than overall impressions

  • Positioned as a complement to reviews, not a replacement

This distinction later informed how tips were prompted, displayed, and scoped within the product.

2. Content Freshness & Relevance Over Time

Another key concern was that tips could become outdated more quickly than reviews. Travel experiences are dynamic as menus change, policies shift, and destinations evolve. In short, traveler advice can lose accuracy faster than broader sentiment.

Users expressed hesitation about trusting tips if they lacked clear signals of recency or context. This highlighted the need for:

  • Contextual relevance tied to the place, experience, or moment

  • Temporal cues (e.g., when the tip was written or last validated)

  • Mechanisms to surface the most helpful and up-to-date tips first

Recognizing this risk early helped frame tips as living content, rather than static contributions, and influenced later thinking around ranking, relevance, and lifecycle management.

3. Potential for Abuse & Trust Erosion

There were concerns around misuse, specifically businesses posting self-promotional or misleading tips. Because tips are shorter and easier to submit, they could be more susceptible to manipulation than traditional reviews.

Trust is foundational to Tripadvisor’s brand, and any erosion would undermine the value of the entire platform. Discovery made it clear that tips would require:

  • Clear attribution and accountability

  • Signals that differentiate traveler-generated content from business influence

  • Guardrails to maintain authenticity and credibility at scale

This insight reframed Traveler Tips as not just a contribution feature, but a trust-sensitive system that needed to uphold the same standards users expect from reviews.


DISCOVERY

The discovery phase focused on aligning leadership and cross-functional partners around the opportunity to introduce Tips as a lightweight UGC format within TripAdvisor’s broader content ecosystem. Given the project’s high visibility, early work prioritized clarity of purpose, shared understanding, and confidence in the direction.

I partnered closely with Product, Engineering, and Research, meeting bi-weekly with the core working team and presenting weekly to leadership. These stakeholder meetings were used to frame the problem, surface tradeoffs, and understand our goal.

We began with a competitive and market analysis of short-form contribution models, evaluating patterns such as single-question prompts, pre-filled responses, free-form text, and hybrid approaches. This highlighted a consistent industry tradeoff: low-effort inputs drive higher participation, while high-effort formats deliver depth at the cost of frequency. This insight became foundational in positioning Tips as a complementary, lower-friction contribution model.

A key discovery insight was recognizing that reviews and tips serve different user mindsets. Reviews reflect a reflective, post-trip behavior requiring time and emotional investment. Tips allowed travelers to quickly share a single, actionable insight with minimal commitment. This distinction helped define clear product principles around effort, tone, and structure. This gave us clear guardrails to guide concept exploration and validation.

From this work, we aligned on a north-star goal: to provide travelers with quick, actionable insights that support fast decision-making, while lowering the effort required to contribute meaningful content.

VALIDATING IDEAS

I explored a wide range of wireframe concepts to evaluate how traveler tips could be created and consumed across the platform. Early directions focused on free-form input, guided multi-question flows, and fixed-answer formats. Each concept was assessed against key success criteria: trustworthiness of content, informational viability, and contributor effort.

Design exploration ran on two parallel tracks: the contributor experience, and the merchandising experience how tips appear and add value within destination and POI pages. Through rapid iteration and close collaboration with Product and Engineering, I refined concepts to balance high-quality signal with low friction. Partnering closely with my Product Manager ensured designs remained aligned with both user needs and business goals. Shout out to my favorite PM Orhun :)

RESEARCH

We conducted our first round of research to pressure-test our assumptions around tips as a new UGC format and understand what would feel most useful to travelers without becoming redundant at scale.

This research focused on defining tips and usefulness. We grounded the study in a familiar behavior, writing a review, then progressively introduced new tip-based contribution models. This allowed us to learn how travelers naturally define a “tip,” what they consider valuable advice, and where the line is between a helpful tip and a mini-review. This step was critical to avoid recreating reviews under a different label.

Evaluating submission models revealed clear tradeoffs.

  • Multiple-tip submission enabled more precise, focused advice and encouraged specificity, but initially appeared more effortful due to perceived text volume.

  • Single submission with guardrails (min/max, photos) provided helpful structure but felt restrictive when travelers had multiple, distinct tips to share.

  • Single-select submissions were fastest to complete but stripped away context, resulting in overly narrow or subjective responses that didn’t generalize well.

Travelers also helped define what makes a good tip: concise, situational, and actionable. Which is often tied to timing, planning decisions, or first-time mistakes. This feedback directly shaped both the prompt design and the final hybrid (“best-of”) submission experience, combining the flexibility of multiple tips with the guidance and constraints needed to maintain quality and trust at scale.

RAMPING UP TO HIGH FIDELITY

With a clear direction established from the first research round, I moved the team into detailed design exploration by consolidating the strongest patterns into a multiple-tip submission model. The goal of this phase was to translate research insights into high-fidelity flows that balanced traveler flexibility with the structural guardrails.

As fidelity increased, I intentionally ramped up the cadence of cross-functional reviews with Product, and Engineering. We aligned weekly on scope, decision points, and deliverables, allowing us to validate feasibility early and reduce downstream rework. This tight feedback loop ensured design decisions accounted for any unforeseen circumstances.

This was the most rigorous phase of the project due to the volume of iterations and approvals required. Many design directions needed explicit sign-off, and each iteration refined not only the core experience but also its edge cases and failure states.

Alongside the primary “happy path,” I designed for all critical system and behavioral states a traveler might encounter, including:

  • Moderation and fraud detection

  • Exiting mid-flow

  • Profanity and content-quality messaging

  • Minimum and maximum character validation

  • Submission and network error states

  • Daily contribution limits

ROUND 2 OF RESEARCH

After several rounds of feedback and iteration, we ran a second, unmoderated research study to validate our direction and pressure-test the designs. I built interactive prototypes for both mobile and desktop to reflect realistic submission flows.

The goals of this round were to understand overall satisfaction with the new tips concept, see whether engaged travelers could complete the key submission steps, and uncover any moments of confusion or friction.

The study surfaced clear, actionable insights that informed our next set of iterations:

  • Nearly half of the participants (5 of 12) misunderstood the purpose of the “Share tip” button

  • Most travelers were unlikely to write a full review or submit another tip immediately after completing one

  • Travelers with accessibility needs responded very positively to the accessibility features

  • Overall, participants felt the tips submission flow was efficient and thoughtfully designed

These findings helped us focus our next design changes on clarity, expectations, and accessibility, while reinforcing that the core flow was working well.


Hiccups along the way

As we moved forward, we ran into several constraints that required additional design changes across user experience, legal, and engineering. To stay on schedule, I focused my efforts on specific areas where design support was most critical.

One challenge was ensuring the required legal text was always visible on mobile. In some cases, it was being covered by the sticky footer, so we needed to rethink placement. At the same time, we were trying to reduce vertical usage to keep key content within the viewport. Complicating this further, a company-wide rebrand was rolling out, which meant updating components and design tokens in parallel.

I worked closely with engineering to identify edge cases and potential issues with the existing designs, outdated back-end, and reviewed how changes would behave in real scenarios. Some of the back-end was not worth restructuring, and coming up with creative design solutions became more feasible. Based on these discussions, I made several targeted updates:

  • Combined the “When did you go” screen with the main contribution screen

  • Removed the progress bar to simplify the flow

  • Reduced padding to improve information density

  • Moved legal text into the sticky footer to ensure visibility

  • Updated components and tokens to align with the new rebrand

These adjustments helped us with legal requirements and technical constraints without derailing timelines.

Bringing it over to our app

As we rolled out the responsive web designs in parallel, another priority was bringing Tips into the mobile app. This required adjustments for mobile-specific constraints. We initially explored a bottom sheet pattern, but due to engineering limitations at the time, we moved to a full-screen experience that mirrored the web flow—creating the feeling of a native app surface while technically rendering a web experience within the app

CONCLUSION

Once we had alignment on the remaining critical items, we wrapped up the designs and moved toward final delivery. Design reviews slowed down and mostly shifted to async, which helped keep stakeholders informed without blocking progress. During this phase, I also put together a detailed handoff file that covered full user flows, different states, and key edge cases to support engineering through build.

In the months following development, we kept an eye on how Tips was performing. Even without any marketing push or feature announcement, we saw a strong number of travelers naturally contributing through the flow. Early analytics showed engagement trending up, with contribution rates increasing by roughly 10% overall. While this was still early data, the initial signal was promising.

At the same time, we were constrained on resources for what came next. A growing amount of tech debt and other high-priority initiatives limited our ability to continue iterating or invest in deeper measurement. While the outlook for Tips was positive, pushing it further at that moment didn’t justify the effort required, so the work paused before we could fully explore its long-term impact.