Project overview: Increasing Conversion Rate of Sales Funnel by UX Changes


Timeline: 1 month


My Role: UX Designer – Led end-to-end research-informed design, collaborated with PMs, engineers, and business leads.


Challenge: Conversion rate was lower than competitors despite similar pricing. Business was skeptical UX could move the needle.

Goal:Identify and address UX barriers in the sales funnel to increase booking conversions without major dev cost.

Methods: Google Analytics, Session Replay, Usability Testing, User Interviews, Prioritization Workshop, A/B Testing

Outcome: +60% increase in conversion rate, Stakeholder buy-in for full redesign ,Validated low-cost changes before major investment

MrBilit:

MrBilit:

In Iran’s highly competitive transportation market, price is no longer the only differentiator. Since platforms typically offer the same routes, schedules, and pricing, user experience becomes the key driver of conversion. At the time, our benchmark data showed MrBilit was underperforming in funnel efficiency compared to competitors.
We noticed that even with equal pricing, users dropped off mid-funnel or switched to other booking platforms. Business stakeholders had started questioning whether UX optimizations could drive tangible growth—especially with limited engineering capacity.
This project aimed to prove that design decisions rooted in user insights could significantly improve business outcomes.

Problem

Problem

Problem

Despite functioning products and competitive pricing, MrBilit’s flight conversion rate hovered around 4%—well below the international benchmark of 5–10%. This gap was also consistent across other vehicle types.


We needed to answer two questions:
Why is the conversion rate underperforming despite a working product and equal pricing?
How can we improve it with minimal resources?

Challenges

Challenges

Challenges

MrBilit started as a small product built by 3 developers. Even as it grew into a mid-sized business, the team still relied heavily on benchmarking and incremental testing—without prioritizing UX.
UX was seen as “just making it pretty,” not as a driver of impact. As a result:
There was low trust in UX’s business value.
I needed to design experiments that proved ROI quickly, without major dev resources.
This project was an exercise in showing that small, insight-driven UX changes can lead to measurable improvements—and that proving value upfront is key to building stakeholder trust.

Metric

Metric

Metric

To properly analyze the effect of our changes, I set up two levels of conversion metrics:


Macro:
Overall Sales Funnel Conversion Rate
(Homepage Visitor → Buyer)


Micro (Step-by-step Funnel Analysis):
Homepage Visitor → Search Results Visitor
Search Results Visitor → Reservation Page Visitor
Reservation Page Visitor → Buyer


These granular metrics helped us identify where drop-offs occurred, isolate design opportunities, and validate the impact of each UX change.

My Role & Team:

My Role & Team:

My Role & Team:

I led this project as a UX Designer, collaborating closely with:
Product Manager – for analytics access, KPIs, and prioritization
Frontend Developer – for feasibility checks and A/B testing
Business Manager – for aligning with growth goals and ensuring measurable impact

I was responsible for problem framing, research setup, ideation, design, testing, and validating improvements.

I led this project as a UX Designer, collaborating closely with:
Product Manager – for analytics access, KPIs, and prioritization
Frontend Developer – for feasibility checks and A/B testing
Business Manager – for aligning with growth goals and ensuring measurable impact

I was responsible for problem framing, research setup, ideation, design, testing, and validating improvements.

Design Process

Design Process

Design Process

To ensure our design decisions aligned with both user needs and business goals, I followed the Double Diamond framework:
Discover: Gathered as much insight as possible about user behaviors, needs, and blockers through data and interviews.
Define: Prioritized the most impactful problems that we could address with minimal effort and maximum return.
Develop: Created design solutions and prototypes, iterating on feedback from internal teams and real users.
Deliver: Launched improved designs and collected feedback post-launch to plan the next iteration.

This structured approach helped us stay focused, move fast, and gain stakeholder trust—while staying grounded in user needs.

Discovery Phase – Identifying the Problem

Discovery Phase – Identifying the Problem

Discovery Phase – Identifying the Problem

I followed a step-by-step, evidence-based process to identify the root causes of conversion drop-offs.

  1. Funnel Analysis with Product Team

  1. Funnel Analysis with Product Team

  1. Funnel Analysis with Product Team

Using Google Analytics, I reviewed each stage of the booking funnel in many ways such as
User flow : discover user journeys.
Page Metrics: bounce rate, exit rate, time on page, how user behave on each page of funnel.
Conversion rates.

Google Analytics Funnel Breakdown

Page details of mobile users.

  1. Hotjar Session Recording Review

  1. Hotjar Session Recording Review

  1. Hotjar Session Recording Review

Findings based on data:

Findings based on data:

Findings based on data:

50% of mobile users dropped off on the homepage, compared to only 12% on desktop.
We observed looping behavior in the journey (Home > Search Result > Home > Search Result), with no clear reason why users bounced back and forth.
There was a high exit rate on the reservation page, which is the final step in the funnel—an unusual and concerning pattern.
Users spent an average of 5 minutes on search results pages, despite most routes having fewer than 5 options. This delay raised questions about decision fatigue or system clarity.
Many clicks were recorded on the “Next Day” button instead of on result cards, suggesting that users didn’t scroll or didn’t see visible results immediately.
Click maps revealed heavy interaction with non-goal-oriented elements like support, other vehicle types, and banners—indicating distraction or misaligned visual hierarchy.
Scroll data showed most users only viewed 1–3 cards on the results page before exiting.

  1. Usability test + Interview:

  1. Usability test + Interview:

  1. Usability test + Interview:

To better understand user behavior—especially on mobile, which accounts for 80% of our traffic—we conducted 5 moderated usability test sessions, combined with in-depth interviews.

To better understand user behavior—especially on mobile, which accounts for 80% of our traffic—we conducted 5 moderated usability test sessions, combined with in-depth interviews.

Key insights included:

Key insights included:

Key insights included:

  • Users sometimes encountered a message like “Sorry, the flight is fully booked, please choose another,” even though the previous screen had shown available seats—indicating a data sync issue that caused fake pageviews on the booking form.

  • On mobile, elements like “new search,” “next/previous day,” and “other vehicles” appeared before actual results and visually dominated the page, causing users to assume no results were available.

  • Users reported that booking a ticket took too long—from reviewing results to completing forms—making the experience feel cumbersome and untrustworthy.

Prioritization Strategy

Prioritization Strategy

These insights revealed friction points across the entire sales flow. However, given time and resource constraints, we aligned with stakeholders to focus on low-cost, high-impact opportunities for the initial iteration. so instead of working on both Mobile and desktop, we decided to focus only on mobile visitors (80% of traffic)

Design Phase:

Design Phase:

Design Phase:

With a clear set of issues, I moved into solution design by doing benchmark and running workshop with PM and developers (crazy 8 method)

Discussion about ideas and compare it with current version. (Crazy 8 workshop with PM and developers)

based on output from workshop I start creating the best UI.

based on output from workshop I start creating the best UI.

based on output from workshop I start creating the best UI.

 Search Result Improvements (First-Screen Exposure Focus)

 Search Result Improvements (First-Screen Exposure Focus)

 Search Result Improvements (First-Screen Exposure Focus)

Testing & Validation Phase

Testing & Validation Phase

I validated the new design through: Usability test( discover users problems with design), A/B testing to make sure about the impact of design changes.

  1. usability test

  1. usability test

  • 5 participants tested the prototype

  • most of problem fixed in new the new design.

  • All participants found the layout simpler and easier to act on

  1. A/B Testing

  1. A/B Testing

  • We launched a live A/B test over 1 weeks

  • Test exposed to ~40K users

  • Results measured in Google Analytics.

Results & Business Impact

Based on this test, conversion rate increased from 4.3% to 4.94% (only on visitors to buyers)

what next?

what next?

These initial changes, delivered with minimal time and resources, helped us gain the trust and support of stakeholders for further improvements.

After this success, we conducted additional iterations — introducing and testing new changes across the funnel — which ultimately led to a significant increase in overall conversion rates.

Later, as the business planned to expand transportation options and needed a more scalable design system, we initiated a full redesign of the sales flow to better align with the company’s long-term vision and market positioning.

What I learned

What I learned

This project taught me the value of starting small, focused, and impact-driven.

By solving real user pain points and demonstrating measurable improvements, I was able to build stakeholder confidence and get buy-in for larger design changes.
It reinforced a key principle:

Start by proving your hypothesis — and let the results advocate for your bigger vision.

This project taught me the value of starting small, focused, and impact-driven.

By solving real user pain points and demonstrating measurable improvements, I was able to build stakeholder confidence and get buy-in for larger design changes.
It reinforced a key principle:

Start by proving your hypothesis — and let the results advocate for your bigger vision.

Final Design after a few round of ux improvements

Final Design after a few round of ux improvements

Thanks for reading

Thanks for reading

Pooria Hassanzadeh

2018


Let’s Connect