
Project overview: Increasing Conversion Rate of Sales Funnel by UX Changes
Timeline: 1 month
My Role: UX Designer – Led end-to-end research-informed design, collaborated with PMs, engineers, and business leads.
Challenge: Conversion rate was lower than competitors despite similar pricing. Business was skeptical UX could move the needle.
Goal:Identify and address UX barriers in the sales funnel to increase booking conversions without major dev cost.
Methods: Google Analytics, Session Replay, Usability Testing, User Interviews, Prioritization Workshop, A/B Testing
Outcome: +60% increase in conversion rate, Stakeholder buy-in for full redesign ,Validated low-cost changes before major investment
In Iran’s highly competitive transportation market, price is no longer the only differentiator. Since platforms typically offer the same routes, schedules, and pricing, user experience becomes the key driver of conversion. At the time, our benchmark data showed MrBilit was underperforming in funnel efficiency compared to competitors.
We noticed that even with equal pricing, users dropped off mid-funnel or switched to other booking platforms. Business stakeholders had started questioning whether UX optimizations could drive tangible growth—especially with limited engineering capacity.
This project aimed to prove that design decisions rooted in user insights could significantly improve business outcomes.
Despite functioning products and competitive pricing, MrBilit’s flight conversion rate hovered around 4%—well below the international benchmark of 5–10%. This gap was also consistent across other vehicle types.
We needed to answer two questions:
Why is the conversion rate underperforming despite a working product and equal pricing?
How can we improve it with minimal resources?
MrBilit started as a small product built by 3 developers. Even as it grew into a mid-sized business, the team still relied heavily on benchmarking and incremental testing—without prioritizing UX.
UX was seen as “just making it pretty,” not as a driver of impact. As a result:
There was low trust in UX’s business value.
I needed to design experiments that proved ROI quickly, without major dev resources.
This project was an exercise in showing that small, insight-driven UX changes can lead to measurable improvements—and that proving value upfront is key to building stakeholder trust.
To properly analyze the effect of our changes, I set up two levels of conversion metrics:
Macro:
Overall Sales Funnel Conversion Rate
(Homepage Visitor → Buyer)
Micro (Step-by-step Funnel Analysis):
Homepage Visitor → Search Results Visitor
Search Results Visitor → Reservation Page Visitor
Reservation Page Visitor → Buyer
These granular metrics helped us identify where drop-offs occurred, isolate design opportunities, and validate the impact of each UX change.
To ensure our design decisions aligned with both user needs and business goals, I followed the Double Diamond framework:
Discover: Gathered as much insight as possible about user behaviors, needs, and blockers through data and interviews.
Define: Prioritized the most impactful problems that we could address with minimal effort and maximum return.
Develop: Created design solutions and prototypes, iterating on feedback from internal teams and real users.
Deliver: Launched improved designs and collected feedback post-launch to plan the next iteration.
This structured approach helped us stay focused, move fast, and gain stakeholder trust—while staying grounded in user needs.
I followed a step-by-step, evidence-based process to identify the root causes of conversion drop-offs.
Using Google Analytics, I reviewed each stage of the booking funnel in many ways such as
User flow : discover user journeys.
Page Metrics: bounce rate, exit rate, time on page, how user behave on each page of funnel.
Conversion rates.


Google Analytics Funnel Breakdown

Page details of mobile users.

50% of mobile users dropped off on the homepage, compared to only 12% on desktop.
We observed looping behavior in the journey (Home > Search Result > Home > Search Result), with no clear reason why users bounced back and forth.
There was a high exit rate on the reservation page, which is the final step in the funnel—an unusual and concerning pattern.
Users spent an average of 5 minutes on search results pages, despite most routes having fewer than 5 options. This delay raised questions about decision fatigue or system clarity.
Many clicks were recorded on the “Next Day” button instead of on result cards, suggesting that users didn’t scroll or didn’t see visible results immediately.
Click maps revealed heavy interaction with non-goal-oriented elements like support, other vehicle types, and banners—indicating distraction or misaligned visual hierarchy.
Scroll data showed most users only viewed 1–3 cards on the results page before exiting.
Users sometimes encountered a message like “Sorry, the flight is fully booked, please choose another,” even though the previous screen had shown available seats—indicating a data sync issue that caused fake pageviews on the booking form.
On mobile, elements like “new search,” “next/previous day,” and “other vehicles” appeared before actual results and visually dominated the page, causing users to assume no results were available.
Users reported that booking a ticket took too long—from reviewing results to completing forms—making the experience feel cumbersome and untrustworthy.
These insights revealed friction points across the entire sales flow. However, given time and resource constraints, we aligned with stakeholders to focus on low-cost, high-impact opportunities for the initial iteration. so instead of working on both Mobile and desktop, we decided to focus only on mobile visitors (80% of traffic)
With a clear set of issues, I moved into solution design by doing benchmark and running workshop with PM and developers (crazy 8 method)

Discussion about ideas and compare it with current version. (Crazy 8 workshop with PM and developers)





I validated the new design through: Usability test( discover users problems with design), A/B testing to make sure about the impact of design changes.

5 participants tested the prototype
most of problem fixed in new the new design.
All participants found the layout simpler and easier to act on
We launched a live A/B test over 1 weeks
Test exposed to ~40K users
Results measured in Google Analytics.
Results & Business Impact
Based on this test, conversion rate increased from 4.3% to 4.94% (only on visitors to buyers)
These initial changes, delivered with minimal time and resources, helped us gain the trust and support of stakeholders for further improvements.
After this success, we conducted additional iterations — introducing and testing new changes across the funnel — which ultimately led to a significant increase in overall conversion rates.
Later, as the business planned to expand transportation options and needed a more scalable design system, we initiated a full redesign of the sales flow to better align with the company’s long-term vision and market positioning.
Pooria Hassanzadeh
2018