Confidential · By invitation
Real problems. Hard decisions. Outcomes I can put a number on. This is the full picture — not the highlight reel.
Case Studies · Confidential
10+ years. Three industries. Design that earned its place in every product it touched.
01 — Current Role
Vendor Platform · Gamification · 26 Markets
Turned a broken badge system into a motivating rewards engine for 500K+ vendors — driving measurable performance gains across 26 markets.
Onboarding · Self-Service · Product Strategy
Every unnecessary step was a vendor who quit. Stripped the flow to its bones — and watched completion rates climb.
Design System · Multi-Brand · Figma
The system behind 36+ markets. Contributed to Cape DS — the design foundation powering every Delivery Hero brand.
Internal Tool · CRM · Sales Operations
Salesforce was the wrong tool for the job. Built a purpose-made CRM from 0→1 — putting the right context in front of reps at the right moment.
02 — Previous Work
CRM · B2B Platform · Real Estate
Hundreds of agents. Two markets. One CRM that wasn't cutting it. Rebuilt Proforce from the inside — faster leads, sharper reporting, less guesswork.
Mobile UX · Classifieds · OLX Motors
50+ research sessions. Millions of users. One broken listing flow. Rebuilt every step sellers and buyers had to take — then measured the difference.
Internal Tool · Productivity · Real Estate
Every deal required five tabs. We built one. Unified calling, lead context, and scheduling — and gave sales reps their focus back.
Wellbeing · Social Platform · Mobile
A mindful social platform for sharing personal struggles anonymously — removing engagement-driven design in favour of genuine human connection.
Senior IC and lead roles. Open to the right team — globally.
Delivery Hero SE · Vendor Platform · 26+ Markets · 2023–2024
A tiered performance program designed to motivate 50,000+ restaurant partners across 26 markets — with 70% of vendors stuck at the lowest tier and a 7% click-through rate. The incentive structure existed. The experience didn't. This is how we changed that.
02 — Context
Delivery Hero operates food delivery platforms in 50+ countries. At the centre of its marketplace model are restaurant and store partners — vendors — who fulfil every order. Their performance directly determines customer experience, platform reliability, and ultimately, revenue.
The Vendor Rewards Program is a tiered performance incentive system designed to motivate these partners to maintain high service quality and achieve operational excellence. Vendors are evaluated monthly across 7 operational metrics and placed into one of three tiers:
Needs Improvement
Below performance thresholds. Restricted access to promotional tools.
Standard Performance
Meets baseline requirements. Access to core platform features.
Top Performance
Exceeds standards. Full benefit unlock: badges, ads, placement, vouchers.
Vendors are measured monthly on: orders completed, order rejection rate, offline time, order delay rate, customer contact rate, customer ratings, and menu quality / price parity. Higher tiers unlock better in-app placement, lower commission rates, ad tools, discount vouchers, and top restaurant badges — significant business levers for vendors.
My role & collaboration model
I was the sole designer across the full project lifecycle — from initial problem framing through to global rollout. I defined the UX strategy, conducted vendor research, created wireframes through to high-fidelity production-ready designs, and led design QA. In the final rollout phase, a fellow designer came on board specifically to help carry out region-specific UI tweaks that local market teams had requested. I defined the action items, shared the context, and guided the work throughout — ensuring everything stayed consistent with the system we'd built.
Design Manager — strategic direction, design principles alignment
Product Manager — vision, scope, experiment metrics, roadmap
Engineering (2 FE, 2 BE) — feasibility, implementation, API logic
Data Analyst — KPI definition, Eppo experiment setup, dashboarding
03 — The Problem
Before the redesign, the numbers told a story of systemic failure. The rewards program had been built with genuine business intent. But the experience was so confusing, fragmented, and hard to navigate that vendors couldn't engage with it meaningfully — even when they tried.
These weren't just UX problems — they were business problems. The Vendor Rewards Program existed to motivate better operational performance. If 70% of vendors are stuck at Tier 1 and only 7% are clicking through to understand how to improve, the entire incentive mechanism is broken. Higher-performing vendors deliver better customer experiences, generate more orders, and reduce refund costs. Low engagement with rewards meant low improvement of the very metrics DH depended on.
"The rewards are buried. I had to scroll so far to even find the benefits. I didn't know what tier I was on, and honestly I'd stopped checking."
— Vendor, usability research session, pre-redesignThe same information. A fundamentally different experience. — Baseline satisfaction: 3/8 → 7/8 post-redesign
DH invested heavily in a tiered rewards system. But low engagement meant the incentive lever wasn't pulling — vendors weren't improving performance metrics that DH needed to move.
Vendors saw a tier label but couldn't understand what it meant, which metrics controlled it, or what they should do next. The program felt arbitrary.
This wasn't a "let's make it look better" brief. It was: our performance incentive system is not working, and vendor quality metrics are suffering as a result. Leadership had set explicit goals to improve operational KPIs. Markets were asking for a unified, transparent solution. The redesign was a strategic move to re-energise vendor motivation at scale.
04 — Research & Discovery
Before touching any design tool, I needed to understand the real problem from the vendor's perspective. What did they actually see when they opened the rewards page? What were they trying to do? Why were they leaving without acting?
Research followed a continuous iterative loop — not a single discovery phase. I ran multiple rounds of vendor testing and feedback cycles that continued through ideation and prototyping, validating and course-correcting at each stage.
4 core findings from vendor interviews
Every interview surfaced the same four failure modes. The problems were structural — not surface-level.
The top banner consumed 40% of the screen but conveyed almost no useful information. Critical data like the next evaluation date and progress metrics were buried. Vendors didn't know where to look.
Vendors didn't know which metrics actually affected their tier. Multiple KPIs were shown with no prioritisation or indication of relative importance. No clear action items to act on.
The rewards and benefits section was buried below the fold — only 35% of vendors scrolled far enough to see what they were working toward. The main motivation was invisible.
No progress indicators, no tier comparisons, no sense of how close they were to leveling up. Vendors felt stuck — no forward momentum, no aspiration signal.
"Why are there three tabs — starter, advanced, and expert? Where am I at the moment? It doesn't define my current status for me."
— Vendor, usability session (direct quote from research)"What are the benefits for the vendor? I see the tiers, but I don't understand what I actually get if I reach the top."
— Vendor, usability session (direct quote from research)Competitive benchmarking
I benchmarked rewards and gamification experiences across industries — not just food delivery — to understand what motivates users, how progress is communicated, and where even leading products fall short.
Performance criteria are often hard to find or hidden. Benefits above the fold increase engagement. Few programs offer tier comparison tables for clarity. Gamification is common but often lacks meaningful rewards.
Made all criteria and benefits visible in one place. Added a side-by-side tier comparison table. Surfaced benefits above the fold for immediate impact. Made progress feel tangible with specific targets and dates.
05 — Synthesis & Define
Good research generates noise. Good synthesis generates signal. After clustering findings, three design principles emerged — grounded in vendor psychology, behavioral economics, and scalability constraints — that governed every decision that followed.
Vendors didn't have an information problem. They had a motivation and clarity problem. All the data was technically present — but cognitively unavailable. The redesign's job wasn't to add more — it was to reduce, surface, and sequence what was already there.
Three design principles — grounded in research and design theory
Show goals, progress, and benefits in one clear view with simple, readable components. Surface the highest-value action, not all actions. Ref: Nielsen's heuristic of minimalist design — remove anything that doesn't serve a direct purpose.
Focus on metrics vendors can control. Exclude external factors. Explain calculations in plain language. Create trust through transparency. Ref: Gamification & behavioral economics — Deci & Ryan's Self-Determination Theory (autonomy, competence, relatedness).
A flexible design system and UI that works across multiple brands, languages, and market rules — without per-brand redesigns. Ref: Systems thinking in product design — one source of truth, multiple expressions.
How might we make vendor performance feel like a growth path rather than a grading system?
How might we surface benefits first, so vendors know what they're working toward before we ask them to work harder?
06 — Ideation & Iteration
The design process was deliberately iterative and user-validated at each step. Not a waterfall — a continuous loop of create, test, learn, and refine. Four distinct rounds, each building on what the previous one revealed.
Iteration is not failure — it's the process. Each round surfaced something the previous one missed.
07 — Design Decisions & Rationale
Every significant design decision came with a tradeoff. These aren't just "what we designed" — they're the choices where we evaluated alternatives, weighed constraints, and made a call. This is where senior design thinking becomes visible.
08 — Final Solution
The redesigned Vendor Rewards hub resolved all four research-identified failure modes: visibility, metric comprehension, benefit discovery, and motivation. Mobile was the primary focus (70% of vendor users) with desktop and tablet fully supported via responsive layouts.
Same component architecture. Different brand expression. Token system in action.
Accessibility & global readiness
The design was built to WCAG-compliant color contrast, typography, and touch targets from day one. RTL support for Arabic and Hebrew was designed in (not retrofitted). Localized date, currency, and number formats were handled through flexible containers. The result: faster launches in 26+ markets with minimal rework per region — a direct reduction in design and development overhead at global scale.
09 — Validation & Experimentation
This is where most portfolios stop: "we tested with users and they liked it." We went further. We built a rigorous, data-first A/B experimentation methodology to generate statistically significant proof that the redesign moved business metrics — not just satisfaction scores.
A common but flawed approach would be comparing old rewards in one country to new rewards in another. Every Delivery Hero market operates under vastly different conditions — maturity, vendor density, consumer behaviour, seasonal patterns, regulatory restrictions. Cross-country comparisons would be unreliable. We needed to compare old vs. new in the exact same environment.
A/B test structure — powered by Eppo
Saw the old Rewards design. Standard experience. No changes.
Experienced the new Rewards design. Benefits-first, progress bars, tier comparison, contextual targets.
Targeting configuration: Platform identifiers used to select vendors from specific country instances. Vertical types filtered to restaurants and coffee vendors only — excluding groceries and retail which have different engagement patterns and would have polluted the data signal.
Server-side experiment via Eppo feature-flagging — no performance delays, no UI glitches, and seamless interaction tracking. Each country ran its own controlled test, producing clean, statistically significant results attributable specifically to the design changes.
Metrics tracked
Fail rate — indicator of operational efficiency
Offline time % — vendor availability and responsiveness
CTR — interactions with performance comparisons and benefits
CVR — whether engagement translates to performance improvement
Tier progression — vendors moving to higher tiers
Usability testing results
8–10 vendors. Key finding: vendors wanted benefits surfaced first and progress clarity. Confusion about metric hierarchy confirmed. Benefits below fold = invisible motivation.
Clickable prototype. Vendors understood tiers and benefits faster. Main feedback: improve readability and goal clarity. Progress bars and evaluation dates still needed.
With progress bars, tier comparison, and evaluation dates added. 7/8 vendors satisfied and motivated — up from 3/8 on the old design.
Design approved as scalable across 26+ markets. Pilot market: HungerStation. A/B test ran with full engineering implementation in production.
"I can finally see what I need to improve and when my evaluation happens. The progress bar makes it feel possible — not overwhelming."
— Vendor, Round 3 usability session10 — Impact & Outcomes
The redesigned Rewards experience launched first in HungerStation as a controlled pilot. The A/B results gave us the data-backed mandate we needed to scale. Within months, the new design had reached 66% of the total vendor base — with a plan to reach 100% by early 2025.
Rewards became a clear motivational driver, encouraging vendors to actively improve key performance metrics. Improved vendor performance led to higher customer satisfaction, more orders, and better retention. The pilot's success provided a data-backed case to scale to 66% of the total vendor base within months — with a plan for 100% by early 2025. Design changes were measured, not assumed.
Reflection
This project taught me as much as it shipped. Four learnings I carry forward as a designer, and five concrete next steps the team has planned.
What comes next — planned next steps
Expand the new Rewards design to 100% of the vendor base across all Delivery Hero markets by early 2025.
Use performance data to provide tailored suggestions for individual vendors — surfacing the metric most likely to advance their tier based on current gaps.
Implement Google Analytics 4 and deeper Eppo reporting for more granular behavioral insights — click heatmaps, drop-off points, and segment-level patterns.
Explore small-scale pilots of leaderboards, milestone badges, and limited-time challenges to sustain motivation beyond the initial engagement lift.
This case study represents what I believe senior product design should be: research-grounded, hypothesis-driven, cross-functional, data-validated, and honest about what it doesn't yet know. Not a UI refresh — a product-level problem solved with evidence.
— Farzam Anjum · Senior Product Designer · Delivery Hero SE · Berlin