Role:
Product Designer
Company:
NA/-
Read Time:
8 minutes
Timeline:
June 12, 2025 - July 23, 2025
Platform:
Mobile App (Healthcare)
Focus Areas:
User Onboarding, UX Copywriting, Product Thinking, Prototyping
Context:
August is an AI-powered healthcare assistant app that helps users manage nutrition, reports, and general wellness.
I discovered their onboarding while researching healthcare UX patterns, and saw an opportunity to redesign it to a more human-centered perspective.
TLDR;
—
THE Problem
Onboarding in healthcare apps often fails to build early trust or reduce user anxiety. Users are greeted with abrupt personal data requests, overwhelming options, and buried privacy cues, leaving them confused and hesitant.
As a result, many either abandon the process or feel detached from the experience, missing out on the supportive entry they expect from a health app.
THE challenge
How do you help a first-time user feel safe enough to share health information with an app they've known for 90 seconds?
The goal is to create an onboarding journey where every user (no matter their background) experiences emotional reassurance, clear guidance, and a sense of progress from the start.
Meet Ankit, my research participant-
Ankit is 27, lives in Delhi, and manages early-stage diabetes while working long hours. He's exactly the fit I'm looking for and in exchange for a coffee, he agreed to sit with me for an hour.
The Interview-
I told Ankit about August AI — an AI-powered health assistant built around exactly what he was looking for. I asked if he'd be willing to go through it for the first time while thinking aloud and he agreed.
Before he opened the app, I asked him one thing:
User Psychological Journey — Original Onboarding
😖
50%
-5%
😍

Welcome / Login
"Okay...
[pause]
'get clarity on your health concerns'...
[pause]
yeah that's why I'm here actually."
"Oh wait I have to log in? I haven't even seen what this does yet. Okay fine."

OTP Verification
Still haven’t seen the value of this app. 🤔 Is this just another generic health app collecting my data?

Introduction
Okay … a bit of warmth and feels a bit more personal, but why was this not the first screen? 😅

Intent Selection
Basic question, but relevant. I appreciate being asked. 😀

Source of Medical Advice
More questions? 🫤 I just want to see what this app does. This is starting to feel like a survey

Doctor- Patient Ratio Insight
Okay, cool stat... but what does that have to do with me right now? Are you selling guilt now? 🤨

Personal Spend Inquiry
Feels like a LinkedIn infographic. 🫠 Still no features, just endless questions.

Healthcare Spend
Another stat. 😵💫 Still no idea
what this app offers.

App Walkthrough
You should’ve led with this. 😒 This is the first time I’m mildly interested. But it took way too long to get here

Notification Request
Push notifications for what? 🙄 You haven’t shown me a single useful action or reminder yet.

Assistant Chat
I'm glad to be done with the questions and can actually see how the app works. I ❤️ the message ‘you’re in a safe space’, but it might get lost because there’s just a bit too much text.
Next
Ankit's emotional arc through the original onboarding looked something like this:
confused → mildly irritated → briefly interested → relieved it's finally over.
That's not the arc a health app should aim for.
Ankit’s experience isn’t unique. The same friction points appear consistently in secondary research on mHealth onboarding behavior.
Research & insights
To see how common Ankit’s experience really was, I looked beyond this single journey. I reviewed healthcare onboarding studies, analyzed flows from leading health apps, and broke down the onboarding flow step by step to spot where trust dropped and friction spiked.
Risk levels are directionally derived from patterns reported across multiple onboarding and mHealth studies.
Drop-off Risk





Early data request
Privacy info
Complex choices
Lack of personalization
Others
Industry-Observed Onboarding Friction
(Secondary Research)
Key Benchmark Insights:
Early requests for sensitive data (such as phone or health info) show high abandonment risk during onboarding.
When privacy assurances are buried or absent before data entry, moderate drop-off risk is observed.
Presenting multiple or complicated choices at the start creates high to moderate abandonment risk.
Lack of meaningful personalization contributed to moderate to low drop-off risk, though small identity cues can improve engagement.
Remaining drop-off risks may vary by product context and user motivation.
WHAT THIS MEANS FOR AUGUST AI?
Users demand clarity, explaining why personal data is needed up front is essential.
Privacy promises & trust should be built before sensitive data entries.
Streamlining early choices and offering genuine, personalized touches (e.g., asking for and using the user’s name) directly addresses key sources of user stress.
Timely feedback & progress cues, like visual step indicators, help users feel in control and reduce abandonment.
design goals
Based on key friction points and user research, we have a clear actionable goals for the redesign-
Build trust before asking for data
1
Always let the user know their progress
Make health choices feel manageable, not overwhelming
Reduce cognitive load
To prioritize the redesign, I organized potential features by their impact on user experience and implementation effort.

Impact/Effort Matrix based on the key findings and observations.
THE REDESIGN
Based on the design goals, I rebuilt August AI’s onboarding from scratch, focusing on the features from the optimal area as it delivers the highest value while having the lowest effort, not just by design wise but overall as a digital product.
1. Clear view of what August can do-
The first screen now introduces August as a personal health companion and previews what Kevin can ask it (e.g., checking symptoms, understanding medications).
Effect: Users see value and safety before sharing data, instead of being hit with an OTP screen out of nowhere.


Progress transparency
Onboarding is broken into a small, predictable sequence of screens, each with a clear step making it obvious to users that they are moving forward and are close to being done.
Effect: Now users don't have to guess how many steps they have to take to complete the onboarding.
Trust badge before phone number
The “Your Privacy is Our Priority” screen now appears before Kevin is asked for his phone number.
It uses familiar cues (HIPAA / GDPR badges and a short reassurance) to explain how his data is protected and why login is needed.
Effect: In the original flow, users were asked for thier phone with little context, which amplified their anxiety about sharing health‑related information.


Less text, less congnitive load
We kept the chat intro short and easy to scan, instead of repeating a long paragraph after users had already built trust during onboarding.
Prompts like “Track nutrition” or “Help with a prescription”, tailored to what the user chose during onboarding helps user get started with less mental effort.
Effect: With less text and options that match the user’s goals from onboarding, the screen is easier to scan and navigate with less mental effort.
By focusing the redesign on the optimal area of the impact–effort matrix, August AI improves trust and clarity for first‑time users with minimal engineering effort.
The final prototype
Reflection & Next Steps
What I’d Do Differently -
Given more time or data, I would:
Test how important phone sign-up truly is:
Experiment with alternative sign-up methods like Google/Apple sign-in, or even let users start chatting immediately without an account, then prompt for sign-up later with a clear reward (e.g., one week of premium access, personalized health insights, or extended chat history).
Experiment with dynamic goal options:
Personalizing the choices based on time of day, user location, or how they found the app, to make the selection feel more relevant.
Co-design with medical professionals:
To ensure the tone and terminology align with clinical empathy, not just UX empathy.
Next Steps -
This redesign sets a strong foundation, but there is always scope of improvement. Here’s what I'd do next:
Does showing trust badges earlier increase phone number verification rates?
Does a "skip onboarding and explore" path reduce drop-off among hesitant or returning users?
For users coming in with an urgent symptom, does a faster, more direct onboarding path improve their first-session experience?
Beyond completion rates: did users feel less anxious after this onboarding than the original?
That last question is the one I care about most. Completion rate is easy to measure. Whether someone felt like they landed somewhere safe- that's harder to capture, and worth more chasing.
REferences
Healthcare onboarding & trust
Healthcare onboarding best practices and drop‑off behavior
https://uxcam.com/blog/10-apps-with-great-user-onboarding/Trust and transparency for health data and privacy cues
https://thisisglance.com/blog/healthcare-app-psychology-building-trust-through-designPatient onboarding and the importance of clear, digital entry points
https://referralmd.com/the-future-of-patient-onboarding-key-lessons-from-consumer-apps/
Cognitive load, Hick’s Law, and decision making
Hick’s Law and reducing options to speed decisions
https://blog.logrocket.com/ux-design/using-hicks-law-help-users-make-decisions/Complete guide to Hick’s Law in UX
https://www.uxness.in/2024/02/the-complete-guide-to-hicks-law-in-ux.htmlCognitive load in UI/UX design
https://www.aufaitux.com/blog/cognitive-load-theory-ui-design/
CUrrent Observation


