Your Pain, Understood. Your Health, Connected.
PainEcho
My team conducted qualitative research with both patients and healthcare providers to understand challenges in healthcare. Based on our research, we designed a chronic pain management smart mirror, where data can be easily shared with providers.
Dates
February 2025
Skills & Tools
Shapes XR, Unity, Figma, Qualitative Research, Adobe After Effects Video Editing, Meshy (3D object builder)
Deliverables
Qualitative Research Plan and Data Analysis, Problem statement, Prototypes, Demonstration Video
Role & Team
Interaction designer, researcher, video editor in a team of four
PainEcho Demo Video
Chronic pain users can log their pain experiences as they pass the mirror, allowing them to integrate pain tracking into their daily routine with intuitive gestures. This data can be shared with their healthcare provider who can create a personalized treatment plan and monitor their condition's progression over time.
How it works
For the purposes of the demo video, I added a dark frame behind the content so that the information could be more visible. However, similar to other smart mirrors, users would see content that appears in a bright white color. I created the final UI for the mirror and edited the mirror interactions for the demo video.
When a user passes by the mirror, a down arrow appears, reminding them to log their pain. To reveal past logs, they can swipe down with their hand.
Once a user swipes down, they'll see past pain logs with dates- the most recent appearing at the top. They can use an open hand gesture to open their most recent log and swipe back up to hide the log.
Once viewing a specific day they can swipe left or right with their hand to see other days.
Data can be stored for up to a year. Users can access their data on a phone, and can send the data to their healthcare provider.
The mirror works with hand gestures and voice commands. Touching the mirror is not required. A user can use a close hand gesture to close this screen.
Learnings
My biggest takeaway was the importance of referencing existing products and user behaviors when designing for novel interfaces. This approach ensures that interactions are as intuitive as possible. For instance, when designing the gestures for our mirror interface, we referenced interactions from familiar digital devices and VR products. By doing this, we were able to create gestures that were similar to those used on smartphones and in applications like Shapes XR.
This experience taught me not to assume what is intuitive, but to actively ask users what feels natural to them.
Other Considerations & Next Steps
Our team knew there were other factors to consider such as whether a user has to buy an entire mirror or a device that they could attach to any mirror. Would this be something that insurance covered and how does the data appear once it’s received by the provider? Lastly, we still need to solve for how users would log pain on their back or other parts on the backside of their body.
We imagined that users could also access this data via an app and be able to add information such as pain, their pain scale, and details. Next steps would include testing on target users and finalizing these considerations.
Detailed Process
Interviews
To understand challenges, we spoke to 5 providers and 5 patients and asked questions around biggest frustrations, care experiences, and current processes.
Problem Definition
After conducting research analysis and learning that patients often didn’t feel heard, we landed on the above How Might We.
Ideation
We completed several rounds of Crazy 8’s to develop ideas, from 3D body models to VR explanations and experiences.
Prototyping & Iterating
After selecting a solution, we conducted desk research to learn more about chronic pain management, we researched existing solutions, and prototyped many iterations. We started out by each drawing out paper wireframes of what it can look it and combining the most feasible ideas given the timeframe.
We experimented with a calendar view, with a slider to change dates, with Unity, Shapes XR, and even with the Xbox Kinect. I created the final iterations in Figma.
Experimenting with a calendar view and full body view.
Experimenting with the Xbox Kinect and a different calendar view.
Unity Body Tracking
We also experimented with Unity to see if we could implement a realistic body tracking interaction. We were able to successfully code the interaction and would want to use Unity in future iterations.
VR pain management
I personally wanted to explore a VR experience that allowed a user to add shape, texture, and intensity to their pain. I used Shapes XR to create a VR space and Meshy to create a 3D version of myself. This was fun to explore but I knew having a chronic pain management system in a VR headset added too much friction for the user.
Defining the interactions
Our team brainstormed the potential interactions a user can have to indicate pain or log it and move from one screen to the other. We started by asking other students to pretend they were logging pain into the mirror to understand natural gestures people may use.
Student indicating how they would log pain in a mirror.