
As a founding designer, I led the creation of immersive learning solutions by applying my expertise in VR/AR and Mobile/Desktop UI/UX. My work focused on designing scalable and cost-effective alternatives to traditional, mannequin-based training.
In collaboration with medical universities and hospitals, I designed SaaS, B2B & B2C and enterprise software that empowered instructors to create custom multiplayer scenarios. These VR simulations, which emulate emergency rooms, allowed students to take on various roles and practice complex procedures and tool interactions directly addressing a core user need to reinforce critical learning in a safe, repeatable environment.
The Impact
The design proved to be highly effective and valuable to its target users, validated by both qualitative feedback and quantitative metrics from studies with McGill University Health Centre.
75%
Increase in knowledge retention compared to traditional methods.
93%
Of clinicians stated they would adopt the platform in their hospital.
87%
Of clinicians found the simulation to be highly useful for their training.
The Learner (Clinician)
Goal:
Build muscle memory and improve decision-making under extreme pressure.
Pain Points:
Lack of confidence, limited exposure to rare cases, and the fear of making a critical mistake on a real child.
The Instructor (Educator)
Goal:
Effectively teach, observe, and debrief trainees in a controlled environment.
Pain Points:
Difficulty standardising scenarios, inability to see from the learner's perspective, and providing objective, data-driven feedback.
The User
To build an effective solution, we had to serve two distinct user types with interconnected needs
Problem
Clinicians lack safe, realistic, and scalable ways to practice for high-stakes paediatric emergencies.
Paediatric trauma is the leading cause of death in children, yet the opportunities for hands-on practice are dangerously rare. Traditional mannequin-based training, while a staple, has significant limitations:
Low Fidelity: Mannequins fail to replicate the immense psychological pressure and dynamic unpredictability of a real emergency, which is critical for building resilience and improving decision-making.
Poor Scalability: These physical simulations are expensive, require significant logistical overhead, and are difficult to standardise, limiting their accessibility for healthcare professionals worldwide.
Final Outcome

CBC News Spotlight

Emergency Room layout

VR Patient Interactions

VR Objects Interactions

VR User Interfaces
Design Goals
-
Replicate Real-World Pressure Design a psychologically convincing experience that mirrors the stress of a real paediatric emergency, allowing clinicians to build resilience.
-
Enable Asymmetric Collaboration Create a dual-experience platform: an intuitive space for learners and a powerful 'God View' for instructors to control the simulation and guide the team.
-
Deliver Actionable Feedback Automatically track in-simulation actions and translate performance data into an objective analytics dashboard for effective debriefing.
-
Design for Scale & Accessibility Build on standalone VR headsets to remove hardware barriers, making elite medical training affordable, scalable, and accessible everywhere.
Design Deep Dive: An Asymmetric System
Immersion & Intuition
The learner's experience was engineered to make the technology feel invisible, allowing them to focus entirely on the patient.

Our goal was to balance authenticity with focus. We designed a virtual UK emergency room with just enough detail to establish presence without distracting from critical tasks. User trials validated this approach, confirming the environment felt both immersive and realistic.
Lighting Technical Doc
Ergonomics & Comfort :
We designed the simulation for comfort and realism. All interactions are ergonomic to prevent user fatigue, while the environment is realistically scaled for a 3-5 person team. Furniture is placed at authentic heights to ground the experience and support natural, collaborative movement.

✅
.png)
❌
.png)
❌
To minimise physical fatigue, I aligned all interactables, particularly UI elements, with the user's natural line of sight.
Spatial UI
Mcgill PeTIT rely on mainly mid field UI which comes usually during the simulation session.
-
Contextual UI
-
Laptop UI
-
Configurational UI
-
Phone Request UI
Far field UI consist of following UIs which usually comes before and after the simulation session.
-
Briefing UI
-
Debriefing UI.
I implemented a layered UI system based on interaction distance: near-field for immediate actions and notifications, mid-field for contextual menus, and far-field for Debrief UI.
.png)
Contextual UI - Design doc
The contextual menu is an essential tool for patient interaction, allowing users to perform exams and prompt actions. For readability, it’s anchored to the patient’s body and always faces the user.
Menu options change dynamically based on the selected body part. For example, selecting the head reveals communication options with pre-set questions.
.png)
Laptop UI
The laptop's object-relative UI is anchored to the laptop, creating a stable and predictable data hub. By not scaling or following the user, the interface remains grounded in the environment to enhance realism.
Key functions include:
-
Patient Info: Access patient records and vitals.
-
Data Entry: Fill out forms like the Glasgow Coma Scale.
-
Imaging: View X-rays and other radiology.
-
Medication Log: Track administered drugs in real-time.
.png)
Dosage Config UI
After selecting a medicine from supply cart, the user is presented with the Dosage UI to confirm the dosage for the patient. This interface allows the user to increase or decrease the dosage value or modify it directly with text input.
.png)
Phone Request UI
This UI is a centralised hub for time-sensitive tasks, allowing user to act quickly without disrupting their workflow.
Key Actions:
-
Expert Consultation: Instantly call specialists like surgeons for support.
-
Report Requests: Seamlessly order radiology reports (e.g., X-rays, CT scans).
-
Patient Transfers: Coordinate the transfer of stabilised patients to another department.
.png)
Briefing UI
.png)
Debriefing UI
This UI provides a structured framework for simulation sessions. Before the session, the Briefing UI allows instructors to assign participant roles and outline objectives.
After the session, the Debriefing UI presents a detailed, timestamped activity log. This enables instructors to review performance, identify key events, and assess critical mistakes.
Design Issues & Solutions
We identified and solved four key UI challenges for PeTIT VR:
-
UI Clipping: Panels clipped through 3D objects, breaking immersion and hiding information.
-
Layout Clutter: Disorganised, floating UI elements created a confusing experience.
-
Poor Readability: Unoptimised text and contrast caused illegibility and eye strain.
-
Head-Locking: Uncomfortable head-locked displays led to motion sickness and fatigue.

UI Clipping

Poor readability

.png)
Layout Clutter
-
Spatially-Aware UI: The UI intelligently faded or moved to avoid intersecting with other objects.
-
Golden Zone Layout: All UI was organised in an ergonomic "golden zone" for a clean, predictable layout.
-
Optimised UI Scale: We refined UI scale, contrast, and distance through iterative testing to ensure comfort and legibility.
-
World-Anchored Interface: A stable, world-anchored interface was used to eliminate motion sickness.
Interaction Modality
UI & Navigation
Ray-casting is the primary interaction method for UI and navigation. It allows users to point at distant objects to summon contextual menus or move through the virtual space.
Object Interaction
I designed a hybrid model for object interaction that combines two methods:
-
Ray-Casting: Allows users to select objects from a distance.
-
Direct Manipulation: Uses natural hand movements for precise tasks with the selected object.
This dual system offers both long-range convenience and close-range precision.

Spatially-Aware UI

Optimised UI Scale
.png)
World-Anchored UI
.png)
Golden Zone
Affordance & Signifiers
My goal was to make the system feel natural and easy to use. By adding consistent visual and physical clues to all the equipment, user automatically which are interact-ables and same interaction method helps to cuts down on confusion and reduces the chance of errors.
Find out more in detail design doc

UI & Navigation - Design Doc

Object Interaction
Immersion & Context:
Integrating UI to support, not shatter, the user's sense of presence. This means either embedding interfaces naturally into the virtual world (diegetic design) or making them appear contextually only when needed.

Default

Default with Index Trigger Pressed

Default hovering on interact-able

Index trigger pressed on interact-able
Multi-Modal Feedback
To make interactions feel tangible and unambiguous, we confirm user actions through a synchronised combination of haptic, auditory, and visual feedback.

User action feedback
Empowering the Instructor: Control & Assessment
The instructor's "God View" dashboard provides full control over the simulation and deep insight into team performance. To navigate the virtual room, instructors can use two modes:
-
Free Roam: Keyboard-based navigation for advanced users.
-
Fixed Cameras: Instantly switch between preset camera angles to easily observe learners.
Instructor Spectator View
Empowering the Instructor: Control & Assessment
From the dashboard, the instructor can trigger dynamic complications in real-time—like a seizure or anaphylactic shock—to test the team's adaptability. The UI allows for nuanced control, such as specifying the exact physiological effects of a condition.
Dashboard Controls
Asymmetric Information Flow
When a learner in VR requests a lab test or an X-ray, the request appears in the instructor's dashboard. The instructor can then fulfil it, sending the diagnostic asset (e.g., a virtual X-ray film) into the simulation for the learner to analyse, perfectly mimicking the real-world chain of command and information flow.
Scan Request

Radiologist
Scan View
.png)
Accepting Request
Session Analytics
From Action to Insight After the session, the platform provides a data-rich debriefing information. It automatically logs every action and assesses performance across key areas like Technical Skills and Teamwork. This timestamped log allows instructors to lead highly detailed, objective, and data-driven reviews of the team's performance.

Session Analytics
PeTIT VR
Petit VR is a VR simulation in collaboration with McGiill University, Canada that addresses a critical gap in medical education. By recreating the high-pressure environment of a paediatric trauma bay in virtual reality, it provide clinicians with a safe, scalable, and highly effective way to train for the most critical moments of their careers.


My Role
Senior Product Designer, leading the end-to-end user experience, from foundational research and interaction design to UI systems and cross-functional strategy
Year : 2022 | Duration: 1 Year | Tools : Figma, Unreal Engine, Spline, Cinema 4D, Shape XR
Learnings & What's Next
The biggest design challenge was striking the perfect balance between deep medical accuracy and creating an intuitive user experience that didn't overwhelm the user. This was only possible through constant iteration and a deep, collaborative partnership with clinical experts.
The future vision for Petit VR is to leverage the Passthrough capabilities of new headsets like the Meta Quest 3. This will enable "in-situ" training, where a virtual/mannequin patient and virtual tools are overlaid onto a real hospital room. This evolution from Virtual to true Mixed Reality will further dissolve the barrier between simulation and real-world practice, helping clinicians save even more lives.
The Approach: Why Virtual Reality?
We chose VR to strategically solve the core problems of traditional training.
Presence & Immersion: VR induces "presence," the feeling of being there, which is crucial for replicating the high-stress environment of a trauma bay.
Safety & Repetition: The platform provides a "safe-to-fail" environment where clinicians can repeatedly practice rare and critical procedures to achieve mastery.
Accessibility & Scale: Running on standalone headsets like the Meta Quest, our solution makes elite training accessible to any clinician, anywhere, anytime.