Frame_596(0002).png
Neurofeedback Immersive Meditation for Hyatt Hotel
4.5★ pilot-tested immersive meditation app designed for Hyatt Hotels.

Hyatt was reimagining luxury wellness, not as a static spa experience, but as something responsive, intelligent, and deeply personal. That raised a deeper question, "What if sound could be seen? And what if that vision could calm your mind, in real time?". In partnership with Hyatt’s teams in Shanghai and Hong Kong, I created a neuro-responsive meditation powered by EEG, where guests’ brainwaves shape the space around them. As the system detects cognitive states like anxiety, focus, or fatigue, it dynamically shifts the sound, visuals, and rhythm to support the guest’s mental state. The result is a personalized meditative experience where calm isn’t prescribed, but co-created, turning wellness into something that listens, adapts, and evolves with each guest.
Client & Duration

Hyatt Hotel (Shanghai & Hongkong) | 2024 Nov - 2025 Apr (6 Months)
Scheduled expansion to Hyatt Seoul and Tokyo in 2025.

Type

Freelance work, Immersive AI-powered meditation service,
Biometric interactive experience, Interaction Design

Team

Primary Assembly (1 Director, 2 Developers, 1 Product Designer),
Room Temperature (3 Product Developers, 2 Marketers)
What I Did

  1. Led end-to-end product design for an AI-driven, brain-responsive journey.
  2. Designed interactive onboarding flows tailored to each guest’s cognitive state
  3. ​​Visualized brainwave data through intuitive, generative graphics 
  4. ​​Iteratively tested and refined the interaction design across EEG hardware, AI-driven behaviors, and responsive UI
  5. ​​Aligned with Hyatt’s design language to ensure brand consistency and tone

CHALLENGE

Designing "Calm" as a "Conversation"

Meditation has always been personal. But luxury wellness hadn’t yet caught up to that truth. Most experiences were beautifully designed, but static. They didn’t adjust when your mind wandered. They didn’t notice when you felt scattered, or calm, or overstimulated. Hyatt wanted to change that. And I wanted to ask, "Can we design calmness, not as a preset, but as a dialogue?" The challenge was to create a meditation experience that felt alive, one that listens, senses, and responds. A system that doesn’t just play a track, but plays with clients.

APPROACH & SOLUTION

Designed a meditation experience that thinks with the guest

Stillness is a feeling. But to design it, we needed data, rhythm, and empathy. I started by 
asking how the mind actually shifts during meditation. Using EEG sensors, we listened to real-time brain activity and mapped those invisible changes into visual, sonic, and environmental shifts.

The result was a meditation flow that adapts as guests go. Visuals deepen when focus rises, pacing slows when stress peaks, and AI modulates each session to meet the guest's current cognitive state. Rather than creating a single meditation sequence, I designed a responsive system, one that senses the mind and choreographs the experience accordingly.
1) iPadOS Version: A premium in-room experience available in Hyatt’s suite rooms.
It integrates with the EEG headset, allowing real-time brainwave input to personalize the meditation 
based on each guest’s cognitive state.

2) H5 Version: Designed for broader accessibility across Hyatt’s global wellness centers.
Guests scan a QR code to access the experience on any mobile device (Fully responsive for all smartphone types).
It offers curated meditation sessions selected by AI based on the guest’s chosen mindset.


​​
IDEATION & ITERATION

1) EEG-integration flow for iPadOS

For the EEG-based version, the challenge was to deliver real-time feedback without disrupting the very stillness we aimed to design. I explored ways to translate invisible brain signals into a UI that feels ambient, not analytical. Early wireframes mapped out onboarding, signal syncing, and adaptive meditation paths— always emphasizing trust, clarity, and softness.
Frame_744.png
Early iteration: Signal syncing interface and onboarding layout
Group_503.png
Refined iteration: Ambient visuals and confidence-based recommendation flow
2) QR-based mobile flow for H5

This version was designed for broader accessibility, no headset required. I approached it with mobile-first UX principles, ensuring a smooth and lightweight flow for users scanning QR codes throughout Hyatt’s wellness spaces. Instead of brainwaves, users select their current mindset, and AI curates a session to match. The focus was speed, calm, and a sense of agency—no tech friction, just entry into mindfulness.
Frame_743.png
Early iteration: Wireframe for mindset selection
Group_504.png
Refined iteration: Wireframe for AI curation based on mindset selection
3) Visualizer, Visual Language for the mind

Beyond UX, I sketched a visualizer to reflect each brainwave’s desired mindset, not with just thin wave lines, but through shifting color ranges and motion. Instead of reading data, I wanted users to feel it. I mapped how color would move, change, and pulse, creating a system where mental states could be sensed intuitively, like mood lighting for the mind. It was about making the invisible visible, and designing trust through feeling.
Frame_7(0010).png

Initial sketch for turning brainwave data into feeling with flowing color and movement

KEY FEATURES

Real-time EEG & AI integration for personalized meditation
Frame_750(0002).png
1) iPadOS: Real-time brainwave analysis by AI with interactive UI 

Before each meditation session, the system conducts a brainwave assessment using the Enophone EEG headset. Based on the live data, AI generates a short explanation describing the user’s current cognitive state—such as “mentally fatigued” or “deeply focused”—and recommends a personalized session to restore balance.
Frame_751.png
1. A live headset connection interface so users could confidently begin the assessment with proper signal
Frame_751(0004).png
2. Real-time brainwave visualizer in the background, guiding users through the sensing process
Frame_751(0007).png
3. Integrated explanation UI, where the AI-generated recommendation feels more like a calming insight than a system output
2) H5: AI-curated sessions based on user-selected mindset

In this version, AI plays the role of a contextual curator, guiding users based on their selected mindset rather than biometric input. After scanning a QR code placed throughout Hyatt’s wellness spaces, users are asked to select their current mental state (e.g., anxious, unfocused, tired). Also, it offers a lightweight and accessible experience, optimized for any mobile device.
Frame_751(0006).png
Smooth transitions between AI’s curated response and the selection, and a simple yet expressive mindset selection interface.
FINAL DESIGN

Here’s how the final experience came together, merging real-time biometrics, ambient feedback, and intuitive flows across both iPadOS and H5 versions.
Frame_753(0001).png
iPadOS Final UI - AI generated result screen based on EEG analysis
Frame_754.png
H5 Final UI - AI curated sessions based on user-selected mindset




🔍 REFLECTION

The real challenge wasn’t just using EEG data or AI. It was designing a seamless flow where both could work together to enhance the meditation experience, not interrupt it. I kept asking, "How can real-time brainwave input and AI-generated guidance feel like part of the same breath?". Early versions felt disjointed. EEG feedback appeared too suddenly. AI insights arrived too sharply. Even the visuals felt clinical. So I rethought the rhythm.

I designed a mood-based visualizer that responded to brainwave states, not as raw lines, but as flowing color and motion. It became the connective tissue between the body and the AI: a soft, ambient layer that translated mental states into emotion. By blending gentle visuals, soft transitions, and context-aware AI, the system started to feel less like technology and more like a living, breathing companion.

🔍 WHAT I LEARNED
  1. Trust is design. Even micro-copy and pacing can shift how safe users feel, especially with sensitive data like brainwaves.
  2. Clarity is calming. Simplifying uncertainty created a sense of intelligence, not just more features.
  3. AI should guide, not dominate. When AI acted more like a gentle suggestion than a verdict, users felt more empowered.