Research.
I believe technology should enhance human connection, not replace it. My research explores how AI and new interaction technologies can make us more connected, empathetic, and effective in our digital lives.
Research Arc.
A narrative sequence that explains how the work connects—click a chapter to highlight it on the map.
Select a chapter to see the story path through pillars, subthemes, and papers.
A horizontal “transit map” of pillars → subthemes → sub-sub themes. Select a station to preview papers.
Research Pillars
Collaborative Intelligence
Designing AI-mediated collaboration: facilitation, shared cognition, and communication support that helps teams work better together.
Affective Adaptation
Teaching systems to sense and adapt to emotion and cognitive load—designing feedback, control, and trust in real-time loops.
Sensing Interfaces
Novel sensing for interaction—building camera-free input techniques and measurement methods that unlock new interface capabilities.
Immersive XR
Creating meaningful experiences in VR and AR. My work focuses on emotional connection, memory, and empathy in virtual spaces.

CLARA: AI-Mediated Facilitation for Enhancing Group Cognition and Cohesion in Remote Collaboration
Ever feel exhausted after Zoom meetings? CLARA is an AI assistant that helps make virtual meetings less tiring and more effective by understanding how you're feeling and adapting in real-time.

CoAffinity: A Multimodal Dataset for Cognitive Load and Affect Assessment in Remote Collaboration
We created a large dataset to help computers understand how stressed or emotional people are during video calls. It's like teaching AI to read the room during virtual meetings.

Re-Touch: A VR Experience for Enhancing Autobiographical Memory Recall Through Haptic and Affective Feedback
Imagine being able to physically touch and feel your cherished memories in VR. Re-Touch lets you revisit personal moments while the system adapts to your emotions, creating a deeply personal experience.

Haptic Empathy: Investigating Individual Differences in Affective Haptic Communications
Can you feel someone's emotions through touch? We studied how different people interpret emotional touch sensations and how to personalize haptic feedback to better communicate feelings.

CAEVR: Biosignals-Driven Context-Aware Empathy in Virtual Reality
We built a VR system that can make you feel more empathetic towards others by reading your body signals and adapting the experience to help you connect emotionally.

RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures
What if your smartwatch could understand gestures you make by touching different parts of your hand? RadarHand uses radar (like in cars) to detect exactly where you touch your hand, enabling new ways to control devices.

RaITIn: Radar-Based Identification for Tangible Interactions
Imagine a smart table that knows exactly which objects you place on it without using cameras or barcodes. RaITIn uses tiny radar reflectors (cheap stickers) to identify objects and even detects when you stack them together.

KinVoices: Using Voices of Friends and Family in Voice Interfaces
What if Siri sounded like your mom or your best friend? We studied how people react when voice assistants use familiar voices and found it can make interactions feel more personal and trustworthy.

Adapting Fitts' Law and N-Back to Assess Hand Proprioception
How well do you know where your hand is without looking? We created a new way to measure this "body awareness" (proprioception) which is important for designing wearables and gesture interfaces.
VRTwitch: Enabling Micro-motions in VR with Radar Sensing
In VR, small finger movements are usually ignored. VRTwitch uses radar to detect even the tiniest finger twitches, enabling new ways to interact in virtual worlds without needing to make big gestures.

Heightened Empathy: A Multi-user Interactive Experience in a Bioresponsive Virtual Reality
Two strangers enter VR, and their heartbeats and emotions start shaping the virtual world around them. As they sync up emotionally, the world transforms, helping them connect on a deeper level.

SealMates: Improving Communication in Video Conferencing using a Collective Behavior-Driven Avatar
What if everyone in your video call was represented by a cute seal that moves based on how engaged the group is? SealMates replaces individual videos with a shared avatar, making meetings feel more connected and less awkward.

A User Study on Sharing Physiological Cues in VR Assembly Tasks
When you work with someone in VR, they can't see your body language. We tested showing teammates' stress levels as subtle visual cues—it helped teams work better together without feeling creepy.

Healing Horizons: Adaptive VR for Traumatic Brain Injury Rehabilitation
Can VR help people recover from brain injuries? Healing Horizons makes therapy exercises engaging by putting them in beautiful VR environments, automatically adjusting difficulty so patients stay challenged but not frustrated.
Designing Body-Centric Interactions with Radar Sensing
My masters thesis on using radar to enable body-centric interaction without cameras.
VRdoGraphy: An Empathic VR Photography Experience
A VR photography experience designed to evoke empathy through context-aware and bioresponsive cues.
The Empathic Metaverse: An Assistive Bioresponsive Platform for Emotional Experience Sharing
A platform vision for bioresponsive, empathic experience sharing in the metaverse.
SensoryScape: Context-Aware Empathic VR Photography
A context-aware VR photography experience that adapts sensory cues to evoke empathy.
The Fusion Nexus: Exploring the Confluence of Virtual and Real Worlds through a Biocognitive Audio-Verbal Interface in Immersive XR Environments
An XR interface concept blending audio-verbal interaction with biocognitive signals.
Virtual Journalist: Measuring and Inducing Cultural Empathy by Visualizing Empathic Perspectives in VR
A VR experience to measure and induce cultural empathy via perspective visualization.
Empathetic Conversational Agents: Utilizing Neural and Physiological Signals for Enhanced Empathetic Interactions
How can chatbots be more empathetic? This work explores using neural and physiological signals to help.
Living Literature: A Collaborative VR Experience for Enhancing Shared Reading Through Affective and Generative Feedback
A collaborative VR reading experience with affective + generative feedback loops.
Showing 22 of 22 publications