Research.

I believe technology should enhance human connection, not replace it. My research explores how AI and new interaction technologies can make us more connected, empathetic, and effective in our digital lives.

Research Arc.

A narrative sequence that explains how the work connects—click a chapter to highlight it on the map.

Select a chapter to see the story path through pillars, subthemes, and papers.

Map Mode
Research Map

A horizontal “transit map” of pillars → subthemes → sub-sub themes. Select a station to preview papers.

Collaborative IntelligenceCollaborative IntelligenceAI FacilitationAI FacilitationGroup CognitionGroup CognitionTrust & ProvenanceTrust & ProvenanceCommunication SupportCommunication SupportRepresentation & AvatarsRepresentation & AvatarsSocial SignalsSocial SignalsShared ExperiencesShared ExperiencesConversational EmpathyConversational EmpathyShared ReadingShared ReadingAffective AdaptationAffective AdaptationAffect & Load Inferen…Affect & Load InferenceCognitive LoadCognitive LoadPhysiological Cue SharingPhysiological Cue SharingAdaptive XR Experienc…Adaptive XR ExperiencesHaptics & MemoryHaptics & MemoryEmpathy & PresenceEmpathy & PresenceEmpathic SystemsEmpathic SystemsCultural EmpathyCultural EmpathyPlatform & VisionPlatform & VisionSensing InterfacesSensing InterfacesRadar InteractionRadar InteractionOn-Skin TouchOn-Skin TouchTangible IdentificationTangible IdentificationMicro-Motion InputMicro-Motion InputMeasurement MethodsMeasurement MethodsProprioceptionProprioceptionImmersive XRImmersive XREmpathy & PresenceEmpathy & PresenceBioresponsive XRBioresponsive XRPerspective & CulturePerspective & CultureMemory & HapticsMemory & HapticsAutobiographical RecallAutobiographical RecallHealth & RehabHealth & RehabAdaptive RehabilitationAdaptive Rehabilitation
Paper Preview
Select a station to preview papers
Click a pillar or a subtheme station to see papers here.

Research Pillars

Collaborative Intelligence

Designing AI-mediated collaboration: facilitation, shared cognition, and communication support that helps teams work better together.

7 publications

Affective Adaptation

Teaching systems to sense and adapt to emotion and cognitive load—designing feedback, control, and trust in real-time loops.

10 publications

Sensing Interfaces

Novel sensing for interaction—building camera-free input techniques and measurement methods that unlock new interface capabilities.

5 publications

Immersive XR

Creating meaningful experiences in VR and AR. My work focuses on emotional connection, memory, and empathy in virtual spaces.

11 publications
Filter:
CLARA: AI-Mediated Facilitation for Enhancing Group Cognition and Cohesion in Remote Collaboration
Featured
ACM TOCHI2025

CLARA: AI-Mediated Facilitation for Enhancing Group Cognition and Cohesion in Remote Collaboration

Ever feel exhausted after Zoom meetings? CLARA is an AI assistant that helps make virtual meetings less tiring and more effective by understanding how you're feeling and adapting in real-time.

CoAffinity: A Multimodal Dataset for Cognitive Load and Affect Assessment in Remote Collaboration
Featured
IEEE TAFFC2025

CoAffinity: A Multimodal Dataset for Cognitive Load and Affect Assessment in Remote Collaboration

We created a large dataset to help computers understand how stressed or emotional people are during video calls. It's like teaching AI to read the room during virtual meetings.

Re-Touch: A VR Experience for Enhancing Autobiographical Memory Recall Through Haptic and Affective Feedback
Featured
SIGGRAPH Asia2024

Re-Touch: A VR Experience for Enhancing Autobiographical Memory Recall Through Haptic and Affective Feedback

Imagine being able to physically touch and feel your cherished memories in VR. Re-Touch lets you revisit personal moments while the system adapts to your emotions, creating a deeply personal experience.

Haptic Empathy: Investigating Individual Differences in Affective Haptic Communications
arXiv2025

Haptic Empathy: Investigating Individual Differences in Affective Haptic Communications

Can you feel someone's emotions through touch? We studied how different people interpret emotional touch sensations and how to personalize haptic feedback to better communicate feelings.

CAEVR: Biosignals-Driven Context-Aware Empathy in Virtual Reality
Featured
IEEE VR2024

CAEVR: Biosignals-Driven Context-Aware Empathy in Virtual Reality

We built a VR system that can make you feel more empathetic towards others by reading your body signals and adapting the experience to help you connect emotionally.

RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures
Featured
ACM TOCHI2024

RadarHand: A Wrist-Worn Radar for On-Skin Touch-Based Proprioceptive Gestures

What if your smartwatch could understand gestures you make by touching different parts of your hand? RadarHand uses radar (like in cars) to detect exactly where you touch your hand, enabling new ways to control devices.

RaITIn: Radar-Based Identification for Tangible Interactions
Featured
CHI2022

RaITIn: Radar-Based Identification for Tangible Interactions

Imagine a smart table that knows exactly which objects you place on it without using cameras or barcodes. RaITIn uses tiny radar reflectors (cheap stickers) to identify objects and even detects when you stack them together.

KinVoices: Using Voices of Friends and Family in Voice Interfaces
Featured
CSCW2021

KinVoices: Using Voices of Friends and Family in Voice Interfaces

What if Siri sounded like your mom or your best friend? We studied how people react when voice assistants use familiar voices and found it can make interactions feel more personal and trustworthy.

Adapting Fitts' Law and N-Back to Assess Hand Proprioception
CHI2021

Adapting Fitts' Law and N-Back to Assess Hand Proprioception

How well do you know where your hand is without looking? We created a new way to measure this "body awareness" (proprioception) which is important for designing wearables and gesture interfaces.

SIGGRAPH Asia2021

VRTwitch: Enabling Micro-motions in VR with Radar Sensing

In VR, small finger movements are usually ignored. VRTwitch uses radar to detect even the tiniest finger twitches, enabling new ways to interact in virtual worlds without needing to make big gestures.

Heightened Empathy: A Multi-user Interactive Experience in a Bioresponsive Virtual Reality
SIGGRAPH2023

Heightened Empathy: A Multi-user Interactive Experience in a Bioresponsive Virtual Reality

Two strangers enter VR, and their heartbeats and emotions start shaping the virtual world around them. As they sync up emotionally, the world transforms, helping them connect on a deeper level.

SealMates: Improving Communication in Video Conferencing using a Collective Behavior-Driven Avatar
CSCW2024

SealMates: Improving Communication in Video Conferencing using a Collective Behavior-Driven Avatar

What if everyone in your video call was represented by a cute seal that moves based on how engaged the group is? SealMates replaces individual videos with a shared avatar, making meetings feel more connected and less awkward.

A User Study on Sharing Physiological Cues in VR Assembly Tasks
IEEE VR2024

A User Study on Sharing Physiological Cues in VR Assembly Tasks

When you work with someone in VR, they can't see your body language. We tested showing teammates' stress levels as subtle visual cues—it helped teams work better together without feeling creepy.

Healing Horizons: Adaptive VR for Traumatic Brain Injury Rehabilitation
SIGGRAPH Asia2023

Healing Horizons: Adaptive VR for Traumatic Brain Injury Rehabilitation

Can VR help people recover from brain injuries? Healing Horizons makes therapy exercises engaging by putting them in beautiful VR environments, automatically adjusting difficulty so patients stay challenged but not frustrated.

Thesis2022

Designing Body-Centric Interactions with Radar Sensing

My masters thesis on using radar to enable body-centric interaction without cameras.

IEEE VRW2023

VRdoGraphy: An Empathic VR Photography Experience

A VR photography experience designed to evoke empathy through context-aware and bioresponsive cues.

arXiv2023

The Empathic Metaverse: An Assistive Bioresponsive Platform for Emotional Experience Sharing

A platform vision for bioresponsive, empathic experience sharing in the metaverse.

SIGGRAPH Asia2023

SensoryScape: Context-Aware Empathic VR Photography

A context-aware VR photography experience that adapts sensory cues to evoke empathy.

SIGGRAPH Asia2023

The Fusion Nexus: Exploring the Confluence of Virtual and Real Worlds through a Biocognitive Audio-Verbal Interface in Immersive XR Environments

An XR interface concept blending audio-verbal interaction with biocognitive signals.

ISMAR2023

Virtual Journalist: Measuring and Inducing Cultural Empathy by Visualizing Empathic Perspectives in VR

A VR experience to measure and induce cultural empathy via perspective visualization.

IJHCI2025

Empathetic Conversational Agents: Utilizing Neural and Physiological Signals for Enhanced Empathetic Interactions

How can chatbots be more empathetic? This work explores using neural and physiological signals to help.

SIGGRAPH Asia2025

Living Literature: A Collaborative VR Experience for Enhancing Shared Reading Through Affective and Generative Feedback

A collaborative VR reading experience with affective + generative feedback loops.

Showing 22 of 22 publications