Sharing Your Heartbeat in VR Collaboration
In simple terms: When you work with someone in VR, they can't see your facial expressions or body language as clearly as in real life. We tested what happens when you share physiological signals-like heart rate and stress indicators-as subtle visual cues. Does it help teammates understand each other, or does it just feel creepy?
🎯 Key Takeaways
- Enhanced awareness - participants better understood their partner's state when physiological cues were visible
- Improved task performance - teams completed assembly tasks more efficiently with shared cues
- Context matters - beneficial for collaborative tasks, potentially problematic for competitive ones
- Privacy trade-offs - users wanted control over what signals to share and when
- Subtle is better - obvious displays of heart rate felt intrusive; ambient cues worked well
The Problem: VR Hides Our Bodies
Virtual Reality creates immersive shared spaces, but it hides something crucial: our bodies and their signals. In real life, when your colleague is stressed, you might notice:
- Changes in their breathing
- Tension in their posture
- Micro-expressions on their face
In VR, these signals are invisible. Your avatar looks the same whether you're calm or panicking.
This matters because collaboration relies on mutual awareness. Knowing your partner is struggling lets you offer help. Sensing their confidence lets you follow their lead.
What We Built

We created a system that captures physiological signals and translates them into subtle VR visualizations:
Captured Signals
- Heart rate - from wrist-worn PPG sensor
- Heart rate variability (HRV) - indicator of stress/relaxation
- Skin conductance - reflects emotional arousal
Visualization Approaches
We tested several ways to display these signals:
- Aura visualization - a subtle glow around the avatar that changes color/intensity
- Particle effects - small particles that increase with arousal
- Environmental changes - ambient lighting shifts based on collective stress
- Direct numeric display - showing actual heart rate (for comparison)
The Study
We recruited pairs of participants to complete VR assembly tasks-building virtual objects together under time pressure. We compared:
- No cues - standard VR collaboration
- Subtle cues - aura and particle visualizations
- Explicit cues - numeric display of heart rate
What We Measured
- Task completion time and accuracy
- Subjective workload (NASA-TLX)
- Sense of presence and co-presence
- Perceived partner awareness
- Privacy comfort levels
What We Found
Finding 1: Subtle Cues Improved Performance
Teams with subtle physiological cue visualization completed tasks faster and with fewer errors. The effect was strongest in high-stress segments of the task.
Finding 2: Awareness Without Awkwardness
Participants reported feeling more aware of their partner's state with subtle cues, but didn't find it intrusive. The ambient nature of the visualization made it feel natural rather than surveillance-like.
Finding 3: Explicit Display Backfired
When we showed actual heart rate numbers, performance dropped. Participants found it distracting and anxiety-inducing ("I kept watching my partner's heart rate instead of doing the task").
Finding 4: Control Is Essential
In post-study interviews, participants emphasized wanting control over:
- Which signals to share
- Who can see them
- When sharing is active
Even those who benefited from the feature wanted the ability to turn it off.
Design Implications
For VR Collaboration Systems
- Provide physiological awareness features - they help collaboration
- Keep visualizations ambient and subtle - explicit displays backfire
- Give users granular control - over what, when, and with whom to share
- Consider context - high-stakes competitive scenarios may not benefit
For Privacy-Preserving Design
Physiological data is intimate. Our approach-transforming signals into abstract visualizations rather than sharing raw data-maintains privacy while enabling benefits.
📚 Personal Reflections: What I Learned
The Body Talks, Even in VR
This project reinforced something I keep rediscovering: the body is central to human experience. We can't just ignore it in virtual spaces and expect full communication.
Abstraction Protects Privacy
By converting physiological signals into ambient cues, we preserved the functional benefit (awareness) while reducing privacy concerns. This pattern-beneficial information through privacy-preserving abstraction-applies broadly.
More Information Isn't Always Better
The explicit heart rate display taught me that more data isn't always more useful. The right level of information, presented appropriately, beats comprehensive data dumps.
Users Want Agency
Even when a feature helps them, people want control over it. This isn't irrational-it's about autonomy and trust. Design for agency, not just optimization.
Connection to My Work
This project connects directly to my research themes:
- CoAffinity captures similar physiological signals for affect assessment
- CAEVR adapts experiences based on physiological state
- CLARA provides collaboration support (here through visibility, there through AI facilitation)
The common thread: understanding human state to support better collaboration.
