6/ Want to connect with Derek? www.linkedin.com/in/derek-orb...
#Shadowme #MotionCapture #AI #SportsTech #Boxing #Shadowboxing #AndroidApp #OpenTesting #GooglePlay #Kickstarter [6/6]
Posts by Empathic Computing Lab
Back the campaign here: www.kickstarter.com/projects/sha...
Support helps fund the next stage of refinement, stability, and expansion. [5/6]
✨ Kickstarter is live (all-or-nothing):
NZ$3,474 pledged of NZ$5,000
38 backers • 24 days to go
Deadline: Thu 14 May 2026, 7:15 PM UTC [4/6]
Try the beta (Android): play.google.com/store/apps/d...
If you test it, feedback is hugely appreciated as it moves toward launch. [3/6]
Shadowme is Derek’s personal project — built to help athletes train smarter with:
✅ real-time form feedback
✅ shadowboxing mode [2/6]
🚀 Beta now live (Android): Shadowme by ECL postdoc Dr Derek Orbaugh is officially in open testing on Google Play. [1/6]
A few days in 🇪🇸 Talks, posters, conversations, and a bit of Barcelona in between. Here's what conference week looks like from the inside.
#CHI2026 #HCI #LabLife #EmpathicComputingLab @ABI_bioeng
Presenting Tue 14 Apr | Experiencing Virtual Worlds (XR) | #CHI2026 Barcelona
Link: programs.sigchi.org/chi/2026/pro...
#VirtualReality #XR #EyeTracking #HCI #PhysiologicalComputing #CHI2026 #EmpathicComputingLab [8/8]
Paper + session info
“The Ocular Command Center: How Eye Responses to Luminance, Color, Tunneling, and Visual Suppression Mediate Users' Physiological States in VR”
Andreia Valente, Augusto Esteves, Mark Billinghurst
[7/8]
Why this matters
Understanding these distinct pathways enables adaptive VR design that can regulate comfort, engagement, and physiological state more reliably.
[6/8]
Key finding #3
Peripheral occlusion + visual suppression changed oculomotor behaviour, but didn’t produce major cardiovascular effects.
Different manipulations → different pathways (reflexive, cognitive, perceptual).
[5/8]
Key finding #2
Colour temperature affected heart rate variability without pupillary mediation — suggesting cognitive appraisal processes — and it induced severe nausea.
Not all “visual tweaks” are equivalent.
[4/8]
Key finding #1
Luminance changes affected heart rate through pupillary reflexes.
Meaning: some comfort/physio shifts are tightly coupled to reflexive ocular pathways.
[3/8]
Study: 40 participants, controlled VR conditions.
We measured:
• eye activity (pupil size, blinks, fixations, saccades)
• cardiovascular responses
• subjective symptoms
Across:
• luminance
• colour temperature
• peripheral occlusion
• visual suppression
[2/8]
Your eyes know more about VR comfort than you might think.
When you adjust luminance, shift colour temperature, or apply peripheral occlusion in VR — what actually happens to your body? Our paper introduces the Ocular Command Center framework to find out. [1/8]
What makes it different: it doesn't just show you similar cases — it traces how changing each attribute moves the outcome, like an agent walking you through adjustments on a comparable property.
Paper: doi.org/10.1145/3772...
@ABI_bioeng #XAI #HCI [2/2]
Presenting at #CHI2026 this week: Comparables XAI 🎉
How do you explain an AI decision intuitively? We borrow from real estate: comparable examples + counterfactual traces that walk attribute-by-attribute from each comparable to the subject.
Result: highest faithfulness + user accuracy [1/2]
His Google Scholar has 50,000+ citations. His ideas are in labs, devices, and research groups around the world.
📎 empathiccomputing.org/team/mark-billinghurst/
📚 scholar.google.com/citations?user=S-J_ItYAAAAJ [3/3]
#MeetTheTeam #AugmentedReality #EmpathicComputing #HCI #ResearchCommunity
Today he leads ECL at the University of Auckland, exploring:
🔬 Augmented Reality interfaces
🖐️ Multimodal input (gaze, gesture, speech)
🤝 Spatial collaboration tech [2/3]
👋 Meet the Team: Prof. Mark Billinghurst, Part II
In 2002, Mark invented the Magic Book — an AR storybook where characters leap into the real world through a headset. That early spark became a career defining the future of how humans and computers coexist. [1/3] #ECLMeetTheTeam
We're here! 🇪🇸 ECL has arrived in Barcelona for #CHI2026. Excited to connect, share ideas, and present our work this week. Come find us!
#HCI #Research #CHI2026 @aucklanduni.bsky.social
What's the most important thing to get right when communicating science outside your field?
Kunal's panel from the ABI Research Forum has some answers.
youtu.be/OdldwQTPn3I
#SciComm #Research
Absolutely. Hopefully, this will lead to some tech that can help deal with cognitive overload, in addition to other problems for accessibility.
That is such a shame. We definitely understand the feeling of realising your work is changing a user's experience for the better. We have a lab member working on haptic feedback to "hack" the biofeedback that increases anxiety, to provide an offramp from anxiety. youtu.be/Ickb2neXMg0
That is fantastic! What was your favourite part of working on it?
This is brilliant. Thank you.
Thank you! Our team leader, Mark Billinghurst, is very intent on using technology to improve access and help people. If you are interested in TBI research, we also have a PhD student working on using VR for TBI rehabilitation and retraining on everyday tasks.
AR + collaboration + HCI + mental health tech. That's the lab. That's Mark. 🧠💻
[5/5]
🔗 empathiccomputing.org/team/mark-bi...
#MeetTheTeam #AugmentedReality #HCI #WearableComputing #EmpathicComputingLab
His research vision: what happens when you put ubiquitous computing ON the body, layer virtual objects over reality, and let people interact through voice, gaze, and gesture — all at once? [4/5]
Mark leads the Empathic Computing Lab @UoAuckland, pioneering research in Augmented Reality, wearable computing, and multimodal human-computer interaction. [3/5]