CHI'24 ACM SIGCHI (2024): Honolulu, Hawaiian Kingdom/USA
Grand challenges in WaterHCI
ACM SIGCHI Premier International Conference of Human-Computer Interaction CHI'24
Florian ‘Floyd’ Mueller, Maria F. Montoya, Sarah Jane Pell, Leif Oppermann, Mark Blythe, Paul H Dietz, Joe Marshall, Scott Bateman, Ian Smith, Swamy Ananthanarayan, Ali Mazalek, Alexander Verni, Alexander Bakogeorge, Mathieu Simonnet, Kirsten Ellis, Nathan Arthur Semertzidis, Winslow Burleson, John Quarles, Steve Mann, Chris Hill, Christal Clashing, and Don Samitha Elvitigala.
A timely articulation of the grand challenges in WaterHCI namely, technology for water environments; users engaging with water; designing water experiences; and ethics around water. This paper contributes a systematic research agenda, and aims to advance the field in combining interactive technologies, with humans and water. https://doi.org/10.1145/3613904.3642052
Abstract
Recent combinations of interactive technology, humans, and water have resulted in what is now known as “WaterHCI”. WaterHCI design seeks to complement the many benefits of engagement with the aquatic domain, by offering, for example, augmented reality systems for snorkelers, virtual reality in floatation tanks, underwater musical instruments for artists, robotic systems for divers, and wearables for swimmers. We conducted a workshop with global experts in WaterHCI system design, evaluation, and analysis, with the objective of articulating the field’s grand challenges. By articulating these grand challenges, we aim to contribute to a systematic WaterHCI research agenda to ultimately advance the WaterHCI field.
Florian ‘Floyd’ Mueller, Maria F. Montoya, Sarah Jane Pell, Leif Oppermann, Mark Blythe, Paul H Dietz, Joe Marshall, Scott Bateman, Ian Smith, Swamy Ananthanarayan, Ali Mazalek, Alexander Verni, Alexander Bakogeorge, Mathieu Simonnet, Kirsten Ellis, Nathan Arthur Semertzidis, Winslow Burleson, John Quarles, Steve Mann, Chris Hill, Christal Clashing, and Don Samitha Elvitigala. 2024. Grand challenges in WaterHCI. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 30 pages.
Exploring a playful extended reality floatation tank experience
to reduce the fear of being in water.
Maria F. Montoya, Hannah Qiao, Prasanth Sasikumar, Don Samitha Elvitigala, Sarah Jane Pell, Suranga Nanayakkara, Florian 'Floyd' Mueller.
Abstract
People with a fear of being in water rarely engage in water activities and hence miss out on the associated health benefits. Prior research suggested virtual exposure to treat fears. However, when it comes to fear of being in water, virtual water might not capture water’s immersive qualities, while real water can pose safety risks. We propose extended reality to combine both advantages: We conducted a study (N=12) where participants with a fear of being in water interacted with playful water-inspired virtual reality worlds while floating inside a floatation tank. Our findings, supported quantitatively by heart rate variability and qualitatively by interviews, suggest that playful extended reality could mitigate fear responses in an entertaining way. We also present insights for the design of future systems that aim to help people with fear of being in water and other specific phobias using the best of the virtual and physical worlds.
Maria F. Montoya, Hanna Qiao, Prasanth Sasikumar, Don Samitha Elvitigala, Sarah Jane Pell, Suranga Nanayakkara, and Florian ‘Floyd’ Mueller. 2024. Exploring a playful extended reality floatation tank experience to reduce the fear of being in water. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 17 pages. https://doi.org/10.1145/3613904.3642285.
What are the depths of AI in Diving?
Leveraging AI for HFE and HCI in Undersea Operations and Recreational Diving
Getting Back Together: HCI and Human Factors Joining Forces to Meet the AI Interaction, CHI Workshop
Sarah Jane Pell
Abstract
Exploring the impact of Artificial Intelligence (AI) through the lens of Human Factors Engineering (HFE) and Human-Computer Interaction (HCI) in the subsea industries and specifically on commercial diving operations may help predict the design potentials and pitfalls for civil recreational diving and embodied immersive visualisation underwater. There are many examples highlighting the potential of AI to enhance safety, situational awareness, and environmental conservation in land, sea, air, cyber, and space territories. By examining the challenges and opportunities of employing or deploying AI-driven technologies in undersea activities, such as autonomous underwater vehicles (AUVs) and real-time metaverse simulations for complex tasks like asset detection and environmental monitoring, we can discuss a new proximal and spatial conundrum: when will AI position actors at once in a dangerously dissociative distance, and yet simultaneously enabling a safe operational distance forfeit all agency and true awareness of thier aquabatics? Through a study of these high-stakes applications in extreme environments, we can further explore translation applications to solve HCI design challenges in recreational diving, and the impact of WaterHCI more broadly on aquatic behaviours. The goal is to illuminate the ethical, technical and security challenges for cross-disciplinary innovation, such as dual-use concerns and identify more intuitive, responsive AI systems that cater to the unique demands of undersea activities, to understand the depths of access and transferability of AI and ML between actors in the converging fields for ultimately predicting the implications for attuned body-aquatic diving sensibilities, for specialist applications and cultural practices, across immersive domains - wet and dry.
Sarah Jane Pell 2024. What are the depths of dual-use AI in Diving? Leveraging AI for HFE and HCI in Undersea Operations and Recreational Diving. In Presentations on: Getting Back Together: HCI and Human Factors Joining Forces to Meet the AI Interaction, In Workshops of the CHI Conference on Human Factors in Computing Systems (CHI’24 W28), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 4 pages https://hcihfetogether.wordpress.com/.
Stellar Corpus: From live motion capture to immersive data visualisation in a CAVE2 uniting altered gravity-movement relationships.
Towards Best Practices for Integrating Physiological Signals in HCI, CHI Workshop
Sarah Jane Pell
Abstract
In the evolving landscape of performance art, the nexus between the performer and their biodata is being redefined by HCI. In this context, this paper presents an interactive work of art that explores the somatic relationship between gravity and movement by combining dance, philosophy, extreme performance, and immersive visualization. Stellar Corpus combines a motion capture suit to capture human movement in extreme environments, human pose estimation models to automatically analyse this data, and an immersive 330° ultra-scale visualisation system that projects the signature altered-gravity performance behaviors in 2D and 3D. The discussion consists of three parts. Firstly, to share the creative development and UX staging to deliver a preview showing new potentials for the CAVE2 (Cave audio visual experience). Secondly, the hack to technically reconfigure the CAVE2 (Cave Automatic Virtual Environment) as a live instrument for the cosmic installation. This enables multiple spectators to engage simultaneously with ‘seven muses’ in the round: celestial entities formed as clusters of light particles and gestures extracted from extreme performances for a transcendent experience. Thirdly, to introduce the possibility of an invisible CAVE (Corporeal Affective Visualization Ethics) probing ethics in PhysioHCI, the choreographed bio-signals' integrity, and the potential as a dramaturgical tool. What are the implications of submitting embodied knowledges and somatic data, for the pleasure of a live experience, and digital legacy? How does replicability serve to manipulate a narrative to reflect the current Zeitgeist, and through physio rituals, or HCI methodologies, affect truth-telling, where the aim is to imbue a magic and awe, to move another, in mind, body, and spirit?
Sarah Jane Pell 2024. Stellar Corpus: From live motion capture to immersive data visualisation in a CAVE2 uniting altered gravity-movement relationships. In Presentations on: PhysioCHI: Towards Best Practices for Integrating Physiological Signals in HCI. In Workshops of the CHI Conference on Human Factors in Computing Systems (CHI’24 W46), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 7 pages. https://doi.org/10.1145/3613905.3636286.
The ACM SIGCHI Conference on Human Factors in Computing Systems is the premier international conference on human-computer interaction. CHI'24 Mahalo https://chi2024.acm.org/
⇐ CHI'2013 ⇐ CHI'2022 ⇐ CHI'2023 ⇐ Back to Conferences ⇐ Back to Publications