ITI/LARSyS at CHI 2021

The 2021 ACM CHI Virtual Conference on Human Factors in Computing Systems is the premier international conference on Human-Computer Interaction. It is considered the most prestigious conference in the field of HCI, which counts with the participation of thousands of international attendees annually. This year, for the first time, it was held fully online, between May 8-13, 2021.
ITI/LARSyS members have had papers accepted and presented at this conference, listing below.

Watch the presentation video.

Abstract

We investigate professional greenhouse growers’ user experience (UX) when using climate-management systems in their daily work. We build on the literature on UX, in particular, UX at work, and extend it to ordinary UX at work. In a 10-day diary study, we collected data with a general UX instrument (AttrakDiff), a domain-specific instrument, and interviews. We find that AttrakDiff is valid at work; its three-factor structure of pragmatic quality (PQ), hedonic identification quality, and hedonic stimulation quality is recognizable in the growers’ responses. In this article, UX at work is understood as interactions among technology, tasks, structure, and actors. Our data support the recent proposal for the ordinariness of UX at work. We find that during continued use, UX at work is middle-of-the-scale, remains largely constant over time, and varies little across use situations. For example, the largest slope of the four AttrakDiff constructs when regressed over the 10 days was as small as 0.04. The findings contrast existing assumptions and findings in UX research, which is mainly about extraordinary and positive experiences. In this way, the present study contributes to UX research by calling attention to the mundane, unremarkable, and ordinary UXs at work.

Watch the presentation video.

Abstract

Head-Mounted Display based Virtual Reality is proliferating. However, Visually Induced Motion Sickness (VIMS), which prevents many from using VR without discomfort, bars widespread adoption. Prior work has shown that limiting the Field of View (FoV) can reduce VIMS at a cost of also reducing presence. Systems that dynamically adjust a user’s FoV may be able to balance these concerns. To explore this idea, we present a technique for standard 360° video that shrinks FoVs only during VIMS inducing scenes. It uses Visual Simultaneous Localization and Mapping and peripheral optical flow to compute camera movements and reduces FoV during rapid motion or optical flow. A user study (N=23) comparing 360° video with unrestricted-FoVs (90°), reduced fixed-FoVs (40°) and dynamic-FoVs (40°-90°) revealed that dynamic-FoVs mitigate VIMS while maintaining presence. We close by discussing the user experience of dynamic-FoVs and recommendations for how they can help make VR comfortable and immersive for all.

Watch the presentation video.

Abstract

Visually impaired children (VI) face challenges in collaborative learning in classrooms. Robots have the potential to support inclusive classroom experiences by leveraging their physicality, bespoke social behaviors, sensors, and multimodal feedback. However, the design of social robots for mixed-visual abilities classrooms remains mostly unexplored. This paper presents a four-month-long community-based design process where we engaged with a school community. We provide insights into the barriers experienced by children and how social robots can address them. We also report on a participatory design activity with mixed-visual abilities children, highlighting the expected roles, attitudes, and physical characteristics of robots. Findings contextualize social robots within inclusive classroom settings as a holistic solution that can interact anywhere when needed and suggest a broader view of inclusion beyond disability. These include children’s personality traits, technology access, and mastery of school subjects. We finish by providing reflections on the community-based design process.