TurnAhead

First-Person View (FPV) drone is a recently developed category of drones designed for precision flying and for capturing exhilarating experiences that could not be captured before, such as navigating through tight indoor spaces and flying extremely close to subjects of interest. FPV viewing experiences, while exhilarating, typically have frequent rotations that can lead to visually induced discomfort. We present TurnAhead, which uses 3-DoF rotational haptic cues that correspond to camera rotations to improve the comfort, immersion, and enjoyment of FPV experiences. It uses headset-mounted air jets to provide ungrounded rotational forces and is the first device to support rotation around all 3 axes: yaw, pitch, and roll. We conducted a series of perception and formative studies to explore the design space of timing and intensity of haptic cues, followed by user experience evaluation, for a combined total of 44 participants (n=12, 8, 6, 18). Results showed that TurnAhead significantly improved overall comfort, immersion, and enjoyment, and was preferred by 89% of participants.

(a) TurnAhead explores the design space of applying 3-DoF rotational haptic cues to the head to improve comfort, immersion, and enjoyment of first-person viewing (FPV) experiences. In this example, a user is viewing a car chase scene from the movie Ambulance (2022) shot by FPV drones as the camera rotates to the right; (b) Our device consists of 6 air nozzles that are mounted on the front of VR headset and placed tangent to the head, and is the first wearable device capable of generating rotational forces to turn left/right (yaw axis) that is in 84% of the rotations in FPV footage.

Links

[CHI’23 Full Paper] Honorable Mention Award🏆 TurnAhead: Designing 3-DoF Rotational Haptic Cues to Improve First-person Viewing (FPV) Experiences

 Bo-Cheng Ke, Min-Han Li, Yu Chen, Chia-Yu Cheng, Chiao-Ju Chang, Yun-Fang Li, Shun-Yu Wang, Chiao Fang, Mike Y. Chen. 2023. CHI ’23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA. Article No.: 401, Pages 1 – 15.

DOI: https://doi.org/10.1145/3544548.3581443

AirRacket

We present AirRacket, perceptual modeling and design of ungrounded, directional force feedback for virtual racket sports. Using compressed air propulsion jets to provide directional impact forces, we iteratively designed for three popular sports that span a wide range of force magnitudes: ping-pong, badminton, and tennis. To address the limited force magnitude of ungrounded force feedback technologies, we conducted a perception study which discovered the novel illusion that users perceive larger impact force magnitudes with longer impact duration, by an average factor of 2.57x. Through a series of formative, perceptual, and user experience studies with a combined total of 72 unique participants, we explored several perceptual designs using force magnitude scaling and duration scaling methods to expand the dynamic range of perceived force magnitude. Our user experience evaluation showed that perceptual designs can significantly improve realism and preference vs. physics-based designs for ungrounded force feedback systems.

AirRacket explores perceptual force feedback design of air propulsion jets to improve the haptic experience of virtual racket sports: ping-pong, badminton, and tennis (note: white smoke added for illustrative purpose only, actual compressed air is invisible).

Links

[CHI’22 Full paper] Best Paper Award 🏆 AirRacket: Perceptual Design of Ungrounded, Directional Force Feedback to Improve Virtual Racket Sports Experiences

Ching-Yi Tsai, I-Lun Tsai, Chao-Jung Lai, Derrek Chow, Lauren Wei, Lung-Pan Cheng, Mike Y. Chen. 2022. AirRacket: Perceptual Design of Ungrounded, Directional Force Feedback to Improve Virtual Racket Sports Experiences. CHI ’22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, Article No.: 185, Pages 1 – 15.

DOI: https://doi.org/10.1145/3491102.3502034

MotionRing

We present MotionRing, a vibrotactile headband that creates illusory tactile motion around the head by controlling the timing of a 1-D, 360° sparse array of vibration motors. Its unique ring shape enables symmetric and asymmetric haptic motion experiences, such as when users pass through a medium and when an object passes nearby in any direction. We first conducted a perception study to understand how factors such as vibration motor timing, spacing, duration, intensity, and head region affect the perception of apparent tactile motion. Results showed that illusory tactile motion around the head can be achieved with 12 and 16 vibration motors with angular speed between 0.5-4.9 revolutions per second. We developed a symmetric and an asymmetric tactile motion pattern to enhance the experience of teleportation in VR and dodging footballs, respectively. We conducted a user study to compare the experience of MotionRing vs. static vibration patterns and visual-only feedback. Results showed that illusory tactile motion significantly improved users’ perception of directionality and enjoyment of motion events, and was most preferred by users.

MotionRing is a vibrotactile headband which creates the sensation of 360° tactile motion around the head.

LINKS

(UIST ’21 FULL PAPER) MOTIONRING: CREATING ILLUSORY TACTILE MOTION AROUND THE HEAD USING 360° VIBROTACTILE HEADBANDS

Shao-Yu Chu, Yun-Ting Cheng, Shih Chin Lin, Yung-Wen Huang, Yi Chen, and Mike Y. Chen. 2021. MotionRing: Creating Illusory Tactile Motion around the Head using 360°  Vibrotactile Headbands. In The 34th Annual ACM Symposium on User Interface Software and Technology (UIST ’21). Association for Computing Machinery, New York, NY, USA, 724–731.

DOI: https://doi.org/10.1145/3472749.3474781

HapticSeer

Haptic feedback significantly enhances virtual experiences. However, supporting haptics currently requires modifying the codebase, making it impractical to add haptics to popular, high-quality experiences such as best selling games, which are typically closed-source.

We present HapticSeer, a multi-channel, black-box, platform-agnostic approach to detecting game events for real-time haptic feedback. The approach is based on two key insights:

  1. All games have 3 types of data streams: video, audio, and controller I/O, that can be analyzed in real-time to detect game events.
  2. A small number of user interface design patterns are reused across most games, so that event detectors can be reused effectively.

We developed an open-source HapticSeer framework and implemented several real-time event detectors for commercial PC and VR games. We validated system correctness and real-time performance, and discuss feedback from several haptics developers that used the HapticSeer framework to integrate research and commercial haptic devices.

We implemented a system whose components are mostly language-independent, loosely coupled, and encapsulated by utilizing a central message broker. Our system architecture comprises three levels of components, raw data capturers, feature extractors, and event detectors.

(CHI ’21 Full Paper) HapticSeer: A Multi-channel, Black-box, Platform-agnostic Approach to Detecting Video Game Events for Real-time Haptic Feedback [HONORABLE MENTION]

Yu-Hsin Lin, Yu-Wei Wang, Pin-Sung Ku, Yun-Ting Cheng, Yuan-Chih Hsu, Ching-Yi Tsai, and Mike Y. Chen. 2021. HapticSeer: A Multi-channel, Black-box, Platform-agnostic Approach to Detecting Video Game Events for Real-time Haptic Feedback. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA, 14 pages.

DOI: https://doi.org/10.1145/3411764.344

(Talk) Youtube

(Repository) hapticseer.org

JetController

JetController is a novel haptic technology capable of supporting high-speed and persistent 3-DoF ungrounded force feedback. It uses high-speed pneumatic solenoid valves to modulate compressed air to achieve 20-50Hz of full impulses at 4.0-1.0N, and combines multiple air propulsion jets to generate 3-DoF force feedback. Compared to propeller-based approaches, JetController supports 10-30 times faster impulse frequency, and its handheld device is significantly lighter and more compact. JetController supports a wide range of haptic events in games and VR experiences, from firing automatic weapons in games like Halo (15Hz) to slicing fruits in Fruit Ninja (up to 45Hz). To evaluate JetController, we integrated our prototype with two popular VR games, Half-life: Alyx and Beat Saber, to support a variety of 3D interactions. Study results showed that JetController significantly improved realism, enjoyment, and overall experience compared to commercial vibrating controllers, and was preferred by most participants.

JetController is a novel high-speed 3-DoF ungrounded force feedback technology capable of supporting the speed of human button presses and high-speed game events.

Press Coverage Links

山下裕毅 (2021, Aug 25). “ジェット噴射”で衝撃与えるVRコントローラー 台湾大学とお茶大「JetController」開発 https://www.itmedia.co.jp/news/articles/2108/25/news047.html

爆炸哥 (2021, Aug 25). 【有片】手掣噴氣添開槍後座力 台灣大學研發JetController https://www.gameover.com.hk/news/525066

Paper Links

(CHI ’21 Full Paper) JetController: High-speed Ungrounded 3-DoF Force Feedback Controllers using Air Propulsion Jets

Yu-Wei Wang, Yu-Hsin Lin, Pin-Sung Ku, Yoko Miyatake, Yi-Hsuan Mao, Po Yu Chen, Chun-Miao Tseng, and Mike Y. Chen. 2021. JetController: Highspeed Ungrounded 3-DoF Force Feedback Controllers using Air Propulsion Jets. In CHI Conference on Human Factors in Computing Systems (CHI ’21), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA.
DOI: https://doi.org/10.1145/3411764.3445549

(CHI’21 Interactivity) Demonstration of JetController: High-speed Ungrounded Force Feedback Controllers Using Air Propulsion Jets

Yu-Wei Wang, Yu-Hsin Lin, Pin-Sung Ku, Yoko Miyatake, Po-Yu Chen, Chun-Miao Tseng, Ching-Yi Tsai, and Mike Y. Chen. 2021. Demonstration of JetController: High-speed Ungrounded Force Feedback Controllers Using Air Propulsion Jets. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI ’21 Extended Abstracts), May 8–13, 2021, Yokohama, Japan. ACM, New York, NY, USA.

DOI: https://doi.org/10.1145/3411763.3451542 

(SIGGRAPH ’21 Labs Installation) JetController: High-speed Ungrounded 3-DoF Force Feedback Controllers using Air Propulsion Jets

Yu-Wei Wang, Yu-Hsin Lin, Yoko Miyatake, Ching-Yi Tsai, Pin-Sung Ku,and Mike Y. Chen. 2021. JetController: High-speed Ungrounded 3-DoFForce Feedback Controllers using Air Propulsion Jets. InSpecial InterestGroup on Computer Graphics and Interactive Techniques Conference Labs(SIGGRAPH ’21 Labs), August 09-13, 2021.ACM, New York, NY, USA.

DOI: https://doi.org/10.1145/3450616.3464520

HeadBlaster

We present HeadBlaster, a novel wearable technology that creates motion perception by applying ungrounded force to the head to stimulate the vestibular and proprioception sensory systems. Compared to motion platforms that tilt the body, HeadBlaster more closely approximates how lateral inertial and centrifugal forces are felt during real motion to provide more persistent motion perception. In addition, because HeadBlaster only actuates the head rather than the entire body, it eliminates the mechanical motion platforms that users must be constrained to, which improves user mobility and enables room-scale VR experiences. We designed a wearable HeadBlaster system with 6 air nozzles integrated into a VR headset, using compressed air jets to provide persistent, lateral propulsion forces. By controlling multiple air jets, it is able to create the perception of lateral acceleration in 360 degrees. We conducted a series of perception and human-factor studies to quantify the head movement, the persistence of perceived acceleration, and the minimal level of detectable forces. We then explored the user experience of HeadBlaster through two VR applications: a custom surfing game, and a commercial driving simulator together with a commercial motion platform. Study results showed that HeadBlaster provided significantly longer perceived duration of acceleration than motion platforms. It also significantly improved realism and immersion, and was preferred by users compared to using VR alone. In addition, it can be used in conjunction with motion platforms to further augment the user experience.

HeadBlaster a) applies ungrounded air propulsion force to the head to stimulate the vestibular and proprioception sensory systems to create the perception of persistent self-motion (note: the white smoke is used here only for illustrative purposes; in regular usage, the compressed air is invisible), and b) our system uses 6 air nozzles mounted on VR headsets and combines multiple compressed air jets to generate lateral forces in 360 degrees.

(To be published on SIGGRAPH2020, August 2020)

Haptic-go-round

We present Haptic-go-round, a surrounding platform that allows deploying props and devices to provide haptic feedbacks in any direction in virtual reality experiences. The key component of Haptic-go-round is a motorized turntable that rotates the correct haptic device to the right direction at the right time to match what users are about to touch. We implemented a working platform including plug-and-play prop cartridges and a software interface that allow experience designers to agilely add their haptic components and use the platform for their applications. We conducted technical experiments and two user studies on Haptic-go-round to evaluate its performance. We report the results and discuss our insights and limitations.

Haptic-go-round is a platform that allows a user to feel haptic feedback when interacting with objects in virtual reality in any direction.

Haptic-go-round: A Surrounding Platform for Encounter-type Haptics in Virtual Reality Experiences

Hsin-Yu Huang, Chih-Wei Ning, Po-Yao Wang, Jen-Hao Cheng, and Lung-Pan Cheng. 2020. Haptic-go-round: A Surrounding Platform for Encounter-type Haptics in Virtual Reality Experiences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–10.
DOI: https://doi.org/10.1145/3313831.3376476

Miniature Haptics

We present Miniature Haptics, a new approach to providing realistic haptic experiences by applying miniaturized haptic feedback to hand-based, embodied avatars. By shrinking haptics to a much smaller scale, Miniature Haptics enables the exploration of new haptic experiences that are not practical to create at the full, human-body scale. Using Finger Walking in Place (FWIP) as an example avatar embodiment and control method, we first explored the feasibility of Miniature Haptics then conducted a human factors study to understand how people map their full-body skeletal model to their hands. To understand the user experience of Miniature Haptic, we developed a miniature football haptic display, and results from our user study show that Miniature Haptics significantly improved the realism and enjoyment of the experience and is preferred by users (p < 0.05). In addition, we present two miniature motion platforms supporting the haptic experiences of: 1) rapidly changing ground height for platform jumping games such as Super Mario Bros and 2) changing terrain slope. Overall, Miniature Haptics makes it possible to explore novel haptic experiences that have not been practical before.

Miniature Haptics introduces the concept of shrinking haptic feedback and applying it to hand-based and embodied avatars.

Miniature Haptics: Experiencing Haptic Feedback through Hand-based and Embodied Avatars

Bo-Xiang Wang, Yu-Wei Wang, Yen-Kai Chen, Chun-Miao Tseng, Min-Chien Hsu, Cheng An Hsieh, Hsin-Ying Lee, and Mike Y. Chen. 2020. Miniature Haptics: Experiencing Haptic Feedback through Hand-based and Embodied Avatars. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–8.
DOI: https://doi.org/10.1145/3313831.3376292

WalkingVibe

Virtual Reality (VR) sickness is common with symptoms such as headaches, nausea, and disorientation, and is a major barrier to using VR. We propose WalkingVibe, which applies unobtrusive vibrotactile feedback for VR walking experiences, and also reduces VR sickness and discomfort while improving realism. Feedback is delivered through two small vibration motors behind the ears at a frequency that strikes a balance in inducing vestibular response while minimizing annoyance. We conducted a 240-person study to explore how visual, audio, and various tactile feedback designs affect the locomotion experience of users walking passively in VR while seated statically in reality. Results showed timing and location for tactile feedback have significant effects on VR sickness and realism. With WalkingVibe, 2-sided step-synchronized design significantly reduces VR sickness and discomfort while significantly improving realism. Furthermore, its unobtrusiveness and ease of integration make WalkingVibe a practical approach for improving VR experiences with new and existing VR headsets.

WalkingVibe prototype with 2 vibration motors behind the ears, which provide vibrotactile stimulation synchronized to footsteps in VR.

WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback

Yi-Hao Peng, Carolyn Yu, Shi-Hong Liu, Chung-Wei Wang, Paul Taele, Neng-Hao Yu, and Mike Y. Chen. 2020. WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR using Unobtrusive Head-mounted Vibrotactile Feedback. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12.
DOI: https://doi.org/10.1145/3313831.3376847

MuscleSense

Strength training improves overall health, well-being, physical appearance, and sports performance.There are four major factors that affect training efficacy in a training session: exercise type, number of repetitions, movement velocity, and workload. Prior research has used wearable sensors to detect exercise type, number of repetitions, and movement velocity while training. However, detecting workload remains constrained to instrumented exercise equipment, such as smart exercise machines or RFID-tagged free weights.This paper presents MuscleSense, an approach that estimates exercise workload by using wearable Surface Electromyography (sEMG) sensors and regression analysis. We evaluated the accuracy of several regression models and the effects of sensor placement through a 20-person user study. Results showed that MuscleSense achieved an accuracy of 0.68kg (root mean square error, RMSE) in sensing workload using both forearm and arm sensors and support vector regression (SVR).

MuscleSense senses exercise weights using wearable sEMG sensors. The chart on the right shows the signals from sEMG sensors on the upper arm, from Channel 1 to Channel 8.

MuscleSense: Exploring Weight Sensing using Wearable Surface Electromyography (sEMG)

Chin Guan Lim, Chin Yi Tsai, and Mike Y. Chen. 2020. MuscleSense: Exploring Weight Sensing using Wearable Surface Electromyography (sEMG). In Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’20). Association for Computing Machinery, New York, NY, USA, 255–263.
DOI: https://doi.org/10.1145/3374920.3374943