The NTU HCI Lab explores future user interfaces at the intersection of human-computer interaction (HCI), artificial intelligence (AI), design, psychology, and mobile/wearable/VR/AR systems. Our work has been featured by Discovery Channel, Engadget, EE Times, New Scientist, and more. It is directed by Prof. Mike Y. Chen and our new faculty member Prof. Lung-Pan Cheng.
Expanding Hands-freeInput Vocabulary using Eye Expressions
Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses
Designing and Investigating AR ShootingInterfaces on Mobile Devices for Drone Videography
Enhancing Captioning Experiences forDeaf and Hard-of-Hearing People in Group Conversations
Automatic and Personalized Ergonomicsusing Self-actuating Furniture
Sensing and Visualizing Electric Current Flows of Breadboarded Circuits
Automatic Sensing of Physical Circuits and Generation of Virtual Circuits to Support Software Tools.
Supporting Rapid Prototyping and Evolution of Electronic Circuits
sensing fingernail deformation to detect finger force touch interactions on rigid surfaces
User-Defined Game Input
for Smart Glasses in Public Space
Sensing Hand Gestures via Back of the Hand
Using Palms as Keyboards for Smart Glasses
grasp-based adaptive keyboard for mobile devices
Automatic Screen Rotation based on Face Orientation