The NTU HCI Lab, directed by Prof. Mike Y. Chen, explores future user interfaces at the intersection of human-computer interaction, psychology, and mobile/wearable systems. Our work has been featured by Discovery Channel, Engadget, EE Times, New Scientist, and more.
Expanding Hands-freeInput Vocabulary using Eye Expressions
Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses
Designing and Investigating AR ShootingInterfaces on Mobile Devices for Drone Videography
Enhancing Captioning Experiences forDeaf and Hard-of-Hearing People in Group Conversations
Automatic and Personalized Ergonomicsusing Self-actuating Furniture
Sensing and Visualizing Electric Current Flows of Breadboarded Circuits
Automatic Sensing of Physical Circuits and Generation of Virtual Circuits to Support Software Tools.
Supporting Rapid Prototyping and Evolution of Electronic Circuits
sensing fingernail deformation to detect finger force touch interactions on rigid surfaces
User-Defined Game Input
for Smart Glasses in Public Space
Sensing Hand Gestures via Back of the Hand
Using Palms as Keyboards for Smart Glasses
grasp-based adaptive keyboard for mobile devices
Automatic Screen Rotation based on Face Orientation