Skip to content

The NTU HCI Lab, directed by Prof. Mike Y. Chen, explores future user interfaces at the intersection of human-computer interaction, psychology, and mobile/wearable systems. Our work has been featured by Discovery Channel, Engadget, EE Times, New Scientist, and more.

PeriText

Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses

ARPilot

Designing and Investigating AR ShootingInterfaces on Mobile Devices for Drone Videography

SpeechBubbles

Enhancing Captioning Experiences forDeaf and Hard-of-Hearing People in Group Conversations

ActiveErgo

Automatic and Personalized Ergonomicsusing Self-actuating Furniture

CurrentViz

Sensing and Visualizing Electric Current Flows of Breadboarded Circuits

CircuitSense

Automatic Sensing of Physical Circuits and Generation of Virtual Circuits to Support Software Tools.

CircuitStack

Supporting Rapid Prototyping and Evolution of Electronic Circuits

Nail+

sensing fingernail deformation to detect finger force touch interactions on rigid surfaces

User-Defined Game Input

User-Defined Game Input
for Smart Glasses in Public Space

Backhand

Sensing Hand Gestures via Back of the Hand

PalmType

Using Palms as Keyboards for Smart Glasses

iGrasp

grasp-based adaptive keyboard for mobile devices

iRotate

Automatic Screen Rotation based on Face Orientation