Skip to content

The NTU HCI Lab explores future user interfaces at the intersection of human-computer interaction (HCI), artificial intelligence (AI), design, psychology, and mobile/wearable/VR/AR systems. Our work has been featured by Discovery Channel, Engadget, EE Times, New Scientist, and more. It is directed by Prof. Mike Y. Chen and our new faculty member Prof. Lung-Pan Cheng.

NodEverywhere

Handsfree input allows people to interact with the real-world without occupying their hands and is especially important for augmented reality headsets. Currently, dwell time is used with eye gaze and head pointing as a handsfree selection technique. However, prior work on improving dwell-time have not addressed unintended selections (i.e. the Midas Touch problem) for general, […]

DeepGesture

Elderly people and the motor impaired have difficulty in interacting with touch screen devices. Commonly-used mobile system uses a general model for gesture recognition. However, the general threshold-based model may not meet their special needs. Hence, we present DeepGesture, a 2-stage model providing self-learning function for gesture recognition. In first stage , three dimensional convolution […]

MuscleSense

Exploring Weight Sensing using Wearable Surface Electromyography (sEMG)

WizardVR

Using Augmented Reality to Support Observingand Wizard-of-Oz Prototyping for Virtual Reality Experience

Gaze+Head

Gaze Pointing with Implicitly Trigger Head Refinement

EyeExpress

Expanding Hands-freeInput Vocabulary using Eye Expressions

PeriText

Utilizing Peripheral Vision for Reading Text on Augmented Reality Smart Glasses

ARPilot

Designing and Investigating AR ShootingInterfaces on Mobile Devices for Drone Videography

SpeechBubbles

Enhancing Captioning Experiences forDeaf and Hard-of-Hearing People in Group Conversations

ActiveErgo

Automatic and Personalized Ergonomicsusing Self-actuating Furniture

CurrentViz

Sensing and Visualizing Electric Current Flows of Breadboarded Circuits

CircuitSense

Automatic Sensing of Physical Circuits and Generation of Virtual Circuits to Support Software Tools.

CircuitStack

Supporting Rapid Prototyping and Evolution of Electronic Circuits

Nail+

sensing fingernail deformation to detect finger force touch interactions on rigid surfaces

User-Defined Game Input

User-Defined Game Input
for Smart Glasses in Public Space

Backhand

Sensing Hand Gestures via Back of the Hand

PalmType

Using Palms as Keyboards for Smart Glasses

iGrasp

grasp-based adaptive keyboard for mobile devices

iRotate

Automatic Screen Rotation based on Face Orientation