LABORATORY OF

Next-generation Experience and Technology

Research

Research

ResearchVision1.png

Embodied Interaction for Wearables & VR/AR

Our research is grounded in Embodied Interaction, emphasizing seamless integration between the human body and digital technologies. We envision interaction paradigms where the body becomes the primary medium of input and output, enabling intuitive, expressive, and multisensory experiences.

1

Wearable & Tangible Interfaces

Designing low-barrier, body-centric systems that leverage the human form for natural interaction in both physical and virtual spaces.

2

Multisensory & Haptic Feedback

Integrating tactile, thermal, and auditory cues to enrich immersion and presence across XR environments.

3

Spatial & Context-aware Interaction

Creating adaptive interfaces that respond dynamically to user context, environment, and task demands.

4

Real-time Prototyping & Evaluation

Building experimental platforms to rapidly test new concepts with rigorous user studies and metrics.

High-bandwidth Input Techniques

We envision a future where input systems seamlessly capture the full bandwidth of human expression—from fine-grained finger motions to whole-body gestures—bridging the gap between physical and virtual worlds. Our research investigates:

1

High-fidelity Sensing and Tracking

Leveraging computer vision, wearable sensors, and hybrid sensing techniques to achieve accurate, low-latency motion capture in 3D space.

2

Expressive and Multimodal Interaction

Combining gestures with voice, gaze, haptics, and context-aware signals for richer user input channels.

3

Adaptive and Personalized Interfaces

Developing machine learning models that adapt input techniques to individual users, tasks, and environments.

4

Performance and Usability Evaluation

Building experimental platforms to systematically evaluate input bandwidth, accuracy, and user experience across diverse application domains.

ResearchVision2.png
ResearchVision3.png

Realistic and Immersive Experience

We envision a future where the boundary between real and virtual worlds dissolves, enabling users to experience seamless and lifelike interactions that engage multiple senses simultaneously. To achieve this, our research explores:

1

Multimodal Feedback Integration

Designing interfaces that combine haptic, auditory, visual, and thermal cues to replicate real-world sensations and enhance perceptual realism.

2

Cross-reality Interaction Design

Developing interaction paradigms that unify physical robots, tangible interfaces, and virtual agents within a single immersive ecosystem.

3

Immersion Evaluation

Creating user-centered evaluation frameworks to measure presence, realism, and cognitive engagement in both real and virtual contexts.

4

Adaptive & Context-aware Rendering

Leveraging AI-driven models to dynamically adapt sensory feedback based on user states, environmental factors, and interaction intent.

333, Techno jungang-daero, Hyeonpung-myeon, Dalseong-gun, Daegu, 42988(ZIP), REPUBLIC OF KOREA

42988 DGIST E7-213, 333, Techno Jungang Daero, Hyeonpung-Eup, Dalseong-Gun, Daegu, South Korea

Tel.053-785-6329 Mail.hyunjae.gil@dgist.ac.kr