Research Project

Visual-Auditory Redirection using Auditory Cues in Reality

This work examines the effect of auditory cues occurring in reality on redirection.

BirdViewAR

We propose BirdViewAR, a surroundings-aware remote drone-operation system that provides significant spatial awareness to pilots through an augmented third-person view (TPV) from an autopiloted secondary follower drone.

HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single Smartphone

We propose HandyGaze, a 6-DoF gaze tracking technique for room-scale environments that can be carried out by simply holding a smartphone naturally without installing any sensors or markers in the environment.

TetraForce: A Magnetic-Based Interface Enabling Pressure Force and Shear Force Input Applied to Front and Back of a Smartphone

We propose a novel phone-case-shaped interface named TetraForce, which enables four types of force input consisting of two force directions (i.e., pressure force and shear force) and two force-applied surfaces (i.e., touch surface and back surface) in a single device.

RedirectedDoors: Redirection While Opening Doors in Virtual Reality

RedirectedDoors is a novel space-efficient technique for redirection in VR focused on door-opening behavior.

PseudoJumpOn: Jumping onto Steps in Virtual Reality

PseudoJumpOn is a novel locomotion technique that allows the user to experience virtual step-up jumping motion without requiring the placement of physical steps.

Emotion Embodied Avatars: Novel Remote Communication with Motion Unit AI

Nonverbal information plays an important role in human-rich communication. Among them, we are conducting research focusing on physical movements that are particularly important for enriching communication and are academically challenging. In particular, we built an AI that links human emotions and intentions with the "Motion Unit," which is an analysis unit for body movements, and we are conducting research to generate emotionally rich motions of avatars through a system of international arts-science and industry-academia collaboration.

BouncyScreen

We explore BouncyScreen, an actuated 1D display system that enriches indirect interaction with a virtual object by pseudo-haptic feedback mechanics enhanced through the screen's physical movements.

WaddleWalls

We propose a self-actuated stretchable partition whose physical height, width, and position can dynamically change to create secure workplaces without inhibiting group collaboration.

TiltChair

We propose an actuated office chair that physically manipulates the user’s posture by actively inclining the chair’s seat to address problems associated with prolonged sitting. The system controls the inclination angle and motion speed with the aim of achieving manipulative but unobtrusive posture guidance.

Shape-Shifting Wall Display

Focusing on the shape of display in workspace, we are exploring wall display that automatically changes its shape and arrangement according to the contents and circumstances.

Magnetic Tracking System for Dexterous 3D Interaction and Motion

IM3D and IM6D are novel real-time magnetic motion-tracking systems using multiple identifiable, tiny, lightweight, wireless and occlusion-free markers.

Coupled-Clay: Physical-Virtual 3D Collaborative Interaction Environment

Coupled-clay is a bi-directional 3D collaborative interactive environment that supports 3D modelling work between groups of users at remote locations.

ShearSheet

We propose ShearSheet, a low-cost, feasible, and simple DIY method that enables tangential (shear) force input on a touchscreen using a rubber-mounted slim transparent sheet.

A Viewport Control Method using Metaphor of Flexible Materials

We propose a novel technique for scroll and zoom operations which regards the displayed content as a flexible material such as a piece of cloth.

Text Typing in VR using Smartphones Touchscreen and HMD

This work presents a new simple text typing method using a smartphone touchscreen keyboard in immersive virtual environments (IVEs) with HMD.

Redirected Jumping: Imperceptibly Manipulating Jump Motions in Virtual Reality

Redirected Jumping is a novel redirection technique using user’s jumping motion.

ZoomWalls: Dynamic Walls that Simulate Haptic Infrastructure for Room-scale VR Worlds

ZoomWalls is a novel haptic infrastructure using dynamic wall props.

Seamless interaction in multi-display environment

We suggest seamless interaction method to solve the contents distortion occurred from multiple displays.

GyroWand

We present GyroWand, a raycasting technique for 3D interactions in self-contained head-mounted displays.

Spatial and Situational Awareness Techniques

We are exploring situational awareness technologies to estimate user's various activities, such as conversational engagement, spatial movements, and interactions with computers by using multiple sensor systems.

PinpointFly

PinpointFly is a novel AR-based egocentric drone interface that enhances spatial perception and manipulation accuracy by overlaying a cast shadow of the flying drone on the ground.

TransformTable

TransformTable is a shape-changing digital table that enables to create suitable user's workspace for various conversational and collaborative scenarios by its dynamic transformation capabilities.

SharedWell: A Display Table for Strategic Negotiations

SharedWell is a new display system that allows multiple users to deal with not only public information but also private information on a shared display.

Visibility Control using Revolving Polarizer

Visibility control is a novel display system that presents information with different levels of visibility to multiple users by controlling visibility with a revolving polarizer.

Target prediction interface “Delphian Desktop”

This research details the design and evaluation of the Delphi-an Desktop, a mechanism for online spatial prediction of cursor movements in a Windows-Icons-Menus-Pointers (WIMP) environment.

Third-Person Piloting

Third-Person Piloting is a novel drone interface, which increases situational awareness by providing an interactive third-person perspective from a spatially coupled second drone in the sky.

Anchored navigation

We propose two novel map navigation techniques, called Anchored Zoom (AZ) and Anchored Zoom and Tilt (AZT).

CamCutter

A novel ad-hoc cross-device application sharing technique using mobile camera.

MovemenTable

On designing a much comfortable, efficient workspace, the MovemenTable automatically moves depend on users` spatial behavior, thus providing dynamic flexible space for users' need.

CubeHarmonic: New Musical Interface with a Magnetic Motion Tracking

CubeHarmonic is a musical application of the Rubik’s cube and a Magnetic 3D motion tracking system which allows users to play and compose music.

IllusionHole: Interactive Stereoscopic Display for Multiple Users

IllusionHole is an interactive display system that allows three or more moving observers to simultaneously observe stereoscopic image pairs from their own viewpoints.

Light-Tracing

Ray-casted movement for improved character control in platform virtual reality gaming.

VRSafariPark

VRSafariPark is an application which use intuitive 3D user interface with metaphor of world tree and blocks in VR.

Fundamental study of pointing motion characteristics

We investigate motion planning and tradeoff relationship between pointing time and intuitive operation.

Boundless Scroll

Our study explores the Boundless Scroll interface that extends the motor space for finger drag motions into the off-screen space using a tracking system around the screen.

Interactive Content Visualization

We are conducting research to develop technologies of interactive content visualization.

Multimodal target selection techniques for multi displays

We compare multi-modal interaction techniques in a perspective-corrected multi-display environment (MDE).

ViBlock: Block-shaped Content Manipulation in VR

ViBlock is an interface to manipulate block-shaped digital content using Head Mounted Display (HMD) in VR.

A-Blocks

A-Blocks is a novel system to measure how to play during block play by blocks which are embedded wireless acceleration sensor.

Living Wall Display

Living Wall Display is autonomous mobile wall display that dynamically changes its position and orientation, coupling with the content animation to augment the interaction experience.

AdapTable

We propose AdapTable; a concept and prototype of a flexible multidisplay tabletop that can physically reconfigure its layout, allowing for interaction with difficult-to-reach regions.