HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single Smartphone
We propose HandyGaze, a 6-DoF gaze tracking technique for room-scale environments that can be carried out by simply holding a smartphone naturally without installing any sensors or markers in the environment. Our technique simultaneously employs the smartphone’s front and rear cameras: The front camera estimates the user’s gaze vector relative to the smartphone, while the rear camera (and depth sensor, if available) performs self-localization by reconstructing a pre-obtained 3D map of the environment. To achieve this, we implemented a prototype that works on iOS smartphones by running an ARKit-based algorithm for estimating the user’s 6-DoF head orientation. We additionally implemented a novel calibration method that offsets the user-specific deviation between the head and gaze orientations. We then conducted a user study (N=10) that measured our technique’s positional accuracy to the gaze target under four conditions, based on combinations of use with and without a depth sensor and calibration. The results show the mean absolute angular error of the gaze point was 8.2 degrees and the positional error was 0.53 m when using the depth sensor, suggesting that the technique can be practically used in applications such as a gaze-based guidance application for museums.
Publications
International
- Takahiro Nagai, Kazuyuki Fujita, Kazuki Takashima, Yoshifumi Kitamura. HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single Smartphone, Proc. of the ACM on Human-Computer Interaction (ISS ’22), Vol. 6, Issue ISS, Article No. 562, pp 143–160, Nov. 2022.
https://doi.org/10.1145/3567715
Domestic
- 永井崇大,藤田和之,高嶋和毅,北村喜文,スマートフォンのみを用いた周囲環境への視線入力インタフェースの検討,ヒューマンインタフェース学会研究報告集, 2021年6月.