Gaze detection in Virtual Reality systems is mostly performed using eye-tracking devices. The coordinates of the sight, as well as other data regarding the eyes, are used as input values for the applications. While this trend is becoming more and more popular in the interaction design of immersive systems, most visors do not come with an embedded eye-tracker, especially those that are low cost and maybe based on mobile phones. We suggest implementing an innovative gaze estimation system into virtual environments as a source of information regarding users intentions. We propose a solution based on a combination of the features of the images and the movement of the head as an input of a Deep Convolutional Neural Network capable of inferring the 2D gaze coordinates in the imaging plane.

Gaze estimation based on head movements in virtual reality applications using deep learning

Soccini A. M.
First
2017-01-01

Abstract

Gaze detection in Virtual Reality systems is mostly performed using eye-tracking devices. The coordinates of the sight, as well as other data regarding the eyes, are used as input values for the applications. While this trend is becoming more and more popular in the interaction design of immersive systems, most visors do not come with an embedded eye-tracker, especially those that are low cost and maybe based on mobile phones. We suggest implementing an innovative gaze estimation system into virtual environments as a source of information regarding users intentions. We propose a solution based on a combination of the features of the images and the movement of the head as an input of a Deep Convolutional Neural Network capable of inferring the 2D gaze coordinates in the imaging plane.
2017
19th IEEE Virtual Reality, VR 2017
usa
2017
Proceedings - IEEE Virtual Reality
IEEE Computer Society
413
414
978-1-5090-6647-6
CNN; ConvNet; Deep Learning; Gaze; Neural Networks; Virtual Reality
Soccini A.M.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1894173
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 15
  • ???jsp.display-item.citation.isi??? 9
social impact