Directing a virtual agent based on eye behaviour of a user
In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Patent |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In an extended reality (XR), virtual reality (VR), augmented reality (AR) or mixed reality (MR) environment, the actions of a virtual object are directed by a user based on the user's eye behaviour, with the environment displaying a user avatar that includes a representation of the user's eye(s). The virtual object is displayed (302) on a display, and is associated with a first viewing frustum containing (304) the user avatar which includes a visual representation of one or more eyes. Whilst displaying the virtual object, eye tracking data is obtained (306) which is indicative of the user's eye behaviour, the visual representation of the eye(s) is updated (312) based on the eye behaviour, and the virtual object is directed (316) to perform an action based on the updated eye(s) representation as well as scene information associated with the device. The eye behaviour may include a movement (308) of the eye of the user from a first focus position to a second focus position, and this movement may include a saccade (310) which may be directed to an object of interest. |
---|