Module description

EDIA provides an integration of eye tracking for multiple different headsets, managing the parsing of the eye tracker output for you in the original sampling rate, and providing it via a standardized interface. 🔗 Source and readme: ‣

EDIA.Eye SubModules


EDIA.Eye.Quest

EDIA.Eye.Vive

EDIA.Eye.Pico

EDIA.Eye.Varjo

➡️ Module installation

Prerequisite

All EDIA modules depend on EDIA.Core, install this one first (Installing EDIA Core)

Installation of this module:

As all the EDIA modules are Unity packages, they are all installable via the package manager panel.

  1. Open the package manager panel.
  2. Click the + > Install package from git URL

Enter:

  [<https://github.com/edia-toolbox/edia_eye.git>](<https://github.com/edia-toolbox/edia_eye.git>)[?path=Assets/com.edia.eye#main](<https://mind-body-emotion.notion.site/EDIA-Toolbox-710f144129a245debd9b71948084ea95>)

Module usage

1️⃣ EDIA.Eye base configuration

Each EDIA.Eye submodule requires the base package to be available.

The Eye-DataHandler prefab should always exist in the scene. This prefab is responsible for translating the SDK specific eyedata stream into the EDIA format.

Add this prefab to the scene.

<aside>

💡Optionally for debugging purposes, add the Eye-GazeVisualiser to the scene.

</aside>

👀 Eye data

EDIA uses a standardized data class EyeDataPackage (inspired by the one used in LSL ET data streams) to pass along eye tracking data.

EDIA.Eye.EyeDataPackage

ℹ️ EDIA.Eye features

Eye calibration

Eye-XRGazeInteractor