EDIA provides an integration of eye tracking for multiple different headsets, managing the parsing of the eye tracker output for you in the original sampling rate, and providing it via a standardized interface. 🔗 Source and readme: ‣
All EDIA modules depend on EDIA.Core, install this one first (Installing EDIA Core)
As all the EDIA modules are Unity packages, they are all installable via the package manager panel.
+ > Install package from git URL
Enter:
[<https://github.com/edia-toolbox/edia_eye.git>](<https://github.com/edia-toolbox/edia_eye.git>)[?path=Assets/com.edia.eye#main](<https://mind-body-emotion.notion.site/EDIA-Toolbox-710f144129a245debd9b71948084ea95>)
Each EDIA.Eye submodule requires the base package to be available.
The Eye-DataHandler prefab should always exist in the scene.
This prefab is responsible for translating the SDK specific eyedata stream into the EDIA format.
Add this prefab to the scene.
<aside>
💡Optionally for debugging purposes, add the Eye-GazeVisualiser to the scene.
</aside>
EDIA uses a standardized data class EyeDataPackage (inspired by the one used in LSL ET data streams) to pass along eye tracking data.
EDIA.Eye.EyeDataPackageEye-XRGazeInteractor