Unity Package

The Unity Package exposes the Beam Eye Tracker API as a device based on the Input System, and offers ready to use behaviours/Scripts that allow you to implement features such as Camera Controls and Immersive HUD with minimal effort.

The page shows how to install the package and add the relevant behaviours to your game.

Note

For foveated rendering the Unity package provides easy access to the FoveatedRenderingState data structure, but there is not yet an out of the box integration with the rendering pipeline.

Unity Package Installation

A ready to use package is available in the unity/beam-eye-tracker-plugin subfolder. To add to your project:

  1. Open the Window > Package Manager menu.

  2. Click on the + button and select Add package from disk.

  3. Navigate to the unity/beam-eye-tracker-plugin subfolder and select the package.json file.

  4. Click on the Import button.

  5. [Optional] Once the package is imported, you can find the Beam Eye Tracker Plugin in the Package manager and import the sample scene.

Note

The package requires the Input System package dependency. There is a chance that you will be requested to enable it and restart the Unity Editor.

Beam Eye Tracker Demo Sample

The CamAndHUDControl.unity sample scene demonstrates all features below. To explore it:

  1. Open the Window > Package Manager menu.

  2. Find the Beam Eye Tracker Plugin package and click on the Samples tab.

  3. Click on the Import button.

  4. Open the CamAndHUDControl.unity scene and click on the Play button.

Beam Eye Tracker Plugin
Beam Eye Tracker Plugin

Sample scene in play mode. The camera follows the head and eye movements, whereas the HUD elements (red blocks) are faded out when not being looked at. The purple circle is the eye tracking overlay from the Beam Eye Tracker application indicating where the user is looking at.

The following sections explain how to integrate each of the core features in your own scene.

How to: Integrating the core features

This section explains how to integrate each of the core features of the Beam Eye Tracker SDK.

Controlling the in-game camera with head and eye tracking

To implement the Camera Controls feature, a script is readily available. To use it, you need to:

  1. Open the Beam Eye Tracker Plugin package in the Packages folder in the Unity Editor.

  2. Locate the Runtime/Scripts/CameraControlBehaviour.cs Script.

  3. Drag the CameraControlBehaviour.cs Script to the Camera in your scene.

  4. Run the scene and the camera should now follow your head and eye movements! Note that Gaming Extensions needs to be toggled ON in the Beam Eye Tracker application. See Starting the Beam Eye Tracker from the game for a convenient alternative.

Compatibility with other scripts modifying the camera pose

The CameraControlBehaviour.cs sets the camera object Transform’s’ localPosition and localEulerAngles properties at each Update call, as shown here below:

private void Update()
{
    if (betInputDevice == null)
    {
        return;
    }

    if (!cameraControlIsPaused && betInputDevice.trackingStatus.ReadValue() == 1)
    {
        cachedTransform.localPosition = betInputDevice.cameraPosition.ReadValue();
        cachedTransform.localEulerAngles = betInputDevice.cameraRotation.ReadValue();
    }
}

However, if you have other behaviours that modify such properties, or the world pose, it will conflict with the CameraControlBehaviour and you will experience undefined behaviour. However, there are simple ways to circumvent such conflicts:

  • Option 1: Add a parent game object to the camera, and move all prior pose-modifying behaviours to that parent, such that CameraControlBehaviour pose modification ends up being compounded to the parent object’s Transform.

  • Option 2: Implement your own behaviour by inheriting from BeamEyeTrackerMonoBehaviour, reading the cameraPosition and cameraRotation outputs, and add their values to the pose generated by your other behaviours. But please implement the Camera control hotkeys for good user experience.

See also

Let the Beam Eye Tracker do the heavy lifting when it comes to generating the pose parameters.

Camera control hotkeys

The package includes a BeamEyeTrackerControls.inputactions file and its generated BeamEyeTrackerControls.cs class. As it can be seen, it defines an action to Recenter the camera which is a critical operation for user experience. The binding is set to the Semicolon key by default, but you can change it using the Unity Action Editor.

There is also an action to pause the camera controls, which is also a nice feature for user experience, but there is no binding for it by default. Please refer to CameraControlBehaviour.cs to see how it is used, in case you implement your own camera controls instead.

Making the HUD more immersive

To implement the Immersive HUD feature, a script is readily available. To use it and, assuming you are using a Canvas to implement your HUD elements, such as the minimap, then you can easily make it fade out when not being looked at to increase immersion. To achieve that:

  1. Open the Beam Eye Tracker Plugin package in the Packages folder in the Unity Editor.

  2. Locate the Runtime/Scripts/ImmersiveHUDPanelBehaviour.cs Script.

  3. Drag the ImmersiveHUDPanelBehaviour.cs Script to each of the Canvas children UI objects you wish to make immersive (such as the minimap).

  4. Run the scene and the UI should now react to whether you are looking at it or not! Note that Gaming Extensions needs to be toggled ON in the Beam Eye Tracker application. See Starting the Beam Eye Tracker from the game for a convenient alternative.

Starting the Beam Eye Tracker from the game

A quality of life behaviour is available that allows to launch the Beam Eye Tracker application and starting the tracking directly from the game, so the user (or the developer ;-) ) doesn’t need to remember to start the Beam Eye Tracker application manually. To use the related Script:

  1. Open the Beam Eye Tracker Plugin package in the Packages folder in the Unity Editor.

  2. Locate the Runtime/Scripts/AutoStartBeamEyeTrackerBehaviour.cs Script.

  3. Drag the AutoStartBeamEyeTrackerBehaviour.cs Script to any GameObject in your scene.

  4. Run the scene and the Beam Eye Tracker application will be launched when the game starts.

Note

This script makes the relevant API call at the first Update call encountered. However, you may want to control in more detail at which point in the game lifetime is the Beam Eye Tracker automatically started. Please refer to Launching the Beam Eye Tracker from your game or application for recommendations on user experience.

Accessing foveated rendering data

If your rendering pipeline supports foveated rendering, you can poll for the latest FoveatedRenderingState data from the BeamEyeTrackerInputDevice instance, as follows:

if (betInputDevice == null)
{
    return;
}
FoveatedRenderingState? foveatedRenderingState =
    betInputDevice.GetFoveatedRenderingState();

An example is available in the Runtime/Scripts/FoveatedRenderingLoggerBehaviour.cs Script which you can attach to any GameObject in your scene.

Key classes

BeamEyeTrackerInputDevice

The BeamEyeTrackerInputDevice class is the main type which wraps around the Beam Eye Tracker API and exposes its functionality as an InputDevice from Unity’s Input System.

Besides the output data that is used to implement all the core features, described previously, it also provides access to the viewportGazePosition and unifiedScreenGazePosition properties, which can be used to implement custom behaviours.

Warning

The BeamEyeTrackerInputDevice retrieves and updates the viewport geometry. However, an inconsistent behaviour could be encountered in the Unity Editor when retrieving the viewport geometry. When in play mode, you may need to click on the “Game” tab and not move the mouse for an instant so that the viewport geometry is correctly retrieved.

BeamEyeTrackerMonoBehaviour

The BeamEyeTrackerMonoBehaviour class is a convenient base class which inherits from MonoBehaviour and exposes static instances of the BeamEyeTrackerInputDevice and BeamEyeTrackerControls through the betInputDevice and betControls properties, respectively.

Note

the betInputDevice can be null if the BeamEyeTrackerInputDevice is not available or initialized yet.