API reference

C++

class eyeware::TrackerClient

Class for connecting to the tracker server and retrieving its resulting tracking data.

It establishes communication to the tracker server (Eyeware Beam application) and provides synchronous access to the data. Those synchronous calls will return the last tracking data results received from the server.

A network_error_handler callback can be provided. If nullptr is used (default), the network errors are ignored and the network connection is reestablished automatically when possible by the TrackerClient class instance. If a network handler is given, it will be called in case of errors (e.g., timeout). In such case, the TrackerClient instance becomes invalid and needs to be recreated to reestablish connection.

Public Functions

TrackerClient(std::function<void(const NetworkError&)> network_error_handler = nullptr, int network_connection_timeout_ms = DEFAULT_NETWORK_TIMEOUTS_IN_MS, int tracking_info_network_timeout_ms = DEFAULT_NETWORK_TIMEOUTS_IN_MS, int base_communication_port = DEFAULT_BASE_COMMUNICATION_PORT, const char *hostname = "127.0.0.1")
Parameters
  • network_error_handler – An optional callback function for managing connection to the server errors.

  • network_connection_timeout_ms – The time period (in ms) for an attempt to connect to the server, after which the network connection is treated as broken.

  • tracking_info_network_timeout_ms – The time period (in ms) for an attempt to obtain tracking info from the server, after which the network connection is treated as broken.

  • base_communication_port – Base connection port to the server. The instance may use base_communication_port+1 as well.

  • hostname – The hostname of the server to obtain tracking results from. Typically the same PC, thus “127.0.0.1”.

ScreenGazeInfo get_screen_gaze_info() const

Retrieves the most recent screen gaze tracking result.

HeadPoseInfo get_head_pose_info() const

Retrieves the most recent head pose tracking result.

bool connected() const

Whether this client is currently connected to the tracker server or not.

Since

1.1.0

struct eyeware::HeadPoseInfo

Represents information of the head pose, for a given time instant.

Public Members

AffineTransform3D transform

Head pose, defined at the nose tip, with respect to the World Coordinate System (WCS).

bool is_lost = true

Indicates if tracking of the head is lost, i.e., if false, the user is not being tracked.

uint64_t track_session_uid = 0

Indicates the ID of the session of uninterrupted consecutive tracking.

struct eyeware::ScreenGazeInfo

Represents information of a person gaze intersection with a single screen, for a given time instant. Screen gaze coordinates are expressed in pixels with respect to the top-left corner of the screen.

Public Members

uint32_t screen_id = 0

ID of the screen, to differentiate in case of a multiscreen setup.

uint32_t x = 0

The horizontal screen coordinate for the gaze intersection.

uint32_t y = 0

The vertical screen coordinate for the gaze intersection.

TrackingConfidence confidence = TrackingConfidence::UNRELIABLE

The confidence of the tracking result.

bool is_lost = true

Tracking status that tells if the other values are dependable.

enum eyeware::TrackingConfidence

Realibility measure for obtained tracking results.

Values:

enumerator UNRELIABLE
enumerator LOW
enumerator MEDIUM
enumerator HIGH
struct eyeware::AffineTransform3D

Representation of a 3D affine transform, composed by a rotation matrix and a translation vector as A = [R | t], where

R = [c_00, c_01, c_02 c_10, c_11, c_12 c_20, c_21, c_22],

t = [c_03, c_13, c_23].

Public Members

Matrix3x3 rotation

Rotation matrix component.

Vector3D translation

Translation vector component.

using eyeware::Matrix3x3 = float[3][3]

Matrix of 3x3, implemented as an array of arrays (row-major).

Matrix3x3 my_matrix; // Assume a Matrix3x3 instance is available
int row = 1;
int col = 2;
float coefficient = my_matrix[row][col];

struct eyeware::Vector3D

Representation of a 3D vector or 3D point.

Public Members

float x = 0.0f

x coordinate.

float y = 0.0f

y coordinate.

float z = 0.0f

z coordinate.

Python

class eyeware.client.TrackerClient

Class for connecting to the tracker server and retrieving its resulting tracking data.

It establishes communication to the tracker server (Eyeware Beam application) and provides synchronous access to the data. Those synchronous calls will return the last tracking data results received from the server.

A network_error_handler callback can be provided. If None is used (default), the network errors are ignored and the network connection is reestablished automatically when possible by the TrackerClient class instance. If a network handler is given, it will be called in case of errors (e.g., timeout). In such case, the TrackerClient instance becomes invalid and needs to be recreated to reestablish connection.

Parameters
  • network_error_handler (Callable) – An optional callback function for managing connection to the server errors.

  • network_connection_timeout_ms (int) – The time period (in ms) for an attempt to connect to the server, after which the network connection is treated as broken.

  • tracking_info_network_timeout_ms (int) – The time period (in ms) for an attempt to obtain tracking info from the server, after which the network connection is treated as broken.

  • base_communication_port (int) – Base connection port to the server. The instance may use base_communication_port+1 as well.

  • hostname (str) – The hostname of the server to obtain tracking results from. Typically the same PC, thus “127.0.0.1”.

property connected

Whether this client is currently connected to the tracker server or not.

New in version 1.1.0.

get_head_pose_info(self: eyeware.client.TrackerClient) eyeware::HeadPoseInfo

Retrieves the most recent head pose tracking result.

get_screen_gaze_info(self: eyeware.client.TrackerClient) eyeware::ScreenGazeInfo

Retrieves the most recent screen gaze tracking result.

class eyeware.client.HeadPoseInfo

Represents information of the head pose, for a given time instant.

property is_lost

Indicates if tracking of the head is lost, i.e., if False, the user is not being tracked.

property track_session_uid

Indicates the ID of the session of uninterrupted consecutive tracking.

property transform

Head pose, defined at the nose tip, with respect to the World Coordinate System (WCS).

class eyeware.client.ScreenGazeInfo

Represents information of a person gaze intersection with a single screen, for a given time instant. Screen gaze coordinates are expressed in pixels with respect to the top-left corner of the screen.

property confidence

The confidence of the tracking result.

property is_lost

Tracking status that tells if the other values are dependable.

property screen_id

ID of the screen, to differentiate in case of a multiscreen setup.

property x

The horizontal screen coordinate for the gaze intersection.

property y

The vertical screen coordinate for the gaze intersection.

class eyeware.client.TrackingConfidence

Realibility measure for obtained tracking results.

Members:

UNRELIABLE

LOW

MEDIUM

HIGH

class eyeware.client.AffineTransform3D

Representation of a 3D affine transform, composed by a rotation matrix and a translation vector as A = [R | t], where

R = [c_00, c_01, c_02 c_10, c_11, c_12 c_20, c_21, c_22],

t = [c_03, c_13, c_23].

property rotation

Rotation matrix component.

property translation

Translation vector component.

class eyeware.client.Vector3D

Representation of a 3D vector or 3D point.

property x

x coordinate.

property y

y coordinate.

property z

z coordinate.

Note

Matrix and vector types, such as the rotation and translation properties of AffineTransform3D, can be transformed to NumPy arrays efficiently. This is useful for using tracking data and coordinates in your application. Example:

# Receive an AffineTransform3D instance
head_pose = tracker.get_head_pose_info()
# Transform the tracking information to standard NumPy arrays
import numpy as np
rotation_numpy = np.array(head_pose.rotation, copy=False)
translation_numpy = np.array(head_pose.translation, copy=False)
# Now we can manipulate tracking information to do several things:
# draw tracking coordinates on the screen, save them for statistics/heatmaps,
# perform arithmetic operations on them, trigger interactive behaviors based on thresholds, etc.