Vizard 7 » Reference » Stereo & Displays » Display devices » OpenXR
7.7

OpenXR

The openxr module adds support for OpenXR based displays and input devices. OpenXR is a standard that is supported by many of the popular VR runtimes (Oculus, SteamVR, Windows Mixed Reality, ...). The OpenXR standard defines a set of core features that will be supported by all runtimes, which includes rendering to an HMD and getting data from input devices. Additional features may optionally be supported by a runtime through the use of OpenXR extensions, which includes eye tracking, face tracking, hand tracking, video passthrough, and more.

 

A collection of sample scripts showing how to use various OpenXR features can be found in the \examples\openxr\ folder.

Getting Started

Before using OpenXR with Vizard, make sure the VR runtime for your device is set as the active OpenXR runtime for your system. The table below describes how to set the active OpenXR runtime for some of the popular VR runtimes:

Runtime

Instructions

Oculus

  • Go to the Settings > General tab within the Oculus app
  • In the OpenXR Runtime section, click the Set Oculus as active button
  • Some OpenXR features might be experimental and only available to developers. If you have a developer account, you can enable additional features under the Settings > Beta tab.

SteamVR

  • Open the Settings dialog from the SteamVR menu
  • Make sure Advanced Settings is set to Show
  • Click on the Developer tab
  • Click the Set SteamVR as OpenXR Runtime button
Windows Mixed Reality
  • If Windows Mixed Reality is not the active OpenXR runtime, it will display a warning at the top of Mixed Reality Portal app
  • Click the Fix it button at the top
Varjo
  • Go to the System tab within the Varjo Base app
  • Under the Compatibility section make sure OpenXR is enabled

OpenXR Module

The openxr module contains the following attributes:

Attribute

Description

openxr.getClient()

Connect to the current OpenXR runtime and return a Client object if successful. Once successfully connected, calling this function subsequent times will return the same object. The client object provides the main interface to the connected OpenXR session.

openxr.HMD(window=viz.MainWindow)

Create and return an HMD object used for managing the rendering to the connected OpenXR HMD. The specified window will be rendered to the HMD.

openxr.getHead()

Return the OpenXR sensor for the user's head.

openxr.getLeftHand() Return the OpenXR sensor for the user's left hand.
openxr.getRightHand() Return the OpenXR sensor for the user's right hand.
openxr.ControllerModel(userPath) Create a node for displaying the OpenXR provided controller model for specified user path (i.e. openxr.USER_HAND_LEFT or openxr.USER_HAND_RIGHT). If the runtime does not support the controller model extension, then coordinate axes will be displayed instead.
openxr.ActionBinding(path, profile) Define an action binding with the specified input path and interaction profile. The action binding object can be used when creating or binding an action with the OpenXR Client object.
openxr.enableExtension(name) Specify an OpenXR extension name to enable when initializing the client. Must be called before calling openxr.getClient(). The openxr module will automatically enable known interaction profile extensions. If new interaction profile extensions are made available before the module can be updated, then this function can be used to manually enable those extensions.
openxr.config Object for specifying OpenXR configuration settings before initializing client.

openxr.SESSION_STATE_UNKNOWN

openxr.SESSION_STATE_IDLE

openxr.SESSION_STATE_READY

openxr.SESSION_STATE_SYNCHRONIZED

openxr.SESSION_STATE_VISIBLE

openxr.SESSION_STATE_FOCUSED

openxr.SESSION_STATE_STOPPING

openxr.SESSION_STATE_LOSS_PENDING

openxr.SESSION_STATE_EXITING

Possible OpenXR session states. Can be provided by the <client>.getSessionState() command or from openxr.EVENT_SESSION_STATE_CHANGE.

openxr.USER_HEAD

openxr.USER_HAND_LEFT

openxr.USER_HAND_RIGHT

openxr.USER_GAMEPAD

openxr.USER_TREADMILL

Identifiers for common OpenXR top level user paths.

openxr.PROFILE_SIMPLE_CONTROLLER

openxr.PROFILE_HTC_VIVE_CONTROLLER

openxr.PROFILE_HTC_VIVE_PRO

openxr.PROFILE_HTC_VIVE_COSMOS_CONTROLLER

openxr.PROFILE_VALVE_INDEX_CONTROLLER

openxr.PROFILE_OCULUS_GO_CONTROLLER

openxr.PROFILE_OCULUS_TOUCH_CONTROLLER

openxr.PROFILE_MICROSOFT_MIXED_REALITY_CONTROLLER

openxr.PROFILE_MICROSOFT_XBOX_CONTROLLER

openxr.PROFILE_HP_REVERB_G2_CONTROLLER

openxr.PROFILE_META_QUEST_PRO_CONTROLLER

Identifiers for common OpenXR interaction profiles.

openxr.REFERENCE_SPACE_VIEW

openxr.REFERENCE_SPACE_LOCAL

openxr.REFERENCE_SPACE_STAGE

openxr.REFERENCE_SPACE_UNBOUNDED_MSFT

openxr.REFERENCE_SPACE_LOCAL_FLOOR

Possible OpenXR reference space values. Can be used with the <client>.setReferenceSpace() command. Not all reference spaces may be supported by the runtime. The <client>.isReferenceSpaceSupported() command can be used to check which spaces are supported.

  • openxr.REFERENCE_SPACE_VIEW: Input poses will be provided relative to the viewer. This is generally not used.
  • openxr.REFERENCE_SPACE_LOCAL: Used to render seated-scale content that is not positioned relative to the physical floor. Local space will be supported by all runtimes.
  • openxr.REFERENCE_SPACE_STAGE: Used to render standing-scale or room-scale content that is relative to the physical floor. This is the default reference space if supported.
  • openxr.REFERENCE_SPACE_UNBOUNDED_MSFT: Used to render world-scale content that spans beyond the bounds of a single stage, for example, an entire floor or multiple floors of a building.
  • openxr.REFERENCE_SPACE_LOCAL_FLOOR: Local floor space is similar to local space, but the Y position is relative to the physical floor just like stage space.
openxr.EVENT_ACTION_CHANGE

Triggered when a previously created OpenXR action state changes. Provides a single event structure e with the following attributes:

  • e.action: Name of the action.
  • e.value: New value of the action. Type depends on action (bool, float , vec2).
openxr.EVENT_INTERACTION_PROFILE_CHANGE Triggered when the assigned interaction profile for one or more top level user paths changes. Provides a single event structure e with no attributes currently. You can use <client>.getInteractionProfile() to get the new interaction profile for the relevant user paths.
openxr.EVENT_SESSION_STATE_CHANGE

Triggered when the OpenXR session state changes. Provides a single event structure e with the following attributes:

  • e.state: New session state. See list of possible session states above.
openxr.EVENT_SESSION_LOST Triggered when the OpenXR session is lost. Typically occurs when a connection to the HMD is lost. Provides a single event structure e with no attributes currently.
openxr.EVENT_SESSION_RESTART Triggered when the OpenXR session is restarted after previously being lost. Provides a single event structure e with no attributes currently.
openxr.EVENT_PROJECTION_CHANGE Triggered when the provided OpenXR projection matrix for the display changes. Provides a single event structure e with no attributes currently.
openxr.onActionChange(name, func, *args, **kw) Register a function to be called when the action with the specified name changes. The action change event will be passed to the provided function, along with the specified arguments.
openxr.onSessionStateChange(state, func, *args, **kw) Register a function to be called when the session state changes to the specified state. The session state change event will be passed to the provided function, along with the specified arguments.
openxr.waitActionChange(name) Returns a viztask.Condition that waits for the action with the specified name to change.
openxr.waitSessionStateChange(state) Returns a viztask.Condition that waits for the session state to change to the specified state.

OpenXR Client

The OpenXR client object contains the following methods:

Method

Description

<client>.getHead()

Return the OpenXR sensor for the user's head.

<client>.getLeftHand() Return the OpenXR sensor for the user's left hand aim pose.
<client>.getRightHand() Return the OpenXR sensor for the user's right hand aim pose.
<client>.getLeftHandGrip() Return the OpenXR sensor for the user's left hand grip pose.
<client>.getRightHandGrip() Return the OpenXR sensor for the user's right hand grip pose.
<client>.getLeftHandPalm() Return the OpenXR sensor for the user's left hand palm pose.
<client>.getRightHandPalm() Return the OpenXR sensor for the user's right hand palm pose.
<client>.getEyeTracker() Returns an OpenXR eye tracker object or None if the current runtime does not support the eye tracking extension.
<client>.getHandTracker() Returns an OpenXR hand tracker object or None if the current runtime does not support the hand tracking extension.
<client>.getBodyTrackerFB() Returns an OpenXR Facebook body tracker object or None if the current runtime does not support the Facebook body tracking extension.
<client>.getFaceTrackerFB() Returns an OpenXR Facebook face tracker object or None if the current runtime does not support the Facebook face tracking extension.
<client>.getFaceTrackerHTC() Returns an OpenXR HTC face tracker object or None if the current runtime does not support the HTC face tracking extension.
<client>.getPassthroughFB() Returns an OpenXR Facebook passthrough object or None if the current runtime does not support the Facebook passthrough extension.
<client>.getPassthroughVarjo() Returns an OpenXR Varjo passthrough object or None if the current runtime does not support the Varjo passthrough extension.

<client>.addBoolAction(name, binding=None)

Add an input action with a boolean (True/False) value. name must be a unique and valid action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects. Returns True/False whether the action was successfully added.

<client>.addFloatAction(name, binding=None)

Add an input action with a single floating point value. name must be a unique and valid action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects. Returns True/False whether the action was successfully added.

<client>.addVec2Action(name, binding=None)

Add an input action with a [x,y] vector value. name must be a unique and valid action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects. Returns True/False whether the action was successfully added.

<client>.addPoseAction(name, binding=None)

Add an input action with a 6DOF pose value. name must be a unique and valid action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects. Returns an OpenXR Sensor object if successfully added.

<client>.addVibrationAction(name, binding=None)

Add an output vibration action. name must be a unique and valid action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects. Returns True/False whether the action was successfully added.

<client>.addActionBinding(name, binding) Specify bindings for the previously created action name. binding can be an openxr.ActionBinding object or list of openxr.ActionBinding objects.
<client>.isActionActive(name) Returns True/False whether the action name is actively being updated by a bound input source. If the current OpenXR session is not in a focused state, then this will always return False.
<client>.getActionState(name) Return the current state of the specified input action name. The return value type depends on the type of the action (bool, float, vec2).

<client>.setVibration(name

, duration=viz.AUTO_COMPUTE

, frequency=viz.AUTO_COMPUTE

, amplitude=1.0)

Set the state of the previously created vibration action name.

  • duration is the length of the vibration in seconds, or the minimum supported length if viz.AUTO_COMPUTE. A duration value of 0 will stop the current vibration.
  • frequency is the frequency of the vibration in Hz, or the default runtime value if viz.AUTO_COMPUTE.
  • amplitude is the amplitude of the vibration between 0.0 and 1.0.

<client>.setVibrationLeftHand(

duration=viz.AUTO_COMPUTE

, frequency=viz.AUTO_COMPUTE

, amplitude=1.0)

Set the vibration of the left hand controller. See <client>.setVibration above for a description of the parameters.

<client>.setVibrationRightHand(

duration=viz.AUTO_COMPUTE

, frequency=viz.AUTO_COMPUTE

, amplitude=1.0)

Set the vibration of the right hand controller. See <client>.setVibration above for a description of the parameters.
<client>.getInteractionProfile(userPath) Get the current interaction profile for the specified top level user path. This value typically will not be available from the runtime until the OpenXR session has focus. You can use openxr.EVENT_INTERACTION_PROFILE_CHANGE to get notified when the interaction profile has been updated.
<client>.setReferenceSpace(space) Set the active reference space for input poses. See the list of possible reference spaces above. Not all reference spaces may be supported by the runtime, except for openxr.REFERENCE_SPACE_LOCAL, which is required to be supported by all runtimes. Returns True/False whether the specified reference space was applied.
<client>.getReferenceSpace() Get the active reference space for input poses. The default reference space is openxr.REFERENCE_SPACE_STAGE.
<client>.isReferenceSpaceSupported(space) Return whether the specified reference space is supported by the runtime.
<client>.getSessionState() Get current OpenXR session state. See the list of possible session states above.
<client>.getSystemInfo()

Get information regarding the OpenXR runtime system. Returns a viz.Data object with the following attributes:

  • runtimeName: Name of runtime
  • runtimeVersion: Version of runtime as a string
  • systemName: Name of the system/device
  • vendorId: A unique identifier of the vendor of the system
<client>.getExtensionList() Get list of supported OpenXR extension names from runtime.
<client>.isExtensionSupported(extension) Return whether the specified OpenXR extension name is supported by the runtime.

OpenXR HMD

The openxr.HMD object contains the following methods:

Method

Description

<hmd>.getSensor()

Get the head sensor associated with the HMD.

<hmd>.setMonoMirror(mode) Set whether the HMD is mirrored to the window in mono. mode can be True, False, or viz.TOGGLE. Disabled by default.
<hmd>.getMonoMirror() Get whether the HMD is mirrored to the window in mono.
<hmd>.setVisiblityMask(mode) Set whether the visibility mask is enabled for the HMD. The visibility mask is used to mask out portions of the screen that won't be visible to the user in the HMD due to distortion correction. It can help increase performance in certain cases. mode can be True, False, or viz.TOGGLE. Disabled by default. The visibility mask feature is an optional OpenXR extension, and therefore might not be supported by the runtime.
<hmd>.getVisibilityMask() Get whether the visibility mask is enabled for the HMD.
<hmd>.setWindow(window) Set the window used to render to the HMD.
<hmd>.getWindow() Get the window used to render to the HMD.

<hmd>.remove()

Remove the HMD object and resources.

OpenXR Sensor

The OpenXR sensor object provides the position and orientation data associated with a bound OpenXR input pose. The sensor has the following methods in addition to the standard extension sensor methods:

Method

Description

<sensor>.isPoseActive() Returns True/False whether the pose is actively being updated by a bound input source. If the current OpenXR session is not in a focused state, then this will always return False.

<sensor>.getLinearVelocity()

Get the linear velocity of the input source in meters/second. Might not be supported by certain devices or runtimes.

<sensor>.getAngularVelocity()

Get the angular velocity of the input source in radians/second. Might not be supported by certain devices or runtimes.

<sensor>.getInputState(inputId) Return the current state of the specified input Id. The return value type depends on the type of the input action (bool, float, vec2). This is only available for the default left and right hand sensors and for known controller input Ids.

<sensor>.setVibration(

duration=viz.AUTO_COMPUTE

, frequency=viz.AUTO_COMPUTE

, amplitude=1.0)

Set the vibration of the sensor. See <client>.setVibration above for a description of the parameters. This is only available for the default left and right hand sensors.

OpenXR Eye Tracker

Full sample scripts for using the OpenXR eye tracker can be found at \examples\openxr\xrEyeTracker.py and \examples\devices\openxrEyeExample.py

 

If supported by the runtime, the OpenXR eye tracker object provides both position and orientation data that represent the gaze origin/direction relative to the HMD coordinate frame. By default, the combined gaze values from both eyes is returned. Some devices provide separate gaze values per eye, which can be accessed by passing the viz.LEFT_EYE or viz.RIGHT_EYE flag to the corresponding sensor function. The eye tracker object has the following methods in addition to the standard extension sensor methods:

Method

Description

<EyeTracker>.isGazeActive(eye=viz.BOTH_EYE) Return True/False whether the eye gaze is actively being tracked for the specified eye. If the current OpenXR session is not in a focused state, then this will always return False.
<EyeTracker>.getGazeConfidence(eye=viz.BOTH_EYE) Return the [0-1] confidence value for the gaze pose of the specified eye.

OpenXR Hand Tracker

A full sample script for using the OpenXR hand tracker can be found at \examples\openxr\xrHandTracker.py

 

If supported by the runtime, the OpenXR hand tracker interface provides the following methods:

Method

Description

<HandTracker>.getLeftHand() Returns the hand sensor for the left hand, if supported.

<HandTracker>.getRightHand()

Returns the hand sensor for the right hand, if supported.

The hand sensor has the following methods in addition to the standard extension sensor methods:

Method

Description

<sensor>.JOINT_PALM

<sensor>.JOINT_WRIST

<sensor>.JOINT_THUMB_METACARPAL

<sensor>.JOINT_THUMB_PROXIMAL

<sensor>.JOINT_THUMB_DISTAL

<sensor>.JOINT_THUMB_TIP

<sensor>.JOINT_INDEX_METACARPAL

<sensor>.JOINT_INDEX_PROXIMAL

<sensor>.JOINT_INDEX_INTERMEDIATE

<sensor>.JOINT_INDEX_DISTAL

<sensor>.JOINT_INDEX_TIP

<sensor>.JOINT_MIDDLE_METACARPAL

<sensor>.JOINT_MIDDLE_PROXIMAL

<sensor>.JOINT_MIDDLE_INTERMEDIATE

<sensor>.JOINT_MIDDLE_DISTAL

<sensor>.JOINT_MIDDLE_TIP

<sensor>.JOINT_RING_METACARPAL

<sensor>.JOINT_RING_PROXIMAL

<sensor>.JOINT_RING_INTERMEDIATE

<sensor>.JOINT_RING_DISTAL

<sensor>.JOINT_RING_TIP

<sensor>.JOINT_LITTLE_METACARPAL

<sensor>.JOINT_LITTLE_PROXIMAL

<sensor>.JOINT_LITTLE_INTERMEDIATE

<sensor>.JOINT_LITTLE_DISTAL

<sensor>.JOINT_LITTLE_TIP

Joint identifiers that can be passed to the standard extension sensor methods when accessing the position or orientation. The joint position/orientation values will be relative to the wrist of the hand. See the OpenXR hand tracker extension page for the coordinate system of all the joints.

<sensor>.getJointRadius(joint)

Get radius (in meters) of the specified joint.

<sensor>.getJointParent(joint) Get parent joint of specified joint or zero if joint does not have a parent.
<sensor>.getJointLocalMatrix(joint) Get local transform matrix of specified joint relative to the joint parent.
<sensor>.getAimStateFB() Return the Facebook aim state sensor for the hand, if supported by runtime.
<sensor>.getHand()

Get the hand identifier that the sensor represents. Can be one of the following:

  • <sensor>.HAND_LEFT
  • <sensor>.HAND_RIGHT
<sensor>.isHandActive() Return True/False whether the hand is actively being tracked. If the current OpenXR session is not in a focused state, then this will always return False.

A full sample script for using the OpenXR Facebook hand tracker aim state can be found at \examples\openxr\xrHandTrackerAimFB.py

 

If supported, the Facebook aim state sensor provides the aim position and orientation of the hand through the standard extension sensor methods. It also provides the following additional methods:

Method

Description

<aimSate>.AIM_COMPUTED

Aiming data is computed from additional sources beyond the hand data.
<aimSate>.AIM_VALID Aiming data is valid.
<aimSate>.AIM_INDEX_PINCHING Index finger pinch discrete signal
<aimSate>.AIM_MIDDLE_PINCHING Middle finger pinch discrete signal
<aimSate>.AIM_RING_PINCHING Ring finger pinch discrete signal
<aimSate>.AIM_LITTLE_PINCHING Little finger pinch discrete signal
<aimSate>.AIM_SYSTEM_GESTURE System gesture is active
<aimSate>.AIM_DOMINANT_HAND Hand is currently marked as dominant for the system
<aimSate>.AIM_MENU_PRESSED System menu gesture is active

<aimSate>.getStatus()

Get aim state status bit mask. See possible states above.

<aimSate>.getPinchStrengthIndex() Get [0-1] pinch strength of index finger.
<aimSate>.getPinchStrengthMiddle() Get [0-1] pinch strength of middle finger.
<aimSate>.getPinchStrengthRing() Get [0-1] pinch strength of ring finger.
<aimSate>.getPinchStrengthLittle()

Get [0-1] pinch strength of little finger.

The openxr.HandTrackerJointModel() object can be used to create a node for displaying joint data from an OpenXR hand sensor. The object provides the following methods in addition to the standard Vizard node methods:

Method

Description

<model>.__init__(tracker

,pointColor=viz.CYAN

, pointSize=0.002

, lineColor=viz.MAGENTA

, lineWidth=2.0

, **kw

)

Create the hand tracker joint model from the specified tracker sensor. The model will display points at each joint with the specified pointColor color and pointSize size. Lines will be rendered between connected joints with the specified lineColor color and lineWidth width. Additional kw arguments will be passed to base node class.

<model>.getTracker()

Get the hand tracker sensor attached to the model.

<model>.getRootNodePoints()

Get the root node for the joint points.

<model>.getRootNodeLines() Get the root node for the joint lines.

The openxr.HandTrackerMeshModel() object can be used to create a node for displaying the OpenXR runtime provided hand tracking mesh model for an OpenXR hand sensor. If the runtime does not provide a hand tracking mesh, a default hand model will be used. The object provides the following methods in addition to the standard Vizard node methods:

Method

Description

<model>.__init__(tracker, **kw)

Create the hand tracker mesh model from the specified tracker sensor. Additional kw arguments will be passed to base node class.

<model>.getTracker()

Get the hand tracker sensor attached to the model.

OpenXR Facebook Body Tracker

A full sample script for using the OpenXR Facebook body tracker can be found at \examples\openxr\xrBodyTrackerFB.py

 

If supported by the runtime, the OpenXR Facebook body tracker interface provides the following methods in addition to the standard extension sensor methods:

Method

Description

<tracker>.JOINT_ROOT

<tracker>.JOINT_HIPS

<tracker>.JOINT_SPINE_LOWER

<tracker>.JOINT_SPINE_MIDDLE

<tracker>.JOINT_SPINE_UPPER

<tracker>.JOINT_CHEST

<tracker>.JOINT_NECK

<tracker>.JOINT_HEAD

<tracker>.JOINT_LEFT_SHOULDER

<tracker>.JOINT_LEFT_SCAPULA

<tracker>.JOINT_LEFT_ARM_UPPER

<tracker>.JOINT_LEFT_ARM_LOWER

<tracker>.JOINT_LEFT_HAND_WRIST_TWIST

<tracker>.JOINT_RIGHT_SHOULDER

<tracker>.JOINT_RIGHT_SCAPULA

<tracker>.JOINT_RIGHT_ARM_UPPER

<tracker>.JOINT_RIGHT_ARM_LOWER

<tracker>.JOINT_RIGHT_HAND_WRIST_TWIST

<tracker>.JOINT_LEFT_HAND_PALM

<tracker>.JOINT_LEFT_HAND_WRIST

<tracker>.JOINT_LEFT_HAND_THUMB_METACARPAL

<tracker>.JOINT_LEFT_HAND_THUMB_PROXIMAL

<tracker>.JOINT_LEFT_HAND_THUMB_DISTAL

<tracker>.JOINT_LEFT_HAND_THUMB_TIP

<tracker>.JOINT_LEFT_HAND_INDEX_METACARPAL

<tracker>.JOINT_LEFT_HAND_INDEX_PROXIMAL

<tracker>.JOINT_LEFT_HAND_INDEX_INTERMEDIATE

<tracker>.JOINT_LEFT_HAND_INDEX_DISTAL

<tracker>.JOINT_LEFT_HAND_INDEX_TIP

<tracker>.JOINT_LEFT_HAND_MIDDLE_METACARPAL

<tracker>.JOINT_LEFT_HAND_MIDDLE_PROXIMAL

<tracker>.JOINT_LEFT_HAND_MIDDLE_INTERMEDIATE

<tracker>.JOINT_LEFT_HAND_MIDDLE_DISTAL

<tracker>.JOINT_LEFT_HAND_MIDDLE_TIP

<tracker>.JOINT_LEFT_HAND_RING_METACARPAL

<tracker>.JOINT_LEFT_HAND_RING_PROXIMAL

<tracker>.JOINT_LEFT_HAND_RING_INTERMEDIATE

<tracker>.JOINT_LEFT_HAND_RING_DISTAL

<tracker>.JOINT_LEFT_HAND_RING_TIP

<tracker>.JOINT_LEFT_HAND_LITTLE_METACARPAL

<tracker>.JOINT_LEFT_HAND_LITTLE_PROXIMAL

<tracker>.JOINT_LEFT_HAND_LITTLE_INTERMEDIATE

<tracker>.JOINT_LEFT_HAND_LITTLE_DISTAL

<tracker>.JOINT_LEFT_HAND_LITTLE_TIP

<tracker>.JOINT_RIGHT_HAND_PALM

<tracker>.JOINT_RIGHT_HAND_WRIST

<tracker>.JOINT_RIGHT_HAND_THUMB_METACARPAL

<tracker>.JOINT_RIGHT_HAND_THUMB_PROXIMAL

<tracker>.JOINT_RIGHT_HAND_THUMB_DISTAL

<tracker>.JOINT_RIGHT_HAND_THUMB_TIP

<tracker>.JOINT_RIGHT_HAND_INDEX_METACARPAL

<tracker>.JOINT_RIGHT_HAND_INDEX_PROXIMAL

<tracker>.JOINT_RIGHT_HAND_INDEX_INTERMEDIATE

<tracker>.JOINT_RIGHT_HAND_INDEX_DISTAL

<tracker>.JOINT_RIGHT_HAND_INDEX_TIP

<tracker>.JOINT_RIGHT_HAND_MIDDLE_METACARPAL

<tracker>.JOINT_RIGHT_HAND_MIDDLE_PROXIMAL

<tracker>.JOINT_RIGHT_HAND_MIDDLE_INTERMEDIATE

<tracker>.JOINT_RIGHT_HAND_MIDDLE_DISTAL

<tracker>.JOINT_RIGHT_HAND_MIDDLE_TIP

<tracker>.JOINT_RIGHT_HAND_RING_METACARPAL

<tracker>.JOINT_RIGHT_HAND_RING_PROXIMAL

<tracker>.JOINT_RIGHT_HAND_RING_INTERMEDIATE

<tracker>.JOINT_RIGHT_HAND_RING_DISTAL

<tracker>.JOINT_RIGHT_HAND_RING_TIP

<tracker>.JOINT_RIGHT_HAND_LITTLE_METACARPAL

<tracker>.JOINT_RIGHT_HAND_LITTLE_PROXIMAL

<tracker>.JOINT_RIGHT_HAND_LITTLE_INTERMEDIATE

<tracker>.JOINT_RIGHT_HAND_LITTLE_DISTAL

<tracker>.JOINT_RIGHT_HAND_LITTLE_TIP

Joint identifiers that can be passed to the standard extension sensor methods when accessing the position or orientation. The joint position/orientation values will be relative to the root joint.

<tracker>.getJointParent(joint)

Get parent joint of specified joint or zero if joint does not have a parent.

<tracker>.getJointLocalMatrix(joint) Get local transform matrix of specified joint relative to the joint parent.
<tracker>.getJointBindMatrix(joint) Get bind matrix (T-Pose) of specified joint relative to the root joint.
<tracker>.getBodyConfidence() Get the [0,1] confidence of the computed body pose.
<tracker>.isBodyActive() Return True/False whether the body is actively being tracked. If the current OpenXR session is not in a focused state, then this will always return False.

The openxr.BodyTrackerFBJointModel() object can be used to create a node for displaying joint data from an OpenXR Facebook body tracker. The object provides the following methods in addition to the standard Vizard node methods:

Method

Description

<model>.__init__(tracker

,pointColor=viz.CYAN

, pointSize=0.002

, lineColor=viz.MAGENTA

, lineWidth=2.0

, **kw

)

Create the body tracker joint model from the specified Facebook body tracker. The model will display points at each joint with the specified pointColor color and pointSize size. Lines will be rendered between connected joints with the specified lineColor color and lineWidth width. Additional kw arguments will be passed to base node class.

<model>.getTracker()

Get the Facebook body tracker attached to the model.

<model>.getRootNodePoints()

Get the root node for the joint points.

<model>.getRootNodeLines() Get the root node for the joint lines.

The openxr.BodyTrackerFBAvatarModel() object can be used to apply joint data from OpenXR FB body tracker to an avatar model. The object provides the following methods in addition to the standard Vizard node methods:

Method

Description

<model>.__init__(tracker, avatar

,applyHipPosition=True

,applyRootPosition=True

,applyEyeTracker=True

, **kw

)

Create the body tracker avatar model from the specified Facebook body tracker.

 

avatar must be either a valid avatar model filename or existing avatar node.

 

applyHipPosition specifies whether the position of the hip joint is applied to the avatar. The hip position contains the users vertical movement. If you have an avatar with legs and would like to keep the legs above ground, you can disable this.

 

applyRootPosition specifies whether the position of the root joint is applied to the avatar. The root joint contains the users movement along the ground plane. If you want to keep the avatar at a fixed location regardless of the user's physical movement, you can disable this.

 

applyEyeTracker specifies whether to apply the openxr.getEyeTracker() sensor data to the avatar's eye bones. If the runtime does not support eye tracking or the avatar does not have eye bones, then this setting will be ignored.

 

Additional kw arguments will be passed to base node class.

<model>.getTracker()

Get the Facebook body tracker attached to the model.

<model>.getAvatar()

Get the underlying avatar node being animated.

<model>.getJointBone(joint) Get the avatar bone associated with the specified body joint or None if joint is not associated with a bone.

OpenXR Facebook Face Tracker

A full sample script for using the OpenXR Facebook face tracker can be found at \examples\openxr\xrFaceTrackerFB.py

 

If supported by the runtime, the OpenXR Facebook face tracker interface provides the following methods:

Method

Description

<tracker>.EXPRESSION_BROW_LOWERER_L

<tracker>.EXPRESSION_BROW_LOWERER_R

<tracker>.EXPRESSION_CHEEK_PUFF_L

<tracker>.EXPRESSION_CHEEK_PUFF_R

<tracker>.EXPRESSION_CHEEK_RAISER_L

<tracker>.EXPRESSION_CHEEK_RAISER_R

<tracker>.EXPRESSION_CHEEK_SUCK_L

<tracker>.EXPRESSION_CHEEK_SUCK_R

<tracker>.EXPRESSION_CHIN_RAISER_B

<tracker>.EXPRESSION_CHIN_RAISER_T

<tracker>.EXPRESSION_DIMPLER_L

<tracker>.EXPRESSION_DIMPLER_R

<tracker>.EXPRESSION_EYES_CLOSED_L

<tracker>.EXPRESSION_EYES_CLOSED_R

<tracker>.EXPRESSION_EYES_LOOK_DOWN_L

<tracker>.EXPRESSION_EYES_LOOK_DOWN_R

<tracker>.EXPRESSION_EYES_LOOK_LEFT_L

<tracker>.EXPRESSION_EYES_LOOK_LEFT_R

<tracker>.EXPRESSION_EYES_LOOK_RIGHT_L

<tracker>.EXPRESSION_EYES_LOOK_RIGHT_R

<tracker>.EXPRESSION_EYES_LOOK_UP_L

<tracker>.EXPRESSION_EYES_LOOK_UP_R

<tracker>.EXPRESSION_INNER_BROW_RAISER_L

<tracker>.EXPRESSION_INNER_BROW_RAISER_R

<tracker>.EXPRESSION_JAW_DROP

<tracker>.EXPRESSION_JAW_SIDEWAYS_LEFT

<tracker>.EXPRESSION_JAW_SIDEWAYS_RIGHT

<tracker>.EXPRESSION_JAW_THRUST

<tracker>.EXPRESSION_LID_TIGHTENER_L

<tracker>.EXPRESSION_LID_TIGHTENER_R

<tracker>.EXPRESSION_LIP_CORNER_DEPRESSOR_L

<tracker>.EXPRESSION_LIP_CORNER_DEPRESSOR_R

<tracker>.EXPRESSION_LIP_CORNER_PULLER_L

<tracker>.EXPRESSION_LIP_CORNER_PULLER_R

<tracker>.EXPRESSION_LIP_FUNNELER_LB

<tracker>.EXPRESSION_LIP_FUNNELER_LT

<tracker>.EXPRESSION_LIP_FUNNELER_RB

<tracker>.EXPRESSION_LIP_FUNNELER_RT

<tracker>.EXPRESSION_LIP_PRESSOR_L

<tracker>.EXPRESSION_LIP_PRESSOR_R

<tracker>.EXPRESSION_LIP_PUCKER_L

<tracker>.EXPRESSION_LIP_PUCKER_R

<tracker>.EXPRESSION_LIP_STRETCHER_L

<tracker>.EXPRESSION_LIP_STRETCHER_R

<tracker>.EXPRESSION_LIP_SUCK_LB

<tracker>.EXPRESSION_LIP_SUCK_LT

<tracker>.EXPRESSION_LIP_SUCK_RB

<tracker>.EXPRESSION_LIP_SUCK_RT

<tracker>.EXPRESSION_LIP_TIGHTENER_L

<tracker>.EXPRESSION_LIP_TIGHTENER_R

<tracker>.EXPRESSION_LIPS_TOWARD

<tracker>.EXPRESSION_LOWER_LIP_DEPRESSOR_L

<tracker>.EXPRESSION_LOWER_LIP_DEPRESSOR_R

<tracker>.EXPRESSION_MOUTH_LEFT

<tracker>.EXPRESSION_MOUTH_RIGHT

<tracker>.EXPRESSION_NOSE_WRINKLER_L

<tracker>.EXPRESSION_NOSE_WRINKLER_R

<tracker>.EXPRESSION_OUTER_BROW_RAISER_L

<tracker>.EXPRESSION_OUTER_BROW_RAISER_R

<tracker>.EXPRESSION_UPPER_LID_RAISER_L

<tracker>.EXPRESSION_UPPER_LID_RAISER_R

<tracker>.EXPRESSION_UPPER_LIP_RAISER_L

<tracker>.EXPRESSION_UPPER_LIP_RAISER_R

All the supported expression identifiers. These can be used as indices to access the expression weights from the list.

<tracker>.CONFIDENCE_LOWER_FACE

<tracker>.CONFIDENCE_UPPER_FACE

All the supported confidence identifiers. These can be used as indices to access the confidence values from the list.

<tracker>.getExpression(expression)

Get the specified expression weight, [0,1].

<tracker>.getExpressionList() Get the list of all expression weights.
<sensor>.getConfidence(confidence) Get the specified confidence value, [0,1].
<sensor>.getConfidenceList() Get the list of all confidence values.

OpenXR HTC Face Tracker

A full sample script for using the OpenXR HTC face tracker can be found at \examples\openxr\xrFaceTrackerHTC.py

 

If supported by the runtime, the OpenXR HTC face tracker interface provides the following methods:

Method

Description

<tracker>.EYE_LEFT_BLINK

<tracker>.EYE_LEFT_WIDE

<tracker>.EYE_RIGHT_BLINK

<tracker>.EYE_RIGHT_WIDE

<tracker>.EYE_LEFT_SQUEEZE

<tracker>.EYE_RIGHT_SQUEEZE

<tracker>.EYE_LEFT_DOWN

<tracker>.EYE_RIGHT_DOWN

<tracker>.EYE_LEFT_OUT

<tracker>.EYE_RIGHT_IN

<tracker>.EYE_LEFT_IN

<tracker>.EYE_RIGHT_OUT

<tracker>.EYE_LEFT_UP

<tracker>.EYE_RIGHT_UP

All the supported eye expression identifiers. These can be used as indices to access the eye expression weights from the list.

<tracker>.LIP_JAW_RIGHT

<tracker>.LIP_JAW_LEFT

<tracker>.LIP_JAW_FORWARD

<tracker>.LIP_JAW_OPEN

<tracker>.LIP_MOUTH_APE_SHAPE

<tracker>.LIP_MOUTH_UPPER_RIGHT

<tracker>.LIP_MOUTH_UPPER_LEFT

<tracker>.LIP_MOUTH_LOWER_RIGHT

<tracker>.LIP_MOUTH_LOWER_LEFT

<tracker>.LIP_MOUTH_UPPER_OVERTURN

<tracker>.LIP_MOUTH_LOWER_OVERTURN

<tracker>.LIP_MOUTH_POUT

<tracker>.LIP_MOUTH_SMILE_RIGHT

<tracker>.LIP_MOUTH_SMILE_LEFT

<tracker>.LIP_MOUTH_SAD_RIGHT

<tracker>.LIP_MOUTH_SAD_LEFT

<tracker>.LIP_CHEEK_PUFF_RIGHT

<tracker>.LIP_CHEEK_PUFF_LEFT

<tracker>.LIP_CHEEK_SUCK

<tracker>.LIP_MOUTH_UPPER_UPRIGHT

<tracker>.LIP_MOUTH_UPPER_UPLEFT

<tracker>.LIP_MOUTH_LOWER_DOWNRIGHT

<tracker>.LIP_MOUTH_LOWER_DOWNLEFT

<tracker>.LIP_MOUTH_UPPER_INSIDE

<tracker>.LIP_MOUTH_LOWER_INSIDE

<tracker>.LIP_MOUTH_LOWER_OVERLAY

<tracker>.LIP_TONGUE_LONGSTEP1

<tracker>.LIP_TONGUE_LEFT

<tracker>.LIP_TONGUE_RIGHT

<tracker>.LIP_TONGUE_UP

<tracker>.LIP_TONGUE_DOWN

<tracker>.LIP_TONGUE_ROLL

<tracker>.LIP_TONGUE_LONGSTEP2

<tracker>.LIP_TONGUE_UPRIGHT_MORPH

<tracker>.LIP_TONGUE_UPLEFT_MORPH

<tracker>.LIP_TONGUE_DOWNRIGHT_MORPH

<tracker>.LIP_TONGUE_DOWNLEFT_MORPH

All the supported lip expression identifiers. These can be used as indices to access the lip expression weights from the list.
<tracker>.isEyeExpressionSupported() Return True/False whether eye expression is supported by current OpenXR session.

<tracker>.getEyeExpression(expression)

Get the specified eye expression weight, [0,1].

<tracker>.getEyeExpressionList() Get the list of all eye expression weights.
<tracker>.isLipExpressionSupported() Return True/False whether lip expression is supported by current OpenXR session.
<tracker>.getLipExpression(expression) Get the specified lip expression weight, [0,1].
<tracker>.getLipExpressionList() Get the list of all lip expression weights.

OpenXR Facebook Passthrough

A full sample script for using the OpenXR Facebook passthrough interface can be found at \examples\openxr\XrPassthroughFB.py

 

If supported by the runtime, the OpenXR Facebook passthrough interface provides the following methods:

Method

Description

<passthrough>.setEnabled(mode)

Set whether video passthrough is enabled. mode can be True, False, or viz.TOGGLE.
<passthrough>.getEnabled() Get whether video passthrough is enabled.

OpenXR Varjo Passthrough

A full sample script for using the OpenXR Varjo passthrough interface can be found at \examples\openxr\xrPassthroughVarjo.py

 

If supported by the runtime, the OpenXR Varjo passthrough interface provides the following methods:

Method

Description

<passthrough>.setEnabled(mode)

Set whether video passthrough is enabled. mode can be True, False, or viz.TOGGLE.
<passthrough>.getEnabled()

Get whether video passthrough is enabled.

<passthrough>.setDepthTestRange(depthRange) Set (near, far) range, in meters, to allow depth testing between real and virtual objects. All virtual objects outside the specified range will always appear over the video image.
<passthrough>.getDepthTestRange() Get (near, far) range, in meters, to allow depth testing between real and virtual objects.

OpenXR Actions

A full sample script for using the OpenXR actions can be found at \examples\openxr\xrActions.py

Note: OpenXR actions must be created and bound before the session gains focus. Once, the session gains focus, all calls to create and bind actions will fail.

Action names must only contain the following characters:

OpenXR Controller Inputs

By default, the openxr module will automatically create input actions for known controllers for the user's left and right hands. The state of these inputs can be accessed using <sensor>.getInputState(inputId) with either the openxr.getLeftHand() or openxr.getRightHand() sensors. Below is a list of all the input Ids for known controller types:

Generic Controller

These are generic input Ids that can be used with any known controller. Not all inputs will be supported depending on the actual controller:

Input Ids
openxr.CONTROLLER_TRIGGER_BUTTON
openxr.CONTROLLER_SQUEEZE_BUTTON
openxr.CONTROLLER_MENU_BUTTON
openxr.CONTROLLER_THUMBSTICK_BUTTON
openxr.CONTROLLER_THUMBSTICK_UP

openxr.CONTROLLER_THUMBSTICK_DOWN

openxr.CONTROLLER_THUMBSTICK_LEFT
openxr.CONTROLLER_THUMBSTICK_RIGHT

openxr.CONTROLLER_X_BUTTON

openxr.CONTROLLER_A_BUTTON

openxr.CONTROLLER_Y_BUTTON

openxr.CONTROLLER_B_BUTTON

openxr.CONTROLLER_TRIGGER_TOUCH
openxr.CONTROLLER_SQUEEZE_TOUCH
openxr.CONTROLLER_THUMBSTICK_TOUCH

openxr.CONTROLLER_X_TOUCH

openxr.CONTROLLER_A_TOUCH

openxr.CONTROLLER_Y_TOUCH

openxr.CONTROLLER_B_TOUCH

openxr.CONTROLLER_TRIGGER_VALUE

openxr.CONTROLLER_SQUEEZE_VALUE

openxr.CONTROLLER_THUMBSTICK_VALUE
HTC Vive Controllers
Input Ids
openxr.VIVE_TRIGGER_BUTTON
openxr.VIVE_SQUEEZE_BUTTON
openxr.VIVE_MENU_BUTTON
openxr.VIVE_TRACKPAD_BUTTON

openxr.VIVE_TRACKPAD_UP

openxr.VIVE_TRACKPAD_DOWN
openxr.VIVE_TRACKPAD_LEFT

openxr.VIVE_TRACKPAD_RIGHT

openxrVIVE_TRACKPAD_TOUCH

openxr.VIVE_SYSTEM_BUTTON
openxr.VIVE_TRIGGER_VALUE
openxr.VIVE_TRACKPAD_VALUE
HTC Vive Cosmos Controllers
Input Ids
openxr.VIVE_COSMOS_TRIGGER_BUTTON
openxr.VIVE_COSMOS_SQUEEZE_BUTTON
openxr.VIVE_COSMOS_MENU_BUTTON
openxr.VIVE_COSMOS_THUMBSTICK_BUTTON
openxr.VIVE_COSMOS_THUMBSTICK_UP
openxr.VIVE_COSMOS_THUMBSTICK_DOWN
openxr.VIVE_COSMOS_THUMBSTICK_LEFT
openxr.VIVE_COSMOS_THUMBSTICK_RIGHT

openxr.VIVE_COSMOS_X_BUTTON

openxr.VIVE_COSMOS_A_BUTTON

openxr.VIVE_COSMOS_Y_BUTTON

openxr.VIVE_COSMOS_B_BUTTON

openxr.VIVE_COSMOS_THUMBSTICK_TOUCH
openxr.VIVE_COSMOS_SHOULDER_BUTTON
openxr.VIVE_COSMOS_TRIGGER_VALUE
openxr.VIVE_COSMOS_THUMBSTICK_VALUE
HTC Vive Focus 3 Controllers
Input Ids
openxr.VIVE_FOCUS3_TRIGGER_BUTTON
openxr.VIVE_FOCUS3_SQUEEZE_BUTTON
openxr.VIVE_FOCUS3_MENU_BUTTON
openxr.VIVE_FOCUS3_THUMBSTICK_BUTTON
openxr.VIVE_FOCUS3_THUMBSTICK_UP
openxr.VIVE_FOCUS3_THUMBSTICK_DOWN
openxr.VIVE_FOCUS3_THUMBSTICK_LEFT
openxr.VIVE_FOCUS3_THUMBSTICK_RIGHT

openxr.VIVE_FOCUS3_X_BUTTON

openxr.VIVE_FOCUS3_A_BUTTON

openxr.VIVE_FOCUS3_Y_BUTTON

openxr.VIVE_FOCUS3_B_BUTTON

openxr.VIVE_FOCUS3_TRIGGER_TOUCH
openxr.VIVE_FOCUS3_SQUEEZE_TOUCH
openxr.VIVE_FOCUS3_THUMBSTICK_TOUCH
openxr.VIVE_FOCUS3_THUMBREST_TOUCH
openxr.VIVE_FOCUS3_TRIGGER_VALUE
openxr.VIVE_FOCUS3_SQUEEZE_VALUE
openxr.VIVE_FOCUS3_THUMBSTICK_VALUE
Valve Index Controllers
Input Ids
openxr.VALVE_INDEX_TRIGGER_BUTTON
openxr.VALVE_INDEX_SQUEEZE_BUTTON
openxr.VALVE_INDEX_SYSTEM_BUTTON
openxr.VALVE_INDEX_THUMBSTICK_BUTTON
openxr.VALVE_INDEX_THUMBSTICK_UP
openxr.VALVE_INDEX_THUMBSTICK_DOWN
openxr.VALVE_INDEX_THUMBSTICK_LEFT
openxr.VALVE_INDEX_THUMBSTICK_RIGHT
openxr.VALVE_INDEX_A_BUTTON
openxr.VALVE_INDEX_B_BUTTON
openxr.VALVE_INDEX_TRIGGER_TOUCH
openxr.VALVE_INDEX_THUMBSTICK_TOUCH
openxr.VALVE_INDEX_A_TOUCH
openxr.VALVE_INDEX_B_TOUCH
openxr.VALVE_INDEX_SYSTEM_TOUCH
openxr.VALVE_INDEX_TRACKPAD_TOUCH
openxr.VALVE_INDEX_TRIGGER_VALUE
openxr.VALVE_INDEX_SQUEEZE_VALUE
openxr.VALVE_INDEX_THUMBSTICK_VALUE
openxr.VALVE_INDEX_TRACKPAD_VALUE
openxr.VALVE_INDEX_SQUEEZE_FORCE
openxr.VALVE_INDEX_TRACKPAD_FORCE
Oculus Touch Controllers
Input Ids
openxr.OCULUS_TOUCH_TRIGGER_BUTTON
openxr.OCULUS_TOUCH_SQUEEZE_BUTTON
openxr.OCULUS_TOUCH_MENU_BUTTON
openxr.OCULUS_TOUCH_THUMBSTICK_BUTTON
openxr.OCULUS_TOUCH_THUMBSTICK_UP
openxr.OCULUS_TOUCH_THUMBSTICK_DOWN
openxr.OCULUS_TOUCH_THUMBSTICK_LEFT
openxr.OCULUS_TOUCH_THUMBSTICK_RIGHT

openxr.OCULUS_TOUCH_X_BUTTON

openxr.OCULUS_TOUCH_A_BUTTON

openxr.OCULUS_TOUCH_Y_BUTTON

openxr.OCULUS_TOUCH_B_BUTTON

openxr.OCULUS_TOUCH_TRIGGER_TOUCH
openxr.OCULUS_TOUCH_THUMBSTICK_TOUCH

openxr.OCULUS_TOUCH_X_TOUCH

openxr.OCULUS_TOUCH_A_TOUCH

openxr.OCULUS_TOUCH_Y_TOUCH

openxr.OCULUS_TOUCH_B_TOUCH

openxr.OCULUS_TOUCH_THUMBREST_TOUCH
openxr.OCULUS_TOUCH_TRIGGER_PROXIMITY
openxr.OCULUS_TOUCH_THUMBREST_PROXIMITY
openxr.OCULUS_TOUCH_TRIGGER_VALUE
openxr.OCULUS_TOUCH_SQUEEZE_VALUE
openxr.OCULUS_TOUCH_THUMBSTICK_VALUE
Meta Quest Pro Controllers
Input Ids
openxr.QUEST_PRO_TRIGGER_BUTTON
openxr.QUEST_PRO_SQUEEZE_BUTTON
openxr.QUEST_PRO_MENU_BUTTON
openxr.QUEST_PRO_THUMBSTICK_BUTTON
openxr.QUEST_PRO_THUMBSTICK_UP
openxr.QUEST_PRO_THUMBSTICK_DOWN
openxr.QUEST_PRO_THUMBSTICK_LEFT
openxr.QUEST_PRO_THUMBSTICK_RIGHT

openxr.QUEST_PRO_X_BUTTON

openxr.QUEST_PRO_A_BUTTON

openxr.QUEST_PRO_Y_BUTTON

openxr.QUEST_PRO_B_BUTTON

openxr.QUEST_PRO_TRIGGER_TOUCH
openxr.QUEST_PRO_THUMBSTICK_TOUCH

openxr.QUEST_PRO_X_TOUCH

openxr.QUEST_PRO_A_TOUCH

openxr.QUEST_PRO_Y_TOUCH

openxr.QUEST_PRO_B_TOUCH

openxr.QUEST_PRO_THUMBREST_TOUCH
openxr.QUEST_PRO_TRIGGER_PROXIMITY
openxr.QUEST_PRO_THUMBREST_PROXIMITY
openxr.QUEST_PRO_TRIGGER_VALUE
openxr.QUEST_PRO_SQUEEZE_VALUE
openxr.QUEST_PRO_THUMBSTICK_VALUE
openxr.QUEST_PRO_TRIGGER_CURL_VALUE
openxr.QUEST_PRO_TRIGGER_SLIDE_VALUE
openxr.QUEST_PRO_THUMBREST_FORCE
openxr.QUEST_PRO_STYLUS_FORCE
Windows Mixed Reality Controllers
Input Ids
openxr.MIXED_REALITY_TRIGGER_BUTTON
openxr.MIXED_REALITY_SQUEEZE_BUTTON
openxr.MIXED_REALITY_MENU_BUTTON
openxr.MIXED_REALITY_THUMBSTICK_BUTTON
openxr.MIXED_REALITY_THUMBSTICK_UP
openxr.MIXED_REALITY_THUMBSTICK_DOWN
openxr.MIXED_REALITY_THUMBSTICK_LEFT
openxr.MIXED_REALITY_THUMBSTICK_RIGHT
openxr.MIXED_REALITY_TRACKPAD_BUTTON
openxr.MIXED_REALITY_TRACKPAD_UP
openxr.MIXED_REALITY_TRACKPAD_DOWN
openxr.MIXED_REALITY_TRACKPAD_LEFT
openxr.MIXED_REALITY_TRACKPAD_RIGHT
openxr.MIXED_REALITY_TRACKPAD_TOUCH
openxr.MIXED_REALITY_TRIGGER_VALUE
openxr.MIXED_REALITY_THUMBSTICK_VALUE
openxr.MIXED_REALITY_TRACKPAD_VALUE
HP Reverb G2 Controllers
Input Ids
openxr.HP_REVERB_TRIGGER_BUTTON
openxr.HP_REVERB_SQUEEZE_BUTTON
openxr.HP_REVERB_MENU_BUTTON
openxr.HP_REVERB_THUMBSTICK_BUTTON
openxr.HP_REVERB_THUMBSTICK_UP
openxr.HP_REVERB_THUMBSTICK_DOWN
openxr.HP_REVERB_THUMBSTICK_LEFT
openxr.HP_REVERB_THUMBSTICK_RIGHT

openxr.HP_REVERB_X_BUTTON

openxr.HP_REVERB_A_BUTTON

openxr.HP_REVERB_Y_BUTTON

openxr.HP_REVERB_B_BUTTON

openxr.HP_REVERB_TRIGGER_VALUE
openxr.HP_REVERB_SQUEEZE_VALUE
openxr.HP_REVERB_THUMBSTICK_VALUE

OpenXR Config

The openxr.config object can be used to apply various configuration settings before initializing the client with openxr.getClient(). The openxr.config object provides the following attributes:

Attribute

Description

openxr.config.AppName

Set the application name. Might be used by the runtime for display purposes. Defaults to "Vizard-Application".

openxr.config.AppVersion Set the application version. Might be used by the runtime for display purposes. Must be an integer. Defaults to 1.
openxr.config.AutoExit Specifies whether the Vizard app should automatically exit when the client session state changes to openxr.SESSION_STATE_EXITING. This means the runtime wants the app to exit, typically by user request. Defaults to True.
openxr.config.AssignDefaultInputActions Specifies whether the default controller inputs will be created for the left and right hand. If disabled, then you must manually create and bind actions to receive user input. Defaults to True.
openxr.config.DepthLayer Specifies whether the scene depth map layer will be submitted to the HMD along with the color map. Most runtimes use the depth map to perform more accurate reprojections of the final displayed image. Defaults to True.
openxr.config.DebugUtils Specifies whether the OpenXR debug utils extension should be enabled. Enabling will output debug information provided by the runtime. Defaults to False.
openxr.config.EyeTracker Specifies whether the OpenXR eye tracker extension should be enabled. Disabling will prevent the use of <client>.getEyeTracker(). Defaults to True.

openxr.config.HandTracker

Specifies whether the OpenXR hand tracker extension should be enabled. Disabling will prevent the use of <client>.getHandTracker(). Defaults to True.

openxr.config.BodyTrackerFB Specifies whether the OpenXR Facebook body tracker extension should be enabled. Disabling will prevent the use of <client>.getBodyTrackerFB(). Defaults to True.
openxr.config.FaceTrackerFB

Specifies whether the OpenXR Facebook face tracker extension should be enabled. Disabling will prevent the use of <client>.getFaceTrackerFB(). Defaults to True.

openxr.config.FaceTrackerHTC Specifies whether the OpenXR HTC face tracker extension should be enabled. Disabling will prevent the use of <client>.getFaceTrackerHTC(). Defaults to True.
openxr.config.PassthroughFB Specifies whether the OpenXR Facbook passthrough extension should be enabled. Disabling will prevent the use of <client>.getPassthroughFB(). Defaults to True.
openxr.config.PassthroughVarjo Specifies whether the OpenXR Varjo passthrough extension should be enabled. Disabling will prevent the use of <client>.getPassthroughVarjo(). Defaults to True.
openxr.config.VisibilityMask Specifies whether the OpenXR visibility mask extension should be enabled. Disabling will prevent the use of <hmd>.setVisibitliyMask(). Defaults to True.
openxr.config.RenderModel Specifies whether the OpenXR render model extension should be enabled. Disabling will prevent the use of openxr.ControllerModel(). Defaults to True.

The following sample code will change the OpenXR app name and disable the default input actions:

import openxr

# Change some client configuration settings.
# Must be changed before initializing client.
openxr.config.AppName = 'MyApp'
openxr.config.AssignDefaultInputActions = 0

# Initialize client
xr = openxr.getClient()