Unity XR Testing Guide: Unit and Integration Tests for VR/AR Apps

Unity XR Testing Guide: Unit and Integration Tests for VR/AR Apps

Unity XR projects often go untested until a physical headset is involved, but that's unnecessary and slow. This guide covers Unity Test Framework, XR Simulation, controller input mocking, and Play Mode tests for XR rigs — everything you need to validate VR and AR behavior automatically.

Testing immersive applications presents a unique challenge: the hardware is expensive, lab time is limited, and bugs in spatial interaction can be subtle and hard to reproduce. Unity's XR toolchain has matured significantly, and with the right approach you can cover most of your critical paths without ever strapping on a headset.

Unity Test Framework Basics for XR Projects

Unity Test Framework (UTF) is built on NUnit and supports two modes: Edit Mode tests (no play loop, fast, suitable for pure logic) and Play Mode tests (full scene lifecycle, required for XR interaction, physics, and coroutines).

For XR work, you will primarily write Play Mode tests. Add the com.unity.test-framework package via Package Manager, then create a test assembly definition in your Tests/PlayMode folder.

// Tests/PlayMode/XRRigTests.asmdef references:
// - Unity.InputSystem
// - Unity.XR.Interaction.Toolkit
// - Unity.XR.Interaction.Toolkit.Tests  // test utilities
// - UnityEngine.TestRunner
// - UnityEditor.TestRunner (Edit Mode only)

A minimal Play Mode test looks like this:

using System.Collections;
using NUnit.Framework;
using UnityEngine;
using UnityEngine.TestTools;

public class XRRigPositionTests
{
    private GameObject _rig;

    [SetUp]
    public void SetUp()
    {
        _rig = new GameObject("XR Rig");
        _rig.AddComponent<UnityEngine.XR.Interaction.Toolkit.XROrigin>();
    }

    [TearDown]
    public void TearDown()
    {
        Object.Destroy(_rig);
    }

    [UnityTest]
    public IEnumerator XROrigin_StartsAtWorldOrigin()
    {
        yield return null; // wait one frame for Awake
        Assert.AreEqual(Vector3.zero, _rig.transform.position,
            "XR rig should start at world origin");
    }
}

XR Simulation: Testing Without a Headset

The com.unity.xr.interaction.toolkit package ships with XR Device Simulator and a lower-level XR Simulation subsystem. For automated tests, XR Simulation is the key — it replaces the native XR backend with a software one that accepts scripted input.

Enable it in your test setup:

using UnityEngine.XR.Interaction.Toolkit.Inputs.Simulation;

[SetUp]
public void SetUp()
{
    // Load the XR Device Simulator if not present
    if (Object.FindObjectOfType<XRDeviceSimulator>() == null)
    {
        var simPrefab = Resources.Load<GameObject>("XR Device Simulator");
        Object.Instantiate(simPrefab);
    }
}

For fully headless CI environments, you can configure the XR Simulation subsystem via XRSimulatorSettings scriptable objects and load them before entering Play Mode.

Mocking XR Input Devices

Unity's Input System provides InputTestFixture, which lets you create fake input devices. For XR, use XRController and XRHMDLayout device descriptors:

using NUnit.Framework;
using UnityEngine.InputSystem;
using UnityEngine.InputSystem.XR;

public class ControllerInputTests : InputTestFixture
{
    private XRController _leftController;
    private XRController _rightController;

    [SetUp]
    public override void Setup()
    {
        base.Setup();
        _leftController = InputSystem.AddDevice<XRController>("LeftHand");
        _rightController = InputSystem.AddDevice<XRController>("RightHand");
    }

    [TearDown]
    public override void TearDown()
    {
        base.TearDown();
        InputSystem.RemoveDevice(_leftController);
        InputSystem.RemoveDevice(_rightController);
    }

    [Test]
    public void TriggerPress_FiresSelectAction()
    {
        bool selectFired = false;
        var action = new InputAction(binding: "<XRController>{RightHand}/trigger");
        action.performed += _ => selectFired = true;
        action.Enable();

        // Simulate trigger press
        Set(_rightController, "trigger", 1.0f);
        InputSystem.Update();

        Assert.IsTrue(selectFired, "Trigger press should fire select action");
        action.Disable();
    }
}

For hand tracking, Unity 6+ exposes HandDevice via XR Hands package. You can synthesize joint poses:

using UnityEngine.XR.Hands;

// In your test setup
var handSubsystem = XRHandSubsystem.Create(new XRHandSubsystemDescriptor());
// Feed synthetic joint data via reflection or the test utilities in com.unity.xr.hands@1.4+

Testing XR Rig Locomotion

Locomotion systems (teleportation, continuous move, snap turn) are stateful and driven by input. Test them by simulating input sequences and asserting final positions or rotations.

[UnityTest]
public IEnumerator TeleportationProvider_MovesRigToTarget()
{
    // Arrange
    var rigGO = SetUpXRRig();
    var teleportArea = CreateTeleportArea(new Vector3(5f, 0f, 5f));
    var provider = rigGO.GetComponentInChildren<TeleportationProvider>();

    var request = new TeleportRequest
    {
        destinationPosition = new Vector3(5f, 0f, 5f),
        destinationRotation = Quaternion.identity,
        matchOrientation = MatchOrientation.WorldSpaceUp
    };

    // Act
    provider.QueueTeleportRequest(request);
    yield return new WaitForSeconds(0.1f); // allow locomotion to process

    // Assert
    var origin = rigGO.GetComponent<XROrigin>();
    Assert.AreApproximatelyEqual(5f, origin.transform.position.x, 0.01f,
        "Rig X should be at teleport destination");
    Assert.AreApproximatelyEqual(5f, origin.transform.position.z, 0.01f,
        "Rig Z should be at teleport destination");
}

UI Interaction: Raycasting and Controller Selection

XR UI interactions go through TrackedDeviceGraphicRaycaster and XRUIInputModule. Testing them requires a Canvas with the correct components and a simulated ray origin.

[UnityTest]
public IEnumerator RayInteractor_SelectsUIButton()
{
    // Build a minimal XR UI scene
    var canvas = CreateWorldSpaceCanvas();
    var button = AddButton(canvas, "TestButton", new Vector3(0, 0, 2));
    bool clicked = false;
    button.onClick.AddListener(() => clicked = true);

    var interactorGO = new GameObject("Ray Interactor");
    var rayInteractor = interactorGO.AddComponent<XRRayInteractor>();
    interactorGO.transform.position = Vector3.zero;
    interactorGO.transform.forward = Vector3.forward;

    // Simulate select input
    Set(_rightController, "trigger", 1.0f);
    yield return new WaitForFixedUpdate();
    yield return null;

    Assert.IsTrue(clicked, "Button should be clicked via ray interactor");
}

Play Mode vs Edit Mode: When to Use Each

Concern Edit Mode Play Mode
Pure math / utilities Yes Overkill
ScriptableObject config Yes Unnecessary
MonoBehaviour Awake/Start No Required
Physics / collision No Required
XR interaction No Required
Coroutines No Required
Input simulation No Required

Keep Edit Mode tests for data models, save/load logic, inventory calculations, and anything that does not touch the scene graph. Move everything involving XROrigin, XRInteractionManager, locomotion, or UI into Play Mode.

XR Interaction Toolkit Test Utilities

XRI ships a com.unity.xr.interaction.toolkit.tests package (available when testables is added to manifest.json). It exposes helpers like TestInteractionManager, TestInteractor, and TestInteractable that remove the need to set up full prefabs in every test.

using UnityEngine.XR.Interaction.Toolkit.Tests;

[UnityTest]
public IEnumerator GrabInteractable_AttachesToInteractor()
{
    var manager = TestUtilities.CreateInteractionManager();
    var interactor = TestUtilities.CreateDirectInteractor();
    var interactable = TestUtilities.CreateGrabInteractable();

    interactable.transform.position = interactor.transform.position;
    yield return new WaitForFixedUpdate();
    yield return null;

    Assert.IsTrue(interactable.isSelected,
        "Interactable should be selected when interactor overlaps");
    Assert.AreEqual(interactor, interactable.firstInteractorSelecting,
        "Selecting interactor should be our direct interactor");
}

Continuous Integration Setup

Running Unity tests in CI requires a license. The recommended path for open-source projects is GameCI, which provides Docker images with Unity preinstalled.

# .github/workflows/xr-tests.yml
name: XR Tests
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
        with:
          lfs: true

      - uses: game-ci/unity-test-runner@v4
        env:
          UNITY_LICENSE: ${{ secrets.UNITY_LICENSE }}
          UNITY_EMAIL: ${{ secrets.UNITY_EMAIL }}
          UNITY_PASSWORD: ${{ secrets.UNITY_PASSWORD }}
        with:
          projectPath: .
          testMode: PlayMode
          artifactsPath: test-results
          githubToken: ${{ secrets.GITHUB_TOKEN }}
          # XR Simulation runs headless; no GPU required for logic tests
          customParameters: -batchmode -nographics

      - uses: actions/upload-artifact@v4
        with:
          name: test-results
          path: test-results

Note: headless mode (-nographics) disables the GPU pipeline. XR Simulation still works for input and interaction, but visual assertions that require rendered output will need a GPU runner (e.g., self-hosted with a GPU, or a cloud runner with GPU passthrough).

Asserting Frame Timing in Play Mode

Performance bugs in XR are catastrophic — a frame drop below 72 Hz on Quest causes nausea. You can add rough frame timing assertions in Play Mode tests to catch regressions:

[UnityTest]
public IEnumerator HeavyScene_MaintainsTargetFrameRate()
{
    LoadHeavyTestScene();
    yield return new WaitForSeconds(2f); // warm up

    float targetFrameTime = 1f / 72f; // 72 Hz target
    float worstFrameTime = 0f;

    for (int i = 0; i < 60; i++)
    {
        yield return null;
        worstFrameTime = Mathf.Max(worstFrameTime, Time.deltaTime);
    }

    Assert.Less(worstFrameTime, targetFrameTime * 1.5f,
        $"Worst frame time {worstFrameTime * 1000:F1}ms exceeds 1.5x budget");
}

These tests are not a substitute for profiling on device, but they catch gross regressions early.

Putting It All Together

A solid Unity XR test suite has three layers: Edit Mode tests for all pure logic (at least 50% of your code should be testable this way if you follow SOLID principles), Play Mode unit tests for individual components (interactors, locomotion providers, UI handlers), and Play Mode integration tests that load representative scenes and exercise full interaction flows.

Invest in test helpers early. A XRTestScene builder class that constructs a minimal rig, interactable, and input device in three lines pays dividends across every test file you write.

HelpMeTest can complement your Unity test suite by running continuous end-to-end monitoring against your deployed XR web portal or companion app, catching regressions in the non-Unity parts of your stack between releases.

Read more