Meta Quest Testing: Automation with Appium and ADB
Meta Quest runs Android under the hood, which means your Android testing toolchain — ADB, Appium, logcat, am instrument — works on Quest with some configuration. This guide covers device setup, Appium with UiAutomator2 for Quest UI automation, OVR Metrics for performance testing, and running Unity tests directly on device.
Meta Quest is an Android device with a custom launcher and VR compositor on top. That means the entire Android testing ecosystem applies — ADB, Appium, am instrument, systrace, logcat, and the Unity test runner's adb shell am instrument workflow. The trick is knowing which parts map cleanly and which need Quest-specific workarounds.
ADB Setup for Meta Quest
Before anything else, enable Developer Mode on the Quest:
- In the Meta mobile app, go to Menu → Devices → select your headset → Developer Mode → Enable.
- Put the headset on and accept the "Allow USB debugging" prompt when you connect via USB-C.
- Verify the connection:
adb devices
# Should output something like:
<span class="hljs-comment"># 1WMHHA123456789 device
<span class="hljs-comment"># If you see "unauthorized", the prompt inside the headset was not accepted
adb kill-server && adb start-server
adb devices <span class="hljs-comment"># try again after accepting the in-headset promptFor wireless ADB (useful when the headset is on someone's head):
# Connect once via USB to set up TCP/IP mode
adb tcpip 5555
<span class="hljs-comment"># Find Quest's IP (in headset: Settings > Wi-Fi > (network name) > Advanced)
QUEST_IP=192.168.1.42
<span class="hljs-comment"># Connect wirelessly
adb connect <span class="hljs-variable">$QUEST_IP:5555
adb devices <span class="hljs-comment"># should show the wireless connectionQuest Link vs Standalone Testing
Quest has two distinct test environments. Standalone (the headset alone, untethered) is your production environment and the one tests must pass on. Quest Link (headset connected to a PC running the Oculus PC app) adds PC rendering capability — useful for testing PCVR builds, but not representative of standalone performance.
Set up a shell script to capture which mode is active:
#!/usr/bin/env bash
<span class="hljs-comment"># scripts/detect-quest-mode.sh
DEVICE=<span class="hljs-variable">${1:-$(adb devices | grep device <span class="hljs-pipe">| grep -v List <span class="hljs-pipe">| head -1 <span class="hljs-pipe">| cut -f1)}
LINK_SERVICE=$(adb -s <span class="hljs-string">"$DEVICE" shell getprop ro.build.product 2>/dev/null)
LINK_ACTIVE=$(adb -s <span class="hljs-string">"$DEVICE" shell pm list packages <span class="hljs-pipe">| grep com.oculus.link 2>/dev/null)
<span class="hljs-built_in">echo <span class="hljs-string">"Device: $DEVICE"
<span class="hljs-built_in">echo <span class="hljs-string">"Build: $LINK_SERVICE"
<span class="hljs-keyword">if [ -n <span class="hljs-string">"$LINK_ACTIVE" ] && adb -s <span class="hljs-string">"$DEVICE" shell pidof com.oculus.systemdriver > /dev/null 2>&1; <span class="hljs-keyword">then
<span class="hljs-built_in">echo <span class="hljs-string">"Mode: Quest Link (PCVR)"
<span class="hljs-keyword">else
<span class="hljs-built_in">echo <span class="hljs-string">"Mode: Standalone (Android)"
<span class="hljs-keyword">fiAppium with UiAutomator2 Driver
Appium's UiAutomator2 driver works on Quest for testing the system UI, Meta's launcher, and any Android activity your app exposes (e.g., settings screens, companion activities, onboarding flows).
Install Appium and the driver:
npm install -g appium
appium driver install uiautomator2
appium & # start serverWrite your first Quest UI test in Python using Appium-Python-Client:
# tests/test_quest_launcher.py
import pytest
from appium import webdriver
from appium.options import AppiumOptions
from appium.webdriver.common.appiumby import AppiumBy
QUEST_DEVICE = "1WMHHA123456789" # from adb devices
@pytest.fixture
def driver():
options = AppiumOptions()
options.platform_name = "Android"
options.load_capability("appium:deviceName", "Meta Quest 3")
options.load_capability("appium:udid", QUEST_DEVICE)
options.load_capability("appium:automationName", "UiAutomator2")
options.load_capability("appium:appPackage", "com.oculus.tv")
options.load_capability("appium:appActivity", ".MainActivity")
options.load_capability("appium:noReset", True)
options.load_capability("appium:newCommandTimeout", 120)
drv = webdriver.Remote("http://localhost:4723", options=options)
yield drv
drv.quit()
def test_app_launches_without_crash(driver):
"""Verify the app launches and shows the main content grid."""
# Wait for main content to appear
content_grid = driver.find_element(
AppiumBy.ID, "com.oculus.tv:id/content_grid"
)
assert content_grid.is_displayed(), "Content grid should be visible on launch"
def test_search_returns_results(driver):
"""Verify search functionality returns results."""
search_button = driver.find_element(
AppiumBy.ACCESSIBILITY_ID, "Search"
)
search_button.click()
search_input = driver.find_element(
AppiumBy.ID, "com.oculus.tv:id/search_input"
)
search_input.send_keys("Netflix")
results = driver.find_elements(
AppiumBy.ID, "com.oculus.tv:id/search_result_item"
)
assert len(results) > 0, "Search should return at least one result"For your own VR application, expose an Android companion activity or settings screen and test it the same way. This covers onboarding flows, permission dialogs, account linking, and settings that exist outside the VR experience.
Testing Your App's Launch and Crash Behavior
Use adb shell am start to launch your app and logcat to monitor for crashes:
#!/usr/bin/env bash
<span class="hljs-comment"># scripts/launch-and-monitor.sh
PACKAGE=<span class="hljs-string">"com.yourcompany.vrapp"
ACTIVITY=<span class="hljs-string">"com.yourcompany.vrapp.MainActivity"
DEVICE=<span class="hljs-variable">${1:-$(adb devices | grep -m1 "device$" <span class="hljs-pipe">| cut -f1)}
TIMEOUT=30
<span class="hljs-built_in">echo <span class="hljs-string">"Launching $PACKAGE on <span class="hljs-variable">$DEVICE..."
<span class="hljs-comment"># Clear previous logs
adb -s <span class="hljs-string">"$DEVICE" logcat -c
<span class="hljs-comment"># Launch the app
adb -s <span class="hljs-string">"$DEVICE" shell am start -n <span class="hljs-string">"$PACKAGE/<span class="hljs-variable">$ACTIVITY"
<span class="hljs-comment"># Monitor for crash signals for TIMEOUT seconds
END=$((SECONDS + TIMEOUT))
CRASHED=<span class="hljs-literal">false
<span class="hljs-keyword">while [ <span class="hljs-variable">$SECONDS -lt <span class="hljs-variable">$END ]; <span class="hljs-keyword">do
<span class="hljs-keyword">if adb -s <span class="hljs-string">"$DEVICE" logcat -d <span class="hljs-pipe">| grep -q <span class="hljs-string">"FATAL EXCEPTION\|AndroidRuntime: FATAL\|Process: $PACKAGE.*died"; <span class="hljs-keyword">then
CRASHED=<span class="hljs-literal">true
<span class="hljs-built_in">break
<span class="hljs-keyword">fi
<span class="hljs-built_in">sleep 1
<span class="hljs-keyword">done
<span class="hljs-keyword">if <span class="hljs-variable">$CRASHED; <span class="hljs-keyword">then
<span class="hljs-built_in">echo <span class="hljs-string">"FAIL: App crashed during launch"
adb -s <span class="hljs-string">"$DEVICE" logcat -d <span class="hljs-pipe">| grep -A 20 <span class="hljs-string">"FATAL EXCEPTION" <span class="hljs-pipe">| <span class="hljs-built_in">tail -25
<span class="hljs-built_in">exit 1
<span class="hljs-keyword">else
<span class="hljs-built_in">echo <span class="hljs-string">"PASS: App ran for $TIMEOUT seconds without crash"
PID=$(adb -s <span class="hljs-string">"$DEVICE" shell pidof <span class="hljs-string">"$PACKAGE")
<span class="hljs-built_in">echo <span class="hljs-string">"App PID: $PID"
<span class="hljs-keyword">fiOVR Metrics Tool for Performance Testing
Meta's OVR Metrics Tool captures per-frame GPU/CPU timing, ASW (Application SpaceWarp) activation, and thermal state. Automate it via adb:
#!/usr/bin/env bash
<span class="hljs-comment"># scripts/run-perf-test.sh
PACKAGE=<span class="hljs-string">"com.yourcompany.vrapp"
DEVICE=<span class="hljs-variable">${1:-$(adb devices | grep -m1 "device$" <span class="hljs-pipe">| cut -f1)}
OUTPUT_DIR=<span class="hljs-string">"./perf-results"
DURATION=60 <span class="hljs-comment"># seconds
<span class="hljs-built_in">mkdir -p <span class="hljs-string">"$OUTPUT_DIR"
<span class="hljs-comment"># Start OVR Metrics capture
adb -s <span class="hljs-string">"$DEVICE" shell am broadcast \
-a com.oculus.ovrmetricstool.CAPTURE_START \
-p com.oculus.ovrmetricstool
<span class="hljs-comment"># Launch the app
adb -s <span class="hljs-string">"$DEVICE" shell am start \
-n <span class="hljs-string">"$PACKAGE/com.yourcompany.vrapp.MainActivity"
<span class="hljs-built_in">echo <span class="hljs-string">"Capturing performance for ${DURATION}s..."
<span class="hljs-built_in">sleep <span class="hljs-string">"$DURATION"
<span class="hljs-comment"># Stop capture and pull results
adb -s <span class="hljs-string">"$DEVICE" shell am broadcast \
-a com.oculus.ovrmetricstool.CAPTURE_STOP \
-p com.oculus.ovrmetricstool
<span class="hljs-built_in">sleep 2
<span class="hljs-comment"># Pull the CSV from the device
adb -s <span class="hljs-string">"$DEVICE" pull \
/sdcard/OVRMetrics/ \
<span class="hljs-string">"$OUTPUT_DIR/"
<span class="hljs-built_in">echo <span class="hljs-string">"Performance data saved to $OUTPUT_DIR/"Parse the CSV output in Python to assert frame time thresholds:
# scripts/analyze_perf.py
import csv
import sys
from pathlib import Path
def analyze_frame_times(csv_path: str, target_fps: int = 72) -> bool:
target_ms = 1000.0 / target_fps
frame_times = []
with open(csv_path) as f:
reader = csv.DictReader(f)
for row in reader:
if "GPU Time (ms)" in row:
try:
frame_times.append(float(row["GPU Time (ms)"]))
except ValueError:
continue
if not frame_times:
print("ERROR: No frame time data found")
return False
avg = sum(frame_times) / len(frame_times)
p95 = sorted(frame_times)[int(len(frame_times) * 0.95)]
p99 = sorted(frame_times)[int(len(frame_times) * 0.99)]
worst = max(frame_times)
over_budget = sum(1 for t in frame_times if t > target_ms)
over_pct = (over_budget / len(frame_times)) * 100
print(f"Frame count: {len(frame_times)}")
print(f"Target: {target_ms:.2f}ms ({target_fps}Hz)")
print(f"Average GPU time: {avg:.2f}ms")
print(f"P95: {p95:.2f}ms")
print(f"P99: {p99:.2f}ms")
print(f"Worst: {worst:.2f}ms")
print(f"Over budget: {over_pct:.1f}% of frames")
passed = over_pct < 2.0 # fail if more than 2% of frames miss budget
print(f"Result: {'PASS' if passed else 'FAIL'}")
return passed
if __name__ == "__main__":
csv_file = sys.argv[1] if len(sys.argv) > 1 else "perf-results/metrics.csv"
ok = analyze_frame_times(csv_file)
sys.exit(0 if ok else 1)Using ovr-platform-tool for Build Deployment
Meta's ovr-platform-tool CLI manages APK uploads and channel deployment — essential for CI:
# Download from https://developer.oculus.com/downloads/package/oculus-platform-tool/
<span class="hljs-comment"># Place in your PATH
<span class="hljs-comment"># Upload a build to the ALPHA channel
ovr-platform-tool upload-quest-build \
--app-id <span class="hljs-string">"$META_APP_ID" \
--app-secret <span class="hljs-string">"$META_APP_SECRET" \
--apk <span class="hljs-string">"build/MyApp.apk" \
--channel ALPHA \
--notes <span class="hljs-string">"CI build $GITHUB_SHA"
<span class="hljs-comment"># Install directly via ADB for device testing (faster than store channel)
adb install -r build/MyApp.apkRunning Unity Tests on Quest via ADB
The Unity Test Runner can execute tests on a connected Android device using adb shell am instrument. Build your test APK in Unity (Edit → Project Settings → Player → Android) with "Run in Background" enabled, then:
#!/usr/bin/env bash
<span class="hljs-comment"># scripts/run-unity-tests-on-device.sh
PACKAGE=<span class="hljs-string">"com.yourcompany.vrapp.tests"
DEVICE=<span class="hljs-variable">${1:-$(adb devices | grep -m1 "device$" <span class="hljs-pipe">| cut -f1)}
RESULTS_DIR=<span class="hljs-string">"./test-results"
RESULTS_XML=<span class="hljs-string">"$RESULTS_DIR/unity-device-tests.xml"
<span class="hljs-built_in">mkdir -p <span class="hljs-string">"$RESULTS_DIR"
<span class="hljs-built_in">echo <span class="hljs-string">"Installing test APK on $DEVICE..."
adb -s <span class="hljs-string">"$DEVICE" install -r build/MyApp-Tests.apk
<span class="hljs-built_in">echo <span class="hljs-string">"Running Unity Play Mode tests..."
adb -s <span class="hljs-string">"$DEVICE" shell am instrument \
-w \
-e class <span class="hljs-string">"MyVRApp.Tests.PlayMode" \
<span class="hljs-string">"$PACKAGE/com.unity3d.player.UnityPlayerActivity" \
2>&1 <span class="hljs-pipe">| <span class="hljs-built_in">tee /tmp/unity-test-output.txt
<span class="hljs-comment"># Parse Unity's output for results
PASSED=$(grep -c <span class="hljs-string">"^OK" /tmp/unity-test-output.txt <span class="hljs-pipe">|| <span class="hljs-literal">true)
FAILED=$(grep -c <span class="hljs-string">"^FAIL" /tmp/unity-test-output.txt <span class="hljs-pipe">|| <span class="hljs-literal">true)
<span class="hljs-built_in">echo <span class="hljs-string">"Passed: $PASSED"
<span class="hljs-built_in">echo <span class="hljs-string">"Failed: $FAILED"
<span class="hljs-comment"># Pull XML results if you configured Unity to write them
adb -s <span class="hljs-string">"$DEVICE" pull \
/sdcard/Android/data/<span class="hljs-variable">$PACKAGE/files/test-results.xml \
<span class="hljs-string">"$RESULTS_XML" 2>/dev/null <span class="hljs-pipe">|| <span class="hljs-literal">true
<span class="hljs-keyword">if [ <span class="hljs-string">"$FAILED" -gt 0 ]; <span class="hljs-keyword">then
<span class="hljs-built_in">echo <span class="hljs-string">"FAIL: $FAILED tests failed on device"
<span class="hljs-built_in">exit 1
<span class="hljs-keyword">fi
<span class="hljs-built_in">echo <span class="hljs-string">"PASS: All tests passed on Quest"Automated Thermal Testing
Quest throttles performance when it gets hot. Test thermal behavior during extended sessions:
# scripts/thermal_test.py
import subprocess
import time
import json
DEVICE = subprocess.check_output(
["adb", "devices"], text=True
).strip().split("\n")[1].split("\t")[0]
def get_thermal_zone(zone_name: str) -> float:
"""Read thermal zone temperature in Celsius."""
result = subprocess.run(
["adb", "-s", DEVICE, "shell",
f"cat /sys/class/thermal/thermal_zone*/type /sys/class/thermal/thermal_zone*/temp"],
capture_output=True, text=True,
)
lines = result.stdout.strip().split("\n")
types = lines[:len(lines)//2]
temps = lines[len(lines)//2:]
for t, v in zip(types, temps):
if zone_name.lower() in t.lower():
try:
return float(v) / 1000.0
except ValueError:
pass
return -1.0
def test_thermal_stability(duration_seconds: int = 300):
"""Run app for N seconds and verify thermal throttling stays manageable."""
package = "com.yourcompany.vrapp"
subprocess.run(["adb", "-s", DEVICE, "shell", "am", "start",
"-n", f"{package}/.MainActivity"])
readings = []
throttle_events = 0
print(f"Monitoring thermals for {duration_seconds}s...")
for i in range(duration_seconds // 5):
time.sleep(5)
gpu_temp = get_thermal_zone("gpu")
cpu_temp = get_thermal_zone("cpu")
readings.append({"time": i * 5, "gpu_c": gpu_temp, "cpu_c": cpu_temp})
# Check for throttling via OVR log
log = subprocess.check_output(
["adb", "-s", DEVICE, "logcat", "-d", "-t", "10",
"-s", "OVRPlugin:W"],
text=True,
)
if "THROTTLING" in log or "SpaceWarp" in log:
throttle_events += 1
print(f" t={i*5:3d}s GPU={gpu_temp:.1f}°C CPU={cpu_temp:.1f}°C"
f" throttles={throttle_events}")
max_gpu = max(r["gpu_c"] for r in readings)
print(f"\nMax GPU temp: {max_gpu:.1f}°C")
print(f"Throttle events: {throttle_events}")
# Quest 3 thermal limit is ~80°C for GPU
assert max_gpu < 80.0, f"GPU reached {max_gpu:.1f}°C — thermal limit exceeded"
assert throttle_events < 5, f"Too many throttle events: {throttle_events}"
print("PASS: Thermal test completed within bounds")
if __name__ == "__main__":
test_thermal_stability()SideQuest for Sideloading in CI
SideQuest's CLI (sidequest-cli) automates APK installation and management without requiring the GUI:
npm install -g sidequest-cli
# Install APK
sidequest install build/MyApp.apk --device <span class="hljs-string">"$QUEST_DEVICE"
<span class="hljs-comment"># List installed packages
sidequest packages --device <span class="hljs-string">"$QUEST_DEVICE"
<span class="hljs-comment"># Uninstall
sidequest uninstall com.yourcompany.vrapp --device <span class="hljs-string">"$QUEST_DEVICE"For CI pipelines where the Quest is in a test lab, combine ADB direct install with a post-install verification:
adb install -r build/MyApp.apk && \
adb shell pm list packages | grep com.yourcompany.vrapp && \
<span class="hljs-built_in">echo <span class="hljs-string">"Install verified"CI Pipeline with GitHub Actions
Connecting physical Quest devices to GitHub Actions requires a self-hosted runner in your lab with the headset connected via USB. For the portions that can run without a device (unit tests, build verification), use hosted runners:
name: Quest Tests
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Build APK
uses: game-ci/unity-builder@v4
with:
targetPlatform: Android
buildName: MyVRApp
device-tests:
needs: build
runs-on: self-hosted # your lab runner with Quest connected
steps:
- uses: actions/checkout@v4
- name: Download APK
uses: actions/download-artifact@v4
with:
name: Build-Android
- name: Install on Quest
run: adb install -r MyVRApp.apk
- name: Run smoke tests
run: bash scripts/launch-and-monitor.sh
- name: Run Unity device tests
run: bash scripts/run-unity-tests-on-device.sh
- name: Collect performance metrics
run: |
bash scripts/run-perf-test.sh
python scripts/analyze_perf.py perf-results/metrics.csv
- name: Upload results
uses: actions/upload-artifact@v4
if: always()
with:
name: quest-test-results
path: |
test-results/
perf-results/HelpMeTest can run your web-based companion app tests and API integration tests in parallel with your Quest device test suite, giving you full-stack test visibility from a single dashboard without managing multiple CI systems.