VoiceOver can be automated on the iOS Simulator, but no single tool or API provides an end-to-end solution. This research investigated five dimensions of the problem: enabling/disabling VoiceOver, sending navigation commands, reading focus state, existing frameworks, and React Native's accessibility APIs. The core finding is that a viable automation stack exists by combining xcrun simctl (VoiceOver lifecycle), AppleScript keystroke injection (navigation), and the macOS AXUIElement API (tree inspection), but reading VoiceOver cursor position remains the hardest unsolved problem. No existing framework (Apple, Google, Deque, or open-source) automates VoiceOver itself. Every tool in the ecosystem validates accessibility metadata, not screen reader behavior. This gap is real and unaddressed.
Recommended architecture: A three-layer system combining (1) defaults write + launchctl for VoiceOver lifecycle control, (2) osascript keystroke injection for navigation, and (3) an in-app debug server calling UIAccessibility.focusedElement(using:) for focus state reads. For React Native apps, this is the only path to automated VoiceOver integration testing.
VoiceOver can be enabled and disabled programmatically on the iOS Simulator using a two-step approach.
# Enable
xcrun simctl spawn booted defaults write com.apple.Accessibility VoiceOverTouchEnabled -bool true
xcrun simctl spawn booted launchctl start com.apple.VoiceOverTouch
# Disable
xcrun simctl spawn booted defaults write com.apple.Accessibility VoiceOverTouchEnabled -bool false
xcrun simctl spawn booted launchctl stop com.apple.VoiceOverTouch
# Verify (look for PID field when running)
xcrun simctl spawn booted launchctl list com.apple.VoiceOverTouchBoth steps are required. Setting the preference alone does not start VoiceOver because VoiceOverTouch is a Mach on-demand service. On real devices, setting the preference triggers a Mach connection from AccessibilityUIServer that starts VoiceOverTouch automatically. On the simulator, this activation pathway is broken. launchctl start forces the daemon to launch regardless.
Eight alternative approaches were tested: preference-only (no launchctl), Darwin notification posts (notifyutil), AccessibilityUIServer restart, boot with --enabledJob flag, preference pre-set before boot, accessibility shortcut (triple-click side button via idb), Simulator.app menu/AppleScript, and URL scheme to Settings.app. None successfully toggled VoiceOver.
The VoiceOverTouch binary runs on the simulator and intercepts gestures. The VoiceOver welcome dialog appears on first activation. However: the activation pathway differs (manual launchctl start required), speech output may need separate audio configuration, and the accessibility shortcut (triple-click side button) does not work. Most online documentation (including Apple Developer Forums) incorrectly states VoiceOver is unavailable on the simulator. This is outdated as of iOS 18+.
No single clean API exists for sending VoiceOver navigation commands. Five approaches were identified, each with different tradeoffs.
When macOS VoiceOver is active (Cmd+F5), it treats the Simulator window as a native macOS view. Navigation commands are sent by injecting VoiceOver keyboard shortcuts via System Events:
# Next element (VO + Right Arrow)
osascript -e 'tell application "System Events" to key code 124 using {control down, option down}'
# Activate (VO + Space)
osascript -e 'tell application "System Events" to key code 49 using {control down, option down}'Key commands: next element (VO+Right), previous element (VO+Left), activate (VO+Space), interact with group (VO+Shift+Down), stop interacting (VO+Shift+Up), read all (VO+A).
Critical limitation: Requires foreground GUI focus on the Simulator window. Non-headless.
When iOS VoiceOver is enabled inside the simulator (not macOS VoiceOver), Connect Hardware Keyboard (Cmd+Shift+K) maps the Mac keyboard as an external keyboard. The same VO modifier (Control+Option) works. Quick Nav mode (toggle with Left+Right Arrow simultaneously) enables single-arrow navigation without the VO modifier.
Scriptable via the same osascript keystroke injection, but targeting iOS VoiceOver instead of macOS VoiceOver.
From within the app, UIAccessibility.post(notification: .screenChanged, argument: targetView) moves VoiceOver focus to a specific element. Useful for verifying focus lands correctly after actions, but does not simulate user navigation traversal. Timing is important: notifications are processed asynchronously and may be ignored if posted before UIKit finishes rendering.
Can query the accessibility tree but cannot simulate VoiceOver gestures or read focus state. performAccessibilityAudit (iOS 17+) checks static properties only. Useful for metadata validation, not navigation testing.
GUI-only tool with "Auto Navigate" that walks VoiceOver reading order. No CLI, no AppleScript dictionary, no automation API. The com.apple.private.accessibility.inspection entitlement blocks third-party use of its underlying framework.
This is the hardest problem. The simulator exposes the full iOS accessibility tree to macOS via AXUIElement, but VoiceOver cursor position is not available through any external API.
The iOS content appears under a group with AXSubrole = "iOSContentGroup" in the Simulator's AXUIElement hierarchy. Each iOS element exposes: AXDescription (label), AXValue, AXRole, AXIdentifier, AXFocused, AXSelected, AXFrame, AXEnabled, AXCustomActions, AXChildrenInNavigationOrder, and more.
AXFocused reflects keyboard/system focus, not VoiceOver cursor position. When VoiceOver focuses a button, that button's AXFocused remains false unless it also has keyboard focus. Probing for VoiceOver-specific attributes (AXAccessibilityFocused, AXVoiceOverFocused, AXAssistiveTechnologyFocused) returns error -25212 (not supported).
AXObserver can watch for kAXFocusedUIElementChangedNotification on the Simulator process. These fire for keyboard focus changes but likely not for VoiceOver cursor movement.
xcrun simctl has no accessibility inspection subcommand. xcrun simctl ui only supports appearance, contrast, and content size. The ios-simulator MCP ui_describe_all returns the full accessibility tree but omits focus/selected state.
| Approach | Accuracy | Requires App Modification? |
|---|---|---|
In-app debug server calling UIAccessibility.focusedElement(using: .notificationVoiceOver) |
Exact | Yes |
In-app elementFocusedNotification listener writing to file/IPC |
Exact | Yes |
Position-based inference (count gestures, index into AXChildrenInNavigationOrder) |
High | No |
| Visual detection of VoiceOver focus outline in screenshots | Medium | No |
AXUIElement AXFocused property |
Keyboard focus only | No |
The in-app debug server is the most reliable approach for owned apps. It calls UIAccessibility.focusedElement(using: .notificationVoiceOver) (the only API that returns the actual VoiceOver-focused element) and exposes the result via HTTP/WebSocket. This requires modifying the app under test.
For fully external observation, position-based inference using AXChildrenInNavigationOrder is the best option: read the navigation order, send N "next element" gestures, and the focused element is the (N+1)th element.
Every framework validates accessibility metadata. None automate VoiceOver.
- XCUITest: Queries the accessibility tree.
performAccessibilityAudit(for:)(iOS 17+) automates Accessibility Inspector checks (contrast, element description, hit region, dynamic type, text clipping, traits). Audits are limited to on-screen elements and static properties. - Accessibility Inspector: GUI-only tool with inspection, audit, and Auto Navigate modes. No CLI or scripting interface. Cannot be automated.
- GTXiLib: Hooks into XCTest teardown for automated checks (label presence, trait conflicts, tap target size, contrast). Last release July 2021. Effectively unmaintained.
- EarlGrey 2: Combines EarlGrey's synchronization with XCUITest. Uses accessibility properties for element selection. Pairs with GTXiLib but does not add accessibility validation.
- idb: CLI automation for simulators. GitHub issue #792 (August 2022) requests VoiceOver toggle. Unresolved. Can write accessibility preferences but cannot start/stop launchd services.
- Deque axe DevTools: Automated WCAG scanning via XCUITest integration. Supports UIKit, SwiftUI, React Native, Flutter. Results upload to dashboard. Does not simulate VoiceOver.
- CashApp AccessibilitySnapshot: Snapshot testing of the accessibility hierarchy. Captures labels, traits, and activation points as visual images for diff comparison. Actively maintained (641 stars, last release April 2026). Best tool for accessibility regression testing.
- A11yUITests: Archived December 2025. Maintainer recommends AccessibilitySnapshot and Reveal instead.
- Maestro: E2E testing framework that interacts via the accessibility layer. Tests implicitly surface missing labels but do not test VoiceOver.
- react-native-testing-library: Accessibility queries (
*ByRole,*ByLabelText,*ByHintText) and matchers (toHaveAccessibilityState,toHaveAccessibilityValue). Runs in JS test environment, not on device. Cannot test iOS behavior. - react-native-accessibility-engine: Jest
.toBeAccessible()matcher with 11 rules. JavaScript-only validation. - Detox: Gray-box E2E using accessibility identifiers.
toBeFocused()tests keyboard focus, not VoiceOver focus. No VoiceOver simulation.
Two APIs exist:
AccessibilityInfo.setAccessibilityFocus(reactTag)(deprecated): requiresfindNodeHandle, being removed in New ArchitectureAccessibilityInfo.sendAccessibilityEvent(ref, 'focus')(current): works with Fabric, nofindNodeHandleneeded
Neither can be triggered from JS tests to simulate real VoiceOver focus. They send native notifications that are no-ops without a running screen reader.
FlatList's virtualization fundamentally conflicts with VoiceOver's scroll-into-view behavior. Items removed from the native hierarchy cannot be discovered by VoiceOver. React Native issue #23140 (filed January 2019) remains unresolved. Additional bugs: wrong reading order in horizontal/multi-column FlatLists (#48028, #28299), focus lost on horizontal scroll (#41566).
Workarounds: use ScrollView for small accessibility-critical lists, implement custom accessibilityActions with increment/decrement for manual scroll announcements.
Discord has custom native modules beyond stock React Native:
AccessibilityFocusView: native component providingonAccessibilityFocusandonAccessibilityBlurcallbacksNativeDeviceAccessibilityModule: Android crash workaround forsetAccessibilityFocususeAccessibilityNativeStackFocusTracking: saves/restores focus across navigation transitionsexperimental_accessibilityOrder(RN 0.82+): declarative focus traversal order prop, snapshot-testable but not verifiable without a real screen reader
A three-layer automation stack:
defaults write → launchctl start/stop
Fully scriptable, no GUI required. Wrap in a CLI tool that handles the two-step enable/disable and state verification via launchctl list.
osascript → System Events → Simulator window
Requires foreground focus on Simulator. Use iOS VoiceOver + external keyboard commands (not macOS VoiceOver) for closer parity with real device behavior. Quick Nav mode simplifies navigation to arrow keys.
App-embedded HTTP endpoint → UIAccessibility.focusedElement(using: .notificationVoiceOver)
The only way to read actual VoiceOver cursor position. Embed a lightweight server in debug builds. For CI, use the elementFocusedNotification listener that writes focus changes to a log file readable by the test harness.
Fallback for unmodified apps: Position-based inference using AXChildrenInNavigationOrder from the AXUIElement tree.
| Property | Status |
|---|---|
| VoiceOver enable/disable | Solved (simctl + launchctl) |
| Navigation commands | Solved but requires GUI focus (osascript) |
| Focus state reading | Solved for owned apps (in-app server); inference-only for third-party |
| Headless execution | Not possible (keystroke injection needs foreground) |
| CI compatibility | Possible with macOS runners that have GUI sessions |
- Build a CLI wrapper around the enable/disable commands with state verification
- Prototype the in-app debug server as a React Native native module that exposes VoiceOver focus via HTTP
- Prototype keystroke-based navigation with timing calibration (how long to wait between gestures for VoiceOver to settle)
- Evaluate position-based inference accuracy against the in-app server ground truth
- Test CI feasibility on macOS runners (Buildkite) with GUI session access
- iOS 26.2 Simulator (iPhone 17, macOS Darwin 24.6.0, Apple Silicon)
- Swift AXUIElement test programs (ApplicationServices API)
- AppleScript System Events queries on Simulator process
- AXObserver notification probing on Simulator PID
xcrun simctl spawndefaults/launchctl command testing
- Supporting VoiceOver in your app
- focusedElement(using:)
- UIAccessibility.Notification
- elementFocusedNotification
- UIAccessibilityFocus protocol
- AXUIElement.h
- AXObserverAddNotification
- Accessibility Inspector
- performAccessibilityAudit
- Perform accessibility audits for your app (WWDC 2023)
- XCUIElement hasFocus (tvOS)
- AssistiveTechnologyIdentifier
- VoiceOver on Simulator (Apple Forums)
- VoiceOver in iOS 15.2 Simulator (Apple Forums)
- NSHipster: simctl
- SwiftLee: VoiceOver Navigator for Simulator (RocketSim)
- Deque: Intro to iOS Accessibility Inspector
- DEV: iOS Accessibility Inspector Beyond Automation
- Mobile A11y: XCUITests for accessibility
- Accessibility and UI Testing in iOS (Jonathan Chen)
- Accessibility focus in SwiftUI (Swift with Majid)
- Preparing your App for VoiceOver: Accessibility Actions
- WWDC 2023 Accessibility Audits (Orange A11y Guidelines)
- VoiceOver Keyboard Commands for iOS (CSB-CDE)
- Mobile Screen Reader Testing (Scott Vinkle)
- Xcode 15: Automated accessibility audits
- Testing your app's accessibility with UI Tests
- Why we sometimes need Dispatch Delay for accessibility (Rahul Gurung)
- VoiceOver iOS Gesture/Keyboard Commands (pauljadam.com)
- VoiceOver Gestures on iOS (Deque University)
- GTXiLib (Google)
- EarlGrey (Google)
- AccessibilitySnapshot (CashApp)
- A11yUITests (archived)
- axe DevTools for Mobile (Deque)
- idb VoiceOver issue #792
- idb Overview
- Maestro
- DFAXUIElement
- hs._asm.axuielement (Hammerspoon)
- All xcrun simctl subcommands (GitHub Gist)
- React Native Accessibility Docs
- React Native AccessibilityInfo API
- FlatList VoiceOver scroll bug (#23140)
- Correct way to move accessibility focus (#37015)
- Accessibility focus order discussion (#389)
- VoiceOver reads FlatList items horizontally (#48028)
- Accessibility loses focus on horizontal scroll (#41566)
- react-native-a11y-focus
- react-native-a11y-order
- RNTL Accessibility State Wiki
- RNTL Accessibility Helpers
- Detox Expect API
- Callstack Accessibility Snapshot Testing
- react-native-accessibility-engine
- sendAccessibilityEvent Fabric commit