When you try to listen to a custom rotor navigation event inside an AccessibilityComponent, the RealityRender stops working as it would normally. It looks like the .customRotors collection property is emptied, but there are no signs of Entity reconstruction. The focus is also lost, and other entities with custom rotors are reset too. After running the custom rotor on the video, it was impossible to get back to focus, and VoiceOver acted anxious.
Name | Category | |
---|---|---|
Camera | Capture | |
Magnifier | Capture | |
Scan Code | Capture | |
Alarm | Clock | |
Stopwatch | Clock | |
Timer | Clock | |
Airplane Mode | Connectivity | |
Bluetooth | Connectivity | |
Cellular Data | Connectivity |
import CoreLocation | |
import RealityKit | |
import RealityKitContent | |
import SwiftUI | |
struct ImmersiveView: View { | |
var body: some View { | |
RealityView { content in | |
if let earthEntity = try? await Entity(named: "Earth", in: realityKitContentBundle) { |
Event Transcript Name Link | |
WWDC24 | |
WWDC24 18 things from WWDC24 https://devimages-cdn.apple.com/wwdc-services/transcripts/individual/wwdc2024/wwdc2024-111976/eng_bdd89e7893f9/wwdc2024-111976-transcript-eng.json | |
WWDC24 Bring context to today’s weather https://devimages-cdn.apple.com/wwdc-services/transcripts/individual/wwdc2024/wwdc2024-10067/eng_8d6b40e9336d/wwdc2024-10067-transcript-eng.json | |
WWDC24 Bring your Live Activity to Apple Watch https://devimages-cdn.apple.com/wwdc-services/transcripts/individual/wwdc2024/wwdc2024-10068/eng_cb5b3cef853d/wwdc2024-10068-transcript-eng.json | |
WWDC24 Bring your app to Siri https://devimages-cdn.apple.com/wwdc-services/transcripts/individual/wwdc2024/wwdc2024-10133/eng_6bfa0ce95261/wwdc2024-10133-transcript-eng.json | |
WWDC24 Bring your app’s core features to users with App Intents https://devimages-cdn.apple.com/wwdc-services/transcripts/individual/wwdc2024/wwdc2024-10210/eng_4243621a1edc/wwdc2024-10210-transcript-eng.json | |
WWDC24 Bring your machine learning and AI m |
Name | Topics | Referenced In | URL | |
---|---|---|---|---|
Accessibility updates | Accessibility & Inclusion | wwdc2024-10073, wwdc2023-10036 | https://developer.apple.com/documentation/Updates/Accessibility | |
Human Interface Guidelines: Accessibility | Accessibility & Inclusion | wwdc2024-10074, wwdc2019-802 | https://developer.apple.com/design/human-interface-guidelines/accessibility | |
Human Interface Guidelines: Typography | Accessibility & Inclusion | wwdc2024-10074, wwdc2020-10175 | https://developer.apple.com/design/human-interface-guidelines/typography | |
Performing accessibility testing for your app | Accessibility & Inclusion | wwdc2024-10073 | https://developer.apple.com/documentation/Accessibility/performing-accessibility-testing-for-your-app | |
UILargeContentViewerInteraction | Accessibility & Inclusion | wwdc2024-10074 | https://developer.apple.com/documentation/uikit/uilargecontentviewerinteraction | |
accessibilityShowsLargeContentViewer() | Accessibility & Inclusion | wwdc2024-10074 | https://developer.apple.com/documentation/SwiftUI/View/accessibilityShowsLargeContentVi |
Name | Topics | Referenced In | URL | |
---|---|---|---|---|
Enhancing the accessibility of your SwiftUI app | Accessibility & Inclusion | 10073, 10074 | https://developer.apple.com/documentation/Accessibility/enhancing-the-accessibility-of-your-swiftui-app | |
Accelerating app interactions with App Intents | App Services | 10210, 10176, 10134 | https://developer.apple.com/documentation/AppIntents/AcceleratingAppInteractionsWithAppIntents | |
Accessing a person’s contact data using Contacts and ContactsUI | App Services | 10121 | https://developer.apple.com/documentation/Contacts/accessing-a-person-s-contact-data-using-contacts-and-contactsui | |
Adopting SwiftData for a Core Data app | App Services | 10137 | https://developer.apple.com/documentation/coredata/adopting_swiftdata_for_a_core_data_app | |
Configuring the PencilKit tool picker | App Services | 10214 | https://developer.apple.com/documentation/pencilkit/configuring_the_pencilkit_tool_picker | |
Connecting to a service with passkeys | App Services | 10125 | https://developer.apple.com/documentation/authenticationservices/connecting_to |
extension Measurement where UnitType == UnitTemperature { | |
var color: Color { | |
/// Convert input temperature to Celsius for consistency | |
let temperatureInCelsius = self.converted(to: .celsius) | |
/// Find the two closest color values in Celsius | |
var lowerIndex = 0 | |
var upperIndex = 0 | |
for i in 0..<colorSteps.count { |
struct ContentView: View { | |
let temperature = Measurement<UnitTemperature>(value: 13, unit: .celsius) | |
var body: some View { | |
List { | |
Text("let temperature = Measurement<UnitTemperature>(value: 13, unit: .celsius)") | |
.font(.headline) | |
Section("Implicit formatting") { | |
LabeledContent("temperature.formatted()", value: temperature.formatted()) |
State machines are everywhere in interactive systems, but they're rarely defined clearly and explicitly. Given some big blob of code including implicit state machines, which transitions are possible and under what conditions? What effects take place on what transitions?
There are existing design patterns for state machines, but all the patterns I've seen complect side effects with the structure of the state machine itself. Instances of these patterns are difficult to test without mocking, and they end up with more dependencies. Worse, the classic patterns compose poorly: hierarchical state machines are typically not straightforward extensions. The functional programming world has solutions, but they don't transpose neatly enough to be broadly usable in mainstream languages.
Here I present a composable pattern for pure state machiness with effects,
Vision Pro | Meta Quest 3 | ||||
---|---|---|---|---|---|
Vision | 57 | 3 | |||
Physical and Motor | 55 | 6 | |||
Hearing | 9 | 3 | |||
General | 10 | 0 |