- Relocalization from 11.3 allows a session to be restored. Uses a map that is supplied by world tracking.
- In iOS 12 you can get a mapping of a physical 3D space
- Mutable list of named anchor so you can add your own anchors
- Raw feature points and extent
- World maps can be serialized (Codable?)
- The world map can be loaded again when the user starts their AR experience.
- iOS 12 adds support for many users seeing the same world map at the same time
ARWorldMapis shared however you please. For instance air drop or multi peer connectivity.getCurrentWorldMapis used to obtain the current world map- When you pass a world map to an AR session, it uses the same relocalization that was added in iOS 11.3
- More points of view give a better world map
- A static well-textured environment gives the best experience
- You get a
worldMappingStatusthat is updated on every frame to tell you how the map is coming along. Goes from Not Available to Limited to Extending to Mapped. If you point to another place, it will go back to limited - There is relocalizing state as well
- Advanced texturing
- Scale
- Position and orientation
- Lighting
- Reflective objects now nicely reflect the scene
- In 11.3 static, known, images could be detected
- Supported by xcode asset catalog
- In iOS 12 images don’t need to be static they may move through the world
- Orientation and rotation are estimated so you can use images as planes and to trigger the viewing of contents
- You can track multiple images
- ARImageTrackingConfiguration
- ARReferenceImage as a trigger
- Feed to ARWorldTrackingConfiguration or ARImageTrackingConfiguration
- You can get an ARImageAnchor when an image is found, ARTrackable object
- Good images have many features and textures. And good contrast
- Xcode will warn you if an image is not likely to be tracked well
- Image anchors are represented in the world coordinate system
- image tracking configuration is independent from world tracking
- position and orientation for every frame
- Detects a known static 3d object
- Needs to be scanned first
- Well textured, non reflective
- Built in to the session
- Added as a reference object, similar to images
- set a set of detection objects on the configuration
- To scan objects there is an arobjectscanningconfiguration
- An example app for scanning is available
- Face is used as a light probe
- Expressions can be tracked, recognized 50+ features
- You can animate a virtual character that way
- Face tracking now tracks eyes