👉 Location Anchors
Place AR experiences at specific places, such as throughout cities
👾 Sample Code
developer.apple.com/documentation/…
🔎 Class ARGeoTrackingConfiguration developer.apple.com/documentation/…
🔎 Class ARGeoAnchor
developer.apple.com/documentation/…
👉 Depth API
Using the LiDAR Scanner, this API provides per-pixel depth information about the surrounding environment
👾 Sample Codes
developer.apple.com/documentation/…
developer.apple.com/documentation/…
🔎 sceneDepth
developer.apple.com/documentation/…
developer.apple.com/documentation/…
👉 Expanded Face Tracking
Support for Face Tracking extends to the front-facing camera on any device with the A12 Bionic chip and later, including the new iPhone SE, so even more users can delight in AR experiences using the front-facing camera
👉 Video Texture
This brings objects, surfaces, and even characters to life by adding rich video to animate virtual TV screens to play a movie, or make a virtual character smile
👉 Improved Object Occlusion Rendering
By combining information from the LiDAR Scanner, and the improved edge detection built into RealityKit, virtual objects are able to interact with your physical surroundings just as you’d expect
• SceneUnderstandingComponent: identify if an entity is real
• SceneUnderstanding Collision Group: filter collisions with real world
• DebugModelComponent: verify material parameters
• Video Materials use AVPlayer, play spatialized audio