Ok so Apple did give some updates silently to it's AR ecosystem #WWDC20

ARKit4 - Depth API, Location Anchors (like Snapchat), Expanded Face Tracking Support (both front & back camera can do AR simultaneously, Scene Geometry (WOW), Instant AR, People Occlusion, Motion Capture
Multiple Face Tracking, Collaborative Sessions.

"Detect up to 100 images at a time and get an automatic estimate of the physical size of the object in the image." what crazy
Scene Geometry - "Create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats."

🤯
Location Anchors

"Place AR experiences at specific places, such as throughout cities and alongside famous landmarks. Location Anchoring allows you to anchor your AR creations...
... at specific latitude, longitude, and altitude coordinates. Users can move around virtual objects and see them from different perspectives, exactly as real objects are seen through a camera lens."
"Because of RealityKit’s native ARKit integration, Location Anchoring, extended support for face tracking, and improved object occlusion rendering are available for apps using RealityKit automatically."

that's great
"Now, virtual objects can be placed under tables, behind walls, or around corners and you’ll see only the parts of the virtual object you’d expect to, with crisp definition of where the...
... physical object hides part of the virtual one. And, all AR experiences powered by RealityKit or AR Quick Look benefit from these improvements automatically, without having to write a single line of code"

ARKit keeps improving!
You can follow @karanganesan.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled: