What’s New in ARKit 2

Description: ARKit 2 makes it easy to develop vivid augmented reality experiences and enable apps to interact with the real world in entirely new ways. Discover how multiple iOS devices can simultaneously view an AR scene or play multiplayer AR games. Learn about new capabilities for tracking 2D images, and see how to detect known 3D objects like sculptures, toys, and furniture.

ARWorldMap persistence

After the user ends a session (say in front of a table and the user put objects on the table), the user can go back again another day in front of the same table and the app can recognize the table and put back all the objects as the user placed them as in the previous session

Multi-User Experiences

The ARWorldMap are sharable and this is why we can play an ar game at the same time with multiple devices

More robust tracking and plane detection

Both horizontal and from ARKit 1.5 vertical.

(Real Time) Environment Texturing

Even if ARKit doesn’t know the whole scene (because the user didn't spin 360-degrees around), ARKit will use CoreML to “hallucinate” the missing parts.

Real Time Multiple 2D Images Tracking:

Track position and orientation in each frame, pictures no need to be static

You need to import the reference images to Xcode.

Object Tracking

Track positions of fixed objects (you can move around them but the object can’t move), you need to import a 3d model of the object into xcode, Apple gives you a (free) app to scan the object and get the 3d object

Face Tracking

  • Gaze tracking
  • Tongue tracking (as in memojii)

Missing anything? Corrections? Contributions are welcome 😃


Written by

Federico Zanetello

Federico Zanetello

iOS Engineer with strong passion for Swift, minimalism, and design. When he’s not busy automating things, he can be found writing at FIVE STARS and/or playing with the latest shiny toys.