ARKit allows developers to build high-detail AR experiences for iPad and iPhone. Environments captured through the device can have animated 3D virtual text, objects and characters added to them. AR scenes made by one individual are persistent and can be seen by others visiting the location later.
ARKit was introduced along with iOS 11. As ARKit is specified to run on Core A9 and higher iOS devices, the AR experiences can have more detailed content and maintain better environmental awareness. With iPhone X, ARKit can perform real-time face scanning and use this data to drive facial expressions of 3D characters.
Using the iOS device’s camera, accelerometers, gyroscope and context awareness, ARKit performs environment mapping as the device is moved. Sensor fusion of the inertial sensor data with the data from the camera allows for highly accurate location awareness and mapping. The software picks out visual features in the environment such as planes and tracks motion in conjunction with information from the inertial sensors. The camera is also used to determine light sources by which AR objects are lit. Apple’s solution to the increased detail and therefore memory usage is a sliding map where old data disappears for new. Users can place anchors to mark creations they want to save.
According to industry speculation, ARKit may have come from Apple’s acquisition of the AR company Matario. The company had already demonstrated solid AR technology and a well-received API.