iOS 11 introduces ARKit, a new framework that allows you to easily create unparalleled augmented reality experiences for iPhone and iPad. By blending digital objects and information with the environment around you, ARKit takes apps beyond the screen, freeing them to interact with the real world in entirely new ways.
This Year Apple has announced two big frameworks as part of WWDC 2017 to take the iPhone and iPad application development to next level. They are called core ML and ARKit.
Core ML, I have explained CoreML and Machine learning in iOS in our previous blog. Now comes on the second big achievement of Apple in the field of augmented reality.
What is Augmented Reality?
Augmented reality is the integration of digital information with the user’s environment in real time. Unlike virtual reality, which creates a totally artificial environment, augmented reality uses the existing environment and overlays new information on top of it.
ARKit [Augmented Reality]
- ARKit is a framework that uses your device camera and motion features to produce augmented reality experiences in your application.
- ARKit adds the 2D or 3D view in your real world views, which is taken by using a device camera.
- ARKit combines device motion tracking, camera scene capture, advanced scene processing and display conveniences to simplify the process of building AR experience.
- ARKit will run on the device which has A9 and above chips set.
Requirement for ARKit
The basic requirement for any AR experience—and the defining feature of ARKit—is the ability to create and track a correspondence between the real-world space the user inhabits and a virtual space where you can model visual content.
Steps for getting the Augmented Reality experience in our application –
Tracking – Matching real world and visual-inertial odometry
The ARKit uses the visual-inertial odometry(VIO) method to create the correspondence between the real world and virtual spaces. This method uses data from the motion sensor and computer vision analysis of a scene from device camera. ARKit analyses the features in the scene using data gathered from different video frames and it combines this scene data with motion sensor data to provide the high precision information about device’s position and motion.
Scene understanding – Plane detection, Hit testing and light estimation
Plane Detection – By using ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well.
Hit testing – Using this we can find the real world surface corresponding to the point the camera image.
Light estimation – ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.
Rendering – Easy integration, ARView and Custom rendering
After completing tracking and scene understanding we can place virtual elements in real world scene using ARKit.
This image helps you to understand about the ARKit.
Core Motion
Core Motion is a framework in ios that reports motion- and environment-related data from the onboard hardware of iOS devices, including from the accelerometers, gyroscopes, pedometers, magnetometers, and barometers. You use this framework to access hardware-generated data so that you can use it in your app.
AVFoundation
The AVFoundation framework combines four major technology areas that together encompass a wide range of tasks for capturing, processing, synthesizing, controlling, importing and exporting audiovisual media on Apple platforms.
Note – To make your app available only on devices supporting ARKit, use the arkit key in the UIRequiredDeviceCapabilities section of your app’s Info.plist. If augmented reality is a secondary feature of your app, use the is supported property to determine whether the current device supports the session configuration you want to use.
Conclusion
This was the small introduction about ARKit. I hope now you have a basic understanding of this framework. We will see in detail about ARKit framework, Visual Inertial Odometry, High Performance & Rendering Optimizations in future blog post.
Comments are closed.