Google on Tuesday unveiled ARCore – a new software development kit that aims to bring augmented reality to millions of existing and future Android phones. No dual cameras or depth sensors required! But wait, what about Tango?
Google launched Tango in 2014, and it is still one of the most advanced AR platforms for mobile devices, but it demands special hardware to work and is thus available on a very limited number of devices that severely lack broader appeal. But seeing as how Apple has managed to bring AR to the iPhone in a much simpler way with its ARKit framework, Google has too decided to launch its own, simplified AR software framework that will be available on a much, much broader scope of devices that Tango.
But enough about Tango, let's see what makes ARCore special.
Much like ARKit, ARCore is made to function on devices without specialized sensors for sensing depth, and as such, is not perfect by any means, but it is widely available and easily accessible. It is currently available on the Google Pixel
and Samsung Galaxy S8
, but by the end of this year, Google promises to have ARCore running on more than 100 million Android devices worldwide.
ARCore is a versatile SDK that works with Java/OpenGL, Unity, and Unreal Engine, and focuses on three things to achieve its goal of delivering the AR experience to a broad audience:
- Motion tracking: Using the phone’s camera to observe feature points in the room and IMU sensor data, ARCore determines both the position and orientation (pose) of the phone as it moves. Virtual objects remain accurately placed.
- Environmental understanding: It is common for AR objects to be placed on a floor or a table. ARCore can detect horizontal surfaces using the same feature points it uses for motion tracking.
- Light estimation: ARCore observes the ambient light in the environment and makes it possible for developers to light virtual objects in ways that match their surroundings, making their appearance even more realistic.
ARCore can detect information about the lighting of its environment and tries to light your virtual objects accordingly
Motion tracking and environmental awareness are two of the most basic prerequisites for an adequate AR experience – objects in AR have to "stick" to surfaces (that is, stay where they are in relation to real objects) and change perspective in relation to the user's point of view. But light estimation is one of the more interesting traits of ARCore, and it makes objects in AR react to changes in the environmental light in real time. This means that 3D objects can change exposure (i.e. become dimmer or brighter) depending on how well-lit your environment is, and also cast dynamic shadows in different directions. Pretty basic stuff but it sure does look cool and it adds to the overall experience.
With ARCore, Google is not trying to compete with its own, dedicated AR platform, but is rather aiming to bring augmented reality to the hands of millions and to make development for the new platform a breeze. ARCore launches on two popular devices,the Google Pixel and the Galaxy S8, and it doesn't require any special hardware to run. Google also says it's working with all major Android phone manufacturers to bring ARCore to existing and future devices. Will Google succeed in making this platform ubiquitous in the long run? Remains to be seen.
In the mean time, check out this sweet ARCore teaser that Google just released: