Live View takes advantage of your phone's camera and Google Maps' vast database to deliver a unique AR (augmented reality) navigation experience that layers the Maps interface over a real-time feed from the camera. It was previously demoed last year, and has been available to Google Local Guides level 5 and up, as well as Pixel phone owners, but it's about te see a broader release later this month.
TechCrunch reports.To use Google Maps Live View, you'll need a phone that supports ARKit, in the case of iPhones, and ARCore, in the case of Android devices. Of course, Live View is also going to be available only in regions where Street View is available,
To use Live View, you'll need to tap the "Start AR" button and hold up your phone like a viewfinder. This brings up a screen that's quite reminiscent of a video game, in that you'll see the world around you, but with virtual elements (like arrows and signs) layered over your surroundings.
To pinpoint your coordinates, Maps uses your phone's rough GPS location and Street View's vast image database to determine exactly where you are. To achieve this, the app takes a series of images, which have a known location, and analyzes them for key visual features (like the outlines of buildings and bridges), in order to create a large-scale index of these features. Then, to localize you, the system compares features in the database to imagery from the live camera feed. Using machine learning, it is able to discern between transient features, such as dynamic lighting and construction work, and the permanent features of the scene.
Google warns that using Live View in Maps will lead to slightly increased battery and mobile data consumption. The feature is expected to roll out in beta on Android and iOS in the coming weeks.