GDD (Google Developer Days) events don't generate that much hype when compared to the Google I/O keynote, but they usually give us an insight of what to expect from the company in the near future. The latest GDD was held in Krakow, Poland on September 5-6, and although it already ended, the search engine giant did share a lot of content from the exhibition on YouTube.
If you go through the Google Developers channel on the video-sharing website, you'll see more than 30 clips from the event uploaded in the last couple of days. The hour-long day 2 keynote is what mobile fans will find most interesting, as it includes demonstrations and new features coming to Google Assistant, some through its integration with Google Lens and Google Translate.
If you have some spare time, feel free to watch the video at the top of this article, but if not, the time-stamped summary we prepared below will give you an overview of the highlights.
GDD Europe 2017 Day 2 keynote highlights
Answering complex questions
- 38:23 - Behshad Behzadi, the presenter, shows Google Assistant's current capabilities of answering questions. The smart companion uses machine learning to resolve questions that are usually a part of the natural human language, such as "Can you please tell me what's the weather going to be like in Krakow tomorrow?". The assistant is also able to answer more specific queries such as the list of rides in an amusement park and their specific height requirements.
- 41:10 - Google Assistant will be able to answer even more complex inquiries in the near future. Behshad used “What is the name of the movie where Tom Cruise acts in it and he plays pool and while he plays pool he dances.” as an example. After processing the information, Google Assistant came up with a result for "The Color of Money" and provided a short summary of the movie.
Integration with Google Translate
- 42:10 - Google Assistant will sport a brand new translation mode through integration with Google Translate. Saying "Be my [insert language here] translator" will prompt the service to repeat anything you say in the language of your choice. This lingual reiteration will be repeated both audibly as well as in text form on the display of your device until you instruct the Assistant to stop.
Setting up preferences
- 46:28 -You can already store preferences on Google Assistant, such as your home address or favorite sports team, but this feature will see even more improvements in 2018. The presenter demonstrated this by telling the service that he can swim in Lake Zurich if the outside temperature is above 25 °C. Then, he asked if he could swim in the lake next weekend, to which the Assistant replied with a "no" because the weather forecast predicted that the temperature won't be as high.
Putting things into context
- 49:21 - Google Assistant will also be able to better understand your questions based on context from previous queries. Behshad asked the Assistant to show him pictures of Thomas, a seemingly nonsensical request if no prior information was provided. The service pulled up pictures of Thomas the Train as this is the most popular result. Then, Behshad requested to see the roster of FC Bayern Munich, as the name of one of the key players there is Thomas Müller. Behshad then repeated the initial question, and this time the Assistant displayed pictures of the Bayern Munich team member based on his previous searches.
Integration with Google Lens
- 51:51 - Finally, we get a glimpse of how Google Assistant will be enhanced through its integration with Google Lens. These features were announced at Google I/O but the demo here is quite intriguing. You can see that Google Lens can determine the caloric content of different foods, while also being able to do currency conversion.