Top brands integrate Google Assistant with new tools and features for Android apps and Smart Displays
Posted by Baris Gultekin and Payam Shodjai, Directors of Product Management
Top brands turn to Google Assistant every day to help their users get things done on their phones and on Smart Displays — such as playing games, finding recipes or checking investments, just by using their voice. In fact, over the last year, the number of Actions completed by third-party developers has more than doubled.
We want to support our developer ecosystem as they continue building the best experiences for smart displays and Android phones. That’s why today at Google Assistant Developer Day, we introduced:
- New App Actions built in intents — to enable Android developers easily integrate Google Assistant with their apps,
- New discovery features such as suggestions and shortcuts — to enable users easily discover and engage with Android apps
- New developer tools and features, such as testing API, voices and frameworks for game development — to help build high quality nativel experiences for smart displays
- New discovery and monetization improvements — to help users discover and engage with developers’ experiences on Assistant.
Now, all Android Developers can bring Google Assistant to their apps
Now, every Android app developer can make it easier for their users to find what they’re looking for, by fast forwarding them into the app’s key functionality, using just voice. With App Actions, top app developers such as Yahoo Mail, Fandango, and ColorNote, are currently creating these natural and engaging experiences for users by mapping their users’ intents to specific functionality within their apps. Instead of having to navigate through each app to get tasks done, users can simply say “Hey Google” and the outcome they want – such as “find Motivation Mix on Spotify” using just their voice.
Here are a few updates we’re introducing today to App Actions.
Quickly open and search within apps with common intents
Every day, people ask Google Assistant to open their favorite apps. Today, we are building on this functionality to open specific pages within apps and also search within apps. Starting today, you can use the GET_THING intent to search within apps and the OPEN_APP_FEATURE intent to open specific pages in apps; offering more ways to easily connect users to your app through Assistant.
Many top brands such as eBay and Kroger are already using these intents. If you have the eBay app on your Android phone, try saying “Hey Google, find baseball cards on eBay” to try the GET_THING intent.
If you have the Kroger app on your Android phone, try saying “Hey Google, open Kroger pay” to try the OPEN_APP_FEATURE intent.
It’s easy to implement all these common intents to your Android apps. You can simply declare support for these capabilities in your Actions.xml file to get started. For searching, you can provide a deep link that will allow Assistant to pass a search term into your app. For opening pages, you can provide a deep link with the corresponding name for Assistant to match users’ requests.
Vertical specific built-in intents
For a deeper integration, we offer vertical-specific built-in intents (BII) that lets Google take care of all the Natural Language Understanding (NLU) so you don’t have to. We first piloted App Actions in some of the most popular app verticals such as Finance, Ridesharing, Food Ordering, and Fitness. Today, we are announcing that we have now grown our catalog to cover more than 60 intents across 10 verticals, adding new categories like Social, Games, Travel & Local, Productivity, Shopping and Communications
For example, Twitter and Wayfair have already implemented these vertical built in intents. So, if you have the Twitter app on your Android phone, try saying “Hey Google, post a Tweet” to see a Social vertical BII in action.
If you have the Wayfair app on your Android phone, try saying “Hey Google, buy accent chairs on Wayfair” to see a Shopping vertical BII in action.
Custom Intents to highlight unique app experiences
Every app is unique with its own features and capabilities, which may not match the list of available App Actions built-in intents. For cases where there isn’t a built-in intent for your app functionality, you can instead create a custom intent.Like BIIs, custom intents follow the actions.xml schema and act as connection points between Assistant and your defined fulfillments.
Snapchat and Walmart use custom intents to extend their app’s functionality to Google Assistant. For example, if you have the Snapchat app on your Android phone, just say, “Hey Google, send a Snap using the cartoon face lens” to try their Custom Intent.
Or, If you have the Walmart app on your Android phone, just say, “Hey Google, reserve a time slot with Walmart” to schedule your next grocery pickup.
With more common, built-in, and custom intents available, every Android developer can now enable their app to fulfill Assistant queries that tailor to exactly what their app offers. Developers can also use known developer tools such as Android Studio, and with just a few days of work, they can easily integrate their Android apps with the Google Assistant.
Suggestions and Shortcuts for improving user discoverability
We are excited about these new improvements to App Actions, but we also understand that it’s equally important that people are able to discover your App Actions. We’re designing new touch points to help users easily learn about Android apps that support App Actions. For example, we’ll be recommending relevant Apps Actions even when the user doesn’t mention the app name explicitly by showing suggestions. If you say broadly “Hey Google, show me Taylor Swift”, we’ll highlight a suggestion chip that will guide the user to open up the search result in Twitter. Google Assistant will also be suggesting apps proactively, depending on individual app usage patterns.
Android users will also be able to customize their experience, creating their own way to automate their most common tasks with app shortcuts, enabling people to set up quick phrases to enable app functions they frequently use. For example, you can create a MyFitnessPal shortcut to easily track their calories throughout the day and customize the query to say what you want – such as “Hey Google, check my calories.”
By simply saying “Hey Google, shortcuts”, they can set up and explore suggested shortcuts in the settings screen. We’ll also make proactive suggestions for shortcuts throughout the Assistant mobile experience, tailored to how you use your phone.
Build high quality conversational Actions for Smart Displays
Back in June, we launched new developer tools such as Actions Builder and Actions SDK, making it easier to design and build conversational Actions on Assistant, like games, for Smart Displays. Many partners have already been building with these, such as Cool Games and Sony. We’re excited to share new updates that not only enable developers to build more, higher quality native Assistant experiences with new game development frameworks and better testing tools, but we’ve also made user discovery of those experiences better than ever.
New developer tools and features
We’ve heard your feedback that you need better voices to match the quality of the experiences you’re delivering on the Assistant. We’ve released two new English voices that take advantage of an improved prosody model to make Assistant sound more natural. Give it a listen.
These voices are now available and you can leverage them in your existing Actions by simply making the change in the Actions Console.
Interactive Canvas expansion
We’re expanding Interactive Canvas to Actions in the education and storytelling verticals; in addition to games. Whether you’re building an action that teaches someone to cook, explains the phases of the moon, helps a family member with grammar, or takes you through an interactive adventure, you’ll have access to the full visual power of Interactive Canvas.
Improved testing to deliver high quality experiences
Actions Testing API is a new programmatic way to test your critical user journeys and ensure there aren’t any broken conversation paths. Using this framework allows you to run end to end tests in an isolated preview environment, run regression tests, and add continuous testing to your arsenal. This API is being released to general availability soon.
New Dialogflow migration tool
For those of you who built experiences using Dialogflow, we want you to enjoy the benefits of the new platform without having to build from scratch. That’s why we’re offering a migration tool inside the Actions Console that automates much of the work to move projects to the improved platform.
New site for game developers
Game developers, we built a new resource hub just for you. Boost your game design expertise with full source code to games, design best practices, interviews with game developers, tools, and everything you need to create voice-enabled games for Smart Displays.
With more incredible experiences being built, we know it can be challenging to help users discover them and drive engagement. To make it easier for people to discover and engage with your experiences, we have invested in a slew of new discovery features:
New Built-in intents and the Learning Hub
We’ll soon be opening two new set Built-in intents (BIIs) for public registration: Education and Storytelling. Registering your Actions for these intents allows users to discover them in a simple, natural way through general requests to Google Assistant. These new BIIs cover a range of intents in the Education and Storytelling domains and join Games as principal areas of investment for the developer ecosystem.
People will then be able to say “Hey Google, teach me something new” and they will be presented with a Learning Hub where they can browse different education experiences. For stories, users can simply say “Hey Google, tell me a story”. Developers can soon register for both new BIIs to get their experiences listed in these browsable catalog.
Household Authentication token and improving transactions
One of the exciting things about the Smart Display is that it’s an inherently communal device. So if you’re offering an experience that is meant to be enjoyed collaboratively, you need a way to share state between household members and between multiple devices. Let’s say you’re working on a puzzle and your roommate wants to help with a few pieces on the Smart Display. We’re introducing household authentication tokens so all users in a home can now share these types of experiences. This feature will be available soon via the Actions console.
Finally, we’re making improvements to the transaction flow on Smart Displays. We want to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. We’ve started by supporting voice-match as an option for payment authorization. And early next year, we’ll also launch an on-display CVC entry.
Simplifying account linking and authentication
Once you build personalized and premium experiences, you need to make it as easy as possible to connect with existing accounts. To help streamline this process, we’re opening two betas: Link with Google and App Flip, for improved account linking flows to allow simple, streamlined authentication via apps.
Link with Google enables anyone with an Android or iOS app where they are already logged in to complete the linking flow with just a few clicks, without needing to re-enter credentials.
App Flip helps you build a better mobile account linking experience and decrease drop-off rates. App Flip allows your users to seamlessly link their accounts to Google without having to re-enter their credentials.
In addition to launching new channels of discovery for developer Actions, we also want to provide more control over how you and your users reach your Actions. Action links were a way to deep link to your conversational action that has been used with great success by partners like Sushiro, Caixa, and Giallo Zafferano. Now we are reintroducing this feature as Assistant links, which enable partners such as TD Ameritrade to deliver rich Google Assistant experiences in their websites as well as deep links to their Google Assistant integrations from anywhere on the web.
We are very excited about all these announcements – both across App Actions and native Assistant development. Whether you are exploring new ways to engage your users using voice via App Actions, or looking to build something new to engage users at home via Smart Displays, we hope you will leverage these new tools and features and share your feedback with us.