Personalize user journeys by Pushing Dynamic Shortcuts to Assistant

Posted by Jessica Dene Earley-Cha, Developer Relations Engineer

Like many other people who use their smartphone to make their lives easier, I’m way more likely to use an app that adapts to my behavior and is customized to fit me. Android apps already can support some personalization like the ability to long touch an app and a list of common user journeys are listed. When I long press my Audible app (an online audiobook and podcast service), it gives me a shortcut to the book I’m currently listening to; right now that is Daring Greatly by Brené Brown.

Now, imagine if these shortcuts could also be triggered by a voice command – and, when relevant to the user, show up in Google Assistant for easy use.

Wouldn’t that be lovely?

Dynamic shortcuts on a mobile device

Well, now you can do that with App Actions by pushing dynamic shortcuts to the Google Assistant. Let’s go over what Shortcuts are, what happens when you push dynamic shortcuts to Google Assistant, and how to do just that!

Android Shortcuts

As an Android developer, you’re most likely familiar with shortcuts. Shortcuts give your users the ability to jump into a specific part of your app. For cases where the destination in your app is based on individual user behavior, you can use a dynamic shortcut to jump to a specific thing the user was previously working with. For example, let’s consider a ToDo app, where users can create and maintain their ToDo lists. Since each item in the ToDo list is unique to each user, you can use Dynamic Shortcuts so that users’ shortcuts can be based on their items on their ToDo list.

Below is a snippet of an Android dynamic shortcut for the fictional ToDo app.

val shortcut = = new ShortcutInfoCompat.Builder(context, task.id)
.setShortLabel(task.title)
.setLongLabel(task.title)
.setIcon(Icon.createWithResource(context, R.drawable.icon_active_task))
.setIntent(intent)
.build()

ShortcutManagerCompat.pushDynamicShortcut(context, shortcut)

Dynamic Shortcuts for App Actions

If you’re pushing dynamic shortcuts, it’s a short hop to make those same shortcuts available for use by Google Assistant. You can do that by adding the Google Shortcuts Integration library and a few lines of code.

To extend a dynamic shortcut to Google Assistant through App Actions, two jetpack modules need to be added, and the dynamic shortcut needs to include .addCapabilityBinding.

val shortcut = = new ShortcutInfoCompat.Builder(context, task.id)
.setShortLabel(task.title)
.setLongLabel(task.title)
.setIcon(Icon.createWithResource(context, R.drawable.icon_active_task))
.addCapabilityBinding("actions.intent.GET_THING", "thing.name", listOf(task.title))
.setIntent(intent)
.build()

ShortcutManagerCompat.pushDynamicShortcut(context, shortcut)

The addCapabilityBinding method binds the dynamic shortcut to a capability, which are declared ways a user can launch your app to the requested section. If you don’t already have App Actions implemented, you’ll need to add Capabilities to your shortcuts.xml file. Capabilities are an expression of the relevant feature of an app and contains a Built-In Intent (BII). BIIs are a language model for a voice command that Assistant already understands, and linking a BII to a shortcut allows Assistant to use the shortcut as the fulfillment for a matching command. In other words, by having capabilities, Assistant knows what to listen for, and how to launch the app.

In the example above, the addCapabilityBinding binds that dynamic shortcut to the actions.intent.GET_THING BII. When a user requests one of their items in their ToDo app, Assistant will process their request and it’ll trigger capability with the GET_THING BII that is listed in their shortcuts.xml.

<shortcuts xmlns:android="http://schemas.android.com/apk/res/android">
<capability android:name="actions.intent.GET_THING">
<intent
android:action="android.intent.action.VIEW"
android:targetPackage="YOUR_UNIQUE_APPLICATION_ID"
android:targetClass="YOUR_TARGET_CLASS">
<!-- Eg. name = the ToDo item -->
<parameter
android:name="thing.name"
android:key="name"/>
</intent>
</capability>
</shortcuts>

So in summary, the process to add dynamic shortcuts looks like this:

1. Configure App Actions by adding two jetpack modules ( ShortcutManagerCompat library and Google Shortcuts Integration Library). Then associate the shortcut with a Built-In Intent (BII) in your shortcuts.xml file. Finally push the dynamic shortcut from your app.

2. Two major things happen when you push your dynamic shortcuts to Assistant:

  1. Users can open dynamic shortcuts through Google Assistant, fast tracking users to your content
  2. During contextually relevant times, Assistant can proactively suggest your Android dynamic shortcuts to users, displaying it on Assistant enabled surfaces.

Not too bad. I don’t know about you, but I like to test out new functionality in a small app first. You’re in luck! We recently launched a codelab that walks you through this whole process.

Dynamic Shortcuts Codelab

Looking for more resources to help improve your understanding of App Actions? We have a new learning pathway that walks you through the product, including the dynamic shortcuts that you just read about. Let us know what you think!

Thanks for reading! To share your thoughts or questions, join us on Reddit at r/GoogleAssistantDev.

Follow @ActionsOnGoogle on Twitter for more of our team’s updates, and tweet using #AppActions to share what you’re working on. Can’t wait to see what you build!

Google Smart Home Developer Summit – New tools and features to build, innovate, and grow with Google

Posted by Michele Turner, Senior Director of Product for Google’s Smart Home Ecosystem

Google Smart Home Developer Summit

Earlier this year at Google I/O, we told you that our goal is to make Google the best place for smart home developers like you to build, innovate, and grow. Today, I’m excited to show you all the new ways we’re improving the tools and opportunities you’ll have to build you best experiences with Google, by:

  • Expanding our platform and tools to make it easier for you to learn and build devices that do more with Google.
  • Providing a site where you can preview the new tools that are coming over the next year to help you build your devices, apps, and integrations.
  • Supporting Matter & Thread across our entire ecosystem, including Nest and Android.
  • Developing more automation capabilities, including the ability to build suggested routines for your users.
  • Helping you differentiate with Google and connect to more users.

(re) Introducing “Google Home”

Our journey as an ecosystem started five years ago with the Google Home speaker and Google Assistant. It has grown into a powerful platform, with support for new smart speakers and displays, Android, Nest, and the Google Home app. It also includes an ecosystem of tens of thousands of devices made by partners and developers like you, enabling Google users to engage with over 200 million devices, and making the smart home more than the sum of its parts.

We’re bringing all of this together, and announcing a new, but familiar name, for our entire smart home platform and developer program, that helps users and developers do more with Google – Google Home. By bringing our platform and tools under the same roof, it gives us a simpler way to show you why and how integrating your devices with Google Home makes them more accessible and helpful across the Google ecosystem.

New Google Home Developer Center

Launching early next year, you’ll have access to our new Google Home Developer Center that will have everything you need to learn and build smart home devices, applications, and automations with Google. It’s a total redesign of our developer site and console, focused on major upgrades to navigation, and new self-serve tools for both developers and their business teams.

The developer center will have tools for each step of development, deployment, and analytics, including:

  • Building Matter devices
  • Customizing setup of your devices in Android and the Google Home app
  • Creating automations and routines
  • Building Android apps with Matter
  • Testing and certification
  • New tools for analytics & performance monitoring

image of Smart Home Developer Center

Quickly build and integrate with Matter

One of the most important new capabilities we’re bringing to our developers is the ability to quickly build and integrate Matter devices. We’re continuing to collaborate with other leading and innovative companies from across the industry to develop Matter — the new, universal, open smart home application protocol that makes it easy to build, buy, and set up smart home devices with any Matter ecosystem or app. We’re also adding Matter as a powerful new way to connect your devices to Google Home and your Android apps.

To make sure users are ready for your Matter devices, we’ll update Nest and Android devices with Matter support, following the launch of the new standards. That means when you build devices with Matter, they can be easily set up and controlled by millions of users everywhere they interact with Google, including Nest speakers and displays, the Google Assistant, and of course Android devices. To make sure you’re ready to build your best Matter-enabled experiences with Google, we’re adding support for Matter in the Google Home Developer Center, and rolling out new tools for Matter development across Google Home and Android, including two new SDKs.

New Google Home Device SDK for Matter devices

The first is the Google Home Device SDK — the fastest way to develop Matter devices, enabling seamless setup, control, and interoperability.

The open source Matter specification and SDK will ensure everyone is starting from the same code base. But building innovative, quality experiences goes beyond sharing the same connectivity protocol. The Google Home Device SDK complements the open-source libraries and simplifies building Matter devices to work seamlessly with Google, including configuring your device with Assistant, improving quality with logging, and adding tools to interact and test with Google devices. This helps you build a more responsive, reliable end-to-end experience for users. We’ll also be adding new capabilities that allow you to innovate with the SDK.

To make your development even easier, we’re also delivering the Google Home IDE to build your smart home devices and connect them to Google in a familiar way. For developers using Visual Studio Code to develop smart home devices, you can easily leverage our tools in that environment by installing the new Google Home IDE, which complements your existing extensions and tools in this popular editor.

Visual studio code

Native Android Support via Google Play Services and a new Google Home Mobile SDK

Mobile devices are an important smart home tool for users, and are critical to how users set up, manage, and control the devices in their home. To make app experiences more seamless, and help your users experience the magic of your device as quickly as possible, we’re building Matter directly into Android, and announcing support for Matter through Google Play services.

One of the key benefits this enables is seamless Matter device setup flows in Android, letting users connect new Matter devices over WiFi and Thread as easily as a new pair of headphones. You’ll be able to customize that setup flow with branding and device descriptions. With just a few taps, users will be able to link your devices to your app, the Google Home app, and other Matter apps they’ve installed. Of course, when users connect your device to Google, it automatically shows up in the Google Home app, on Android controls for smart home devices, and is controllable with the Google Assistant on Android, without additional development.

We’re also creating new tools to accelerate your development with the Google Play services Matter APIs, using the new Google Home Mobile SDK. Building a Matter-native app on Android lets users link their smart home devices to your app during the setup process, or later in their journey, with a few easy taps – with no need for account linking.

We’re already well underway building Matter integrations with many of the leaders in the smart home industry, and helping their Matter devices do more with Google, with many more to follow.

Inspire engagement with Suggested Routines

Whether via Matter or existing integration paths, being able to easily and reliably connect your devices to Google helps users build their smart homes. For developers, automations allow users to do more with your devices.

We want to help you easily combine them with other devices into coordinated routines, and to use context and triggers to increase their usefulness and engagement with the help of Google’s intelligence. So in our new Developer Center, we’ll enable you to create your own suggested routines that users can easily discover directly in the Google Home app. Your routines can carry your brand, suggest new ways for users to engage with your devices, and enhance them by coordinating them with other devices and context signals in the home.

Do more with Google Home

This is just the start of new ways we’re enabling your devices and brands to do more with Google Home. We know that for device makers, compatibility with Google Home is an important way to engage your users. But you want to make sure that your brand, products, and innovations are front and center with your users, to help them get the most from the experiences you’ve built.

That’s why all of the new tools we’re building help you to go beyond just compatibility with Google Home — and empower you to build your best, most engaging experiences with Google.

  • Customizable setup flows built into Android and Google Home that let your users experience the magic of your device with just a few taps right out of the box.
  • Native Matter apps on Android your users can discover and connect to in one streamlined setup flow.
  • Suggested routines to help your users do more with your devices.
  • New ways for users to discover and use your devices’ capabilities within the Google Home app.
  • the new Google Home Developer Center that brings developer and marketing tools together in one place, to help you and your team quickly bring all this to market.

Building for Matter

Support user growth and discovery

Of course, when you’ve built those great experiences, you want to tell everyone about them! For users that haven’t discovered your devices yet, we’re leveraging the power of Google to help users learn about your devices, and bring them home.

Earlier this year, we launched our new smart home directory on web and mobile that has seen great user engagement. This new site gives consumers an easy to use resource for discovering smart devices compatible with Google, and the experiences they can create with them, whether with a single device or using multiple devices together with automations and routines. We’re continuing to expand the site with more use cases, addressing the needs of both beginners and more sophisticated users looking to grow their smart homes and get more out of them.

We’ll have more to share with you over the coming months! Visit developers.googoe.com/home to read more about our announcements today and sign up for updates. We can’t wait to see what you build!

You’re invited to the Google Smart Home Developer Summit

Posted by Toni Klopfenstein, Developer Relations Engineer

Google Smart Home Developer Summit

Today there are over 276 million smart home households globally, and the industry continues to see rapid growth every year. Users have never been more comfortable bringing home new smart home devices — but they also continue to expect more from their devices, and their smart homes. To meet and exceed these expectations, we want to make sure developers have the tools and support to build their best experience across the Google Home app, Nest, Android, and Assistant.

That’s why we’re excited to announce the return of the Google Smart Home Developer Summit on October 21, 2021! This year’s event is free to join, fully virtual and will be hosted on our website with broadcast times available for our developer communities in the AMER, EMEA, and APAC regions.

To kick things off, Michele Turner, Senior Director of Product for Google’s Smart Home Ecosystem, will share our vision for the home and preview upcoming tools and features to build your next devices and apps using Matter and Thread — technologies transforming the industry. This will be followed by a developer keynote to dig deeper into announcements, and a round of technical sessions, workshops, and more, hosted by Google’s smart home leaders.

Building the best smart home platform means using trusted technology and intelligence to develop your integrations faster, provide tools to drive your innovation, and allow you new paths to growth. We can’t wait to engage with you and share more about how we can lead and grow the smart home together.

You can register for the Google Smart Home Developer Summit 2021 here, and follow along with the event using the tag #GoogleHomeSummit on social media. We hope to see you there!

Assistant Recap Google I/O 2021

Written by: Jessica Dene Earley-Cha, Mike Bifulco and Toni Klopfenstein, Developer Relations Engineers for Google Assistant

Now that we’ve packed up all of the virtual stages from Google I/O 2021, let’s take a look at some of the highlights and new product announcements for App Actions, Conversational Actions, and Smart Home Actions. We also held a number of amazing live events and meetups that happened during I/O – which we’ll summarize as well.

App Actions

App Actions allows developers to extend their Android App to Google Assistant. For our Android Developers, we are happy to announce that App Actions is now part of the Android framework. With the introduction of the beta shortcuts.xml configuration resource and our latest Google Assistant Plug App Actions is moving closer to the Android platform.

Capabilities

Capabilities is a new Android framework API that allows you to declare the types of actions users can take to launch your app and jump directly to performing a specific task. Assistant provides the first available concrete implementation of the capabilities API. You can utilize capabilities by creating shortcuts.xml resources and defining your capabilities. Capabilities specify two things: how it’s triggered and what to do when it’s triggered. To add a capability, use Built-In intents (BIIs), which are pre-built intents that provide all the Natural Language Understanding to map the user’s input to individual fields. When a BII is matched by the user’s speech, your capability will trigger an Android Intent that delivers the understood BII fields to your app, so you can determine what to show in response.

This framework integration is in the Beta release stage, and will eventually replace the original implementation of App Actions that uses actions.xml. If your app provides both the new shortcuts.xml and old actions.xml, the latter will be disregarded.

Voice shortcuts for Discovery

Google Assistant suggests relevant shortcuts to users and has made it easier for users to discover and add shortcuts by saying “Hey Google, shortcuts.”

Image of Google Assistant voice shortcuts

You can use the Google Shortcuts Integration library, currently in beta, to push an unlimited number of dynamic shortcuts to Google to make your shortcuts visible to users as voice shortcuts. Assistant can suggest relevant shortcuts to users to help make it more convenient for the user to interact with your Android app.

gif of In App Promo SDK

In-App Promo SDK

Not only can Assistant suggest shortcuts, with In-App Promo SDK you can proactively suggest shortcuts in your app for actions that the user can repeat with a voice command to Assistant, in beta. The SDK allows you to check if the shortcut you want to suggest already exists for that user and prompt the user to create the suggested shortcut.

Google Assistant plugin for Android Studio

To support testing Capabilities, Google Assistant plugin for Android Studio was launched. It contains an updated App Action Test Tool that creates a preview of your App Action, so you can test an integration before publishing it to the Play store.

New App Actions resources

Learn more with new or updated content:

Conversational Actions

During the What’s New in Google Assistant keynote, Director of Product for the Google Assistant Developer Platform Rebecca Nathenson mentioned several coming updates and changes for Conversational Actions.

Updates to Interactive Canvas

Over the coming weeks, we’ll introduce new functionality to Interactive Canvas. Canvas developers will be able to manage intent fulfillment client-side, removing the need for intermediary webhooks in some cases. For use cases which require server-side fulfillment, like transactions and account linking, developers will be able to opt-in to server-side fulfillment as needed.

We’re also introducing a new function, outputTts(), which allows you to trigger Text to Speech client-side. This should help reduce latency for end users.

Additionally, there will be updates to the APIs available to get and set storage for both the home and individual users, allowing for client-side storage of user information. You’ll be able to persist user information within your web app, which was previously only available for access by webhook.

These new features for Interactive Canvas will be made available soon as part of a developer preview for Conversational Actions Developers. For more details on these new features, check out the preview page.

Updates to Transaction UX for Smart Displays

Also coming soon to Conversational Actions – we’re updating the workflow for completing transactions, allowing users to complete transactions from their smart screens, by confirming the CVC code from their chosen payment method. Watch our demo video showing new transaction features on smart devices to get a feel for these changes.

Tips on Launching your Conversational Action

Make sure to catch our technical session Driving a successful launch for Conversational Actions to learn about some strategies for putting together a marketing team and go-to-market plan for releasing your Conversational Action.

AMA: Games on Google Assistant

If you’re interested in building Games for Google Assistant with Conversational Actions, you should check out the recording of our AMA, where Googlers answered questions from I/O attendees about designing, building, and launching games.

Smart Home Actions

The What’s new in Smart Home keynote covered several updates for Smart Home Actions. Following our continued emphasis on quality smart home integrations with the updated policy launch, we added new features to help you build engaging, reliable Actions for your users.

Test Suite and Analytics

The updated Test Suite for Smart Home now supports automatic testing, without the use of TTS. Additionally, the Analytics dashboards have been expanded with more detailed logs and in-depth error reporting to help you more quickly identify any potential issues with your Action. For a deeper dive into these enhancements, try out the Debugging the Smart Home workshop. There are also two new debugging codelabs to help you get more familiar with using these tools to improve the quality of your Action.

Notifications

We expanded support for proactive notifications to include the device traits RunCycle and SensorState. Users can now be proactively notified for multiple different device events. We also announced the release of follow-up responses. These follow-up responses enable your smart devices to notify users asynchronously to device changes succeeding or failing.

WebRTC

We added support for WebRTC to the CameraStream trait. Smart camera users can now benefit from lower latency and half-duplex talk between devices. As mentioned in the keynote, we will also be making updates to the other currently supported protocols for smart cameras.

Bluetooth Seamless Setup

To improve the on-boarding experience, developers can now enable BLE (bluetooth low energy) for device onboarding with Bluetooth Seamless Setup. Google Home and Nest devices can act as local hubs to provision and register nearby devices for any Action configured with local fulfillment.

Matter

Project CHIP has officially rebranded as Matter. Once the IP-based connectivity protocol officially launches, we will be supporting devices running the protocol. Watch the Getting started with Project CHIP tech session to learn more.

Ecosystem and Community

The women building voice AI and their role in the voice revolution

Voice AI is fundamentally changing how we interact with technology and its future will be a product of the people that build it. Watch this session to hear about the talented women shaping the Voice AI field, including an interview with Lilian Rincon, Sr. Director of Product Management at Google. Leslie also discusses strategies for achieving equal gender representation in Voice AI, an ambitious but essential goal.

AMA: How the Assistant Investment Program can help fund your startup

This “Ask Me Anything” session was hosted by the all-star team who runs the Google for Startups Accelerator: Voice AI. The team fielded questions from startups and investors around the world who are interested in building businesses based on voice technology. Check out the recording of this event here. The day after the AMA session, the 2021 cohort for the Voice AI accelerator had their demo day – you can catch the recording of their presentations here.

Image from the AMA titled: How the Assistant Investment Program can help fund your startup

Women in Voice Meetup

We connected with amazing women in Voice AI and discussed ways allies can help women in Voice to be more successful while building a more inclusive ecosystem. It was hosted by Leslie Garcia-Amaya, Jessica Dene Earley-Cha, Karina Alarcon, Mike Bifulco, Cathy Pearl, Toni Klopfenstein, Shikha Kapoor & Walquiria Saad

Smart home developer Meetups

One of the perks of I/O being virtual this year was the ability to connect with students, hobbyists, and developers around the globe to discuss the current state of Smart Home, as well as some of the upcoming features. We hosted 3 meetups for the APAC, Americas, and EMEA regions and gathered some great feedback from the community.

Assistant Google Developers Experts Meetup

Every year we host an Assistant Google Developer Expert meetup to connect and share knowledge. This year we were able to invite everyone who is interested in building for Google Assistant to network and connect with one another. At the end several attendees came together at the Assistant Sandbox for a virtual photo!

Image of GoogleIO assitant meetup

Thanks for reading! To share your thoughts or questions, join us on Reddit at r/GoogleAssistantDev.

Follow @ActionsOnGoogle on Twitter for more of our team’s updates, and tweet using #AoGDevs to share what you’re working on. Can’t wait to see what you build!

New for I/O: Assistant tools and features for Android apps and Smart Displays


Posted by Rebecca Nathenson, Director of Product for the Google Assistant Developer Platform

New Assistant tools at Google IO header

Today at I/O, we shared some exciting new product announcements to help you more easily bring Google Assistant to your Android apps and create more engaging content on smart displays.

Assistant development made easy with new Android APIs

App Actions helps you easily bring Google Assistant to your Android app and complete user queries of all kinds, from booking a ride to posting a message on social media. Companies such as MyFitnessPal and Twitter are already using App Actions to help their users get things done, just by using their voice. You can enable App Actions in Android Studio by mapping built-in intents to specific features and experiences within your apps. Here are new ways you can help users easily navigate your content through voice queries and proactive suggestions.

Better support for Assistant built-in intents with Capabilities

Capabilities is a new framework API available in beta today that lets you declare support for common tasks defined by built-in intents. By leveraging pre-built requests from our catalog of intents, you can offer users ways to jump to specific activities within your app.

For example, the Yahoo Finance app uses Capabilities to let users jump directly to the Verizon stock page just by saying “Hey Google, show me Verizon’s stock on Yahoo Finance.” Similarly, Snapchat users can use their voice to add filters and send them to friends: “Hey Google, send a snap with my Curry sneakers.”

Improved user discoverability with Shortcuts in Android 12

App shortcuts are already a popular way to automate most common tasks on Android. Thanks to the new APIs for Shortcuts in Android 12, it’s now easier to find all the Assistant queries that are supported with apps. If you build an Android Shortcut, it will automatically show up in the Assistant Shortcuts gallery, so users can choose to set up a personal voice command in your app, when they say “Hey Google, shortcuts.”

3 phones showing shortcuts from Assistant

Google Assistant can also suggest relevant shortcuts to help drive traffic to your app. For example, when using the eBay app, people will see a suggested Google Assistant Shortcut appear on the screen and have the option to create a shortcut for “show my bids.”

We also introduced the Google Shortcuts Integration library, which identifies shortcuts pushed by Shortcuts Jetpack Module and makes them available to Assistant for use in managing related voice queries. By doing so, Google Assistant can suggest relevant shortcuts to users and help drive traffic to your app.

Get immediate answers and updates right from Assistant using Widgets, coming soon

Improvements to Android 12 also makes it easier to discover glanceable content with Widgets by mapping them to specific built-in intents using the Capabilities API. We’re also looking at how to easily bring driving optimized widgets to Android Auto in the future. The integration with Assistant will enable one shot answers, quick updates and multi-step interactions with the same widget.

For example, with Dunkin’s widget implementation, you can say “Hey Google, reorder from Dunkin’ to select from previous drinks and place the order. Strava’s widget helps a user track how many miles they ran in a week by saying “Hey Google, check my miles on Strava”, and it will show up right on the lock screen.

Strava widget showing how many miles ran in a week

Build high quality Conversational Actions for smart displays

Last year, we introduced a number of improvements to the Assistant platform for smart displays, such as Actions Builder, Actions SDK and new built-in intents to improve the experience for both developers and users. Here are more improvements rolling out soon to make building conversational actions on smart displays even better.

New features to improve the developer experience

Interactive Canvas helps you build touch- and voice-controlled games and storytelling experiences for the Assistant using web technologies like HTML, CSS, and JavaScript. Companies such as CoolGames, Zynga, and GC Turbo have already used Canvas to build games for smart displays.

Since launch, we’ve gotten great feedback from developers that it would be simpler and faster to implement core logic in web code. To enable this, the Interactive Canvas API will soon provide access to text-to-speech (TTS), natural language understanding (NLU), and storage APIs that will allow developers to trigger these capabilities from client-side code. These APIs will provide experienced web developers with a familiar development flow and enable more responsive Canvas actions.

We’re also giving you a wider set of options around how to release your actions. Coming soon, in the Actions Console, you will be able to manage your releases by launching in stages. For example, you can launch to one country first and then expand to more later, or you can launch to just a smaller percentage and gradually roll out over time.

Improving the user experience on smart displays

You’ll also see improvements that will enhance visual experiences on the smart display. For example, you can now remove the persistent header, which allows you to utilize full real estate of the device and provide users with fully immersive experiences.

Before Interactive Canvas brought customized touch interfaces to the Smart Display, we provided a simple way to stop TTS from playing by tapping anywhere on the screen of the device. However, with more multi-modal experiences being released on Smart Displays, there are use cases where it is important to continue playing TTS while the user touches the display. Developers will soon have the option to enable persistent TTS for their actions.

We’ve also added support for long-form media sessions with updates to the Media API so you can start playback from a specific moment, resume where a previous session stopped, and adapt conversational responses based on media playback context.

Easier transactions for your voice experiences

We know how important it is to have the tools you need to build a successful business on our platform. In October of last year, we made a commitment to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. On-device CVC and credit card entry will soon be available on smart displays. Both of these features make on-device transactions much easier reducing the need to redirect users to their mobile devices.

We hope you are able to leverage all these new features to build engaging experiences and reach your users easily, both on mobile and at home. Check out our technical sessions, workshops and more from Google I/O on YouTube and get started with App Actions and Conversational Actions today!

Policy changes and certification requirement updates for Smart Home Actions


Posted by Toni Klopfenstein, Developer Advocate

Illustration of 2 animated locks and phone with Actions on Google logo on screen

As more developers onboard to the Smart Home Actions platform, we have gathered feedback about the certification process for launching an Action. Today, we are pleased to announce we have updated our Actions policy to enable developers to more quickly develop their Actions, and to help streamline the certification and launch process for developers. These updates will also help to provide a consistent, cohesive experience for smart device users.

Device quality guidelines

Ensuring each device type meets quality benchmark metrics provides end users with reliable and timely responses from their smart devices.With these policy updates, minimum latency and reliability metrics have been added to each device type guide. To ensure consistent device control and timely updates to Home Graph, all cloud controlled smart devices need to maintain a persistent connection through a hub or the device itself, and cannot rely on mobile devices and tablets.

Along with these quality benchmarks, we have also updated our guides with required and recommended traits for each device. By implementing these within an Action, developers can ensure their end users can trigger devices in a consistent manner and access the full range of device capabilities. To assist you in ensuring your Action is compliant with the updated policy, the Test Suite testing tool will now more clearly flag any device type or trait issues.

Safety and security

Smart home users care deeply about the safety and security of the devices integrated into their homes, so we have also updated our requirements for secondary user verification. This verification step must be implemented for any Action that can set a device in an unprotected state, such as unlocking a door, regardless of whether you are building a Conversational Action or Smart Home Action. Once configured with a secondary verification method, developers can provide users a way to opt out of this flow. For any developer wishing to include an opt-out selection to their customers, we have provided a warning message template to ensure users understand the security implications for turning this feature off.

For devices that may pose heightened safety risks, such as cooking appliances, we require UL certificates or similar certification forms to be provided along with the Test Suite results before an Action can be released to production.

Works With ‘Hey Google’ badge

These policy updates also will affect the use of the Works With Hey Google badge. The badge will only be available for use on marketing materials for new Smart Home Direct Actions that have successfully integrated any device types referenced.

Any Conversational Actions currently using the badge will not be approved for use for any new marketing assets, including packaging/product refreshes. Any digital assets using the badge will need to be updated to remove the badge by the end of 2021.

Timeline

With the roll-out today, there will be a 1 month grace period for developers to update new integrations to match the new policy requirements. For Actions currently deployed to production, compliance will be evaluated when the Action is recertified. Once integrations have been certified and launched to production, Actions will need to be recertified annually, or any time new devices or device functionality is added to the Action. Notifications for recertification will be shared with the developer account associated with your Action in the console.

This policy grace-period ends April 12, 2021.

Please review the updated policy, as well as our updated docs for launching your Smart Home Action. You can also check out our policy video for more information.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team’s updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Increasing our engagement with the voice technology community


Posted by Leslie Garcia-Amaya, Global Product Partnerships Lead, Google Assistant Ashwin Karuhatty, Head of Global Product Partnerships, Google Assistant

Google assistant image

The interest and adoption of voice technology reached an important inflection point last year with the pandemic, as we immediately saw Google Assistant play a bigger role in helping people manage more of their time at home, from juggling family activities to controlling their smart home devices.

To help brands and developers stay ahead of these trends and identify potential opportunities to create impactful voice experiences for their users, we spun up a series of virtual events to stay engaged with the community when many in-person industry events were cancelled. For example, we introduced VOICE Talks last April in partnership with Modev as a monthly series of digital events that connected Google business, engineering and product leaders directly with the voice-tech ecosystem and developer community. VOICE Talks also provided a platform to companies, like Sony, Bamboo Learning, American Express, Verizon, Headspace, Vizio, iRobot, Nike, Dunkin, to share best practices on how they integrated voice technology into their products. You can watch past episodes here.

The ecosystem support and participation has been incredible with over 110,000 subscribers for VOICE Talks, over 40,000 hours of content consumed and active ongoing viewership on YouTube. In addition, we saw a huge demand for country/region-specific content in India, and started the VOICE Talks India series, which has also been received very well.

Thanks to all the positive feedback from the community, we’re looking to double down on those efforts this year. In addition to hosting more VOICE Talks events, we’re expanding our collaboration with industry-recognized influencers through podcasts, livestreams and more to continue growing the community, such as:

Additionally, we’re excited to announce that Google Assistant is the first corporate sponsor of Women In Voice, a global non-profit with a mission to amplify women and diverse people in the voice technology field that has grown to 20 chapters in 15 countries since they launched in 2018. This sponsorship builds on the momentum Women In Voice established with Google Assistant at CES 2020, where they collaborated on a “Women In Tech & Allies” event. Tune in to womeninvoice.org to stay up to date on upcoming events and collaborations between Google Assistant and Women In Voice.

There’s now more ways to hear from us, share your feedback and learn about the latest trends in the space.

2020 Google Assistant developer Year in Review


Posted by Payam Shodjai, Director, Product Management Google Assistant

With 2020 coming to a close, we wanted to reflect on everything we have launched this year to help you, our developers and partners, create powerful voice experiences with Google Assistant.

Today, many top brands and developers turn to Google Assistant to help users get things done on their phones and on Smart Displays. Over the last year, the number of Actions built by third-party developers has more than doubled. Below is a snapshot of some of our partners who’ve integrated with Google Assistant:


2020 Highlights

Below are a few highlights of what we have launched in 2020:

1. Integrate your Android mobile Apps with Google Assistant

App Actions allow your users to jump right into existing functionality in your Android app with the help of Google Assistant. It makes it easier for users to find what they’re looking for in your app in a natural way by using their voice. We take care of all the Natural Language Understanding (NLU) processing, making it easy to develop in only a few days. In 2020, we announced that App Actions are now available for all Android developers to voicify their apps and integrate with Google Assistant.

For common tasks such as opening your apps, opening specific pages in your apps or searching within apps, we introduced Common Intents. For a deeper integration, we’ve expanded our vertical-specific built-in intents (BIIs), to cover more than 60 intents across 10 verticals, adding new categories like Social, Games, Travel & Local, Productivity, Shopping and Communications.

For cases where there isn’t a built-in intent for your app functionality, you can instead create custom intents that are unique to your Android app. Like BIIs, custom intents follow the actions.xml schema and act as connection points between Assistant and your defined fulfillments.

Learn more about how to integrate your app with Google Assistant here.

2. Create new experiences for Smart Displays

We also announced new developer tools to help you build high quality, engaging experiences to reach users at home by building for Smart Displays.

Actions Builder is a new web-based IDE that provides a graphical interface to show the entire conversation flow. It allows you to manage Natural Language Understanding (NLU) training data and provides advanced debugging tools. And, it is fully integrated into the Actions Console so you can now build, debug, test, release, and analyze your Actions – all in one place.

Actions SDK, a file based representation of your Action and the ability to use a local IDE. The SDK not only enables local authoring of NLU and conversation schemas, but it also allows bulk import and export of training data to improve conversation quality. The Actions SDK is accompanied by a command line interface, so you can build and manage an Action fully in code using your favorite source control and continuous integration tools.

Interactive Canvas allows you to add visual, immersive experiences to Conversational Actions. We announced the expansion of Interactive Canvas to support Storytelling and Education verticals earlier this year.

Continuous Match Mode allows the Assistant to respond immediately to a user’s speech for more fluid experiences by recognizing defined words and phrases set by you.

We also created a central hub for you to find resources to build games on Smart Displays. This site is filled with a game design playbook, interviews with game creators, code samples, tools access, and everything you need to create awesome games for smart displays.

Actions API provides a new programmatic way to test your critical user journeys more thoroughly and effectively, to help you ensure your Action’s conversations run smoothly.

The Dialogflow migration tool inside the Actions Console automates much of the work to move projects to the new and improved Actions Builder tool.

We also worked with partners such as Voiceflow and Jovo, to launch integrations to support voice application development on the Assistant. This effort is part of our commitment to enable you to leverage your favorite development tools, while building for Google Assistant.

We launched several other new features that help you build high quality experiences for the home, such as Media APIs, new and improved voices (available in Actions Console), home storage API.

Get started building for Smart Displays here.

3. Discovery features

Once you build high quality Actions, you are ready for your users to discover them. We have designed new touch points to help your users easily learn about your Actions..

For example, on Android mobile, we’ll be recommending relevant Apps Actions even when the user doesn’t mention the app’s name explicitly by showing suggestions. Google Assistant will also be suggesting apps proactively, depending on individual app usage patterns. Android mobile users will also be able to customize their experience, creating their own way to automate their most common tasks with app shortcuts, enabling people to set up quick phrases to enable app functions they frequently use. By simply saying “Hey Google, shortcuts”, they can set up and explore suggested shortcuts in the settings screen. We’ll also make proactive suggestions for shortcuts throughout Google Assistants’ mobile experience, tailored to how you use your phone.

Assistant Links deep link to your conversational Action to deliver rich Google Assistant experiences to your websites, so you can send your users directly to your conversational Actions from anywhere on the web.

We also recently opened two new built-in intents (BIIs) for public registration: Education and Storytelling. Registering your Actions for these intents allows your users to discover them in a simple, natural way through general requests to Google Assistant on Smart Displays. People will then be able to say “Hey Google, teach me something new” and they will be presented with a browsable selection of different education experiences. For stories, users can simply say “Hey Google, tell me a story”.

We know you build personalized and premium experience for your users, and need to make it easy for them to connect their accounts to your Actions. To help streamline this process we opened two betas for improved account linking flows that will allow simple, streamlined authentication via apps.

  • Link with Google enables anyone with an Android or iOS app where they are already logged in to complete the linking flow with just a few clicks, without needing to re-enter credentials.
  • App Flip helps you build a better mobile account linking experience, so your users can seamlessly link their accounts to Google without having to re-enter their credentials.

What to expect in 2021

Looking ahead, we will double down on enabling you, our developers and partners to build great experiences for GoogleAssistant and help you reach your users on the go and at home. You can expect to hear more from us on how we are improving the Google Assistant experience to make it easy for Android developers to integrate their Android app with Google Assistant and also help developers achieve success through discovery and monetization.

We are excited to see what you will build with these new features and tools. Thank you for being a part of the Google Assistant ecosystem. We can’t wait to launch even more features and tools for Android developers and Smart Display experiences in 2021.

Want to stay in the know with announcements from the Google Assistant team? Sign up for our monthly developer newsletter here.

Top brands integrate Google Assistant with new tools and features for Android apps and Smart Displays

Posted by Baris Gultekin and Payam Shodjai, Directors of Product Management

Top brands turn to Google Assistant every day to help their users get things done on their phones and on Smart Displays — such as playing games, finding recipes or checking investments, just by using their voice. In fact, over the last year, the number of Actions completed by third-party developers has more than doubled.

We want to support our developer ecosystem as they continue building the best experiences for smart displays and Android phones. That’s why today at Google Assistant Developer Day, we introduced:

  • New App Actions built in intents — to enable Android developers easily integrate Google Assistant with their apps,
  • New discovery features such as suggestions and shortcuts — to enable users easily discover and engage with Android apps
  • New developer tools and features, such as testing API, voices and frameworks for game development — to help build high quality nativel experiences for smart displays
  • New discovery and monetization improvements — to help users discover and engage with developers’ experiences on Assistant.

Now, all Android Developers can bring Google Assistant to their apps

Now, every Android app developer can make it easier for their users to find what they’re looking for, by fast forwarding them into the app’s key functionality, using just voice. With App Actions, top app developers such as Yahoo Mail, Fandango, and ColorNote, are currently creating these natural and engaging experiences for users by mapping their users’ intents to specific functionality within their apps. Instead of having to navigate through each app to get tasks done, users can simply say “Hey Google” and the outcome they want – such as “find Motivation Mix on Spotify” using just their voice.

Here are a few updates we’re introducing today to App Actions.

Quickly open and search within apps with common intents

Every day, people ask Google Assistant to open their favorite apps. Today, we are building on this functionality to open specific pages within apps and also search within apps. Starting today, you can use the GET_THING intent to search within apps and the OPEN_APP_FEATURE intent to open specific pages in apps; offering more ways to easily connect users to your app through Assistant.

Many top brands such as eBay and Kroger are already using these intents. If you have the eBay app on your Android phone, try saying “Hey Google, find baseball cards on eBay” to try the GET_THING intent.

If you have the Kroger app on your Android phone, try saying “Hey Google, open Kroger pay” to try the OPEN_APP_FEATURE intent.

It’s easy to implement all these common intents to your Android apps. You can simply declare support for these capabilities in your Actions.xml file to get started. For searching, you can provide a deep link that will allow Assistant to pass a search term into your app. For opening pages, you can provide a deep link with the corresponding name for Assistant to match users’ requests.

Vertical specific built-in intents

For a deeper integration, we offer vertical-specific built-in intents (BII) that lets Google take care of all the Natural Language Understanding (NLU) so you don’t have to. We first piloted App Actions in some of the most popular app verticals such as Finance, Ridesharing, Food Ordering, and Fitness. Today, we are announcing that we have now grown our catalog to cover more than 60 intents across 10 verticals, adding new categories like Social, Games, Travel & Local, Productivity, Shopping and Communications

For example, Twitter and Wayfair have already implemented these vertical built in intents. So, if you have the Twitter app on your Android phone, try saying “Hey Google, post a Tweet” to see a Social vertical BII in action.

If you have the Wayfair app on your Android phone, try saying “Hey Google, buy accent chairs on Wayfair” to see a Shopping vertical BII in action.

Check out how you can get started with these built-in intents or explore creating custom intents today.

Custom Intents to highlight unique app experiences

Every app is unique with its own features and capabilities, which may not match the list of available App Actions built-in intents. For cases where there isn’t a built-in intent for your app functionality, you can instead create a custom intent.Like BIIs, custom intents follow the actions.xml schema and act as connection points between Assistant and your defined fulfillments.

Snapchat and Walmart use custom intents to extend their app’s functionality to Google Assistant. For example, if you have the Snapchat app on your Android phone, just say, “Hey Google, send a Snap using the cartoon face lens” to try their Custom Intent.

Or, If you have the Walmart app on your Android phone, just say, “Hey Google, reserve a time slot with Walmart” to schedule your next grocery pickup.

With more common, built-in, and custom intents available, every Android developer can now enable their app to fulfill Assistant queries that tailor to exactly what their app offers. Developers can also use known developer tools such as Android Studio, and with just a few days of work, they can easily integrate their Android apps with the Google Assistant.

Suggestions and Shortcuts for improving user discoverability

We are excited about these new improvements to App Actions, but we also understand that it’s equally important that people are able to discover your App Actions. We’re designing new touch points to help users easily learn about Android apps that support App Actions. For example, we’ll be recommending relevant Apps Actions even when the user doesn’t mention the app name explicitly by showing suggestions. If you say broadly “Hey Google, show me Taylor Swift”, we’ll highlight a suggestion chip that will guide the user to open up the search result in Twitter. Google Assistant will also be suggesting apps proactively, depending on individual app usage patterns.

Android users will also be able to customize their experience, creating their own way to automate their most common tasks with app shortcuts, enabling people to set up quick phrases to enable app functions they frequently use. For example, you can create a MyFitnessPal shortcut to easily track their calories throughout the day and customize the query to say what you want – such as “Hey Google, check my calories.”

By simply saying “Hey Google, shortcuts”, they can set up and explore suggested shortcuts in the settings screen. We’ll also make proactive suggestions for shortcuts throughout the Assistant mobile experience, tailored to how you use your phone.

Build high quality conversational Actions for Smart Displays

Back in June, we launched new developer tools such as Actions Builder and Actions SDK, making it easier to design and build conversational Actions on Assistant, like games, for Smart Displays. Many partners have already been building with these, such as Cool Games and Sony. We’re excited to share new updates that not only enable developers to build more, higher quality native Assistant experiences with new game development frameworks and better testing tools, but we’ve also made user discovery of those experiences better than ever.

New developer tools and features

Improved voices

We’ve heard your feedback that you need better voices to match the quality of the experiences you’re delivering on the Assistant. We’ve released two new English voices that take advantage of an improved prosody model to make Assistant sound more natural. Give it a listen.

These voices are now available and you can leverage them in your existing Actions by simply making the change in the Actions Console.

Interactive Canvas expansion

But what can you build with these new voices? Last year, we introduced Interactive Canvas, an API that lets you build custom experiences for the Assistant that can be controlled via both touch and voice using simple technologies like HTML, CSS, and Javascript.

We’re expanding Interactive Canvas to Actions in the education and storytelling verticals; in addition to games. Whether you’re building an action that teaches someone to cook, explains the phases of the moon, helps a family member with grammar, or takes you through an interactive adventure, you’ll have access to the full visual power of Interactive Canvas.

Improved testing to deliver high quality experiences

Actions Testing API is a new programmatic way to test your critical user journeys and ensure there aren’t any broken conversation paths. Using this framework allows you to run end to end tests in an isolated preview environment, run regression tests, and add continuous testing to your arsenal. This API is being released to general availability soon.

New Dialogflow migration tool

For those of you who built experiences using Dialogflow, we want you to enjoy the benefits of the new platform without having to build from scratch. That’s why we’re offering a migration tool inside the Actions Console that automates much of the work to move projects to the improved platform.

New site for game developers

Game developers, we built a new resource hub just for you. Boost your game design expertise with full source code to games, design best practices, interviews with game developers, tools, and everything you need to create voice-enabled games for Smart Displays.

Discovery

With more incredible experiences being built, we know it can be challenging to help users discover them and drive engagement. To make it easier for people to discover and engage with your experiences, we have invested in a slew of new discovery features:

New Built-in intents and the Learning Hub

We’ll soon be opening two new set Built-in intents (BIIs) for public registration: Education and Storytelling. Registering your Actions for these intents allows users to discover them in a simple, natural way through general requests to Google Assistant. These new BIIs cover a range of intents in the Education and Storytelling domains and join Games as principal areas of investment for the developer ecosystem.

People will then be able to say “Hey Google, teach me something new” and they will be presented with a Learning Hub where they can browse different education experiences. For stories, users can simply say “Hey Google, tell me a story”. Developers can soon register for both new BIIs to get their experiences listed in these browsable catalog.

Household Authentication token and improving transactions

One of the exciting things about the Smart Display is that it’s an inherently communal device. So if you’re offering an experience that is meant to be enjoyed collaboratively, you need a way to share state between household members and between multiple devices. Let’s say you’re working on a puzzle and your roommate wants to help with a few pieces on the Smart Display. We’re introducing household authentication tokens so all users in a home can now share these types of experiences. This feature will be available soon via the Actions console.

Finally, we’re making improvements to the transaction flow on Smart Displays. We want to make it easier for you to add seamless voice-based and display-based monetization capabilities to your experience. We’ve started by supporting voice-match as an option for payment authorization. And early next year, we’ll also launch an on-display CVC entry.

Simplifying account linking and authentication

Once you build personalized and premium experiences, you need to make it as easy as possible to connect with existing accounts. To help streamline this process, we’re opening two betas: Link with Google and App Flip, for improved account linking flows to allow simple, streamlined authentication via apps.

Link with Google enables anyone with an Android or iOS app where they are already logged in to complete the linking flow with just a few clicks, without needing to re-enter credentials.

App Flip helps you build a better mobile account linking experience and decrease drop-off rates. App Flip allows your users to seamlessly link their accounts to Google without having to re-enter their credentials.

Assistant links

In addition to launching new channels of discovery for developer Actions, we also want to provide more control over how you and your users reach your Actions. Action links were a way to deep link to your conversational action that has been used with great success by partners like Sushiro, Caixa, and Giallo Zafferano. Now we are reintroducing this feature as Assistant links, which enable partners such as TD Ameritrade to deliver rich Google Assistant experiences in their websites as well as deep links to their Google Assistant integrations from anywhere on the web.

We are very excited about all these announcements – both across App Actions and native Assistant development. Whether you are exploring new ways to engage your users using voice via App Actions, or looking to build something new to engage users at home via Smart Displays, we hope you will leverage these new tools and features and share your feedback with us.

Announcing DevFest 2020

Posted by Jennifer Kohl, Program Manager, Developer Community Programs

DevFest Image

On October 16-18, thousands of developers from all over the world are coming together for DevFest 2020, the largest virtual weekend of community-led learning on Google technologies.

As people around the world continue to adapt to spending more time at home, developers yearn for community now more than ever. In years past, DevFest was a series of in-person events over a season. For 2020, the community is coming together in a whole new way – virtually – over one weekend to keep developers connected when they may want it the most.

The speakers

The magic of DevFest comes from the people who organize and speak at the events – developers with various backgrounds and skill levels, all with their own unique perspectives. In different parts of the world, you can find a DevFest session in many local languages. DevFest speakers are made up of various types of technologists, including kid developers , self-taught programmers from rural areas , and CEOs and CTOs of startups. DevFest also features a wide range of speakers from Google, Women Techmakers, Google Developer Experts, and more. Together, these friendly faces, with many different perspectives, create a unique and rich developer conference.

The sessions and their mission

Hosted by Google Developer Groups, this year’s sessions include technical talks and workshops from the community, and a keynote from Google Developers. Through these events, developers will learn how Google technologies help them develop, learn, and build together.

Sessions will cover multiple technologies, such as Android, Google Cloud Platform, Machine Learning with TensorFlow, Web.dev, Firebase, Google Assistant, and Flutter.

At our core, Google Developers believes community-led developer events like these are an integral part of the advancement of technology in the world.

For this reason, Google Developers supports the community-led efforts of Google Developer Groups and their annual tentpole event, DevFest. Google provides esteemed speakers from the company and custom technical content produced by developers at Google. The impact of DevFest is really driven by the grassroots, passionate GDG community organizers who volunteer their time. Google Developers is proud to support them.

The attendees

During DevFest 2019, 138,000+ developers participated across 500+ DevFests in 100 countries. While 2020 is a very different year for events around the world, GDG chapters are galvanizing their communities to come together virtually for this global moment. The excitement for DevFest continues as more people seek new opportunities to meet and collaborate with like-minded, community-oriented developers in our local towns and regions.

Join the conversation on social media with #DevFest.

Sign up for DevFest at goo.gle/devfest.


Still curious? Check out these popular talks from DevFest 2019 events around the world…

Join us for Google Assistant Developer Day on October 8

Posted by Baris Gultekin, Director, Product Management Google Assistant and
Payam Shodjai, Director, Product Management Google Assistant

More and more people turn to Google Assistant every day to help them get the most out of their phones and smart displays: From playing games to using their favorite app by voice, there are more opportunities than ever for developers to create new and engaging experiences for Google Assistant.

We welcome you to join us virtually at our Google Assistant Developer Day on Thursday, October 8, to learn more about new tools and features we’re building for developers to bring Google Assistant to mobile apps and Smart Displays and help drive discoverability and engagement via voice. This will also be a great chance to chat live with Google leaders and engineers on the team to get your questions answered.

You’ll hear from our product experts and partnership leads on best practices to integrate with Google Assistant to help users more easily engage with their favorite apps by voice. Other sessions will include in-depth conversations around native development on Google Assistant, with so much more.

We’ll also have guest speakers like: Garrett Gaudini, Head of Product at Postmates, Laurens Rutten, Founder & CEO of CoolGames, Corey Bozarth, VP of Product & Monetization at MyFitnessPal and many other, join us on stage to share their stories about how voice has transformed the way people interact with their apps and services.

Whether you build for mobile or smart home, these new tools will help make your content and services available to people who want to use their voice to get things done.

Registration is FREE! Head on over to the event website to register and check out the schedule.