Unlock new use cases and increase developer velocity with the latest ARCore updates


Posted by Ian Zhang, Product Manager, AR & Zeina Oweis, Product Manager, AR

Two phones showing animated screens

ARCore was created to provide developers with simple yet powerful tools to seamlessly blend the digital and physical worlds. Over the last few years, we’ve seen developers create apps that entertain, engage, and help people in different ways–from letting fans interact with their favorite characters, to placing virtual electronics and furniture for the perfect home setup and beyond.

At I/O this year, we continue on the mission of improving and building AR developer tools. With the launch of ARCore 1.24, we’re introducing the Raw Depth API and the Recording and Playback API. These new APIs will enable developers to create new types of AR experiences and speed up their development cycles.

Increase AR realism and precision with depth

When we launched the Depth API last year, hundreds of millions of Android devices gained the ability to generate depth maps in real time without needing specialized depth sensors. Data in these depth maps was smoothed, filling in any gaps that would otherwise occur due to missing visual information, making it easy for developers to create depth effects like occlusion.

The new ARCore Raw Depth API provides more detailed representations of the geometry of objects in the scene by generating “raw” depth maps with corresponding confidence images. These raw depth maps include unsmoothed data points, and the confidence images provide the confidence of the depth estimate for each pixel in the raw depth map.

4 examples of ARCore Raw Depth API

Improved geometry from the Raw Depth API enables more accurate depth measurements and spatial awareness. In the ARConnect app, these more accurate measurements give users a deeper understanding of their physical surroundings. The AR Doodads app utilizes raw depth’s spatial awareness to allow users to build realistic virtual Rube Goldberg machines.

ARConnect by PHORIA (left) and AR Doodads by Jam3 (right) use the improved geometry from the Raw Depth AP

ARConnect by PHORIA (left) and AR Doodads by Jam3 (right) use the improved geometry from the Raw Depth API

The confidence image in the Raw Depth API allows developers to filter depth data in real time. For example, TikTok’s newest effect enables users to upload an image and wrap it onto real world objects. The image conforms to surfaces where there is high confidence in the underlying depth estimate. The ability for developers to filter for high confidence depth data is also essential for 3D object and scene reconstruction. This can be seen in the 3D Live Scanner app, which enables users to scan their space and create, edit, and share 3D models.

TikTok by TikTok Pte. Ltd. (left) and  3D Live Scanner by Lubos Vonasek Programmierung (right) use confidence images from the ARCore Raw Depth API

TikTok by TikTok Pte. Ltd. (left) and 3D Live Scanner by Lubos Vonasek Programmierung (right) use confidence images from the ARCore Raw Depth API

We’re also introducing a new type of hit-test that uses the geometry from the depth map to provide more hit-test results, even in low-texture and non-planar areas. Previously, hit-test worked best on surfaces with lots of visual features.

Hit Results with Planes (left)

Works best on horizontal, planar surfaces with 

good texture

Hit Results with Depth (right)

Gives more results, even on non-planar or
low-texture areas

The lifeAR app uses this improved hit-test to bring AR to video calls. Users see accurate virtual annotations on the real-world objects as they tap into the expertise of their social circle for instant help to tackle everyday problems.

lifeAR by TeamViewer uses the improved depth hit-test

As with the previous Depth API, these updates leverage depth from motion, making them available on hundreds of millions of Android devices without relying on specialized sensors. Although depth sensors such as time-of-flight (ToF) sensors are not required, having them will further improve the quality of your experiences.

In addition to these apps, the ARCore Depth Lab has been updated with examples of both the Raw Depth API and the depth hit-test. You can find those and more on the Depth API documentation page and start building with Android and Unity today.

Increase developer velocity and post-capture AR

A recurring pain point for AR developers is the need to continually test in specific places and scenarios. Developers may not always have access to the location, lighting will change, and sensors won’t catch the exact same information during every live camera session.

The new ARCore Recording and Playback API addresses this by enabling developers to record not just video footage, but also IMU and depth sensor data. On playback, this same data can be accessed, enabling developers to duplicate the exact same scenario and test the experience from the comfort of their workspace.

DiDi used the Recording and Playback API to build and test AR directions in their DiDi-Rider app. They were able to save 25% on R&D and testing costs, 60% on travel costs, and accelerated their development cycle by 6 months.

DiDi-Rider by Didi Chuxing saves on development resources with the Recording and Playback API

DiDi-Rider by Didi Chuxing saves on development resources with the Recording and Playback API

In addition to increasing developer velocity, recording and playback unlocks opportunities for new AR experiences, such as post-capture AR. Using videos enables asynchronous AR experiences that remove time and place constraints. For instance, when visualizing AR furniture, users no longer have to be in their home. They can instead pull up a video of their home and accurately place AR assets, enabling them to “take AR anywhere”.

Jump AR by SK Telecom uses the Recording and Playback API to transport scenes from South Korea right into users’ homes to augment with culturally relevant volumetric and 3D AR content.

JumpAR by SKT uses Recording and Playback to bring SouthKorea to your home

JumpAR by SKT uses Recording and Playback to bring SouthKorea to your home

VoxPlop! by Nexus Studios is experimenting with the notion of Spatial Video co-creation, where users can reach in and interact with a recorded space rather than simply placing content on top of a video. The Recording and Playback API enables users to record videos, drop in 3D characters and messages, and share them with family and friends.

VoxPlop! by Nexus Studios uses the Recording and Playback API to experiment with Spatial Video co-creation

VoxPlop! by Nexus Studios uses the Recording and Playback API to experiment with Spatial Video co-creation

Learn more and get started with the Recording and Playback API docs.

Get started with ARCore today

These latest ARCore updates round out a robust set of powerful developer tools for creating engaging and realistic AR experiences. With over a billion lifetime installs and 850 million compatible devices, ARCore makes augmented reality accessible to nearly everyone with a smartphone. We’re looking forward to seeing how you innovate and reach more users with ARCore. To learn more and get started with the new APIs, visit the ARCore developer website.

Improving shared AR experiences with Cloud Anchors in ARCore 1.20

Posted by Eric Lai, Product Manager, Augmented Reality

Augmented reality (AR) can help you explore the world around you in new, seemingly magical ways. Whether you want to venture through the Earth’s unique habitats, explore historic cultures or even just find the shortest path to your destination, there’s no shortage of ways that AR can help you interact with the world.

That’s why we’re constantly improving ARCore — so developers can build amazing AR experiences that help us reimagine what’s possible.

In 2018, we introduced the Cloud Anchors API in ARCore, which lets people across devices view and share the same AR content in real-world spaces. Since then, we’ve been working on new ways for developers to use Cloud Anchors to make AR content persist and more easily discoverable.

Create long-lasting AR experiences

Last year, we previewed persistent Cloud Anchors, which lets people return to shared AR experiences again and again. With ARCore 1.20, this feature is now widely available to Android, iOS, and Unity mobile developers.

Developers all over the world are already using this technology to help people learn, share and engage with the world around them in new ways.

MARK, which we highlighted last year, is a social platform that lets people leave AR messages in real-world locations for friends, family and their community to discover. MARK is now available globally and will be launching the MARK Hope Campaign in the US to help people raise funds for their favorite charities and have their donations matched for a limited time.

AR photo

MARK by People Sharing Streetart Together Limited

REWILD Our Planet is an AR nature series produced by Melbourne based studio PHORIA. The experience is based on the Netflix original documentary series Our Planet. REWILD uses Ultra High Definition Video alongside AR content to let you venture into earth’s unique habitats and interact with endangered wildlife. It originally launched in museums, but can now be enjoyed on your smartphone in your living room. As episodes of the show are released, persistent Cloud Anchors allow you to return to the same spot in your own home to see how nature is changing.

AR image

REWILD Our Planet by PHORIA

Changdeok ARirang is an AR tour guide app that combines the power of SK Telecom’s 5G with persistent Cloud Anchors. Visitors at Changdeokgung Palace in South Korea are guided by the legendary Haechi to relevant locations where they can experience historical and cultural high fidelity AR content. Changdeok ARirang at Home was also launched so that this same experience can be accessed from the comfort of your couch.

AR image

Changdeok ARirang by SK Telecom

In Sweden, SJ Labs, the innovation arm of Swedish Railways, together with Bontouch, their tech innovation partner, uses persistent Cloud Anchors to help passengers find their way at Central Station in Stockholm, making it easier and faster for them to make their train departures.

AR image

SJ Labs by SJ – Swedish Railways

Coming soon, Lowe’s Persistent View will let you design your home in AR with the help of an expert. You’ll be able to add furniture and appliances to different areas of your home to see how they’d look, and return to the experience as many times as needed before making a purchase.

AR example

Lowe’s Persistent View powered by Streem

If you’re interested in building AR experiences that last over time, you can learn more about persistent Cloud Anchors in our docs.

Call for collaborators: test a new way to find AR content

As developers use Cloud Anchors to attach more AR experiences to the world, we also want to make it easier for people to discover them. That’s why we’re working on earth Cloud Anchors, a new feature that uses AR and global localization—the underlying technology that powers Live View features on Google Maps—to easily guide users to AR content. If you’re interested in early access to test this feature, you can apply here.

Some earth Cloud Anchors concepts

A new wave of AR Realism with the ARCore Depth API

Posted by Rajat Paharia, Product Lead, AR Platform

Since the launch of ARCore, our developer platform for building augmented reality (AR) experiences, we’ve been focused on providing APIs that help developers seamlessly blend the digital and physical worlds.

At the end of last year, we announced a preview of the ARCore Depth API, which uses our depth-from-motion algorithms to generate a depth map with a single RGB camera. Since then, we’ve been working with select collaborators to explore how depth can be used across a range of use cases to enhance AR realism.

Today, we’re taking a major step forward and announcing the Depth API is available in ARCore 1.18 for Android and Unity, including AR Foundation, across hundreds of millions of compatible Android devices.

Generate a depth map without specialized hardware to unlock capabilities like occlusion

As we highlighted last year, a key capability of the Depth API is occlusion: the ability for digital objects to accurately appear behind real world objects. This makes objects feel as if they’re actually in your space, creating a more realistic AR experience.

Illumix, the game studio behind Five Nights at Freddy’s AR: Special Delivery, uses occlusion to deepen the realism of the experience by allowing certain characters to hide behind objects for more startling jump scares.

Play Five Nights at Freddy’s AR: Special Delivery

While occlusion is an important capability, the ARCore Depth API unlocks more ways to increase realism and enables new interaction types. The ARCore Depth Lab spurred more ideas on how depth can be used, including realistics physics, surface interactions, environmental traversal, and more. Developers can now build on these ideas through the open sourced GitHub project.

Experiment with ARCore Depth Lab on the Google Play Store

The designers and engineers at Snap Inc. integrated several of these ideas into a set of Snapchat Lenses including the Dancing Hotdog and a new Android exclusive Undersea World Lens.

See how depth can add a layer of realism to your Snapchat experience

Snapchat Lens Creators can now download an ARCore Depth API template to create depth-based experiences for compatible Android devices. Sam Hare, Research Engineering Manager at Snap Inc, expressed his excitement, “We’re beginning to understand what kinds of depth capabilities are exciting for developers to build with. This single integration point streamlines and simplifies the development process and enables Lens Studio developers to easily take advantage of advanced depth capabilities.”

Another app that combines occlusion with other depth capabilities is Lines of Play, an Android experiment from the Google Creative Lab. Lines of Play lets users create domino art in AR, and uses depth information to showcase both occlusion and collisions. Design elaborate domino creations, topple them over and watch them collide with the furniture and walls in your room.

Watch as domino pieces topple into each other and onto your walls with Lines of Play

In addition to gaming and self-expression, depth can also be used to unlock new utility use cases. For example, the TeamViewer Pilot app, a remote assistance solution that enables AR annotations on video calls, uses depth to better understand the environment so experts around the world can more precisely apply real time 3D AR annotations for remote support and maintenance.

3D annotations help experts accurately highlight details in the TeamViewer Pilot app

Later this year, you will be able to try more depth-enabled AR experiences such as SKATRIX by Reality Crisis and SPLASHAAR by ForwARdgames, that use surface interactions and environmental traversal as they make rich use of the environment around you.

Check out surface interactions and environmental traversal in SKATRIX, and SPLASHAAR

While depth sensors, such as time-of-flight (ToF) sensors, are not required for the Depth API to work, having them will further improve the quality of experiences. Dr. Soo Wan Kim, Camera Technical Product Manager at Samsung commented on the future that the Depth API and ToF unlocks saying, “Depth will enrich user’s AR experience in many perspectives. It will reduce scanning time, and can detect planes fast, even low textured planes. These will bring seamless experiences to users who will be able to use AR apps more easily and frequently.” In the coming months, Samsung will update their Quick Measure app to use the ARCore Depth API on the Galaxy Note10+ and Galaxy S20 Ultra.

Accurately measure with Quick Measure

The ARCore Depth API will be rolling out today. Check back later for the SDK and updates to our developer site.

Blending Realities with the ARCore Depth API

Posted by Shahram Izadi, Director of Research and Engineering

ARCore, our developer platform for building augmented reality (AR) experiences, allows your devices to display content immersively in the context of the world around us– making them instantly accessible and useful.
Earlier this year, we introduced Environmental HDR, which brings real world lighting to AR objects and scenes, enhancing immersion with more realistic reflections, shadows, and lighting. Today, we’re opening a call for collaborators to try another tool that helps improve immersion with the new Depth API in ARCore, enabling experiences that are vastly more natural, interactive, and helpful.
The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.
Example depth map

Example depth map, with red indicating areas that are close by, and blue representing areas that are farther away.

One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real world objects. Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene. We will begin making occlusion available in Scene Viewer, the developer tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today.

A virtual cat with occlusion off and with occlusion on.

We’ve also been working with Houzz, a company that focuses on home renovation and design, to bring the Depth API to the “View in My Room” experience in their app. “Using the ARCore Depth API, people can see a more realistic preview of the products they’re about to buy, visualizing our 3D models right next to the existing furniture in a room,” says Sally Huang, Visual Technologies Lead at Houzz. “Doing this gives our users much more confidence in their purchasing decisions.”
The Houzz app with occlusion is available today.

The Houzz app with occlusion is available today.

In addition to enabling occlusion, having a 3D understanding of the world on your device unlocks a myriad of other possibilities. Our team has been exploring some of these, playing with realistic physics, path planning, surface interaction, and more.

Physics, path planning, and surface interaction examples.

When applications of the Depth API are combined together, you can also create experiences in which objects accurately bounce and splash across surfaces and textures, as well as new interactive game mechanics that enable players to duck and hide behind real-world objects.
A demo experience we created where you have to dodge and throw food at a robot chef

A demo experience we created where you have to dodge and throw food at a robot chef.

The Depth API is not dependent on specialized cameras and sensors, and it will only get better as hardware improves. For example, the addition of depth sensors, like time-of-flight (ToF) sensors, to new devices will help create more detailed depth maps to improve existing capabilities like occlusion, and unlock new capabilities such as dynamic occlusion—the ability to occlude behind moving objects.
We’ve only begun to scratch the surface of what’s possible with the Depth API and we want to see how you will innovate with this feature. If you are interested in trying the new Depth API, please fill out our call for collaborators form.

ARCore updates to Augmented Faces and Cloud Anchors enable new shared cross-platform experiences

Posted by Christina Tong, Product Manager, Augmented Reality

Two years ago, we launched ARCore, our developer platform for building augmented reality (AR) experiences. Since then, we’ve seen developers create thousands of AR apps across Android and iOS that transform the way people play, shop, learn and create together. To enable even more shared cross-platform AR experiences, we’re announcing new updates to ARCore’s Augmented Faces and Cloud Anchors APIs.

Augmented Faces on iOS

Earlier this year, we announced our Augmented Faces API, which offers a high-quality, 468-point 3D mesh that lets users attach fun effects to their faces — all without a depth sensor on their smartphone. With the addition of iOS support rolling out today, developers can now create effects for more than a billion users. We’ve also made the creation process easier for both iOS and Android developers with a new face effects template.

Improvements to Cloud Anchors

Last year, we introduced the Cloud Anchors API, which lets developers create shared AR experiences across Android and iOS. Cloud Anchors let devices create a 3D feature map from visual data onto which anchors can be placed. The anchors are hosted in the cloud so multiple people can use them to enable shared real world experiences. Cloud Anchors power a wide variety of cross-platform apps, like Just a Line, PHAROS AR and Spacecraft AR.

In our latest ARCore update, we’ve made some improvements to the Cloud Anchors API that make hosting and resolving anchors more efficient and robust. This is due to improved anchor creation and visual processing in the cloud. Now, when creating an anchor, more angles across larger areas in the scene can be captured for a more robust 3D feature map. Once the map is created, the visual data used to create the map is deleted and only anchor IDs are shared with other devices to be resolved. Moreover, multiple anchors in the scene can now be resolved simultaneously, reducing the time needed to start a shared AR experience.

These updates to Cloud Anchors are available for developers today.

Persistent Cloud Anchors and Call for Collaborators

As we look to the future, we’re taking steps to expand the scale and timeline of shared AR experiences with persistent Cloud Anchors. We see this as enabling a “save button” for AR, so that digital information overlaid on top of the real world can be experienced at anytime.

Imagine working together on a redesign of your home throughout the year, leaving AR notes for your friends around an amusement park, or hiding AR objects at specific places around the world to be discovered by others.

Persistent Cloud Anchors are powering Mark AR, a social app being developed by Sybo and iDreamSky that lets people create, discover, and share their AR art with friends and followers in real-world locations. With persistent Cloud Anchors, users can continuously return back to their pieces as they create and collaborate over time.

Mark AR phone demonstration

Mark AR is an app that lets people create and discover AR art in real-world locations.

Reliably anchoring AR content for every use case—regardless of surface, distance, and time—pushes the limits of computation and computer vision because the real world is diverse and always changing. By enabling a “save button” for AR, we’re taking an important step toward bridging the digital and physical worlds to expand the ways AR can be useful in our day-to-day lives.

We’re currently looking for more developers to help us explore and test persistent Cloud Anchors in real world apps at scale, before making the feature broadly available. If you’re interested in early access, you can apply here.

Updates to ARCore Help You Build More Interactive & Realistic AR Experiences

Posted by Anuj Gosalia

A little over a year ago, we introduced ARCore: a platform for building augmented reality (AR) experiences. Developers have been using it to create thousands of ARCore apps that help people with everything from fixing their dishwashers, to shopping for sunglasses, to mapping the night sky. Since last I/O, we’ve quadrupled the number of ARCore enabled devices to an estimated 400 million.

Today, at I/O we introduced updates to Augmented Images and Light Estimation – features that let you build more interactive, and realistic experiences. And to make it easier for people to experience AR, we introduced Scene Viewer, a new tool which lets users view 3D objects in AR right from your website.

Augmented Images

To make experiences appear realistic, we need to account for the fact that things in the real world don’t always stay still. That’s why we’re updating Augmented Images — our API that lets people point their camera at 2D images, like posters or packaging, to bring them to life. The updates enable you to track moving images and multiple images simultaneously. This unlocks the ability to create dynamic and interactive experiences like animated playing cards where multiple images move at the same time.

Letter cards overlaid with an example of how Augmented Images API can be used with moving targets

An example of how the Augmented Images API can be used with moving targets by JD.com

Light Estimation

Last year, we introduced the concept of light estimation, which provides a single ambient light intensity to extend real world lighting into a digital scene. In order to provide even more realistic lighting, we’ve added a new mode, Environmental HDR, to our Light Estimation API.

two mannequins with varying light

Before and after Environmental HDR is applied to the digital mannequin on the left, featuring 3D printed designs from Julia Koerner

Environmental HDR uses machine learning with a single camera frame to understand high dynamic range illumination in 360°. It takes in available light data, and extends the light into a scene with accurate shadows, highlights, reflections and more. When Environmental HDR is activated, digital objects are lit just like physical objects, so the two blend seamlessly, even when light sources are moving.

two mannequins with light diffusing from left to right

Digital mannequin on left and physical mannequin on right

Environmental HDR provides developers with three APIs to replicate real world lighting:

  • Main Directional Light: helps with placing shadows in the right direction
  • Ambient Spherical Harmonics: helps model ambient illumination from all directions
  • HDR Cubemap: provides specular highlights and reflections

Rockets showing lighting changes: Main directional light plus ambient spherical harmonics plus HTF cubemap equals environmental HDR

Scene Viewer

We want to make it easier for people to jump into AR, so today we’re introducing Scene Viewer, so that AR experience can be launched right from your website without having to download a separate app.

To make your assets accessible via Scene Viewer, first add a glTF 3D asset to your website with the web component, and then add the “ar” attribute to the markup. Later this year, experiences in Scene Viewer will begin to surface in your Search results.




auto-rotate camera-controls alt="TEXT ABOUT YOUR MODEL" background-color="#455A64">

Mobile example of NASA.gov Curiosity Rover in use

NASA.gov enables users to view the Curiosity Rover in their space

These are a few ways that improving real world understanding in ARCore can make AR experiences more interactive, realistic, and easier to access. Look for these features to roll out over the next two releases. To learn more and get started, check out the ARCore developer website.