You now have an option to modify instance types and weights for a running EC2 Fleet or Spot Fleet (referred to further as fleet). You can replace an entire launch template configuration specifying new instance types, weights, and other parameters without deleting and re-creating a fleet.
Today, we are announcing first-class support for RxJava as of Amplify Android 1.3.0. RxJava is a Java VM implementation of Reactive Extensions. RxJava is widely used by Android developers as a means to simplify asynchronous programming. Amplify Android is part of the open source Amplify Framework. Amplify makes it easy for developers to build Android apps with AWS-powered functionality, such as auth, data modeling, storage, and analytics.
Announcing the General Availability of Bottlerocket, a new open source Linux-based operating system purpose-built to run containers
Today, Amazon Web Services (AWS) announced the General Availability of Bottlerocket, a new open source Linux-based Operating System (OS) purpose-built to run containers. Bottlerocket includes only the software needed to run containers and comes with a transactional update mechanism. These properties enable customers to use container orchestrators to manage OS updates with minimal disruptions, enabling improved security and lower operational costs for containerized applications. AWS-provided Bottlerocket images are available for Amazon EKS (GA) and Amazon ECS (Preview). Bottlerocket is developed as an open source project on GitHub.
Serverless IoT Platform Accelerator is an AWS Solutions Consulting Offer delivered via a consulting engagement from Onica, a Rackspace Technology Company and an AWS IoT and AWS Machine Learning Competency Partner. The Serverless IoT Platform Accelerator solves common problems in implementation around data ingestion, storage, and monitoring of IoT devices. With this accelerator, customers can gain intelligence from their IoT data within minutes of deploying the cloud-native platform. Customers that request this consulting offer will participate in an engagement that delivers IoT device connectivity, customized app features, data visualization, notifications, and device controls in a user-friendly interface, which leads to device data insights.
EMR Notebooks is a service that provides a fully managed, Jupyter-based notebook to data scientists and engineers who write ad-hoc jobs and experiment with them. Now you can orchestrate EMR Notebooks in a non-interactive manner to run ETL workloads especially in production. Before this feature, executing notebooks required the Jupyter User Interface access through the AWS Management Console.
Amazon WorkSpaces adds support for AWS Resource Groups Tag Editor. AWS Resource Groups Tag Editor allows you to add, edit, or delete AWS tags from your WorkSpaces along with your other AWS resources. AWS Resource Groups makes it easier to manage and automate the tasks on large numbers of AWS resources at one time. AWS Tag Editor allows you to search for AWS resources that you want to tag across multiple services and apply common tags for those resources.
Amazon AppStream 2.0 adds support for AWS Resource Groups Tag Editor. AWS Resource Groups Tag Editor allows you to add, edit, or delete AWS tags from your image builders, fleets, and stacks along with your other AWS resources. AWS Resource Groups makes it easier to manage and automate tasks on large numbers of AWS resources at one time. AWS Tag Editor allows to you search for AWS resources that you want to tag across multiple services and apply common tags for those resources.
Posted by Vikram Sharma, Software Engineering Intern; Jianing Wei, Staff Software Engineer; Tyler Mullen, Senior Software Engineer
Today, we are excited to release the Instant Motion Tracking solution in MediaPipe. It is built upon the MediaPipe Box Tracking solution we released previously. With Instant Motion Tracking, you can easily place fun virtual 2D and 3D content on static or moving surfaces, allowing them to seamlessly interact with the real world. This technology also powered MotionStills AR. Along with the library, we are releasing an open source Android application to showcase its capabilities. In this application, a user simply taps the camera viewfinder in order to place virtual 3D objects and GIF animations, augmenting the real-world environment.
Instant Motion Tracking in MediaPipe
Instant Motion Tracking
The Instant Motion Tracking solution provides the capability to seamlessly place virtual content on static or motion surfaces in the real world. To achieve that, we provide the six degrees of freedom tracking with relative scale in the form of rotation and translation matrices. This tracking information is then used in the rendering system to overlay virtual content on camera streams to create immersive AR experiences.
The core concept behind Instant Motion Tracking is to decouple the camera’s translation and rotation estimation, treating them instead as independent optimization problems. This approach enables AR tracking across devices and platforms without initialization or calibration. We do this by first finding the 3D camera translation using only the visual signals from the camera. This involves estimating the target region’s apparent 2D translation and relative scale across frames. The process can be illustrated with a simple pinhole camera model, relating translation and scale of an object in the image plane to the final 3D translation.
By finding the change in relative size of our tracked region from view position V1 to V2, we can estimate the relative change in distance from the camera.
Next, we obtain the device’s 3D rotation from its built-in IMU (Inertial Measurement Unit) sensor. By combining this translation and rotation data, we can track a target region with six degrees of freedom at relative scale. This information allows for the placement of virtual content on any system with a camera and IMU functionality, and is calibration free. For more details on Instant Motion Tracking, please refer to our paper.
A MediaPipe Pipeline for Instant Motion Tracking
A diagram of Instant Motion Tracking pipeline is shown below, consisting of four major components: a Sticker Manager module, a Region Tracking module, a Matrices Manager module, and lastly a Rendering System. Each of the components consists of MediaPipe calculators or subgraphs.
Diagram of Instant Motion Tracking Pipeline
The Sticker Manager accepts sticker data from the application and produces initial anchors (tracked region information) based on user taps, and user gesture controls for every sticker object. Initial anchors are then sent to our Region Tracking module to generate tracked anchors. The Matrices Manager combines this data with our device’s rotation matrix to produce six degrees-of-freedom poses as model matrices. After integrating any user-specified transforms like asset scaling, our final poses are forwarded to the Rendering System to render all virtual objects overlaid on the camera frame to produce the output AR frame.
Using the Instant Motion Tracking Solution
The Instant Motion Tracking solution is easy to use by leveraging the MediaPipe cross-platform framework. With camera frames, device rotation matrix, and anchor positions (screen coordinates) as input, the MediaPipe graph produces AR renderings for each frame, providing engaging experiences. If you wish to integrate this Instant Motion Tracking library with your system or application, please visit our documentation to build your own AR experiences on any device with IMU functionality and a camera sensor.
Augmenting The World with 3D Stickers and GIFs
Instant Motion Tracking solution allows bringing both 3D stickers and GIF animations into Augmented Reality experiences. GIFs are rendered on flat 3D billboards placed in the world, introducing fun and immersive experiences with animated content blended into the real environment.Try it for yourself!
Demonstration of GIF placement in 3D
MediaPipe Instant Motion Tracking is already helping PixelShift.AI, a startup applying cutting-edge vision technologies to facilitate video content creation, to track virtual characters seamlessly in the view-finder for a realistic experience. Building upon Instant Motion Tracking’s high-quality pose estimation, PixelShift.AI enables VTubers to create mixed reality experiences with web technologies. The product is going to be released to the broader VTuber community later this year.
Instant Motion Tracking helps PixelShift.AI create mixed reality experiences
We look forward to publishing more blog posts related to new MediaPipe pipeline examples and features. Please follow the MediaPipe label on Google Developers Blog and Google Developers twitter account (@googledevs).
We would like to thank Vikram Sharma, Jianing Wei, Tyler Mullen, Chuo-Ling Chang, Ming Guang Yong, Jiuqiang Tang, Siarhei Kazakou, Genzhi Ye, Camillo Lugaresi, Buck Bourdon, and Matthias Grundman for their contributions to this release.
Amazon RDS for Oracle now supports July 2020 Oracle Patch Set Updates (PSU) and Release Updates (RU)
Amazon RDS for Oracle now supports the July 2020 Patch Set Updates (PSU) for Oracle Database 11.2 and 12.1, and July 2020 Release Update (RU) for Oracle Database 12.2, 18c and 19c.
Today, we are excited to announce the general availability (GA) of an Amazon GameLift FleetIQ update that enables game developers to add low-cost, low-latency GameLift servers to their existing on-premises or cloud-based server capacities. GameLift is an AWS managed service for deploying, operating, and scaling dedicated servers for multiplayer games, and it is trusted by some of the most successful game companies in the world, including Ubisoft, Gameloft, and N3TWORK. With this update, developers can launch low-cost GameLift servers into their AWS accounts and register the servers with their existing game server management systems to incrementally migrate live games, burst in-game events, or deploy containerized games onto AWS. To learn more about the GameLift FleetIQ update, visit our blog.
Amazon EC2 can now hibernate EBS-backed Amazon EC2 M5a and R5a instances. You can now hibernate your newly launched instances running on M5a and R5a instance types. Hibernation provides you the convenience of pausing your workloads and resuming them from the saved state later. Hibernation is just like closing and opening your laptop lid, your application will start right where it left off.