How Udacity students succeed with Google Cloud

Editor’s note: Today we hear from Udacity, which uses a variety of Google Cloud technologies for its online learning platform. Read on to learn how they built online workspaces that give students immediate access to fast, isolated compute resources and private data sets. 

At Udacity, we use advanced technologies to teach about technology. One example is our  interactive “Workspaces,” which students use to gain hands-on experience with a variety of advanced topics like artificial intelligence, data science, programming and cloud. These online environments comprise everything from SQL interpreters to coding integrated development environments (IDEs), Jupyter Notebooks and even fully functional 3D graphical desktops—all accessible via an everyday browser.

udacity uLab.gif
Udacity’s latest IDE environment, “uLab,” where “Learning Guides” can demonstrate skills interactively.

To build these Workspaces, we relied heavily on Google Cloud Platform (GCP) in numerous interesting and novel ways. This article details our implementation and where we hope to take it in the future. 

Workspaces design goals

Udacity customers are smart, busy learners from all over the world, who access our courses remotely. To meet their needs, we designed Udacity Workspaces to:

  • Be ready to use in under 15 seconds

  • Offer advanced functionality directly inside the browser-based Udacity Classroom

  • Instantly furnish starter and example files to students in a new Workspace, and automatically save all student work and progress for the next session

  • Provide quick access to large external datasets

  • Function well with any session duration… from two minutes to four hours, or more

  • Provide reliable compute availability and GPU power wherever needed

We chose GCP for its ease of use, reliability, and cost-effectiveness. Let’s see how we used different GCP offerings to meet these goals.

Fast, personalized access to Workspaces 

Students demand immediate access to their Workspaces, but booting up a full GCP host from an image can take awhile. That’s OK if a student plans on using their Workspace for an hour, but not if they’re using it for a two minute Workspace coding challenge.

To address this, we built a custom server management tool (“Nebula”) that maintains pools of ready servers to assign to students immediately. To control costs, the pools are sized by a custom usage-pressure measurement algorithm to be fairly surge ready, but which also reduces the pools to as small as a single instance during idle periods. Pools are maintained in multiple data centers, to maximize access to GPUs.

GCP’s by-the-second pricing and flexible reservations policy served us well here. Given the short usage windows of some student exercises, hourly billing or bulk billing might have proved cost prohibitive.

Having ready-to-go server pools minimizes startup time, but we also needed to place “starter files,” or later on, the student’s own work from a previous session, onto the hosts as quickly as possible. After experimenting with several approaches, we decided to store these files as tarballs in Cloud Storage. We found that we can copy up to 3GB to and from Cloud Storage within our SLA time window, so we set a hard limit of 3GB for student drives.

Every time a student’s session goes idle for half an hour, we deallocate the host, compress and copy the student’s files to Cloud Storage, then delete the host. In this manner we make time-stamped backups of each session’s files, that students can opt to restore any time they need to (via the Workspaces GUI). An alternative approach could be to leverage Cloud Storage’s version control, which provides access to GCP’slifecycle controls as well. However, at the time we built the student files storage system, this GCP feature was still in beta, so we opted for a home-grown facility.

In addition, we take advantage of Cloud Functions to duplicate the student files in a second region to ensure against regional outages. Side note: if we were to build this feature today, we could take advantage of dual-region buckets to automatically save student files in two regions.

Access to large datasets

From time to time, students need to access large datasets, e.g., in our machine learning courses. Rather than writing these datasets on server images, we mount read-only drives to share a single dataset across multiple student hosts. We can update these datasets on new shared drives, and Nebula can point new sessions at these new drives without interrupting existing session mounts. 

To date, we’ve never run into a concurrent read-only mount limit for these drives. However, we do see a need for quick-mount read-write dataset drives. One example could be a large SQL database that a student is expected to learn to modify in bulk. Duplicating a large drive on-the-fly isn’t feasible, so one approach could be to manage a pool of writeable drive copies to mount just-in-time, or to leverage Google Cloud’s Filestore

With the Filestore approach, you’d pre-create many copies of data drives in a folder tree, and mount a particular folder on the Filestore to a specific student’s container when access is needed; that copy would then never be assigned to anybody else, and asynchronously deleted/replaced with a fresh, unaltered copy when the student’s work is finished.

Consistent compute power

In a shared environment (e.g. Google Kubernetes Engine ), one student’s runaway process could affect the compute performance of another student’s entire container (on the same metal). To avoid that, we decided on a “one-server-per-student” model, where each students gets access to a single Compute Engine VM, running several Docker containers—one container for the student’s server, another for an auto-grading system, and yet another for handling file backups and restores. In addition to providing consistent compute power, this approach also has a security advantage: it allows us to run containers in privileged mode, say, to use specialized tools, without risking a breach beyond the single VM allocated to any one student.

This architecture also ensures that GPU-equipped hosts aren’t shared either, so students benefit from all available performance. This is especially important as students fire up long-running, compute intensive jobs such as performing image recognition. 

As a cost control measure, we meter GPU host usage and display available remaining GPU time to students, so they  switch their GPUs on and off. This “switching” actually allocates a new host from our pools to the student (either GPU-enabled or not). Because we can do the switch in under 15 seconds, it feels approximately like a toggle switch, but some aspects of the session (such as open files) may be reset (e.g., in an IDE configuration). We encourage students to ration their GPU time and perform simpler tasks such as editing or file management in “CPU mode.”

One of our GPU host configurations provides an in-browser Ubuntu desktop with pass-through Nvidia K80 GPUs for high-performance compute and graphics. This configuration is heavily employed by our Autonomous Systems students, who run graphics-intensive programs like Gazebo (shown), and run robot environment simulations. You can read more about this configuration here.

udacity gpu.gif

Wanted: flexible and isolated images

This configuration has hit all our goals except for true image flexibility. For our large variety of courses we require many variations of software installations. Normally such needs would be satisfied with containers, but the requirement of isolated compute environments eliminates that as an option. 

In the past two years, we’ve empowered hundreds of thousands of Udacity students to advance their careers and learn new skills with powerful learning environments called Workspaces, built on top of GCP. Throughout, GCP has proven itself to be a robust platform and a supportive partner, and we look forward to future product launches on top of Google Cloud. If you’d like to learn more about the solutions we’ve built, feel free to reach out to me on Twitter, @Atlas3650.

How Sunrun sheds light on efficiency and growth with Chrome Enterprise

Editor’s note: Today’s post is by Joe Romeo, Lead Systems Administrator at Sunrun, the leading home solar, battery storage, and energy services company. Sunrun uses Chrome Enterprise to grow the business and help employees work efficiently.

The solar market is very competitive. Solar providers need to devote an enormous amount of technology and resources toward winning sales and getting ahead, and as the market leader for residential solar, Sunrun needs to continuously iterate, improve, and focus on efficiency in order to grow our business. Chrome Enterprise and Dell Chromebooks have become our go-to solutions for doing what we do better than anyone else.

How can we solve day-to-day business problems?  

We’ve standardized on Chrome Browser, and all of our employees use G Suite. That way, we can be device-agnostic, which works well in a business where people use all types of technology, including Android devices, iPhones, and every kind of computer. Ideally, we’d like everyone to use Dell Chromebooks moving forward, but first we wanted to understand who needs them and why.

We reached out to team leads in every business unit and asked what tools they used to get work done. We got some surprising feedback. For instance, our project coordinators’ workflow consisted of taking pictures of customers’ roofs before and after solar panel installations, then shrinking every picture down to thumbnail size on their Windows laptops before uploading them to Salesforce—a process that took a lot of time. 

We figured out a better way. We gave the project coordinators Chromebooks, which greatly simplified the process. We set up shared drives in Google Drive, with folders linking directly to Salesforce, eliminating several steps. It was a good example of how we can speed up work processes using better tools and communication between teams and our information technology department.

Chrome Browser also inspires us to get creative with homemade extensions. Our Salesforce team built an extension that lets people submit a trouble ticket with just one click, including adding a screenshot of the Salesforce webpage issue. The extension works much better than the old process of opening up a trouble ticket in Salesforce, which was frustrating and a barrier for people when seeking help.

Feb 12670 _ Sunrun hero image v2.png

How can we get devices into peoples’ hands faster?

We’re a lean IT team, with about 35 people serving 4,400 workers. We don’t want to be a bottleneck for employees, like busy salespeople, who need laptops. In the past, we’d collect laptops from former employees and ship them to one of our onsite IT departments. Local IT professionals spent as many as two hours (in between support calls) re-imaging each laptop and installing security software before shipping them around the country to new hires, who may have waited up to a week to receive devices. Employees never liked this slow and costly process. It’s frustrating to show up on day one at a new job and not have the tools you need to work.

Chromebooks completely changed device handouts for the sales team. With Chrome Enterprise Upgrade, we can wipe Chromebooks remotely in five minutes or less—no shipping delays and no time-wasting. We created organizational units for each team’s laptop in the Admin console, each with different policies. It’s a much better IT management and security experience.

How can we onboard and train employees better?

Our overall onboarding process needed work even beyond the sales team. If you’re handing out new devices, training and change management are really important. Fortunately, Chrome Enterprise and G Suite ensure that our “laptop on day one” goal is met.

Our support teams have gone through extensive Chromebook training. Most of them use a Chromebook every day, so when employees call with an issue, support techs can walk them through it using their own devices. 

Google’s training documentation was very helpful, especially the support pages with lots of pictures (we love pictures around here!). In fact, we created GIFs for our monthly “Tips and Tricks” emails, showing employees how to open and edit Excel files in Google Sheets, for example. It’s much more effective than just saying, “click this, then click that.”

chromebook excel.gif

When I came on board at Sunrun, our IT VP’s motto was, “Modernize and simplify.” These should be core values for any business. But in a company whose mission is to “make a planet run by the sun,” reducing wasteful processes and improving efficiency are right in line with what we do. Chrome Enterprise and Chromeboooks help us solve these pain points, and every day, they sprout all kinds of new benefits.

Code to Inspire brings developer skills to girls in Afghanistan with the help of G Suite

Born in Iran as a refugee during the Soviet invasion of Afghanistan, I understand the challenges many people face there to get access to formal education. I was one of eight children in a progressive, yet financially limited, family. We left everything behind in Herat to move to a new country. To make ends meet, my mother sold handmade clothing. She invested what little she earned in my education, which made it possible for me to finish high school. 

While opportunities for education in Afghanistan have increased over the past few decades, there are still many barriers that stand in the way of education for Afghan women—familial expectations, socioeconomic circumstances, cultural stigmas, societal norms and even safety issues. These circumstances make it challenging for women to find work and explain why only 19 percent participate in the workforce, 84 percent lack formal education and are often illiterate, and just 2 percent have access to higher education.

2001 marked the fall of the Taliban in Afghanistan, and many Afghan families, including my own, found hope in their motherland again. The next year, I returned to Herat. Seeing my peers —women just like me with so much potential—I felt compelled to do something. But I knew that in order to help others, I first needed to further my own education. I earned a bachelor’s degree in Computer Science, and later a Master’s degree from the Technical University of Berlin in Germany. 

When I returned to Afghanistan after school, I hoped to share my newly-minted tech skills with Herat women by teaching at the local university. At the time, few women were participating in the public workforce and there were still many extremist, conservative views in the country. I was vocal about inequalities and faced backlash from the community; because of these threats, I came to the United States as an asylum seeker in 2012. 

After arriving in the United States, I was inspired to start Code to Inspire, the first coding school for girls in Afghanistan between the ages of 15 and 25 that provides free after school education in gaming, web development, graphic design, mobile applications and full stack development. Our goal is to empower women with the skills they need to program so that they can drive change in their communities and gain equal access to opportunities and financial independence.

Determined to make this coding school happen, I found myself faced with an interesting obstacle: how could I build an education center in Afghanistan without ever leaving the US? 

To make this possible, I turned to technology—just as I had hoped my future students would. I took a side job teaching Farsi to pay my bills, and built the blueprint for Code to Inspire from a laptop in Brooklyn. I managed everything from my computer working from cafes in New York: fundraising, shipping equipment, recruiting mentors, registering applicants, and developing the curriculum. To move things forward day-to-day, I use G Suite to communicate and collaborate with my team. I use Hangouts Meet to connect with students remotely during weekly calls and monthly check-ins. We use Google Docs, Sheets and Slides to create essential documents and keep track of our operations. We even used Slides for our first pitch deck, which I shared with my board to get their feedback. This power and connectivity enabled a refugee to make her dream come true, and built up the digital literacy of the women I was working with back home!

Since opening its doors, 150 girls have studied with us, and the students have created so many interesting projects. One inspiring example is the popular mobile game, Afghan Hero Girl, which has taken off in Afghanistan and abroad, and was even recognized by the local government. The girls developed the game to show a female protagonist dressed in traditional Afghan garb, who undergoes obstacles that are relatable to them.  

Over 50 women have graduated from Code to Inspire, and 20 have secured remote full-time employment and freelance projects. Many of these women are even out-earning their male relatives who have become huge supporters of the program.

By 2030, we hope to open two additional schools in Kabul and Mazar that can serve up to 500 girls and provide employment opportunities within six months of graduation. From the ruins of a shattered nation and shattered lives of refugees can come treasure, if we know where to find it. For me, the girls in Afghanistan are the treasure and investing in their education is the future of a peaceful Afghanistan.

How Google Cloud helped Phoenix Labs meet demand spikes with ease for its hit multiplayer game Dauntless

In the role-playing video game Dauntless, players work in groups to battle monsters and protect the city-island of Ramsgate. Commitment reaps big rewards: with every beast slayed, you earn new weapons and armor made of the same materials as the Behemoth you took down, strengthening your arsenal for the next battle. 

And when creating Dauntless, game studio Phoenix Labs channeled these same values of resourcefulness, teamwork, and persistence. But instead of using war pikes and swords, it wielded the power of the cloud to achieve its goals.  

Preparing for unknown battles with containers and the cloud

For the gaming industry, launches bring unique technological challenges. It’s impossible to predict if a game will go viral, and developers like Phoenix Labs need to plan for a number of scenarios without knowing exactly how many players will show up and how much server capacity will ultimately be needed. In addition, since Dauntless was the first game in the industry to launch cross-platform—available on PlayStation 4, Xbox One, and PCs—it would be critical for all the underlying cloud-based services to work together flawlessly and provide an uninterrupted, real-time and consistent experience for players around the globe.

As part of staying agile to meet player needs, Phoenix Labs runs all its game servers in containers on Google Cloud Platform (GCP). The studio has a custom Google Kubernetes Engine (GKE) cluster in each region where Dauntless is available, across five continents (North America, Australia, Europe and Asia). When a player loads the game, Dauntless matches him or her with up to three other players, forming a virtual team that is taken to a neighboring island to hunt a Behemoth monster together. Each “group hunt” runs on an ephemeral pod on GKE, lasting for about 15 minutes before the players complete their assignment and return to Ramsgate to polish their weapons and prepare for the next battle. 

“Containerizing servers isn’t very common in the gaming industry, especially for larger games,” said Simon Beaumont, VP Technology at Phoenix Labs. “Google Cloud spearheaded this effort with their leadership and unique technology expertise, and their platform gave us the flexibility to use Kubernetes-as-a-service in production.”

dauntless-screenshot_stormchasers_login-fullsize.jpg

Addressing player and customer needs at launch and beyond

When Dauntless launched out of beta earlier this year, the required amount of server capacity turned out to be a lot. Within the first week, player count quickly climbed to 4 million—rapid growth that was no small feat to accommodate.

Continuously addressing Reddit and Twitter feedback from players, Phoenix Labs’ lean team worked side by side with Google Cloud Professional Services to execute over 1,700 deployments to its production platform during the week of the launch alone. 

“Google Cloud’s laser focus on customers reaches a level I’ve never seen before,” said Jesse Houston, CEO and co-founder at Phoenix Labs. “They care just as much about our experience as a GCP customer as they do about our players. Without their ‘let’s go’ attitude, Dauntless would have been a giant game over.”

dauntless-artwork_switch_launch_malkarion_box_art-fullsize.jpg

“Behemoth” growth, one platform at a time 

Now that Dauntless has surpassed 16 million unique players and launched on Nintendo Switch, Phoenix Labs is preparing to expand to new regions such as Russia and Poland (they recently launched in Japan) and take advantage of other capabilities across Google. For example, by leveraging Google Ads and YouTube as part of its digital strategy for Dauntless, 5 million new gamers were onboarded in the first week of launch; using YouTube Masthead ads also increased exposure to its audience. Phoenix Labs has migrated to Google Cloud’s data warehouse BigQuery for its ease of use and speed, returning queries in seconds based on trillions of rows of data. They’re even beginning to use the Google Sheets data connector for BigQuery to simplify reporting and ensure every decision is data informed. 

At Google Cloud, we’re undaunted by behemoth monsters—and the task of making our platform a great place to launch and run your multiplayer game. Learn more about how game developers of all sizes work with Google Cloud to take their games to the next level here.

A home away from home: Wayfair goes hybrid on Google Cloud with 100 Gbps Dedicated Interconnect

Whatever the requirement—everything from enterprise-readiness fundamentals like reliability, performance, and security, to innovations for enabling microservices architecture or hybrid/multi-cloud deployments—the Google Cloud networking portfolio has something to offer.  

For Wayfair Inc., an American e-commerce company and one of the world’s largest online sellers of home goods, globally resilient connectivity was the key to streamlining its move to the cloud. Formerly known as CSN Stores, the company was founded in 2002 and now sells over 14 million home furnishing and décor items from over 11,000 suppliers.

In early 2018, Wayfair was thinking about the best way to run and scale its on-prem server fleet to support its largest sales event of the year—Way Day. With existing data centers and compute capacity around the globe, Wayfair decided to explore a hybrid architecture in Google Cloud that would let them burst their capacity during events like Way Day and Black Friday.

cupcakes.jpg

Google Cloud’s hybrid connectivity networking products provide a fast and reliable connection between customers’ infrastructure to our cloud. Using Google Cloud’s 100 Gbps Dedicated Interconnect helped Wayfair minimize the risk of running out of network capacity during peak hours as well as keeping the data analysis going in order to deliver the products their customers need. The large pipe also allowed Wayfair to control their capacity and have better troubleshooting, capacity planning, and forecasting capabilities. 

“As we evolve our public cloud strategy, our customers depend on secure, high performance, and reliable connectivity from our data centers to Google Cloud. We turned to Google as our cloud provider because Google’s network has the throughput, bandwidth, and latency required for our business applications.” – Steve Crusenberry, Vice President, Infrastructure and Platform Engineering, Wayfair

GCP 100 Gbps Dedicated Interconnect.gif

Trying cloud on for size: URBN’s Nuuly builds from scratch with Google Cloud

Editor’s note: Today we’re hearing from Nuuly, the subscription-based clothing business of URBN. As a new brand within an established company, their small technology team had the opportunity to build their infrastructure from scratch. Here’s how they did it.

At URBN, we’re a retail company that includes several well-known U.S. retail brands: Urban Outfitters, Anthropologie, and Free People. When we decided to move into the subscription model space last year, our goal was to let our customers try out new styles by renting clothing. They can subscribe monthly and pick six items per month from our e-commerce site to wear as much as they like, then return them back to us. This brand-new service, called Nuuly, launched in mid-2019 after just 10 months from conceiving the idea to offering the service to customers. In that 10 months, we built an entire technology platform using Google Cloud from the ground up, choosing the right tools to power this new business—without any legacy apps. 

From a technology perspective, we had different challenges and goals as a subscription provider than URBN does as an online and in-person retailer. We decided to put a team together to totally focus on this new retail rental model and its unique requirements. That meant we could make our own decisions about the platform and frameworks we’d use. We first had to think about the business model we were building, then assess what technology would best fit with that model.  

Signing up for cloud    

The overarching challenge we faced was creating the best solution for the job within a tight timeframe. Instead of just shopping for products, we went a layer deeper and looked at all aspects of this new market we were entering. The recurring revenue model is different from a typical retail revenue model, and so is managing inventory at the individual garment level, rather than the typical SKU level. But in the retail market, a lot of legacy tools were built for that model. Our existing enterprise system wasn’t able to support these new requirements, and it would have been much harder to retrofit than to build from the ground up. We knew we wanted a cloud platform so we could take advantage of provider infrastructure instead of ours for hosting.

When we started exploring the cloud technology options to power Nuuly, we had a few parameters established: We wanted a secure platform that would let us grow as the business grows, and since we’re not cloud experts, we wanted to have the option of using managed services. Because we were in a brand-new space, we wanted to be able to start small, but not be restricted when we did want to scale out. Another important aspect for us was that our technology would let us easily use data science and machine learning to do extensive personalization. 

As the engineering lead, I did a lot of up-front analysis to figure out which would be the best cloud partner for us. Google checked off a lot of boxes, and we knew the data management platform would support our needs, so we chose Google Cloud. Our 15-person team was able to build the entire new platform in the 10-month timeframe because we chose Google—we didn’t have to hire an operations team to solve the problem. Using Google Cloud services let us bootstrap quickly and build our business on top of that. That platform includes an entire warehouse management system, and the software powering the brand-new distribution center.

Powering our subscribers with data

With Google Cloud, we’ve built a data backbone that lets us integrate data from different sources, messaging, stream processing and ETL jobs. We chose Google Kubernetes Engine (GKE) and multiple Google Cloud managed data offerings, including BigQuery, Data Studio, Cloud Storage, TensorFlow, and more. We use an event-driven architecture with Kafka as the main event manager. We run a Confluent cluster on Google Cloud, which helps us meet our strict event data ordering requirement. 

The biggest competitive advantage we get using Google Cloud is the managed services. There are so many capabilities built in to Google Cloud, so it’s easy for us to adopt new features as quickly as possible to stay competitive. For example, we can capture all events streaming into BigQuery, then add a reporting dashboard easily for our business partners on the brand side. They can then make informed decisions on what to buy based on updated user information. Using Google Cloud’s many features also means we can offer our customers more cool features so they can try new styles and new ways of expressing themselves. We can offer different channels to different customers based on age group, for example, to recommend new styles to try without having to buy them.

We also had strict scalability criteria when we set out to deploy cloud and build our new business. Our cloud platform with Google has scaled along with the business, so as we acquire new subscribers, we’re not increasing our operational costs at the same time like we would be with a more traditional on-prem system. With cloud, we can pay for resources as we use them, so cost savings and efficiency is also an important metric to track. And the cloud lets us scale without customer downtime, so that translates into another key metric for us.

Our business is still evolving, so we’re just scratching the surface in terms of all we can offer our customers. What’s great is that with managed cloud services, we’re able to focus on the business side of things, rather than the technology side. Since we don’t have to manage resources, we can provision resources up front and use the utilization model. 

Learn more about URBN and about Nuuly.

How Mynd uses G Suite to manage a flurry of acquisitions

Editor’s note:Today’s post is by Gordon Thomas, IT Architect for Mynd Property Management, based in Oakland, California and operating in 16 U.S. markets. Mynd uses G Suite Enterprise to manage acquisitions and keep its staff of 400 workers productive.

In the past six months, Mynd Property Management has doubled in size—in part by hiring, but also through acquisitions that help build our business. It’s an exciting time to be at Mynd—every week brings new people who need to be set up with email and network access, which can also be a bit hectic. To speed onboarding, we use G Suite Enterprise to get our new people working as soon as they walk through the door.

Streamlined onboarding
We started using G Suite when we launched the business because it fit our overall approach to technology—that is, deploying cloud tools that are flexible for our employees and easy for IT to manage. Good thing we picked G Suite, because given the pace of our acquisitions—13 at last count—we have our hands full with combining business systems and keeping people productive. Onboarding is literally 80 percent of our daily work! 

Some people on our team don’t have technical backgrounds, so we are careful to only introduce new software tools once people are prepared to use them. With G Suite, people need very little ramp-up time because it is an intuitive, integrated solution. In one of our in-person training sessions, we showed an accountant how to open up a Microsoft Excel file in Google Sheets, and he started working on it right away, without much help.

We’re a remote-first company, so working in the cloud is second nature to us. About 75 percent of our total workforce are remote full-timers. G Suite tools are optimized for mobile, so they’re especially useful for the 50 full-time property managers who work solely on mobile devices in the field. 

Better meetings, easier file storage
In Mynd’s early days, we used G Suite Basic, but upgraded to G Suite Enterprise after Google Cloud Premier Partner Suitebriar showed us the benefits of using tools like Hangouts Meet and shared drives in Google Drive. They’ve really helped us scale the business, so we can stay productive even in the midst of acquisitions—and of course, onboard people more quickly. 

Many G Suite tools have become things we can’t live without. Our executives got first crack at using Hangouts Meet Hardware, and loved it so much—especially the recording feature—that now all of our conference rooms have it. 

Google Drive got the same love from everyone. It’s all we need for file sharing and storage, which means we don’t have to pay for another storage solution. Drive also helps with onboarding; as soon we assign an employee a Google Account, she or he automatically gets access to the shared drives (in Drive) that they need to do their jobs. 

Customizing the tech stack
As much as we love G Suite, we still like to use some non-G Suite tools. It helps that we can tailor G Suite to fit in our technology stack any way we want. We’d never get that control and customization from other vendors in the productivity space.

For example, we love using Slack for internal chat and messaging, so we turned off Hangouts Chat. We like that we can control these channels, and that G Suite is not an all-or-nothing product. That means we’re not afraid to try out new G Suite features, since we can roll them back if they’re not right for us. 

The result of our day-to-day G Suite use is that we don’t suffer much down time even during acquisitions. We can pull relevant data from users’ old systems and easily automate user creation from start to finish. When we merge with companies that already use G Suite, we can move their data over to the business in less than a week.

When we recently moved over 80 new employees and 50 different web domains to G Suite, it was much less challenging than we expected. Since we don’t plan to slow down on acquisitions, we breathe easier knowing that G Suite keeps everyone working in the middle of so much change.

Packet Mirroring: Visualize and protect your cloud network

As networks grow in complexity, network and security administrators need to be able to analyze and monitor network traffic to respond to security breaches and attacks. However, in public cloud environments, getting access to network traffic can be challenging.

Many customers use advanced security and traffic inspection tools on-prem, and need the same tools to be available in the cloud for certain applications. Our new Packet Mirroring service is now in beta, and allows you to troubleshoot your existing Virtual Private Clouds (VPCs). With this service, you can use third-party tools to collect and inspect network traffic at scale, provide intrusion detection, application performance monitoring, and better security controls, helping you ensure the security and compliance of workloads running in Compute Engine and Google Kubernetes Engine (GKE). For more, watch this video.

For instance, Packet Mirroring lets you identify network anomalies within and across VPCs,  internal traffic from VMs to VMs, traffic between end locations on the Internet and VMs, and also traffic between VMs to Google services in production. 

Packet Mirroring is available in all Google Cloud Platform (GCP) regions, for all machine types, for both Compute Engine instances and GKE clusters.

In short, Packet Mirroring allows you to:

  • Help ensure advanced network security by proactively detecting threats. Respond to intrusions with signature-based detection on predetermined attack patterns, and also identify previously unknown attacks with anomaly-based detection.

  • Improve application availability and performance with the capability to diagnose and analyze what’s going on over the wire instead of relying only on application logs.

  • Support regulatory and compliance requirements by logging and monitoring of transactions for auditing purposes.

gcp Packet Mirroring.gif

“Google Cloud’s new Packet Mirroring service accelerates our cloud adoption by giving us the visibility we need to secure our applications and protect our most precious asset, our customers.” – Diane Brown, Senior Director IT Risk Management, Ulta Beauty

Packet Mirroring is important for enterprise users from both a security and networking perspective. You can use Packet Mirroring in a variety of deployment setups for different network topologies, such as VPC Network Peering and Shared VPC. In Shared VPC environments, for instance, an organization may have packet mirroring policies and collector backends that were set up by the networking or security team in the host project; the packet mirroring policy, meanwhile, is enabled in the service projects where the developer team runs its applications. This centralized deployment mode improves the ease of use of Packet Mirroring for security and networking teams, at the same time making it transparent to the development teams.

Packet Mirroring is natively integrated with Google’s Andromeda SDN fabric. This approach keeps Packet Mirroring performance and management overhead low, as the receiving software appliances running on the collector backends don’t need to perform any decapsulation operation to parse the receive packet mirrored data. 

Partnering for network security

We’ve been working with several partners to help us test and develop Packet Mirroring, and have received valuable feedback along the way. Here are our Packet Mirroring partners, and how they work with the tool:

Help ensure security and compliance in the cloud

Our goal is to give you the right advanced security solutions for connecting your business to Google Cloud. With Packet Mirroring, you can reduce risk, diagnose to ensure the availability of your mission-critical applications and services and meet compliance requirements. Click here to learn more about GCP’s cloud networking portfolio and reach out to us with feedback at  [email protected].

Google Cloud Platform is now FedRAMP High authorized

At Google Cloud, we’re committed to providing public sector agencies with technology to help improve citizen services, increase operational effectiveness, and better meet their missions. We  build our products with security and data protection as core design principles, and we regularly validate these products against the most rigorous regulatory requirements and standards.  

To that end, we are proud to announce that Google Cloud Platform (GCP) has received FedRAMP High authorization to operate (ATO) for 17 products in five cloud regions, and we’ve expanded our existing FedRAMP Moderate authorization to 64 products in 17 cloud regions. This means that public sector agencies now have the ability to run compliant workloads at the highest level of civilian classification.

How FedRAMP certification works
FedRAMP is a U.S. government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services offered to US federal government agencies. Most federal agency cloud deployments and service models, other than certain on-premises private clouds, must meet FedRAMP requirements at the appropriate (Low, Moderate, or High) risk impact level. While Google Cloud already maintains an authorization for both GCP and G Suite at the Moderate impact level, achieving High status on GCP means we can provide greater access to technology for our most security-sensitive customers. And while the FedRAMP ATO is required for federal agencies, it is also a security benchmark for other industries, including financial services, health, and manufacturing. If you’re a GCP customer, you can enjoy the benefit of a FedRAMP High-authorized infrastructure at no additional cost and without any change in your services. 

Obtaining FedRAMP High required documenting at length how our infrastructure and platforms help our customers keep their data safe. We carefully translated the principles of our BeyondCorp model, including zero-trust networking, that we have implemented at Google into the NIST 800-53r4 security controls, which were then documented and assessed by a third-party organization. As part of this process, we also completed FIPS 140-2 L1 overall and L3 physical FIPS validation of the internal version of Google’s Titan Security Key authenticator. We worked closely with the FedRAMP Joint Authorization Board to document Google’s monitoring, patching, and vulnerability scanning infrastructure in order to meet the rigorous continuous monitoring requirements of FedRAMP High. 

Receiving a FedRAMP High ATO means we can support agency missions that require some of the highest levels of data protection for unclassified workloads. These could include health care delivery, emergency response, space operations, and many others. 

Supporting the public sector with cloud innovation
These new certifications reflect our continued investment and support for customers in the U.S. public sector, and is another example of momentum we’re seeing as government agencies move to the cloud. For example, we recently teamed up with researchers from NASA-FDL to help identify life beyond earth with our machine-learning capabilities, and the Library of Congress team spoke at Google Cloud Next ‘19 on how they’re making books accessible to the visually impaired. We are also helping the U.S. Air Force modernize its modeling and simulation training infrastructure.  

At the state and local level, the State of Arizona plans to migrate thousands of employees and contractors to G Suite to improve security and collaboration. It anticipates millions of dollars in cost savings over the next three years. And New York City Cyber Command is partnering with Google Cloud to automate and speed log analysis and other initiatives to protect New Yorkers from malicious cyber activity, while also safeguarding data privacy on mobile devices and across public WiFi networks. 

Welcoming new public sector leaders
Today’s news reinforces our commitment to the public sector. Earlier this year, I joined Google Cloud to lead our public sector efforts. We’ve also added Brent Mitchell to lead Google Cloud’s state and local government strategy, and Lesta Brady to head up our federal civilian sales strategy. And we recently announced a new Global Public Sector organization within Google Cloud, with a charter of engaging with public sector customers worldwide—and have welcomed new leaders in Canada, EMEA, and Latin America into this organization. Finally, I’m excited today to announce that long-time Googler and Chief Internet Evangelist Vint Cerf and his group of technology specialists will be joining my team to bring their expertise to public sector customers globally. His team will continue to evangelize the potential of the internet and the solutions it can enable, which is critically important for public sector decision-makers to understand as part of the delivery of their services. 

We look forward to continuing to help federal, state, and local government agencies innovate, and will pursue additional global certifications to meet their needs. You can learn more here about our public sector work.

How two London borough councils use Chrome Enterprise and G Suite for modern collaboration

Editor’s note: David Grasty is the Corporate Head of Digital Transformation at two councils located in Southwest London. The councils employ roughly 5,000 council workers across 114 sites to keep local services operating for a combined 400,000 residents. This is how Grasty helped his IT team operate more efficiently, while giving his workers a more modern work experience using G Suite and Chrome Enterprise.

About six years ago, our Kingston and Sutton councils began sharing IT services—a step that helped us save money and reduce our IT workload. The process also provided our employees with a more modern work experience, so they can keep local services for borough residents up and running, like libraries, hospitals, schools, sustainable transportation and environmental health programs. 

Using technology to help teams collaborate
We knew we wanted to replace our Windows 7 computers with cloud-based technology that allowed our employees to work together without interruption, including everyone from social workers to hospital staff—so we chose G Suite. We replaced our legacy email solution that operated on each borough’s individual servers and transitioned to Gmail with help from a tool called CloudMigrator offered from our Google Partner, Cloud Technology Solutions. After our council employees got used to working in Gmail—which many already knew from personal use—they began to work in other G Suite apps. Our work habits started to change. 

In one meeting, we discussed areas of responsibility for different council organizations. I created a spreadsheet in Google Sheets to collaborate on ideas and others jumped into the document to add additions. We’d never make progress that quickly if we shared documents back and forth in email attachments. Similarly, it became second nature for people to connect face-to-face in video meetings in Meet or to send messages about projects in Chat.  It sounds like such a simple change, but being able to attend meetings from home or from different administrative buildings, has saved people thousands of hours of commuting and walking time. 

Providing flexible devices to help “sitters,” “walkers,” and “runners”
Our goal as an IT team is to be seen as more than just “wires and Wi-Fi.” Instead, we want to be integral partners for digital transformation. With G Suite technology in place, we were primed to offer flexible work options for employees, like shared desks or the ability to work from home. With that said, many employees work in historic buildings which can’t easily be renovated. The devices we chose needed to be flexible, installed and usable from anywhere.

We rolled out 3,800 Chromebooks in 80 locations, followed by about 1,700 Chromeboxes that replaced PCs. It was simple to deploy Chrome devices—I don’t think we could have rolled out the same number of Windows laptops in only four months. We knew that Chromebooks and Chromeboxes would work right out of the box in just a few minutes, with the correct policies applied. 

To ensure success with the rollout, we carefully matched device types to each worker depending on their role and preference, classifying workers as: “sitters,” “walkers,” or “runners.” 

  • Sitters have assigned desks, so they received Chromeboxes with large displays that help them get their work done.
  • Walkers work at their desks but like the freedom to work from home, so they received Acer Chromebooks, which are more portable. 
  • Runners frequently travel throughout the boroughs and offices,  so they received Acer Spin Chromebooks, which can convert into tablets for presentations. The Acer Spin devices helped people in the field connect more easily with local residents, who might not be able to visit council offices. 

With Chromebooks and G Suite, we’re not tied down to particular offices and data centers because just about every application we use is web-based or is a G Suite app. If we need to use legacy apps, such as council tax and planning systems, we can access them through Chrome’s Legacy Browser Support. Legacy Browser Support allows us to seamlessly access from Chrome just the legacy apps we need to in the required legacy browser, limiting the time we spend in unsecure browsers, while also not stalling what we need to get done in those legacy applications. 

What I also particularly love about Chrome devices is that they require very little administration. With Chrome Enterprise Upgrade they’re secure and manageable right out of the box: we simply set policies within the Google Admin console, and from there we can track device usage, choose network settings, and even lock down devices. 

Evolving workspaces to support modern collaboration
In our two main buildings, we’ve turned very traditional offices into flexible spaces where workers can set up their Chromebook at any open space. We have fewer desks now; many people work from home and join video meetings.

When I see employees working efficiently in the cloud, instead of pushing bits of paper around, I’m confident we’ll have greater impact on Kingston and Sutton residents.

From showroom to front room: DFS delivers smarter retail experiences with Google Cloud

Delivering smarter retail journeys that marry the personal service of a store visit with the wealth of choice available online has become a key ambition in the consumer goods industry. While we’ve come to expect digitally integrated shopping experiences when buying electronics or groceries, even shoppers of made-to-order products can benefit from the use of public cloud to enhance customer touchpoints across digital and physical channels.

This is what DFS, the UK’s leading upholstery retailer, is now doing with Google Cloud’s support. A household name, with a 50-year pedigree and more than 5,500 employees, DFS is highly regarded for the quality of its handmade-to-order sofas and soft furnishings, and for the service customers receive when visiting its showrooms. The company operates its own distribution network, with the support of 20 distribution centers, nearly 300 delivery vehicles and over 600 delivery specialists.

We’re working with DFS across its sales and distribution network, helping it to prepare for the future of retail via an unparalleled combination of cloud services, collaboration tools and digital devices.

Our relationship came about due to DFS’ acquisition of Sofology in 2017. The company had some decisions to make around how it should integrate IT infrastructure and applications as a group, DFS looked at what it could learn from its new brand’s technology approach. As part of this exploration, conducted with Google Premier partner NetPremacy, it also reviewed its existing environment in light of the traffic demands during the Christmas holiday.

DFS first followed Sofology’s example by trialing G Suite for employee communications, looking to effect broader culture change towards more seamless collaboration. The feedback was overwhelmingly positive, prompting adoption and roll out across its network over a six-month period in 2019. The success of G Suite led DFS to investigate further Google products, resulting in the decision to adopt two additional solutions.

Firstly, DFS embraced Chrome devices to improve in-store systems and augment the customer experience.  The company rolled out 1,200 Chromebooks to its retail stores in late 2019, enabling salespeople to access key information in a convenient and secure way, while interacting with customers. Technical colleagues found that the tablets could be set up in a matter of seconds, with secure login and individual accounts for each member of staff on the showroom floor.

Secondly, DFS has made the decision to transition its web platform from a  private cloud environment to Google Cloud Platform (GCP). Once implemented, this will allow the technology team to deploy a number of new applications, and will result in a series of benefits, including:

  • Improved functionality and performance of DFS’ e-commerce platform. The site will have far greater capacity, providing a more responsive user experience and scalable resources to easily handle seasonal spikes in traffic.

  • E-commerce back-end improvements, including greater flexibility and the ability to deploy major upgrades seamlessly.

  • A commercial benefit in lowering hosting costs, compared with the previous approach. 

The transition to GCP took place in November 2019 and is expected to deliver a positive revenue impact over the busy Christmas season.

Russell Harte, Group Technology Director at DFS, said, “Our business relies on a lot of moving parts all working in harmony. Whether a customer visits us digitally or in person–or both–there needs to be a congruence in the offers they see and the service they receive. The work we’re doing with Google Cloud will ensure the best possible experience for end users, while also bringing efficiencies to our logistical capabilities.” 

“As we migrate to GCP and access the range of applications available, we’re gaining a better understanding of what’s possible with the platform and how we can use it to seize the opportunity to react better to new challenges.”

DFS is now looking at a range of other capabilities made possible by its new cloud environment. These include the adoption of containerization through Kubernetes for faster application development, and the use of Apigee as an integration layer to bring together supply chains from across the group and improve last-mile logistics for deliveries. The company is also putting more resources into leveraging data science capabilities within GCP infrastructure, to gain insights from large volumes of data. It’s anticipated that harnessing this data will enable DFS to predict demand and footfall more accurately, and to gain greater visibility over logistical data to improve the efficiency of customer deliveries.

“The relationship with Google Cloud has been excellent,” says Russell. “It’s also setting our business up for where the retail industry is going, enabling a smarter, more customer-centric approach. They understand our business challenges and have the unique expertise to help us drive forward. We’re excited about how the partnership will grow from here.”

Just Eat satisfies its appetite for customer insights with Google Cloud

The app economy has enabled a huge range of unique business models to flourish. One such model is online food ordering and delivery services, in which apps leverage geo-location data to aggregate local food choices and offer personalized options to consumers.

A leading company in this space is Just Eat. Launched in the UK in 2001 with a vision of ‘serving the world’s greatest menu. Brilliantly.’ The company has capitalized on the popularity of online food delivery and grown its presence across 12 markets. 

Just Eat acts as an intermediary between take-out food outlets and hungry customers, giving local restaurants access to a broader base of potential diners, while providing consumers with an easy and secure way to order and pay for food from their favourite restaurants. Today the company helps 27 million customers find food from more than 112,000 restaurants—everything from homemade Italian pasta, to Chinese noodle bowls, to fish-and-chips. 

Data is the fuel of Just Eat’s rapid growth, but it wasn’t always looked at that way. In its early days, Just Eat struggled with the deluge of information and faced fragmentation across its systems. In fact, the company realized its legacy data vendor wasn’t capable of ingesting 90 percent of the data produced by its food platform. This was incredibly frustrating for Just Eat’s analysts and data scientists, who had to waste time cleaning up sources instead of leveraging the data to create a better user experience. 

Just Eat turned to Google Cloud, and now uses machine learning (ML) to power sophisticated consumer recommendations on both its app and website. It also makes heavy use of features offered by Google Cloud Platform, including BigQuery for running analytics on its customer data set and Cloud Pub/Sub for messaging app users with relevant offers in real-time. 

Having all of Just Eat’s data in one platform has translated into real value for its customers. With Google Cloud tools, Just Eat has created its own proprietary Customer Ontology framework, which today contains 5.5 billion features that better understand consumers’ behavior and food habits, and provides insights into previous visits. Just Eat recently created an “Adventurous Index” to map its customers according to their ordering habits, enabling them to tailor their marketing and user experiences. For example, mid-adventurous customers are shown a choice of restaurants that serve their most ordered cuisine, while adventurous customers can choose from restaurants that serve a wider variety. This not only has prompted consumers to be more adventurous with their choices, but also has led to more business at a more diverse set of restaurants.

Matt Cresswell, Director of Customer Platforms at Just Eat said that Google Cloud has become integral to its product delivery: “Consumer food choice is a hugely nuanced topic. We know that individuals have their own unique journeys when they use Just Eat. We’ve sought to create a truly one-to-one relationship with every customer. The changes we’ve made to the platform mean they can access the dishes they enjoy at the touch of a fingertip, and find inspiration to discover new dishes they’ll love. We’re grateful to Google Cloud for helping us support our customers on their culinary explorations.”