Forrester looks at the opportunity for Google Cloud partners

One reason that I joined Google Cloud’s channel team was the importance Google places on partners and the opportunity we have together to help customers succeed. In 2019, our partners responded to this opportunity and are seeing the results. Google recently commissioned a new Total Economic Impact study from Forrester, which shines some light on our partner community, how they are helping customers succeed, and the opportunities they’re seeing. 

Based on interviews with Google Cloud partners, Forrester’s research helps illustrate why so many cloud and enterprise partners are building on, supporting, and providing unique services to customers deploying Google Cloud. As we enter 2020, the opportunity ahead for partners working with Google Cloud is larger than ever. Let’s look at some highlights from the study.

Why are partners choosing to grow with Google Cloud?

Open, multi-cloud strategy. Customers overwhelmingly choose multi-cloud environments, and partners are increasingly choosing to work with Google Cloud thanks to this multi-cloud approach. But equally important is Google Cloud’s support of open source software, like Kubernetes. Partners who want openness are choosing to work with Google Cloud.

Data, analytics, and AI capabilities. Our differentiated capabilities in data and analytics—BigQuery, DataFlow, Dataproc, and Tensorflow—are also key reasons partners choose Google Cloud. A good example of this is our partner Deloitte, which in 2019 began to build new solutions leveraging BigQuery, AI, and ML to help customers in clinical research better leverage enormous quantities of data.

How are partners growing their businesses on Google Cloud? 

Forrester data reveals that partners are seeing strong revenue and margin growth by building products, developing services, and selling capabilities on Google Cloud, and they’ve also seen success in differentiating themselves by earning specializations for the platform as well. 

Growing services offerings. As customer demand for Google Cloud capabilities grows, partners are seeing increased business from services offerings, helping customers implement GCP capabilities, including AI and ML, data center modernization, and multi-cloud deployments with Anthos. For example, the average migration services deal size for Google Cloud partners increased between three-to-six times over the last three years, while cloud modernization and application development deals grew by between three-to-five times during the same period.

High-value projects. Partners in Forrester’s study were also able to increase their margins by adding services around higher-value analytics, AI/ML, cloud modernization, and cloud-native application development projects, as were partners who built custom IP on GCP. While Google Cloud offers strong margins for products across the board, like G Suite, the study showed that partners who moved “up the stack” to more complex technologies earned larger margins overall.

Investing in talent and skills development. In Forrester’s research, some partners reported up to 800% growth in hiring over a four-year period. This aligns with what we already know: Google Cloud skills and talent are in high demand, and hiring and building skills around Google Cloud continues to be a tremendous area of opportunity and differentiation for our partners.

Certification and training. Google Cloud partners are continuing to invest in training and certifications for their teams. For example, one partner surveyed by Forrester has a goal to have 70% of its technical staff certified on Google Cloud by the end of the year. Our partners have many opportunities to build new skills on Google Cloud, including sales and technical enablement offerings, professional development tools and incentives, and in many cases, no-cost certification training.

Expertises and Specializations in key areas. Investments in Expertises and Specializations help partners differentiate themselves and fill technical skills gaps that customers need. The number of our partners that have earned Specializations increased nearly three times over the past year (based on Google Cloud data), which was reflected in interviews conducted by Forrester as well.

What opportunities lie ahead?

Forrester asked partners about the road ahead, such as where Google Cloud partners see the most opportunity to grow and differentiate their businesses. 

We know that it’s still early days for cloud migration across industries, but a few key product areas stood out as opportunities for partners—particularly, data analytics, artificial intelligence, machine learning, and building custom IP. Customer engagements in these areas are leading to repeat business and larger contract values for partners, Forrester found, so building expertise can help “further bolster partners’ value proposition and differentiation in the marketplace.” 

We’re excited about the shared opportunity we have with partners to help customers solve their trickiest challenges and address their biggest opportunities with Google Cloud. To learn more, download and read Forrester’s full TEI study, “The Google Cloud Business Opportunity for Partners,” here. Visit us here to learn about our Partner Advantage program, including more details about how you can start earning new Google Cloud Specializations and Expertises.

Build a modern workplace, stay competitive, with these 3 essential moves

The future of work is here—it’s just not evenly distributed. Indeed, many of the technologies, trends, and cultural norms that will drive tomorrow’s workplaces are already reshaping organizations around the world today. We examine this topic in a new Google Cloud report on the future of collaboration and productivity, featuring the results of original research, predictions by business and technology leaders, and recommendations from Google experts.

The report also identifies three important and impactful changes that businesses can make today to catch up with forward-thinking competitors. Here are highlights:

1. Give people tools that help them save time and work faster.
What is it about work that just isn’t working? Research shows that employees around the world spend significant chunks of their day on low-value activities that get in the way of efficiency, productivity, and innovation. These include attending unnecessary meetings, managing high volumes of email, tracking down the right versions of documents, switching between applications, and generally struggling with outdated technology.

distractions.jpg

To help their employees spend more time on work that matters, future-minded organizations are replacing legacy business software with cloud-based tools that integrate seamlessly with one another and use automation to make routine tasks less burdensome. For example, an app might surface the right information at the right time so you don’t have to go hunting for it, suggest commonly used phrases as you type, or propose available meeting times and conference rooms. Add up the few minutes you save on each task, and that’s a significant portion of the day you’ve reclaimed for the work that matters.

2. Empower people to access knowledge and share ideas.
Have you ever used Google Search to find your own company’s logo? Or recreated content that already existed because you couldn’t track something down? Employees around the world experience similar challenges when searching for what they need across siloed, ineffective information repositories within their organizations. And that’s a problem, because nearly 80% of business leaders agree that seamless, digital access to knowledge is very important to overall performance, according to a survey sponsored by Google Cloud and conducted by Harvard Business Review Analytic Services.

cant find info.jpg

The solution? Providing employees with centrally managed technology that lets them access and share knowledge more easily, curbing reliance on unsanctioned apps that may put sensitive data at risk. Fast access to relevant information can also help cultivate new opportunities for employees to contribute, whether it’s by suggesting a new product or service, providing feedback on internal processes or business initiatives, or exchanging insights with colleagues across the company. Eighty-nine percent of executives say that for a business to be successful, new ideas must come from everyone across the organization, from senior executives to front-line employees, according to another survey sponsored by Google Cloud and conducted by Harvard Business Review Analytic Services.

3. Let people work how they want: flexibly and collaboratively.
Along with time-saving tools and expanded access to information, two core values are reshaping work for forward-thinking organizations and will help determine the future success of every business: flexibility and collaboration. According to Deloitte research, only 17% of millennials are willing to stay with an employer for more than five years without the opportunity to work where they choose, while 74% of global employees believe that businesses perform better when they encourage teamwork.

work becoming flexible.jpg

Luckily, technology is evolving to support this sweeping shift toward flexible collaboration. We now have tools that let people on opposite sides of the globe work together as a team, from video meetings for face-to-face interaction to digital whiteboards for collective brainstorming. Research shows that more and more businesses are taking advantage of these cloud-powered services: 27% of global executives adopted integrated software suites for collaborative content creation and idea sharing in the last two years, while an additional 24% plan to do so by 2021, according to Harvard Business Review Analytic Services.

Read the complete report for more insights, along with customer stories and tips on everything from improving the quality of meetings to protecting your data. Download it here.

The 2019 Accelerate State of DevOps: Elite performance, productivity, and scaling

DevOps Research and Assessment (DORA), a pioneer in helping organizations achieve high DevOps and organizational performance with data-driven insights, and Google Cloud are excited to announce the launch of the 2019 Accelerate State of DevOps Report. The report provides a comprehensive view of the DevOps industry, providing actionable guidance for  organizations of all sizes and in all industries to improve their software delivery performance to ultimately become an elite DevOps performer. With six years of research and data from more than 31,000 professionals worldwide, the 2019 Accelerate State of DevOps Report is the largest and longest-running research of its kind.

New insights in 2019

We saw continued evidence that software speed, stability, and availability contribute to organizational performance, and this year we were able to uncover new insights into the practices and capabilities that drive high DevOps performance. Some insights include:

  • DevOps has “crossed the chasm”: Organizations across industries continue to improve their DevOps expertise, particularly among the highest performers. The proportion of elite performers has almost tripled, now at 20% of all organizations. This confirms reports from other industry analysts.
DevOps insights.jpg
  • Elite performers are more likely to use the cloud: Fast autoscaling, cost visibility, and reliability are some of the key benefits offered by cloud computing. The highest performing DevOps teams were 24 times more likely than low performers to execute on all five capabilities of cloud computing defined by the National Institute of Standards and Technology (NIST), which include on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service.

  • Most cloud users aren’t using it to its full potential: Only 29% of respondents who use cloud met all five of NIST’s above-mentioned criteria. This underscores the fact that organizations who claim to use cloud computing haven’t necessarily adopted all the essential patterns that matter for driving elite performance, which could be holding them back from reaping the benefits of the cloud.

  • For the first time, industry matters: In this year’s report, the retail industry saw better performance both in terms of speed and stability. However, consistent with previous years, we saw continued evidence that no other industry sees better or worse performance. This suggests that organizations of all types and sizes, including highly regulated industries such as financial services, government and retail, can achieve high levels of performance by adopting DevOps practices.

  • DevOps in the enterprise — Part 1: For the first time, we found evidence that enterprise organizations (those with more than 5,000 employees) are lower performers than those with fewer than 5,000 employees. Heavyweight process and controls, as well as tightly coupled architectures, are some of the reasons that result in slower speed and the associated instability.

  • DevOps in the enterprise — Part 2: Our analysis shows the highest DevOps performers (that is, the high and elite performers), focus on structural solutions that build community, which fall into one of these four patterns: Community Builders, University, Emergent, and Experimenters. 

  • No “one size fits all” approach, but concurrent efforts drive success: When investing in DevOps capabilities, particularly in large organizations, focus needs to be on both team-level and organization-level efforts. Continuous integration, automated testing, and monitoring are some of the efforts that work well at the team level.  Examples of organization-level capabilities include the ability to set architectural or change approval policies that span across departments and teams. The report breaks down these capabilities and outlines the strategies to adopt so you can execute on a DevOps strategy for maximum impact. 

  • Low performers use more proprietary software than high and elite performers: The cost to maintain and support proprietary software can be prohibitive, prompting high and elite performers to use open source solutions. This is in line with results from previous reports. In fact, the 2018 Accelerate State of DevOps Report found that elite performers were 1.75 times more likely to make extensive use of open source components, libraries, and platforms. 

How do you improve at DevOps?

This year’s report provides two research models to help drive DevOps improvements: performance and productivity. 

The performance research model looks at the constructs and levers you can pull to drive organizational performance, providing insights into how cloud, continuous delivery, disaster recovery testing, clear change management and a culture of psychological safety can positively impact software delivery performance. The research also finds that heavyweight change processes don’t work

performance research model.png

The productivity research model shows that organizations can improve engineer productivity by investing in easy-to-use tools and information search, a culture of psychological safety, and by reducing technical debt. Improved productivity also helps drive better employee work/life balance and reduces burnout.

productivity research model.png

This year’s report revalidates important findings for the sixth year in a row: First, that it’s possible to optimize for stability without sacrificing speed. Second, DevOps delivers value to customers and end users by impacting both commercial and non-commercial goals. 

Thanks to everyone who contributed to the survey. We hope this report helps organizations of all sizes, industries, and regions improve. We look forward to hearing your thoughts and feedback on the report. Here are some ways you can learn more about 2019 The Accelerate State of DevOps Report.

A CIO’s guide to the cloud: hybrid and human solutions to avoid trade-offs

What do CIOs and CTOs deliver for the company? If you said “technology,” that’s just the beginning. According to their research, McKinsey found that 85% of CIOs and CTOs interviewed in the spring of 2019 said they were essential for at least two of the three most common CEO priorities—revenue acceleration, improved agility and time to market, and cost reduction.

IT modernization – including migrating to the cloud – is key to business growth and agility. Yet, according to a recent McKinsey study, 80% of CIOs report that regardless of their level of cloud migration, they still haven’t reached their projected agility and business benefits. Sometimes, this is because of issues like training and skills gaps in the IT workforce. Surprisingly often though, the barrier to reaching the goals is based on trade-offs that CIOs themselves feel they must make to strike a balance between the perfect and the possible.

IT agility index.png

But what if you could have it all without the trade-offs? As Will Grannis, Managing Director of the CTO Office at Google, and Arul Elumalai, Partner at McKinsey & Company discussed in our recent digital conference, many of the compromises CIOs make can be avoided with new technology, modern architectures and by encouraging a transformation mindset across the business. In interviews, CIOs explained how they’ve leveraged the best of the cloud without compromising on security, agility, and flexibility. Here’s how these leaders avoid three of the top perceived trade-offs—both with technology and by transforming their operating model.

Trade-off #1: Developer agility vs. control and governance
Moving to the cloud offers new opportunities for speed, but 69% of organizations indicate that stringent security guidelines and code review processes can slow developers significantly. One CISO of a multinational company mentioned that cloud development was so fast that they had to institute manual checks on their developers’ code. So much for agility. 

To overcome this trade-off and maintain both speed and security, some respondents found success in DevOps, hiring security-experienced talent and introducing automation for security and quality. Building in security into the CI/CD pipeline and increasing automation don’t just eliminate the tradeoff, they result in higher quality and faster innovation.

At Google Cloud, we’ve also observed that customers with strong DevOps practices have increased speed-to-market and product/service quality. From our own journey, we’ve learned seven critical lessons essential to adopting a DevOps model, ranging from taking up small projects and embracing open source to building an overall DevOps culture.

Trade-off #2: Single-vendor benefits vs. freedom from lock-in
CIOs perceive benefits to using the fewest number of clouds, specifically avoiding introducing multiple systems that require their teams to develop and maintain multiple skillsets. Unfortunately, 83% of the CIOs interviewed said that while they would prefer fewer clouds, the potential financial and technical lock-in drives them to multiple providers. 

Successful CIOs said that they can avoid lock-in pitfalls not just with contractual guardrails and executive and board education, but with evolving hybrid cloud technologies that provide additional choices. Hybrid cloud platforms based on containers can further mitigate the risk of using a single cloud vendor. The key to successful hybrid architectures is the infrastructure abstraction and portability that containers create for them, enabling disparate environments to work together. 

This notion has been at the heart of our strategy at Google Cloud with Anthos, which provides an abstraction layer and an application modernization platform for hybrid and multi-cloud environments. Enterprises can use Anthos to modernize how they develop, secure, and operate hybrid-cloud environments and enable consistency across cloud environments.

Trade-off #3: Best-of-breed tools vs. standardization and familiarity
Optimizing tool chains for different environments can improve productivity, but many CIOs believe that this means reduced functionality and tools. While 77% of CIOs said they had to standardize to the lowest common denominator, some have found a better solution. Rather than giving up the languages, libraries, and frameworks that their teams prefer, effective leaders said that they found success by investing in training programs to upscale talent and adopting new open and vendor-agnostic solutions. Architectures that are based on open-source components have been the keys that helped remove this tradeoff, and eliminate the notion of a lowest common denominator. 

This is why we have built Anthos on open-source components like Kubernetes, Istio and Knative. Anthos gives your business the choice you need. With the ability to create code that works in most environments using the tools, languages, and systems you prefer, you can do more without major changes to how you work.

Regardless of your current cloud adoption level, check out “Unlock business acceleration in a hybrid cloud world” to discover more about McKinsey’s findings, including how CIOs drive agility, methods to make trade-offs unnecessary, and how to prepare your team for the cloud. Then, stay tuned for subsequent posts that  take a closer look at how hybrid solutions and strategies can help CIOs drive a transformation mindset across the business—without compromising on security, agility, and flexibility.

Don’t leave frontline workers behind: The case for cloud-native devices

What does a physician’s assistant, a police records technician, a retail sales associate, and an automobile quality assurance worker have in common? They all work on the front lines of an organization. For some, this means interacting directly with customers or patients. For others, it involves building, checking  or delivering goods or services. No matter the job, they all make work happen

Because they interact directly with customers, or with the creation of goods and services, frontline workers can be some of the most important employees in an enterprise, yet they’re frequently underserved when it comes to up-to-date tools and technology. 

Fortunately, that’s starting to change, and we shared the ways several Google Cloud customers are bringing cloud-native technologies to the front lines in our previous post, “5 ways IT leaders can invest in their frontline workers with Chromebooks and G Suite.”

But we also wanted to get a deeper understanding of what the frontline workforce wants and needs when it comes to technology adoption, as well as what gets in its way. In May, we commissioned an independent study with Forrester to explore exactly that, and below you’ll find  a snapshot of what we learned, directly from frontline workers themselves. 

The frontline workforce faces obstacles to completing core job tasks

Happy and productive employees can mean both lower workforce attrition and higher customer satisfaction, leading to more sales. But it can be hard to deliver that productivity when employees grapple with  dysfunctional technology, which can interrupt flow, slow down performance, or stop work in its tracks.

The frontline workforce in particular is slowed down by technology in a few ways, including: device operating failure or required rebooting after updates; slow or outdated systems; and a lack of immediate access to critical information.

This information gap can be crippling because frontline workers often need access to multiple sources of information to do their jobs. According to the survey, 46 percent of frontline workers reported they have to ask customers to wait while they hunt down information at least once a week, if not more. Both employee and customer experiences suffer as a result.

What the frontline workforce wants is clear from the Forrester survey results:

  • Devices that are secure from external threats

  • Easy, instant communication with teammates, managers, and leadership

  • Continuous productivity, rather than glitchy or slow-operating systems

  • Self-service information on-demand to help them do their jobs

Download the report to read more specifics on these findings. 

Filling the technology gap for the frontline workforce 

Forrester findings suggest that businesses can address frontline workforce needs by adopting cloud-native technologies because they’re easy to use and deploy for employees, as witnessed in worker responses. 

For example, only 12 percent of the frontline workers surveyed found it difficult to use work-related applications accessed via a web browser. That means that the other 88 percent of workers felt comfortable using apps that are browser-based to do their jobs. 

Taking this a step further, an additional report that we commissioned with Forrester in 2018 reviewed the economic impact of adopting cloud-native solutions, like Chrome OS and G Suite. In that report, research suggests that businesses could stand to benefit from hidden cost savings when adopting cloud-native tools, including more than $477,000 in IT management and services savings.

If you couple those findings with these frontline workforce insights, the opportunity presents itself: going cloud-first can increase productivity, employee retention and engagement and mobility for businesses, all at a lower cost.

To learn more about the perspectives of the frontline workforce, read the full study, or join J. P. Gownder, vice president and principal analyst at Forrester, on July 24 for a free webinar on the technology gap for frontline workers.

Future of cloud computing: 5 insights from new global research

Research shows that cloud computing will transform every aspect of business, from logistics to customer relationships to the way teams work together, and today’s organizations are preparing for this seismic shift. A new report from Google on the future of cloud computing combines an in-depth look at how the cloud is shaping the enterprise of tomorrow with actionable advice to help today’s leaders unlock its benefits. Along with insights from Google luminaries and leading companies, the report includes key findings from a research study that surveyed 1,100 business and IT decision-makers from around the world. Their responses shed light on the rapidly evolving technology landscape at a global level, as well as variations in cloud maturity and adoption trends across individual countries. Here are five themes that stood out to us from this brand-new research.

1. Cloud computing will move to the forefront of enterprise technology over the next decade, backed by strong executive support.

Globally, 47 percent of survey participants said that the majority of their companies’ IT infrastructures already use public or private cloud computing. When we asked about predictions for 2029, that number jumped 30 percentage points. C-suite respondents were especially confident that the cloud will reign supreme within a decade: More than half anticipate that it will meet at least three-quarters of their IT needs, while only 40 percent of their non-C-suite peers share that view. What’s the takeaway? The cloud already plays a key role in enterprise technology, but the next 10 years will see it move to the forefront—with plenty of executive support. Here’s how that data breaks down around the world.

fig1_companies adopting cloud.gif

2. The cloud is becoming a significant driver of revenue growth.

Cloud computing helps businesses focus on improving efficiency and fostering innovation, not simply maintaining systems and status quos. So it’s not surprising that 79 percent of survey respondents already consider the cloud an important driver of revenue growth, while 87 percent expect it to become one within a decade. C-suite respondents were just as likely as their non-C-suite peers to anticipate that the cloud will play an important role in driving revenue growth in 2029. This tells us that decision-makers across global organizations believe their future success will hinge on their ability to effectively apply cloud technology.

fig2_high expectations for cloud.gif

3. Businesses are combining cloud capabilities with edge computing to analyze data at its source.

Over the next decade, the cloud will continue to evolve as part of a technology stack that increasingly includes IoT devices and edge computing, in which processing occurs at or near the data’s source. Thirty-three percent of global respondents said they use edge computing for a majority of their cloud operations, while 55 percent expect to do so by 2029. The United States lags behind in this area, with only 18 percent of survey participants currently using edge computing for a majority of their cloud operations, but that figure grew by a factor of 2.5 when respondents looked ahead to 2029. As more and more businesses extend the power and intelligence of the cloud to the edge, we can expect to see better real-time predictions, faster responses, and more seamless customer experiences.

fig3_role of edge computing.gif

4. Tomorrow’s businesses will prioritize openness and interoperability.

In the best cases, cloud adoption is part of a larger transformation in which new tools and systems positively affect company culture. Our research suggests that businesses will continue to place more value on openness over the next decade. By 2029, 41 percent of global respondents expect to use open-source software (OSS) for a majority of their software platform, up 14 percentage points from today. Predicted OSS use was nearly identical between IT decision-makers and their business-oriented peers, implying that technology and business leaders alike recognize the value of interoperability, standardization, freedom from vendor lock-in, and continuous innovation.

fig4_current and expected us of open source.gif

5. On their journey to the cloud, companies are using new techniques to balance speed and quality.

To stay competitive in today’s streaming world, businesses face growing pressure to innovate faster—and the cloud is helping them keep pace. Sixty percent of respondents said their companies will update code weekly or daily by 2029, while 37 percent said they’ve already adopted this approach. This tells us that over the next 10 years, we’ll see an uptick in the use of continuous integration and delivery techniques, resulting in more frequent releases and higher developer productivity.

fig5_more frequent code updates.gif

As organizations prepare for the future, they will need to balance the need for speed with maintaining high quality. Our research suggests that they’ll do so by addressing security early in the development process and assuming constant vulnerability so they’re never surprised. More than half of respondents said they already implement security pre-development, and 72 percent plan to do so by 2029.

fig6_security shifting.gif

Cloud-based enterprises will also rely on automation to maintain quality and security as their operations become faster and more continuous. Seventy percent of respondents expect a majority of their security operations to be automated by 2029, compared to 33 percent today.

fig7_move towards a future.gif

Our Future of Cloud Computing report contains even more insights from our original research, as well as a thorough analysis of the cloud’s impact on businesses and recommended steps for unlocking its full potential. You can download it here.

Analyzing 3024 rice genomes characterized by DeepVariant

Rice is an ideal candidate for study in genomics, not only because it’s one of the world’s most important food crops, but also because centuries of agricultural cross-breeding have created unique, geographically-induced differences. With the potential for global population growth and climate change to impact crop yields, the study of this genome has important social considerations.

This post explores how to identify and analyze different rice genome mutations with a tool calledDeepVariant. To do this, we performed a re-analysis of the Rice 3K dataset and have made the data publicly available as part of the Google Cloud Public Dataset Program pre-publication and under the terms of the Toronto Statement.

We aim to show how AI can improve food security by accelerating genetic enhancement to increase rice crop yield. According to the Food and Agriculture Organization of the United Nations, crop improvements will reduce the negative impact of climate change and loss of arable land on rice yields, as well as support an estimated 25% increase in rice demand by 2030.

Why catalog genetic variation for rice on Google Cloud?

In March 2018, Google AI showed that deep convolutional neural networks can identify genetic variation in aligned DNA sequence data. This approach, called DeepVariant, outperforms existing methods on human data, and we showed that the approach to call variants on a human can be used to call variants on other animal species. This blog post demonstrates that DeepVariant is also effective at calling variants on a plant, thus demonstrating the effectiveness of deep neural network transfer learning in genomics.

In April 2018, three research institutions—the Chinese Academy of Agricultural Sciences (CAAS), the Beijing Genomics Institute (BGI) Shenzhen, and the International Rice Research Institute (IRRI)published the results of a collaboration to sequence and characterize the genomic variation of the Rice 3K dataset, which consists of genomes from 3,024 varieties of rice from 89 countries. Variant calls used in this publication were identified against a Nipponbare reference genome using best practices and are available from the SNP-Seek database (Mansueto et al, 2017).

We recharacterized the genomic variation of the Rice 3K dataset with DeepVariant. Preliminary results indicate a larger number of variants discovered at a similar or lower error rate than those detected by conventional best practice, i.e. GATK.

In total the Rice3K DeepVariant datasetcontains ~12 billion variants at ~74 million genomic locations (SNPs and Indels). These are available in a 1.5 terabyte (TB) table that uses the BigQuery Variants Schema.

Even at this size, you can still run interactive analyses, thanks to the scalable design of BigQuery. The queries we present below run on the order of a few seconds to a few minutes. Speed matters, because genomic data are often being interlinked with data generated by other precision agriculture technologies.

Illustrative queries and analyses

Below, we present some example queries and visualizations of how to query and analyze the Rice 3K dataset. Our analyses focus on two topics:

  • The distribution of genome variant positions, across 3024 rice varieties.
  • The distribution of allele frequencies across the rice genome.

For a step-by-step tutorial on how to work with variant data in BigQuery using the Rice 3K data or another variant dataset of your choosing, consider trying out the Analyzing variants with BigQuery codelab.

Analysis 1: Genetic variants are not uniformly distributed

Genomic locations with very high or very low levels of variation can indicate regions of the genome that are under unusually high or low selective pressure.

In the case of these rice varieties, high selective pressure (which corresponds to low genetic variation) indicates regions of the genome under high artificial selective pressure (i.e. domestication). Moreover, these regions contain genes responsible for traits that regulate important cultivational or nutritional properties of the plant.

We can measure the magnitude of the regional pressure by calculating at each position the Z statistic of each individual variety vs. all varieties. Here’s the query we used to produce the heatmap below, which shows the distribution of genetic variation across all 1Mbase-sized regions across all 12 chromosomes as columns (labeled by the top colored row), vs. all 3024 rice varieties as rows. Red indicates very low variant density relative to other samples within a particular genomic region, while pale yellow indicates very high variant density within a particular genomic region. The dendrogram below shows the similarity among samples (branch length) and groups similar rice varieties together:

rice_genomes_plot.png

A high resolution PDF of this plot is available, as well as the R script used to generate it.

Some interesting details of the dataset are highlighted (in yellow) in the heatmap above:

  1. Closer inspection of chromosome 5 (cyan columns, 1Mbase blocks 9-12) shows that the distinct distribution of Z scores across samples likely occurs due to two factors:

    1. this region includes many centromeric satellites resulting in a high false-positive rate of variants detected, and

    2. a genomic introgression present in some of the rice varieties multiplies this effect (yellow rows).

  2. Nearly all of the 3024 rice varieties included in the Rice 3K dataset are from rice species Oryza sativa. However, 5 Oryza glaberrima varieties were also included. These have a high level of detected genetic variation because they are from a different species, and are revealed as a bright yellow band at the top of the heatmap.

  3. The majority of samples can be partitioned into one group with high variant density and another group with low variant density. This partition fits with previously used methods for classification by admixture. For example, the bottom rows that are mostly red correspond to rice varieties in the japonica and circum-basmati (aromatic) groups that are similar to the Nipponbare reference genome we used.

Analysis 2: Some specific regions are under selective pressure

According to the Hardy-Weinberg Principle, the expected proportion of genotype frequencies within a randomly mating population, in the absence of selective evolutionary pressure, can be calculated from the component allele frequencies. For a bi-allelic position having alleles P and Q and corresponding population frequencies p and q, the expected genotype proportions for PP, PQ, and QQ can be calculated with the formula p2 + 2pq + q2 = 1. However we need to modify this formula by adding an inbreeding coefficient F to reflect the population structure (see: Wahlund effect) and the self-pollination tendency of rice: PP=p2+Fpq ; PQ=2(1-F)pq ; QQ=q2+Fpq where F=0.95.

The significance of genomic positions deviating from the expected genotype distribution follows χ2 , allowing a p-value to be derived and thus identification of positions that are either under significant selective pressure or neutral. In short, this analysis, highlights the fact that rice is highly inbred.

Below you can find a plot of 10-kilobase genome regions from the Oryza sativa genome, colored according to the proportion of variant positions that are significantly (p<0.05) out of (inbreeding modified) Hardy-Weinberg equilibrium, with white regions corresponding to those under low selective pressure and red regions corresponding to those under high selective pressure:

Oryza sativa genome plot.png

The data shown above were retrieved using this query and plotted using this R script. The query used to make this figure was adapted to the BigQuery Variants Schema from one of a number of quality control metrics found in the Google Genomics Cookbook.

Note that selective pressure on the genome is not uniformly distributed, indicated by the clumps of red visible in the plot. Interestingly, there is little correspondence between the prevalence of variants within a region (previous figure) and the proportion of variants within that same region that are under significant selective pressure. The bin size (10 kilobases) used in this visualization is on the order of the average Oryza sativa gene size (3 kilobases) and, given the low correlation between high selective pressure and variant density, it may be useful to guide a gene hunting expedition aimed at identifying genomic loci associated with phenotypes of interest (i.e. those that affect caloric areal yield, nutritive value, and drought- and pest-resistance).

Data availability and conclusion

Genome sequencer reads in FastQ format from Sequence Read Archive Project PRJEB6180, were aligned to the Oryza sativa Os-Nipponbare-Reference-IRGSP-1.0 reference genome using the Burrow-Wheeler Aligner (BWA), producing a set of aligned read files in BAM format.

Subsequently, the BAM files were processed with the Cloud DeepVariant Pipeline, a Cloud TPU-enabled, managed service that executes the DeepVariant open-source software. The pipeline produced a list of variants detected in the aligned reads, and these variants were written out to storage as a set of variant call files in VCF format.

Finally, all VCF files were processed with the Variant Transforms Cloud Dataflow Pipeline, which wrote records to a BigQuery Public Dataset table in the BigQuery Variants Schema format.

For additional guidance on how to use DeepVariant and BigQuery to analyze your own data on Google Cloud, please check out the following resources:

Acknowledgments

We’d like to thank our collaborators and their organizations—both within and outside Google—for making this post possible:

  • Allen Day, Google Cloud

  • Ryan Poplin, Google AI

  • Ken McNally, IRRI

  • Dmytro Chebotarov, IRRI

  • Ramil Mauleon, IRRI

Make your voice heard! Take the 2019 Accelerate State of DevOps survey

The survey for 2019 Accelerate State of DevOps Report is now live and we’d love to hear from you. Whether you’re just starting your DevOps journey, or you adopted DevOps a while ago, please make your voice is heard so that the survey captures insights from everyone.  

For some background, the Accelerate State of DevOps Report is the largest and longest running research project of its kind. Since launching it six years ago, we’ve surveyed more than 30,000 technical professionals worldwide, across all industries.

By contributing to the survey, you will help shape the narrative of the rapidly growing DevOps industry. Your insights will help drive conversations on how as an industry we can develop software faster with less risk.

Last year, thanks to your contributions to the survey, we were able to get answers to key critical questions around DevOps. Some of the questions included:

  • Does DevOps even matter?
  • What drives high-performing DevOps teams?
  • The role of cloud, open source, and culture in DevOps
  • Key metrics to measure DevOps performance.

Last year’s report classified teams into elite, high, medium, and low performers and found such classifications exist in all types of organizations and industry verticals. We saw the proportion of high performers growing year over year, while low performers are struggling to keep up. You can learn more about insights from last year’s report here.

The table below highlights some of the data from the report. It showcases software development and delivery metrics across elite, high, medium, and low-performing DevOps teams.

software_devlier_performance.png

Last year, we also focused on diversifying the percentage of women and underrepresented minorities taking the survey, and saw a big improvement. We hope to improve upon last year’s work, so please share the survey with your colleagues and your network!

The 2019 survey  will take approximately 25 minutes to complete. This year, we dig into topics like deployment toolchains, cloud, disaster recovery, how we work, and more! The DORA research team and Google Cloud want to thank you in advance for your participation. Your insights will be very valuable for the entire DevOps industry and there are no right or wrong answers.

Shape the future of DevOps and make your voice heard: Link

New study: The state of AI in the enterprise

Editor’s note: Today we hear from one of our Premier partners, Deloitte.  Deloitte’s recent report, The State of AI in the Enterprise, 2nd Edition, examines how businesses are thinking about—and deploying—AI services.

From consumer products to financial services, AI is transforming the global business landscape. In 2017, we began our relationship with Google Cloud to help our joint customers deploy and scale AI applications for their businesses. These customers frequently tell us they’re seeing steady returns on their investments in AI, and as a result, they’re interested in more ways to increase those investments.

We regularly conduct research on the broader market trends for AI, and in November of 2018, we released our second annual “State of AI in the Enterprise” study. It showed that industry trends at large reflect what we hear from our customers: the business community remains bullish on AI’s impact.

In this blog post, we’ll examine some of the key takeaways from our survey of 1,100 IT and line-of-business executives and discuss how these findings are relevant to our customers.

Enterprises are doubling down on AI—and seeing financial benefits

More than 95 percent of respondents believe that AI will transform both their businesses and their industries. A majority of survey respondents have already made large-scale investments in AI, with 37 percent saying they have committed $5 million or more to AI-specific initiatives. Nearly two-thirds of respondents (63 percent) feel AI has completely upended the marketplace and they need to make large-scale investments to catch up with rivals—or even to open a narrow lead.

A surprising 82 percent of our respondents told us they’ve already gained a financial return from their AI investments. But that return is not equal across industries. Technology, media, and telecom companies, along with professional services firms, have made the biggest investments and realized the highest returns. In contrast, the public sector and financial services, with lower investments, lag behind. With 88 percent of surveyed companies planning to increase AI spending in the coming year, there’s a significant opportunity to increase both revenue and cost savings across all industries. However, like past transformative technologies, selecting the right AI use cases will be key to recognizing near and long-term benefits.

Enterprises are using a broad range of AI technologies, increasingly in the cloud

Our findings show that enterprises are employing a wide variety of AI technologies. More than half of respondents say their businesses are using statistical machine learning (63 percent), robotic process automation (59 percent), or natural language processing and generation (53 percent). Just under half (49 percent) are still using expert or rule-based systems, and 34 percent are using deep learning.

When asked how they were accessing these AI capabilities, 59 percent said they relied on enterprise software with AI capabilities (much of which is available in the cloud) and 49 percent said, “AI as a service” (again, presumably in the cloud). Forty-six percent, a surprisingly high number, said they were relying upon automated machine learning—a set of capabilities that are only available in the cloud. It’s clear, then, that the cloud is already having a major effect on AI use in these large enterprises.

These trends suggest that public cloud providers can become the primary way businesses access AI services. As a result, we believe this could lower the cost of cloud services and enhance its capabilities at the same time. In fact, our research shows that AI technology companies are investing more R&D dollars into enhancing cloud native versions of AI systems. If this trend continues, it seems likely that enterprises seeking best-of-breed AI solutions will increasingly need to access them from cloud providers.

There are still challenges to overcome

Given the enthusiasm surrounding AI technologies, it is not surprising that organizations also need to supplement their investments in talent. Although 31 percent of respondents listed “lack of AI skills” as a top-three concern—below such issues as implementation, integration, and data—HR teams need to look beyond technology skills to understand their organization’s pain points and end goals. Companies should try to secure teams that bring a mix of business and technology experience to help fully realize their AI project potential.

Our respondents also had concerns about AI-related risks. A little more than half are worried about cybersecurity issues around AI (51 percent), and are concerned about “making the wrong strategic decisions based on AI recommendations” (43 percent). Companies have also begun to recognize ethical risks from AI, the most common being “using AI to manipulate information and create falsehoods” (43 percent).

In conclusion

Despite some challenges, our study suggests that enterprises are enthusiastic about AI, have already seen value from their investments, and are committed to expanding those investments. Looking forward, we expect to see substantial growth in AI and its cloud-based implementations, and that businesses will increasingly turn to public cloud providers as their primary method of accessing them.

Deloitte was proud to be named Google Cloud’sGlobal Services Partner of the Year for 2017–in part due to our joint investments in AI. To learn more about how we can help you accelerate your organization’s AI journey, contact [email protected].

As used in this document, “Deloitte” means Deloitte Consulting LLP, a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of our legal structure. Certain services may not be available to attest clients under the rules and regulations of public accounting.