Google Cloud Bolsters Data, Analytics, and AI Offerings
Today at its Data Cloud & AI Summit, Google Cloud unleashed a series of enhancements designed to bolster various database, analytics, and AI and machine learning offerings, including its BigQuery data warehouse, Looker BI tool, as well as AlloyDB, its new Postgres-compatible database. New data clean rooms are on tap, as is a new app for creating generative AI products.
Google Cloud is launching new editions of BigQuery that it says will give customers more choice and flexibility. Users can mix and match among Standard, Enterprise, and Enterprise Plus, which cost $.04, $.06, and $.10 per slot hour, respectively, to match the analytics price with the performance they need.
“The Standard edition is best for ad-hoc, development, and test workloads, while Enterprise has increased security, governance, machine learning, and data management features,” Gerrit Kazmaier, vice president and GM of data analytics for Google Cloud, writes in a blog. “Enterprise Plus is targeted at mission-critical workloads that demand high uptime, availability, and recovery requirements, or have complex regulatory needs.”
The new pricing mechanism are in place now. Customers with predictable workloads can purchase BigQuery editions for single or multi-year commitments, the company says. Customers with unpredictable workloads can select an auto-scaling package that requires them to pay only for the compute capacity they use.
Starting July 5, BigQuery customers will no longer be able to purchase flat-rate annual, flat-rate monthly, and flex slot commitments, and will have to select either Standard, Enterprise, or Enteprrise Plus packages, the company says. Also starting July 5, the company is increasing the price of “the on-demand analysis model by 25% across all regions.”
The company is also rolling out a new “compressed storage billing” model that will reduce the cost of BigQuery customers, “depending on the type of structured and unstructured data that is stored,” the company says. Google Cloud says its customer Exabeam has achieved a data compression rate of 12-to-1.
BigQuery ML, the company’s flagship machine learning offering launched back in 2019, is also gaining new features. Among them is the capability to import models from PyTorch, to host remote models on Vertex AI, and to run pre-trained models from Vertex AI, the company says.
Google Cloud also used the Data Cloud and AI Summit as an opportunity to reiterate two new features that it added two weeks ago to VertexAI, the machine learning model development and deployment solution that Google Cloud launched back in 2021, including Generative AI Studio and Model Garden.
Google Cloud says the new Generative AI Studio will provide “a wide range of capabilities including a chat interface, prompt design, prompt tuning, and even the ability to fine-tune model weights.” Model Garden, meanwhile, will allow users to search, discover, and interact with Google’s own “foundation models.” Over time, Google Cloud plans to add “hundreds of open-source and third-party models” to the Garden.
Google Cloud is also launching a new Gen App Builder that’s designed to help developers build AI apps using AI powered search and conversational experiences, the company says.
The goal with Gen App Builder is to allow skilled developers and “even those with limited machine learning skills [to] quickly and easily tap into the power of Google’s foundation models, search expertise, and conversational AI technologies to create enterprise-grade generative AI applications,” write Google Cloud’s Lisa O’Malley, senior director of product management for industry solutions, and Yariv Adan, director of cloud conversational AI, in a blog post.
The company also unveiled Looker Modeler, a new add-on for its flagship BI and analytics product that will help customers define metrics about their business using Looker’s semantic modeling layer. Looker Modeler serves as “the single source of truth for your metrics, which you can share with the BI tools of your choice, such as PowerBI, Tableau, ThoughtSpot, Connected Sheets, and Looker Studio, providing users with quality data for informed decisions,” write Kazmaier and Andi Gutmans, Google Cloud’s general manager and vice president of engineering databases, in a blog post.
Starting in the third quarter, Google Cloud customers will be able to try out BigQuery data clean rooms, which will allow them to share data across organizations while respecting privacy. The clean rooms will be useful for combining first-party data with advertising campaign data or other third-party data from the Google Cloud data marketplace, Kazmaier and Gutmans write.
“This can enable your organization to unlock insights and improve campaigns, all while preserving privacy protections,” they write.
On the transactional front, Google Cloud unveiled a new version of AlloyDB, the Postgres-compatible cloud database that it initially launched last year. With AlloyDB Omni, customers now have a version of the database that they can download and run wherever they want, including on laptops, servers, or on edge devices.
Google Cloud claims AlloyDB Omni is more than 2x faster than standard Postgres for transactional workloads and up to 100x faster for analytical workloads. The company is also launching a new Database Migration Assessment (DMA) tool designed to help customers move to AlloyDB Omni or Cloud SQL.
Related Items:
Google Cloud’s 2023 Data and AI Trends Report Reveals a Changing Landscape
Google Cloud Opens Up Its Data Cloud at Next ’22
Google Cloud Announces Vertex AI Tool for Demand Forecasting