Follow BigDATAwire:

October 2, 2024

Google Cloud Bolsters GenAI with ScaNN Index, Valkey Updates

Google Cloud today unveiled a slew of database enhancements designed to improve customers’ generative AI initiatives, including the general availability of ScaNN index that can support up to 1 billion vectors in AlloyDB and support for vector search in Memorystore for Valkey 7.2.

As companies build out their GenAI products and strategies, they’re looking to databases that can bring it all together. The capability to create, store, and serve vector embeddings that connect to large language models (LLMs) is a critical piece of those initiatives. To that end, Google Cloud rolled out several enhancements to its database offerings that can help companies move their GenAI balls forward.

First up is the launch of Google’s ScaNN index with AlloyDB, the company’s Postgres-based hosted database service. First announced in April for Alloy DB Omni, the downloadable version of AlloyDB, Google Cloud has now declared the ScaNN index generally available with its hosted AlloyDB for PostgreSQL offering.

ScaNN is built on the approximate nearest-neighbor technology that Google Research built for its own search engine, for Google Ads, and for YouTube. That will give Google Cloud customers plenty of overhead for their neural search and GenAI applications, says Google Cloud GM & VP of Engineering, Databases Andi Gutmans.

“The ScaNN index is the first PostgreSQL-compatible index that can scale to support more than one billion vectors while maintaining state-of-the-art query performance–enabling high scale workloads for every enterprise,” Gutmans said in a blog post today.

ScaNN is Google’s approximate nearest neighbor algorithm

ScaNN is compatible with pgvector, the popular vector plug-in for Postgres, but exceeds it in several ways, according to a Google white paper on ScaNN. Compared to pgvector, ScaNN can create vector indexes up to 8x faster, offers 4x the query performance, uses 3-4x less memory, and up to 10x the write throughput. You can download the Google white paper here.

Another GenAI enhancement can be found with the addition of vector search in the 7.2 versions of Memorystore for Redis and Memorystore for Valkey, a new key-value store offering Google Cloud launched last month. Valkey is an open-source fork of Redis that’s managed by the Linux Foundation, and which Google Cloud has taken an interest.

“A single Memorystore for Valkey or Memorystore for Redis Cluster instance can perform vector search at single-digit millisecond latency on over a billion vectors with greater than 99% recall,” Gutmans writes in his blog post.

The company also announced the public preview of Memorystore for Valkey 8.0, which will bring major performance and reliability improvements, a new replication scheme, networking enhancements, and detailed visibility into performance and resource usage, the database GM says. Memorystore for Valkey 8.0 pushes up to twice the queries per seconds compared to Memorystore for Redis Cluster, at microseconds latency, Gutmans says.

Google Cloud announced updates to several other products, including Firebase, Spanner, and Gemini. You can read more about them here.

Related Items:

Google Revs Cloud Databases, Adds More GenAI to the Mix

Google Cloud Bolsters AI Options At Next ’24

Google Cloud Launches New Postgres-Compatible Database, AlloyDB

BigDATAwire