Home Opinions & Editorials Industry News Google Cloud Introduces A Managed Memcached Service

Google Cloud Introduces A Managed Memcached Service

-

Google today announced the MemoryStore beta for Memcached, a new service that provides a fully managed memory database compatible with the open-source Memcached protocol. It will join Redis in the MemoryStore family, which was first launched in 2018. This announcement gives your caching layer more flexibility and choice.

In-memory data stores are essential infrastructure for creating scalable, high-performance applications. Whether it’s building a highly responsive e-commerce website, creating multiplayer games with thousands of users; or analyzing real-time data pipelines with millions of instances, an in-memory store can provide low latency and volume for millions of transactions. Redis is a popular memory data store for events such as session stores; gaming leaderboards, stream analytics, API rate control, threat detection, and more. Another in-memory data store, Open Source Memcached; continues to be the most popular choice as a caching layer for databases and is used for its speed and simplicity.

Suggested Reads: Google Cloud Hits Home Run In MLB, AWS Punch-Out

Main features of MemoryStore for Memcached

Memcached provides a simple but powerful in-memory key-value store and is popular as a front-end cache for databases. Using Memcached as a front-end store not only provides an in-memory caching layer for faster query processing; but it can also help save costs by reducing the load on your back-end database.

  • MemoryStore is a fully open-source protocol compatible with Memcached. If you move applications using self-deployed Memcached or other cloud providers, you can replace your app with zero code changes.
  • The Memorystore for Memcached is fully managed. All common tasks are observed, such as deployment, scalability, managing node configuration in the client, setting up monitoring, and pasting. You can focus on creating your applications.
  • Proper measuring a cache is a common challenge with distributed caches. Memorystore’s scaling feature for Memcached; with comprehensive open-source Memcached tracking metrics, allows you to scale your event up and down to improve your cache-hit rate and price. With MemoryStore for Memcached, you can scale your cluster up to 5 TB per instance.
  • The Auto-Discovery Protocol enables customers to programmatically transform with ease; making it easier to deal with changes in the number of nodes during scaling. This drastically reduces the management overhead and code puzzle.
  • You can track your Memorystore for Memcached instances cached by built-in dashboards in the cloud console and rich metrics in cloud monitoring.

Memorystore for Memcached can be accessed from available applications running on Compute Engine, Google Kubernetes Engine (GKE), App Engine Flex, App Engine Standard, and Cloud Functions.

Akarshan Narang
Akarshan Narang
Covering the world of Cloud at CMI.

Cloud

Cloud Management