Cloud Memorystore is a new addition to the Google Cloud Platform. It provides a fully managed, cloud-hosted Redis service, allowing you to cache common responses on GCP-hosted web applications, providing your users low latency and high performance.
Cloud Memorystore is a new addition to the Google Cloud Platform. It provides a fully managed, cloud-hosted Redis service, allowing you to cache common responses on GCP-hosted web applications, providing your users low latency and high performance.
Due to its in-memory nature, Memorystore features some of the lowest latencies on the platform, down to sub-millisecond levels. This managed-Redis service is hosted on Google’s highly scalable infrastructure, which means that it can support instances up to 300 GB and network throughput of 12 Gbps. Memorystore offers an easy migration path for users of Redis, a technology that is fast gaining popularity, especially for use from within Docker containers running on Kubernetes. In this course, Leveraging Fully Managed Redis Datastores Using Google Cloud Memorystore, you'll examine all of these aspects of working with Memorystore, and learn how to get the best out of this powerful managed database service. First, you will explore the suite of storage products that are available on the GCP and where exactly Memorystore fits in. You will be introduced to the capabilities of using Redis to cache data for transactions, and as a publisher-subscriber message delivery system, and you will learn about the LRU eviction policies that Memorystore follows. Next, you will implement Memorystore integrations with applications that you host on Compute Engine VMs, App Engine, and on Google Kubernetes Engine clusters. These are the current options that the GCP supports for working with managed Redis. Finally, you will dive into how you can configure Memorystore for high-availability configurations. Memorystore offers two Redis tiers: basic tier and standard tier instances. Basic tier instances do not support cross-zone replication and failover, while standard tier applications are equipped with both features. In addition, the standard tier offers far lower downtime during scaling. You’ll also see how you can monitor Redis instances using Stackdriver. When you’re done with this course, you will have a good understanding of how you can use Memorystore to cache your data on the cloud and know how you can integrate managed Redis with your applications running on various compute options on the GCP.
OpenCourser helps millions of learners each year. People visit us to learn workspace skills, ace their exams, and nurture their curiosity.
Our extensive catalog contains over 50,000 courses and twice as many books. Browse by search, by topic, or even by career interests. We'll match you to the right resources quickly.
Find this site helpful? Tell a friend about us.
We're supported by our community of learners. When you purchase or subscribe to courses and programs or purchase books, we may earn a commission from our partners.
Your purchases help us maintain our catalog and keep our servers humming without ads.
Thank you for supporting OpenCourser.