Featured Service: MemCachier
Each week, we’re going to be featuring one of our partners to talk about their service, their story, and what brought them to join Manifold.
MemCachier is a memcache-as-a-service provider that manages and scales memcache servers, allowing you to focus on your app. They’ve created a custom memcache implementation with better reliability and usability, but the same low latency that the open-source version has. They want to make it easier for a product to scale by not requiring any code changes to add or remove capacity, and also removing the worry of managing instances and servers from the developer’s shoulders. On top of that, they provide an analytics dashboard in order for developers to get more out of their cache.
MemCachier has been a partner of Manifold since May 2017. They started in 2012 and have over 40,000 active users. We reached out to Sascha Trifunovic from MemCachier to learn more about their story.
Sitara: How has MemCachier revolutionized the way developers and teams track their cache?
Sascha: MemCachier was the first push button cache provisioning to integrate seamlessly with web frameworks and cache clients. With MemCachier, using a cache to speed up web apps becomes a no-brainer. Anyone can use it, regardless of their expertise in setting up or managing cache servers.
MemCachier gives insights into how a user’s cache is behaving through our analytics dashboard. Web developers can now see at a glance if their cache is performing as expected or if a bigger cache is needed.
MemCachier allows users to upgrade and downgrade the size of their cache without losing any data, all at the push of a button. This means that developers can change cache parameters without fear. Through the analytics dashboard, they can see the impact on their changes in real time and compare them to historical trends.
Si: How does memcaching compare to other caching methods?
Sa: Memcache is a proven technology that has been used for over 14 years. It is used by some of the largest software teams with the most demanding workloads. Following the mantra of “simplicity is good”, memcache is reliable, performant, and it gets the job done. Its architecture makes it easy to scale, and with MemCachier setting up a 100GB cache is just as easy as setting up a 25MB cache.
Si: Stacks are changing, how’s memcache changing to support new application architectures?
Sa: Memcache has been a key component of web applications for over a decade because caching is a core infrastructure that cuts across software stacks and use cases. Multi-page sites use caching to speed up expensive page rendering, while single-page sites use caching to store API response payloads and user states. So it’s no surprise that companies like Facebook and Google continue to use memcache across their software stacks.
Si: Caching has always been a bit of a dark art, how would you suggest people learn about the various techniques and tools?
Sa: MemCachier tries to make caching as transparent as possible. There is an abundance of plug-and-play clients in a variety of languages that integrate with popular frameworks so that caching with memcache can be done in any environment the user is familiar with. On the server side, we’ve already done the necessary work so that developers can focus actually using their caches. Our documentation is always a good starting point for learning how to set up caching for a range of clients and frameworks, and for more advanced questions we have an excellent support team comprised of the engineers that built (and keep improving) our systems. Our analytics dashboard also helps remove some of the mystery surrounding cache performance. Once a cache is set up, we encourage people to play around with resizing their cache on the fly to see the immediate results of that change reflected on their dashboard and to use those results to optimize the size of their cache.
Si: What might caching look like in a serverless/microservices world?
Sa: As workloads become more granular it’s becoming increasingly important to have caches that can scale dynamically and still provide the same reliability and performance of a large cache at smaller sizes. Also, shared caching will start to play an even more important role as an expanding number of functions or microservices will need access to shared state — a cache like MemCachier is the natural (and fastest) way to share this state across a constellation of microservices.
MemCachier provides a range of memcache storage options, starting from a free Developer plan.