Redis transforms into AI real-time memory layer with Decodable, LangCache.

Powering next-gen AI: Redis acquires Decodable and unveils LangCache, streamlining real-time data and memory for intelligent agents.

September 17, 2025

Redis transforms into AI real-time memory layer with Decodable, LangCache.
In a significant move to bolster its position in the artificial intelligence sector, data platform Redis has announced a major expansion of its AI strategy, highlighted by the acquisition of real-time data platform Decodable and the launch of a new semantic caching service, LangCache.[1][2][3] These announcements, made at the Redis Released 2025 event in Bengaluru, India, signal the company's evolution from a high-speed in-memory data store to a foundational "real-time memory layer" for the burgeoning world of intelligent, agent-based software.[4] The strategic initiatives underscore a sharpened focus on the Indian market, which Redis CEO Rowan Trollope identified as a cornerstone of the company's global growth strategy, citing its massive developer community and vibrant startup ecosystem.[4][2][5] This series of moves is designed to provide developers with the essential infrastructure for building and scaling reliable AI applications, particularly by enhancing how AI agents access and utilize real-time contextual data.[2][3]
The cornerstone of the expanded strategy is the acquisition of Decodable, a serverless platform that simplifies the complex processes of real-time data ingestion, transformation, and delivery.[6][3] This strategic purchase is intended to supercharge Redis's capabilities in managing streaming data pipelines, making it significantly easier for developers to feed fresh, context-rich information into AI systems.[4][6] By integrating Decodable's technology, Redis aims to streamline the flow of data from diverse sources into its core platform, ensuring that AI agents and applications have instant access to the most current information for making relevant and reliable decisions.[6] CEO Rowan Trollope emphasized that as technology becomes increasingly reliant on Large Language Models (LLMs), the Decodable acquisition will simplify the creation of data pipelines, converting raw data into actionable context within Redis where it is always fast and available.[2][3] This integration will directly enhance Redis Data Integration (RDI), broadening its connections and capabilities to ensure data from legacy databases and other sources is synchronized in near-real time for immediate use in demanding AI workloads.[6] The move reflects a deep investment in Redis Cloud and the company's mission to be the central, fastest real-time data platform for modern applications.[6]
Alongside the acquisition, Redis has launched LangCache, a new, fully-managed semantic caching service, now available in public preview.[1][2] This tool is engineered to address two of the most significant challenges in deploying LLM-powered applications: high operational costs and response latency. LangCache works by storing and retrieving semantically similar calls to LLMs, effectively creating a memory layer that prevents redundant queries.[4][7] By serving responses from its cache for repeated or similar questions, Redis claims LangCache can slash LLM API costs by up to 70 percent and accelerate response times by as much as 15 times compared to live inference.[4][2] This capability is crucial for applications like AI-powered chatbots, retrieval-augmented generation (RAG) systems, and multi-step AI agents, which often face repetitive user queries.[7][8] The service is accessible via a REST API, simplifying deployment and eliminating the need for developers to manage the underlying database infrastructure, while also offering advanced cache management features to control data access, privacy, and eviction protocols.[7][9]
Further cementing its role as a critical component of the AI development stack, Redis also unveiled deeper integrations with popular agent frameworks, including AutoGen and Cognee, along with enhancements for LangGraph.[4][1][10] These integrations are designed to simplify the development of AI agents by providing a straightforward way to equip them with persistent, scalable memory without writing complex, custom code.[4][2] By offering a robust memory layer, Redis enables AI agents to maintain context, learn from past interactions, and perform more sophisticated reasoning and planning.[4] This addresses a key challenge in the move from AI prototypes to production-scale systems, where providing the right memory and context is paramount for reliable performance. The integrations build on Redis's existing portfolio of partnerships with frameworks like LangChain and LlamaIndex, reinforcing its position as a high-performance vector database and LLM cache essential for building agentic applications.[11]
The decision to announce these sweeping strategic updates in Bengaluru was a deliberate one, reflecting India's growing prominence in the global AI landscape.[4][2][12] Trollope noted that the attendance at the Indian developer gathering was higher than in any other market, a testament to the country's energy and scale.[4] With over 17 million developers and one of the world's largest startup ecosystems, India represents a fertile ground for the adoption and innovation of AI technologies.[2][5] Redis is positioning itself to be the foundational infrastructure for this new wave of intelligent applications being built in the region and globally.[12] By providing the tools to manage context and memory in real time, Redis is betting that its platform will be indispensable as enterprises and startups alike embrace AI at an unprecedented speed.[12] The combination of the Decodable acquisition, the launch of LangCache, and deeper framework integrations provides a powerful, unified solution for developers aiming to build the next generation of capable, responsive, and reliable AI systems.

Sources
Share this article