Redis,announced the integration of Redis Enterprise Cloud’s vector database capabilities with Amazon Bedrock, a service that makes it easy to build generative AI applications with foundation models. The solution allows customers to streamline application development by capitalizing on developer efficiency and scalability of a fully managed, high-performance database, while making it easy to use an array of leading foundation models (FMs) via API.
Read More: CIO Influence Interview with Sumeet Arora, Chief Development Officer at ThoughtSpot
Redis Enterprise Cloud integration with Amazon Bedrock streamlines development of generative AI apps
Vector databases have become essential in addressing unique challenges in generative AI. Redis Enterprise Cloud offers the flexibility to store vector embeddings as hashes or JSON documents. It provides a high-performance index and query search engine that meets generative AI applications’ low latency demands. In addition to native vector database features, Redis Enterprise Cloud integrates with leading AI application development frameworks, libraries, and data orchestration platforms to deliver the same ease-of-use Redis is known for by developers.
Amazon Bedrock is a fully managed service that enables developers to build and scale generative AI applications using FMs from Amazon, AI21 Labs, Anthropic, Cohere, and Stability AI that best suit their unique use case. For organizations implementing Retrieval Augmented Generation (RAG) architectures or large language model (LLM) caching, this integration eliminates the need to build their own models, customize existing models, or share their proprietary data with a commercial LLM provider. As a vector database along with Amazon Bedrock, Redis Enterprise Cloud offers hybrid semantic search capabilities to pinpoint relevant data. It can also be deployed as an external domain-specific knowledge base. This allows LLMs to receive the most relevant and up-to-date context which improves result quality and reduces undesirable model hallucinations. The integration will be available in AWS Marketplace.
“The integration of Redis Enterprise Cloud and Amazon Bedrock is a continuation of our commitment to deliver more efficient and high-performance solutions to the developer community and our customers,” said Tim Hall, Chief Product Officer at Redis. “The combination of these robust serverless platforms will help accelerate generative AI application development through efficient infrastructure management and seamless scalability required by these applications.”
“Generative AI presents an exciting new frontier for companies to create innovative customer experiences by using FMs for their specific use cases,” said Atul Deo, general manager, Amazon Bedrock at AWS. “Customers are keen to use techniques like RAG to ensure that FMs deliver accurate and contextualized responses. This integration of Amazon Bedrock and Redis Enterprise Cloud will help customers streamline their generative AI application development process by simplifying data ingestion, management, and RAG in a fully-managed serverless manner.”
Top Insights: Best Practices for Implementing Transaction Monitoring
“We chose Redis Enterprise Cloud on AWS because of its versatility and reliability,” said Sergio Prada, CTO at Metal. “It provides simple data structures for storing messages, methods for fetching slices, and fast vector similarity search and information retrieval––all of which are critical to our platform’s success. We’re thrilled to see Redis and AWS work together to further our mission of deploying LLM applications to production for the enterprise.”
Recommended: CIO Influence Interview with Craig Hinkley, Chief Executive Officer at CloudBolt
[To share your insights with us, please write to sghosh@martechseries.com]