CIO Influence
IT and DevOps

Databricks Enhances Data Intelligence Platform with Investment in Mistral AI

Databricks Enhances Data Intelligence Platform with Investment in Mistral AI

Databricks has announced its collaboration and investment in Mistral AI’s Series A funding, highlighting a shared commitment to advancing generative AI through open-source solutions. Mistral AI, a prominent European provider of generative AI solutions, now sees its open models seamlessly integrated into the Databricks Data Intelligence Platform, deepening the partnership between the two entities.

This integration enables Databricks customers to access Mistral AI’s models via the Databricks Marketplace conveniently. Additionally, users can engage with these models in the Mosaic AI Playground, utilize them as optimized model endpoints through Mosaic AI Model Serving, and tailor them to their specific needs by incorporating their data through adaptation.

Since the beginning of the year, nearly 1,000 enterprises have already harnessed Mistral models on the Databricks platform, resulting in millions of model inferences. With these streamlined integrations, enterprises can swiftly leverage Mistral AI’s models for their generative AI applications while upholding the stringent standards of security, data privacy, and governance inherent to the Databricks platform.

“We are delighted to forge this strategic alliance with Databricks, reaffirming our shared commitment to the portability, openness and accessibility of generative artificial intelligence for all. By seamlessly integrating our models into Databricks’ data intelligence platform, we are advancing our shared mission of democratizing AI. This integration marks an important step in extending our innovative solutions to Databricks’ vast customer base and continues to drive innovation and significant advances in AI. Together, we are committed to delivering accessible and transformative AI solutions to users worldwide.”- ARTHUR MENSCH, Founder and CEO of Mistral AI

Mistral AI Unveils Open Models: Mistral 7B and Mixtral 8x7B

Mistral AI has seamlessly integrated its open models into the Databricks platform, offering users access to advanced generative AI capabilities.

Mistral 7B: This compact yet robust dense transformer model, trained with an 8k context length, stands out for its efficiency. With a modest size of 7 billion parameters, Mistral 7B leverages grouped query attention (GQA) and sliding window attention (SWA) in its architecture. For further insights into Mistral 7B, readers can refer to Mistral’s dedicated blog post.

Mixtral 8x7B: This model introduces a sparse mixture of expert (SMoE) architecture, supporting a context length of 32k and accommodating multiple languages, including English, French, Italian, German, and Spanish. Mixtral 8x7B outperforms Llama 2 70B across various benchmarks, offering faster inference due to its SMoE architecture, activating only 12 billion parameters during inference out of 45 billion trained parameters. To delve deeper into Mixtral 8x7B, readers can explore Mistral’s previously published blog post.

Leveraging Mistral AI’s Models within Databricks Data Intelligence Platform

Discover the wealth of Mistral AI models available in the Databricks Marketplace, a dynamic hub for data, analytics, and AI solutions driven by the open-source Delta Sharing standard. Through this platform, customers can explore Mistral AI’s models, grasp their functionalities, and examine practical examples illustrating how to utilize these models across the Databricks ecosystem effectively. This includes deploying models with Mosaic AI Model Serving, conducting batch inference with Spark, and executing model inference in SQL using AI Functions. For a deeper understanding of the Databricks Marketplace and AI Model Sharing, refer to our comprehensive blog post.

Seamless Mistral Model Inference with Mosaic AI Model Serving

Harness the power of Mosaic AI Foundation Model APIs within Model Serving to effortlessly access and query Mixtral 8x7B, among other cutting-edge models. This feature streamlines optimized model deployments without requiring users to create or maintain deployments and endpoints separately. Explore the documentation of the Foundation Model APIs to grasp its full capabilities.

Through Databricks Mosaic AI Model Serving, customers access Mistral’s models using the same APIs employed for other Foundation Models. This facilitates seamless deployment, governance, querying, and monitoring of any Foundation Model across various cloud environments and providers, enabling experimentation and productionization of large language models.

Customers can directly invoke model inference from Databricks SQL using the ai_query SQL function. For a detailed guide, consult the provided SQL code and ai_query documentation.

Tailoring Mistral Models with Mosaic AI Adaptation

Empower yourself with Mosaic AI’s capabilities to craft custom models effortlessly. Adapt Mistral AI’s models, alongside other foundational models, leveraging proprietary datasets to enhance a model’s understanding of specific domains or use cases. This process aims to enrich the model’s comprehension of a company’s vernacular, ultimately enhancing performance on targeted tasks. Once fine-tuned or adapted, users can swiftly deploy the adapted model for inference using Mosaic AI Model Serving, benefiting from cost-effective serving and gaining ownership of a distinctive model IP (Intellectual Property).

Interactive Inference in the Mosaic AI Playground

Experience the agility of experimenting with pre-trained and fine-tuned Mistral models through the Mosaic AI Playground, accessible within the Databricks console. This interactive platform facilitates multi-turn conversations, experimentation with model inference sampling parameters such as temperature and max_tokens, and side-by-side comparison of different models to effectively evaluate response quality and performance characteristics.

Databricks and Mistral AI Forge Strategic Partnership

Databricks Ventures proudly welcomes Mistral AI into its portfolio, marking a significant milestone in its collaborative journey. With Mistral AI models now seamlessly integrated into the Databricks ecosystem, users can access a comprehensive suite of tools designed for constructing, evaluating, and deploying end-to-end generative AI applications. Whether users opt for pre-trained models or prefer customized solutions, Databricks offers a plethora of pathways to kickstart their projects promptly.

For users seeking enhanced accuracies tailored to specific use cases, leveraging Mosaic AI Foundation Model Adaptation to customize Mistral AI models with proprietary data proves to be a cost-effective and straightforward option.

Databricks ensures efficient and secure serverless inference, underpinned by its unified approach to governance and security. This ensures that enterprises can confidently deploy AI solutions leveraging Mistral AI models on the Databricks platform, combining top-notch foundation models with Databricks’ unwavering commitment to data privacy, transparency, and control.

Customer Success Stories: Leveraging Mistral AI’s Models on Databricks

Databricks customers are reaping the rewards of integrating Mistral AI’s models into their workflows, as evidenced by the following testimonials:

James Lin, Head of AI/ML Innovation at Experian:

“At Experian, we’re focused on developing Gen AI models with the lowest rates of hallucination while preserving core functionality. Leveraging the Mixtral 8x7b model on Databricks has significantly expedited our prototyping process, showcasing its superior performance and rapid response times.”

Luv Luhadia, Global Alliance at Celebal Technologies:

“Databricks stands at the forefront of driving innovation and adoption for generative AI in the enterprise realm. Our partnership with Mistral on Databricks has yielded remarkable results, particularly evident in our RAG-based consumer chatbot. Previously reliant on FAQ-based systems, our chatbot struggled to handle the variation in user queries. However, with the implementation of Mistral-based technology, our chatbot now adeptly addresses user queries, boosting system accuracy from 80% to an impressive 95%. Mistral’s cutting-edge technology and expertise have significantly enhanced performance for our customers, and we eagerly anticipate further collaboration with Mistral and Databricks to explore the full potential of data and AI.”

FAQs

1. What is the significance of the partnership between Databricks and Mistral AI?

The partnership between Databricks and Mistral AI signifies a strategic collaboration to enhance generative AI development. Mistral AI’s models are now fully integrated into the Databricks Data Intelligence Platform, offering users advanced capabilities for building, testing, and deploying AI applications.

2. How do users benefit from Mistral AI models integrated into the Databricks platform?

Users can leverage Mistral AI’s models within the Databricks ecosystem for various purposes, including model deployment, batch inference, and SQL-based model inference. The integration provides seamless access to advanced AI functionalities, empowering users to streamline their workflows and improve efficiency.

3. What options are available for consuming Mistral AI models on Databricks?

Users have several options for consuming Mistral AI models on Databricks, including a side-by-side comparison of pre-trained models and consumption through pay-per-tokens. Additionally, users can customize Mistral AI models on proprietary data through Mosaic AI Foundation Model Adaptation for improved accuracies tailored to specific use cases.

4. How do Databricks ensure efficient and secure serverless inference with Mistral AI models?

Databricks ensures efficient and secure serverless inference by employing a unified approach to governance and security. This approach enables enterprises to deploy AI solutions leveraging Mistral AI models on the Databricks platform confidently while maintaining stringent data privacy, transparency, and control standards.

5. What is Mistral AI?

Mistral AI is a prominent provider of generative AI solutions, recognized for its innovative approach to AI model development and deployment. The company specializes in developing open models for tasks such as natural language processing (NLP) and language generation.

[To share your insights with us as part of editorial or sponsored content, please write to sghosh@martechseries.com]

Related posts

SignalWire Announced Launch of a No-Code Intelligent AI Agent

PR Newswire

NSW Government Deploys Pega to Help Speed and Simplify Its Building Bond Management Processes

Persistent Systems And IBM Deepen Collaboration To Accelerate Hybrid Cloud Adoption In The Enterprise

CIO Influence News Desk