CIO Influence
CIO Influence News Cloud

Deci Partner with Qualcomm to Make Generative AI More Accessible

Deci Partner with Qualcomm to Make Generative AI More Accessible

Deci’s groundbreaking new models streamline the deployment of advanced Generative Artificial Intelligence on the Qualcomm Cloud AI 100 solution, unlocking cost effective, real-time AI processing

Deci, the deep learning company harnessing artificial intelligence (AI) to build AI, announced it is collaborating with Qualcomm Technologies, Inc. to introduce advanced Generative Artificial Intelligence (AI) models tailored for the Qualcomm Cloud AI 100, Qualcomm Technologies’ performance and cost-optimized AI inference solution designed for Generative AI and large language models (LLMs). This working relationship between the two companies is designed to make AI accessible for a wider range of AI-powered applications, resulting in the democratization of Generative AI’s transformative power for developers everywhere.

PREDICTIONS SERIES 2024 - CIO InfluenceCIO INFLUENCE News: D-Wave Showcases Advantage QPU’s Ability to Improve Cellphone Network Transmission

“Together with Qualcomm Technologies we are pushing the boundaries of what’s possible in AI efficiency and performance,” said Yonatan Geifman, CEO and co-founder of Deci. “Our joint efforts streamline the deployment of advanced AI models on Qualcomm Technologies’ hardware, making AI more accessible and cost-effective, and economically viable for a wider range of applications. Our work together is a testament to our vision of making the transformational power of generative AI available to all.”

Through the relationship, Deci will work with Qualcomm Technologies to launch two groundbreaking models. The first model is DeciCoder-6B, a 6 billion parameter model for code generation engineered with a focus on performance at scale. Supporting eight programming languages (C, C#, C++, GO, RAST, Python, Java, JavaScript), it outperforms established models such as CodeGen2.5-7B, StarCoder-7B, and CodeLlama-7B. In fact, in Python, DeciCoder achieves a 3-point lead over models more than twice its size, such as StarCoderBase 15.5B. The model also stands out for its remarkable memory and computational efficiency, boasting 19x higher throughput compared to similar models when running on Qualcomm’s Cloud AI 100.

The second model, DeciDiffusion 2.0, is a 732 million parameter text-to-image diffusion model that sets new standards by outperforming Stable Diffusion v1.5, operating at 2.6 times the speed with on-par image quality. Both models are meticulously optimized to leverage the full potential of the Qualcomm Cloud AI 100 solution. These models are designed to enable users across various industries to experience exceptional performance from the outset at a more competitive price point.

CIO INFLUENCE News: Webscale Acquires Section.io to Launch CloudFlow

Both DeciCoder-6B and DeciDiffusion 2.0 were developed using Deci’s Neural Architecture Search Technology, AutoNACâ„¢, its proprietary, hardware-aware technology that democratizes the use of Neural Architecture Search for enterprises of all sizes. The distinctive architecture of both models ensures efficient scaling of batching while maintaining minimal memory usage and avoiding any increase in latency. Additionally, the models were designed to handle large batches, enabling maximal utilization of the computational power of the Qualcomm’s Cloud AI 100 cores. DeciCoder-6B and DeciDiffusion have been released under Apache-2.0 and CreativeML Open RAIL++-M Licenses, respectively.

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

New Version of Lens IDE for Kubernetes Improves Teamwork; Helps Accelerate Adoption Of Cloud-Native Technologies

CIO Influence News Desk

CGG Adds New Southeast Asia Carbon Storage Study to Growing CCUS Library

CIO Influence News Desk

Fortinet Extends Security Fabric with World’s Fastest Next-Generation Firewall and 5G Connectivity for SASE

CIO Influence News Desk