CIO Influence
CIO Influence News Cloud Security

Numenta Transforms the AI Landscape with NuPIC

Numenta Transforms the AI Landscape with NuPIC

New AI platform leverages neuroscience discoveries to finally enable large language models on CPUs

Numenta, maker of powerful, scalable neuroscience-based AI solutions announced its commercial AI product, the Numenta Platform for Intelligent Computing (NuPIC). Based on two decades of neuroscience research, NuPIC leverages Numenta’s unique architecture, data structures, and algorithms to enable the efficient deployment of Large Language Models (LLMs) on CPUs. The result is the first AI platform that delivers disruptive performance, substantial cost savings, and business-critical privacy, security, and control. Designed to allow any developer or software engineer to get up and running easily, NuPIC requires no deep learning experience.

Read More: CIO Influence Interview with Joe Ramieri, VP of North America at Instabase

“We recognize that people are in a wave of AI confusion. Everyone wants to reap the benefits, but not everyone knows where to start or how to achieve the performance they need to put LLMs into production”

“We recognize that people are in a wave of AI confusion. Everyone wants to reap the benefits, but not everyone knows where to start or how to achieve the performance they need to put LLMs into production,” said Subutai Ahmad, CEO of Numenta. “The only platform based on the Thousand Brains Theory of Intelligence, NuPIC delivers performance results that elevate CPUs to be the ideal platform for running LLMs. With our optimized inference server, model library, and training module, you can select the right model for your unique business needs, fine-tune them on your data, and run them at extremely high throughput and low latency on CPUs, significantly faster than on an NVIDIA A100 GPU— all with utmost security and privacy.”

NuPIC offers many firsts that fundamentally transform the AI landscape. Namely, NuPIC:

Runs AI models efficiently on CPUs — With NuPIC, customers experience consistently high throughput and low latency in inference using only CPUs, eliminating the need for complex, expensive, and hard-to-obtain GPU infrastructures.

Provides uncompromised data privacy, security, and control — NuPIC runs entirely within the customer’s infrastructure, either on-premise, or via private cloud on any major cloud provider. Unlike alternatives that require sending internal data to external SaaS services, NuPIC customers maintain complete control over their data and models, and internal data never leaves company walls. This level of privacy ensures consistent, reliable behavior, lower costs, and enhanced data compliance.

Read More: CIO Influence Interview with Jim Alkove, CEO and Co-Founder at Oleria

Offers a variety of optimized, production-ready LLMs — NuPIC’s flexible model library makes it easy to choose the right tool for the right job. With a range of production-ready models from BERTs to GPTs, customers can choose to optimize for accuracy or speed. Additionally, they can use existing models and create as many customized versions as they want, when they want.

Allows customers to quickly prototype LLM-based solutions, and then deploy at scale — Without a large or experienced machine learning team. NuPIC is backed by a dedicated team of AI experts that can help provide a seamless experience in deploying LLMs in production. Delivered as a Docker container, customers can rapidly prototype, iterate, and scale their AI solutions using standard MLOps tools and processes.

These unique features translate to significant business advantages. Customers can finally harness the power of LLMs on more accessible CPUs. Customers achieve dramatic throughput and latency improvements of 10-100x on Intel 4th Gen Xeon Scalable Processors compared with other standard CPUs, and up to 35x throughput improvement compared to NVIDIA A100 GPUs. They can choose the right LLM for their application, fine-tune it using custom data, scale easily, and run bigger models without increased budgets. Finally, NuPIC lets organizations maintain complete control over their data to assure privacy.

Gallium Studios, co-founded by Lauren Elliott and Will Wright, creator of The Sims and Where in the World is Carmen San Diego, is among the first companies using NuPIC. “Our latest game, Proxi, is an expansive interactive world populated by your personal memories and connections. We turned to Numenta because of fundamental challenges we faced in incorporating AI – not only to deliver the best experience possible to our players, but also ensure that we never jeopardize the trust and privacy they place in us,” said Lauren Elliott, Co-Founder and CEO of Gallium Studios. “With NuPIC, we can run LLMs with incredible performance on CPUs and use both generative and non-generative models as needed. And, because everything is on-prem, we have full control of models and data. Over time, Numenta’s cutting-edge neuroscience-driven research will enable us to build simulated AI players that continuously learn, adapt, and behave in truly intelligent fashion. We are excited by the possibilities!”

Read MoreCIO Influence Interview with Conor Egan, VP of Product and Engineering at Contentstack

[To participate in our interview series, please write to us at sghosh@martechseries.com]

Related posts

AlmaLinux 9, Popular CentOS Alternative, Available Now

Stefanini Group Partners With Microsoft to Integrate Simultaneous Translation Platform Into Microsoft Teams

PR Newswire

GrammaTech CodeSentry 4.0 Enables Developers to Identify Security Vulnerabilities Hidden in Third Party Code

CIO Influence News Desk