CIO Influence
CIO Influence News Cloud Machine Learning

Nebul Chooses Lightbits to Deliver Powerful AI Cloud

Nebul Chooses Lightbits to Deliver Powerful AI Cloud

Set to revolutionize AI services sovereign to the EU with NVMe over TCP software-defined data platform

Lightbits Labs (Lightbits), the inventor of NVMe over TCP and pioneer in modern, software-defined cloud data platforms, today announced that Nebul, has chosen Lightbits as the storage foundation on which to build cutting-edge AI cloud services. Nebul’s service helps companies across the EU unify their data, deploy NVIDIA-based Private AI, and extract actionable insights from their data through a robust and protected cloud infrastructure. Lightbits enables the AI cloud service provider to offer their customers 16x better performance at significantly lower costs than the alternatives while at the same time supporting their sustainability mission. The high performance and low latency capabilities of Lightbits, combined with the versatility in compatibility with common orchestration environments–including Kubernetes, VMware, OpenShift, and OpenStack–make it an ideal choice for cloud builders or service providers supporting diverse performance-sensitive workloads at scale.

Latest CIO Influence Article: Leveraging Big Data Analytics for Strategic Decision-Making: A Guide for CIOs

“With Lightbits as part of our platform, we can offer our customers 16 times the performance at half the cost of AI services from the hyperscalers.”

Nebul chose Lightbits for its superior performance, resilience, and comprehensive data services, which include three-way replication and QoS. “We considered the usual storage providers, but even the software-defined ones tend to have a ‘closed recipe’ now in terms of infrastructure. We prefer to build with software that’s open, innovative, and especially performant,” said Arnold Juffer, CEO and Founder of Nebul. “With Lightbits as part of our platform, we can offer our customers 16 times the performance at half the cost of AI services from the hyperscalers.” This remarkable efficiency allows Nebul to improve large language model (LLM) performance and handle latency-sensitive workloads under stress without compromising cost or reliability.

As a service provider, Nebul realized their customers would require block storage for latency-sensitive workloads commonly found in AI, such as RAG model training and inference. These workloads are built on vector, real-time, and other NoSQL databases that demand high performance at scale. The Lightbits cloud data platform scales beyond the petabyte level and delivers performance of up to 75 million IOPS and consistent sub-millisecond tail latency even under a heavy load. This exceptional performance profile makes it the ideal solution for vector and other AI-oriented databases whether they manage real-time AI application data or store training parameters and tags.

Also Read: Security as a Business Enabler: How Collaboration Between IT and Business Teams Strengthens Data Protection

“With Lightbits in place, Nebul can now provide EU-based organizations with a specialty AI cloud that runs on NVIDIA AI Enterprise with certified tools, frameworks, and AI apps to meet the demands of today’s most intensive applications,” commented Eran Kirzner, CEO and co-founder of Lightbits. “The data platform is incredibly flexible running in container environments, like Kubernetes and Azure Kubernetes Solution (AKS), delivering accelerated performance and efficiency for cloud-native applications at scale. We attribute this versatility combined with the unparalleled speed, scalability, and cost-efficiency to the increase in interest and cloud service use cases, like AI.”

The Lightbits cloud data platform enables enterprises to build AI clouds with extreme performance at scale to capitalize on rapidly expanding AI opportunities.

“Nebul evaluated many systems and conducted rigorous testing seeking a solution that could handle extreme conditions and high loads without performance degradation. Lightbits emerged as the clear leader, meeting Nebul’s stringent requirements and providing a platform for continuous innovation,” added Jos Keulers, Founder of NVMestorage.com, a Lightbits Luminary Leader Partner. “With a solution attuned to the performance and scalability requirements of AI, we’re in a great position to help our customers architect a future-proofed data platform with the ability to support modern AI workloads.”

Latest CIO Influence Article: CIO’s Guide to Preventing Ransomware Attacks in B2B Enterprises

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Related posts

Rackspace Technology Delivers on Long-standing Commitment to OpenStack Community with Launch of OpenStack Enterprise

GlobeNewswire

C3 Generative AI Now Publicly Available on Google Cloud Marketplace

Business Wire

Ingram Micro Helps Accelerate Partner Success with Microsoft Azure and New Commerce Experience

CIO Influence News Desk