CIO Influence
CIO Influence News Machine Learning

Petuum Unveils Enterprise MLOps Platform

Petuum Unveils Enterprise MLOps Platform

Petuum CEO Aurick Qiao, PhD and Director of Engineering Tong Wen, PhD demoed the new Petuum Platform for scaling enterprise MLOps and announced that they are now accepting applications for private beta customers

In their talk Supercharging MLOps with Composability, Automation, and Scalability at Open Data Science Conference (ODSC) East, Aurick Qiao, PhD and Tong Wen, PhD of machine learning startup Petuum unveiled their new enterprise MLOps platform for AI/ML teams, now in private beta.

Petuum helps enterprise AI/ML teams operationalize and scale their machine learning pipelines to production with the world’s first composable platform for MLOps. After years of development at CMU, Berkeley, and Stanford, as well as dozens of customer engagements in finance, healthcare, energy, and heavy industry, Petuum announced a limited release of their platform through an exclusive private beta for select customers.

Latest ITechnology News: Acer Debuts Acer Chromebook Spin 514 Powered by New AMD Ryzen 5000 C-Series Processors

“We have spent the last five years working with customers on the hard problems in MLOps, and have learned how to multiply AI team productivity through extensive research. The Petuum Platform helps AI teams do more with less.” – Aurick Qiao, CEO

Petuum’s enterprise MLOps platform is built around principles of composability, openness, and infinite extensibility. With universal standards for data, pipelines, and infrastructure, AI applications can be built from reusable building blocks and managed as part of a repeatable assembly-line process. Petuum’s users don’t need to worry about infrastructure or DevOps expertise, glue code, or tuning, and can instead focus on rapidly deploying more projects in less time, with less resources, and with less help from others.

Latest ITechnology News: Department for Work and Pensions Improves Service Delivery Time to Meet Record Demand with Red Hat

“In training alone, we have seen 3 to 8 times greater time to value. Infrastructure orchestration and Pythonic deployment system are easy enough for a data scientist to use.” – Tong Wen, Director of Engineering

The end-to-end platform includes the AI OS with low/no-code Kubernetes optimized for AI. Universal Pipelines allow low-expertise users to compose and execute DAGs with modular DataPacks for any kind of data. The low/no-code Deployment Manager can upgrade, reuse, and reconfigure pipelines in production with observability and user management. The platform also hosts a revolutionary experiment manager for amortized autotuning and optimizing pipelines of models and systems.

Petuum’s award-winning team has grown out of the CASL open source consortium and comprises thought leaders across all categories of machine learning operations. Petuum’s customers have seen improvements of 50% or more in time to value and productivity of ML team and resources. These unparalleled efficiencies only increase with scale.

“This is the Petuum omniverse. With Petuum AI OS you can wrap up anything and everything, as long as it runs with Docker and normal compute systems. In that sense, you not only have this graph system, you also want to standardize all of your pipelines.” –  Guowei He, Inception Institute of Artificial Intelligence.

Latest ITechnology News: Oracle and Amazon Web Services Veteran Joins SDI Presence

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

Leading Edge AI Chipmaker Hailo Partners with NXP to Launch High-Performance, Scalable, AI Solutions for the Automotive Industry

CIO Influence News Desk

Quix Delivers Low-Code/No-Code Platform For Building Data Pipelines

CIO Influence News Desk

Unison Marketplace Launches New Automation Capabilities and User-Friendly Features to Save Contracting Officers’ Hours of Work

PR Newswire

Leave a Comment