CIO Influence
CIO Influence News Machine Learning

Google Cloud and Hugging Face Announce Strategic Partnership to Accelerate Generative AI and ML Development

Google Cloud and Hugging Face Announce Strategic Partnership to Accelerate Generative AI and ML Development

Developers will be able to train, tune, and serve open models quickly and cost-effectively on Google Cloud

Google Cloud and Hugging Face announced a new strategic partnership that will allow developers to utilize Google Cloud’s infrastructure for all Hugging Face services, and will enable training and serving of Hugging Face models on Google Cloud.

Read More : Deloitte and UiPath Expand Strategic Alliance With New SAP Delivery Framework

The partnership advances Hugging Face’s mission to democratize AI and furthers Google Cloud’s support for open source AI ecosystem development. With this partnership, Google Cloud becomes a strategic cloud partner for Hugging Face, and a preferred destination for Hugging Face training and inference workloads. Developers will be able to easily utilize Google Cloud’s AI-optimized infrastructure including compute, tensor processing units (TPUs), and graphics processing units (GPUs) to train and serve open models and build new generative AI applications.

Google Cloud and Hugging Face will partner closely to help developers train and serve large AI models more quickly and cost-effectively on Google Cloud, including:

  • Giving developers a way to train, tune, and serve Hugging Face models with Vertex AI in just a few clicks from the Hugging Face platform, so they can easily utilize Google Cloud’s purpose-built, end-to-end MLOps services to build new gen AI applications.
  • Supporting Google Kubernetes Engine (GKE) deployments, so developers on Hugging Face can also train, tune, and serve their workloads with “do it yourself” infrastructure and scale models using Hugging Face-specific Deep Learning Containers on GKE.
  • Providing more open source developers with access to Cloud TPU v5e, which offers up to 2.5x more performance per dollar and up to 1.7x lower latency for inference compared to previous versions.
  • Adding future support for A3 VMs, powered by NVIDIA’s H100 Tensor Core GPUs, which offer 3x faster training and 10x greater networking bandwidth compared to the prior generation.
  • Utilizing Google Cloud Marketplace to provide simple management and billing for the Hugging Face managed platform, including Inference, Endpoints, Spaces, AutoTrain, and others.

CIO INFLUENCE News: Sevilla FC Transforms the Player Recruitment Process with the Power of IBM watsonx Generative AI

“Google Cloud and Hugging Face share a vision for making generative AI more accessible and impactful for developers,” said Thomas Kurian, CEO at Google Cloud. “This partnership ensures that developers on Hugging Face will have access to Google Cloud’s purpose-built AI platform, Vertex AI, along with our secure infrastructure, which can accelerate the next generation of AI services and applications.”

“From the original Transformers paper to T5 and the Vision Transformer, Google has been at the forefront of AI progress and the open science movement,” said Clement Delangue, CEO of Hugging Face. “With this new partnership, we will make it easy for Hugging Face users and Google Cloud customers to leverage the latest open models together with leading optimized AI infrastructure and tools from Google Cloud including Vertex AI and TPUs to meaningfully advance developers ability to build their own AI models.”

Read More: Top and Bottom Software Companies Perceived for Data Security Ranked for 2024

[To share your insights with us as part of editorial or sponsored content, please write to sghosh@martechseries.com]

Related posts

ASG Technologies Advances Mobius Content Services Platform to Improve Governance and Automation of Content-Rich Processes

CIO Influence News Desk

Aryballe Streamlines Odor Analysis with New Digital Olfaction Hardware and Software

CIO Influence News Desk

Snyk Announces Integration of Application Security Analytics with Snowflake AI Data Cloud

GlobeNewswire