CIO Influence
CIO Influence News Cloud

Oracle Cloud Infrastructure Utilized by Microsoft for Bing Conversational Search

Oracle Expands Support for Open Source Community at KubeCon + CloudNativeCon

Oracle announced a multi-year agreement with Microsoft to support the explosive growth of AI services. Microsoft is using Oracle Cloud Infrastructure (OCI) AI infrastructure, along with Microsoft Azure AI infrastructure, for inferencing of AI models that are being optimized to power Microsoft Bing conversational searches daily. Leveraging the Oracle Interconnect for Microsoft Azure, Microsoft is able to use managed services like Azure Kubernetes Service (AKS) to orchestrate OCI Compute at massive scale to support increasing demand for Bing conversational search.

PREDICTIONS SERIES 2024 - CIO Influence

CIO INFLUENCE News: Sysdig Debuts New Benchmark for Cloud Detection and Response

Bing conversational search requires powerful clusters of computing infrastructure that support the evaluation and analysis of search results that are conducted by Bing’s inference model.

“Generative AI is a monumental technological leap and Oracle is enabling Microsoft and thousands of other businesses to build and run new products with our OCI AI capabilities,” said Karan Batta, senior vice president, Oracle Cloud Infrastructure. “By furthering our collaboration with Microsoft, we are able to help bring new experiences to more people around the world.”

“Microsoft Bing is leveraging the latest advancements in AI to provide a dramatically better search experience for people across the world,” said Divya Kumar, global head of marketing for Search & AI at Microsoft. “Our collaboration  with Oracle and use of Oracle Cloud Infrastructure along with our Microsoft Azure AI infrastructure, will expand access to customers and improve the speed  of many of our search results.”

CIO INFLUENCE News: Montage Technology Leads in Trial Production of 3rd-Gen DDR5 RCDs

Inference models require thousands of compute and storage instances and tens of thousands of GPUs that can operate in parallel as a single supercomputer over a multi-terabit network.

OCI Superclusters include OCI Compute Bare Metal instances, ultra-low latency RDMA cluster networking, and a choice of HPC storage. OCI Superclusters can scale up to 4,096 OCI Compute Bare Metal instances with 32,768 A100 GPUs or 16,384 H100 GPUs, and petabytes of high-performance clustered file system storage to efficiently process massively parallel applications.

CIO INFLUENCE News: Lexsoft Launches Fully Cloud-Enabled Knowledge Management Solution, Lexsoft T3

[To share your insights with us, please write to sghosh@martechseries.com]

Related posts

New Relic Enhances AIOps with the Industry’s First AI Recommended Alerts

Business Wire

Equinix Collaborates with Nasdaq to Scale Digital Infrastructure

CIO Influence News Desk

Anomali Appoints Chris Peterson as Vice President of Global Channel and Technology Partnerships

CIO Influence News Desk