18 Aug, 2025 | Monday
Trending : LaptopsAppsHow To

Bing Chat is so power-hungry, Microsoft had to borrow Oracle's servers

Microsoft is using Oracle Cloud Infrastructure AI infrastructure, along with Microsoft Azure AI infrastructure, for inferencing AI models for Bing Chat.

Published By: Shubham Verma

Published: Nov 08, 2023, 11:56 AM IST

Bing Chat is apparently very hungry for GPUs.

Story Highlights

  • Microsoft has signed a multi-year deal with Oracle.
  • Oracle will provide Microsoft with GPUs for extra power.
  • Bing Chat will use these GPUs to run more smoothly.

Bing Chat has grown colossally in terms of its adoption since its inception. The current demand for Microsoft’s generative artificial intelligence (AI) service is so high that it is becoming hard for the Redmond-based giant to manage everything on its own. So, it is asking Oracle for help. Cloud major Oracle has announced a multi-year agreement with tech giant Microsoft to support the growth of AI services, especially AI chatbot Bing. Microsoft is using Oracle Cloud Infrastructure (OCI) AI infrastructure, along with Microsoft Azure AI infrastructure, for inferencing AI models that are being optimised to power Microsoft Bing conversational searches daily.

Leveraging the Oracle Interconnect for Microsoft Azure, Microsoft is able to use managed services like Azure Kubernetes Service (AKS) to orchestrate OCI Compute at a massive scale to support the increasing demand for Bing conversational search, the company said in a statement. Since Microsoft needs more computing resources to keep up with the increasing demand, it needs to plug in a lot more GPUs. Oracle happens to have tens of thousands of Nvidia A100s and H100 GPUs that it has agreed to lease to Microsoft.

“Generative AI is a monumental technological leap and Oracle is enabling Microsoft and thousands of other businesses to build and run new products with our OCI AI capabilities,” said Karan Batta, senior vice president, Oracle Cloud Infrastructure. “By furthering our collaboration with Microsoft, we are able to help bring new experiences to more people around the world,” Batta added.

Bing conversational search requires powerful clusters of computing infrastructure that support the evaluation and analysis of search results that are conducted by Bing’s inference model. “Our collaboration with Oracle and use of Oracle Cloud Infrastructure along with our Microsoft Azure AI infrastructure, will expand access to customers and improve the speed of many of our search results,” said Divya Kumar, global head of marketing for Search & AI at Microsoft.

Inference models require thousands of compute and storage instances and tens of thousands of GPUs that can operate in parallel as a single supercomputer over a multi-terabit network.

TRENDING NOW

— Written with inputs from IANS

Get latest Tech and Auto news from Techlusive on our WhatsApp Channel, Facebook, X (Twitter), Instagram and YouTube.

Author Name | Shubham Verma

Select Language