Saturday, March 25, 2023
  • About
  • Privacy Policy
  • Editorial Principles
  • Why to trust us
  • Contact
Scoop Nova
  • Home
  • Business
  • Technology
  • Scandals
  • Politics
Scoop Nova
  • Home
  • Business
  • Technology
  • Scandals
  • Politics
No Result
View All Result
Scoop Nova
Home Technology

Nvidia’s A100 is the $10,000 chip powering the race for A.I.

admin by admin
February 24, 2023
in Technology
0
Nvidia’s A100 is the $10,000 chip powering the race for A.I.
586
SHARES
3.3k
VIEWS
Share on FacebookShare on Twitter

You might also like

Crypto firms slam SEC, Washington for lack of clarity on rules

Crypto firms look to Swiss lenders after Silvergate, Signature Bank fail

SVB fail is double-whammy for startups dealing with plunge in venture

Nvidia CEO Jensen Huang speaks at the press conference at MGM during CES 2018 in Las Vegas on January 7, 2018.

Mandel Gunn | AFP | Getty Images

Software that can write text and draw pictures that look like they were made by humans has sparked a gold rush in the tech industry.

Companies like Microsoft and Google are fighting to integrate cutting-edge AI into their search engines. Multi-billion dollar competitors such as OpenAI and Stable Diffusion are racing to publish their software.

Powering many of these applications is the Nvidia A100, a nearly $10,000 chip that has become one of the most important tools in the artificial intelligence industry.

The A100 is now a “workhorse” for artificial intelligence professionals, said investor Nathan Benaich. Newsletter and report It covers the AI ​​industry, including a partial list of supercomputers using A100. According to New Street Research, Nvidia holds 95% of the market for graphics processors that can be used for machine learning.

A100 is great for machine learning models like power tools. Chat GPT, Bing Bingagain stable diffusion. Many simple calculations can be performed at the same time. This is important for training and using neural network models.

The technology behind the A100 was originally used to render sophisticated 3D graphics in games. Often referred to as a graphics processor or GPU, his recent Nvidia A100 is geared toward machine learning tasks and runs in a data center rather than inside a hot gaming PC.

Large companies and start-ups working on software such as chatbots and image generators need hundreds or thousands of Nvidia chips and either buy them themselves or secure access to their computers from cloud providers. Offers.

Read more about CNBC Pro tech and crypto

hundreds of GPUs Required for training artificial intelligence models such as large language models. Chips must be powerful enough to quickly process terabytes of data and recognize patterns. After that, you’ll also need a GPU like the A100 for “inference,” i.e. using the model to generate text, make predictions, or identify objects in photos.

This means AI companies need access to many A100s. Some entrepreneurs in this space even see the number of A100s they have access to as a sign of progress.

“A year ago we had 32 A100s,” Emad Mostaque, CEO of Stability AI wrote on twitter in January. “Dream big and stack up the Moar GPU kids. Brrr.” over $1 billion.

Stability AI now has access to over 5,400 A100 GPUs. in one quote The State of AI report charts and tracks the companies and universities with the largest collection of A100 GPUs, but does not include cloud providers that do not publish numbers.

Nvidia is on the AI ​​train

Nvidia stands to benefit from the AI ​​hype cycle.During Wednesday’s fiscal quarter earnings reportwhile overall sales fell 21%, investors Stock up about 14% This was largely due to the company’s AI chip business, reported as Data Center, showing continued growth during the quarter, recording revenue of more than $3.6 billion, up 11% during the quarter.

Nvidia’s stock is up 65% so far in 2023, outperforming the S&P 500 and other semiconductor stocks.

Nvidia CEO Jensen Huang couldn’t stop talking about AI during a conference call with analysts on Wednesday, suggesting the recent boom in artificial intelligence is at the heart of the company’s strategy.

“Activities around the AI ​​infrastructure we built and inference using Hopper and Ampere to influence large language models have gone through the roof in the last 60 days,” said Huang. increase. “Whatever our view is this year, there is no doubt that the last 60, 90 days have resulted in a pretty dramatic shift.”

Ampere is Nvidia’s codename for its A100 generation chips. Hopper is the codename for the new generation that includes the recently shipped H100.

need more computers

Nvidia A100 processor

NVIDIA

Machine learning tasks can take up an entire computer’s processing power compared to other types of software that use occasional bursts of processing power in microseconds, such as serving web pages., Sometimes hours and days.

This means that companies with hit AI products often need to get more GPUs to handle peak periods or improve their models.

These GPUs aren’t cheap. In addition to a single A100 on a card that can be inserted into existing servers, many data centers use systems containing eight of his A100 GPUs working in tandem.

This system, Nvidia’s DGX A100, It comes with the necessary chips, but the asking price is about $200,000. On Wednesday, Nvidia announced it would sell cloud access to his DGX system directly.

It’s easy to see how the cost of A100 increases.

For example, New Street Research estimates that the OpenAI-based ChatGPT model within Bing’s search may require eight GPUs to deliver responses to questions in less than a second.

At this rate, Microsoft would need more than 20,000 8-GPU servers just to roll out Bing’s model to everyone, a feature that could cost Microsoft $4 billion in infrastructure spending. suggests.

“If you’re from Microsoft and you want to scale it on the scale of Bing, maybe $4 billion. If you want to scale on the scale of Google, which handles 8 or 9 billion queries every day, you really need DGX. We need to spend $80 billion,” said Antoine Chkaiban, technology analyst at New Street Research. “The numbers we came up with are huge. But this simply reflects the fact that anyone using such a large language model would need a large supercomputer in use.” Thing.”

The latest version of Stable Diffusion, an image generator, 256 A100 GPUsA total of 200,000 compute hours, according to information posted online by Stability AI.

Mostaque, CEO of Stability AI, said on Twitter that training the model alone would cost $600,000 at market prices. tweet exchange The price was unusually cheap compared to its rivals. This does not include the cost of “inference” or model deployment.

Nvidia CEO Huang said in an interview with CNBC’s Katie Tarasov that the company’s product is actually cheaper for the amount of computation this kind of model requires.

“We went from a $1 billion data center running CPUs down to a $100 million data center,” Huang said. “Today, if you put $100 million in the cloud and share it with 100 companies, it’s pretty much zero.”

Huang said Nvidia’s GPUs allow startups to train models at a much lower cost than using traditional computer processors.

“Now you can build something like a large language model like GPT for $10 million to $20 million,” Huang said. “It’s really, really affordable.”

new competition

Nvidia isn’t the only company making GPUs for artificial intelligence. AMD and intel Competing graphics processors, like the big cloud companies Google and Amazon develops and deploys proprietary chips specifically designed for AI workloads.

Still, according to State of AI, “AI hardware continues to be strongly integrated into NVIDIA.” calculation reportAs of December, over 21,000 open source AI papers said they used Nvidia chips.

most researchers Included in the State of AI Compute Index used V100, Nvidia’s chip that came out in 2017, but A100 will grow rapidly in 2022, becoming the third most used Nvidia chip. became. game.

The A100 also has the distinction of being one of the few chips that has been subject to export controls for national defense reasons. Last fall, Nvidia said in his SEC filing that the US government imposed licensing requirements barring exports of his A100 and his H100 to China, Hong Kong and Russia.

“USG indicated that the new licensing requirements address the risk that covered products may be used or diverted for ‘military end uses’ or ‘military end users’ in China and Russia,” said Nvidia. said. mentioned in the filingNvidia has previously said it has adopted some of its chips for the Chinese market to comply with US export regulations.

The A100’s fiercest competition may be its successor. The A100 was first introduced in his 2020, long before the chip cycle. Introduced in 2022, his H100 is starting to go into mass production. In fact, Nvidia announced Wednesday that in its January-ending quarter, it recorded more revenue from its H100 chip than it did its A100, but the H100 will have a higher price per unit.

According to Nvidia, the H100 is one of the first data center GPUs optimized for Transformers, a technology of increasing importance used by many of the top modern AI applications. Nvidia said Wednesday that it wants to speed up AI training by over a million percent. This could ultimately mean that AI companies don’t need that many of his Nvidia chips.

Tags: A100Advanced Micro DevicesAlphabet class CAlphabet Inc.Amazon.comappleBreaking News: TechnologyBusiness newsChipInc.IntelmobileNVIDIA CorporationNvidiaspoweringracetechnologywork
Previous Post

Gina Raimondo says U.S. to create semiconductor manufacturing clusters

Next Post

Domino's Pizza (DPZ) and Papa John's (PZZA) report earnings

admin

admin

Related Posts

Crypto firms slam SEC, Washington for lack of clarity on rules
Technology

Crypto firms slam SEC, Washington for lack of clarity on rules

by admin
March 24, 2023
Crypto firms look to Swiss lenders after Silvergate, Signature Bank fail
Technology

Crypto firms look to Swiss lenders after Silvergate, Signature Bank fail

by admin
March 22, 2023
SVB fail is double-whammy for startups dealing with plunge in venture
Technology

SVB fail is double-whammy for startups dealing with plunge in venture

by admin
March 20, 2023
How a TikTok ban in the U.S. might work
Technology

How a TikTok ban in the U.S. might work

by admin
March 18, 2023
Southeast Asia VC firms more impacted from SVB fallout than startups
Technology

Southeast Asia VC firms more impacted from SVB fallout than startups

by admin
March 16, 2023
Next Post
Domino’s Pizza (DPZ) and Papa John’s (PZZA) report earnings

Domino's Pizza (DPZ) and Papa John's (PZZA) report earnings

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

How Amazon’s cloud unit is helping researchers analyze genetics

How Amazon’s cloud unit is helping researchers analyze genetics

January 1, 2023
Senate vote on overturning federal rule on ESG investments

Senate vote on overturning federal rule on ESG investments

March 2, 2023

Categories

  • Business
  • Politics
  • Technology

Don't miss it

Ex-Morgan Stanley advisor charged in alleged NBA fraud
Business

Ex-Morgan Stanley advisor charged in alleged NBA fraud

March 24, 2023
Crypto firms slam SEC, Washington for lack of clarity on rules
Technology

Crypto firms slam SEC, Washington for lack of clarity on rules

March 24, 2023
New Jersey deli fraud defendant renounced U.S. citizenship
Politics

New Jersey deli fraud defendant renounced U.S. citizenship

March 24, 2023
GameStop (GME) Q4 2022 earnings
Business

GameStop (GME) Q4 2022 earnings

March 22, 2023
Crypto firms look to Swiss lenders after Silvergate, Signature Bank fail
Technology

Crypto firms look to Swiss lenders after Silvergate, Signature Bank fail

March 22, 2023
Organized crime is plaguing retailers, spurring debate
Business

Organized crime is plaguing retailers, spurring debate

March 20, 2023
Scoop Nova

Scoop Nova is the only worldwide news organization with a European perspective, and it’s where the rest of the world goes to hear what the World has to say.

Categories

  • Business
  • Technology
  • Scandals
  • Politics

Quick Links

  • About Scoop Nova
  • Contact us
  • Editorial Principles
  • Why to trust us
  • Privacy Policy
  • Sitemap

Recent News

Ex-Morgan Stanley advisor charged in alleged NBA fraud

Ex-Morgan Stanley advisor charged in alleged NBA fraud

March 24, 2023
Crypto firms slam SEC, Washington for lack of clarity on rules

Crypto firms slam SEC, Washington for lack of clarity on rules

March 24, 2023

© Copyright 2022 ! All rights are reserved by scoopnova.com

  • Home
  • Business
  • Technology
  • Scandals
  • Politics

© Copyright 2022 ! All rights are reserved by scoopnova.com