Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Business

Trying to Win the AI War Will Be Really, Really, Really Expensive

To kick off this year’s TED conference, Google’s AI boss reminded his audience in Vancouver on Monday of his best estimate of how much the search giant would spend to develop AI: more than $100 billion.

Hassabis, who runs the renowned DeepMind research lab within Google and is arguably the most important figure at the center of Alphabet’s AI projects, shared the astronomical figure in response to a question about what the competition was doing .

According to a report from The Information last month, Microsoft and OpenAI have drawn up plans to create a $100 billion supercomputer called “Stargate” containing “millions of specialized server chips” to power the ChatGPT creator’s AI.

Naturally, when Hassabis was asked about the rumors surrounding his rivals’ supercomputer and its cost, he was quick to note that Google’s spending could exceed that amount: “We don’t talk about our precise numbers, but I think we invest more than that over time.”

Although the generative AI boom has already sparked a huge surge in investment (AI startups alone raised nearly $50 billion last year, according to Crunchbase data), comments from Hassabis say the competition to lead the AI ​​sector is going to get much more expensive.

This is particularly the case for companies like Google, Microsoft and OpenAI, all of which are engaged in an intense battle to become the first to claim the development of artificial general intelligence, an AI capable of matching reasoning and human ingenuity.

Big chips

Still, the idea that a company can spend more than $100 billion on a single technology that some think might be overrated is telling.

It’s worth thinking about where these expenses are going. For starters, a large portion of development costs will be spent on chips.

This is one of the most expensive purchases for companies invested in the race to develop smarter AI. Simply put, the more chips you have, the more computing power you have to train AI models on larger volumes of data.

Companies working on large language models, like Google’s Gemini and OpenAI’s GPT-4 Turbo, rely heavily on third-party chips like Nvidia. But they are increasingly trying to design their own.


Jensen Huang Presents at Nvidia's 2024 GTC Conference

Jensen Huang, CEO of Nvidia, which is a leading supplier of chips to AI players.

Justin Sullivan/Getty Images



The general business of training models is also becoming more and more expensive.

Stanford University’s annual AI Index report, released this week, notes that “the costs of training cutting-edge AI models have reached unprecedented levels.”

He noted that OpenAI’s GPT-4 used “approximately $78 million of compute for training,” compared to the $4.3 million that was used to train GPT-3 in 2020. Google, meanwhile, cost $191 million to form. The original technology behind AI models cost around $900 to train in 2017.

The report also notes that “there is a direct correlation between the costs of training AI models and their computational requirements,” so if AGI is the end goal, the cost will only skyrocket.

businessinsider

Back to top button