Домой United States USA — software Just when we thought we were safe, ChatGPT is coming for our...

Just when we thought we were safe, ChatGPT is coming for our graphics cards

140
0
ПОДЕЛИТЬСЯ

With the rise of ChatGPT and large language models, should gamers worry about a repeat of the cryptocurrency-driven GPU shortage of the past couple of years?
Everyone seems to be talking about ChatGPT nowadays thanks to Microsoft Bing, but given the nature of large language models (LLMs), a gamer would be forgiven if they feel a certain déjà vu.
See, even though LLMs run on huge cloud servers, they use special GPUs to do all the training they need to run. Usually, this means feeding a downright obscene amount of data through neural networks running on an array of GPUs with sophisticated tensor cores, and not only does this require a lot of power, but it also requires a lot of actual GPUs to do at scale.
This sounds a lot like cryptomining but it also doesn’t. Cryptomining has nothing to do with machine learning algorithms and, unlike machine learning, cryptomining’s only value is producing a highly speculative digital commodity called a token that some people think is worth something and so are willing to spend real money on it.
This gave rise to a cryptobubble that drove a shortage of GPUs over the past two years when cryptominers bought up all the Nvidia Ampere graphics cards from 2020 through 2022, leaving gamers out in the cold. That bubble has now popped, and GPU stock has now stabilized.
But with the rise of ChatGPT, are we about to see a repeat of the past two years? It’s unlikely, but it’s also not out of the question either.Your graphics card is not going to drive major LLMs
While you might think the best graphics card you can buy might be the kind of thing that machine learning types might want for their setups, you’d be wrong. Unless you’re at a university and you’re researching machine learning algorithms, a consumer graphics card isn’t going to be enough to drive the kind of algorithm you need.

Continue reading...