innumerate
no, illiterate. As in "they can read or understand mathematical equations".
you're making out like everyone who has ever placed an order for blackwell chips has ordered 2-3x what they needed.
Yes. 50%->100% more than they need is pretty typical, especially when they are concerned about competition. They cancel them later.
faster hardware is necesarry to resolve this
that's not how training works. You can train LLMs just fine with slower chips, it just takes longer. faster training means you can update your model to handle new information in less time, but the vast majority of LLM queries don't involve as much new information as you may expect, hence the fastest chips only being necessary for "tip of the spear" and the bulk of data center chips being slower models. WHY DO YOU THINK DEEPSEEK WAS SUCH A BIG THING?
nice, quick little chips
they are nice, quick little chips. they're a bitch to maintain, and expensive, but they are nice, quick little chips. They're like the extra little spice in your kitchen that make all the difference in the final dish, but they are by no means the meat-and-potatoes of the next AI revolution...it's complete overkill. And yes, the stock has captured the minds of a bunch of brain-dead millennial gamers who remember seeing "made with NVIDIA" brand videos at the beginning of every video-game play session. That's probably their greatest strength: the level of branding they created in the minds of 10 year olds back in the early 2000s.
But those 10 year olds aren't the ones placing orders for nvda's AI chips right now...big companies are. The 10 year olds are the idiots who say things like "next stop, the moon!"
fucking sad man
but glorious to watch what's coming