Commissar SFLUFAN Posted June 5 Share Posted June 5 Nvidia passes Apple in market cap as second-most valuable public U.S. company WWW.CNBC.COM Investors are becoming more comfortable that Nvidia's huge growth in sales to a handful of cloud companies can persist. Quote Nvidia passed Apple in market cap on Wednesday as investors continue betting on the chipmaker behind the artificial intelligence boom. It is now the second-most valuable public company, behind Microsoft. Nvidia also hit a $3 trillion market cap milestone on Wednesday after shares rose over 5%. At market close, Nvidia had a market value of $3.019 trillion, versus Apple’s, which stood at $2.99 trillion. Microsoft is the most valuable publicly traded company, with a market cap of $3.15 trillion, as of Wednesday. Nvidia shares have risen more than 24% since the company reported first-quarter earnings in May and have been on a tear since last year. The company has an estimated 80% market share in AI chips for data centers, which are attracting billions of dollars in spending from big cloud vendors. Investors are also becoming more comfortable that Nvidia’s huge growth in sales to a handful of cloud companies can persist. For the most recent quarter, revenue in its data center business, which includes its GPU sales, rose 427% from a year earlier to $22.6 billion, about 86% of the company’s overall sales. Quote Link to comment Share on other sites More sharing options...
Uaarkson Posted June 5 Share Posted June 5 Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 5 Share Posted June 5 I wonder if there is a point where Nvidia makes so much from AI that they give up on gaming because it would make them more money by not bottlenecking production facilities. Quote Link to comment Share on other sites More sharing options...
TUFKAK Posted June 5 Share Posted June 5 My taxable brokerage had a good day 1 Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted June 5 Share Posted June 5 4K GPUs, here we go! 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted June 5 Share Posted June 5 Yea, they had a pretty good 20-ish months Fun fact: if you invested $10k in NVDA 10 years ago, it’d be worth about $2.4m right now 1 Quote Link to comment Share on other sites More sharing options...
stepee Posted June 5 Share Posted June 5 yeah it’s been insane Quote Link to comment Share on other sites More sharing options...
Brick Posted June 5 Share Posted June 5 Good? Quote Link to comment Share on other sites More sharing options...
unogueen Posted June 5 Share Posted June 5 nvidia has no reason to cut for gpu die when pro dies sell for 100X more. Quote Link to comment Share on other sites More sharing options...
Massdriver Posted June 6 Share Posted June 6 Nearly bought nvidia several times, but never did. Truly remarkable and congratulations to those that did. This momentum will not continue at this pace for much longer. I can’t see them topping 7 trillion in the next 2 years. My guess is the increases will slow down. Quote Link to comment Share on other sites More sharing options...
Reputator Posted June 6 Share Posted June 6 This has gotta be one of the fastest tech industry growths in recent history. Quote Link to comment Share on other sites More sharing options...
b_m_b_m_b_m Posted June 6 Share Posted June 6 Selling shovels Quote Link to comment Share on other sites More sharing options...
Reputator Posted June 6 Share Posted June 6 As a fun aside, when I signed up for Robin Hood close to two years ago, they gave me a tiny portion of NVIDIA stock. As in, 0.031582 of one NVIDIA share. It's now worth $39. 1 3 Quote Link to comment Share on other sites More sharing options...
legend Posted June 6 Share Posted June 6 I am begging AMD to compete on AI. Quote Link to comment Share on other sites More sharing options...
stepee Posted June 6 Share Posted June 6 1 minute ago, legend said: I am begging AMD to compete on AI. so many people are! 1 1 Quote Link to comment Share on other sites More sharing options...
elbobo Posted June 6 Share Posted June 6 9 hours ago, chakoo said: I wonder if there is a point where Nvidia makes so much from AI that they give up on gaming because it would make them more money by not bottlenecking production facilities. This is a legit concern if you are a PC gamer. Their gaming GPUs are now a very small part of their revenue and even smaller portion of their earnings and the ratio will likely only continue in one direction. At some point they might just wash their hands of it. Only saving grace I can see would be Jensen carrying on just because he genuinely likes them and at this point no one is going to question him. Quote Link to comment Share on other sites More sharing options...
BloodyHell Posted June 6 Share Posted June 6 The new 4080 TI I picked up last week must have put them over the top…. Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 6 Share Posted June 6 11 hours ago, legend said: I am begging AMD to compete on AI. You have to wonder how much sony is kicking themselves for not keeping up with their crazy architecture on the side. 1 Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted June 6 Share Posted June 6 Just now, chakoo said: You have to wonder how much sony is kicking themselves for not keeping up with their crazy architecture on the side. Who's laughing now? Ken is laughing! 1 Quote Link to comment Share on other sites More sharing options...
legend Posted June 6 Share Posted June 6 48 minutes ago, chakoo said: You have to wonder how much sony is kicking themselves for not keeping up with their crazy architecture on the side. Not all that much. Crazy architectures mean more work because there are less standards and libraries that can be applied. The cost of that outweighs the benefits. For example, back in the day I took a class on programming for the Cell processor and did a simple ML application. I got the cell version to scale really well with SPUs and it blew out of the water standard CPUs. I even posted about it on the IGN boards! But the work was enormous in comparison and game dev would only be harder given time constraints and human bandwidth. So the exotic architectures are not really worth it in that regard. For the PlayStations, the fact that AMD isn’t competing with Nvidia on AI isn't as bad either. GPUs are kind of already good out of the box for model inference. Yes, tensor cores are nice and give a bump, but the disparity for inference in terms of raw compute isn’t that great in the grand scheme of things. Especially not since on a fixed platform you can hyper optimize if you need to. The real disparity is on the training hardware with data center GPUs for AI and the accelerated libraries for training. This doesn’t matter much for the PlayStation itself since you’re not going to use the PSs for training. 1 Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 6 Share Posted June 6 3 hours ago, legend said: Not all that much. Crazy architectures mean more work because there are less standards and libraries that can be applied. The cost of that outweighs the benefits. For example, back in the day I took a class on programming for the Cell processor and did a simple ML application. I got the cell version to scale really well with SPUs and it blew out of the water standard CPUs. I even posted about it on the IGN boards! But the work was enormous in comparison and game dev would only be harder given time constraints and human bandwidth. So the exotic architectures are not really worth it in that regard. For the PlayStations, the fact that AMD isn’t competing with Nvidia on AI isn't as bad either. GPUs are kind of already good out of the box for model inference. Yes, tensor cores are nice and give a bump, but the disparity for inference in terms of raw compute isn’t that great in the grand scheme of things. Especially not since on a fixed platform you can hyper optimize if you need to. The real disparity is on the training hardware with data center GPUs for AI and the accelerated libraries for training. This doesn’t matter much for the PlayStation itself since you’re not going to use the PSs for training. From a gaming perspective, I agree but from a HW-pushing perspective, I disagree. I think if Sony kept pushing R&D on their crazy HW architecture they would be a big player in AI right now. Sony was pushing teams to learn how to work in parallel processing to get the most out of the hardware slightly before Cuda came around as an alternative. I also know the joy/pain of working with the Cell processor. I built some frameworks at work for us to use the SPUs but we ended up not doing anything with it since the game was primarily a Vita game. I actually enjoyed it more than I hated it but I also spent most of my time working on pipelines and rarely on gameplay so it wasn't hard for me to wrap my head around it. Quote Link to comment Share on other sites More sharing options...
legend Posted June 6 Share Posted June 6 21 minutes ago, chakoo said: From a gaming perspective, I agree but from a HW-pushing perspective, I disagree. I think if Sony kept pushing R&D on their crazy HW architecture they would be a big player in AI right now. Sony was pushing teams to learn how to work in parallel processing to get the most out of the hardware slightly before Cuda came around as an alternative. I also know the joy/pain of working with the Cell processor. I built some frameworks at work for us to use the SPUs but we ended up not doing anything with it since the game was primarily a Vita game. I actually enjoyed it more than I hated it but I also spent most of my time working on pipelines and rarely on gameplay so it wasn't hard for me to wrap my head around it. I'm not sure Sony could have justifiably funded it at the time. It's a huge financial effort to push novel parallel architectures, and it's complicated when you're also forced with to work with various other companies (e.g., IBM for Cell). Nvidia had an existing business model for their chips that were already usable in standard PCs (which was critical to capture the AI market) and it was fortuitous that that existing business model worked well for accelerating AI. That allowed them organically and sustainably grow into it. Sony's main business model that would have supported it was PS, but as we both agree, you couldn't have really justified it there making it nothing more than a cost they'd have to bear. Even in retrospect knowing that AI blew up, it's unclear that there would have been a viable business model for Sony to go from where things were to being a big player in the AI hardware space. The only good alternative candidate where in retrospect you could easily justify it is AMD and perhaps Intel. Both of them had more clear organic paths to it in the same way Nvidia did. Quote Link to comment Share on other sites More sharing options...
Reputator Posted June 6 Share Posted June 6 Did Sony actually design it? I thought it was primarily IBM's. Quote Link to comment Share on other sites More sharing options...
Ghost_MH Posted June 7 Share Posted June 7 4 hours ago, legend said: The only good alternative candidate where in retrospect you could easily justify it is AMD and perhaps Intel. Both of them had more clear organic paths to it in the same way Nvidia did. This reminds me of how there was a time not that long ago when AMD easily had the fastest ARM CPUs on the market, but gave up on ARM development because I guess nobody wants ARM in the datacenter? I still have an AMD ARM pizza box sitting in my server room, being babied, because there is no serious not-Ampere alternative on the market. I also have a ThunderX2 server because I thought they'd be a major player, but nope, they disappeared too. 1 Quote Link to comment Share on other sites More sharing options...
legend Posted June 7 Share Posted June 7 2 hours ago, Reputator said: Did Sony actually design it? I thought it was primarily IBM's. It was certainly primarily IBM. I'm pretty sure Sony was involved in the design, but I honestly don't remember the details as well I used to know them. Either way, that's exactly what I mean by it being more complicated with other company involvement. EDIT: Wikipedia at least indicates that it was a joint design effort, also with Toshiba. Cell (processor) - Wikipedia EN.WIKIPEDIA.ORG Quote In mid-2000, Sony Computer Entertainment, Toshiba Corporation, and IBM formed an alliance known as "STI" to design and manufacture the processor.[8] Quote Link to comment Share on other sites More sharing options...
b_m_b_m_b_m Posted June 7 Share Posted June 7 25 minutes ago, legend said: It was certainly primarily IBM. I'm pretty sure Sony was involved in the design, but I honestly don't remember the details as well I used to know them. Either way, that's exactly what I mean by it being more complicated with other company involvement. EDIT: Wikipedia at least indicates that it was a joint design effort, also with Toshiba. Cell (processor) - Wikipedia EN.WIKIPEDIA.ORG What an unfortunate acronym Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 7 Share Posted June 7 As mentioned, it was a joint effort and sony did contribute a lot to the design. There were initially plans to have duel cell chips in the system in place of a CPU/GPU combo. 7 hours ago, legend said: I'm not sure Sony could have justifiably funded it at the time. It's a huge financial effort to push novel parallel architectures, and it's complicated when you're also forced with to work with various other companies (e.g., IBM for Cell). Nvidia had an existing business model for their chips that were already usable in standard PCs (which was critical to capture the AI market) and it was fortuitous that that existing business model worked well for accelerating AI. That allowed them organically and sustainably grow into it. Sony's main business model that would have supported it was PS, but as we both agree, you couldn't have really justified it there making it nothing more than a cost they'd have to bear. Even in retrospect knowing that AI blew up, it's unclear that there would have been a viable business model for Sony to go from where things were to being a big player in the AI hardware space. The only good alternative candidate where in retrospect you could easily justify it is AMD and perhaps Intel. Both of them had more clear organic paths to it in the same way Nvidia did. I don't fully agree Sony is a large company and has funded a lot of R&D stuff that wasn't always practical initially that did eventually make it into a product but I feel we will go around in circles on this. I also agree that AMD is probably the best other alternative but I would put the likes of Qualcomm and others over Intel, and to an extent IBM if they were still in chips. 1 Quote Link to comment Share on other sites More sharing options...
unogueen Posted June 7 Share Posted June 7 15 hours ago, chakoo said: You have to wonder how much sony is kicking themselves for not keeping up with their crazy architecture on the side. Vydea owns CUDA. I saw this shit coming since like 2001 uni labs. Quote Link to comment Share on other sites More sharing options...
unogueen Posted June 7 Share Posted June 7 Geforce 2 GTS on ALL lab pcs??? Someone made it their home. Quote Link to comment Share on other sites More sharing options...
Reputator Posted June 7 Share Posted June 7 2 hours ago, unogueen said: Geforce 2 GTS on ALL lab pcs??? Someone made it their home. You foresaw the dominance of NVIDIA in HPC world in 2001 because some system builder put the same GPU in every lab PC? Time to put down the bong. 2 Quote Link to comment Share on other sites More sharing options...
SuperSpreader Posted June 7 Share Posted June 7 1 hour ago, Reputator said: Time to put down the bong. Time to double down Quote Link to comment Share on other sites More sharing options...
BloodyHell Posted June 7 Share Posted June 7 19 hours ago, chakoo said: From a gaming perspective, I agree but from a HW-pushing perspective, I disagree. I think if Sony kept pushing R&D on their crazy HW architecture they would be a big player in AI right now. Sony was pushing teams to learn how to work in parallel processing to get the most out of the hardware slightly before Cuda came around as an alternative. I also know the joy/pain of working with the Cell processor. I built some frameworks at work for us to use the SPUs but we ended up not doing anything with it since the game was primarily a Vita game. I actually enjoyed it more than I hated it but I also spent most of my time working on pipelines and rarely on gameplay so it wasn't hard for me to wrap my head around it. “I think” is doing a lot of heavy lifting in that paragraph. Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 7 Share Posted June 7 2 hours ago, Reputator said: You foresaw the dominance of NVIDIA in HPC world in 2001 because some system builder put the same GPU in every lab PC? Time to put down the bong. Also before cuda existed. Yes Yes I too saw the inevitable current state of AI back when I was making GBA games. Quote Link to comment Share on other sites More sharing options...
chakoo Posted June 7 Share Posted June 7 1 minute ago, BloodyHell said: “I think” is doing a lot of heavy lifting in that paragraph. PS3 isn't some fluke in design. You can see where Sony was focusing on parallelized operations as it is a core part of the VU design on the PS2. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted June 7 Share Posted June 7 Let's not forget that the GPU in the PS3 was nVidia Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.