AbsolutSurgen Posted December 18, 2023 Share Posted December 18, 2023 1 hour ago, Remarkableriots said: How would you put a bigger case on the HP Omen Transcend 14-inch laptop? You would make the aluminum case big enough to cover the cooling for the larger chip (and likely larger battery). I presume the first thing is that they would make it thicker, like @stepeesuggested. Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 20, 2023 Share Posted December 20, 2023 Intel to Mass Produce 15th Gen Arrow Lake CPUs on 2nm (20A) Process in 2024 | Hardware Times WWW.HARDWARETIMES.COM Following the launch of the 1st Gen Core Ultra processors on the 4nm-class Intel 4 node, the chipmaker plans the next checkpoint in its race to “process leadership.” Sanjay Natarajan, SVP... Quote 2nm (20A) will also utilize EUV lithography to enhance yields and production capacity. It’ll also be the first node to use RibbonFET transistors, commonly known as GAA (Gate All Around), the successor to FinFET. The 15th Gen Arrow Lake processors will be the first to leverage PowerVia, Intel’s backside power delivery technology meant to optimize power and frequency. Per the chipmaker’s internal tests, PowerVia demonstrates >5% frequency improvement and >90% cell density on Intel 4. The first 20A chips are expected to launch with Arrow Lake in the latter half of 2024. The 15th Gen client family will be the sole benefactor of the cutting-edge node. Granite Rapids and Sierra Forest are slated to use Intel 3, while Clearwater Forest will utilize 18A. Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 23, 2023 Share Posted December 23, 2023 Firm predicts it will cost $28 billion to build a 2nm fab and $30,000 per wafer, a 50 percent increase in chipmaking costs as complexity rises | Tom's Hardware WWW.TOMSHARDWARE.COM Chips are not getting cheaper. Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 25, 2023 Share Posted December 25, 2023 Intel's CEO says Moore's Law is slowing to a three-year cadence, but it's not dead yet | Tom's Hardware WWW.TOMSHARDWARE.COM Moore's Law is complicated. Quote Ever since taking the position of CEO in 2021, Gelsinger has emphatically said Moore's Law is "alive and well." In fact, he even said Intel could surpass the pace of Moore's Law at least until 2031 and has promoted "Super Moore's Law," a strategy to boost transistor count using 2.5D and 3D chip packaging technologies such as Foveros. Intel also often refers to this strategy as "Moore's Law 2.0," and AMD has also said we're entering the era of a slowed pace of Moore's Law. In the talk at MIT, Gelsinger was asked about the potential end of Moore's Law, and he began by saying, "I think we've been declaring the death of Moore's Law for about three to four decades." However, he eventually followed this up with, "we're no longer in the golden era of Moore's Law, it's much, much harder now, so we're probably doubling effectively closer to every three years now, so we've definitely seen a slowing." Gelsinger said that despite this apparent slowing of Moore's Law, Intel could create a 1-trillion transistor chip by 2030, when today, the biggest chip on a single package has around 100 billion transistors. The CEO said four things made this possible: new RibbonFET transistors, PowerVIA power delivery, next-generation process nodes, and 3D chip stacking. He ended his answer by saying, "For all of the critics that declare we're dead... until the periodic table is exhausted, we ain't finished." Still, Gelsinger admitted that Moore's Law's economic side is breaking down. "A modern fab seven or eight years ago would have cost about $10 billion. Now, it costs about $20 billion, so you've seen a different shift in the economics." Quote Link to comment Share on other sites More sharing options...
SuperSpreader Posted December 25, 2023 Share Posted December 25, 2023 Deeeeeeeez nuuuuuuttttttzzz Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 25, 2023 Share Posted December 25, 2023 NVIDIA Increased Production of GeForce RTX 4070 Super to Compete with AMD Radeon RX 7800 XT WWW.GURU3D.COM According to recent reports, Nvidia is preparing for the release of its RTX 4070 Super GPU, which is set to debut on January 17, 2024.This launch is part of Nvidia's 2024 graphics card refresh... Quote According to recent reports, Nvidia is preparing for the release of its RTX 4070 Super GPU, which is set to debut on January 17, 2024. This launch is part of Nvidia's 2024 graphics card refresh strategy, which also includes updated versions of the RTX 4070, RTX 4070 Ti, and RTX 4080 models. The RTX 4070 Super, in particular, is reportedly being produced in larger quantities as Nvidia positions it as a key component of this refresh. The production strategy for the RTX 4070 Super involves modifying the existing AD104 chip used in the RTX 4070 Ti, streamlining the manufacturing process. This approach aligns with the speculation that Nvidia plans to cease the production of the RTX 4070 Ti and RTX 4080 as part of the super refresh series. A significant aspect of Nvidia's strategy is to position the RTX 4070 Super in direct competition with AMD's Radeon RX 7800 XT. The latter has shown strong sales performance, as indicated by recent sales data from Mindfactory. To effectively challenge the Radeon RX 7800 XT, the RTX 4070 Super will need to offer superior performance or competitive pricing. This competitive landscape may prompt AMD to respond by adjusting the pricing of its RX 7800 XT model. Quote Link to comment Share on other sites More sharing options...
Keyser_Soze Posted December 26, 2023 Share Posted December 26, 2023 3 2 Quote Link to comment Share on other sites More sharing options...
Nokra Posted December 26, 2023 Share Posted December 26, 2023 8 hours ago, Keyser_Soze said: Ungrateful shit. 2 Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 27, 2023 Share Posted December 27, 2023 Nvidia CEO Jensen says, 'Our life goal is not to build CUDA GPUs' — notes the company changed its mission but never changed the name | Tom's Hardware WWW.TOMSHARDWARE.COM "We changed the mission," Jensen Huang says, referring to Nvidia's AI leadership, "I just never changed the name." Quote Jensen drew comparisons to transportation technologies, which employ various types of transportation to achieve the same goal of moving items around the globe, and how a company's mission can lead them to fundamentally different answers, even if operating in the same field. "As you know, the G in GPU originally stood for graphics," Jensen said, "And today, we do much, much more than graphics. We changed the mission. I just never changed the name!" As Jensen says, "Our life goal is to solve computer problems that normal computers cannot." While this mission statement can certainly be applied to innovations in real-time graphics rendering like RTX and DLSS, it's also quite clear that this applies to Artificial Intelligence and Nvidia's near-uncontested leadership in that area. There's no doubt in the industry that Nvidia seized the hardware opportunities presented by AI like no one else in the industry was willing or able to prior. Quote Link to comment Share on other sites More sharing options...
Brian Posted December 27, 2023 Share Posted December 27, 2023 When my daughter opened her last gift and realized there is no puppy under the tree, the disappointment and disgust in her eyes when she looked at me 2 Quote Link to comment Share on other sites More sharing options...
Spawn_of_Apathy Posted December 27, 2023 Share Posted December 27, 2023 Next year put a taxidermy dog in a tightly sealed box and tell her “huh, it was alive before Christmas”. 1 1 Quote Link to comment Share on other sites More sharing options...
Spawn_of_Apathy Posted December 27, 2023 Share Posted December 27, 2023 1 hour ago, Remarkableriots said: Nvidia CEO Jensen says, 'Our life goal is not to build CUDA GPUs' — notes the company changed its mission but never changed the name | Tom's Hardware WWW.TOMSHARDWARE.COM "We changed the mission," Jensen Huang says, referring to Nvidia's AI leadership, "I just never changed the name." this really makes me hope some AI bubble bursts, not unlike crypto mining, and Nvidia is left holding the bag. 1 Quote Link to comment Share on other sites More sharing options...
AbsolutSurgen Posted December 27, 2023 Share Posted December 27, 2023 On 12/26/2023 at 3:02 AM, Keyser_Soze said: Dude ain't wrong. Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 28, 2023 Share Posted December 28, 2023 4 hours ago, Spawn_of_Apathy said: this really makes me hope some AI bubble bursts, not unlike crypto mining, and Nvidia is left holding the bag. https://www.amd.com/en/products/graphics/radeon-ai.html AMD is getting on that AI gravy train just as much as Nvidia and even Intel is also doing that. https://www.intel.com/content/www/us/en/artificial-intelligence/hardware.html I believe we're just at the dawn of AI, and it will just keep expanding for a long time. @legend What do you think is the roadmap for AI going forward? Quote Link to comment Share on other sites More sharing options...
legend Posted December 28, 2023 Share Posted December 28, 2023 2 hours ago, Remarkableriots said: https://www.amd.com/en/products/graphics/radeon-ai.html AMD is getting on that AI gravy train just as much as Nvidia and even Intel is also doing that. https://www.intel.com/content/www/us/en/artificial-intelligence/hardware.html I believe we're just at the dawn of AI, and it will just keep expanding for a long time. @legend What do you think is the roadmap for AI going forward? Tech-wise, the community needs a replacement for CPython (the programming language/interpreter), or Python needs to be massively overhauled. It's a terribly slow language that creates barriers and prevents all kinds of research avenues from being pursued. It only works for things now because all the big AI libraries used in Python are actually written in C++ with bindings for specific kinds of operations in Python. The current trend is to allow compiling python code into native code at runtime, but this has numerous limitations. Something like Mojo lang might be one way to escape this problem, but we'll see. Research-wise we need to move out of the realm of static dataset fitting. Intelligent organisms don't learn by fitting nicely curated datasets. They learn interactively with an environment (sometimes with an interactive teacher to help) and there is tremendous power in being able to do so, but we haven't really cracked how to do it especially well (despite several big milestones, it remains hard). In particular, most ML methods rely on "iid" assumptions about data that don't hold when you're interacting with an environment and the kludges we have around that are, well, kludges. Dataset fitting might still play a role in the story for pretraining well defined things, but if it's all we can ever do, we'll never succeed at replicating the kind of intelligence found in biologically intelligent organisms. Aside from getting out of dataset fitting, we need to improve on control methods (how to act) especially at very long time horizons (e.g., a person can plan for the next few seconds or next few years. AI control methods... not so much). We need a way to learn casual world models (which kind of depends on learning interactively with an environment to do interventions) that facilitate planning. We also need to figure how to efficiently store and recall previously learned skills without forgetting them and be able to apply and reason about them compositionally. There are other things I could list, but those are the big ticket items. It's also kind of the same set of things that we were largely in the dark on 20 years ago. Mostly, in that time we've only learned how to do function approximation (that is, fit a function to data) well (as long as we have an iid data assumption). Doing function approximation well *is* a wildly significant and transformative advance, but it's still just one small slice of the problem. 1 1 1 Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 28, 2023 Share Posted December 28, 2023 @legend Not saying it's possible now, but hypothetically, do you think one solution could be a technology-biological interface that could solve some issues? Like an implant-type device? Maybe even something like this? ‘Biocomputer’ combines lab-grown brain tissue with electronic hardware WWW.NATURE.COM A system that integrates brain cells into a hybrid machine can recognize voices. A system that integrates brain cells into a hybrid machine can recognize voices. I'm always dreaming about how technology will change in the future. Quote Link to comment Share on other sites More sharing options...
SuperSpreader Posted December 28, 2023 Share Posted December 28, 2023 4 hours ago, legend said: (sometimes with an interactive teacher to help) We're fucked 1 Quote Link to comment Share on other sites More sharing options...
Remarkableriots Posted December 28, 2023 Share Posted December 28, 2023 NVIDIA GeForce RTX 4080 SUPER 16 GB, 4070 Ti SUPER 16 GB, 4070 SUPER 12 GB Custom Models From MSI & Gigabyte Leak Out WWW.GOOGLE.COM NVIDIA GeForce RTX 4080 SUPER, 4070 Ti SUPER & 4070 SUPER GPU custom models from MSI & Gigabyte have leaked out as we approach launch. Anybody thinking about getting the 4080 Super? I'm kinda confused on 10 different versions of it but ok though. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted December 28, 2023 Share Posted December 28, 2023 19 minutes ago, Remarkableriots said: I'm kinda confused on 10 different versions of it but ok though It’s different coolers and different “levels” of being “pre-overclocked”; it’s all the same chip on the PCB. It’s the same as how MSI has multiple versions of the 4090 all at different price points. 1 Quote Link to comment Share on other sites More sharing options...
legend Posted December 28, 2023 Share Posted December 28, 2023 10 hours ago, Remarkableriots said: @legend Not saying it's possible now, but hypothetically, do you think one solution could be a technology-biological interface that could solve some issues? Like an implant-type device? Maybe even something like this? ‘Biocomputer’ combines lab-grown brain tissue with electronic hardware WWW.NATURE.COM A system that integrates brain cells into a hybrid machine can recognize voices. A system that integrates brain cells into a hybrid machine can recognize voices. I'm always dreaming about how technology will change in the future. I’m very open to biological computers of any sort and do hope we someday augment our brain. I’m not sure how much utility we will have in pairing them until we have a better understanding of the biology though. Maybe some things but without a deeper understanding it seems like it will be hard to engineer with it. 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4 Share Posted January 4 4 1 Quote Link to comment Share on other sites More sharing options...
AbsolutSurgen Posted January 6 Share Posted January 6 2 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 6 Share Posted January 6 RUMOR: 4080 Super - $999 4070 Ti Super - $799 4070 Super - $599 1 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 On 1/6/2024 at 2:52 PM, Spork3245 said: RUMOR: 4080 Super - $999 4070 Ti Super - $799 4070 Super - $599 Confirmed: https://videocardz.com/newz/nvidia-launches-geforce-rtx-40-super-series-999-rtx-4080s-799-rtx-4070-tis-and-599-rtx-4070s 4070 Ti Super also has 16gb of ram. 4080 Super has 16gb just like the 4080, and the 4070 Super has 12gb just like the 4070. Quote Link to comment Share on other sites More sharing options...
Nokra Posted January 8 Share Posted January 8 I think I'll wait for the 5xxx series to upgrade since I'm still pretty happy with my 3070 Ti, but that 4080 Super is sure tempting. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted January 8 Share Posted January 8 Nvidia is now the master of bamboozling expectations. I honestly believe these "refreshed" cards could have been offered at launch at the upcoming same performance at this same price and Nvidia would still have made bank. Now they're going to get *some* praise for offering better value, after they brought about one of the worst value generations in the 80 and 70 class cards getting a steep price hike on 4000 launch, compared to previous generations. Just damn, does this make the OG 4080's price look even worse than it did. It's also a damn shame AMD couldn't release a card substantially faster and cheaper than the 4080 to force some competition, but then again they probably would have priced it higher anyway. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 25 minutes ago, cusideabelincoln said: Nvidia is now the master of bamboozling expectations. I honestly believe these "refreshed" cards could have been offered at launch at the upcoming same performance at this same price and Nvidia would still have made bank. Now they're going to get *some* praise for offering better value, after they brought about one of the worst value generations in the 80 and 70 class cards getting a steep price hike on 4000 launch, compared to previous generations. Just damn, does this make the OG 4080's price look even worse than it did. It's also a damn shame AMD couldn't release a card substantially faster and cheaper than the 4080 to force some competition, but then again they probably would have priced it higher anyway. AMD does the same thing, they just add a 50 to their numbering scheme instead of calling it “Super”. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 G-SYNC Displays Dazzle At CES 2024: G-SYNC Pulsar Tech Unveiled, G-SYNC Comes To GeForce NOW, Plus 24 New Models WWW.NVIDIA.COM NVIDIA G-SYNC Pulsar is the first and only display tech to deliver flawless variable frequency strobing, variable refresh, and variable overdrive, together establishing a new gold standard for motion clarity and... New VRR tech @stepee 1 Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted January 8 Share Posted January 8 4 minutes ago, Spork3245 said: AMD does the same thing, they just add a 50 to their numbering scheme instead of calling it “Super”. They're a bit less egregious because their +50 cards are barely any better than the original cards. Nvidia usually offers a bigger uplift with Super and Ti cards. But this generation Nvidia was definitely trying to make up for their "mistake" of making the 3080 almost as fast as the 3090 for half the price at launch. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 1 minute ago, cusideabelincoln said: They're a bit less egregious because their +50 cards are barely any better than the original cards. Nvidia usually offers a bigger uplift with Super and Ti cards. But this generation Nvidia was definitely trying to make up for their "mistake" of making the 3080 almost as fast as the 3090 for half the price at launch. What? The only times the Ti/Super refreshes are a big uplift is if it’s replacing the current flagship (and even then it’s not always a big improvement), otherwise it typically just hits the “in between” for like a 10% uplift or so (ie: a 3070 Ti is in between a 3070 and a 3080) -> these are the same performance uplifts that AMD does on their refreshes. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted January 8 Share Posted January 8 23 minutes ago, Spork3245 said: What? The only times the Ti/Super refreshes are a big uplift is if it’s replacing the current flagship (and even then it’s not always a big improvement), otherwise it typically just hits the “in between” for like a 10% uplift or so (ie: a 3070 Ti is in between a 3070 and a 3080) -> these are the same performance uplifts that AMD does on their refreshes. AMD's 6000 refresh was very pointless, also their cards don't respond as well to overclocking which leads me to believe they would have had a much tougher time delivering their refreshed cards at their actual launch. Nvidia cards were power limited last gen, but if you got a card with a good PCB and massive cooling you can squeeze a lot out of it. The 3090 Ti is the de facto example. All the 3090 benchmarks you see will have the card running at 350W (same as the 3080) or maybe up to 380W, then they beefed up the 3090 Ti layout and gave it a standard of TDP 450W with an upto of 600W in the benchmarks. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 9 minutes ago, cusideabelincoln said: AMD's 6000 refresh was very pointless, also their cards don't respond as well to overclocking which leads me to believe they would have had a much tougher time delivering their refreshed cards at their actual launch. Nvidia cards were power limited last gen, but if you got a card with a good PCB and massive cooling you can squeeze a lot out of it. The 3090 Ti is the de facto example. All the 3090 benchmarks you see will have the card running at 350W (same as the 3080) or maybe up to 380W, then they beefed up the 3090 Ti layout and gave it a standard of TDP 450W with an upto of 600W in the benchmarks. The 3090 Ti was 5-8% faster than the non-Ti Even OCing it typically only pushed the Ti 6-7% over stock performance, the non-Ti would typically gain 4-5% OCed. AMD wasn’t the only one with a pointless refresh last gen. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted January 8 Share Posted January 8 1 hour ago, Spork3245 said: The 3090 Ti was 5-8% faster than the non-Ti Even OCing it typically only pushed the Ti 6-7% over stock performance, the non-Ti would typically gain 4-5% OCed. AMD wasn’t the only one with a pointless refresh last gen. It just feels too calculated this generation, especially considering they were going to call the 4070 Ti a 4080 and charge $900 for it. But then we all raised hell and they changed the name and lowered the price before release. Jokes on us, since they basically did it with all of the cards. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 8 Share Posted January 8 4 minutes ago, cusideabelincoln said: It just feels too calculated this generation, especially considering they were going to call the 4070 Ti a 4080 and charge $900 for it. But then we all raised hell and they changed the name and lowered the price before release. Jokes on us, since they basically did it with all of the cards. All the companies are equally as calculating because they do it every generation. Maybe Intel will save us 3 Quote Link to comment Share on other sites More sharing options...
stepee Posted January 8 Author Share Posted January 8 2 hours ago, Spork3245 said: G-SYNC Displays Dazzle At CES 2024: G-SYNC Pulsar Tech Unveiled, G-SYNC Comes To GeForce NOW, Plus 24 New Models WWW.NVIDIA.COM NVIDIA G-SYNC Pulsar is the first and only display tech to deliver flawless variable frequency strobing, variable refresh, and variable overdrive, together establishing a new gold standard for motion clarity and... New VRR tech @stepee cmonnnn 2026 8k LG oled with nvidia gsync pulsar! 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.