stepee Posted September 29, 2022 Share Posted September 29, 2022 20 hours ago, cusideabelincoln said: In Cyberpunk the 4090 with DLSS 3 frame generation is 2.5x faster than a 3099 Ti using DLSS 2 performance mode. In Spiderman the 4090 provides twice the performance using DLSS 3 frame generation compared to native resolution. In Portal the 4090 gives 5.5x performance using DLSS 3 frame generation, DLSS 2 is 3.33x faster than native rendering. Latency using DLSS 3 frame gen is similar to using DLSS 2. Worst case is it's still better than latency without using DLSS. So this does mean the higher frame rate using frame generation does not give you the latency benefit if that frame rate was rendered without frame gen. Good news about frame generation is the image quality is top notch and will make the game look smoother. I’m really curious what that frame rate for Spider-man is, if it can hold 120 or not. They mention that it’s still cpu limited but not sure to what extent. It feels weird having a cpu bottleneck again in pc gaming. Is this more of a programming issue (not taking advantage of the extra cores?) or just that there isn’t a cpu twice as powerful as the ps5 or whatever yet? Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 29, 2022 Author Share Posted September 29, 2022 45 minutes ago, stepee said: So how hard are we expecting this to be to get? I’m hoping the price and people being upset with the price pushing bad mindshare vibes makes this not TOO crazy. @Mr.Vic20 you just going for the FE on Nvidia.com? Yeah, availability will be ugly and I'm going for an FE. I don't want to climb any higher with the wattage requirements. I full expect to be hunting for one of these for 2-3 months if I miss day one. 1 Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted September 29, 2022 Share Posted September 29, 2022 10 minutes ago, stepee said: I’m really curious what that frame rate for Spider-man is, if it can hold 120 or not. They mention that it’s still cpu limited but not sure to what extent. It feels weird having a cpu bottleneck again in pc gaming. Is this more of a programming issue (not taking advantage of the extra cores?) or just that there isn’t a cpu twice as powerful as the ps5 or whatever yet? Spider-Man with RT does need a lot of CPU and memory speed to play at a locked 120 fps. Check these RT charts and you'll see, no matter the resolution, anything that isn't a 12900k with high speed DDR5 is bottlenecked. So a 12900k with DDR4 all the way down to a four core CPUs show no performance difference from 1080p to 4k. Quote And the new Ryzen CPUs don't quite match Intel in this game. So if you want 120 fps in Spider-Man, you will need an Intel CPU with the fastest DDR5 RAM you can get. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 29, 2022 Share Posted September 29, 2022 12 minutes ago, cusideabelincoln said: Spider-Man with RT does need a lot of CPU and memory speed to play at a locked 120 fps. Check these RT charts and you'll see, no matter the resolution, anything that isn't a 12900k with high speed DDR5 is bottlenecked. So a 12900k with DDR4 all the way down to a four core CPUs show no performance difference from 1080p to 4k. And the new Ryzen CPUs don't quite match Intel in this game. So if you want 120 fps in Spider-Man, you will need an Intel CPU with the fastest DDR5 RAM you can get. Almost seems like the ram speed is the biggest thing here with that 40fps jump on the same cpu to finally hit 120ish. I guess that’s what is going to be needed down the line. Not going to upgrade to a ddr5 system for 120fps Spider-man but I guess in a year or two it might be needed if you want to move towards 120fps. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 29, 2022 Share Posted September 29, 2022 @cusideabelincoln However! With Dlss3 generating a frame in between two frames, shouldn’t that technically relieve 1/3 of the bandwidth from cpu also? Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted September 29, 2022 Share Posted September 29, 2022 2 minutes ago, stepee said: @cusideabelincoln However! With Dlss3 generating a frame in between two frames, shouldn’t that technically relieve 1/3 of the bandwidth from cpu also? People theorize that is why the FPS are so high in flight sim. That's something that would make me go Nvidia tbh. Might make the game more playable in VR. Though I'd wait to see what a 7950x3d can do. 1 Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted September 29, 2022 Share Posted September 29, 2022 3 hours ago, stepee said: @cusideabelincoln However! With Dlss3 generating a frame in between two frames, shouldn’t that technically relieve 1/3 of the bandwidth from cpu also? In the Digital Foundry video I think he tries to say that when they show Spiderman DLSS 2 only slightly faster than native while DLSS 3 is twice as fast. DLSS 3 is not exactly relieving CPU bandwidth compared to DLSS 2, as they both require the same CPU cycles. But in a hard CPU or memory bottleneck, DLSS 3 will double the effective frame rate because every other frame is AI generated entirely by the GPU. You can see that in the Spiderman charts and video. Pick any DDR 4 CPU to compare at 1080p and 4k the performance is only a few frames different. DLSS 2 renders the game at 1080p so that explains why the DF video shows little difference over native. But DLSS 3 is twice as fast since half the frames aren't constrained by the rest of the system. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 29, 2022 Share Posted September 29, 2022 2 minutes ago, cusideabelincoln said: In the Digital Foundry video I think he tries to say that when they show Spiderman DLSS 2 only slightly faster than native while DLSS 3 is twice as fast. DLSS 3 is not exactly relieving CPU bandwidth compared to DLSS 2, as they both require the same CPU cycles. But in a hard CPU or memory bottleneck, DLSS 3 will double the effective frame rate because every other frame is AI generated entirely by the GPU. You can see that in the Spiderman charts and video. Pick any DDR 4 CPU to compare at 1080p and 4k the performance is only a few frames different. DLSS 2 renders the game at 1080p so that explains why the DF video shows little difference over native. But DLSS 3 is twice as fast since half the frames aren't constrained by the rest of the system. Yeah, I caught that but it’s still hard to tell exactly what that will mean. That is the game I can’t wait to see some hard numbers behind! I see what you mean how it’s technically not reducing the strain, but effectively it kind of works the same as far as trying to achieve 120fps within the constraints of the cpu. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted September 29, 2022 Share Posted September 29, 2022 Just now, stepee said: Yeah, I caught that but it’s still hard to tell exactly what that will mean. That is the game I can’t wait to see some hard numbers behind! I see what you mean how it’s technically not reducing the strain, but effectively it kind of works the same as far as trying to achieve 120fps within the constraints of the cpu. I would put it as DLSS 3 is bypassing system bottlenecks. It's requiring the same system resources as running the game at a lower resolution, but will send twice the frames to your monitor. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 30, 2022 Share Posted September 30, 2022 28 minutes ago, cusideabelincoln said: I would put it as DLSS 3 is bypassing system bottlenecks. It's requiring the same system resources as running the game at a lower resolution, but will send twice the frames to your monitor. Yeah, that’s fair..though wouldn’t it also be technically reducing the cpu strain if you were say targeting 60fps and able to achieve that natively cpu wise, but you use dlss3 because a game is heavy gpu wise, so it would only have to use as much cpu power as needed to render 30fps or 40fps or whatever? Like the actual cpu usage would still be lowered to hit the same target for 60fps. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted September 30, 2022 Share Posted September 30, 2022 29 minutes ago, stepee said: Yeah, that’s fair..though wouldn’t it also be technically reducing the cpu strain if you were say targeting 60fps and able to achieve that natively cpu wise, but you use dlss3 because a game is heavy gpu wise, so it would only have to use as much cpu power as needed to render 30fps or 40fps or whatever? Like the actual cpu usage would still be lowered to hit the same target for 60fps. It could do that. Depends on how you're capping the frame rate. Afaik any modern CPU can push over 60 fps in all games besides sim type games so there's no need to reduce load. I wonder how DLSS would handle a frame rate cap, which frames would it decide to keep and which to their away. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 30, 2022 Share Posted September 30, 2022 40 minutes ago, cusideabelincoln said: It could do that. Depends on how you're capping the frame rate. Afaik any modern CPU can push over 60 fps in all games besides sim type games so there's no need to reduce load. I wonder how DLSS would handle a frame rate cap, which frames would it decide to keep and which to their away. Definitely will be a super interesting release to read about once benchmarks are out. I want to see the most extreme examples of this too, even if it’s not perfect. Like when they did tests from 160p or whatever in Control. Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted September 30, 2022 Share Posted September 30, 2022 Makes me hopeful 60fps DLSS3 will actually work out. Quote Link to comment Share on other sites More sharing options...
stepee Posted September 30, 2022 Share Posted September 30, 2022 1 hour ago, crispy4000 said: Makes me hopeful 60fps DLSS3 will actually work out. listen, I can only get so hard Quote Link to comment Share on other sites More sharing options...
Brian Posted October 5, 2022 Share Posted October 5, 2022 1 Quote Link to comment Share on other sites More sharing options...
stepee Posted October 5, 2022 Share Posted October 5, 2022 lol damnnnnn Quote Link to comment Share on other sites More sharing options...
Brick Posted October 5, 2022 Share Posted October 5, 2022 That's a chonke boi! 1 Quote Link to comment Share on other sites More sharing options...
stepee Posted October 6, 2022 Share Posted October 6, 2022 That actually made me concerned with my case lol. But luckily it seems pretty spacious in there even though it’s not that big of a case (like regular desktop size not a big tower or anything), looking at its dimensions vs my 3080 ftw3 it doesn’t seem to be a problem. Quote Link to comment Share on other sites More sharing options...
Commissar SFLUFAN Posted October 6, 2022 Share Posted October 6, 2022 This is one incredibly stupid company. 1 Quote Link to comment Share on other sites More sharing options...
stepee Posted October 6, 2022 Share Posted October 6, 2022 49 minutes ago, Commissar SFLUFAN said: This is one incredibly stupid company. It’s my kind of stupid though! Quote Link to comment Share on other sites More sharing options...
Keyser_Soze Posted October 6, 2022 Share Posted October 6, 2022 Might as well give the GPU it's own case. 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted October 6, 2022 Share Posted October 6, 2022 5 hours ago, Keyser_Soze said: Might as well give the GPU it's own case. Razer Core X Aluminum External GPU Enclosure (eGPU): Compatible with Windows & MacOS Thunderbolt 3 Laptops, NVIDIA /AMD PCIe Support, 650W PSU, Classic Black Sorry! Something went wrong! A.CO 1 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted October 6, 2022 Share Posted October 6, 2022 I really think all manufacturers should include a GPU support bracket for the 4090. I believe ASUS includes theirs. For the 3090 Ti, EVGA included their “e-leash”, which works as a “hanger” you attach to the top of your case and then attach to the GPU. Quote Link to comment Share on other sites More sharing options...
dualhunter Posted October 6, 2022 Share Posted October 6, 2022 9 hours ago, Keyser_Soze said: Might as well give the GPU it's own case. In a generation or two, GPUs will be so big you'll just plug your peripherals into your GPU instead of having a separate motherboard 1 Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted October 6, 2022 Share Posted October 6, 2022 6 hours ago, dualhunter said: In a generation or two, GPUs will be so big you'll just plug your peripherals into your GPU instead of having a separate motherboard Shhhh, don't give Nvidia more ideas on how to monopolize the market. 1 Quote Link to comment Share on other sites More sharing options...
jaethos Posted October 6, 2022 Share Posted October 6, 2022 Pretty much all of the board partner models are even bigger. The top end ASUS and Gigabyte cards make the founders card look small. Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted October 6, 2022 Share Posted October 6, 2022 20 hours ago, Brian said: 4090 Strix Quote Link to comment Share on other sites More sharing options...
Brian Posted October 6, 2022 Share Posted October 6, 2022 1 minute ago, cusideabelincoln said: 4090 Strix @stepee you in trouble bud 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted October 6, 2022 Share Posted October 6, 2022 19 minutes ago, cusideabelincoln said: Shhhh, don't give Nvidia more ideas on how to monopolize the market. The nForce was such a good mobo for the time. Wonder why nVidia bailed on mobos. Quote Link to comment Share on other sites More sharing options...
stepee Posted October 6, 2022 Share Posted October 6, 2022 3 hours ago, Brian said: @stepee you in trouble bud son of a I really hope I can get a FE. 1 Quote Link to comment Share on other sites More sharing options...
Dre801 Posted October 7, 2022 Share Posted October 7, 2022 That thing needs its own case. Quote Link to comment Share on other sites More sharing options...
SuperSpreader Posted October 7, 2022 Share Posted October 7, 2022 Ridiculous Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted October 7, 2022 Share Posted October 7, 2022 10 hours ago, SuperSpreader said: Ridiculous Jay couldn't fit the ASUS card into a Lian Li mid-case. Quote Link to comment Share on other sites More sharing options...
TwinIon Posted October 7, 2022 Share Posted October 7, 2022 Do we know yet if the 4080 will be equally huge? Quote Link to comment Share on other sites More sharing options...
jaethos Posted October 7, 2022 Share Posted October 7, 2022 It depends on the card. Some are using the same coolers for their 4080 as their 4090, some are using smaller ones. You'll have to look at each card to figure it out. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.