Paperclyp Posted January 1, 2023 Share Posted January 1, 2023 Apologies for making this thread my personal tech help page. That being said... My card should be here this week, whereas my processor will still be a few weeks out. I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works? Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 1, 2023 Share Posted January 1, 2023 3 minutes ago, Paperclyp said: I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? There’s no reason to wait. In fact, better to do the gpu first so you can see the difference the new cpu adds. 4 minutes ago, Paperclyp said: It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works? You’ll want to reinstall your gpu drivers (and with a clean install). Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted January 1, 2023 Share Posted January 1, 2023 3 hours ago, Paperclyp said: Apologies for making this thread my personal tech help page. That being said... My card should be here this week, whereas my processor will still be a few weeks out. I'm curious - is it optimal to wait to install both pieces together, or is there really no reason I can't install the card when it gets here, then do the processor when it gets here? It seems like I shouldn't have to do anything on the OS / driver side, just pray everything works? Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play? You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max. Quote Link to comment Share on other sites More sharing options...
Paperclyp Posted January 1, 2023 Share Posted January 1, 2023 1 hour ago, cusideabelincoln said: Use that week to get some before\after benchmarks, and if you want to post them here I'd be curious to see the difference. What games do you plan to play? You shouldn't have to do any software\driver changes when changing the CPU. I'd suggest over clocking out of the box. It should do 4.2-4.4 GHz without any fine tweaking or voltage adjustments. If you have good cooling and want to spend more time tweaking, you could get 4.7-5 GHz max. Cool. Do I just go into the bios to overclock? Never done it before... I'm not sure what I'll play. I'll def boot up the RE Village demo because it has some pretty cool modular settings so I just want to compared framerates and whatnot with certain things turned on. Otherwise, What I've been playing recently is pretty low-end. I have been toying around with starting a new Valheim game. Potentially could start Nier Automota or thinking about re-starting Doom Eternal. Quote Link to comment Share on other sites More sharing options...
HardAct Posted January 1, 2023 Share Posted January 1, 2023 Hope this keeps up, I'm hoping AMD closes the gap making GPU's more affordable for performance dollars so I can buy a 5090 or 5090Ti {Most likely a 5080Ti but I can dream} for around $1500 and be done with it again for the next 3 generations at least, maybe even 4. I've gone from my 2080Ti till now and that's the longest EVER so I'm excited about the "NEXT" big GPU from Invidia. By that time RTX with DLSS 2/3 should be IRON out and everything will run 60-120 Locked with Vsync on. That's the dream. Though through the LGCx or LGC1 Vsync isn't needed! Quote Link to comment Share on other sites More sharing options...
Dexterryu Posted January 3, 2023 Share Posted January 3, 2023 I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either. The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO. Quote Link to comment Share on other sites More sharing options...
SuperSpreader Posted January 4, 2023 Share Posted January 4, 2023 11 hours ago, Dexterryu said: I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either. The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO. Probably better depth rendering so environments can be more open with less restrictions on sightlines or better/complex VFX. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 17 hours ago, Dexterryu said: I think they're going to be at an interesting point. With the "next gen" console generation really just starting to get regular "current gen only" titles they are going to be the limiting factor when it comes to GPU value. We're already at the point where we can play damn near photo-realistic games in UE5.x at high frame-rates at 4K... As a 3080 owner, I have zero interest in a 40X0 card for gaming purposes. Additionally, now that crypto is struggling the artificial demand for gaming cards isn't there either. The biggest thing is, at this point is where does Nvidia go from here beyond efficiency? Better graphics is darn near hit the diminishing returns point on more photo-realism vs animation IMHO. We need higher resolution still (12k), better RT performance so games can be 100% ray traced via path renders, and increases in overall performance so games can run at the aforementioned 12k at 120-360fps. There's still quite a bit of room for growth, but the main things are certainly resolution and framerate. If you're gaming below 4k, I can understand not wanting to upgrade your 3080, but for me, the 3080 was nowhere near cutting it for 4k 60fps most of the time, let alone 4k 120fps. 1 Quote Link to comment Share on other sites More sharing options...
Dexterryu Posted January 4, 2023 Share Posted January 4, 2023 23 minutes ago, Spork3245 said: We need higher resolution still (12k), better RT performance so games can be 100% ray traced via path renders, and increases in overall performance so games can run at the aforementioned 12k at 120-360fps. There's still quite a bit of room for growth, but the main things are certainly resolution and framerate. If you're gaming below 4k, I can understand not wanting to upgrade your 3080, but for me, the 3080 was nowhere near cutting it for 4k 60fps most of the time, let alone 4k 120fps. I do game at QHD, having had a 4k monitor in the past I just couldn't really tell the difference vs being able to go 144hz & 1000nit HDR. Again, diminishing returns on higher resolutions. 6 hours ago, SuperSpreader said: Probably better depth rendering so environments can be more open with less restrictions on sightlines or better/complex VFX. And this is where it's going to be a tougher sell because games won't necessarily "look" better. So there's not going to be that visual jump like you'd expect. Not saying that higher frames or more depth is bad. Just that for most people it's not going to make them do a $1000+ upgrade. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 8 minutes ago, Dexterryu said: Again, diminishing returns on higher resolutions. Strong disagree. I have a 1440p ultrawide next to my 4k 144hz 1000nit HDR monitor and the difference is super apparent to me. 12k is when aliasing is no longer perceptible without AA. Quote Link to comment Share on other sites More sharing options...
Dexterryu Posted January 4, 2023 Share Posted January 4, 2023 13 minutes ago, Spork3245 said: Strong disagree. I have a 1440p ultrawide next to my 4k 144hz 1000nit HDR monitor and the difference is super apparent to me. 12k is when aliasing is no longer perceptible without AA. To each their own then. The biggest visual impact for me has been HDR vs resolution. Again, not saying they are bad things just not worth the asking price. Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted January 4, 2023 Share Posted January 4, 2023 Quote Link to comment Share on other sites More sharing options...
CitizenVectron Posted January 4, 2023 Share Posted January 4, 2023 I'm building my new PC this week, and I am sticking with 1440p (165hz) for the next 5-year cycle. 4k is impressive, but for me I'd much rather have high, stable refresh rates. My build (assembling this weekend): 5800x3D 6950 XT 32GB DDR4 3600hz MSI Mag B550 Tomahawk mobo LG 27GP83B-B monitor (27" 1440p 165hz) Bunch of M.2 and other SSDs 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 8 minutes ago, crispy4000 said: It's not particularly great, but it seems to OC very well like the 4080. Biggest issue is the price compared to previous xx70 Ti offerings. It's a better value than the 7900 XT, IMO. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted January 4, 2023 Share Posted January 4, 2023 1 minute ago, CitizenVectron said: I'm building my new PC this week, and I am sticking with 1440p (165hz) for the next 5-year cycle. 4k is impressive, but for me I'd much rather have high, stable refresh rates. My build (assembling this weekend): 5800x3D 6950 XT 32GB DDR4 3600hz MSI Mag B550 Tomahawk mobo LG 27GP83B-B monitor (27" 1440p 165hz) Bunch of M.2 and other SSDs Woot fellow AMD bro. I am waiting for CES to see what new monitors will come out. I'm at 120Hz 5120 by 1440p and would like to stay the same but with maybe an OLED display. I work from home though so I'm concerned about burn in. Lol gonna have to turn off my monitor on breaks and like make sure i move windows around a bit. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted January 4, 2023 Share Posted January 4, 2023 2 minutes ago, Spork3245 said: It's not particularly great, but it seems to OC very well like the 4080. Biggest issue is the price compared to previous xx70 Ti offerings. It's a better value than the 7900 XT, IMO. Ya the 7900XT should be less in and of itself. With the 4070 it needs atleazt a 100 dollar drop. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 4 minutes ago, Zaku3 said: Ya the 7900XT should be less in and of itself. With the 4070 it needs atleazt a 100 dollar drop. Yea the XT is $100 more than the 4070 Ti. Without an OC and no RT the XT is around 6-9% faster (OC'd, they're pretty-much 1:1, but that's with the XT not OC'd and I don't know how well it OCs), but with RT enabled the 4070 Ti comes out on top with like +25%. Quote Link to comment Share on other sites More sharing options...
CitizenVectron Posted January 4, 2023 Share Posted January 4, 2023 6 minutes ago, Zaku3 said: Woot fellow AMD bro. I am waiting for CES to see what new monitors will come out. I'm at 120Hz 5120 by 1440p and would like to stay the same but with maybe an OLED display. I work from home though so I'm concerned about burn in. Lol gonna have to turn off my monitor on breaks and like make sure i move windows around a bit. I work from home 60% of the time, and with this new monitor I will have: 1 x 27" 1440p (Freesync) 1 x 27" 1440p (Gsync) So, I'll be using the new monitor for gaming, but both monitors for productivity. I considered going ultrawide, but I prefer to just have two monitors. Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted January 4, 2023 Share Posted January 4, 2023 16 minutes ago, Spork3245 said: It's not particularly great, but it seems to OC very well like the 4080. Biggest issue is the price compared to previous xx70 Ti offerings. It's a better value than the 7900 XT, IMO. If I were to buy a card soon (I probably won't), I wouldn't consider that labeling relevant at all. There's plenty of cards that can be ignored if you don't take the gen on gen comparisons too literally. I'd sooner think of this card as a better 3080 with DLSS3. Seems like a good card for someone like me primarily interested in 4k60 gaming, and a much better option for price vs. performance than the 4080. $800 is still way too much for me to stomach though. It's only a 'good' MSRP in a terrible time for GPU pricing. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 10 minutes ago, crispy4000 said: wouldn't consider that labeling relevant at all. I don't disagree, but that's been the main argument people make against the 4080, despite the performance-per-dollar being better than the 4090. Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted January 4, 2023 Share Posted January 4, 2023 19 minutes ago, Spork3245 said: I don't disagree, but that's been the main argument people make against the 4080, despite the performance-per-dollar being better than the 4090. I thought the arguement was more that 4080 was a shit deal compared to 30xx cards on sale, performance per dollar. I suppose now you could also say it’s a shit deal compared to a 4070 TI. There’s no way a 4080 is worth $400 more. Quote Link to comment Share on other sites More sharing options...
elbobo Posted January 4, 2023 Share Posted January 4, 2023 I honestly have no idea what nvidia and amd are doing this gen. It seems like the options are go sell a kidney and buy a 4090 or say fuck off and buy a PS5, the performance bump on 4080/4070 vs previous gen is not worth the cost. AMD is a bit better but still overpriced by at least 25% Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 17 minutes ago, crispy4000 said: I thought the arguement was more that 4080 was a shit deal compared to 30xx cards on sale, performance per dollar. No, it has to do with the pricing vs previous xx80 cards. That's the main sticking point people are complaining about. The price-per-dollar works out about 1:1 with current 3080 and 3080 Ti prices, and superior to 3090/3090 Ti prices. The 7900 XT/XTX truly aren't better values in all honesty by comparison, either. 17 minutes ago, crispy4000 said: There’s no way a 4080 is worth $400 more. That largely depends on your usage case (preferred resolution, settings, and framerate). Quote Link to comment Share on other sites More sharing options...
Dexterryu Posted January 4, 2023 Share Posted January 4, 2023 I think the main thing here with all cards, is whether there is value in upgrading vs the asking price. Looking at the industry as a while, I think the crypto rush for GPUs inflated prices and the manufacturers saw this and adjusted MSRP. That said, with crypto crashing and cards being readily available people aren't willing to overpay. There's still going to be enthusiasts that have to max out/have the best (I realize a lot of people posting here are in that club), but those aren't the every day gamers that previously had to overpay to have ANY card. It'll be interesting to see what happens now that supply seems to exceed demands. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted January 4, 2023 Share Posted January 4, 2023 Ok got off the phone with AMD. They are offering refunds or replacement. I'm gonna just wait for a replacement cuz I don't want to jump through hoops for a AIB card or having to hunt down a 4080. The card even like this is a good upgrade over my 6800XT. It's more of a performance is being left on the table. The only negative is it doesn't like Steel Division 2 and crashes everyone once in a while. Guess I'll just play Warno. I feel bad joining MP games and dropping. I Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted January 4, 2023 Share Posted January 4, 2023 16 minutes ago, Spork3245 said: That largely depends on your usage case (preferred resolution, settings, and framerate). I have really hard time envisioning the sort of PC gamer where a 4070ti wouldn’t meet their needs but a 4080 would. Maybe some years in the future? But then it might be best to save that $400 off the top for a new card. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 4 minutes ago, crispy4000 said: I have really hard time envisioning the sort of PC gamer where a 4070ti wouldn’t meet their needs but a 4080 would. Maybe some years in the future? But then it might be best to save that $400 off the top for a new card. At 4k there's a 25-30% performance improvement on the 4080 for 33% more money. When RT is enabled the 4070 Ti struggles at 4k whereas the 4080 is mostly able to maintain 60fps. If your goal is 4k max settings but don't want to spend beyond 1200, the 4080 is a better solution. At 1440p, the 4070 Ti shines. Quote Link to comment Share on other sites More sharing options...
crispy4000 Posted January 4, 2023 Share Posted January 4, 2023 3 minutes ago, Spork3245 said: At 4k there's a 25-30% performance improvement on the 4080 for 33% more money. When RT is enabled the 4070 Ti struggles at 4k whereas the 4080 is mostly able to maintain 60fps. If your goal is 4k max settings but don't want to spend beyond 1200, the 4080 is a better solution. At 1440p, the 4070 Ti shines. We talking DLSS2 here? Because if not, it negates so much of that talking point. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 4 minutes ago, crispy4000 said: We talking DLSS2 here? Because if not, it negates so much of that talking point. Why would you not have DLSS or FSR enabled at 4k? Also, "talking point"? Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted January 4, 2023 Share Posted January 4, 2023 1 minute ago, Spork3245 said: Why would you not have DLSS or FSR enabled at 4k? I find FSR1 and FSR2 worse then native. It's close with FSR2 but something is still off to me when playing dark tide. Based on my 1600p 16in laptop I want to say DLSS looks about as good as native. It does feel off at times but it generally felt as if I was playing at native. Only at 100FPS. Damn. I think Asus is gonna make the flow line Intel only for this gen. Guess a 2023 Lenovo Legion 7 is gonna be my next laptop. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 Just now, Zaku3 said: I find FSR1 and FSR2 worse then native. It's close with FSR2 but something is still off to me when playing dark tide. Based on my 1600p 16in laptop I want to say DLSS looks about as good as native. It does feel off at times but it generally felt as if I was playing at native. Only at 100FPS. Damn. I think Asus is gonna make the flow line Intel only for this gen. Guess a 2023 Lenovo Legion 7 is gonna be my next laptop. At 4k I'd argue that DLSS quality and balance are superior to native due to the removal of aliasing. FSR(2) I'd only use the "quality" settings, personally. At below 4k I have no idea how FSR fairs, but DLSS still looks better than native in most cases when set to quality mode, even down to 1080p. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted January 4, 2023 Share Posted January 4, 2023 1 minute ago, Spork3245 said: At 4k I'd argue that DLSS quality and balance are superior to native due to the removal of aliasing. FSR(2) I'd only use the "quality" settings, personally. At below 4k I have no idea how FSR fairs, but DLSS still looks better than native in most cases when set to quality mode, even down to 1080p. I do use FSR at times on my Onexplayer. It is just fsr1.0 for Elden Ring. If you use the highest res option it doesn't look that bad. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted January 4, 2023 Share Posted January 4, 2023 Just now, Zaku3 said: I do use FSR at times on my Onexplayer. It is just fsr1.0 for Elden Ring. If you use the highest res option it doesn't look that bad. I don't know if I ever used FSR1 tbh. I don't think there were any games (that I played, at least ) that offered FSR1 without also offering DLSS. I know Far Cry 6 only has FSR, but I'm 99% sure it's v2. FSR1 was a bit worse than OG DLSS from what I saw. Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted January 4, 2023 Share Posted January 4, 2023 Just now, Spork3245 said: I don't know if I ever used FSR1 tbh. I don't think there were any games (that I played, at least ) that offered FSR1 without also offering DLSS. I know Far Cry 6 only has FSR, but I'm 99% sure it's v2. FSR1 was a bit worse than OG DLSS from what I saw. FSR 1 made textures look soft, ver 2 is much improved and its quality, while lower than DLSS 2, is not off by much. Both are attractive propositions now. Quote Link to comment Share on other sites More sharing options...
CitizenVectron Posted January 4, 2023 Share Posted January 4, 2023 5 minutes ago, Mr.Vic20 said: FSR 1 made textures look soft, ver 2 is much improved and its quality, while lower than DLSS 2, is not off by much. Both are attractive propositions now. So when are FSR/DLSS most-useful? All the time? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.