Spork3245 Posted September 6, 2020 Share Posted September 6, 2020 26 minutes ago, morph89 said: How much of a bottleneck do we think a 4770K (stock) + 32GB RAM + 512GB M.2 NVMe would be to pair with a 3090 for 4K gaming? Currently on a 1080 Ti that does well at 1440p with high+ultra settings. Really struggles with 4K for the most part though. I've been on the fence about waiting for Zen 3 to build a new system, but a whole new build + 3090 might break the bank a bit harder than I'd like to at the moment. Then again, it is the apocalypse so... YOLO? I doubt there’d be much of a bottleneck at 4k. You really should OC that CPU, though... just set it to 4.4-4.5ghz and let it be. Quote Link to comment Share on other sites More sharing options...
Rev Posted September 6, 2020 Share Posted September 6, 2020 4 hours ago, stepee said: @Rev said it was 6:30am but i didn’t ask where he read that Just looked again and it's 6am Pacific. 1 Quote Link to comment Share on other sites More sharing options...
cusideabelincoln Posted September 7, 2020 Share Posted September 7, 2020 5 hours ago, morph89 said: How much of a bottleneck do we think a 4770K (stock) + 32GB RAM + 512GB M.2 NVMe would be to pair with a 3090 for 4K gaming? Currently on a 1080 Ti that does well at 1440p with high+ultra settings. Really struggles with 4K for the most part though. I've been on the fence about waiting for Zen 3 to build a new system, but a whole new build + 3090 might break the bank a bit harder than I'd like to at the moment. Then again, it is the apocalypse so... YOLO? If you're doing fine with a 1080 Ti at 1440p, then you should expect similar results at 4k with a 3090. 3090 will approach roughly twice the performance of a 1080 Ti, while bumping from 1440p to 4k is at least 50% more demanding on the GPU. You might be losing a little bit of performance, depending on the title, due to being stuck on lower bandwidth DDR3 memory or games that heavily utilize 6+ cores. But again, that depends on how sensitive a particular game engine is to memory performance (Far Cry games love CPU speed and memory speed). Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 9, 2020 Share Posted September 9, 2020 The 3080 is 3-4x more efficient at mining crypto (etherium) than the 2080 Goodbye, stock Quote Link to comment Share on other sites More sharing options...
Brian Posted September 9, 2020 Share Posted September 9, 2020 Mining is still a thing? Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 9, 2020 Share Posted September 9, 2020 33 minutes ago, ManUtdRedDevils said: Mining is still a thing? It never stopped being a thing Quote Link to comment Share on other sites More sharing options...
DPCyric Posted September 10, 2020 Share Posted September 10, 2020 Strictly for VR two 3070's would be WAY better than one 3080 right? Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 5 hours ago, DPCyric said: Strictly for VR two 3070's would be WAY better than one 3080 right? Not necessarily: it depends if the game in question supports SLI, what the SLI scaling is (some games get a 70% boost over a single card, others a 20% boost), and how much microstutter is caused by the SLI. I ran SLI and Crossfire for years, you’re better off with a single GPU, IMO. Quote Link to comment Share on other sites More sharing options...
DPCyric Posted September 10, 2020 Share Posted September 10, 2020 9 minutes ago, Spork3245 said: Not necessarily: it depends if the game in question supports SLI, what the SLI scaling is (some games get a 70% boost over a single card, others a 20% boost), and how much microstutter is caused by the SLI. I ran SLI and Crossfire for years, you’re better off with a single GPU, IMO. Ah, I remember reading somewhere that SLI could be really good for VR but didn't know where it went. Maybe a 3080 now and a second one when Squadron 42 releases 🤔 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 1 hour ago, DPCyric said: Ah, I remember reading somewhere that SLI could be really good for VR but didn't know where it went. Maybe a 3080 now and a second one when Squadron 42 releases 🤔 I doubt you’ll need 2x 3080s to power Squadrons in VR. What you read was based on the idea that each card could/would independently render for each eye in VR via DX12’s multi-GPU features. It was never implemented and apparently never will be considering it’s been like 4-5+ years since it was last even talked about. Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 4 minutes ago, Spork3245 said: I doubt you’ll need 2x 3080s to power Squadrons in VR. What you read was based on the idea that each card could/would independently render for each eye in VR via DX12’s multi-GPU features. It was never implemented and apparently never will be considering it’s been like 4-5+ years since it was last even talked about. Probably because the excited engineer that breathlessly offered up that idea, discovered later that even a few millisecond desynchronization between eyes would result in vomiting! So much vomiting, and head aches, oh the headaches! 1 Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 I ran SLI for 4 years until I final stopped punishing myself. The handful of users that still do defend it through story telling that is less than truthful and cherry picked examples of solid implementation. Over all, its a massive pain in the ass, and if you want to take advantage of all that power, you have to wait until everyone else has played the game you are excited about while the SLI support gets ironed out. Nvidia themselves do not like SLI now and will largely ignore issues for months at a time. Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 I mean, if you’re willing to drop $1400+ on two 3080s, just get a 3090 Quote Link to comment Share on other sites More sharing options...
Ominous Posted September 10, 2020 Share Posted September 10, 2020 Yea I think the 980 was the last sli I did, and it was a pile of shit most of the time. Quote Link to comment Share on other sites More sharing options...
Firewithin Posted September 10, 2020 Share Posted September 10, 2020 newegg has their evga pages up. im going to be up at 6am anyway but it will be interesting to see how quickly some of these notification emails are sent Quote Link to comment Share on other sites More sharing options...
JPDunks4 Posted September 10, 2020 Share Posted September 10, 2020 When do they go for sale? Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted September 10, 2020 Share Posted September 10, 2020 17 minutes ago, JPDunks4 said: When do they go for sale? Thursday the 17th. 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 40 minutes ago, JPDunks4 said: When do they go for sale? 22 minutes ago, Zaku3 said: Thursday the 17th. The 3080 is the 17th, 3090 is the 24th iirc. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted September 10, 2020 Share Posted September 10, 2020 5 minutes ago, Spork3245 said: The 3080 is the 17th, 3090 is the 24th iirc. Crap forgot to mention that. Also expect low supply so if your heart is set on a 3090 I still recommend trying to get a 3080. I am going for the 3080 and if I can get it great if not will go for the 3090. Quote Link to comment Share on other sites More sharing options...
JPDunks4 Posted September 10, 2020 Share Posted September 10, 2020 Yeah may just get the 3080. I am more interested in finally having a HDMI 2.1 GPU for my LG OLED. But if I want to appreciate the push for 4k120 I'd need the 3090. Quote Link to comment Share on other sites More sharing options...
Zaku3 Posted September 10, 2020 Share Posted September 10, 2020 48 minutes ago, JPDunks4 said: Yeah may just get the 3080. I am more interested in finally having a HDMI 2.1 GPU for my LG OLED. But if I want to appreciate the push for 4k120 I'd need the 3090. I think I might get the 3080 and sit on it until the 24th. I will return it if I can grab a 3090. Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 1 hour ago, Spork3245 said: The 3080 is the 17th, 3090 is the 24th iirc. I "coincidentally" happen to have that day off! Quote Link to comment Share on other sites More sharing options...
Ominous Posted September 10, 2020 Share Posted September 10, 2020 3 minutes ago, Mr.Vic20 said: I "coincidentally" happen to have that day off! I have the 24th off too. Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 5 minutes ago, Ominous said: I have the 24th off too. Let's do this! Quote Link to comment Share on other sites More sharing options...
Ominous Posted September 10, 2020 Share Posted September 10, 2020 4 minutes ago, Mr.Vic20 said: Let's do this! 1 Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 @Mr.Vic20 1 Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 Jensen may keep his tacky ass kitchen and deceptive spatula arrangements! Quote Link to comment Share on other sites More sharing options...
HardAct Posted September 10, 2020 Share Posted September 10, 2020 Gonna be so jealous of you 3080/3090 guys out the gate. I'm so excited to see the Professionals get there little hands on these things and we see real world numbers, so exciting. I'm set to build a 4k/120 capable PC rig and just be happy for a long time. I'm hoping this is easy for the 3090 to do across the board, but we hall see. Then I can narrow in on my new TV to pair with it. must be 2.1 {of course} and be able to have a 120{+} refresh rate in 65" and have G-sync would be a great selling point Really interested to know how you guys feel about there being a Ti version of the 3080 / 3090? I mean wow, if there is still a higher end card there keeping under wraps?? There has been since the 900 series right? The cost though, ouch. Hoping to get 500-600 for my 2080Ti and go for in 12 months from now! Really am interested though to my "Ti" question, do you think NVIDIA is holding on to these too for a later announcement and the cost of course? No one is really talking about this, so maybe I'm just way off here? They said the 3090 was just re branding of their Titan cards, so that leaves the Ti cards to still be announced and really makes me excited to see just what a GPU with little limitations can really do where we are in graphics demands now. Great shit, love watch this stuff play out, as I can't be a Day 1'er this time around... Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 @hardact , a couple of things make me not really worried about Nvidia dropping a Christmas surprise in the form of a ti variant, those include: 1.) The 3080 and the 3090 will both offer 60FPS in 4K, if not higher frame rates for a few years 2.) DLSS 2.0/2.1 is gaining adoption, which will further extend the life of these cards 3.) 8K doesn't make a lot of sense for almost all use cases right now, outside of VR, and those special few who somehow want to sit 3-5 feet from a 77" display. (note: this is not advised ) So go right ahead Nvidia, I'm not going to need that extra power for at least another year, by which time there will inevitably be newer, faster cards to fawn over. As for TVs LG is still in "pole position" with their C series, although recent reporting indicates all models up through their latest 2020 units suffer from a general "greying" when attempting to use the supported VRR feature of HDMI 2.1 standard. So, the quest continues for the "perfect" 4K OLED for gaming. That said, I'm definitely not sad about my C9, its damn near perfect! Quote Link to comment Share on other sites More sharing options...
stepee Posted September 10, 2020 Share Posted September 10, 2020 So is the 24th when you should actually get the gpu if you are able to secure one? Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 Just now, stepee said: So is the 24th when you should actually get the gpu if you are able to secure one? If you’re purchasing from a physical location, yes. Quote Link to comment Share on other sites More sharing options...
Mr.Vic20 Posted September 10, 2020 Author Share Posted September 10, 2020 1 minute ago, stepee said: So is the 24th when you should actually get the gpu if you are able to secure one? I have hopes of having one by my birthday in Oct! Quote Link to comment Share on other sites More sharing options...
Spork3245 Posted September 10, 2020 Share Posted September 10, 2020 I remember when the 980 came out, I was able to order one from Newegg on launch day and also pick it up at their warehouse about 90 minutes from me so I had it on launch day. Those were the days Quote Link to comment Share on other sites More sharing options...
AbsolutSurgen Posted September 10, 2020 Share Posted September 10, 2020 The plug placement on the Founder's Edition is terrible. Quote Link to comment Share on other sites More sharing options...
Ominous Posted September 10, 2020 Share Posted September 10, 2020 I kinda want to hook this up to my c7, but space in the living room is already at a premium Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.