Jump to content

legend

Members
  • Posts

    30,120
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by legend

  1. I was a backer and put a bunch of time into this over the weekend. It's got some technical issues, but overall I'm enjoying it a lot.

     

    The combat isn't hard (although maybe that's because I'm used to this kind of game?). The only battles I've lost are ones listed as "impossible," but which I tried anyway :p Despite the easiness though, it remains engaging and enjoyable.

     

    I'm having a hard time gauging how long it is. I've already reached tier 3 skills, but I feel like I'm not far in the game. I only just did the first main boss and left the main area.

  2. 20 minutes ago, Spork3245 said:

     

    IIRC, I remember benchmarks from a short while back (too lazy to google for them) and the performance, when OCed to the same clock speeds, between the 2500k and 4700k was within the margin of error. There’s been like no improvement to CPU architecture, for gaming*, until semi-recently. :/ 

    With AC:O, maxed, I average 55-ish fps in the benchmark regardless of 1080p or 1440p. :silly:  Probably the first game that concerned me with my CPU since switching to the i7.

     

    Yep, that's pretty consistent with my results! :lol: I could get right below 60fps with basically any graphics settings but nothing I did would budge it past.

  3. 9 minutes ago, Spork3245 said:

     

    Going from a i5 2500k (OC 4.5ghz) to an i7 3770k (OC 4.5ghz) was a night and day difference in some games, most notably with frame timing, micro-stuttering, and minimum fps in games. In a few, the average  fps went up a decent amount too (total war games, TW3 and a couple others). I’d say you really want a 8 threaded CPU. Currently, I think the only game that really shows a cpu bottleneck for me is AC:O as I get the same fps average (according to the benchmark) in 1080p vs 1440p.

    I plan on finally installing the 5960x that @Mr.Vic20 gave me almost a year ago (I know, I know!) in the next 6 weeks. I’m getting a new case, cooler, and possibly new PSU for it (which is one of the reasons I kept putting it off :p), so I’m 90% sure I’ll be doing a giveaway on my current i7.

     

    My current one is better than a 2500k (like  4700 series or whatever), but yeah, it's 4 cores only. I ultimately bailed on AC:O for other reasons, but I couldn't for the life of me get the fps to a stable 60 regardless of video settings, which is consistent with your experience of it being CPU bottle necked.

  4. 11 minutes ago, Mr.Vic20 said:

    The funny thing about CPU bottle-necking is that it typically occurs at lower resolutions like 1080P, at 1440P & 4K frame rates drop and with it the typical CPU work load while in a game. That said, DICE has stated that if you want to do Ray Tracing it carries a CPU cost on top of the GPU cost and that in their initial testing they think the current sweet spot is 12 threads, so they can "lightly" distribute the work load across as many cores as possible. So i7s or some thing from AMD is likely in your future! 

     

    That means I may need closer to a whole PC upgrade since I'll need a new mobo, and then RAM as well.

     

    Well, if I upgrade end of this year ~5 years give or take isn't a bad run for a PC I originally built on a mid-level budget.

  5. 18 hours ago, Brick said:

     

    Let's hope. Not exactly sure what else I'll need. I know I need to upgrade my CPU, but I wonder if I'll need a new motherboard too, and I'll have to do the math to see if I'll need a higher PSU. 

     

    I'm kind of worried about my CPU bottlenecking things too. It's a high end i5 from end of 2013, which is a bit old. At the same time, Intel hasn't exactly been making huge strides, game CPU usage has been seriously bottlenecked by consoles' pathetic CPU, and the trend of this new line is to further offload tech from the CPU to the GPU. So it might be okay?

  6. 21 hours ago, CitizenVectron said:

    I went from a 780ti to a 1080ti, so I should be safe until a 2280ti, I guess. Although with the way they keep increasing prices, I am not looking forward to the $2499 it will cost.

     

    I would expect (fuck I hope so) that prices come back down to the still high, but not crazy high, Nvidia prices of yore after this line. Adding so many new features with a major architecture changes, while also still having standard regular raster improvements can't come cheap. But next round it will be more old hat.

  7. :lol: 

     

    @Brick if you're willing to drop $1500 CDN on a card, then this is probably the best time to pay the sky high Nvidia price. However, if you want to feel a bit more sane about it, maybe wait until at least one game with DLSS or some other major feature is released and people get to play with it to confirm.

     

    I'm pretty confident we know what the card is at this point, but you'll feel better with that extra bit of confirmation. That probably means waiting no longer than the end of this year, so it's not a huge gap.

  8. 2 hours ago, sblfilms said:

     

    So I just spent 10 minutes looking through his tweets at some of the responses and I feel some actually solidify the “interpretation” that Oz was giving essentially an “I don’t see color” claim. I think this is particularly clear in an exchange with one gay person where Oz says something to the effect of your sexuality isn’t your only defining characteristic.

     

    The error here is that society does largely define individuals in marginalized people groups by those charactertistics! This is why people bang the drum of representation for marginalized communities. Notwithstanding his personal failures, it was a societal good for the Cosby Show to show a Black family in which both spouses were highly educated and financially successful in their fields, a Black husband who loved his wife and was faithful to her, a Black father who raised his children well.

     

    I don’t think Oz was wrong, just as the “I don’t see color” people are wrong to feel the way they do, but it is a privilege to not be concerned about your race or sexuality or nation of origin or whatever else about you is an immutable characteristic that the society as a whole places as less than to some degree.

     

    I agree that telling someone they have other characteristics is much more guilty of the "I don't see color" analogy, I must have missed that one. But that doesn't seem to be his primary claim. He seemed receptive to the importance of gay representation, but that it simply isn't the case of B&E and that we shouldn't feel the need to label them or know their sexuality.

     

    So maybe he is guilty of it while also pushing a more sensible position simultaneously.

  9. 18 minutes ago, Mr.Vic20 said:

    I think there are a few camps on this subject, and I believe I get the view of each. I'm all about progress in tech, but I understand how the unwritten contract of "This" costs X dollars historically is causing most budget conscious gamers to say:

     

    Image result for this deal is getting worse animated gif

     

    However, I prefer to see it as:

     

    Image result for my god it's full of stars animated gif

     

    People disappointing with the 2080 might well become a fair bit more interested in the card once the DLSS support is there. Linus tech has made the point that this series seems rushed and at least from a software perspective, I can see that. But I can also look forward to Q1 of 2019 and think, these features will get some solid bus and pick up rapidly! 

     

     

    :lol: Yeah I totally understand people on a more sane budget for gaming balking at this price. But fuck it's exciting and those of us willing to spend that kind of cash on this kind of stuff should!

  10. 5 minutes ago, crispy4000 said:

    I'm probably jumping the gun here, but since that video said that lower resolutions work too ... what about DLSS in a Switch successor?

    I can dream.

     

    Depends I suppose, on how fast Nvidia can miniaturize this architecture with low power demands, and how soon Nintendo might put out a successor.

     

    There is hope that miniaturization won't be too far off on Nvidia's side; at least with respect to tensors cores taking a bigger slice, because Nvidia has lots of orthogonal incentives for that. For example, there is already the Jetson Xavier which is a small low-power SoC and boasts tensor core support.

     

    That product's primary role, as you might imagine, is for those devious individuals trying to get deep learning power on robots and sensors :isee: But it also means it could be easy for them to provide something for portable devices like a new Switch successor.

×
×
  • Create New...