Jump to content

legend

Members
  • Posts

    30,120
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by legend

  1. 1 hour ago, Reputator said:

     

    This statement contains a bunch of assumptions about the future you literally can't make. "Remaining top tier" isn't an argument in favor of buying a new graphics card. I don't care how much it titillates your I/O ports.

     

    I don't know it's the future, no. But I'm pretty damn confident staying with pure raster isn't going to lead us anywhere great and it's quite surprising that you think the status quo is okay and headed anywhere but a dead end.

     

    It would also be surprising to me if the future of graphics goes somewhere that's not ray-tracing nor raster. In all these years, ray tracing hasn't gone away. There hasn't been a different approach embraced unless you count hybrid approaches which is exactly what Nvidia is doing. At a certain point, the approximation model of rasterization hits a wall.

     

    And if the community not willing to support a solid take at something new because there is any uncertainty at all then we have a really long shitty road ahead of us.

     

    Quote

    No one was disappointed at the performance of the 10 series. Prior to that, we were stuck on 28nm for three generations, so there was some stagnation, albeit out of the hands of any IHV. Aside from you, however, I've not heard anyone lament the stagnation of features though.

     

    I'm not even talking about the 10 series alone. I'm talking about the overall trend of graphics. That means PC GPUs and console generations which are over longer intervals (and which these days are closely tied to what PC GPUs can do). I honestly don't know what to tell you if you think people haven't been expressing a sense of diminishing returns. I see it rather frequently.

     

    Quote

     

    You have a very utopian-like concept for the role of consumers. Be realistic. Few people have the money to throw away on an investment in the future with very few short-term gains. Most of us aren't venture capitalists. We don't "have to" encourage anything if NVIDIA and the developers fail to give us incentives beyond empty promises. If you think otherwise, I have some snake oil you might be interested in.

     

    I've previously said that if people can't afford those prices I sympathize with that entirely. And who said anything about "throwing away" the money? My very argument is that there is a long term benefit here that people should be incorporating to their decision making. 

     

    Again, you keep appealing to "have to" arguments when I'm very explicitly rejecting that entire line of thinking, so I don't know why you keep returning to it. Ignoring future consequences is, definitionally, myopic.

  2. 13 minutes ago, Reputator said:

     

    You'll be confused at any community disappointment if performance remains virtually the same, for the same cost? You GREATLY overestimate how much people care about the potential of raytracing.

     

    Lets be precise: I'll be confused at the community disappointment if performance on conventional games remains top tier, while providing a major advance in architecture that will yield some improvements now, and ultimately much greater ones.

     

    I don't think I'm overestimating anything. The stagnation of raster-based tech is immediately apparent. Both of the last two generations have yielded many people being disappointed at the graphical progress being made. That's because we're running out of room for what we can do with conventional tech. We need to change it up or it's really going to stagnate. 

     

    And it's not just raytracing. Having strong DL capabilities on board is going to have ramifications for gaming beyond their up-resing for limited rays (or resolution in general). This tech is moving faster than I would have expected, but with hardware like this, there are some important ways in which DL can be useful for gaming and gaming graphics. For example, by making intentional physically-based animation. (A tech important not just to the resulting actors in the game, but the whole animation pipeline being streamlined)

     

    Quote

    Again, purchasing a first generation of an unproven technology because it will encourage the technology to advance isn't the job of consumers. Bring the horse, then you can have the cart. It's not on us to assume the risk, which given the pricing, would indeed be the case for all but the top-end 2080 Ti.

     

    This is a next generation product. New features are nice, but improving the performance of last gen cards is expected. Not a bonus.

     

    "It's not on us" isn't a useful way to frame things. No one is "obligated" to do anything ever, so nothing is ever "on someone" to do. It's a simple question of "does supporting this encourage the future I want to see?"  If you do not think this line of tech is useful, that it doesn't represent a future we should want to see, then we can have that discussion, because we may disagree. But it's simply true that you have to encourage what you want to see more of independent of any "obligation" or who it's "on."

  3. 10 minutes ago, Reputator said:

     

     

     

     

    I did see that post and thought I had addressed it, along with others. I'll will try to clarify though.

     

    "Worse performance for the money" is an evaluation based purely on what these cards will do for games the moment they are released. I'm quite deliberately and explicitly making an argument that part of the reason we should get on board beyond the immediate value and and extra bonuses will see in near term is to (1) encourage this hardware development track further; and (2) encourage software developers to work out the kinks by providing an audience for them to develop to.

     

    Indeed, I have also explicitly stated that a more traditional raster-focused card, as you've suggested as an alternative, would be far more disappointing to me.

     

    As far as a trade off, I would be more sympathetic to the argument if these cards preform substantially worse than the 10x line. In fact, if 3rd party reviews reveal that the RTX really takes a meaningful hit compared to the 10x line, I'll be more understanding of people having further hesitation. Depending on the hit, I might wait as well!

     

    If they end up still being top of the line raster cards though, I will remain puzzled at any community disappointment. I don't think we can ask for better than maintaining existing parity while introducing a whole new tech. Hitting that parity at all is pretty great.

  4. 3 minutes ago, CastlevaniaNut18 said:

    I don't care to get into a semantics argument, but the pricing is fucking ridiculous. 

     

    I'm not playing semantics, I'm trying to understand the root of where we disagree. If by "price gouging" you purely mean "expensive" then sure, I agree with you! If you mean it's unjustly fair to ask that much, then I do disagree with you and want to know why you think that.

     

    2 minutes ago, Reputator said:

     

    No, consumers don't spend money purely based on an ideal. Telling people they need to buy these cards, whether or not they get their money's worth out of it, because it will encourage worthwhile products later literally makes no sense.

     

    This has nothing to do with an "ideal." I'm not using "principle" to mean aspiring to an atheistic ideal. I mean its a fundamental aspect of the mathematical machinery of decision making in populations of agents that affect each other. It would be a categorical mistake to pretend that this dynamic doesn't exist.

     

    2 minutes ago, Reputator said:

    If you weren't ignoring my earlier responses you'd already know.

     

    I wasn't ignoring anything. If you think you provided a salient point that I have not read nor considered, please point it out.

  5. 7 minutes ago, CastlevaniaNut18 said:

    I love PC gaming. I can afford it. 

     

    Doesn't mean I'm willing to throw away t hat much money, though.

     

    There's gradations to this. If you're already on something like a 1080Ti I understand not wanting to drop cash on it all over again. What I don't understand is a lack of excitement for these cards. What are people wanting Nvidia to do otherwise? Making big, but important, changes like this isn't going to be cheap but I can't think of anything better to encourage.

     

    A regular upgrade to a standard raster card at the same price range as the last set is not exciting and I'd be quite disappointed to continue to be stuck in that zone.

     

    But if we don't want to be stuck there, we have to put our money on the future we want to see.

  6. Yes, absolutely. We've been on a run of diminishing returns for graphics for some time now. The only way out is to break free of old standards. Nvidia's complete embracing of a combination of raster + ray trace + deep learning is a very promising direction to take.

     

    No, it won't be perfect out of the gate, but you're still getting top of the line cards and as a community we ought to be encouraging this direction for our hobby.

  7. 5 minutes ago, Spork3245 said:

     

    Gsync is technically superior to Freesync (no clue about 2.1’s VRR, but it’s likely the same case) as it’s a hardware solution vs a software solution, but it’s by an indecipherable amount that requires a 600fps camera to capture the difference - though that fraction of a fraction of a fraction of a millisecond will certainly be nVidia’s reasoning for keeping it around, at least on monitors. :p The issue is also that we don’t know if monitors will adopt 2.1 VRR as they are primarily display port and I’m unaware of any VRR standards upcoming to DP.

    To your other point, I would highly doubt that nVidia has included a 2.0b-2.1 hybrid HDMI out on the 20xx series like MS did on the XboneX and Samsung did on the Q9 without making a huge deal about it (both MS and Samsung made a bunch of noise about their inclusion... and this is nVidia we’re talking about! :p )

     

    I suppose they could maybe then make a case for the time being for sticking with Gsync for monitors since there is no standard, while supporting 2.1's VRR for TVs. Still would be ridiculous though because requiring such special hardware is a terrible plan and I really wish they would do away with it.

  8. 7 minutes ago, Spork3245 said:

     

     

    It’d be totally fine if they stuck with GSync for monitors and display port, especially since HDMI VRR =/= Freesync. I am highly concerned about HDMI 2.1 compliance on future cards, however, especially since these cards really should have been 2.1 compliant given that nVidia was the first to adopt 2.0b compliancy to give 4k 60hz HDR to PC.

     

    It's perfectly fine for Gsync to coexist with 2.1 VRR, but it seems completely pointless for it to do so if they have 2.1 VRR, so I'm more worried about them exclusively supporting Gsync for VRR.

     

    As we found in other discussion, you don't need to be fully 2.1 compliant to support 2.1 VRR, so lack of full 2.1 support is at least not a blocker for that. Although if you wanted other 2.1 features, then yeah, that might be an issue.

  9. Beyond the silliness of the company's hype machine, I think we should also praise the whole company. They ultimately are deciding to make this large business decision to push the market forward.

     

    It comes down to this: if you want the market to embrace this change, then we should be okay with paying the cost of getting us there.

     

    Here's a world that would suck IMO: the market doesn't embrace the new cards and so Nvidia goes back to only pushing on raster and it takes far longer to get to this next world of architecture both because Nvidia has stopped focusing on it and because developers have no audience to justify working on the software end.

     

    All things considered this is a commendable job to maintain top-end parity on their top end cards for raster, and introduce a whole lot more. I don't think we can really ask for better than that and so if we want to see more of this, we should reward the company for trying to do it.

     

     

    This is an occasion where Vic's compulsion to buy the newest greatest hardware really would be better if adopted by more people :p 

    • Thanks 1
  10. 20 minutes ago, Mr.Vic20 said:

    This is the double edged sword of Nvidia's approach. On the one hand they promising what no one else is, the so called "holy grail" of graphics, Ray tracing (though its not a 100% true solution, but rather a very convincing alternative solution) On the other hand, they have done what Nvidia does namely:

     

    1.) Waste 90 minutes of our lives explaining graphic technologies to people who already have decent understanding of said graphics tech.

    2.) Promising the moon!

    3.) Using charts that mean very little at best and are out right lies at worst!

    4.) And price gouging the consumer with the impunity of an arrogant Sony back in the day. 

     

    Now that the dust has settled on Nvidia's reveal, and the gaming "Press" that quite frankly were writing breathless pieces of hyperbole like a rapid fire machine gun over the last month, have only now been turning the corner on asking some practical questions. This is not thier fault specifically, as Nvidia is handing out only PR crap if anything. But still, the back swing on the initial excitement will be brutal in the coming weeks. With prices sky high and the consumer skeptical as all Hell, there aren't too many good narratives likely to emerge between now and mid September when these cards launch.

     

    My, unproven take, is that the 2080ti is likely to perform at or around the speed of the Titan V in non RTX games. In RTX games, I suspect, I'll have to down shift from 4K to 1440P to pick up the use of this new feature. So I guess it will be time to truly put to the test the concept of "Prettier pixels" vs resolution. 

     

     

    This is my take as well and frankly, I'm okay with it because where they have delivered they really have. They're doing an enormous shift in architecture and developers are just now starting to use it. It's not going to super high performance on the new stuff immediately. As long as it's not a losing proposition in standard pure raster methods (meaning slower than existing GTX cards), there's really not a trade off. Maybe if AMD was able to compete and was pushing out even better raster cards there'd be a trade off to consider. But there's not. It's still going to be the fastest raster card on the market and introduces a whole new world of rendering with wildly a different architecture.

     

    I think we *should* be rewarding Nvidia for this. Despite being king, they haven't rested on their laurels. They've put in the work to start the next generation of graphics and are pulling it off. People shrugging off what they've done is odd to me.

  11. 7 minutes ago, Bjomesphat said:

    Why are people still buying games on day one anyway? They're just going to get discounted in a few months anyway, sometimes by 50%.

     

    The only games I pre-order are Nintendo games that I need to play on day one. Because I know they're not going on sale for a long time.

     

    Because they want to play sooner, and are willing to spend the money to do so? :p

    • Thanks 2
  12. 2 hours ago, Kal-El814 said:

    As ever, it seems as though people discussing movies on the internet believe they live on the same figurative planet as the rest of the movie viewing public. They do not. Most people aren't sitting through 10 minutes of credit reel to see the extra scenes, fewer people than that know who the hell Captain Marvel is, fewer people than that know how many movies Chadwick's deal was for, etc.

     

    A buddy of mine watched Man of Steel for the first time the other day (pray for him) and despite being a dude on the internet who enjoys superhero movies... he didn't know it wasn't a sequel to Superman Returns. He's not an idiot. He just likes movies enough to watch them and go to the theater and aside from that spends exactly zero time or energy thinking about them. This is how most people watch movies.

     

    So to say...

     

     

    A decent number of motherfuckers who go to EVERY ONE of these movies couldn't tell you why Superman and Spider-Man won't cross over. Marvel Studios and WB are as about as familiar to them as Paramount and Universal... they're logos people see before movies that mean, essentially, nothing.

     

    So... yes. Impossible as it may seem here, some people finished Infinity War thinking that was the "last one" or at the very least that everyone that died is gone forever.

     

    1 hour ago, sblfilms said:

     

    More accuratrly, they left the movie not thinking anything about future movies because they never think about a next movie. I can’t quantify it, but it’s not at all an uncommon reaction when I would chat with customers leaving the movie that they were surprised to find out this was really just part 1 of a 2 part film. When the next films begin their advertising blitz, that is often the first time they think about it at all.

     

    1 hour ago, sblfilms said:

     

    Yes, there are people who leave the theater after movies like FOTR and have no conception that there is more to come and they don’t even think about it.

     

     

    Well in that case

     

    giphy.gif

     

     

    Who am I kidding? I haven't wanted to for a long time for much more significant reasons!

     

    Still...

    • Haha 1
  13. I absolutely loathe the idea that wages for some jobs are expected to come from tips.

     

    That said, only when it's that kind of job do I feel compelled to tip unless it's also change that I don't care to have in my pocket and I for some reason paid with cash.

×
×
  • Create New...