Jump to content

Spork3245

Members
  • Posts

    40,119
  • Joined

  • Last visited

  • Days Won

    98

Everything posted by Spork3245

  1. I'm talking about the amount the settings which can be lowered and the amount of dynamic resolution that can intercede. But you did say that it's totally A-OK for the 2080 Ti to use lowered settings/resolution... which is my point that you're contradicting yourself regarding it not okay for the XboneX. Hmmm, it's almost like you're just being obstinant and arguing for the sole sake of arguing. Nah, couldn't possibly be that! Oh, okay, you only have a problem with my sarcastic comments when using your own logic to imply such a statement. Got it! Wow, it's almost like I never actually called it a 4k/30 machine and stated "If 4k 30fps is acceptable, might as well save money and get an Xbone X" in reply to your statement about lowering settings/resolution/framerate with the 2080 Ti
  2. I think the second revision of the PS2 slim switched to software, I remember there being major issues in games like MGS VR Missions which were addressed via the next update to the console which, IIRC, used a slightly faster processor to better emulate those games, but it had had issues even still. (I think the PS2 slim with the sliding top?)
  3. You need higher than a 60fps average to maintain a solid 60fps. So, where’s this arbitrary line you’re drawing where the lowered settings/resolution no longer becomes acceptable on the X1X? Also, *cough* If you're going to call that a 4k60 gpu, as you did, this damn well should be considered a 4k30 console. All I'm asking is for you to be consistent.
  4. Here’s the official nVidia tech demo that should eventually be available for download (I use to love running nVidia tech demos )
  5. PRINT MEDIA IS COMING BACK, BABY! Also, before you judge, keep in mind he was just fighting Mr. Freeze.
  6. Except that I haven’t, my stance on this has been consistent since the GeForce 256 era. You just think I have because you disagree with my position and would rather argue. If you think I’m contradicting myself because of the XboneX posts... jfc, the suggestion is and always has been sarcasm and is being used to show the folly of your diminished settings/resolution argument. I stated as such and said the argument for the XboneX being a 4k/30 machine sucks pages ago. So, it’s not lowering settings and using dynamic resolution to achieve a stable framerate? What’s it doing that’s different than your suggestion of lowering settings and using dynamic resolution to get a stable 4k/60 on everything currently out with a 2080 Ti? Gee, yet I’m the one contradicting myself. It had Display Port and HDMI, but comes with a DP-to-DVI adapter. The adapter you were initially talking about, in regards to my own monitor and what we once spoke with Xbob about, is not a digital-to-digital adapter, it’s an actual converter for changing digital signals to analog.
  7. I mean, you’re resorting to piggybacking on an actual response, so why not? Lowering settings and using dynamic resolution? BTW, please show me where I actually stated that the 2080 Ti was a “piece of kit” or even a bad card, as you so used quotations for? My posts have been in regards to my disappointment that it cannot do 4k/60 ultra in games that it is releasing alongside and how I believe a $1k card should be giving me all the eye candy in current games, not that the card is a pos.
  8. Yea... that would def change my opinion of the ep... I really thought it was drawn censored.
  9. Plot-twist: these PS1 classic consoles are actually always online and just stream the games!
  10. Wait, what? Why in the world would the 2080 Ti only be HDMI? Furthermore, a digital-to-digital port conversion is simple and just a pass through (ie: hdmi to display port, or hdmi to dvi-d, or any combo of the three should be just a simple connection adapter). Digital-to-analog requires an actual hardware/software based conversion of the signal which is where those expensive and lag-inducing converters come into play. I have a friggen CRT Sony GDM-FW900, bro. If you have a 16:9 monitor, it’s HIGHLY unlikely that you have an analog based LCD monitor.
  11. Shhhhh, adults are speaking. Speaking of firm stances, you’re still okay with “sacrifices” for the 2080 Ti but disqualify the same sacrifices for the Xbone. Okay. I’m glad you believe in your own argument.
  12. ...because you did put words in my mouth and insinuated something regarding RT effects on older gen cards that I never stated. I’m, uhh... sorry (?) you somehow found this to be “sassy” and as reasoning to begin insulting me...? I’m not, though, my stance on these things has been consistent for years. I always granted these exceptions and stated as such using AMD and hairworks as an example. *On cards that support it Which is why I wouldn’t recommend a 10xx series card to be purchased at this point in time. Just like I wouldn’t have recommended a DX7 GeForce 2 after the DX8 GeForce 3 released. I do hold RT in its own performance spectrum, however, as I would likely state (again), once RT is available for use in games, that the 2080 Ti is practically/almost/close to a 4k/60 card in non-RT games and a 1080p/60 card for RT capable games. I really fail to see what’s outlandish or a stretch in that statement. I’m not “adding exceptions”, a card cannot render hardware-based effects that it cannot render, thus how can I hold that against its capable resolution/fps? If the 1080 Ti were a new card that just released alongside the 20xx series and was from the same manufacturer (as it is) I would rip it apart as if it were the GeForce4 MX reborn, though. I never stated my opinion or stance on the matter was “absolutist” and did not have exceptions for certain effects. That’s something you assumed and then willfully refused my clarifications on with your insistence of your 1080 Ti example. Once I clarified you accuse me of “pleading” and now “adhoc add exceptions” based on your own assumptions of what you think my stance is. Simply put, I do not hold the performance of non-available features against what a card is actually capable of, especially if it’s a card made prior to said effect existing, however, I would also be unlikely to recommend a card for purchase that is missing said features. If the 1080 Ti could do RT in RTX games then I would hold it against it, but, again, in regards to RT, I place it in its own category of performance simply due to its demanding nature and limited availability in games. I kinda-sorta doubt this.
  13. DVI-D is digital, DVI-I is both digital and analog and fits bith DVI-D and DVI-A inputs. You’re fine.
  14. Actually, let’s just review for a moment. I legitimately thought that you were asking me to clarify my position and have been attempting to do so. You now tell me that I am “pleading” by responding to you with clarification of my posts, thoughts, reasoning and position on something I have been consistent on for the 7+ years I have posted here, even after your second insult to me when I did no such thing to you prior to your attempted “gotcha”. Yet, now I am somehow “pleading” because your “gotcha” was just me going “ “ and responding with more clarification. I’ve never known you to act like an arrogant douchebag until now, but it’s good to know your actual character, I suppose. I eagerly await more of you attempting to talk down to me and a reply laden with lol and rofl emojis. Enjoy your dinner.
  15. I’m not pleading at all...? That’s exactly how I feel and have always felt. I never discredited AMD card performance for not being able to use GameWorks either. Are you just attempting to be a dick? Oh, okay, then.
  16. IIRC, he suggested a converter, and then also realized that they are terrible for gaming as they have input lag, and a max res of 1080p and 60hz. My monitor supports 1440p at 80hz.
  17. Actually, since RT isn’t available I would not consider it part of the max settings on a card that doesn’t support it. In regards to RT performance, I would only compare it to other cards that support RT and specifically when RT is used. In which case the 2080 Ti becomes almost a 4k/60 card and a 1080p/60 RT card. This really isn’t hard to follow, especially since I previously stated that RT is a whole different story (twice). Go home, Legend, you need to reboot your OS and stop attempting BS “gotcha” moments.
  18. No, I never stated such a thing, though your attempt at changing my argument and putting words in my mouth is noted. I wouldn’t change the target performance/settings for an AMD card because it doesn’t support hairworks either.
×
×
  • Create New...