1. New hardware smothering raytracing setting for CP2077 according to the developers, unreleased to the general public from what i can gather.

  2. That's exactly what I said. They are showing their hands. The GPU isn't better...it's just better at faking it.

  3. Could be they tried the Skyrim Aldrnari modpack or Cyberpunk 2077 OR a modern racing game that for some reason renders the entire map even when you are not looking at it.

  4. From what I’ve heard it increased the bounce lighting from a single bounce to multiple (like metro exodus enhanced edition), added full resolution ray-traced reflections and all lights now emit ray-traced lighting.

  5. You're right but at least they compare it to other current models not like they compare a iPhone CPU to one from like 3 years ago saying its 1.5x faster like duh

  6. Everything was big in that premiere. The scripted performance. The price tag. The card itself. Jensen's head. Jensen's wallet. Jensen's new superyacht.

  7. I dont know why people other than journalists even pay attention to these things. They’re always nonsense. Just wait for honest reviews looking at things objectively. The price alone is putting me off nvidia. I’ll wait to see what amd is coming out with.

  8. I just got my 3080 yesterday and the first thing I did was test cyberpunk at 1440 P everything on ultra with ray tracing on the highest level. with the dlss on and it was average 80 FPS so I think I’m good.

  9. Personally, I like High settings, medium RT, and DLSS quality/balanced on my 3080. I’m on 1440p. I couldn’t really tell between the RT levels besides analyzing puddles for reflection clarity

  10. I don't use Ray Tracing cause my RX6750XT isn't that good at it... but it runs perfectly fine with high/ultra settings on 60fps with the card completes maxxed out haha.

  11. I suspect 4000 series rasterization performance will be the true test on release. Sure they can boast twice the performance in SOME titles, but let's see the overall performance on release. I ain't buying the 4000 series due to my confidence in Nvidia is no longer there. Hope they lose enough money so that they can reduce pricing and be more consumer friendly

  12. Yep, especially as DLSS 3.0 need to be first trialed by fire. Seems like fluff framrate numbers (like usual with interpolation methods used on TVs), that have no practical (better responsivnes) effect.

  13. With 4070 +10, being 899$ and it seems DLSS 3.0 fluff fps shanningans, I now too feel good about my 3060Ti.

  14. How does this DLSS stuff work? If a game already supported DLSS 2 will it also be able to utilize DLSS 3, or are those different data sets? And if it takes so much extra work to keep up with each new gen of DLSS how great are the chances all the studios will implement the function?

  15. DLSS 3.0 uses frame interpolation to basically fake a higher fps while using AI to make it look good. It uses a new and improved type of core (forgot what it's called) to allow this, meaning it will be exclusive to RTX 4000 cards. DLSS 2.0 is a different technology, which works on any RTX card.

  16. i used dlss when i played icarus and it gave me like a 30% bump in fps. the only artifact i would sometimes see is tree leaves occasionally looked weird but you had to really look for it. i always keep it on unless its causing problems

  17. Turning on DLSS to improve performance is kind of cheating the benchmarks because it's not half as lossless as they say, and it certainly doesn't improve quality unless you're already rendering above native (and why would you do that if you needed more performance?). Marketing BS.

  18. This is interesting, I haven’t noticed a loss of quality turning dlss on but it definitely looks different. Sometimes it looks more zoomed out or something it’s hard to explain. It helps me in warzone for example but I see what you mean by it doesn’t really look better. Quality dlss is good but usually has much less performance compared to balanced. Without dlss I can get 140ish fps and it looks different like turning on dlss almost looks like it’s a grainy(not the right word) 4K picture. With dlss it gets a bit more frames though no matter which one you choose.

  19. I'm blind as fuck as I can't notice properly set up 144hz in games but I can see the worse difference with DLSS and RTX looks like a gimmick, Radeon seems like a perfect card for me where I can appreciate the bigger VRAM over this shit I won't use.

  20. DLSS 2.0 is here to stay so are other upscaling AI methods, they work in practice very well for Single Player games.

  21. I mean I get the need for new and improved tech every year, but, mid to high end rdna 2 and ampere have more than enough performance to dominate any game I want to play right now, and most likely for any game coming within the next few years. I am on a 2060 right now and it’s getting a bit slow (especially on my 3440x1440 display) so I’m going to be upgrading to either a 3070, 6700 xt, or 6800 soon

  22. 1070 here and newer games sure stress my system but I’m capable of playing them on low settings. A 4080 cost more than my entire PC so it’s really difficult for me to justify the insane price. Will go with a 3070 or 3080 before I jump ship to AMD if they offer something competitive and reasonably priced. Nvidia can suck it.

  23. my problem is why we need those tech to run it in high fps with that fucking price i dont want my game to be smudged i want it to run natively

  24. Yeah. If your game can't run native on a 600 watt GPU which costs as much as a kidney perhaps the game needs to be optimized lol

  25. This is the old 90s technique of showing graphs and putting a sticker on something and claiming it's faster! DLSS off RT on is slow....NO FUCKING SHIT. My concern is that they are only getting 22fps on native graphics...that means the core power isn't any better than 30 series. They are just doing better upscaling.

  26. Well take half the frames since DLSS3.0 generates half of them with a lot of approximations so it's a 22 to 48 FPS boost.

  27. Its actually more like 22 fps to 62fps with DLSS 2.0. Then 62 to 92 with DLSS 3.0 with the frame generation (which actually looks really good).

  28. That the only thing I don't like, because it's depend so much for what the development team implement, i cant say it's always been a good interpolation, like Spiderman (if I not wrong they launches patch to make it better) but still, you can see it can be a very shit, and putting that as actual fps is a shitty strategy for the people don't know how that works, and it's lame.

  29. Last week I got my RTX 3080TI for good money and I thought I might be doing stupid purchase right now, but seeing all the 4080 shenanigans I don't feel sorry at all.

  30. Wtf is with DLSS 3.0 creating frames between frames... That literally means the GPU is rendering a frame and then delaying it while it creates another frame to go before it. I think the input lag is gonna be baadd.

  31. its like Stadia.. that demo where there was a huge input lag. that pretty much killed them overnight.

  32. I think they are purposely using cpu bottlenecked scenarios to show how powerful their cpu frame bypass thing is or whatever it is..

  33. What I really wanna know is how bypassing any CPU information is actually a good thing? What if you press a button and those 3 frames don't register it? Isn't that 3 frames of lag even if technically input delay is unchanged?

  34. It's got about 50% more cuda cores than a 3090. That's a massive hardware improvement, but it's going to take a while for the software to catch up.

  35. If Ada performance/watt was so much better than Ampere why would they push it to 450W? Makes no goddamn sense.

  36. so much BS, they really compared performance with DLSS on, i expect real performance to be like 20% more than previous gen at best

  37. Dlss is a funny scenario for Nvidia.. like they literally have to compare themselves to themselves...by breaking the symmetry they essentially have to make themselves bad at one place or another..

  38. What, you dont like graphs with less than two scales ? You dont like absolute numbers with nothing to relate ?

  39. All of this power isn't too useful right now but I'm hoping in the future we have things like super ultra sampling. If you've ever used an emulator to play a super nintendo game there's there is a few different filter and sampling modes that you can utilize. It literally makes the game look like it was designed for modern technology instead of a CRT television. It's on the magic and in the future we can have super ultra sampling of actual polygons and vertices et cetera for current and previous generation 3-D games. So you could take for example a game like Fable 1 and make it look as good as Elden Ring or better. All of the geometry/polygons/lighting being subdivided, upsampled, super antialiased, filtered etc. In real time. Once we have GPUs that have so much power we dont know what to do with them, this is something that will be a very real possibility.

  40. VR would be a great use of GPU power, too bad Nvidia (and probably AMD as well in their announcement) are acting like it doesn’t exist because it’s not ‘hot’ anymore despite more people owning headsets than ever.

  41. They're faking performance increases by not using DLSS on last-gen cards but using it on current-gen cards. "2-4x performance boost" is only comparing with those metrics, turning RT on max and leaving DLSS off on older cards while having DLSS set to max performance on newer cards. It's sketchy at best and false advertising at worst. Nvidia being scummy is nothing new though, expect stuff like this to get worse from here on out.

  42. Team red here…come on in the waters fine and the frames are high without needing any black magic fuckery!

  43. The patch helped but the game is still pretty unoptimized, I wonder why they use it it must have the only real implementation of ray tracing.

  44. Its like we watched a guy walk on stage drop his pants and proceed to act like nothing was happening... It was fun to watch if nothing else. Ignore 4000 series.

  45. They are trying to force adaptation of their technology by purposely lowering gaming performance when not using it.

  46. At some point, the snake oil of engineered obsolescence will hit critical mass and the gig will be up for nVidia. We will no longer buy the need to upgrade because they tailor-made a need for us.

  47. 4090 is not worth buying. You can't really see any difference going from already extremely high settings to higher settings

  48. Its literally confusing on purpose, if you don't understand it and they have the better stats on the graph, you automatically will assume it's better regardless of how relevant the information even is. Stupid marketing bullshit designed to misinform the consumer

  49. u know i decided to upgrade from a 2070 to a 3080 and it was nice but now i don't think i won't upgrade for a long ass time

  50. Your 3080Ti will run every new game at the highest settings for years to come. Heck my 2080 Super barely has any problems with games at 1440p altho I do have a 10900K which helps.

  51. They just trying to make the 40 series look better even though no one needs the card any time soon beyond rendering things for say vfx as games aren't going to producing such visual details for a few years to come unless they specifically want to target and include people with these cards over people that are using just a 10 or 20 series card like myself

  52. I knew Nvidia would pull something like this off, AGAIN, eventually. Trying to make a certain feature exclusive. Just think if AMD wasn't around? The FIRST version of the DLSS, Nvidia would've charged extra for that. Well, they are doing it now anyways.

  53. Honestly, what even is a good gpu these days? I want one I won’t have to replace for like 5-7 years and can run pretty much anything and these 4000 series seem sucky

  54. My prediction is that these new cards are gonna not much better than 30 series when we're talking about rasterization performance.

  55. The team making these slides must hate their jobs. Knowing you’re producing absolute trash whose only purpose is to serve as noise until the first reputable Reviewer gets their hands on it.

  56. It's due to DLSS 3. From what NVIDIA states, DLSS 3 not only improves FPS, but also adds frames using AI learning.

  57. I always hate hoe Nvidia presents their hardware...why can't they just be like AMD.... showing real Benchmarks and Numbers.. 2-4 times faster...wow how precious

  58. What has me peeved is the dlss 3 being locked to 40 series. Don't give me those improved architecture bs. I'm sure they could make it work on 30 Series cards, even if it's less efficient and less of a boost. It's clear they want to encourage people to buy the 40 series cards that are already scalper level prices.

  59. They should not be able to market that lower spec'd 4080 under the name 4080. Especially since the only thing on the packaging is going to be the lower amount of memory. It's basically a 4070 being marketed as a 4080 and that's so frickin wrong.

  60. Listen I wanted to get the 4080 so badly, but then I saw their pricing and I just can’t buy into that corruption, sorry but I’m waiting for the prices to drop lol I’m just gonna chill on my 3080

  61. The shenanigans mostly are explained by the new dlss not only upscaling rendered frames, but outright fabricating new ones.

  62. So, if I’m building a new pc, say, my first one, and I want stuff like RT at maybe 2k, do I get a 3080 or 4080 12?

Leave a Reply

Your email address will not be published. Required fields are marked *

News Reporter