Rincewind
Magister
Wow. So much emotion.
RT is not forward, it's backwards. Ray tracing is the simplest most basic form of rendering, just 1:1 physics simulation, but is extremely slow. The technological advancements were all the other methods of rasterization and shading that allow decent approximations of lighting for a fraction of the cost. It will always be this way, to properly evaluate ray tracing you have to imagine what could have been done if more efficient methods were used instead. Current iterations don't even look better, because the image quality loss from all the smearing/dithering/subsampling/blurring far outweighs any benefits from slightly more accurate lighting or reflections. Even with dedicated hardware units designed specifically to accelerate ray tracing instead of the general purpose shaders used for everything else, it's still slow as hell.nothing i've said is contradictory. 2yo midrange stuff like the rx 6700 xt and rtx 3060, even a770 can handle RT quite well, but there's still compromises like render resolution, FPS, # of rays/rt quality or settings. by 2027-2028 almost everything will probably be RT first and those compromises will be a lot more limited.
a lot of early implementations sucked also, a few like CP2077, bfv and metro exodus did not, only now things are starting to suck a lot less because they're are moving toward RT-first workflows like lumen.
yeah, fuck nvidia for pushing technology forward. let's just all take it up the ass from amd instead where they refuse to implement any new features unless nvidia goes first and they scramble to get something out of the door to have parity.RT is a scam from nVidia and because amd was too weak to capitalize during first generation
also, amd couldn't compete because rdna1 was a shit buggy unfinished architecture and also had no RT hardware built in to begin with. amd's problem was that they didn't just follow directx or vulkan development to see what nvidia was up to.
Specular aliasing and moire are a result of bad subsampling usually due to improper LOD or MIP maps/texture filtering. But making proper LOD takes effort and devs would rather just slap a massive blur on everything. Then they get lazy and stop even trying to minimize aliasing since they know it's going to be blurred to hell anyway. And it gives them an excuse to use all sorts of subsampling and dithering techniques.i presume you're talking about TAA and derivative upscalers like dlss2/xess/fsr2, and fact is that the blur and detail loss from them is utterly negligible compared to native TAA (depending on resolution scale, though the target is 1:4 which all of them do well with at 4k, but at 2k and 1080 the results are spottier for sure). the alternative is having games be a complete disaster of aliasing and a hell of moire patterns, so taa is a requirement to not make the graphics look fucking broken.upscalers, spatio-temporal denoisers, antialiasing and sharpeners turning motion into a blurry smear
also, would you have bitched about LOD models in 1998? tessellation/sub-division? if no, then temporal upscaling is good, and you have no right to complain.
As soon as these things become standard devs will factor them in during development and they will be required to reach 30/60fps, ray tracing or not.The problem is - that you can achieve 120 fps at 4k with good image quality using image reconstruction and frame generation - so Tensor cores / Wave MMA silicon is useful even if the game does not use Ray Tracing.
This is just literally false.Let's remind everyone one clear thing : Digital Foundry doesn't play games, they measure them. And they measure with fucking framemeters to whine and complain about how this game plays at 60 FPS moooooost of the time but when you enter a big city while it loads it switches to 57 or sometimes even 55 FPS for a couple of seconds. Then they do a DLSS vs FSR analysis when they zoom on the edge of 3D models and compare stills of them being blurred and while a difference is noticeable, it's never apparent to a normal player who, you know, play the game. As a result of this kind of excessive scrutiny, people are complaining about horrible ports when they are, in fact, not. Jedi Survivor played nice on my GTX 980. I didn't fucking spend every minute of gameplay looking at the framerate, I just played it. The Last of Us was mostly 30 FPS on PS3. GTA 5 could hardly reach 30 FPS. And people had fun.
<lots of bullshit skipped>
I'm no fan of Digital Foundry and I don't watch their content, but your OP is nothing but misinformation.
They make usual supects really angry - which is a plus.That's what you get for following hardcore game tech spergs on YT - disillusionment and disappointment.
You're literally everything that's wrong with PC gaming. Besides furries.
Key word is "decent approximation". Yes, a prebaked lightmap can look exactly as good as RT would in a cinematic set scene where you can only go forward. Only issue is that we're in an era of massive open world games and a random alleyway won't and can't have that same level of care and consistency. RT can make all of it have that same level of consistency. If shit like SSAO or SS reflections are a "decent approximation" to you, then I'm glad, I guess, but that's not factually true.The technological advancements were all the other methods of rasterization and shading that allow decent approximations of lighting for a fraction of the cost.
Maybe in mobile games. Real time path tracing is currently possible on hardware currently sold. I would personally bet on on technology improving, rather than not improving. As I've said before there's going to be a buffer period where things will be carried along by software RT/GI like software Lumen or something else using distance fields or voxels or whatever.It will always be this way,
Then please enlighten me with some examples that have modern geometry and modern detailed normal mapped textures and multiple active lights and don't immediately spaz out without AA any time you move the camera.Specular aliasing and moire are a result of bad subsampling usually due to improper LOD or MIP maps/texture filtering. But making proper LOD takes effort and devs would rather just slap a massive blur on everything. Then they get lazy and stop even trying to minimize aliasing since they know it's going to be blurred to hell anyway.
Good? The developers are using TAA anyway, so may as well use TAA to save on performance that couldn't otherwise be saved on.And it gives them an excuse to use all sorts of subsampling and dithering techniques.
Thing is if you use something to this degree it's called abuse.TAA anyway, so may as well use TAA to save on performance that couldn't otherwise be saved on.
"Just use more TAA bruh aaaaaaand ackshuaallyyyy"ninjazombiemaster
•
1y ago
Looks like it may be lacking temporal anti-aliasing. Try enabling that.
Small side note, Lumen is not actually responsible for cast sha(...)
The double edge curse of being part of the PC MASTER RACE. I for one welcome it, because I enjoy seeing games I will only be able to run at 144fps in years to come.What annoys me is that for consoles they care about rock hard 60 FPS (or whatever the target is, which I can get behind) but then when it comes to PCs they suck the raytracing cock even if it means 35 FPS DLSS performance mode. OMG the reflections meanwhile the actual game is a blurry, unplayable mess.
1. You dont need RT for hard shadows. *Re RT: run against the wall I was looking at. That kind of experience is in my opinion game changing.
smearing/dithering/subsampling/blurring far outweighs any
It's funny cause even if they did "OK" per object/ character only AO with matting (so it wouldn't mess up lightmaps), temporal would crash it. All-in temporal lets goooI've never seen SSAO I didn't immediately want to turn off