NVIDIA in 2025 Be Like…

_silentgameplays_
47 Comments
Subscribe
Notify of
47 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Isn’t this was in the 40 series too? Can’t say confidently because I’m not qualified in this sphere

So what card should I get if I always turn off RT, DLSS, TAA and play on 1080p?

NVIDIA feeding users like theyโ€™re on a pixelated IV drip.

DLSS is actually one of the most advanced technologies nowadays. Don’t get really why everyone is so negative about it. It indeed improves the picture quality and frame rate significantly. And also native frames that you can get are better than in 40s series

where is the original vid? i have had it in past sonewhere…

My face when dlss is actually a helpful use of AI to allow for more cheap and accessible cards that allow you to play modern games without having to sell your grandma to be able to afford a 4090.

ฤฐf you can’t tell the difference, you can’t pretend as they’re fake

People need to learn that not all AI things are bad. Some are helpful.
Gen AI is bad, but that’s a whole different thing than what those cards are doing.

The ai buzzword isnโ€™t always bad. The output from the new dlss seems outstanding. Itโ€™s also a very good path for us to maximise the capabilities of hardware. Might not be perfect right now but itโ€™s on its way

This might be news for you, but all the frames are fake

Me over here with a 1650

![gif](giphy|W0c3xcZ3F1d0EYYb0f|downsized)

Unfortunately graphics are kind of cooked atm. New games rely on TAA which causes motion blur, smearing, and image instability when moving. TAA is heavily used to mask graphical artifacts, and it makes things look terrible. More recently, things like TSR, Lumen, and DLSS all create visual artifacts because they prioritize ‘performance’ over actual visual fidelity. Pixel crawl, visual inconsistencies, and pop-in are the norm now.

Is AI really going to become a buzzword for negative feedback?

You think TAA blur is bad? Get ready for AI hallucinations on your frames now. (Also with TAA blur)

I hate AI slop more than anyone else and multi frame gen would be dogshit not doubt, but DLSS 4 seems quite a lot better than DLSS 3 as they have shifted from CNN to Transformer model. DLSS4 would also be availabe to all the models that already support DLSS so atleast something good is coming out of the presentation. I am looking forward to what AMD is cooking tbh, they said they won’t be putting flagship card but concentrating on mid-range. Midrange market is so shit rn that I would buy AMD even if they don’t have any DLSS competitor if they put out a card with decent raster performance and more VRAM.

wat

so do I stick with the 30 series?

But that’s what it is… DLSS has been AI since it’s inception, it’s in the name deep leaning super sampling basically the definition of an AI.

Tbh I don’t give a fuck if it’s “fake frames” if it runs amazing and looks amazing.

What are fake frames? Frames either exist or not.

Am i the only one wondering what the hell is going on in the picture? Anyone now the source so that i can research it myself?

I can see this resulting in large AAA game dev companies getting lazy and relying on DLSS 4.0 frame gen to smooth out the gameplay, rather than optimise the engine/game.

Can someone explain to me why AI frames are bad? This seems to me like it’s one of the few scenarios where using AI would be good.

DLSS to 1080p to higher resolutions is actually really good. DLAA is also decent.

Loss less scaling from steam much? I meant aint they the same but other is just cheaper

This a R/ of memes or…?

It seems that the DLLS5 Frames will provide 100000% performance as if they were created frame from nothing.

Now you get to enjoy 30 FPS input lag even though the FPS number says 200

Im sorry but I have to ask, whatโ€™s the sauce???, guys pls tell me, I need it

Nv like
Ai ai ai ai ai u need ai ai u dont need vram

I want to remind everyone that Lossless Scaling has 3x frame gen for “Everyone” right now.
Or just regular if you don’t want that much.

Gamers: ‘Just one more frame daddy NVIDIA, I promise I’ll stop after this one!

![gif](giphy|96ciETgT883Bu)

Drink the nog cater.

Why is this an nsfw Tag????

This is what we get for letting the “AnYtHiNg LeSs ThAn 120 fRaMeS iS uNpLaYaBlE” crowd go unchecked.

So new generation of NVIDIA relies on AI even more than the 40XX, AMD makes no progress but changes naming scheme to the worst possible and Intel runs out of stock immediately. GPU market is cooked lol.

every tech company does this with AI…

๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚

But can it run Crysis on Ultra?

I understand that nvidia are heavily relying on dlss 4 but cmon man I’m pretty excited that I can afford a 5070(maybe) but that excitement will go away if I see actual performance numbers that are not on par with even the 4070 ti super(big chance)

we’re crying about AI now, while literally so many things were possible by AI years ago.

I’m a bit confused why people hate on it.. DLSS has been a blast for me, good AA, high fps, very fluent.. dunno what’s bad about it? If it works it works

I’m completely unaware of the situation, can someone fill me in please?

Can people really not tell the difference between real and fake frames? To me I can still tell the real framerate underneath all this FG AI slop. It’s like if I put a small note in the corner of my monitor with “120 fps” on it while I’m capped at 60.

Okay, where do I go to get a better performing card from a competitor for 4k ultra ray traced gaming at very high FPS?

5090 still has 30% better raw performance compared to 4090

Linus Tech Tips just uploaded a video on the 50 series, its only visually obvious that the frames are being AI generated when you look at bold numbers while moving around in games.

47
0
Would love your thoughts, please comment.x
()
x