Well I am shocked, SHOCKED I say! Well, not that shocked.
Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.
So my next card is probably gonna be an RX 9070XT.
even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.
Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.
It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.
deleted by creator
It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.
Idk man I’m rocking a 9070 and 9800x3d running bazzite and this thing smokes. For $1150 combined for the pair it was a no brainer. I’m pulling down 90fps on expedition 33 maxed out settings
Yeah it will occasionally get tripped up on Ray tracing, But I just lower the settings a bit on those games and I’m still cooking at 55+. This is on a 3440x1440 monitor mind you.
I just can’t see justifying the Nvidia tax unless you have so much disposable income it simply doesn’t matter to you. I’d rather have a pretty damn good sports car without nitrous rather than pay 3x for one with it. I built a PC for $1750USD that absolutely rips
Bought a 5700xt on release for £400, ran that til last year when the 7900gre released in the UK. Can’t remember what I paid but it was a lot less than the flagship 7900 and I forsee lasting many years as I have no desire to go above 2K.
AMD GPUs have been pretty great value compared to nvidia recently as long as you’re not tying your self worth to your average FPS figures.
I’m still surviving on my RX580 4GB. Limping along these days, but no way I can justify the price of a new GPU.
What about the used market? The Nvidia 1080/1080 TI or AMD 6000/7000 series is not too bad.
R9 380x does more than I need it too.
If it wasn’t for video format compatibility (av1 mostly) then I would still some R9 fury coil whine as background noise.
The R9’s were really something.
I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.
Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.
deleted by creator
The majority sure, but there are some gems though.
Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example
You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.
It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.
I am tired of be treated like a fool. No more money for them.
I bought a 3070 for far more than I should’ve back when that was new, and I don’t plan to make that mistake twice. This GPU is likely going to be staying in this PC til it croaks. Never felt the need for anything more powerful anyway, it runs everything I need it to on high settings.
Same I got a 3080 12G a few months after release for $1k from EVGA and it’s the most I’ve ever spent on a computer part. Next upgrade is def gonna be in the 600-700 range, not making that mistake again.
GTX 1060 6Gb still going strong!
Runs FFXIV at 1440p.
Runs HL Alyx on my Rift.
Runs everything prior to this gen.
If I need to run a more modern game, I’ll use my PS5.
Jesus christ man. I thought I was slumming it with a 3070.
hahahahahahahaha.
rx580
RX580 remains a power efficient champ. The old hot hatch of the GPU world.
That was a beautiful card, bought to use with vr, my gf is still rockin that system
Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.
Ex-fucking-actly!
Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.
5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.
4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.
Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.
The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.
I was gifted a 2080Ti about a year or so ago and I have no intention on upgrading anytime soon. The former owner of my card is a friend who had it in their primary gaming rig, back when SLI wasn’t dead, he had two.
So when he built a new main rig with a single 4090 a few years back he gifted me one and the other one he left in his old system and started using that as a spare/guest computer for having impromptu LANs. It’s still a decent system, so I don’t blame him.
In any case, that upgraded my primary computer from a 1060 3G… So it was a welcome change to have sufficient video memory again.
The cards keep getting more and more power hungry and I don’t see any benefit in upgrading… Not that I can afford it… I haven’t been in school for a long time, and lately, I barely have time to enjoy YouTube videos, nevermind a full assed game. I literally have to walk away from a game for so long between sessions that I forget the controls. So either I can beat the game in one sitting, or the controls are similar enough to the defaults I’m used to (left click to fire, right click to ADS, WASD for movement, ctrl or C for crouch, space to jump, E to interact, F for flashlight, etc etc…); that way I don’t really need to relearn anything.
This is a big reason why I haven’t finished some titles that I really wanted to, like TLoU, or Doom Eternal… Too many buttons to remember. It’s especially bad with doom, since if you don’t remember how, and when to use your specials, you’ll run out of life, armor, ammo, etc pretty fast. Remembering which special gives what and how to trigger it… Uhhh … Is it this button? Gets slaughtered by an imp … Okay, not that button. Reload let’s try this… Killed by the same imp not that either… Hmmm. Goes and looks at the key mapping ohhhhhh. Okay. Reload I got it this time… Dies anyways due to other reasons
Whelp. Quit maybe later.
Been hearing this for the past 3 years
I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.
You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.
After some research I decided to side-grade to the 6800 and it’s a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn’t high end but it’s no slouch either. AMD is king on Linux.
I don’t buy every generation and skip 1 if not 2. I have a 40xx series and will probably wait until the 70xx (I’m assumimg series naming here) before upgrading.
Ah capitalism…
Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.











