Well I am shocked, SHOCKED I say! Well, not that shocked.
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.
But it was expensive.
It seemed like horrible value at the time, but in hindsight a 4090 was not the worst investment, hah.
Also built a dream machine in 2022. I have a 4090, a 7700X, 32GB of DDR5 6000, and 8TB of NVME storage. It’s got plenty of power for my needs; as long as I keep getting 90+ FPS @ 4K and programs keep opening instantly, I’m happy. And since I bought into the AM5 platform right at the beginning of it, I can still upgrade my CPU in a few years and have a brand new, high end PC again for just a few hundred bucks.
The progress is just not there.
I’ve got RX 6800 XT for €400 in May 2023 which was at that point almost a 3y old card. Fastforward to today, the RX 9060 XT 16GB costs more and is still slower in raster. Only thing going for it is FSR4, better encoder and a bit better RT performance about which I couldn’t care less about.
bit better RT performance about which I couldn’t care less about.
Yeah raytracing is not really relevant on these cards, the performance hit is just too great.
The RX 9070 XT is the first AMD GPU where you can consider turning it on.
But I wouldnt turn it on and actually play with it even if I could because I will always take the better performance.
I’ve actually tried Path Tracing in CP2077 running at native Steam Deck resolution streamed to my Steam Deck OLED from my PC at max settings and it could do 30FPS locked fairly well (overlocked by 20% though). But the game looks absolutelly horrible in motion with it’s terrible LOD so no amount of RT or PT can save it. It looks dope for screenshots though. But that’s PT, RT is basically almost indistinguishable. And PT is many, many years away for it to be viable for majority of people to use.
https://www.youtube.com/watch?v=yNcYZ5l_c48
(The game reports W10 but it was Fedora42 actually)
But I wouldnt turn it on and actually play with it even if I could because I will always take the better performance.
Depends. In Cyberpunk I can get 90-100fps on 1440p on ultra with raytracing on and FSR4 Quality (via Optiscaler). That is a very good experience IMO, to the point that I forget about “framerate” while playing.
That’s Windows though, in Linux the raytracing performance is rather worse for some reason and it slips below the threshold of what I find noticeable, so I go for 1440p native.
Don’t think I’ll be moving on from my 7900XTX for a long while. Quite pleased with it.
I have a 3080 and am surviving lol. never had an issue
I have a 3080 also. It’s only just starting to show it’s age with some of these new UE5 games. A couple weeks ago discovered dlssg-to-fsr3 and honestly i’ll take the little bit of latency for some smoother gameplay
Still running a 1080, between nvidia and windows 11 I think I’ll stay where I am.
Pretty wise, that’s the generation before the 12HVPWR connectors started burning up.
Afaik the 2080was the last FE with a regular PCIe power connector.
I’m enjoying the fact that I haven’t played video games in about 20 years now. The last one I played regularly was Quake II for Super Nintendo (and that was a few years old even then). If I ever get back into gaming, I can just pick up where I left off and play a bunch of new-to-me games on ancient, cheap technology. Like, were gamers less happy in 2005 than they are today? I don’t think so.
Shit, I still play Civilization III from time to time and it’s fine.
That you misremembered the generation of Nintendo console that Quake 2 was on makes this the perfect chefs kiss millennial boomer comment, lol.
Why not just buy a cheaper one? X060 or X070 series is usually fine in price and runs everything at high enough settings. Flagship is for maxed out everything on 4k+ resolutions. And in those cases, everything else is larger and more expensive as well; the monitor needs to be 4k, huge ass PSU, large case to fit the PSU and card in, even the power draw and energy… costs just start growing exponentially.
For me, with a 2080 already, I would have to spend much more than what my gpu was worth to have any significant upgrade. It’s just not worth it.
When a new gpu was 500-900 usd it was fine.
But yeah, 2070rtx keeps chugging on
Oh totes. NVIDIA continuing to lie even more blatantly to their face, driver bricking issues on updates, missing GPU ROPS performance, even more burn problems with a connector they knew continued to be problematic and lied about it, they and their retail partners releasing very limited inventory and then serving internal scalping while also being increasingly hostile to the rest of their consumers, ray tracing performance improvements they have to exclusive push in certain games and the newest most expensive hardware to actually get any benefit from their cards, false MSRP pricing and no recourse for long time loyal customers except a lottery in the US while the rest of the regions get screwed. Totes just that it’s “too expensive”, because when have gamers ever splurged on their hobby?
All I want is more VRAM, it can already play all the games I want.
But with our new system we can make up 10x as many fake frames to cram between your real ones, giving you 2500 FPS! Isn’t that awesome???
Bullshitted pixels per second seem to be the new currency.
It may look smooth in videos, but 30fps upframed(?) to 120fps will still feel like a 30fps game.
Modern TVs do the same shit, and it both looks and feels like ass. And not good ass.
I don’t mean to embarrass you, but you were also supposed to say “AI!”
Points with a finger and laughs
Look at that loser not using AI
I have a 4090. I don’t see any reason to pay $4K+ for fake frames and a few % better performance. Maybe post Trump next gen and/or if prices become reasonable and cables stop melting.
fake frames
And that’s my main problem with what the industry has become. Nvidia always had sizable jumps generation to generation, in raw performance. They STILL get better raw performance, but now it’s nowhere near impressive enough and they have to add their fake frame technologies into their graphs. Don’t get me wrong, they always had questionable marketing tactics, but now it’s getting even worse.
No idea when I’m replacing my 3060ti, but it won’t be nVidia.
I don’t think the 5090 has been 4k in months in terms of average sale price. 4k was basically March. 3k is pretty common now as a listed scalp price, and completed sales on fleabay seem to be 2600-2800 commonly now.
The problem is that 2k was too much to begin with though. It should be cheaper, but they are selling ML cards at such a markup with true literal endless demand currently, there’s zero reason to put any focus at all on the gaming segment beyond a token offering that raises the margin for them, so business wise they are doing great I guess?
As a 9070xt and 6800xt owner, it feels like AMD is practically done with the gpu market. It just sucks for everyone that the gpu monopoly is here, presumably to stay. Feels like backroom deals creating a noncompetitive landscape must be prevalent, plus a total stranglehold with artificial monopoly of code compatibility from nvidia’s side make hardware irrelevant.
Technically Intel is also releasing some cheapo GPUs in similar capability to nVidia but they all have the same manufacturers anyways.
There’s major issues with those GPUs in some commonplace use cases and they have major scalping issues. Sure in some use cases there’s zero issues, but this aint like the early 2000s when there were many brands that all basically worked.
Now you’re either nvidia with every feature, amd with most features (kinda like a store brand), or intel with major compatibility flaws with specific games because it’s technically a GPU.
One issue is everyone is supply constrained by TSMC. Even Arc Battlemage is OOS at MSRP.
I bet Intel is kicking themselves for using TSMC. It kinda made sense when they decided years ago, but holy heck, they’d be swimming in market share if they used their own fabs instead (and kept the bigger die).
I feel like another is… marketing?
Like, many buyers just impulse buy, or go with what some shill recommended in a feed. Doesn’t matter how competitive anything is anymore.
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
Ain’t nobody got time (money) for that!
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
I’m sitting on a 3060 TI and waiting for the 40-series prices to drop further. Ain’t no universe where I would pay full price for the newest gens. I don’t need to render anything for work with my PC, so a 2-3 year old GPU will do just fine
Not surprised. Many of these high end GPUs are bought not for gaming but for bitcoin mining and demand has driven prices beyond MSRP in some cases. Stupidly power hungry and overpriced.
My GPU which is an RTX2060 is getting a little long in the tooth and I’ll hand it off to one of the kids for their PC but I need to find something that is a tangible performance improvement without costing eleventy stupid dollars. Nvidia seems to be lying a lot about the performance of that 5060 so I might look at AMD or Intel next time around. Probably need to replace my PSU while I’m at it.
bitcoin mining
That’s a thing of the past, not profitable anymore unless you use ASIC miners. Some people still GPU mine it on niche coins, but it’s nowhere near the scale as it was during the bitcoin and ethereum craze a few years ago.
AI is driving up prices or rather, it’s reducing availability, which then translates into higher prices.
Another thing is that board manufacturers, distributors and retailers have figured out that they can jack up GPU prices above MSRP and enough suckers will still buy them. They’ll sell less volume but they’ll make more profit per unit.
My kid got the 2060, I bought a RX 6400, I don’t need the hairy arms any more.
Then again I have become old and grumpy, playing old games.
Hell, I’m still rocking with a GTX 950. It runs Left for Dead 2 and Team Fortress 2, what more do I need?
I’m still using my GTX 1070. There just aren’t enough new high-spec games that I’m interested in to justify paying the outrageous prices that NVIDIA is demanding and that AMD follows too closely behind on. Even if there were enough games, I’d refuse to upgrade out of principle, I will not reward price gouging. There are so many older/lower-spec games that I haven’t yet played that run perfectly for me to care. So many games, in fact, that I couldn’t get through all of them in my lifetime.
Lezgooo 1070 crew reporting in (☞゚ヮ゚)☞
I remember when High-end-GPUs were around 500 €.