Well I am shocked, SHOCKED I say! Well, not that shocked.
Been hearing this for the past 3 years
I remember when High-end-GPUs were around 500 €.
Rimworld doesn’t need a new gpu
i wish i could remember how i got it to run on apple silicon last time because i can’t do it now
What it needs is a new multi-threaded engine so I can actually use all these extra cores. XD
Sounds like version 1.6 is supposed to get multithreading.
Can’t wait
My new gpu was a steam deck.
Heck yes, made more sense to buy a steam deck than upgrade my PC.
I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.
I’m on a 2080 or 2090 (I forget which). I thought I’d upgrade to the 40xx now that 5090s are out. I looked at the prices and absolutely not. The 5090s are around 500k JPY, and ordering from the US would work out to about the same with exchange, tax, and any possible tariff that exists this week. Salaries here are also much lower than in the west as well on average even for those of us in software.
4070s are still around 100k which is cheaper than last time I looked at 250k ish.
Price aggregator site in Japan if you want to play around: https://kakaku.com/pc/videocard/itemlist.aspx?pdf_Spec103=500 On the left, you’ll see the cards to select and the prices are obvious on the screen.
Don’t think they made a 2090. 2080 or 2080 ti I guess.
The PC industry has turned into a scuzzy hellscape for average joes that just want to have decent options at realistic prices. They don’t even care about gaming anymore, it’s about YouTube and BitcoinBruhzz now.
I’ve still got a still pretty decent setup (5800x3d 4070ti), but it’s the last stand for this guy I’m afraid. Looking over the past decade or so, I’ve honestly had better gaming experiences on consoles for mere fractions of the price of a PC build. Mods and PC master race nonsense aside. Sure you don’t need a subscription for online PC playing (I rarely play online), but you can barely get a processor for what a PS5 costs anymore. Let alone a video card, which is upwards of a lot of people’s take home pay for a month, the way things are going.
Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090
Yeah no shit, what a weird fucking take
But why spend to ““eat food”” when you can have RAYTRACING!!!2
In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.
(Lowest price I can find)
… That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.
…
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.
This reality is a farce.
…
Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.
RX 9070 + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.
If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.
That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.
Or you could swap out for an Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.
Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.
Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.
I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.
Does that sound about right?
Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:
Consoles cannot really do what they claim to do at 4K… at actual 4K.
They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.
Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.
1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.
And then to stick it to the man further you’re running Linux of course, right?
I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
(I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)Levelone techs had relevant guidance.
Kernel 6.14 or greater Mesa 25.1 or greater
Ubuntu and Mint idt have those yet hence your difficult time.
Try Bazzite. Easy, beginner friendly, but very God hardware support and up to date.
The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures
You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech
Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.
Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.
There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)
That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…
idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.
They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.
On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.
I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.
…
On the other end of the engine spectrum:
Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.
Compare that to oh I dunno, the Source engine.
Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.
Still looks great, runs very efficiently, can scale down to older hardware.
Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.
Looks great, runs efficiently.
None of them use RT.
Because you don’t need to, if you take the time to actually optimize both your engine and game design.
I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.
Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.
Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…
…and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…
… and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.
So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.
…
That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.
If you don’t, older light rendering methods work almost as well, and run much, much faster.
Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.
Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.
Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.
…
I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.
I am still on my GTX 1060 3 GB, probably worth about $50 at this point lol
I ran vr on one of those. Not well, but well enough.
Fuck Nvidia anyways. #teamred
I wish AMD had something like CUDA that my video rendering software used so I could stop using nvidia.
#teamred
Temu Nvidia is so much better, true. Please support the “underdog” billion dollar company.
no don’t buy hardware on Temu
I support the lesser evil option, yes. It’s not like I have much other choices now, do I? Thanks to fucking Nvidia.
I’ve been on Linux since 2018 (my PC is from 2016) and my next GPUs will always be AMD, unless Intel somehow manages to produce an on par GPU
Fuck those guys too honestly. AMD is fueling this bullshit just as much as Nvidia.
Nvidia is one of the most evil companies out there, responsible for killing nearly all other GPU producers destroying the market.
So is AMD with their availability of literally three video cards in stock for all of North America at launch. Which in turn just fuels the scalpers. Downvote this all you want guys, AMD is just as complicit in all of this, they’ve fuelled this bullshit just as much.
Nvidia is singlehandedly responsible for killing all competition but AMD. They destroyed all other GPU companies with the nastiest tactics to dominate the market, only AMD has been able to survive. You can’t blame AMD for chip shortages, it’s the after shock after the covid pandemic. Never ever has there been a higher demand for chips, especially thanks to the rising EV market.
You can’t say AMD is as bad as Nvidia, as Nvidia is the sole reason the market got ruined in the first place. They are the worst of the worst.
And don’t forget diaper Donny, who destroyed international trade with his fucking tariff wars.
I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.
But it was expensive.
Also built a dream machine in 2022. I have a 4090, a 7700X, 32GB of DDR5 6000, and 8TB of NVME storage. It’s got plenty of power for my needs; as long as I keep getting 90+ FPS @ 4K and programs keep opening instantly, I’m happy. And since I bought into the AM5 platform right at the beginning of it, I can still upgrade my CPU in a few years and have a brand new, high end PC again for just a few hundred bucks.
It seemed like horrible value at the time, but in hindsight a 4090 was not the worst investment, hah.
I just looked up the price and I was “Yike!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.
Still on a 1060 over here.
Sure, I may have to limit FFXIV to 30fps in summer to stop it crashing, but it still runs.
They are talking about skipping 1 or 2 generations not taking 10 years off
Hey, it’s not 2026 just yet!
I’m running Linux for everything and my GTX 1070 is still chugging along trying to power my 1440p 144hz monitor ^^’
Well, I mostly just play strategy games and CS2 (which I do have to run on almost the lowest possible settings without FSR. I basically turn everything to lowest except for lowest still AA setting and dynamic shadows to not have a disadvantage and get 110 - 180 fps depending on the situation)
But I’m planning on buying a used Radeon 9070 XT and just inserting it into my current build (i7 6800k based lololol) and on eventually buying a new build around it
(A 750W 80 Plus Platinum PSU should be able to handle a new 970 XT)
Hey, I’m also on a 1060 still! Admittedly I hardly game anymore, although I am considering another Skyrim playthrough.
I’m ngl, finances had no impact on my decisions to stay at 3080. Performance and support did. Everything I want to play runs at least 60 to 180 fps with my current loadout. I’m also afraid once Windows 10 LTSC dies I won’t be able to use a high end GPU with Linux anyways.
You can always side-grade to AMD. I was using a 3070 and ditched Windows for Kubuntu and while it was very usable, I would get the slightest input lag and had to make sure the compositor (desktop effects) was turned off when playing a game.
After some research I decided to side-grade to the 6800 and it’s a night and day difference. Buttery smooth gaming. It performs better with compositor on than Nvidia did with it off. I know 6800 isn’t high end but it’s no slouch either. AMD is king on Linux.
I just paid $400 for a refurbished MSI Gaming Z Trio Radeon RX 6800. The most I’ve ever spent. I never want to spend that much again.