Then put 8GB in a 9060 non-XT and sell it for $200. You’re just wasting dies that could’ve been used to make more 16GB cards available (or at least a 12 GB version instead of 8).
That wouldn’t work. AMD uses a lot of low memory cheap cheap memory in unison to achieve high speeds, that’s why their cards have more vram than nvidia, not because the amount matters, but because more memory chips together can get higher speeds.
Nvidia uses really expensive chips that are high speed so they can fewer memory chips to get the same memory speed.
Then AMD lied and manipulated gamers for advertising that you need 16gb vram.
Memory speed > memory amount
Why would speed matter more then amount. If I have to swap from the slower system memory it’s going to slow it down. Having more means I can store more needed stuff in fast ram.
That’s not how it works at all. You still need to use system memory.
And honestly, I really don’t have time to explain all the details on how RAM and VRAM works.
You are definitely AMDs target audience.
You’re wrong bro. I
can’ter, don’t have time to explain but you’re just wrong. *Finishes with passive aggressive insult*You’re fresh from reddit, ay? Ever consider going back?
Why don’t you fuck off, ya dumb cunt?
Wow. Moi, a dumb cunt? You suuuure got me! I hope you didn’t hurt yourself coming up with such an epic zinger. True big brain material there, really shows us all how much mental horsepower you’ve got under the hood.
Fucking toxic manchild Redditors. Looking forward to seeing your account inactive after a few weeks of us mocking you.
Ive got 16gb of vram 2k monitor and this tracks pretty accurately. I almost never use over 8gb. The only games that I can break 10gb are games where I can enable a setting (designed for old PCs) where I can load all the textures into vram.
Games with lots if mods like KSP.
Ksp uses ram, not vram. I play rp1 with 8gb vram no problem. 32 gb of ram isn’t enough though.
Yep, confused it.
I don’t think I’ve ever seen a game use more RAM than ksp with mods though, holy moly.
My Rimworld at 500+ mods can be pretty fucked
Weird. You must be playing old games. Most modern games are going over 8gb at 1440p no problem. They have been for at least a few years now.
I would agree because 8gb is entry for desktop gaming and most people start at entry level
If he’d chosen his words more carefully and said “many” rather than “most” nobody would have a reason to disagree.
Lmao. AMD out here fumbling a lay up.
I mean honestly, yeah. With a simple 4BG chip they could have been won the low end and not screwed over gamers.
They’ve really seemed to have forgotten their roots with the GPU market, which is a damn shame.
Seriously.
All AMD had to do here is create a 12GB and 16GB version (instead of 8 and 16), then gesture at all the reviews calling the RTX 5060 8GB DOA because of the very limiting VRAM quantity.
8GB VRAM is not enough for most people. Even 1080p gaming is pushing the limits of an 8GB card. And this is all made worse when you consider people will have these cards for years to come.
Exactly. Even if you accept their argument that 8GB is usually enough today for 1080P (and we all know that is only true for high performance e-sports focused titles), it is not true for tomorrow. That makes buying one of those cards today a really poor investment.
Even worse when you consider the cost difference between 8GB and 16GB can’t be that high. If they ate the cost difference and marketed 16GB as the new “floor” for a quality card, then they might have eaten NVIDIA’s lunch where they can (low-end)
Oh fuck you AMD. NVidia fucked up with the 4060 already, and again with the 5060.
Tell that to my triple 1440p screen flight simulator!
most gamers aren’t doing that. You can get a very good idea of what they’re doing by looking at Steam hardware surveys.
Most gamers are stuck with lower end hardware because they can’t afford anything anymore.
Exactly
Have you tried buying three graphics cards?
My 4k tv disagrees. Even upscaling from 1440p, my 10GB is barely enough on new games
Last month’s Steam survey had 1080p as the most common primary display resolution at about 55%, while 4k was at 4.57%.
4K is a tiny part of the market. Even 1440p is a small segment (albeit rapidly growing).
“8gb ought to be enough for anybody”
I wish.
Send one of these guys by my place. I’ll show them what 8GB can not do…
I personally think anything over 1080p is a waste of resolution, and I still use a card with 8GB of VRAM.
That being said, lots of other people want a 16GB card, so let them give you money AMD!
anything over 1080p is a waste of resolution
For games, maybe.
But I also use my PC for work (programming). I can’t afford two, and don’t really need them.
At home I’ve got a WQHD 1440p monitor, which leaves plenty of space for code while having the solution explorer, watch window, and whatnot still open.
At work we’re just given cheap refurbished 1080p crap, which is downright painful to work with and has often made me consider buying a proper monitor and bringing it to work, just to make those ~8h/day somewhat less unbearable.
So I can’t go back to 1080p, and have to run my games at 1440p (and upscaling looks like shit, so no).
My gaming rig is also my media center hooked up to a 4k television. I sit around 7 feet away from it. Anything less than 1440p looks grainy and blocky on my display.
I can’t game at 4k because of hardware limitations (a 3070 just can’t push it at good framerates) but I wouldn’t say it’s a waste to go above 1080p, use case is an important factor.
It looks grainy because it’s a damn TV and not a monitor. You’re not going to be able to tell the difference AT THE DISTANCE that you’re supposed to be using them at. Larger monitors are meant to be used from a farther distance away. TVs are meant to be used from across the room.
You’re that guy with his retina plastered on the glass of his smartphone going “I CAN SEE THE PIXELS!”
Pixel density is pixel density. Doesn’t matter if it’s a tv or a monitor.
Sure monitors typically have less input lag and there are reasons one might choose a monitor over a tv, but the reverse is also true. I chose a 55" tv for my sim racing setup that sits maybe a meter from my face and there’s no problem with that setup
TV panels have lower PPI than monitors.
1440p on a 27" monitor is the best resolution for work and for gaming.
I personally think anything over 1080p is a waste of resolution
But but Nvidia said at the RTX 3000 announcement that we can now have 8K gaming
Tell that to game developers. Specifically the ones that routinely don’t optimize shit.
Or to gamers who insist on playing these unoptimized games at max settings. $80 for the game, and then spend $1000 buying a gpu that can run the game.
Oh so it’s not that many players are FORCED to play at 1080p because AMDs and Novideos “affordable” garbage can’t cope with anything more to make a game seem smooth, or better yet the game detected we’re running on a calculator here so it took pity on us and set the graphics bar low.
Hey, give a little credit to our
public schools(poorly-optimized eye-candy) new games! (where 10-20GiB is now considered small)
Guess I’ll stick with my GTX 1070TI until next century when GPU manufacturers have passed the bong to someone else. Prices are insane for the performance they provide these days.
Same. I’ve encountered exactly one game, ever, that I couldn’t play with that card, and that was last month with Doom: Dark Ages which won’t even boot without RTX support.
Literally never had a single other problem over the past 7 years of use. I played Cyberpunk 2077 with that card. I’m currently playing Clair Obscur with that card and it looks stupendously beautiful on it.
Greetings fellow 1070Ti user.
I just ditched my 8gb card because it wasn’t doing the trick well enough at 1080p and especially not at 1440p.
So if i get this straight AMD agrees that they need to optimize games better.
I hate upscaling and frame gen with a passion, it never feels right and often looks messy too.
First descendant became a 480p mess when there were a bunch of enemies even tho i have a 24gb card and pretty decent pc to accompany that.
I’m now back to heavily modded Skyrim and damn do i love the lack of upscaling and frame gen. The Oblivion stutters were a nightmare and made me ditch the game within 10 hours.