AMD Says You Don't Need More VRAM
-
I'll just keep using my 7900xtx for 240hz 4k then.
-
I'll just keep using my 7900xtx for 240hz 4k then.
Used 7800 XT for 500€ for >60Hz 4k then
(>60Hz as in theoretically more than 60Hz, but my monitor only supporting 60 lol)
-
wrote last edited by [email protected]
He said most people are playing at 1080p, and last month's Steam survey had 55% of users with that as their primary display resolution, so he's right about that. Ignore what's needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
-
This really feels like AMD's "don't you guys have phones" moment.
-
He said most people are playing at 1080p, and last month's Steam survey had 55% of users with that as their primary display resolution, so he's right about that. Ignore what's needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
Absolutely. Why pay more if less is good enough?
They are open about it, and give the option to get more RAM if you want it. Fine by me.
No one with a 4k monitor will by them anyway.
-
Absolutely. Why pay more if less is good enough?
They are open about it, and give the option to get more RAM if you want it. Fine by me.
No one with a 4k monitor will by them anyway.
Absolutely. Why pay more if less is good enough?
Different problem, IMO.
-
AMD doesn't know I selfhost generative AI models.
8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I've been searching for a used K80 in my price range to solve this problem.
-
AMD doesn't know I selfhost generative AI models.
8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I've been searching for a used K80 in my price range to solve this problem.
You aren’t the majority of gamers.
-
You aren’t the majority of gamers.
I know I'm not but that doesn't mean that gamers wouldn't benefit from more VRAM as well.
Just an example, Nvidia's implementation of MSAA is borked if you've only got 8gigs of VRAM, so all those new super pretty games need to have their gfx pipelines hijacked and the antialiasing replaced with older variants.
Like, I'm not gonna go around saying my use case is normal, but I also won't delude myself into thinking that the average gamer wouldn't benefit from more VRAM as well.
-
I know I'm not but that doesn't mean that gamers wouldn't benefit from more VRAM as well.
Just an example, Nvidia's implementation of MSAA is borked if you've only got 8gigs of VRAM, so all those new super pretty games need to have their gfx pipelines hijacked and the antialiasing replaced with older variants.
Like, I'm not gonna go around saying my use case is normal, but I also won't delude myself into thinking that the average gamer wouldn't benefit from more VRAM as well.
Sure, but if they want more VRAM they can just buy the 16gb version. It doesn’t hurt to have a cheaper option.
-
He said most people are playing at 1080p, and last month's Steam survey had 55% of users with that as their primary display resolution, so he's right about that. Ignore what's needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
- Why would they be buying a new card to play how they're already playing?
- What does the long term trend line look like?
You can confidently say that this is fine for most consumers today. There really isn't a great argument that this will serve most consumers well for the next 3 to 5 years.
It's ok if well informed consumers are fine with a compromise for their use case.
Misrepresenting the product category, and misleading less informed consumers to believe that it's not a second rate product in the current generation is deeply anti-consumer.
-
He said most people are playing at 1080p, and last month's Steam survey had 55% of users with that as their primary display resolution, so he's right about that. Ignore what's needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
8GB VRAM is definitely a problem even at 1080p. There are already benchmarks showing this from various outlets. Not in every game of course, and it definitely hurts AAA mre than others. but it will get worse with time, and people buy GPUs to last several years, so it shouldn't have a major issue on the day you buy it!
-
- Why would they be buying a new card to play how they're already playing?
- What does the long term trend line look like?
You can confidently say that this is fine for most consumers today. There really isn't a great argument that this will serve most consumers well for the next 3 to 5 years.
It's ok if well informed consumers are fine with a compromise for their use case.
Misrepresenting the product category, and misleading less informed consumers to believe that it's not a second rate product in the current generation is deeply anti-consumer.
You can confidently say that this is fine for most consumers today. There really isn't a great argument that this will serve most consumers well for the next 3 to 5 years.
People have been saying that for years, my 8gb card is chugging along just fine. The race to vram that people were expecting just hasn't happened. There's little reason to move on from 1080p and the 75+ million ps5s aren't going anywhere anytime soon.
-
I agree if that would mean that those options are cheaper.
You can be a very active gamer and not need more, why pay more. -
Sorry but I don't understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren't forcing your hand or limiting your options.
-
Sorry but I don't understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren't forcing your hand or limiting your options.
It's not, it's just spun for clicks.
-
Sorry but I don't understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren't forcing your hand or limiting your options.
The big deal is that the vast majority of gamers aren't techies. They don't know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn't make it obvious that it's low-end is exploiting consumers' lack of knowledge
-
The big deal is that the vast majority of gamers aren't techies. They don't know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn't make it obvious that it's low-end is exploiting consumers' lack of knowledge
I run most games just fine with my 3070 8gb. While I would've preferred to have more when I bought it, it's held up just fine.
While the 9060 XT isn't released yet, everything I've seen so far has made the difference pretty clear. I have no problem with offering a lesser sku if the difference is clear. Not like Nvidia and their 1060 3gb and 6gb where they also cut the cores and memory bandwidth. If these differ on release my stance would be different.
Also gaming isn't the only reason to have a GPU, I still use my 1060 in my server for transcoding and it works just fine. If I needed something to replace it, or if I was building a new one from scratch, 9060 XT 8gb or an arc would be a fine choice.
-
Perhaps AMD could convince game developers/publishers to spend some time learning optimisation. Many of the popular games I've seen in the past 5 years have been embarrassingly bad at working with RAM and storage specs that were considered massive until relatively recently. The techniques for doing this well have existed for generations.