Anyone sick of these new games requiring so much power to run?
-
Using 1440p for 20 years doesn't mean 4k is a scam
It is a scam if you're buying 27" monitors like op. You can only cram so much dpi in a monitor before you get diminishing returns. I've been playing in 1440p, 27" for a while, and can barely see the pixels if I put my eyes 10 cm away from the screen (and I've been playing arma reforger, so there's been a lot of squinting at bushes through a high-powered scope lately).
I've also used a 4k, 32" screen for a long time at work (in gamedev, so I wasn't looking at excel files either... Well, actually I also was but that's besides the point) and couldn't really tell the difference with my home setup on that front (though I admit 32" at 1440p doesn't look great sometimes, I also tried that for a while). Really, the most noticeable things were the HDR and the ludicrous fps I could get from having a top-of-the-line CPU and GPU (and 128 Go RAM also helped a bit I guess)
-
Idk man, I bought Sol Cesto yesterday, and I'm pretty sure my toaster could run it
Edit:
RX 6600, two 4k monitors
Bruh. I have a 3080 Ti and barely feel comfortable running my games in 2k. I'm pretty sure the 6600 was made with only 1080p and lower in mind.
wrote last edited by [email protected]. I know, dude. That's my whole point. Why do WE have to bear the burden of optimizing the game? Why don't the developers optimize their games? It's bonkers the game "hell is us" listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
-
I'd like to know what those contradictions are, if you'll indulge me. Sorry if my comment was confusing, I added some paragraph breaks to make it easier to read and fixed a typo.
What I don't get about your setup is how you have such a powerful CPU, but don't get an equally powerful GPU?
It seems more than anything your system just isn't balanced for demands of modern gaming (very GPU centric).
My old PC with a Ryzen 2700X, or hell - even a Ryzen 1600 that I originally built it with, but paired with an RTX 3090 instead of an AMD GPU, would handily outdo yours at most gaming tasks, even 4K gaming is just about doable on such a setup.
I understand that GPU prices are shit, but in light of that the best approach is to spend as little as possible on everything else.
What I don't get about your setup is how you have such a powerful CPU, but don't get an equally powerful GPU?
That's the part that cleared everything up for me (it could be me as English isn't my first language lol). I apologize. As for my CPU, I got it on sale for $120 and ran it for around 6 months without a dGPU, until I got an RX580 for free from a friend for around half a year then "upgraded" to my RX6600 that I got from Facebook for $100. That's why I have a weaker GPU and a decent CPU. Hope that clears things up now.
-
Hey man, I agree with you on principle but the fact is that you're trying to run new AAA games with an older card at 4K.
Time marches on, and graphics demands have changed. Newer cards are built differently and games are (albeit poorly) designed to utilize the new hardware.
6600 is a fine card but yeah, you're going to have to compromise somewhere. A lot of good advice here to tap into older games, or you can spend $180 and buy a good 1440p monitor and see if that opens up your options as well.
You're hermit crabbing into used parts on the cheap which is great, but if you're not willing to pay a pound of flesh for a new card then you're going to have to settle for reduced performance - it's that simple. Otherwise what's the point of making better hardware, if nothing takes advantage of it?
I agree with you. That's why I moved to emulation and playing old games. I don't want to stress my budget for something I don't need. I've been emulating PS3 games and they've been running fantastic. Cheers
-
. I know, dude. That's my whole point. Why do WE have to bear the burden of optimizing the game? Why don't the developers optimize their games? It's bonkers the game "hell is us" listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
It's not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can't rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on "kind of")
If you don't want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you'll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn't received the new GPUs yet.
-
Because investors realized that gaming is lucrative and leads society. You think it's a coincidence that Valve created basically the first successful digital distribution platforms and now every entertainment medium followed their model? Do you think microprocessors need to be that good on phones for Web browsing? AI compute hardware came from gaming gpu technology. The software originally made for and tested on games.
But, like music publishers trying to push the newest and latest and greatest music on us, only to have you realize that "oh yeah, 80s music actually kicks ass" and "oh fuck, 50s music WAS neat" or if you have or when you do "holy shit classical is actually amazing and lasted for literally hundreds of years whereas current pop music is constantly only like 2 years old"...
...Like music in that way, old games don't just magically become irrelevant and bad, despite what pop culture may try to tell us. Compatibility may be the biggest issue, but a lot of old games are legitimately better than newer ones (a lot of old games were bad too). It's all the social and technical evolution of the medium, and once you start being able to look at it that way - once that perspective and vibe catches on a bit more - I think gaming will be healthier.
Don't spend 700 bucks. The game will still be there. Wait. Play fun things now that you can, and if when it's economically playable for you it's still fun, then it'll still be fun. There are far too many games right now to even seriously consider fomo for anything but the MOST socially important games, and those are few and far between, and usually very easy to run.
You're doing fine.
Don't look at how much 5090s cost in like Australia or some other countries.
Absolutely spot on. It needs a personal change. A change in mentality, the way we think of entertainment, games in this case. We need to stop believing these profit hungry people and stop chasing after those "shadows" and "highlights" and every single detail in games. When you enjoy a game, you won't even notice all those "details". You're not going to be running around in a game looking at trees and sunlight. I guarantee you that it all becomes blur in the background and you won't even care about it. It's all a collective anxiety they've trained us into. Chasing after those fps numbers and details is what got us hooked to their exorbitant prices and shit performance. It was somewhat a wakeup call for me, if you wanna call it that. I just got tired of stressing about my hardware, that's totally fine and capable, not playing their shit and poorly optimized games. Then come those who defend corporations and berate you for "choosing the wrong resolution". Why? I like 4k. It looks nice. Why do I have to bear the burden of their poorly optimized games. They ARE 100% more than capable of optimizing their games to run on 4k ON MY RX6600, but they don't want to. They're lazy. It won't make them enough money. It won't squeeze the last penny out of our pockets. They're also listening to Nvidia. Of course Nvidia wants us to believe that games are "so good" nowadays that we need their top of the line $3000 GPU to play them on 4k. How dare we peasants play on 4k on a non $3000 GPU? Blasphemy!!! Fuck'em. At least there are still folks like you out there who understand this bullshit and don't bootlick.
-
I tend to play shit 2+ years old...
This has been me since the dawn of time. I have never bought a game on release date. My son can vouch for me on this one. Whenever he asked for a PS4 game that just came out, my answer is "we need to wait until it becomes $15 - $20 at gamestop and we will get it". I've never bought a PS4 game for more than $20.
-
It's not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can't rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on "kind of")
If you don't want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you'll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn't received the new GPUs yet.
I'll just play older games. Everything runs at 4k 60 fps no issue on RPCS3 and I've been having a freaking blast. Started with uncharted 1, and man, I've missed out on this game. I'm going to stick with older games.
-
I can't believe how poorly most Unity games run these days. We're talking fairly basic 2D games that are struggling to run well on hardware from this decade. It's really pathetic.
Yup, this is what I'm fucking saying. I'm so sick of it. I'm so sick of this "you'll need a $3000 GPU to run a 4k game at 30 fps". Like what the actual fuck? Who asked for this?
-
. I know, dude. That's my whole point. Why do WE have to bear the burden of optimizing the game? Why don't the developers optimize their games? It's bonkers the game "hell is us" listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You're saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they'll run well on your boutique monitor.
It's a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I'll keep running new games on my older card at 1080 and be perfectly happy with it.
-
This has been me since the dawn of time. I have never bought a game on release date. My son can vouch for me on this one. Whenever he asked for a PS4 game that just came out, my answer is "we need to wait until it becomes $15 - $20 at gamestop and we will get it". I've never bought a PS4 game for more than $20.
I stopped at ps3/Xbox360 gen.
Back then I'd hit GameStop hard on the "buy two get one free" used days.
Uses to buy dozens of titles. Haha.
-
I stopped at ps3/Xbox360 gen.
Back then I'd hit GameStop hard on the "buy two get one free" used days.
Uses to buy dozens of titles. Haha.
Man, those were the days. I have over 50 CDs for the PS4/PS3 until now.
-
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You're saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they'll run well on your boutique monitor.
It's a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I'll keep running new games on my older card at 1080 and be perfectly happy with it.
That's why I went back to the roots. I'm now playing older games at 4k 60 fps no problem. I'll stick with emulators. I'd rather not spend the $700. I'll still complain about new games not running for me, though. That's the only thing I can do beside playing older games instead
-
Man, those were the days. I have over 50 CDs for the PS4/PS3 until now.
Ditto. Went as far as printung box art for the generic GameStop covers
-
This is getting out of hand. The other day I saw the requirements for "Hell is Us" game and it's ridiculous. My RX6600 can't play anything anymore. I've downloaded several PS3 ROMs and now I'm playing old games. So much better than this insanity. This is probably what I'm going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don't want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They're $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can't afford these high end GPUs, but now very few games work on low settings and I'd get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an "omfg, wtf is this horrible shit" moment.
I'm so sick of this shit!I don't regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I'm emulating and it's actually pretty awesome. I've missed out on so many games in my youth so now I'm just going to catch up on what I've missed out on. Everything works in 4k now and I'm getting my full 60FPS and I'm having so much fun.Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I'm lucky my steam deck can run it or I'd be screwed.
-
With GOTY and all the "expansions"
Love it when you can get those for a cheap price. I ended up getting a copy of Borderlands 1 with DLC on another disc that's one of those advertised for xbox one and 360 sets for a good price a few years back. Don't remember the price, but it had to have been less than $20 before tax.
-
That's why I went back to the roots. I'm now playing older games at 4k 60 fps no problem. I'll stick with emulators. I'd rather not spend the $700. I'll still complain about new games not running for me, though. That's the only thing I can do beside playing older games instead
Or just run newer games at 1080p. Unless you're unhealthily close to the monitor you probably won't even see the difference.
If you're rubbing it on a TV across the room, you probably literally can't see the difference.
-
Or just run newer games at 1080p. Unless you're unhealthily close to the monitor you probably won't even see the difference.
If you're rubbing it on a TV across the room, you probably literally can't see the difference.
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can.
It's best when the screen itself is physically 1080p. I think you fat-fingered the "b" in "running". Came out funny -
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I'm lucky my steam deck can run it or I'd be screwed.
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
-
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
I'm so used to having a toaster for a desktop I had to look up what FSR is. I'm too college student budgeted to touch FSR.
Last I played months ago, I recall it being fine enough when I had my deck docked. FPS was stable and didn't look too bad. Though, I'm not an expert on anything graphics, so my word doesn't mean much.