Anyone sick of these new games requiring so much power to run?
-
I tend to play shit 2+ years old...
This has been me since the dawn of time. I have never bought a game on release date. My son can vouch for me on this one. Whenever he asked for a PS4 game that just came out, my answer is "we need to wait until it becomes $15 - $20 at gamestop and we will get it". I've never bought a PS4 game for more than $20.
-
It's not bonkers though. Fill rate (the time it takes to render all the pixels your monitor is displaying) is a massive issue with ever increasingly photo realistic games because you can't rely on any simple tricks to optimize the rendering pipeline, because there is so much details on the screen that every single pixel can potentially completely change at any given moment, and also be very different from its neighbors (hence the popularity of temporal upscalers like DLSS, because extrapolating from the previous frame(s) is really the last trick that still kind of works. Emphasis on "kind of")
If you don't want to sell a kidney to buy a good GPU for high resolutions, do yourself a favor and try to get a 1440p monitor, you'll have a much easier time running high end games. Or run your games at a lower res but it usually looks bad.
I personally experienced this firsthand when I upgraded to 1440p from 1080p a while ago, suddenly none of my games could run at max settings in my native resolution, even though it was perfectly fine before. Also saw the same problem in bigger proportions when I replaced my 1440p monitor with a 4k one at work and we hadn't received the new GPUs yet.
I'll just play older games. Everything runs at 4k 60 fps no issue on RPCS3 and I've been having a freaking blast. Started with uncharted 1, and man, I've missed out on this game. I'm going to stick with older games.
-
I can't believe how poorly most Unity games run these days. We're talking fairly basic 2D games that are struggling to run well on hardware from this decade. It's really pathetic.
Yup, this is what I'm fucking saying. I'm so sick of it. I'm so sick of this "you'll need a $3000 GPU to run a 4k game at 30 fps". Like what the actual fuck? Who asked for this?
-
. I know, dude. That's my whole point. Why do WE have to bear the burden of optimizing the game? Why don't the developers optimize their games? It's bonkers the game "hell is us" listed the 4090 as a minimum requirement to run 4k ar 30 fps. I was like wut?
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You're saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they'll run well on your boutique monitor.
It's a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I'll keep running new games on my older card at 1080 and be perfectly happy with it.
-
This has been me since the dawn of time. I have never bought a game on release date. My son can vouch for me on this one. Whenever he asked for a PS4 game that just came out, my answer is "we need to wait until it becomes $15 - $20 at gamestop and we will get it". I've never bought a PS4 game for more than $20.
I stopped at ps3/Xbox360 gen.
Back then I'd hit GameStop hard on the "buy two get one free" used days.
Uses to buy dozens of titles. Haha.
-
I stopped at ps3/Xbox360 gen.
Back then I'd hit GameStop hard on the "buy two get one free" used days.
Uses to buy dozens of titles. Haha.
Man, those were the days. I have over 50 CDs for the PS4/PS3 until now.
-
Because running 4k is extreme. Asking it to run well at 4k is asking them to quadruple the pixels for the same processing cost. You're saying you want the vast majority of people who dont have a 4k setup to have their games downgraded so they'll run well on your boutique monitor.
It's a balancing act, and they can either make the game look like something from 2010 on all systems just to make sure it runs in 4k on older cards, or they can design it to look good in 1080p on older or cheaper cards, which is fine for most most people.
If you want to game in 4k, you need to buy a video card and monitor to support it. Meanwhile, I'll keep running new games on my older card at 1080 and be perfectly happy with it.
That's why I went back to the roots. I'm now playing older games at 4k 60 fps no problem. I'll stick with emulators. I'd rather not spend the $700. I'll still complain about new games not running for me, though. That's the only thing I can do beside playing older games instead
-
Man, those were the days. I have over 50 CDs for the PS4/PS3 until now.
Ditto. Went as far as printung box art for the generic GameStop covers
-
This is getting out of hand. The other day I saw the requirements for "Hell is Us" game and it's ridiculous. My RX6600 can't play anything anymore. I've downloaded several PS3 ROMs and now I'm playing old games. So much better than this insanity. This is probably what I'm going to be doing now, play old games.
Edit: I wanted to edit the post for more context and to vent/rant a little.
I don't want to say I made a mistake, but I buy everything used, and I have scored a good deal on two 27" 4k monitors from Facebook marketplace. Got both monitors for $120.
They're $800 on Amazon used. Great monitors and I love 4k. I also bought an RX6600 AMD GPU for $100 from Facebook. It was almost new. The owner upgraded and wanted to get rid of it. My whole build was very cheap compared to what I see some folks get (genuinely happy for those who can afford it. Life is too short. Enjoy it while you can).
I can't afford these high end GPUs, but now very few games work on low settings and I'd get something like 20 FPS max. My friend gave me access to his steam library and I wanted to play Indiana Jones the other day, and it was an "omfg, wtf is this horrible shit" moment.
I'm so sick of this shit!I don't regret buying any of these, but man it sucks that the games I want to play barely even work.
So, now I'm emulating and it's actually pretty awesome. I've missed out on so many games in my youth so now I'm just going to catch up on what I've missed out on. Everything works in 4k now and I'm getting my full 60FPS and I'm having so much fun.Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I'm lucky my steam deck can run it or I'd be screwed.
-
With GOTY and all the "expansions"
Love it when you can get those for a cheap price. I ended up getting a copy of Borderlands 1 with DLC on another disc that's one of those advertised for xbox one and 360 sets for a good price a few years back. Don't remember the price, but it had to have been less than $20 before tax.
-
That's why I went back to the roots. I'm now playing older games at 4k 60 fps no problem. I'll stick with emulators. I'd rather not spend the $700. I'll still complain about new games not running for me, though. That's the only thing I can do beside playing older games instead
Or just run newer games at 1080p. Unless you're unhealthily close to the monitor you probably won't even see the difference.
If you're rubbing it on a TV across the room, you probably literally can't see the difference.
-
Or just run newer games at 1080p. Unless you're unhealthily close to the monitor you probably won't even see the difference.
If you're rubbing it on a TV across the room, you probably literally can't see the difference.
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can.
It's best when the screen itself is physically 1080p. I think you fat-fingered the "b" in "running". Came out funny -
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I'm lucky my steam deck can run it or I'd be screwed.
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
-
Lmao. That game runs so bad without FSR. Eith FSR on, it looks so weird.
I'm so used to having a toaster for a desktop I had to look up what FSR is. I'm too college student budgeted to touch FSR.
Last I played months ago, I recall it being fine enough when I had my deck docked. FPS was stable and didn't look too bad. Though, I'm not an expert on anything graphics, so my word doesn't mean much.
-
Not the newest game but still newer, but one of my biggest gripes is how much you need to be able to run the newest Ratchet and Clank. I'm lucky my steam deck can run it or I'd be screwed.
wrote last edited by [email protected]It's a PS5-only game running on a portable device. Considering the state of a lot of ports (including this one at launch lol), it's a miracle that it runs this well.
-
I do run them at 1080p, trust me. Here is the thing, though, running 1080p on a native 4k screen makes for a horrible looking picture. It just looks off and very bad. Try it if you can.
It's best when the screen itself is physically 1080p. I think you fat-fingered the "b" in "running". Came out funnyYou'll want to use Lossless Scaling. It'll quadruple the pixels without any filtering and make the output not look weird on a 4k display.
-
I've been a PC gamer for almost 30 years now. This perpetual march of buying new PC upgrades to play new games is an old song.
With a new, much more expensive and faster tune.
Even back then you could play new games with a 2yo mid gpu with ok fps
-
Absolutely spot on. It needs a personal change. A change in mentality, the way we think of entertainment, games in this case. We need to stop believing these profit hungry people and stop chasing after those "shadows" and "highlights" and every single detail in games. When you enjoy a game, you won't even notice all those "details". You're not going to be running around in a game looking at trees and sunlight. I guarantee you that it all becomes blur in the background and you won't even care about it. It's all a collective anxiety they've trained us into. Chasing after those fps numbers and details is what got us hooked to their exorbitant prices and shit performance. It was somewhat a wakeup call for me, if you wanna call it that. I just got tired of stressing about my hardware, that's totally fine and capable, not playing their shit and poorly optimized games. Then come those who defend corporations and berate you for "choosing the wrong resolution". Why? I like 4k. It looks nice. Why do I have to bear the burden of their poorly optimized games. They ARE 100% more than capable of optimizing their games to run on 4k ON MY RX6600, but they don't want to. They're lazy. It won't make them enough money. It won't squeeze the last penny out of our pockets. They're also listening to Nvidia. Of course Nvidia wants us to believe that games are "so good" nowadays that we need their top of the line $3000 GPU to play them on 4k. How dare we peasants play on 4k on a non $3000 GPU? Blasphemy!!! Fuck'em. At least there are still folks like you out there who understand this bullshit and don't bootlick.
Lol you're cute.
Also, it's a fascinating novel perspective you've presented, in that our caring about fps and small details is a learned obsessive behavior. I'd love to hear more about how you think that works and came to be.
-
You'll want to use Lossless Scaling. It'll quadruple the pixels without any filtering and make the output not look weird on a 4k display.
Elaborate, please. What res would that be?
-
I didn't mention that in the op, but Indiana Jones was running like shit on 1080 low settings. The fucking game forces DLSS. This is where gaming heading, forced DLSS and forced garbage so we are forced to buy expensive shit
wrote last edited by [email protected]Well yes, Ray Tracing is required. A 6600 XT is a mid rage card several generations ago that has very early generation RT support. Not even the 7000 series does it super well.
Trying to expect anything good from something ray traced with a 6600 xt is a pipe dream. Let alone 4k. Minimum requirements on steam usually are 1080p 30fps (sometimes 720p/30fps WITH upscaling DLSS/FSR)
Yes gpus are expensive, but it doesnt mean you should expect those kinds of titles to run on your hardware.
You havent mentioned what CPU you have, as running 4k also increases that demand a considerable amount, especially with 2x 4k monitors.
If you want a gpu upgrade the Arc battlemage, and soon Arc Celestial GPUs should be affordable. Battlemage is coming down to 300 bucks new rn, so those should be able to handle newer titles better, but from a second hand perspective, DONT expect to be able to play RT required games, let alone, WITHOUT upscaling.
I hate the peices as much as anyone else, but the only option we have is to wait for better second hand cards over time, just stick to Indle/AA games (not AAA or "AAAA"), and possibly run games at 1080p instead of 4k.
Best of luck getting everything running!