When using this resource, please read the opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly.
On June 04 2025 13:09 mantequilla wrote: for 1080p, is there a difference btw 4070 super 12gb and 5070 12gb? benchmark scores seem very close, but I don't know if 5070 has some newer technology that can make a difference in the coming years
5070 here is available for $680 (new) and 4070 super is $555 (used)
Quad framegen is the main thing, much better than dual framegen (same latency cost but 3x as many added frames) and it massively changes the balance of what you can run with the game performance at an acceptable level. The gaming landscape has been sharply divided between games and cards which do support quad framegen and those which don't, especially for high refresh rate users.
Under the hood the 5000 series has several times more tensor performance to enable this and other features.
There is also display connectors, 4000 series has DP1.4 while 5000 has DP2.1 w/ UHBR20 which more than doubles achievable resolutions / refresh rates / color depth which matters a lot if you want to use a high resolution and/or high refresh rate monitor.
For example DP1.4 manages 4k 10bit at 97hz while DP2.1 UHBR20 can do 267hz.
DSC exists to stretch these limits, but it's awful. It causes features to be removed (like DSR/DLDSR) and serious usability problems like the monitor not being detected for 5 seconds when you alt tab, which has Windows move all of your desktop icons into a pile on your other screen/s and similar issues with e.g. game UI's. I used it for less than 2 weeks and i will never touch it again.
Warranty is a thing as well, with used they are at best 1-2 year into their warranty period and at worst not transferrable.
Personally i would be looking at 5060ti 16GB (rather have this than a 4070) or a 5070ti 16GB depending on budget. There'll soon be an 18GB 5070 variant but it's not here now, and until then the VRAM to performance and VRAM to price ratios are too hard of a sell for 12GB 5070 IMO.
When it comes to 1080p, how much of an issue 12GB VRAM is? It seems people are divided between "doomed" and "non-issue."
I typically like to hold on to a card until it no longer performs enough, which usually means it loses most of its resale value by then, honestly I went from gtx 750 -> 1060 -> 5700xt.
In terms of raw performance, my 5700xt is still "okay" for the games I care about, but it has several non-performance-related issues.For instance, the display driver crashes whenever I connect my second monitor, so I connected it to the onboard HDMI with integrated graphics. However, now when I move a window between monitors, it behaves oddly, like YouTube videos freezing and similar issues
There’s a noticeable gap in raw performance between the 4070 Super and the 5060 Ti, but I’m unsure which will be a headache sooner in the long run: missing a specific feature or lacking raw power or lacking enough VRAM
Also am I understanding correctly, frame gen doesn't help when you are getting say 20fps and bumping it 4x, it would still be unplayable right?
On June 05 2025 03:53 mantequilla wrote: Thank you as always, Cryo!
When it comes to 1080p, how much of an issue 12GB VRAM is? It seems people are divided between "doomed" and "non-issue."
I typically like to hold on to a card until it no longer performs enough, which usually means it loses most of its resale value by then, honestly I went from gtx 750 -> 1060 -> 5700xt.
In terms of raw performance, my 5700xt is still "okay" for the games I care about, but it has several non-performance-related issues.For instance, the display driver crashes whenever I connect my second monitor, so I connected it to the onboard HDMI with integrated graphics. However, now when I move a window between monitors, it behaves oddly, like YouTube videos freezing and similar issues
There’s a noticeable gap in raw performance between the 4070 Super and the 5060 Ti, but I’m unsure which will be a headache sooner in the long run: missing a specific feature or lacking raw power or lacking enough VRAM
Also am I understanding correctly, frame gen doesn't help when you are getting say 20fps and bumping it 4x, it would still be unplayable right?
I have a 4070 super and I can game fairly comfortably at 3440x1440. I'm generally limited by what I consider to be low fps in settings before I run out of VRAM. At 1080 I'd imagine it to be a non-issue. I'd definitely take the extra native frames over a few extra generated frames.
On June 05 2025 03:53 mantequilla wrote: Thank you as always, Cryo!
When it comes to 1080p, how much of an issue 12GB VRAM is? It seems people are divided between "doomed" and "non-issue."
I typically like to hold on to a card until it no longer performs enough, which usually means it loses most of its resale value by then, honestly I went from gtx 750 -> 1060 -> 5700xt.
In terms of raw performance, my 5700xt is still "okay" for the games I care about, but it has several non-performance-related issues.For instance, the display driver crashes whenever I connect my second monitor, so I connected it to the onboard HDMI with integrated graphics. However, now when I move a window between monitors, it behaves oddly, like YouTube videos freezing and similar issues
There’s a noticeable gap in raw performance between the 4070 Super and the 5060 Ti, but I’m unsure which will be a headache sooner in the long run: missing a specific feature or lacking raw power or lacking enough VRAM
Also am I understanding correctly, frame gen doesn't help when you are getting say 20fps and bumping it 4x, it would still be unplayable right?
The bar for being able to max games is currently usually between 8GB and 12GB for 1080p (one or two can't max @ 1080p with 12GB, but have minimal loss). It's rising each year. Historically extra VRAM pays off very well, as there is a large range of visual upgrades which cost primarily memory rather than processing time and those run great on cards that are a little bit dated or lower end but have VRAM, while they don't run at all on cards which don't.
Both DLSS and framegen are temporal and work better with higher framerates (better quality, fewer artifacts - and for framegen, a smaller latency penalty). As a rough guide, FPS of 30 before framegen is bad, 60 is okay and 120 is excellent.
Personally i still prefer to have FG on with a base framerate of 30, i.e. the result is less bad than just having 30fps, but the holes and downsides go from close to invisible at 120>480 to being pretty huge and annoying at 30>120. It's just that playing a game at 30fps will be a mess of blur and stroboscopic artifacts anyway so it's a case of bad vs less bad.
Blur is inversely proportional to framerate (i.e. half of the framerate has 2x as much eye-tracking motion blur). Stroboscopic artifacting is inversely proportional as well, half of the framerate has 2x artifact size / visibility. e.g. .
Framegen is huge for fixing both of these. People talk a lot about imperfections in interpolation, but don't generally give as much weight as they should to image problems that come with a lower framerate despite them often far outweighing any imperfection from interpolation.
Between DLSS & FG you are heavily incentivized to play with higher resolutions and refresh rates because they're huge visual uplifts with minimal cost. 4k used to have 25-35% of the FPS of 1080, now it's more like 80% with a very respectable image quality. If you can get to 60-120fps base, you can pump it to 240-480 etc. Running on a 1920x1080 @60hz monitor is a huge waste of potential overall, you really need high resolution and/or high refresh rate (preferably both) to get the most out of a modern day system. Without either, monitor will be a huge bottleneck in your experience. That's also where having 2.7x the bandwidth from a better version of Displayport plays a large role.
Those monitors are not super expensive, either. My main monitor at the moment is a 1080p 380hz fast IPS which costs ~£300 (MSRP for 5060ti 16GB is £400 for reference). It has VRR with per-frame adaptive overdrive in the scaler, and it's better than a £750 monitor from 5 years ago. If that's too much then there are 280hz 1080p fast-IPS monitors at half of that price again (~£150) and that's kind of the minimum bar to make good use of a graphics card like this i think. Big resolution or big refresh rate, both if possible :D
---
^good show of the state of VRAM. 12GB is not there now, it will be in 2027
On June 26 2025 00:18 KwarK wrote: Looking for a laptop for running multiple eve online clients, using for Minecraft, and for video calls. Will spend most of its time on a desk with peripherals but will also go to sleepovers or Minecraft parties from time to time. Budget $500 or so. Thinking this? https://www.amazon.com/gp/product/B0D4WCZGNF?psc=1 like new for $550 or https://www.amazon.com/gp/product/B0CKXYYDRW?psc=1
Any better options out there or is this more or less what I’m after?
First looks better, i don't really know about laptops in that price range
I am using an ultrawide monitor (2560 x 1080) but I can not play games in 1920 x 1080p. I am sure that was possible with gtx1050 but not with rtx3050 anymore?!
On August 28 2025 17:37 Dingodile wrote: I am using an ultrawide monitor (2560 x 1080) but I can not play games in 1920 x 1080p. I am sure that was possible with gtx1050 but not with rtx3050 anymore?!
You can make a custom 1920x1080 resolution in the nvidia control panel