When using this resource, please read the opening post. The Tech Support forum regulars have helped create countless of desktop systems without any compensation. The least you can do is provide all of the information required for them to help you properly.
For some reason my 970 under msi after burner only getting ~100 fps lowest settings 1080p, like 150core/500mem
What do I need run 144hz stable.... I am assuming GPU upgrade - I just purchased this cpu after toasting the IMC on a 4970k delid
You want to first identify where the bottleneck is. If you drop the resolution to 720p or lower, does FPS move up significantly? Can you get 144hz+ stable at 720p? This just helps you understand if your CPU/memory can support the higher framerates if you get a better GPU.
The 4690k at 4.5ghz should be able to power up to a 1070-1080 level GPU before it starts really bottlenecking in most games, so I'd guess that's the bottleneck. If you can bump RAM up to 1800/2133 it might help a bit in the minimum framerate area, even if you have to loosen timings a bit without the upgrade.
Essentially, since you're at ~100fps, you'll want something at least 50% faster on paper to reach 144hz. This is around the 1660TI level. Ideally, you'd get something a bit more powerful so you can increase the fidelity as well from low to higher levels.
The 2060 Super or rx 5700 is probably the best buy in the immediate future for you. You'll be CPU bottlenecked in pretty much any game until you can upgrade the rest of your system.
So what am I looking for here? If the voltage isn't high enough you're saying it can impact performance invisibly such that I wouldn't be able to tell via benchmark software?
You may not be able to tell via monitoring software, but it would affect benchmarks. I'm not sure how to find the best voltage for manual OC's atm.
Out of the box it should do 4.2ghz single core, lower on multi - so if you can maintain your single core benches while raising the multi then it should be an improvement. Probably mainly focus on memory though as bad memory settings will kill performance and motherboards aren't very good at auto memory settings, especially the non-$700 motherboards.
Gotcha so I want to maximize all core performance without sacrificing single core. Is that a risk with locking the speed higher across all cores?
After further investigation, while the auto mobo v setting reported 1.36v, the cores were all drawing 1.1v. I forced the bios to 1.325 and performance in Cinebench improved ~100 points. I can force the cores to 4.250 and 4.750 but it becomes increasingly unstable and 4.300 results in inevitable crash (even with increased voltage).
PBO+OC - If I understand how to do this correctly, consistently clocks 3910 or less. Considering how practically every setting in the MSi bios or Ryzen master is called "Precision Boost Something" or "Something Overclock" I have no idea what things I'm supposed to "enable"/"disable" for this function.
The RAM seems to be stable @ 3333 16-18-18-36 1.35v (which I was pleased with considering I took a risk on the brand, +1 for OLOy btw). I will check out the tuning program in the video you linked, it will be interesting to see what it recommends, thanks!
For overclocking on motherboards some are better than others for options I guess due to some limitations i.e HPET but generally I would stay under 1.3v vcore 1.9 input and see what I got there if nothing magic then whatever same with gpu, as long as it has 4 cores but thats a bottom level requirement for my system *FORTNITE* :D
CPU/memory can support the higher framerates if you get a better GPU.
I considered this I built this system for SCII but now I use it for Fortnite anmd test with SCII. I was thinking a 1080 as well. But I also considered the fact that it might be the hyperthreading I don't know if this game uses hyperthreading well or if it's a requirement beucase I should be running this crap at 300+ stable considering it runs on PS4. I mean LOL does it run at 30 hz on there christ.
I bought the 970 for price but it only has 3.5 vram can't use 4 on some games so I can't play some games normally like that kind of thing so I just think buying a GTX/RTX x80 card would be best.
I have some 1866hz memory but I use 1600 to have better mouse response and I don't think that would be the case but I can test with some higher memory until you know I get a better GPU or another 4790/60.
I'm pretty sure my net is fine 50/10 so I'm guessing it is a hardware issue I think I underestimated this game.
I have tried running 1366 and testing now......
Nope still 100 fps but if I drop the 3d resoultion down to 35% it runs a stead 125-135 what to do????
considering it runs on PS4. I mean LOL does it run at 30 hz on there christ.
A lot of ps4 games have dips below 30 and even below 20fps at times, ya, although some try to stay near/at 60 most of the time. It's not guaranteed at all that they'll maintain good perf.
but if I drop the 3d resoultion down to 35% it runs a stead 125-135 what to do????
CPU/memory can support the higher framerates if you get a better GPU.
I considered this I built this system for SCII but now I use it for Fortnite anmd test with SCII. I was thinking a 1080 as well. But I also considered the fact that it might be the hyperthreading I don't know if this game uses hyperthreading well or if it's a requirement beucase I should be running this crap at 300+ stable considering it runs on PS4. I mean LOL does it run at 30 hz on there christ.
- snip -
I have tried running 1366 and testing now......
Nope still 100 fps but if I drop the 3d resoultion down to 35% it runs a stead 125-135 what to do????
Agreed with Cyro
You're not gonna gain a huge amount of frames from a GPU upgrade, you'll top out around 120 fps. You can gain a lot of detail (higher shaders, better visuals in general) with a GPU upgrade despite that with something around the 5700/2060S/5700XT level of performance. You can get a better GPU than that, but it won't really help you until you can upgrade the rest of the system. A 1080 is pretty dated now, so unless you can find one for a steep discount, you're better off buying a newer card with better all-round performance.
For RAM, you really want to be going up in frequency (higher numbers). Going down to 1333 won't help you at all. If you can't go up, leave it at 1600.
Yes I mean't 1366 resolution. I put in some 1866 memory and it did help with the lower drops because it was bad at the beginning in the load world and I imagine when there's a lot of building or larger player numbers. So now it is like manageable at 1080p mid round but yeah the load world is terrible its not even like frame drops but tearing lol yuck
The game settings says allow multithreaded rendering but eh who knows what that does it didn't change anything for me I assume well I don't even know why thats there.
Now like settings I don't use any just like medium if anything for view distance. But I don't know what the hell it is I think even if I ddin't cook my 4790 the HT would not make a difference I could use it but it only can do 1 channel and its channel B LOL I cracked the side trying to vice it and a few razor marks.
So yeah with 8GB ram I doubt that would help at all. But mid round when there is less going on it's fine I'm wondering just if some of that is server lag when loading in because I didn't go through a lot of replays to see if that contributes.
I do not want to move from ddr3 or the 4690k I think they SHOULD do fine? But yeah like you guys are saying this no HT low OC(won't go higher) with 970 is just iffy. So yeah I was thinking a x80 ti card I just feel like the higher clocks work better there just isn't any video of like benchmark low setting machines, and the ones that are out vary from chip to chip. Now I don't need a 4k setup just steady 144 at 1080p this setup should be capable of that and is mid round but load world and beginning of rounds are just yuck
but if I drop the 3d resoultion down to 35% it runs a stead 125-135 what to do????
CPU limit
Now we're talking 1366 res at 3d res of 35%
I do not want to move from ddr3 or the 4690k I think they SHOULD do fine?
If you're happy with the 125-135fps you could probably maintain it at higher res with a stronger GPU, but if you had a lot of CPU headroom the FPS would fly up to like 200+ sustained when dropping the graphics load like that.
I see so what you propose if just upgrading the system instead for the price of a 1080ti just upgrade in general which is what I am thinking I also considered this because for a g1 gaming 1080ti we're talking $800 which is just crazy I mean I can probably pull a better setup with a K CPU and mboard.
OK boys I want to spend $800 on an upgrade - like I said I would rather use the setup that will run 1080p I only have a 1080p monitor.
What is your current build?
What is your monitor's native resolution?
1: 1920x1080
1080p Gaming 144hz non-streaming
Why do you want to upgrade? What do you want to achieve with the upgrade?
CPU + Mboard + Ram
Intel K CPU, Asus Mboard, Corsair/GSkill Ram
I would rather use generationally older hardware just to get into ddr4 but with some expandable socket compatibility just spending as little as possible. I don't need the very best, just what is the next preferred standard up from the 4690k/4790k
For Intel CPU's the 9600k might be good - keep in mind that these generally require buying a cooler too if you don't have one that fits.
Ryzen 3600 is a better value CPU in general (partially because it has a decent cooler in with the price) but maybe not if you're not valuing the hyperthreading and productivity performance in favor of straight high-FPS in games that won't scale past 6c.
GPU's shouldn't be as expensive as you're thinking, 1080ti's are last gen and have been EOL for like a year. After holding prices for years the GPU market has taken a nosedive and the FPS per dollar has increased to 50%++ higher than it was a year ago, the price of used 1080ti's has halved.
Take a look at 1660ti's through 2070 super on Nvidia side and the 5700 / 5700xt for AMD.
On September 05 2019 14:51 Lmui wrote: A 1080 is pretty dated now, so unless you can find one for a steep discount, you're better off buying a newer card with better all-round performance.
Same performance as a 2070 (but without ray tracing) but 20% cheaper. There's a legit reason to get a 1080, if it happens to be in your price bracket.
Well prospective components are the 5930k/8600k ROG RAMPAGE V EDITION 10/ASUS Prime Z370-A I Gskill 2133/3200 ram
I don't know is the 8600k the new 4690k? Because apparently there was no difference and the only difference was the price. Regardless of price they had little computation differences and the 4690k was in use more in the community so it was the better choice and still is for SCII but for Fortnite the 8600k is better.
The Ram confused me on the timings and the .... frequencies .... are everything but the frequency just doubles so like they say the 1866 is the performance limit of ddr3? Which frequency is the "1866" of ddr4?
More importantly how does the 5930k compare to the 8600k and is it worth investing in? They are priced similarly - I would really like to do the 2011v3 route but I have to get another board anyway and does this 2011v3 cpu/board support ddr4 higher than 2133?
I spent $350 on a new 970 and it's trash and that's why I am saying a 1080ti beucause anything less doesn't run Fortnite like it. It is stable over 200 fps and the others are completely unreliable from the benchmark videos I have seen there is just no comparing them it has to be 1080ti performance and above so what is in that performance point - price not considered? Intel is just better for gaming and like I said has more support I do not use amd cpu or rx cards. The 2070 super is - $300 less than the 1080ti but the 2070 is worse in performance. It is very similiar and I imagine it can hold 200 fps 1080p right LOL on the other hand the 2080 ti is + $300 with similiar performance. The easy choice would be the 1080ti to save money and be completely sure the hardware will hold otherwise if I get a 2070 and the frames are not optimal than I would have thrown another 300 in the trash. I will more than likely get a 1080ti eventually after the cpu+board upgrade.
But from what I've been reading Fortnite game settings have become increasingly taxing as they update and hyperthreading is not needed but more cores are better i.e 6 than 4
If we're being completely honest here 4690k 4.5 ghz/1600 16gb ram/2060 super should hold 144 hz but when I have fortnite up my gpu load is at 50% and cpu is 90-95% holy shit
On September 06 2019 13:48 bt wrote: If we're being completely honest here 4690k 4.5 ghz/1600 16gb ram/2060 super should hold 144 hz but when I have fortnite up my gpu load is at 50% and cpu is 90-95% holy shit
Well, that's a CPU bottleneck if I've ever seen one.
Start cranking up graphics settings til the GPU hits 85-95% under load. You might as well get some eye candy out of this as well.
I don't know is the 8600k the new 4690k? Because apparently there was no difference and the only difference was the price. Regardless of price they had little computation differences and the 4690k was in use more in the community so it was the better choice and still is for SCII but for Fortnite the 8600k is better.
More importantly how does the 5930k compare to the 8600k and is it worth investing in?
The 5930k and 4690k are the same generation, Haswell. It released in 2013. They perform very similarly core for core, 5930k probably a bit weaker due to frequency limits but it has 6 cores and hyperthreading. Neither are really relevant outside of cheap used options.
The 8600k is an architecture newer on a substantially more advanced manufacturing process, but it's no longer relevant because it was replaced with the 9600k a while back - the same thing, but marginally improved.
That's a 6c6t CPU with substantially more core performance than than the 4c4t 4690k as it gets more work done per clock due to a range of improvements; it also clocks some four or five hundred mhz higher.
The Ram confused me on the timings and the .... frequencies .... are everything but the frequency just doubles so like they say the 1866 is the performance limit of ddr3? Which frequency is the "1866" of ddr4?
The advertised mhz of RAM is the data rate, which is double the frequency. "1866mhz" ddr3 is actually 933mhz frequency, but since everybody advertises the data rate you don't have to pay any attention to that for your generation of CPU.
DDR3 started around 1066-1333mhz data rate, scaled up to 2133-2400 being common up to around 3000mhz on the top end of daily clocks by the last gen.
DDR4 started with around 2133-2666mhz data rates but now 3000-3600mhz is standard and daily clocks in excess of 4000mhz are possible.
I'm running 4000c17 24/7 on 1.39v right now as my Z370 Hero won't seem to accept any higher frequency ('far as i can tell both the CPU and memory shouldn't have trouble with it) but current gen boards might do 4200-4400 24/7, especially more expensive ones.
Ryzens are probably more optimal at 3600-3800 with tighter timings for the same kit and voltage.
Those kinds of clocks (around tight 3600) are achievable for both Intel and AMD even on low end motherboards with cheap memory kits right now and they often impact performance more than current gen CPU overclocks so i'd encourage tweakers to look into memory performance and settings, even the basics like manually setting primary-secondary-tertiary timings using a program to calculate them for you can make a huge difference.
As i previously recommended a few cases that are cheap. I would highly advice to check this YT clip:
It raises a few eyebrows in terms of case selection. Buyers be aware.
Edit: Im sorry for recommending Q500L (as i thought it was the same design as Q300L) as it has the PSU in the front and thats why the thermals are all bad:
I have a question related to FPS in SC2 (and I guess other games too). I'm currently running a G3258 CPU overclocked to 4,4Ghz with an AMD HD7950 GPU on a standard 1080p monitor.
I'm planning to upgrade everything eventually, but I really want two monitors so looking to buy a Dell S2719DGF 1440p monitor.
Now the actual question. With my current setup in late game 200vs200 supply fights I'm seeing pretty noticeable FPS drops, down to 20ish at times. Will I see even worse FPS on 1440p resolution or is that all on the GPU, which I guess even the old HD7950 should be able to handle especially with the medium/low settings I'm running?