I have been testing SC2 on my new 16GB Memory, M1 Mac Mini and the FPS results are not as good as I expected.
I have tested with
- 3 different resolutions - 4K UHD 3840 x 2160, Full HD 1920 x 1080 and HD 1280 x 720 - against all the 5 graphics settings - using a 30 second late game battle sequence from MARU vs BYUN [TvT], ASUS ROG Online 2020 Grand Final.
Not sure if I am missing a particular setting that can boost FPS. Also please let me know if there is a better way to benchmark performance on Apple Silicon.
On December 27 2020 19:50 Cyro wrote: The M1 is an ARM processor and SC2 is built for x86, so it's non-native code. There can be huge performance hits from running like that.
So you need the 2020 Intel version. After that, Sc2 will have to be optimised which I doubt Blizzard would do.
Yea, unless you have some assembly language code (which would be very surprising), it's just a checkbox to recompile and build an x86/ARM fat binrary. I'd be surprised Blizzard don't do it soon or later for SC2.
On December 28 2020 02:11 krishpy wrote: I have captured the CPU and GPU usage in the video when I was testing. It looks like its GPU bound currently.
There was also plenty of free memory during the test runs.
Sorry did not have the time to watch the video yet. But it makes sens that it is GPU bound. In this case, I don't think we'll get that much from an ARM version unfortunately. If you check these benchmarks:
The BaseMark ones near the end, you can see the difference between the Rosetta (emulated x86 like SC2 is now) and native version. You get a bit, but not that much.
Yea, unless you have some assembly language code (which would be very surprising), it's just a checkbox to recompile and build an x86/ARM fat binrary. I'd be surprised Blizzard don't do it soon or later for SC2.
It's not quite that easy, you need to fix all of your unaligned pointer dereferences which were previously generally fine on x86.
Yea, unless you have some assembly language code (which would be very surprising), it's just a checkbox to recompile and build an x86/ARM fat binrary. I'd be surprised Blizzard don't do it soon or later for SC2.
It's not quite that easy, you need to fix all of your unaligned pointer dereferences which were previously generally fine on x86.
Could you elaborate on simpler terms if you have the time? I find this whole architecture discussion both interesting and confusing.
On December 27 2020 21:52 heqat wrote: Hopefully SC2 will be recompiled for ARM. It's not that much work really. WoW has arleady been ported.
WoW is still making Blizzard money while on the other hand...
Yeah... sc2 servers has been broken for 6 days now and it probably won't be fixed till the new years. If they don't dedicate resources to maintain, I doubt they'll spend resources to support recompile sc2.
Yea, unless you have some assembly language code (which would be very surprising), it's just a checkbox to recompile and build an x86/ARM fat binrary. I'd be surprised Blizzard don't do it soon or later for SC2.
It's not quite that easy, you need to fix all of your unaligned pointer dereferences which were previously generally fine on x86.
I guess it dedends of your code style. I have been working on cross-platform application for iOS and x86 for years and 99.99% of the code requires no change at all (and the changes were mostly due to some "hacky" code). Most of the code was pure C++ with pointers and all. But usually everything is already properly aligned in our case. I guess that if WoW was ported that fast, their code base should be clean enough to be simply re-compiled (with a few changes here and there).
On December 27 2020 21:52 heqat wrote: Hopefully SC2 will be recompiled for ARM. It's not that much work really. WoW has arleady been ported.
WoW is still making Blizzard money while on the other hand...
Yeah... sc2 servers has been broken for 6 days now and it probably won't be fixed till the new years. If they don't dedicate resources to maintain, I doubt they'll spend resources to support recompile sc2.
I wouldn't hold my breath it would be soon. I've never seen Blizzard mishandling their SC2 servers for that long since I started playing SC2 beta.
Only at 720/low does the CPU start coming into play as it hits 60-75 FPS and the GPU moves a little away from 100% utilization.
Odds are it won't run well, even with recompilation. 720/medium was chugging away at 28-35 FPS which isn't really playable. A small boost up to 40 FPS average would help, but not much.
You'd have better luck saving money until Apple releases a higher end apple silicon chip. They have an 8 core GPU atm, it'd probably have to scale up to 24-32 core before you could consider 1080/med settings.
On January 24 2021 11:21 renaissanceMAN wrote: Got my M1 MBP a few days ago and booted up SC2; ~90 FPS at 1920 x 1080. I was impressed, but has the game been recompiled?
It hasn't yet, at least from I could find. It does appear that support is at least paying attention to M1 Macs because they have a note posted regarding vsync causing hangs on M1 Macs. The pinned note mentions the issue is being worked on but that doesn't tell us anything. There haven't been any official announcements at all.
On January 24 2021 11:21 renaissanceMAN wrote: Got my M1 MBP a few days ago and booted up SC2; ~90 FPS at 1920 x 1080. I was impressed, but has the game been recompiled?
It hasn't yet, at least from I could find. It does appear that support is at least paying attention to M1 Macs because they have a note posted regarding vsync causing hangs on M1 Macs. The pinned note mentions the issue is being worked on but that doesn't tell us anything. There haven't been any official announcements at all.
Gaming aside this thing is a absolute monster and the battery life is unbelivable.
On December 28 2020 02:11 krishpy wrote: I have captured the CPU and GPU usage in the video when I was testing. It looks like its GPU bound currently.
There was also plenty of free memory during the test runs.
Sorry did not have the time to watch the video yet. But it makes sens that it is GPU bound. In this case, I don't think we'll get that much from an ARM version unfortunately. If you check these benchmarks:
The BaseMark ones near the end, you can see the difference between the Rosetta (emulated x86 like SC2 is now) and native version. You get a bit, but not that much.
Aside from the ARM thing and the whole pooling of CPU/GPU and RAM, which I’m not as sure about as some of you tech wizards
I was always under the impression SC2 was both CPU bound and not optimised for multicore processors, not GPU bound.
My 2008 iMac and old i5 Windows machine were able to run it pretty well, well primarily the latter. This despite the latter’s relative lack of GPU grunt bottlenecking me out of many a game in the proceeding 9 years until I upgraded.
From what I remember in the early days, and that’s a long time ago now, folks with higher clock speeds on single cores with generally slower/fewer core processors were getting better performance than those with less, even if they had more advanced processors on occasion.
Can someone let me know if I’m going crazy/senile? I’d always assumed this was the case. If I’m wrong about this what else do I believe that’s fallacious!? Even my laptop college gave me can run it alright and it’s got a pretty ropey integrated GPU.
Is running SC2 via Rosetta pushing a lot of the workload to GPUs on a game that’s generally CPU bound and causing performance issues, or does the M1 chip generally do this or something?
From a Logic (Apple native Digital Audio Workstation software) group I’m in the M1 machines absolutely smoke it when recording and mixing audio, which can be very resource intensive. Hope to pick up one at some point, heard good things/seen benchmarks for other intensive creative applications like rendering etc too.
I wouldn’t expect it to be a gaming monster at all, but it does surprise me that it can’t push SC2, even if it is via the emulation layer.
On December 28 2020 02:11 krishpy wrote: I have captured the CPU and GPU usage in the video when I was testing. It looks like its GPU bound currently.
There was also plenty of free memory during the test runs.
Sorry did not have the time to watch the video yet. But it makes sens that it is GPU bound. In this case, I don't think we'll get that much from an ARM version unfortunately. If you check these benchmarks:
The BaseMark ones near the end, you can see the difference between the Rosetta (emulated x86 like SC2 is now) and native version. You get a bit, but not that much.
Aside from the ARM thing and the whole pooling of CPU/GPU and RAM, which I’m not as sure about as some of you tech wizards
I was always under the impression SC2 was both CPU bound and not optimised for multicore processors, not GPU bound.
My 2008 iMac and old i5 Windows machine were able to run it pretty well, well primarily the latter. This despite the latter’s relative lack of GPU grunt bottlenecking me out of many a game in the proceeding 9 years until I upgraded.
From what I remember in the early days, and that’s a long time ago now, folks with higher clock speeds on single cores with generally slower/fewer core processors were getting better performance than those with less, even if they had more advanced processors on occasion.
Can someone let me know if I’m going crazy/senile? I’d always assumed this was the case. If I’m wrong about this what else do I believe that’s fallacious!? Even my laptop college gave me can run it alright and it’s got a pretty ropey integrated GPU.
Is running SC2 via Rosetta pushing a lot of the workload to GPUs on a game that’s generally CPU bound and causing performance issues, or does the M1 chip generally do this or something?
From a Logic (Apple native Digital Audio Workstation software) group I’m in the M1 machines absolutely smoke it when recording and mixing audio, which can be very resource intensive. Hope to pick up one at some point, heard good things/seen benchmarks for other intensive creative applications like rendering etc too.
I wouldn’t expect it to be a gaming monster at all, but it does surprise me that it can’t push SC2, even if it is via the emulation layer.
I can't comment on macOS, but you're correct regarding high CPU frequency on 1-2 cores. That's what counts in sc2. So you can get away with an i5 Alder Lake CPU like i5-12600K(F) if your goal is just gaming. Before Alder Lake, Ryzen 5000 series was the hot deal. To give you perspective, any recent CPU in the last 1-2 years with 3+ GHz should do well enough in sc2. Even my i7-5820k (years old now) was doing fine in sc2. Only improvements you see with recent CPUs is handling late games a little easier (less lag), most noticeable in 3vs3 and 4vs4. Other than that, you may use any old graphics card (e.g. even 5 years old graphics card) and still be fine because sc2 doesn't rely on that much.
Edit: Also, 3 GHz CPU just released vs 3 GHZ CPU from 2 years ago aren't the same. Neither in sc2 nor elsewhere even if cores are somehow the same number. IPC (instructions per cycle) gets improved after each CPU generation, which is why we all see more and more fps in sc2 with newer CPUs even if Blizzard doesn't do optimisations anymore. Imagine IPC as a bag when you go to grocery store. 5 years ago you could carry 5 items in it, 4 years ago you could carry 6 items in it before returning home for a new round of "shopping", etc. Hopefully this example would do to explain fps gain without support from game developer.
On January 03 2022 11:05 KNUCKLEHEAD wrote: what is the ladder experience on the m1 mac mini? I would just put everything on the lowest settings and try to get the best gameplay
I play on a 27" Cinema Display with a moderate resolution (definitely not 4k) and I still average ~90 FPS in almost all situations, ladder experience has been great, I'm diamond 2.
FYI: still on the same MBP, not a mini, but they're comparable. I'd be really interested to see what SC2 is like on the new M1 Pro or M1 Max.
I have a standardised CPU benchmark with data from Vermeer, Cometlake and Alderlake if anyone wants to run it on something else. PM me your discord name
-------
I was always under the impression SC2 was both CPU bound and not optimised for multicore processors, not GPU bound.
My 2008 iMac and old i5 Windows machine were able to run it pretty well, well primarily the latter. This despite the latter’s relative lack of GPU grunt bottlenecking me out of many a game in the proceeding 9 years until I upgraded.
From what I remember in the early days, and that’s a long time ago now, folks with higher clock speeds on single cores with generally slower/fewer core processors were getting better performance than those with less, even if they had more advanced processors on occasion.
Can someone let me know if I’m going crazy/senile? I’d always assumed this was the case. If I’m wrong about this what else do I believe that’s fallacious!? Even my laptop college gave me can run it alright and it’s got a pretty ropey integrated GPU.
SC2 uses very little graphics, but the regular M1 graphics are slow and they struggle to run actual games. It's more there for desktop use while the M1 max does a better job as a mobile gaming system.
You need 1 core to run the simulation and 1 other core to do everything else without interrupting the important stuff. The most important limiting factor has been feeding that core with data from cache and memory - SC2 scaled massively from DDR3 to DDR4 to DDR5, as well as from L3 cache improvements - but the core performance and clock is also very important.
On January 05 2022 03:56 Cyro wrote: I have a standardised CPU benchmark with data from Vermeer, Cometlake and Alderlake if anyone wants to run it on something else. PM me your discord name
I was always under the impression SC2 was both CPU bound and not optimised for multicore processors, not GPU bound.
My 2008 iMac and old i5 Windows machine were able to run it pretty well, well primarily the latter. This despite the latter’s relative lack of GPU grunt bottlenecking me out of many a game in the proceeding 9 years until I upgraded.
From what I remember in the early days, and that’s a long time ago now, folks with higher clock speeds on single cores with generally slower/fewer core processors were getting better performance than those with less, even if they had more advanced processors on occasion.
Can someone let me know if I’m going crazy/senile? I’d always assumed this was the case. If I’m wrong about this what else do I believe that’s fallacious!? Even my laptop college gave me can run it alright and it’s got a pretty ropey integrated GPU.
SC2 uses very little graphics, but the regular M1 graphics are slow and they struggle to run actual games. It's more there for desktop use while the M1 max does a better job as a mobile gaming system.
You need 1 core to run the simulation and 1 other core to do everything else without interrupting the important stuff. The most important limiting factor has been feeding that core with data from cache and memory - SC2 scaled massively from DDR3 to DDR4 to DDR5, as well as from L3 cache improvements - but the core performance and clock is also very important.
On January 05 2022 03:56 Cyro wrote: I have a standardised CPU benchmark with data from Vermeer, Cometlake and Alderlake if anyone wants to run it on something else. PM me your discord name
-------
I was always under the impression SC2 was both CPU bound and not optimised for multicore processors, not GPU bound.
My 2008 iMac and old i5 Windows machine were able to run it pretty well, well primarily the latter. This despite the latter’s relative lack of GPU grunt bottlenecking me out of many a game in the proceeding 9 years until I upgraded.
From what I remember in the early days, and that’s a long time ago now, folks with higher clock speeds on single cores with generally slower/fewer core processors were getting better performance than those with less, even if they had more advanced processors on occasion.
Can someone let me know if I’m going crazy/senile? I’d always assumed this was the case. If I’m wrong about this what else do I believe that’s fallacious!? Even my laptop college gave me can run it alright and it’s got a pretty ropey integrated GPU.
SC2 uses very little graphics, but the regular M1 graphics are slow and they struggle to run actual games. It's more there for desktop use while the M1 max does a better job as a mobile gaming system.
You need 1 core to run the simulation and 1 other core to do everything else without interrupting the important stuff. The most important limiting factor has been feeding that core with data from cache and memory - SC2 scaled massively from DDR3 to DDR4 to DDR5, as well as from L3 cache improvements - but the core performance and clock is also very important.
Do you have an M1 Max?
No, just talked to a bunch of people running M1 / M1 Max on OSRS community. The M1 is struggling bigtime on the GPU side there.