|
I was just wondering if I could get some more opinions on what I can do to optimize my Xsplit/In Game Starcraft 2 and Dota 2 settings for streaming. If there's anyway possible, I'd love to reduce the In Game lag for Starcraft 2 without compromising graphical quality
I have a Asus P8Z77-V mobo, 3770k @ 4.1Ghz, 8 gigs of 2133 DDR3 Ram, and a 9800 GTX+ (Waiting on my new video card). (Side note: My CPUz says that Ram is running at 800 MHz when it should be running at 2133. Is there anyways to fix that?)
Here are a few screen shots of my Xsplit settings as well as some In game settings. Thank you very much in advance!
![[image loading]](http://i.imgur.com/PvEoi.png)
![[image loading]](http://i.imgur.com/EpGts.png)
![[image loading]](http://i.imgur.com/3cW0q.png)
![[image loading]](http://i.imgur.com/lSd5B.png)
|
On August 28 2012 00:07 raybasto wrote: I have a Asus P8Z77-V mobo, 3770k @ 4.1Ghz, 8 gigs of 2133 DDR3 Ram, and a 9800 GTX+ (Waiting on my new video card). (Side note: My CPUz says that Ram is running at 800 MHz when it should be running at 2133. Is there anyways to fix that?)
You should double the indicated frequency that the RAM is running at, since it's double-data-rate (DDR) memory. So that's 1600 MHz, which is the highest stock frequency for RAM with an Ivy Bridge chip. You can try to get the processors memory controller to run at 2133, but it's not garantueed to work, as it's technically outside the specs of the CPU. Dive into the BIOS for this.
|
Thanks for your help. I set XMP on my BIOS and that made my ram 2133MHz, the timings are stock in the BIOS, and I manually set the voltage to 1.6 (Which is what it is supposed to be according to the box) but when I go on CPUz, it says it runs at 800MHz, which would be 1600MHz. Anyone know why or how to set it to 2133MHz? Also, how do my Xsplit settings look?
|
United Kingdom20322 Posts
RAM being at 1600mhz instead of 2133 isnt really a big deal, did you check the timings to make sure they are as advertised or lower?
You have Quality set to 8.
AFAIK (and i looked into it quite a bit) the Quality setting is just a simple way of saying crf, my internet doesnt seem to be loading anything at all right now but you can find more information on that on x264 wiki and other things, basically, with the quality setting in xsplit all you are doing is limiting the quality of your stream at times (when you are inside your bandwidth limit to encode at say 5000kb/s, but the stream only needs 3000kb/s to look at X quality at Y time, so it wont use the other 2000 to improve quality past this mark and will save the bandwidth instead) so you can bump it up to 10, or bypass it by writing "Preset&ex:crf:XX" in the preset box, with quality on not set, which will let you push the setting to a higher quality than xsplits 10, if you want, 10 = crf25 IIRC which is pretty good, but its a noticable quality limiter in some scenes where it wouldnt be neccesary, if you dont mind using up to the bandwidth hard cap that you set in bitrate. That hard cap wont really change no matter how high you set CRF i think.
(correct me if im wrong, anyone more knowledgeable)
You are using a 3770k @4.2ghz, there is no need to stop at 1280x720 at 25fps - my i7 950 (1.5 generations old) @ 3.84ghz atm runs 1280x720@veryfast@60fps at whatever CRF i set it to without any issues really at all, and your CPU is far far far more powerful, even though the difference in clock speed is low, ive got access to a 3770k downstairs but havnt put time into finalising overclocks or testing things out with xsplit yet so i dont know how it would perform, but you should be able to throw up 1280x720 at 60fps or 1920x1080 at 30fps if you use the veryfast preset without any real issues at all, i can do both on a far inferior CPU. My 3770k walks all over this CPU on benchmarks (Though i have to question the voltage, 1.312v reading in CPU-Z? are you using offset voltage? Mine runs HOT, i got 4.2ghz stable at 1.08v without using vdroop when i was experimenting, if you are running 1.312v while under full load it is serious cause for concern mostly due to heat, ivy bridge is NOT good with voltage)
When it comes to the framerates you are running xsplit at, there is usually a performance hit on games from capture, if you are playing at 1920x1080, going from 30fps to 60 might hurt your performance a lot (not through graphed cpu usage) so you will want to check that out to see if you can do it smoothly, and post results from www.speedtest.net and www.pingtest.net to a server close to you also for a better idea of internet quality.
My xsplit isnt opening (internet seems to be not running properly, will fix after post) but if you want quick answer try throwing up 1920x1080@30fps@veryfast, 1280x720@60fps@veryfast, and 1280x720@30fs@medium with quality on 10. If you want to push quality a bit higher, you can set quality to not set, and in the preset box, instead of for example "Veryfast", make it contain "veryfast&ex:crf:20". CRF is a nonlinear scale with lower being better (translated into quality by xsplit team for simplicity) so 20crf will use around double the bitrate of 26 if you are not hitting your bitrate cap IIRC. Its not really useful to go further down, even in local recordings, if you are pushing it 12-16 will give great quality on local records without a bandwidth limiter (crf IS the limiter)
I personally run the game on low/medium shaders, a few other settings further down than you (textures maxed, they dont cause much of a performance hit for me and are important visually) which runs great, and you might want to try the same, with your amazing CPU matched with a now mediocre at best card (due to aging of tech) which could be bottlenecking you really hard if it cant keep up with the higher shaders and graphics settings in sc2.
600, relatively, is an extremely low bitrate, but even if you cant go higher, with a 3770k you can push presets a couple notches lower than most others especially if you are watching CPU and experiment a bit, i wouldnt do 1920x1080 livestreams with it (but go ahead on local record :D) but 720p60 at veryfast or faster should be quite watchable, and due to the 60fps a pleasure to watch even at those bitrates.
If you have problems, watch the task manager performance tab, CPU usage graph, if you are maxing out cores push the preset up a notch and try again, it will destroy your performance if you go just that 1% too far. You need a buffer for when scenes get harder to encode, so either encode a replay at x2-x4 that involves army maxing or something, or make sure you are not close to maxing out CPU. Check CPU while encoding at the sc2 login screen, or in replay, or in Dota 2 or something, not staring at a static scene, it will affect CPU usage a lot.
Let me know if i missed anything or you have any other questions (or if i got anything wrong, anyone), just thoughtdumping.
|
I would look at this guide to see what your best available resolution, bitrate, and FPS are. The short version is that you should find an ISP who will give you much bigger upload speeds. A 3770 can probably deliver 1080p 30 fps at faster preset but you would need a much better upload speed to realistically.
WRT crf: Cyro's right, amping up the quality is a great idea way to use your CPU to get better stream quality. But, you will need to be able to set a higher bitrate to take advantage of the higher CRFs. Also, like Cyro said, be sure to set quality to "not set" or the crf you specify in the preset box won't work.
Personally I wouldn't take crf below 18, which is generally considered indistinguishable from source to the naked eye. Also, if you are downscaling from source resolution, the quality loss due to downscaling will dwarf almost everything. If you can render 1080p crf 18 you will get diminishing returns for your CPU and would be better off changing presets or increasing bitrates so that fewer scenes are bitrate constrained.
|
United Kingdom20322 Posts
A 3770 can probably deliver 1080p 30 fps at faster preset
Id be suprised if you couldnt deliver 50 at faster/veryfast with a ~4.2ghz+ overclock at this point, but any pushing of boundaries needs a lot of testing
|
On August 28 2012 06:50 Cyro wrote:Id be suprised if you couldnt deliver 50 at faster/veryfast with a ~4.2ghz+ overclock at this point, but any pushing of boundaries needs a lot of testing
Sure, but why use that CPU on FPS instead of more bits or a slower preset? FPS > 30 is hard to see unless you're actually watching from a TV.
|
United Kingdom20322 Posts
On August 28 2012 07:13 ZeroTalent wrote:Show nested quote +On August 28 2012 06:50 Cyro wrote: A 3770 can probably deliver 1080p 30 fps at faster preset Id be suprised if you couldnt deliver 50 at faster/veryfast with a ~4.2ghz+ overclock at this point, but any pushing of boundaries needs a lot of testing Sure, but why use that CPU on FPS instead of more bits or a slower preset? FPS > 30 is hard to see unless you're actually watching from a TV.
I disagree completely. From my experience being greatly affected by it and the opinions of those who watched IPL3 and the last asus ROG and reddit it seems widely sought out.
At the very least, if you cant emulate a stream that has 30fps and 3mbit bitrate because you are stuck at 600, might as well exceed it in some areas that could even make it the "better" stream for some viewers, to allow it to be superior in one way, instead of being inferior in every way?
edit: Just saw the speedtest - it wasnt loading before due to internet issues, with 0.98up you can probably push bitrate a bit further. I run 600 fine @0.75ish. Might squeeze out an extra 15%.
|
On August 28 2012 07:13 ZeroTalent wrote:Show nested quote +On August 28 2012 06:50 Cyro wrote:Sure, but why use that CPU on FPS instead of more bits or a slower preset? FPS > 30 is hard to see unless you're actually watching from a TV. I disagree completely. From my experience being greatly affected by it and the opinions of those who watched IPL3 and the last asus ROG and reddit it seems widely sought out.
I'm sure people want it. And if you're watching on an actual television, I'm sure it's worth it. But most people watch from their gaming desktop or laptop. I'm not so sure the 30fps is really visible there. Yes, the monitor refreshes at 60fps and certainly on those 120Hz monitors you can probably see it, but on the non-high end monitors it doesn't feel much smoother.
But even if it's "sought out" the 60fps mania feels a bit like the megapixel chase to me -- it's a number, and people think "more" is always better, but there are things that matter besides FPS. The Nikon D40 is 6 megapixels but it takes way better pictures than most 8MP or higher point and shoots. Likewise per-frame quality matters as much if not more than FPS in big battles in SC2. Given two streams with all else equal except FPS, I'd much rather watch the 30fps one.
Take the MLG Ultimate stream from this weekend. All I did was eyeball the network tab on the task manager, but it looked like 4000-5000 Kbps. Without bitrate constraints, a midgame 120 supply TvT battle takes 8500 Kbps to encode at --preset medium --crf 18, and marines & tanks don't have that much animation. A ZvP endgame StalkerColossusMothership vs InfestorCorruptorBroodlord battle takes 30,000 Kbps at 30fps with --preset medium --crf 18. At any realistic streaming bitrate, the encoder is totally hosed. And now we're supposed to think things will be more pleasant to watch if the encoder has to produce twice as many frames with the same number of bits?And it's a lot of CPU to push all those frames, and you could use that CPU on a slower preset that would get you better compression and therefor more quality per-bit. Why not either stick with 30 frames and amp up the preset and/or crf and/or bitrate?
I don't have the horsepower to both play and record at 60fps, but my friend & I did do some tests of SC2 footage 30fps vs 15 fps (I did them this weekend for other reasons), and at the same bitrate the per-frame quality on 15 fps was way better and the SSIM numbers bear this out. Now, 15fps is too slow ... it looks like you're watching stop-motion animation. But outside of scrolling, when are you really going to notice 60fps when watching SC2? Meanwhile, basically every time someone casts psy storm the encoder can barely cope, and asking it to produce two frames instead of one just makes the situation worse. I blame Peter Jackson for this mess, but when you're filming the Hobbit you can set the encoder on --preset slowest, and Blu-Rays have crazy high bitrates compared to streaming.
Would anyone else like to borrow the "streaming framerate isn't everything" soapbox?
|
Thanks everyone for the help. I have yet to buy a license to Xsplit yet so I should probably do that as soon as possible.
What do you guys think of my Starcraft and Dota in game settings? Whenever I play a game, I feel a slight delay that I'm not use to thus hindering my game play experience. Does everyone feel this delay, I was hoping to be able to stream and play at the same quality (with no delay) as I play without streaming. Thanks again
|
@Zerotalent While it's true that a lot of people don't mind the difference between 30 and 60fps, it IS clearly visible on any monitor running 60fps. Take a look at http://boallen.com/fps-compare.html and tell me that you don't see a difference there.
Yes, each frame has individually less bits on a higher framerate, but the difference between each frame is also reduced as the stream is updated more frequently, so 30fps doesn't exactly look twice as good as 60 does frame for frame.
I really enjoy fluid streams. I can hardly watch a stream under 25 fps (some of my friends stream at 20 or 15. it's disgusting) and a lot of others do too. Infact, I streamed Dragon Warrior 3 for the GBC at 60fps and was given praise at how smooth everything looked. I tested it with 30fps and was so disappointing that I had to up it for sure.
|
What should my SC2/Dota 2 graphical settings be if I want to play with very very minimal delay?
|
On August 28 2012 10:32 Nabutso wrote:@Zerotalent While it's true that a lot of people don't mind the difference between 30 and 60fps, it IS clearly visible on any monitor running 60fps. Take a look at http://boallen.com/fps-compare.html and tell me that you don't see a difference there.
Alright, I do see a difference on my reasonably priced asus monitor. Point taken. And if I were watching someone play MW3 on a PC monitor, I'm sure I'd see a difference. I'm just not sure it matters so much for SC2. I'm even less sure that it's a better use of CPU compared to cranking up (well, down) crf or using a slower preset.
I'm going to try some science to see what things actually look like .
|
On August 28 2012 10:46 raybasto wrote: What should my SC2/Dota 2 graphical settings be if I want to play with very very minimal delay?
I'm on a i7-2600K with a gtx 560, and I stream with Graphics Medium, Shaders & Textures High. You might have enough CPU to be able to set the effects to High as well.
EDIT: hrm ... not sure what you can get away with with that graphics card. Hopefully the new one shows up soon!
|
Thanks. I will try medium graphics with high textures and shading and tell you how it goes
|
United Kingdom20322 Posts
The delay will either be from framerate drop (if your FPS is far far lower than you are used to) from capture, or input lag from capture, there are a few ways to minimise it but it will always be there to some extent
|
United Kingdom20322 Posts
On August 28 2012 08:19 ZeroTalent wrote:Show nested quote +On August 28 2012 07:13 ZeroTalent wrote:On August 28 2012 06:50 Cyro wrote:Sure, but why use that CPU on FPS instead of more bits or a slower preset? FPS > 30 is hard to see unless you're actually watching from a TV. I disagree completely. From my experience being greatly affected by it and the opinions of those who watched IPL3 and the last asus ROG and reddit it seems widely sought out. I'm sure people want it. And if you're watching on an actual television, I'm sure it's worth it. But most people watch from their gaming desktop or laptop. I'm not so sure the 30fps is really visible there. Yes, the monitor refreshes at 60fps and certainly on those 120Hz monitors you can probably see it, but on the non-high end monitors it doesn't feel much smoother. But even if it's "sought out" the 60fps mania feels a bit like the megapixel chase to me -- it's a number, and people think "more" is always better, but there are things that matter besides FPS. The Nikon D40 is 6 megapixels but it takes way better pictures than most 8MP or higher point and shoots. Likewise per-frame quality matters as much if not more than FPS in big battles in SC2. Given two streams with all else equal except FPS, I'd much rather watch the 30fps one. Take the MLG Ultimate stream from this weekend. All I did was eyeball the network tab on the task manager, but it looked like 4000-5000 Kbps. Without bitrate constraints, a midgame 120 supply TvT battle takes 8500 Kbps to encode at --preset medium --crf 18, and marines & tanks don't have that much animation. A ZvP endgame StalkerColossusMothership vs InfestorCorruptorBroodlord battle takes 30,000 Kbps at 30fps with --preset medium --crf 18. At any realistic streaming bitrate, the encoder is totally hosed. And now we're supposed to think things will be more pleasant to watch if the encoder has to produce twice as many frames with the same number of bits?And it's a lot of CPU to push all those frames, and you could use that CPU on a slower preset that would get you better compression and therefor more quality per-bit. Why not either stick with 30 frames and amp up the preset and/or crf and/or bitrate? I don't have the horsepower to both play and record at 60fps, but my friend & I did do some tests of SC2 footage 30fps vs 15 fps (I did them this weekend for other reasons), and at the same bitrate the per-frame quality on 15 fps was way better and the SSIM numbers bear this out. Now, 15fps is too slow ... it looks like you're watching stop-motion animation. But outside of scrolling, when are you really going to notice 60fps when watching SC2? Meanwhile, basically every time someone casts psy storm the encoder can barely cope, and asking it to produce two frames instead of one just makes the situation worse. I blame Peter Jackson for this mess, but when you're filming the Hobbit you can set the encoder on --preset slowest, and Blu-Rays have crazy high bitrates compared to streaming. Would anyone else like to borrow the "streaming framerate isn't everything" soapbox?
Its not about the numbers, or some stupid drama over a movie framerate or poorly interpolated high refresh rate TV's, higher FPS will simply be smoother and more lifelike.
You say 15fps is too slow, "it looks like you're watching stop-motion animation", but what if your perspective was changed, and the 30fps looked like stop motion instead? You are used to watching 24/30fps, but if that was a much higher number, if you were playing fps on a 120hz LCD or a CRT and only watched videos recorded in 60/120fps, it wouldnt be quite that was, regardless of "human eye limitations" and stuff people say. You accept 30fps as totally normal, and 15fps as unwatchably slow, but only because you are used to 30fps in the first place.
There is a certain pain felt after playing the game on lower settings with a minimum framerate well above 60 for almost the entire game, watching a good tournament stream (asus ROG) @ 60fps, and then opening a new stream afterwards, and being able to immediately tell it is 30fps, and it just feels wrong. Streams cant be lossless, but even if you had to double bitrate for the same quality per frame, running at half the bitrate per frame still feels closer to preserving the game and realism experience, and it is more more pleasurable to watch, particularly with scrolling or fast units, zergling packs etc.
When IPL3 started with one of the first widely seen 60fps sc2 streams, a ton of people freaked out in the live report thread saying the IPL maps were using a game speed that was faster than normal or something.
I havnt seen anyone outside of the TV/console crowd actuly complain about higher framerates, even at the cost of per-frame quality, and ive seen far too many people playing copycat and picking up on whatever drama, taking a side, the cheap wine paradox where people will actuly not only agree but think it tastes good just because somebody else said it tasted good, or it was more expensive.
I stopped taking those arguments seriously anyway when people were arguing against the hobbit saying "We dont want it to look realistic, we want it to take us out of our world, its a fantasy movie" and things along those lines, its just so stupid. Wasnt the entire overdone marketing campaign of "HDTV" pitched as making TV "More realistic"?
|
With the upload speed you have, you could leave your pc at stock as it wont make much of a difference on performance.
|
United Kingdom20322 Posts
On August 28 2012 14:58 LgNKami wrote: With the upload speed you have, you could leave your pc at stock as it wont make much of a difference on performance.
Ivy cant overclock well at all anyways.. id be pretty confident going for 1.5x+ the overclock margin on sandy (i5 vs i5) just because of heat limits destoying any chances of keeping up.
In my experience unless you specifically opt for notably above average cooling, you will actuly get better benches (and easier) with SB than its successor
|
You could try to lower the cpu settings such as physics and such that wont decrease the graphical performance
|
|
|
|
|
|