|
Cross-posted from XSplit forums, but I think it will be useful here, especially since I used SC2 as the reference footage.
Based on helping out on the XSplit forums for a couple of weeks now, setting stage resolution and bitrate is very difficult. A common error for first-time streamers is to pick a resolution that is too taxing for their CPU, or pick a bitrate that is too low for their resolution (resulting in reduced quality and increased stuttering on stream) or too high for their bitrate (resulting in increased CPU to encode finer details and push bits around). There is a much longer guide devoted to getting started with XSplit that talks a little about these settings, but it doesn't seem to be sinking in. So I put together a quick guide on how to pick your resolution and bitrate.
Click for full-sized image
Google Doc link: Streaming Resolution, FPS, & Bitrate lookup
All you need to do is find your CPU and Speedtest upload speed on this table to get your max resolution & FPS, and then use the smaller bitrate table to get the bitrate for that resolution/FPS pair. Be sure to use this to set your Stage Resolution (View->Resolution), and then set your channel to "Default Stage Resolution", to minimize resizing
EXAMPLE 1: I have an i5-2500K and a 2.5mbps upload, I should set my stage resolution to 720p and the framerate to 30 FPS. But, I don't upload speed have enough to use a high bitrate stream. I should start by setting my VBV maxrate to 2000 Kbps.
EXAMPLE 2: I have a super high end i7 3960, but only 2.0 Mbps up. Even though my CPU is capable of streaming 1080p@60fps, my internet connection is so slow that all I can stream in is 480p at 30fps, and choose a VBV maxrate of 1500 Kbps. Hmmm ... maybe I should get a better internet connection so I can stream in higher quality.
EXAMPLE 3: I have an old quad-core Athlon II that plays games like LoL and SC2 just fine, but a super awesome 10mbps upload speed from my ISP. Even though my connection could suppoert 1080p@60fps, my computer is probably only powerful enough to handle 360p@25fps. Perhaps it's time to buy a new PC?
EXAMPLE 4: I have an i7 3770K and an 8.0 mbps upload speed. I can afford to stream at 1080p@45 FPS and use the low bitrate of 4200 Kbps. However, the 1080p@30 high bitrate of 6000 Kbps is also available. My viewers might actually like that stream better than 1080p@45 low bitrate.
As always, these are estimates at best. Be sure to experiment with your specific setup. But I hope people find it helpful and easy to use! Also, If you have a CPU that's not listed, please add a comment with the bitrate & resolution you stream at and I will be more than happy to add it.
And now, some FAQ answers: How did you pick these bitrates?
I took a replay that included a 30-second segment containing both a large army battle and multiple camera changes (managing drops/harass) and recorded it using fraps. This is meant to simulate a "high complexity" scene that is difficult to encode. I then transcoded it at several resolutions and framerates using ffmpeg , which uses the same encoder as XSplit, at settings equivalent to what XSplit uses except bitrate constraints. This gave me an average bitrate for high complexity footage without constraint. This is not an exact science, but it is at least based on real data and should be a decent starting point.
Why are the "High quality" bitrate recommendations so much higher than I see when I look at other guides?
There are several reasons for this:- Higher bitrates are more CPU intensive in streaming scenarios -- makes gaming harder
- Very few viewers have connections consistently > 6Mbps, especially if they are doing other things while watching a stream. So it's only available to people who can stream in multiple bitrates, at which point you have to have a > 10mbps upload which is not even available in home/small business markets.
- The higher bitrates are used only during complex scenes, which is a small fraction of the time spent streaming (though arguably the most important time!)
What about other settings (quality, preset, etc.)?
The right starting point is almost always to set quality to 10 and the preset to either "veryfast" or "XSplit default". If the game feels to slow at that point, try, in order:- Dropping from 30 fps to 25 fps and reducing VBV max bitrate by 10% Kbps
- Dropping the VBV max bitrate again by another 10-20% (experiment here)
- Lowering the quality until you think the stream is too ugly
- Dropping you XSplit stage resolution (View->Resolution)
If the game is still feels like it's running fine, you can try two thinks crank up the quality- Set quality to "not set", and set your preset to "veryfast&ex:crf:24" or "XSplit default&ex:crf:24". If things are still running fine, and you are using a bitrate from the High Quality or Extreme Quality bitrate, keep decreasing the number after crf until you can't play the game. Stop when you get to
18 21.
- increase the bit rate.
Broadly speaking, increasing the quality will have some impact at all times but only a small impact in complex scenes, while increasing the bitrate will have a big impact during large battles or other complex scenes.
I will be running some experiments to analyse the impact of various preset & quality changes. Stay tuned ...
|
Thanks 4 sharing, now my stream run way smoother
|
This guide looks like it was well researched, but unfortunately seems not to be very realistic. Those recommended bitrate numbers seem needlessly high. By not using a VBV max bitrate when testing for bitrate you're essentially telling x264 to stream at an unconstrained constant rate factor, which doesn't allocate bits fairly for a VBV situation. Such high bitrates should never be needed with a VBV.
As an example, check out the TSL4 stream - we do 1080p30 at 3mbps and it's incredibly high quality. Going higher starts to cause a lot of issues with viewers being unable to watch your stream due to their own Internet or ISP not being able to sustain such a bitrate, as well as the risk of CPU / GPU decoding not being able to keep up (Flash player is pretty terrible at decoding)
Also lowering the bitrate or quality will not help with game slowdown as they are just rate control options. FPS, resolution and preset are the most important factor for CPU. Recommending a CRF below 25 is also pretty much pointless, the bitrate will balloon to the point where the generated frames just aren't usable. As an example, CRF of 24 at 1080p30 in SC2 easily generates 10+mbps frame sizes which are well above most peoples bitrates.
In terms of raw quality, bitrate has the most impact on quality regardless of anything else. You should optimize for bitrate first based on upload, then select the appropriate resolution, preset and FPS for the CPU.
|
On September 05 2012 12:50 R1CH wrote: This guide looks like it was well researched, but unfortunately seems not to be very realistic. Those recommended bitrate numbers seem needlessly high.
...
In terms of raw quality, bitrate has the most impact on quality regardless of anything else. You should optimize for bitrate first based on upload, then select the appropriate resolution, preset and FPS for the CPU.
I tested CRFs all the way down to 18. It's true that when using a VBV constraint for streaming*, going below the 20-22 range doesn't improve quality much during any battle of meaningful size. During significant battles (say, when both players are above 100 supply) going below CRF 25 helps very little. However during less complex moments of gameplay, which is the vast majority of the time, increasing CRF while using a bitrate somewhat higher than other recommendations will result in a big boost in quality. I tested this and observed minimal impact on game framerates until I got down to CRF <= 20. Conveniently, the marginal gain in quality in going from CRF 21 to CRF 18 is much smaller than the gain from CRF 24 to CRF 21, so I think CRF 21 is probably a good stopping point for "best viable streaming quality".
I realize that the High/Extreme bitrate recommendations are unrealistically high for 1080p streams. I'm considering deleting them especially if there's feedback that many viewers can't watch streams at those bitrates. But there may stil be an audience for them. Based on my "extremely scientific" poll in the SC2 General Forum, 70% of stream viewers have > 3 Mbps connections, and 60% have > 6Mbps. I do think that anyone whose Best quality stream has a bitrate above, say, 4500 Kbps should provide a second low rez/bitrate/quality stream for lower-bandwidth users so that everyone can watch, and the current guide says that.
I'm a little surprised that 3000 Kbps is considered enough for the TSL high quality stream, unless it's encoded using medium or slow preset (an overclocked 3930 used as a dedicated streaming PC can encode 720p source at slow preset in real time!). Obviously a popular site like Team Liquid is going to have to provide a good experience for the largest number of users, but it might be worth it to consider providing a "extreme quality" stream for high bandwidth users. Based on my eyeballing of Task Manager a few weeks ago, the MLG Ultimate stream used 4500-5000 Kbps (and even it was unable to keep up with large battles, but I don't know what the rest of their encoding settings looked like).
I'm not so sure it's correct to "optimize for bitrate first based on upload". If you have a pre-Sandy bridge i3 and a 10 Mbps upload connection there's no point using 7500 Kbps to stream in 480p just because it's there. Divorcing the bitrate from your resolution/fps/CPU power entirely will lead to weird results like this. Also, increasing bitrate does seem to result in increased CPU and/or decreased encoding FPS, though my numbers for encoding FPS were for offline tests rather than streaming. CPU and bitrate end up going go hand-in-hand, at least a little.
* If someone let me be king for a day, I'd decree that we all constrain the bitrate with --bitrate and proper rate tolerance rather than fiddle around with CRFs, a ratetol that's an arbitrary unit of measurement, VBV-bufsize, rc-lookahead, etc., but, that's a whole 'nother kettle of fish.
|
|
|
|