|
On November 30 2012 04:05 jacosajh wrote:Show nested quote +On November 25 2012 21:08 Firebolt145 wrote:For those that haven't been following the changelog: * Added stream delay feature A feature which is only available for pro subscribers of Xsplit is now free in OBS. ^_^ This is a nice feature. But makes me wonder if OBS will be free for much longer ^_^
Did you read it's license? It will stay free.
|
first time i streamed with OBS, only sc2 showed up on the stream. Now, everything shows up. How do I keep it so that only sc2 is being streamed?
|
Use game capture or window capture with Aero on.
|
|
On November 30 2012 08:22 CrazyIvanPL wrote: I'm here to say one thing: dunno how, but i managed to stream 720p watchable sc2 from c2d cpu... anyone can explain this to me? ;P
What fps is your average during game play, what's your minimum? Use fraps to bench a game, roughly. I truly wish some sort of benchmarking map/utility existed in Sc2 (preloader might be okay, but depending on the fraction of a second of when you start it can heavily change the fps avg/min, since it runs so short and the idling fps right when the program ends, and the heavy load immediately when it starts, means if you start your bench too early/late or stop it too early/late gives inconsistent results and you can't exactly tell it to start when your done loading, since load time is never consistent).
But if you can do so, good for you. I'm concerned if it's truly 'playable' at intense 200v200 mothership, broodlord, armies fighting, but I imagine it is doable.
Which makes me raise the question: What exactly determines CPU usage when streaming? Fps? resolution? bitrate? I mean since bitrate is a big part of it... I dont understand exactly how changing the FPS, changes the CPU load, or if it does at all (i mean as long as you are below your minimum time, i forgot how exactly you figure out what the max fps your cpu can do but minimum frames has something to do with it). What settings would raise or lower CPu usage?
I mean the quality of your stream is set by your bitrate, which has nothing to do with your cpu. I guess it's about how much your CPU is compressing the image....but what exactly influences that? Is every single CPU going to be at 80%+ usage on every core, and just more fps the better your CPU?
|
WEll my gpu is not new and my fps ingame is like 50-60 with drops to like 30-40 during big battles. Only thing that is happening while streaming is a little stuttering on stream, but it's still small enough to watch imo
And ofc i have affinity set for OBS to stream so i have less cpu usage and it's between 70-90%. Jumping all the time so hard to say how much exactly ;p my settings are 1500/2000 bitrate/buffer rescalling to 720p from 1080. ofc aero is disabled and all the stuff i don't need to run during streaming. I think that most of cpu is taken by process of compressing vid and sending it bitrate just determines how much u can send out, which is connected with how much ur cpu can compress at time...
|
Streaming = Encoding = CPU Usage.
When you encode + capture, it eats up your CPU therefore the FPS drop and such. A capture card can help reduce this.
|
capture card costs more than new cpu+mb ^^
|
United Kingdom20163 Posts
On December 01 2012 18:11 Belial88 wrote:Show nested quote +On November 30 2012 08:22 CrazyIvanPL wrote: I'm here to say one thing: dunno how, but i managed to stream 720p watchable sc2 from c2d cpu... anyone can explain this to me? ;P What fps is your average during game play, what's your minimum? Use fraps to bench a game, roughly. I truly wish some sort of benchmarking map/utility existed in Sc2 (preloader might be okay, but depending on the fraction of a second of when you start it can heavily change the fps avg/min, since it runs so short and the idling fps right when the program ends, and the heavy load immediately when it starts, means if you start your bench too early/late or stop it too early/late gives inconsistent results and you can't exactly tell it to start when your done loading, since load time is never consistent). But if you can do so, good for you. I'm concerned if it's truly 'playable' at intense 200v200 mothership, broodlord, armies fighting, but I imagine it is doable. Which makes me raise the question: What exactly determines CPU usage when streaming? Fps? resolution? bitrate? I mean since bitrate is a big part of it... I dont understand exactly how changing the FPS, changes the CPU load, or if it does at all (i mean as long as you are below your minimum time, i forgot how exactly you figure out what the max fps your cpu can do but minimum frames has something to do with it). What settings would raise or lower CPu usage? I mean the quality of your stream is set by your bitrate, which has nothing to do with your cpu. I guess it's about how much your CPU is compressing the image....but what exactly influences that? Is every single CPU going to be at 80%+ usage on every core, and just more fps the better your CPU?
CPU usage is decided almost entirely by encoding settings, so the resolution*fps that you use and the preset
see http://mewiki.project357.com/wiki/X264_Settings
You cant really "compress more", thats basically what the preset setting does, but tripling CPU usage only gives you like 20% better compression efficiency, for practical reason's it is basically never worth mentioning anything but Veryfast unless you are streaming at a lower resolution etc 540p on a high end CPU. Since 540p uses a quarter of the load of 1080 you have plenty of room to drop preset to medium or slow, maybe slower on an OC'd sandy/ivy bridge quad and it can have a noteable difference then, but otherwise, no. I wouldnt reccomend changing it either way unless you know how to test settings properly and do it really extensively
|
On December 01 2012 20:55 CrazyIvanPL wrote:WEll my gpu is not new and my fps ingame is like 50-60 with drops to like 30-40 during big battles. Only thing that is happening while streaming is a little stuttering on stream, but it's still small enough to watch imo And ofc i have affinity set for OBS to stream so i have less cpu usage and it's between 70-90%. Jumping all the time so hard to say how much exactly ;p my settings are 1500/2000 bitrate/buffer rescalling to 720p from 1080. ofc aero is disabled and all the stuff i don't need to run during streaming. I think that most of cpu is taken by process of compressing vid and sending it bitrate just determines how much u can send out, which is connected with how much ur cpu can compress at time...
A little stuttering in stream is a stream setting issue.
You should NOT set affinity for OBS. That would be limiting your system. If anything, set SC2 priority to High and the TWO streaming programs to above normal. You are aware that setting OBS priority or affinity to something different doesnt really do anything right? When you start up a stream, another process will become listed in task manager, and that's the one you fiddle with, although you really shouldn't. Just put sc2 priority on above normal so your gaming is smooth.
CPU usage is decided almost entirely by encoding settings, so the resolution*fps that you use and the preset see http://mewiki.project357.com/wiki/X264_SettingsYou cant really "compress more", thats basically what the preset setting does, but tripling CPU usage only gives you like 20% better compression efficiency, for practical reason's it is basically never worth mentioning anything but Veryfast unless you are streaming at a lower resolution etc 540p on a high end CPU. Since 540p uses a quarter of the load of 1080 you have plenty of room to drop preset to medium or slow, maybe slower on an OC'd sandy/ivy bridge quad and it can have a noteable difference then, but otherwise, no. I wouldnt reccomend changing it either way unless you know how to test settings properly and do it really extensively
I know that preset is how fast you encode, a faster encode is obviously more work. But thanks, I see, about resolution and fps, since those change how much you compress per second.
|
had to drop in to the thread to say how awesome this program is. I could never stream in 720p with xsplit without getting massive frame drops both on stream and in game. With OBS i can stream at 720p without hardly noticing the difference between streaming and not streaming... its pretty sick.
Had some trouble getting sound to work, had to get VAC and go a long way around it. I can't wait to see what this program goes as it gets refined and more features. The stream direct from game is awesome too, thats prolly why it has such low impact on performance on my machine.
Edit: after a little playing around I figured out why it wasn't picking up on the game sounds. You need to have whatever output you are using in game as your default output in windows. I use my usb headset but my default in windows was my speakers.... so obv no sound was being picked up by OBS, once I switched it over to my USB headset, everything works just fine with no need for VAC!
|
United Kingdom20163 Posts
On December 02 2012 04:31 Belial88 wrote:Show nested quote +On December 01 2012 20:55 CrazyIvanPL wrote:WEll my gpu is not new and my fps ingame is like 50-60 with drops to like 30-40 during big battles. Only thing that is happening while streaming is a little stuttering on stream, but it's still small enough to watch imo And ofc i have affinity set for OBS to stream so i have less cpu usage and it's between 70-90%. Jumping all the time so hard to say how much exactly ;p my settings are 1500/2000 bitrate/buffer rescalling to 720p from 1080. ofc aero is disabled and all the stuff i don't need to run during streaming. I think that most of cpu is taken by process of compressing vid and sending it bitrate just determines how much u can send out, which is connected with how much ur cpu can compress at time... A little stuttering in stream is a stream setting issue. You should NOT set affinity for OBS. That would be limiting your system. If anything, set SC2 priority to High and the TWO streaming programs to above normal. You are aware that setting OBS priority or affinity to something different doesnt really do anything right? When you start up a stream, another process will become listed in task manager, and that's the one you fiddle with, although you really shouldn't. Just put sc2 priority on above normal so your gaming is smooth. Show nested quote +CPU usage is decided almost entirely by encoding settings, so the resolution*fps that you use and the preset see http://mewiki.project357.com/wiki/X264_SettingsYou cant really "compress more", thats basically what the preset setting does, but tripling CPU usage only gives you like 20% better compression efficiency, for practical reason's it is basically never worth mentioning anything but Veryfast unless you are streaming at a lower resolution etc 540p on a high end CPU. Since 540p uses a quarter of the load of 1080 you have plenty of room to drop preset to medium or slow, maybe slower on an OC'd sandy/ivy bridge quad and it can have a noteable difference then, but otherwise, no. I wouldnt reccomend changing it either way unless you know how to test settings properly and do it really extensively I know that preset is how fast you encode, a faster encode is obviously more work. But thanks, I see, about resolution and fps, since those change how much you compress per second.
Faster encode is actuly a lot less work, uses a lot less CPU cycles. In an offline enviroment, you would be encoding the same video in less time compared to slower presets as you dont have to process in real time
Resolution*FPS is about how much data is running through encoder, not really compression efficiency. Twice as many pixels takes twice as many CPU cycles to encode
|
yea backwards.
Resolution*FPS is about how much data is running through encoder, not really compression efficiency. Twice as many pixels takes twice as many CPU cycles to encode
Are you sure about that?
If you have X bitrate at Y resolution, and you increase your fps, your still using the same bitrate, so what will happen is the image quality will go down unless you increase the bitrate to compensate (or lower resolution). So even though it's encoding twice as many pixels, the image quality will go to half...
I'm not sure about CPU load though - I think the issue of the image quality halving due to the increase in FPS at same resolution and bitrate is about your upload and connection moreso... right?
Obviously, it's not exactly halved, because the higher your fps, the less work your CPU does per frame since each additional frame is more similar to the one before it the higher the frame rate, but to get across the basic idea of what is happening...
Does bitrate affect how much your CPU load is? It should right, it affects how much data your CPU is sending out, right? Shouldn't your bitrate really be the biggest determinant of CPU load? Since lower bitrate, higher fps, means the CPU isn't compressing as much because the images just come out looking shitty....
|
Wow, this program is pretty beast. I have a i5-2300, which is pretty good but has a weak clock speed that's almost unsuitable for streaming, but I can stream at 720p@30fps with this thing no problem. I had trouble streaming even 540p with FFSplit+DXtory and XSplit.
|
United Kingdom20163 Posts
On December 02 2012 16:06 Belial88 wrote:yea backwards. Show nested quote +Resolution*FPS is about how much data is running through encoder, not really compression efficiency. Twice as many pixels takes twice as many CPU cycles to encode Are you sure about that? If you have X bitrate at Y resolution, and you increase your fps, your still using the same bitrate, so what will happen is the image quality will go down unless you increase the bitrate to compensate (or lower resolution). So even though it's encoding twice as many pixels, the image quality will go to half... I'm not sure about CPU load though - I think the issue of the image quality halving due to the increase in FPS at same resolution and bitrate is about your upload and connection moreso... right? Obviously, it's not exactly halved, because the higher your fps, the less work your CPU does per frame since each additional frame is more similar to the one before it the higher the frame rate, but to get across the basic idea of what is happening... Does bitrate affect how much your CPU load is? It should right, it affects how much data your CPU is sending out, right? Shouldn't your bitrate really be the biggest determinant of CPU load? Since lower bitrate, higher fps, means the CPU isn't compressing as much because the images just come out looking shitty....
Doubling framerate would be halving bitrate per frame, but you cant just measure quality, and also, quality with bitrate is nonlinear with massive diminishing returns, a 50% cut to bitrate could be only a 20% cut to "quality".
Bitrate has no effect on CPU load, you need to get "compressing" out of your head, it doesnt work that way at all. Processing a frame at X resolution is a set amount of work depending on a lot of factors, increasing framerate is a pretty close to linear increase in load, and the preset you are encoding at defines difficulty of encoding and compression efficiency relative to eachother (Medium might give the same quality at 800kbits as Veryfast at 1000kbits) but the input and output bitrate are irrelevant for CPU load. You are not increasing the complexity or difficulty by adding more bitrate, just telling the encoder if it is allowed to use more bits for increasing quality or not. Compression efficiency and difficulty comes with preset, and then resolution*fps, and returns are extremely low. Going from Veryfast to Superfast nets ~6% less CPU load but more than doubles the bitrate you need for the same quality. Stepping from Veryfast to faster, fast, medium will increase CPU requirements many times over, but give you extremely minor increases in quality at the same bitrate, the preset setting is really not of much use to live encoding (as opposed to offline, where you can leave a PC with 3 hours of video, and give it a 12 hour night to encode it at a setting that will give better quality within filesize limits)
|
I'm sure this is a tall order, but is there any way for OBS to take advantage of Intel's Quick Sync encoding? I'm not sure what kind of difference it can make with the encoding process of a livestream, but the benchmarks look pretty promising.
|
nm i found it. Wish i could delete my post :|
|
On December 03 2012 08:14 Cyro wrote:Show nested quote +On December 02 2012 16:06 Belial88 wrote:yea backwards. Resolution*FPS is about how much data is running through encoder, not really compression efficiency. Twice as many pixels takes twice as many CPU cycles to encode Are you sure about that? If you have X bitrate at Y resolution, and you increase your fps, your still using the same bitrate, so what will happen is the image quality will go down unless you increase the bitrate to compensate (or lower resolution). So even though it's encoding twice as many pixels, the image quality will go to half... I'm not sure about CPU load though - I think the issue of the image quality halving due to the increase in FPS at same resolution and bitrate is about your upload and connection moreso... right? Obviously, it's not exactly halved, because the higher your fps, the less work your CPU does per frame since each additional frame is more similar to the one before it the higher the frame rate, but to get across the basic idea of what is happening... Does bitrate affect how much your CPU load is? It should right, it affects how much data your CPU is sending out, right? Shouldn't your bitrate really be the biggest determinant of CPU load? Since lower bitrate, higher fps, means the CPU isn't compressing as much because the images just come out looking shitty.... Doubling framerate would be halving bitrate per frame, but you cant just measure quality, and also, quality with bitrate is nonlinear with massive diminishing returns, a 50% cut to bitrate could be only a 20% cut to "quality". Bitrate has no effect on CPU load, you need to get "compressing" out of your head, it doesnt work that way at all. Processing a frame at X resolution is a set amount of work depending on a lot of factors, increasing framerate is a pretty close to linear increase in load, and the preset you are encoding at defines difficulty of encoding and compression efficiency relative to eachother (Medium might give the same quality at 800kbits as Veryfast at 1000kbits) but the input and output bitrate are irrelevant for CPU load. You are not increasing the complexity or difficulty by adding more bitrate, just telling the encoder if it is allowed to use more bits for increasing quality or not. Compression efficiency and difficulty comes with preset, and then resolution*fps, and returns are extremely low. Going from Veryfast to Superfast nets ~6% less CPU load but more than doubles the bitrate you need for the same quality. Stepping from Veryfast to faster, fast, medium will increase CPU requirements many times over, but give you extremely minor increases in quality at the same bitrate, the preset setting is really not of much use to live encoding (as opposed to offline, where you can leave a PC with 3 hours of video, and give it a 12 hour night to encode it at a setting that will give better quality within filesize limits)
yea i noticed when going to superfast and ultrafast on an athlon ii x4 3.4ghz/2.6nb resulted in less than a .5 fps increase in min/max/avg fps according to a 300 second fraps benchmark on 200v200 lategame zvp map full creep mothership, but clearly made the stream look much shittier. In either case I was still 90%+ utilization on every core though. I was getting 25+ min avg frames, and it was very much playable just fine, but it seemed like even the athlon ii was too powerful to really need quicker presets.or maybe that's not how it works, but i found it yea, as you say, quite useless.
I always understood that slower preset was so you could upload a 1080 stream or even 720 stream on a relatively lower bitrate so you could make the stream more accessible to people who might have low down speeds. And faster was to help really shit CPUs on good connections. I know someone actually did a test of the slower presets, and it resulted in significant decreases in necessary bitrate for equal image quality, like 500-1000kbit reduction in necessary bitrate. He didn't test the quicker presets though.
Thanks though. I never noticed increased CPU load or lower fps when I increased my fps from 25 to 45 for my stream. I think I did notice a slight decrease in image quality though, but the increased fps made it look much better. 60fps looked poor on my stream, it made it look too weird i thought, like too smooth.
|
hey jim!! very nice software but i think it lacks the feature to use game source in the 64-bit version.
|
United Kingdom20163 Posts
On December 03 2012 17:40 Belial88 wrote:Show nested quote +On December 03 2012 08:14 Cyro wrote:On December 02 2012 16:06 Belial88 wrote:yea backwards. Resolution*FPS is about how much data is running through encoder, not really compression efficiency. Twice as many pixels takes twice as many CPU cycles to encode Are you sure about that? If you have X bitrate at Y resolution, and you increase your fps, your still using the same bitrate, so what will happen is the image quality will go down unless you increase the bitrate to compensate (or lower resolution). So even though it's encoding twice as many pixels, the image quality will go to half... I'm not sure about CPU load though - I think the issue of the image quality halving due to the increase in FPS at same resolution and bitrate is about your upload and connection moreso... right? Obviously, it's not exactly halved, because the higher your fps, the less work your CPU does per frame since each additional frame is more similar to the one before it the higher the frame rate, but to get across the basic idea of what is happening... Does bitrate affect how much your CPU load is? It should right, it affects how much data your CPU is sending out, right? Shouldn't your bitrate really be the biggest determinant of CPU load? Since lower bitrate, higher fps, means the CPU isn't compressing as much because the images just come out looking shitty.... Doubling framerate would be halving bitrate per frame, but you cant just measure quality, and also, quality with bitrate is nonlinear with massive diminishing returns, a 50% cut to bitrate could be only a 20% cut to "quality". Bitrate has no effect on CPU load, you need to get "compressing" out of your head, it doesnt work that way at all. Processing a frame at X resolution is a set amount of work depending on a lot of factors, increasing framerate is a pretty close to linear increase in load, and the preset you are encoding at defines difficulty of encoding and compression efficiency relative to eachother (Medium might give the same quality at 800kbits as Veryfast at 1000kbits) but the input and output bitrate are irrelevant for CPU load. You are not increasing the complexity or difficulty by adding more bitrate, just telling the encoder if it is allowed to use more bits for increasing quality or not. Compression efficiency and difficulty comes with preset, and then resolution*fps, and returns are extremely low. Going from Veryfast to Superfast nets ~6% less CPU load but more than doubles the bitrate you need for the same quality. Stepping from Veryfast to faster, fast, medium will increase CPU requirements many times over, but give you extremely minor increases in quality at the same bitrate, the preset setting is really not of much use to live encoding (as opposed to offline, where you can leave a PC with 3 hours of video, and give it a 12 hour night to encode it at a setting that will give better quality within filesize limits) yea i noticed when going to superfast and ultrafast on an athlon ii x4 3.4ghz/2.6nb resulted in less than a .5 fps increase in min/max/avg fps according to a 300 second fraps benchmark on 200v200 lategame zvp map full creep mothership, but clearly made the stream look much shittier. In either case I was still 90%+ utilization on every core though. I was getting 25+ min avg frames, and it was very much playable just fine, but it seemed like even the athlon ii was too powerful to really need quicker presets.or maybe that's not how it works, but i found it yea, as you say, quite useless. I always understood that slower preset was so you could upload a 1080 stream or even 720 stream on a relatively lower bitrate so you could make the stream more accessible to people who might have low down speeds. And faster was to help really shit CPUs on good connections. I know someone actually did a test of the slower presets, and it resulted in significant decreases in necessary bitrate for equal image quality, like 500-1000kbit reduction in necessary bitrate. He didn't test the quicker presets though. Thanks though. I never noticed increased CPU load or lower fps when I increased my fps from 25 to 45 for my stream. I think I did notice a slight decrease in image quality though, but the increased fps made it look much better. 60fps looked poor on my stream, it made it look too weird i thought, like too smooth.
Your encoding preset is not going to affect your ingame framerate unless your CPU usage is too high, particularly with sc2, which only heavily loads 1 core. You wont get any fighting over resources unless you set demands too high unless you have a game for example that will heavily load all of your cores at times, leaving no resources anywhere for the encoding.
90% across multiple cores is far too high, no room to account for spikes etc. You should have no more than ~70-80% on any core if you hold mouse click on minimap and drag the camera around as fast as you can for 10-15 seconds with task manager open and always on top window, if not, you are using too aggressive settings to have a smooth and consistent stream and avoid causing PC-wide stutters and other performance issues
|
|
|
|