It finally hit me: Why GSL streams look washed out - Page 2
Blogs > Lysenko |
dannystarcraft
United States179 Posts
| ||
r.Evo
Germany14079 Posts
I have always wondered where that problem could come from from a photographers point of view and assumed if the issue arises with videocameras it should be around with photocameras as well but the only places I noticed it was when stuff was badly edited. Thanks for teaching all of us a little bit about old standards wrecking us in todays world. =D | ||
R1CH
Netherlands10340 Posts
| ||
ieatkids5
United States4628 Posts
| ||
Divine-Sneaker
Denmark1225 Posts
| ||
nttea
Sweden4353 Posts
| ||
Lysenko
Iceland2128 Posts
What's funny is that it's probably all coming from digital systems operating in compatibility modes to support outdated signal standards. I doubt that too much analog equipment remains in GOMTV's image stream but this stuff is very hard to get away from. Like you mention, interlacing is another great example. Interlacing was invented to reduce bandwidth when broadcasting using analog systems for display on CRTs, whose phosphors glow for a bit after being illuminated. Nobody uses CRT displays anymore except in certain very narrow applications (for viewing HD, of course there are millions of SD sets still in service), but most countries' HD broadcast standards require that the 1920x1080 mode be broadcast interlaced DESPITE that the digital broadcasting standard permits adjusting the amount of video compression anyway, independently of that. Then, on the other end, the TVs, which are generally not devices that support actual interlaced display modes, have to do all kinds of tricky signal processing to produce a nice looking image. I think TV viewers would see a lot more objectionable artifacts, except for the fact that almost all pre-recorded content on mainstream TV is shot at 24 fps 1080p and the round-trip conversion to 1080i/60 fields per second for broadcast and back again is something the TVs have gotten pretty good at doing automatically. Of course that 24 FPS standard too is a throwback to the early part of the century when motion picture film stock didn't have the tensile strength to be run much faster through a camera than that. Even current film stock can handle higher frame rates without damage, and of course all modern digital cinematic cameras support frame rates to 48 and above. In a world with no analog broadcasting and no CRTs, we should not have 16-235 video color spaces or interlacing at all. 24 fps I am more ambivalent about because it has a characteristic look that I have learned to love as a filmmaker, and using a higher frame rate would have a direct negative impact on the cost and feasibility of visual effects work. Basically, 24 fps makes digital visual effects a lot easier and cheaper, since a lot of work is still manual frame-by-frame labor. | ||
KMM
11 Posts
Blackmagic Decklink, which operates only with digital signals. The 1080i Stream of GOMs GSL shown on Korean TV "AniBox HD" is also wrong its zoomed out and has a small letterbox around The People at Gom are just incompetent @Lysenko 24fps is cheaper... 48/60fps is still the way to go, imo 24fps will slowly die out in the next 10years | ||
Lysenko
Iceland2128 Posts
On December 03 2012 12:55 KMM wrote: @Lysenko 24fps is cheaper... 48/60fps is still the way to go, imo 24fps will slowly die out in the next 10years I think the jury is totally still out. A lot of people who have seen footage from The Hobbit in 48 fps are unhappy with the look, including some non-experts. That said, I think the extra costs in post production will be the thing that prevents a wholesale switch to higher frame rates. Storage costs are an issue, and while storage costs are always dropping, visual effects people have been pretty good at expanding to fill all available space with things like massive caches for scene data and higher image and geometry resolutions. Furthermore, techniques like painstaking manual frame-by-frame painting and rotoscoping are essential tools that (almost) directly double in cost going to 48 fps. Render CPU resources also double, and that's another area where artists and software developers fill all available time no matter how fast the systems get even at 24 fps. Production is an issue too. With most productions shooting at 4k these days, on-set data storage requirements (which because of the need to be ruggedized, fast, and portable can be extremely expensive) would double as well. I think it's a very real possibility that a studio looking at 48 fps anytime in the next 20 years will find that visual effects costs go up 30% by making that decision, and that right there will kill it for a visual-effects-heavy film that doesn't have a James Cameron or Peter Jackson to push the issue. We'll see though, there are obviously people who love the look and want to make it happen. Then there's the question of how to convert a 48 fps film to 1080i/60, lol... Edit: Worth noting that the studios are already incurring a similar hit today for 3D, though there are some economies in 3D that might not apply in a higher frame rate. However, 3D can provide a much more different experience than a frame rate bump can, and the theaters can charge more as a result, so there's a better business case for it. Going 24-48 is comparatively subtle. Edit 2: I expect The Hobbit in the vast majority of theaters and on Blu-Ray to be downsampled to 24 for both bandwidth and compatibility reasons. In fact, the reason they're choosing 48 rather than 60 is specifically for that. | ||
KMM
11 Posts
As you already stated interlace is just a cheap method to save bandwidth, but as 48p gets "standard" tv broadcasts will be in 1080p btw 48p -> 60i is done the same way as 24p ->60i by repeating frames and making the video choppy all you say against 48p are costs and data storage, but both factors get smaller in the future... edit: 3d is imo still overrated and in afaik applying visual effects on a (realfootage) 3d movie, costs more than 2d 48p... why not just shoot in 60fps, thats way smoother than 48p and most devices that can handle 1080p24 can also handle 1080p60 60fps can be made into 30fps without problems, bluray specifications (avchd) allows 60fps as for interlacing and tvs: All Tv-shows are 60fps Ads are mostly 60fps (or 30fps) movies/series are 24fps i live in germany so its a whole lot different: all tv-shows are 50fps ads are 50fps (or 25fps) movies/series are 25fps (pal-speedup) you are saying tvs got good at tripple conversion deinterlacing, thats wrong motion adaptive deinterlacing is what TVs or PC GPUs got good at upscaling the interlaced content to progressive, while maintaining framerate most TVs will not even recognize that its playing back 24fps, it will just deinterlace the 60i to 60p even if there are dupe frames | ||
Lysenko
Iceland2128 Posts
On December 03 2012 14:24 KMM wrote: why would you still use 1080i in the future. As you already stated interlace is just a cheap method to save bandwidth, but as 48p gets "standard" tv broadcasts will be in 1080p btw 48p -> 60i is done the same way as 24p ->60i by repeating frames and making the video choppy all you say against 48p are costs and data storage, but both factors get smaller in the future... edit: 3d is imo still overrated and in afaik applying visual effects on a (realfootage) 3d movie, costs more than 2d 48p... why not just shoot in 60fps, thats way smoother than 48p and most devices that can handle 1080p24 can also handle 1080p60 60fps can be made into 30fps without problems, bluray specifications (avchd) allows 60fps as for interlacing and tvs: All Tv-shows are 60fps Ads are mostly 60fps (or 30fps) movies/series are 24fps i live in germany so its a whole lot different: all tv-shows are 50fps ads are 50fps (or 25fps) movies/series are 25fps (pal-speedup) you are saying tvs got good at tripple conversion deinterlacing, thats wrong motion adaptive deinterlacing is what TVs or PC GPUs got good at upscaling the interlaced content to progressive, while maintaining framerate most TVs will not even recognize that its playing back 24fps, it will just deinterlace the 60i to 60p even if there are dupe frames The reason 1080i isn't going away is that 720p/60 and 1080i/60 are the two licensed HD broadcast formats in the US and certain other countries, mostly in Asia. They're a lowest-common-denominator that everyone has to support (in terms of signal format. You can legally sell a 720p TV that downconverts 1080i). There are many TVs out there incapable of handling a 1080p signal, so the standards will require supporting 1080i for the foreseeable future. The dynamic of storage and CPU is a lot more complicated than you suggest. The upward pressures on each without regard to frame rate in vfx keep pace with the technology. In fact, I see render times getting LONGER even as computers get faster to achieve more accurate realism. Disk space is the same -- there are powerful techniques that require massive amounts of cached data and as storage gets cheaper those techniques become more common and fill it up. Waiting 20 years will not help this because we're going to be disk and CPU bound for that entire time even if formats stay the same. 24p to 60i conversion is actually a pretty smooth process. Each 24p frame is displayed for either 1/30 or 1/20 second, occupying 2 or 3 fields respectively, and all the lines in the frame always wind up being displayed. (A field, in interlaced video, consists of either the even or the odd lines in a frame.) Each frame alternates between those lengths so there's no sensation of choppiness. 48p-60i is something else again. You have to throw away half the image information for four fields and then show a full frame on the 5th. This will look choppy, so it's more likely that the procedure with The Hobbit will be to master to video by using a 24 fps source. Yes, TVs are decent at motion-adaptive deinterlacing, but speaking as a former television engineer for Mitsubishi Electric, which pioneered a lot of this digital HD technology, I can tell you that many TVs do specifically go looking to identify 24fps source material and show full frames at 1080p/24 or 60 rather than just deinterlacing the 1080i input. I know this because I was in the room in some instances where it was put in. However, whether that feature is there is irrelevant to this discussion. Finally, 3d is probably somewhat more costly in visual effects than doubling the frame rate, but in many ways they're similar. Additional costs in 3d pretty much come down to extra labor in layout and animation departments plus possibly some up-front software development costs that can be amortized over multiple shows. Anyway, I've been working in these and related fields for pushing 18 years now, so I'm pretty sure of myself on the way the technology and costs scale over time. As for what fads or bandwagons the studios will jump on, that's harder to predict. I just think on frame rates the judgment will be that it's too costly for too small a benefit. | ||
EtherealDeath
United States8366 Posts
On December 03 2012 16:18 Lysenko wrote: The reason 1080i isn't going away is that 720p/30 and 1080i/60 are the two licensed HD broadcast formats in the US and certain other countries, mostly in Asia. They're a lowest-common-denominator that everyone has to support (in terms of signal format. You can legally sell a 720p TV that downconverts 1080i). There are many TVs out there incapable of handling a 1080p signal, so the standards will require supporting 1080i for the foreseeable future. The dynamic of storage and CPU is a lot more complicated than you suggest. The upward pressures on each without regard to frame rate in vfx keep pace with the technology. In fact, I see render times getting LONGER even as computers get faster to achieve more accurate realism. Disk space is the same -- there are powerful techniques that require massive amounts of cached data and as storage gets cheaper those techniques become more common and fill it up. Waiting 20 years will not help this because we're going to be disk and CPU bound for that entire time even if formats stay the same. 24p to 60i conversion is actually a pretty smooth process. Each 24p frame is displayed for either 1/30 or 1/20 second, occupying 2 or 3 fields respectively, and all the lines in the frame always wind up being displayed. (A field, in interlaced video, consists of either the even or the odd lines in a frame.) Each frame alternates between those lengths so there's no sensation of choppiness. 48p-60i is something else again. You have to throw away half the image information for four fields and then show a full frame on the 5th. This will look choppy, so it's more likely that the procedure with The Hobbit will be to master to video by using a 24 fps source. Yes, TVs are decent at motion-adaptive deinterlacing, but speaking as a former television engineer for Mitsubishi Electric, which pioneered a lot of this digital HD technology, I can tell you that many TVs do specifically go looking to identify 24fps source material and show full frames at 1080p/24 or 60 rather than just deinterlacing the 1080i input. I know this because I was in the room in some instances where it was put in. However, whether that feature is there is irrelevant to this discussion. Finally, 3d is probably somewhat more costly in visual effects than doubling the frame rate, but in many ways they're similar. Additional costs in 3d pretty much come down to extra labor in layout and animation departments plus possibly some up-front software development costs that can be amortized over multiple shows. Anyway, I've been working in these and related fields for pushing 18 years now, so I'm pretty sure of myself on the way the technology and costs scale over time. As for what fads or bandwagons the studios will jump on, that's harder to predict. I just think on frame rates the judgment will be that it's too costly for too small a benefit. 18 years? Dang how old are you o.o | ||
Lysenko
Iceland2128 Posts
On December 03 2012 17:12 EtherealDeath wrote: 18 years? Dang how old are you o.o I'm 41, turning 42 in June. I graduated college young, so I started my first real job at age 21, but it was a couple years later that I joined Mitsubishi, and I went to Disney Animation when I was 24, in April of 1996. (So "pushing 18 years" means a little less than 17 and a half. My 18th anniversary of joining Mitsubishi will be around June of 2013.) Lots more detail here, if you are interested in the whole story: http://www.teamliquid.net/blogs/viewblog.php?topic_id=230005 | ||
Aerisky
United States12128 Posts
Great read and replies... I don't understand a lot of it (not even close!), but I thought it was really interesting to go through | ||
Lysenko
Iceland2128 Posts
On December 03 2012 17:23 Aerisky wrote: Great read and replies... I don't understand a lot of it (not even close!), but I thought it was really interesting to go through Thanks man, I love this technology. Keeps me going at work when the specifics are getting me down. Edit: I was around 26 when all my friends were telling me OH MAN YOU HAVE TO TRY STARCRAFT! Too bad I never did, or I'd probably have been a lot better at the game and have a 10+ year TL account. :D | ||
unkkz
Norway2196 Posts
| ||
scrubtastic
1166 Posts
| ||
Lysenko
Iceland2128 Posts
On December 03 2012 18:30 unkkz wrote: Yeah fuck rotoscoping 48 fps is all i have to say on that issue. Interesting blog this. Made me learn a thing or two for work actually. All you do looking at a huge long roto is sigh and start clicking. :D | ||
KMM
11 Posts
in germany every hd channel is either 720p50 or 1080i50 i think high fps is still too much underrated, native 60fps hd content looks soooooo good in my eyes 24fps is just wrong, a mediocre frame rate dating back from the time when you had to physically cut movies frame per frame... you're right about 48fps conversion, but lets be honest what converts better to 60i than 60p does haha | ||
Lysenko
Iceland2128 Posts
KMM: I agree that there are advantages to a higher frame rate. And, producers might accept increases in cost to get there. However, there will be increases in cost. Also, I was in error. The U.S. broadcast standard specifies 720p to be broadcast at 60 fps, not 30. Sorry!! | ||
| ||