Let's say that a video has length T and that at time t of the video, the video/buffer ratio is given by r(t). (Above we did the case when r(t) is constant for all t). Say, for example, the video is T=10 seconds, and that the ratio is given by r(t) = (t+1)/11, 0 < t < T. What is the minimum time that you will remain paused throughout the video?
Not sure if this makes sense, but I believe a simple answer can still be given (using Calculus), provided r(t) isn't too messy. Curious to see if anyone gets an answer that agrees with mine.
My answer: + Show Spoiler +
I think it's just (int_0^T 1/r(t) dt) - T