I don't understand what lines you're talking about there
120hz did succeed, the main reason that they're not highly popular is cost and the vast majority of consumers not caring or not caring enough about input lag, motion quality etc.
60hz is a relatively low refresh rate that's on the bottom end of acceptable, yet not abhorrent enough to inspire mass adoption of another standard
On March 25 2016 01:35 Cyro wrote: I don't understand what lines you're talking about there
120hz did succeed, the main reason that they're not highly popular is cost and the vast majority of consumers not caring or not caring enough about input lag, motion quality etc.
60hz is a relatively low refresh rate that's on the bottom end of acceptable, yet not abhorrent enough to inspire mass adoption of another standard
so, on a framebuffer display like LCD , without eye-tracking-blur countermeasure, the blur results in "resolution overlap",multiple rows of pixels turn into single lines in effect
1920x1080x 30fps @ 60hz = slide show
1920x1080x 60fps @ 60hz = 300 lines of motion resolution
1920x1080x 120fps @ 120hz = 600 lines of motion resolution
120hz not succeeds because you can't reliably hit 120fps, and without that it's the same. And even if you have constant 120fps (no chance really ), it will look only as good as 1280*720 , because that amount of blur makes 1080 vertical resolution redundant. Or you can "empty" the framebuffer between frames to emulate the CRT ( I know blanking is the better word but I wanted to illustrate with 'emptying' ).
On a display without framebuffer (high intensity in short time slice), you have all the motion resolution you'd ever want, however , without v-sync you get tearing, and with v-sync there will be hiccups if the framerate is too high - 120hz is way too high.
120hz failed , because pixel count always gets the priority , and 4K60hz will be the next interesting format to most people since they still seem to enjoy whatever bad temporal behaviour, even slide shows, and they'll buy incrementally better hardware to hit steady 60 frame.
120hz is marginally better than 60hz even if you're running @60fps
blurry 1080p is blurry 1080p, it's still 1080p. The blur doesn't effectively lower the resolution, it just makes the image look relatively worse. I think this part
it will look only as good as 1280*720 , because that amount of blur makes 1080 vertical resolution redundant
is outright false or at least poorly understood
---
because pixel count always gets the priority
That's extremely subjective. A LOT of people use >60hz monitors. It's a minority, but so what? Almost everyone uses crap.
1080p monitors are about 400x more popular than 4k monitors on the Steam hardware survey, another minority for you.
VR headsets are also going performance absolutely over pixel counts with 90-120hz displays across the board and promises of further improvement past gen 1.
Indeed they are. When unable to control shutter speed of your camera, well, it sucks. Do note that I was testing for input lag and not response time in these images. I used the stopwatch method as described by tftcentral. Further down that page the explain their new methods of testing which are more accurate than photographing webpages.
Now response time, on the other hand, was found to be between 4.6 and 12.6 ms, depending on which greys you were transitioning between and what monitor setting you have chosen.
--
I don't understand what lines you're talking about there
1920x1080x 60fps @ 60hz = 300 lines of motion resolution
1920x1080x 120fps @ 120hz = 600 lines of motion resolution
Yeah I'm lost too. Maybe do you mean that an image moving across a 60Hz screen at 60FPS will move 8 pixles/frame and a 120Hz screen at 120 FPS will move 4 pixles/frame?
because pixel count always gets the priority
I play SC2 on the CRT at 1024x768 @ 85Hz, just so I could experience (if I could detect it) the increased frame rate and lower frame time compared to the Asus at 1920x1200 @ 60HZ
4K60hz will be the next interesting format
I read a rumor that higher color gamut and contrast is the "next big thing."
1. It depends on mice drivers. Also, ps/2 was vastly faster than usb, and many winxp machines still used ps/2 mices (this is because ps/2 bypasses the software waiting line and goes straight to the processor).
2. LCD wiill always be slower than CRT. This is simply because of the technology: CRT creates images at the speed of light (cathode tube), LCD crates images by partially covering light sources (has moving parts, not as fast as light).
3. Response time can be measured in many ways: Black to white, White to black, Black to White to Black and finally Grey to grey.
Grey to grey tells you nothing. The reason is because you dont know what shade of grey (no pun intended) is used, and it is important to know.
Black to white is not that useful because you dont know white decay (how fast white vanishes after you switch to black again), meaning you can have ghosting or similar (It also does not give you much since different monitors will have different brightness settings, and you dont know how it was measured).
White to black tells you how good the brightness control and decay is, and it is a decent way to identify the response time of a monitor.
Black to white to black (or white to black to white) is the only solid form of testing response time as it has to do the full cycle. Anything 5ms and under is perfect.
In other words, Id take a 5ms b-w-b before a 1 ms GTG.
Finally, please dont mistake response time with input lag (from the monitor) and total input lag (from your system).
Total input lag is quite higher than just input lag, usually about an additional 300-400 ms.
Test here for knowing your response time (mine with lag free plasma tv is at 230 best, 280 worst).
In terms of Hz, the more, the more fps you can get. Good in shooters, not so much in rts.
In theory the more hz, the more frames, the less time (delay) between frames. But the delay is very low even in a monitor of 60hz (if i remember correctly its like 16,6666 ms of delay between each frame). Other factors just take priority over hz in monitors.
On March 25 2016 04:39 Cyro wrote: 120hz is marginally better than 60hz even if you're running @60fps
blurry 1080p is blurry 1080p, it's still 1080p. The blur doesn't effectively lower the resolution, it just makes the image look relatively worse. I think this part
it will look only as good as 1280*720 , because that amount of blur makes 1080 vertical resolution redundant
is outright false or at least poorly understood
it does, "effectively lower the resolution" ,it's called 'smearing' , it means high frequency details get turned into some blobby mess.
Assuming ideal anti-aliasing, 1080p at 120hz with framebuffer blur is only better than 720p (and framebuffer blur) in slow motion , close-up scenes.
It's debatable if the usual RTS ingame camera position constitutes a good scenario for this "tilt" for spatial resolution fidelity because you can't have too much movement if viewed from that far. 60fps at 120hz is the very same thing as 60hz except for lag.
Oh, if something is even more quixotic than 120hz it's the AR/VR stuff, especially with the close to eye scenes (haha) , instead of solutions I see ostentatious stuff .
Test here for knowing your response time (mine with lag free plasma tv is at 230 best, 280 worst).
The input lag - as in actual time between giving an action and having it performed on screen - is about 20ms on an optimized system (>60hz LCD, CRT, vsync off etc)
My reaction times are about 170ms visual, 130ms auditory but it's very important to know that reaction time is not directly comparable to input lag. You can have a 170ms reaction time and 50ms input lag - or a 210ms reaction time and 10ms input lag and the input will happen at the same time if you're reacting to something on the screen, but it will feel very different to the users.
The difference between a 20ms and 30ms lag - or especially between a 20ms and 40ms lag is easily visible and feelable in a blind test.
This video should show some of that:
-------
In theory the more hz, the more frames, the less time (delay) between frames. But the delay is very low even in a monitor of 60hz (if i remember correctly its like 16,6666 ms of delay between each frame).
With a typical optimized system right now, you can have about 15ms latency from everything outside of the display refresh.
A 144hz refresh makes the lag time about 15.00 - 21.94ms on the screen A 60hz refresh makes the lag time about 15.00 - 31.67ms on the screen.
The average lag is substantially higher, the peak lag is way higher and more importantly, the variance in input lag is much bigger as well. A screen refresh of 60hz can often be over half of the total peak input lag of a whole system.
So bisu did the Human Benchmark today and his score is nothing to brag about IMO in RTS games minor response or/and input lag doesnt affect gameplay unlike FPS games...
On March 27 2016 00:54 Cyro wrote: This thread is a confusing mess of people comparing display latency to display response time to human reaction times etcetcetc
I literally work with this stuff for a living and people were telling me I'm wrong. I gave up and now I just bask in the stupidity of others.
The issue is not just about monitor refresh rate. Refresh rate can be even 1000Hz. But does it really matter if it refreshes after a significant delay? In practice these flat screen high refresh rate monitors cannot perform the full refresh to the desired value of colour on fast switching scenes. Pixels are just not fast enough to switch the colour at desired refresh rate. And what we get is not the real requested color, but partially changed color on the way to the requested.
The issue is not just about monitor refresh rate. Refresh rate can be even 1000Hz. But does it really matter if it refreshes after a significant delay?
For input lag, yes. The monitor refresh time adds a delay, other stuff just adds yet more delay. Depending on the system, the refresh can be 50%+ of the lag or only a small fraction like 10% of it.
That video does not seem to have an issue with display lag; what's highlighted is the motion blur on the LCD. That kind of blur happens even with near-instant pixel switching because of the sample-and-hold effect described here - http://www.blurbusters.com/faq/oled-motion-blur/
That effect is the difference between this:
and this:
on a fast moving object, even on the same monitor with the same pixel transition times. It shows up differently depending on how the camera is set up - cameras act quite differently to human eyes, and these two images were taken with a pursuit camera^ aimed to simulate human eye tracking. Either way, you can see that something is screwy with the motion performance in your video.
First generation LCD's were pretty awful for pixel change times, but we've gotten to the point where they are of little relevance on the fastest monitors compared to motion-ruining effects like sample+hold.
CRT is in the ~1ms region. Half a millisecond isn't very noticable in practice, but motion blur increasing by a factor of 15 is painfully obvious.
On March 23 2016 04:01 WinterViewbot420 wrote: My whole thing right now is that the game (excluding mouse) renders in 24 frames per second on Fastest speed and the mouse renders in 60 frames per second, so all this talk is almost useless. Not only that, but this isn't a game that requires much visual comprehension aside from cloaked units or bunches of stacked flying units, which comes down to the visual quality of the monitor.
This is a game where these discussions are important:
isn't this more like a game that is playable without a monitor as you can just memorize the moves and the rythm.
Please tell me how in the name Aiur a human is supposed to memorize over three thousand key presses on seven different lanes, all timed by the millisecond.
Well at least in the posted vid there are long part following a pattern, so you don't actually have to memorize each three thousand individual key presses.
How can a musician play an instrument without notes?
Piste, I watched the entire video with my mouth wide open. It was very impressive.
And yes I am a musician who can play my instruments without notes. It's a different thing. For starters, in an instrument like a piano, each key sounds the same, whereas in the game there's no such thing, so the same type of memorisation is not possible.
Since this conversation goes to anywhere, I'll be keeping to use another program to make my game response time faster... Take a look at this and compare to Flash's FPVOD's game response time.
The answer is that the delay to move a unit left vs right is practically the same on CRT and a decent LCD, but there are other significant advantages for CRT like the motion performance
On March 28 2016 20:46 LaStScan wrote: Since this conversation goes to anywhere, I'll be keeping to use another program to make my game response time faster... Take a look at this and compare to Flash's FPVOD's game response time. https://www.youtube.com/watch?v=OlbKyloOQLc
On March 28 2016 20:46 LaStScan wrote: Since this conversation goes to anywhere, I'll be keeping to use another program to make my game response time faster... Take a look at this and compare to Flash's FPVOD's game response time.
Yeah that looks like nothing short of hacking the engine to make it consider your moves in earlier simulation ticks. For more info on how some RTS engines (i think BW's included) work, read this: http://www.gamasutra.com/view/feature/131503/1500_archers_on_a_288_network_.php . This applies to Starcraft 2 as well. For SC2, one turn is about 50 milliseconds (~20-22 "turns" per second)
Their comments of a quarter to half second of lag not being noticed are quite amusing, but they're talking about more average users. Today in SC2 and i assume brood war with settings that multiplayer is played on, the actual delay is more around 100 milliseconds i think.