summaryrefslogtreecommitdiff
path: root/opengl/tests/gl_basic/gl_basic.cpp
diff options
context:
space:
mode:
authorOkan Arikan <okana@google.com>2017-04-06 16:35:56 -0700
committerOkan Arikan <okana@google.com>2017-04-06 21:22:30 -0700
commit5e4b792b920d6420555788cbbf539f8badf90320 (patch)
tree73c668af7b7b4728c1432d88215afa45e3743ff7 /opengl/tests/gl_basic/gl_basic.cpp
parent0005f132ceef28184e9a3fd33cc313134b99d220 (diff)
Fix the latency model.
We were being too clever in modeling the latency using exponentially moving average. The issue is that the average changes as new samples come in which we then use to update the timestamps. If the sampling rate is high enough (like IMU) then the changes in the average are close to the delta times between samples. This causes the sample times to move, and sometimes even change their updated timestamp order. This causes all kinds of mess when we linearly extrapolate because the slope is bogus. The fix is to just average a certain number of latency samples and then stick with that constant average. Bug: 36997591 Test: Run any 3DOF VR app. Change-Id: I5411b2a6b7c3f258bf197f0615c0339d68fd2fd7
Diffstat (limited to 'opengl/tests/gl_basic/gl_basic.cpp')
0 files changed, 0 insertions, 0 deletions