you got it almost right :) (unless i made some mistakes in reading your reworking of the formula's)
yaw(t) := yaw(t-1) + dx/k2*(k1*sqrt(dx*dx+dy*dy)*a / deltaT +v)
deltaT = time passed between t and t-1
a = mouseaccel
v = sensitivity
k1 & k2 constants
k2=33
k1=0.002
but i dont see the point of this last post of yours?!?
i believe i once asked in this thread how would you solve this problem:
estimating best performance sensitivity from data captured inside the game
yaw(t) := yaw(t-1) + dx/k2*(k1*sqrt(dx*dx+dy*dy)*a / deltaT +v)
deltaT = time passed between t and t-1
a = mouseaccel
v = sensitivity
k1 & k2 constants
k2=33
k1=0.002
but i dont see the point of this last post of yours?!?
i believe i once asked in this thread how would you solve this problem:
estimating best performance sensitivity from data captured inside the game