Reply
Mon 12 Sep, 2011 11:13 am
I have a problem understanding a piece of a paper. Greatly appropriate any hint or help. It says:
A sensor records Z(i) at intervals of 1 second and calculates background values U(i) using formula:
U(i)=R*U(i-1)+(1-R)*Z(i),
where R is a constant factor and U(0) is computed from pre-measurement data.
Now, any idea if this formula is famous? Is it a two-term Gaussian mixture noise?
Then, it says exactly like this:
The variance δU(i) of these values is computed from the calculated values U(i):
δU(i)=k*sqrt(U(i)/T)
where k is sigma factor and T is the given measuring time.
I have no idea how the variance became something like that. I understand the term T and the sqrt function but the overall formula, no idea.