The standard deviation is defined as the root square of the variance:
The definition can be converted directly into an algorithm that computes the variance and standard deviation in two passes: compute the mean in one pass over the data, and then do a second pass to compute the squared differences from the mean.
[…]
Welford’s method is a usable single-pass method for computing the variance. It can be derived by looking at the differences between the sums of squared differences for N and N-1 samples
[…]
This means we can compute the variance in a single pass using the following algorithm:
variance(samples):
M := 0
S := 0
for k from 1 to N:
x := samples[k]
oldM := M
M := M + (x-M)/k
S := S + (x-M)*(x-oldM)
return S/(N-1)
The algorithm can be extended to handle unequal sample weights, replacing the simple counter n with the sum of weights seen so far. West suggests this incremental algorithm: