Your browser does not seem to support JavaScript. As a result, your viewing experience will be diminished, and you have been placed in read-only mode.
Please download a browser that supports JavaScript, or enable it if it's disabled (i.e. NoScript).
Probe interval 500 ms (default) vs. 15000 ms produces vastly different standard deviation.
Some difference is to be expected. But nearly 3x?
Thoughts?
See attached images.
Three or four probes isn't sufficient for a meaningful standard deviation.