Talk:2311: Confidence Interval
What's a millisigma? 188.8.131.52 03:31, 26 May 2020 (UTC)Ven
- Not an official scientific term - most likely referring to standard deviation. One standard deviation, or sigma, is the 68.3 % of values lying around the mean in a normal distribution. A millisigma in a standard deviation would be .0683 % of a normal distribution so that much variation would be bad? Not sure. 184.108.40.206 05:23, 26 May 2020 (UTC)
- Actually, if you integrate a normal distribution from to , you'll get a range of about 0.08% of all values. This would be bad because it would mean that, as big as the confidence interval appears in the picture, the more meaningful 1- or 3-sigma interval (whose size represents the uncertainty of the model) would be larger by a factor of 1250 or 3750, respectively. --Koveras (talk) 08:38, 26 May 2020 (UTC)
- Perhaps you heard about Six Sigma, a quality method used by General Electric (among others) to keep specifications and processes within tiny tolerances. The six sigmas mean that even absolute (so-called) outliers in your production are within the strict tolerances. With milli-sigmas it is extremely seldom to get an acceptable result at all. Sebastian --220.127.116.11 10:53, 26 May 2020 (UTC)
- No. But maybe it's related to the recent Mt. St. Helens comic... :p Seriously, not everything has to be related to the hot-button topic of the day.