There is a concept in mathematics called significant figures. In a nutshell what this means is that when something is measured, it can be measured with increasing exactness such that the additional amount of precision doesn't make a great deal of difference to the overall value, or that you are less sure of the exact value of the smaller and smaller numbers.

Let's take an example: About how many grains of sand are there on that beach you're standing on? (Bet you were wondering where that squishy feeling between your toes was coming from!) What if I told you there was 10 million? A big number, a bit hard to grasp but we'll use it. Now imagine if we were to remeasure it by calling up Acme Sand Counting Company and they give us a value of 10,500,000 grains. Okay, ten and a half million is halfway between ten and eleven million; we can understand half or halfway. Now we get another company (Better Sand Counters Inc) to do the job and they tell us the other estimates were off, and the actual count is 11,000,040 grains. So our brain says "Okay, about 11 million."

What you've just done is adjust the figure in your head to round it to a significant number you can easily use. You've looked at the 40 grains that were over the nice round 11 million figure and said to yourself that it is just slightly over compared to the larger values of the first digits of the count.

What this means for the lab is that we can give you a protein value of 12.1329 if we wanted, but it is our experience that the significant portion of the value in this case only goes to the tenth, or 12.1 as it is here. Most of this is because of the increasing insignificance of the added digits, but another contributing factor is the concept of analytical variance.

In the real world it is almost impossible to repeat a procedure exactly the same twice. In a laboratory setting, chemical reactions can change due to differences in time, temperature, pressure, reagents used, sampling or even just chaos math (ask a meteorologist about that one.) Any of these changes can effect results; sometimes they cancel each other out, and sometimes they are additive. These create a result distribution that forms itself into that wonder of statistics, the bell-shaped curve, where the majority of results find themselves hanging around the average value with fewer and fewer values the further from the mean you travel. Stat folks would use standard deviations (SD) to describe how flat or tall the curve is. If for some strange reason you want to know more, you can find oodles of info on "normal distributions" elsewhere on the internet.

In the same manner we use an analytical variance (AV) to describe where typical values are to be expected and where, if exceeded, they should be redone to check for error. Although statistics shows us that there will be a few values that are legitimately very low or high compared to what the average is, if we see something vastly different (say 3 AVs too high or low) we do everything over to make absolutely sure there wasn't an error on our part.

Note that different assays have different formulae for how the AV is calculated; a fat that should be 20% can be as low as 18% and still be within 1 AV. A protein that should be 20% can only go down to 19.4% to stay within 1 AV.

The main point is that although we deal with results that COULD be reported to 1/10000 of a percent, we take into account real world variability and present the results in a manner that is easy to grasp and will give you the significant information you need without going into minutia.