My high school chemistry teacher, Mr. Danes, called them “sigfigs” or “signficant figures.” Weekend Fisher calls them “significant digits,” and uses them as an analogy for the limits of human precision when it comes to matters of theology. Some things we can know with certainty, but that doesn’t mean we can extrapolate endlessly to reach conclusive answers on every question we might possibly have.
For example, if you have a digital thermometer, and it tells you that the temperature outside is 82.1 degrees F and it seems to be rising 1 degree per hour, you can’t necessarily project a temperature of 85.301247 at a certain future point in time. Why not? Because our measurements aren’t that accurate. Sure, our calculators are that accurate; the formulas we’re testing are (possibly) that accurate. But our measurements aren’t. You started out with precision to 1/10th of a degree. By the time you run past your original level of precision, any further precision is unwarranted and probably misleading. It’s a figment of your calculator’s imagination. It’s not a matter of whether your calculation is right; it’s that you can’t get out more precision than you had when you started.
Well said, WF.