In October 1994, a flaw was discovered in the Pentium microchip installed in personal computers. The chip produced an incorrect result when dividing two numbers. Intel, the manufacturer of the Pentium chip, initially announced that such an error would occur once in 9 billion division operations, or "once every 27,000 years," for a typical user; consequently, Intel did not immediately offer to replace the chip.
Depending on the procedure, statistical software packages (e.g., SAS) may perform an extremely large number of divisions to produce required output. For heavy users of the software, 1 billion divisions over a short time frame is not unusual. Will the flawed chip be a problem for a heavy SAS user?