Forums before death by AOL, social media and spammers... "We can't have nice things"
|    comp.arch    |    Apparently more than just beeps & boops    |    131,241 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 131,046 of 131,241    |
|    Stephen Fuld to quadi    |
|    Re: Combining Practicality with Perfecti    |
|    11 Feb 26 19:50:00    |
      From: sfuld@alumni.cmu.edu.invalid              On 2/11/2026 3:04 PM, quadi wrote:              snip                     > And I noticed that a lot of mathematical tables from the old days went up       > to 10 digit accuracy, and scientific calculators had 10 digit displays,       > calculating internally to a slightly higher precision.              The ten digit displays came from the design of the first electric       calculators, made by such companies as Friden and Monroe in the 1940s       and 50s). They had ten rows of numeric keys (0-9), so that the       operator, who presumably had ten fingers (including thumbs) could       operate them quickly. So 10 digits sort of became standard. When       computers came along, and the designers wanted to use binary for them,       they needed 35 bits (including sign) to hold the ten digits. Going with       36 bits allowed six six bit characters. The requirement from the US       Navy (a major customer) for that precision led to the 36 bit Univac 1100       series being a 36 bit machine. Once you have 36 bit integers, you might       as well use 36 bit floating point numbers, and then 72 bit double       precision floating point numbers as the 1100 series did.                     --        - Stephen Fuld       (e-mail address disguised to prevent spam)              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca