Abnormal normalizations?
uunet!cwi.nl!Dik.Winter
uunet!cwi.nl!Dik.Winter
Mon Nov 22 18:42:43 PST 1993
> My question was simply whether or not such a normalization could or might
> occur... an issue which (I believe) is essentially independent of the
> comparison of NaNs, and also totally independent of the knothole effect.
Can normalizations occur that cut off bits and precision on assignment?
I think yes. On the machines I know/have known there are two kinds of
normalization:
1. A number is normalized if the most significant machine-radix digit
of the mantissa is non-zero. This is the most standard way of
normalization; it shifts to the left the mantissa, decreasing the
exponent, until that digit is non-zero. This *can* result in
underflow which can be rewritten as zero. This could happen on
CDC Cybers and can happen on Cray's for instance.
2. A number is normalized if the absolute value of the exponent is
minimal. (A.A. Gray representation, makes conversion to/from
integer easier.) In this case normalization shifts the mantissa
to the right or left at the same time increasing or decreasing the
exponent; depending on the size of the exponent. But the machines
I know that implemented this representation stopped shifting as soon
as non-zero bits would shift out or if a shift would result in
over- or underflow.
IEEE machines do not allow this because the leading mantissa bit is hidden,
so on assignment no normalization takes place. The Gray representation
machines I know also do not allow this. It can take place on machines
where assignment does normalize and the leading mantissa bit is not hidden.
I do not know off-hand the situation on Cray's, but I know that on the
Cyber it was possible to get an unnormalized result from an operation
which would be normalized on assignment to another variable; losing
precision due to underflow.
More information about the Numeric-interest
mailing list