long long
D. Hugh Redelmeier
uunet!redvax!hugh
Thu Dec 19 14:07:48 PST 1991
There has been much talk of long long and of 64-bit machines and of
sort-of-64-bit machines, and so on.
Strictly speaking, nothing need be done to C to handle the current
situation. On a real 64-bit machine, short, int, and long could
very pleasantly be 16, 64, and 64 bits. On a sort-of-64-bit
machine, they could be 16, 32, and 64 bits.
I don't think that this gives the programmer the control that he
wishes. I say this as such a programmer.
(1) It may not give the programmer a way to "get at" certain sizes.
(2) It does not give the programmer to specify what capacity he
needs.
The first of these is dealt with indirectly by various long long
proposals. long long allows more integral types so that a more
diverse set of constraints can be met. [Henry points out that short
short may in fact be demanded, for example if int is 64 bits!].
Unfortunately, some of these constraints are sometimes
contradictory. Here are some that I can think of:
- (sloppy?) 32-bit code ought to continue to run. Maybe even
(sloppy?) 16-bit code.
- Some type must be 64 bits on machines that can support it well.
- There might be more than 4 integral sizes (char, short, int, long)
that ought to be expressible.
- int ought to be the "natural" size for the machine.
- We might need to cater to (sloppy?) programmers don't seem to
remember to use size_t when they wish to represent sizes of
objects. Many programs predate size_t's definition or widespread
implementation. (Side note: I still have trouble convincing
programmers who care about portability to use size_t because so
few (old) implementations of C support it.)
- There is no pre-defined signed analogue of size_t. X3J11 botched
the definition of ptrdiff_t, failing to promise that it would be
large enough to hold anything at all! Consequently, programmers
have been forced to arbitrarily choose such a type. They
generally choose int, so int had better be big enough to subscript
any reasonable array.
- There has been and is no way for a programmer to portably get the
size he requires. Many conventions have been followed.
I think that this whole mess should be fixed once and for all (I was
a C user when long was introduced, and also when the transition to
sort-of-32-bit and 32-bit implementations was made).
- I want to be able to specify a sufficient capacity for my types.
Capacity seems best expressed as a range of values that the type
can contain.
- I don't need the type to *only* hold those values. Giving me more
extra is fine.
- I rarely need to specify a representation (i.e. how many bits or
bytes). For the specialized needs of external data
representation, further specifying bitfields would seem to be a
natural direction.
The solution that I have in mind is Pascal subranges. Let the
implementation choose the representation. Of course, the current
integral types have to remain.
There are some problems with subranges.
- What type should be used for evaluating expressions? The C answer
is probably: the type of the result of an operator is the widest
of the operand type(s) and int (this is not really great --
surprising overflow is quite possible).
- I don't know if any subranges should have the peculiar
characteristics of unsigned integral types (perhaps an explicit
"unsigned" could appear in such a typename).
I don't have a favourite notation for subranges in C. Perhaps a
variant of enum "enum { = lwb, = upb}". This has the merit of being
a slight extension of the current enum rather than a whole new
construct. The specification of enum already allows the
implementation to choose a representation for each enum type based
on the collection of enum constants. The notation is a bit awkward
-- a short form might be nice.
In summary, let us not patch C once again. Let's get it right.
Hugh Redelmeier
{utcsri, yunexus, uunet!attcan, utzoo, scocan}!redvax!hugh
When all else fails: hughacsri.toronto.edu
+1 416 482-8253
More information about the Numeric-interest
mailing list