No subject

uunet!research.att.com!doug uunet!research.att.com!doug
Tue Dec 31 17:12:19 PST 1991


Although I am not strongly enamored of the colon width
specifier, I am strongly opposed to the multiplication
of entities without reason.  The following reasons offered
in defense of new entities escape me.

> The colon for bitfield width is written after the declarator [sic], rather
> than before.  This is a gratuitous incoherence.

Incoherent, perhaps, but it exists.  Whatever incoherence there is
is pertinent only if you are arguing to abolish it.  Reasoning
by analogy to dimensions, one might even conclude that the
width *belongs* rightmost, for it refers to a level of 
aggregation finer than the last dimension, namely an
aggregation of bits.

> Also, the colon syntax has another meaning (in bitfields), which is
> *not* that of specifying a type.  The type you specify for a bitfield
> makes a difference, independent of the specified width.

The only things that are supposed to make a difference about
bitfields are "signed" and "unsigned".  (Some old compilers
heed "long" and "short" as well, but that's a garbage feature.)
As far as I can see, exactly the same situation will ensue no matter
what convention you adopt.  There will be some indicator that
this is a signed or unsigned integral type plus an indicator
of how big you want it to be.

Doug McIlroy



More information about the Numeric-interest mailing list