even more on base conversion and standardization

David G. Hough dgh
Sat Jun 2 16:57:19 PDT 1990


> Date: Sat, 02 Jun 90 16:27:44 -0400
> From: Tim Peters <uunet!ksr!tim>
> 
> >  It is certainly the intent of NCEG to ratify the non-contentious
> >  part of that requirement [actually a recommendation that compile-time
and execution-time expression evaluation match];
> 
> That's in fact the basis of my concern:  NCEG shouldn't mandate stuff
> specifically intended to help networked platforms *if* that stuff
> imposes unreasonable costs on other platforms.

I think networking issues will help users realize that bitwise compatibility
among IEEE systems is desirable; early implementations and appropriate
standards will help them realize that it's possible; 
then they will demand it on all implementations.
As usual most high-performance implementations will have a "fast" mode
in which they bypass the more onerous requirements in favor of performance.
> 
> Strongly agree that C bindings for 754 are a legitimate NCEG activity.
> It's *going beyond* 754 that's dubious, precisely *because* there's only
> a finite amount of time for stds work.  E.g., suppose you get perfect
> conversions into NCEG -- what does that accomplish?

Everything.

> The *problem* here seems
> to be that 754 was too permissive, not that there's not a language
> binding -- change 754 instead and the problem goes away for every
> new-&-improved-754X vendor, under all languages, at one stroke.

Putting correctly-rounded conversions in IEEE 754 doesn't do much because
IEEE 754 doesn't tell how to get to them from C (printf and scanf might do
something different if they aren't constrained to be correctly-rounded).
It's not worth the trouble to revise 754/854 unless the revision includes
language bindings to C and Fortran, in my opinion.  
Given that, it makes just as much sense to
specify the C numeric environment in NCEG, an ongoing activity, rather than
try to start a new one.  When Rex first proposed NCEG, I thought it would
be more appropriate to rejuvenate the 754 committee instead, but the people
who are willing and able to work on the problem now are already doing so...
in NCEG.  Fortran unfortunately fell into a black political hole of too many
standardizers trying to get in on the action; further technical input
there is not as helpful as getting something promising out of NCEG.

> Think the most effective way to correct that is to revise 754.

Think about the politics of that suggestion.  Put tighter requirements in NCEG,
a new entity, and nobody loses, at least immediately.  Tighten up 754
requirements and a bunch of existing stuff becomes non-conforming.
So the resistance to change in any standard is pretty significant.

> "The identical" is more often the enemy of "the correct" than the
> friend.  I.e., I bet if you went back to your customer and said "OK,
> we'll give you the same answers everywhere, but we're picking the worst
> of the lot across the board" they'd reply "good enough!  just so long as
> all these *differences* go away".

Since the optimal kind of algorithm varies among systems, you can't standardize
an algorithm without favoring some systems over others.  That's why the only
neutral specification is on the results rather than the algorithm.  The only
specification that can provide identical results without specifying an algorithm
is the best possible, i.e. correct rounding.

> Every time Cray introduces a new
> model it gets to ask its customers whether they want better arithmetic;
> that a Y-MP yields results bit-for-bit identical to a Cray-1's isn't
> because CRI's hardware designers *can't* make better arithmetic
> acceptably fast.

Upward compatibility is a bear, just ask the VAX/VMS contingent at DEC.
They had the choice of risking the gradual erosion of their customer base
and cash flow as sites converted to non-DEC RISC architectures to get performance, 
or risking the gradual erosion of their customer base and cash flow as 
sites considered DEC RISC, and then non-DEC RISC architectures, once DEC
in effect admitted that the VAX architecture wouldn't be able to keep up
with RISC performance.  An installed base of computers is your biggest asset and
your biggest liability.  NCEG doesn't have an installed base of 
conforming implementations; in standardization this is an asset.

> Fortran has been around a long
> time, and some programs have run successfully unchanged on every crazy
> platform that's come along in the last 20 years.

It's hard to imagine a program that ran on every previous machine that Seymour
Cray had a hand in, that suddenly became unglued on the Cray-1.
Though I'm sure it happened.  Just as likely, stuff that failed
obscurely on a CDC 7600 might have started working again on a Cray-1.

> both acted as strong pressures to cause many speed-conscious Fortran
> programmers to do their integer arithmetic *using* floating-point
> operations.

I think this was true on CDC-6400 (as far as * /) whether you coded in
floating-point or integer arithmetic - it all ended up being done in
floating point.  And the results of / at least had to vary depending
on whether you used Seymour's not-quite-chopped or not-quite-rounded
arithmetic.



More information about the Numeric-interest mailing list