appropriate optimization technology

Thomas M. Breuel uunet!idiap.ch!tmb
Tue May 19 08:10:02 PDT 1992


David G. Hough on validgh writes:
 > You don't need to disable substantial traditional optimization in order
 > to prevent a+b-a-b from being optimized away to zero as dead code.   
 > Sun compilers and GCC 2.1 both manage OK, as well as HP, IBM, and SGI/MIPS.

By what measure do they "manage OK"?

 > Aside from the real but diminishing
 > problems of extra-precise registers on x86 and 68k
 > systems, most of the "optimizations" that cause trouble for floating-point
 > arithmetic really don't buy much performance on realistic applications that
 > were carefully written for performance initially.

You are telling me that if I hand-optimize my code then I don't need
an optimizer.

Well, I like to be able to input my formulas in the least error-prone
and most convenient way (often, involving automatic generation of
formulas) and have the optimizer optimize it. That's what it's for,
after all.

 > ANSI-C compilers already supply <float.h>.   The issue is what to do about
 > programs that were already written before C was standardized, especially
 > on machines that don't have ANSI-C compilers, such as most Sun
 > workstations.

Too bad. Such programs were unportable and bad practice then, and they
are unportable and bad practice now. Why make the language less useful
in order to cater to some poorly written old code?

And, as I said before, you can try as you may, given the lack of a
mandated floating point model, you still can't use the standard to
prevent compiler vendors for conforming implementations from doing
these kinds of optimizations anyway. You can at best complain that
some compiler doesn't yield the behavior you want for your unportable
program, and, depending on the market, some compiler vendors may
listen; I hope they don't.

					Thomas.



More information about the Numeric-interest mailing list