(j3.2006) (SC22WG5.5761) [ukfortran] RE: RE: Units of measure
Van Snyder
Van.Snyder
Thu Jul 7 23:37:13 EDT 2016
On Fri, 2016-07-08 at 10:37 +0900, Cohen Malcolm wrote:
> Doing it via software tools has the significant advantages of
> (a) no impact on vendor implementations,
> (b) works even for compilers whose vendors have not provided it,
> (c) only takes a year or two to obtain, instead of more than 10 years via
> the standard (or even more than that, seeing as how here we are in 2016 but
> few compilers yet implement all of the standard we started working on in
> 1998 and published in 2004!).
Why were coarrays not done in this way? How about C interop? PDTs?
What was the point of DO CONCURRENT, especially given that OpenMP
directives do the same thing? Do OpenMP directives have any impact on
vendor implementations, or are they handled by a preprocessor?
Doing anything via software tools has the significant disadvantage that
the lifetime of support of the tools bears no relationship to the
lifetime of support that a language feature would have. When Fujitsu
said "nothing newer than Fortran 95 for Intel processors, no 64-bit
product, and no Linux product maintenance," and Tom Lahey retired, we
switched to other compiler vendors without any trauma. That's not an
option if the software tool company goes out of business, and there
aren't any others who offer the same functionality. Maybe if it were
done by directives that the processor handled....
The things published in 2004 but not yet universally available were much
more difficult than the units proposal. Indeed, coarrays were much nore
difficult, but they're widely available. Nice try at a straw man,
though.
We did ask for this in 1986. It took me eleven more years to get
funding to join J3. The powers-that-be at JPL didn't believe us in 1986
that automatic units checking might avert a catastrophe.
> There are software companies which provide software tools for Fortran. NAG
> is such a company. As I mentioned before, I am sure that NAG would be
> prepared to design the directives and implement tools for units handling,
> and almost certainly for less than 1% of $300M. So far, neither I nor
> anyone else at NAG has received even a hint of interest in such a thing.
A company that builds software tools (not NAG) contacted us. We were
expected to bear the entire development expense. Then they would sell
the tool (and sell maintenance and support to us) and we would not share
in the profits. We weren't sure whether Congress would stand for it.
It looked too much like a "Solyndra" scam. Maybe it would sneak
through. Our Ethics office said "don't touch it." We didn't want to
take the chance. Somebody might have gotten fired, or even prosecuted.
We used a software tool for a different purpose, from a vendor who had
developed it at their own expense. A few years after we became
essentially totally dependent upon it, we found a few bugs and some
deficiencies. The company was out of business. Working around the bugs
wasn't free. Developing a simpler but nontrivial tool to overcome the
deficiencies by post-pre-processing its output wasn't free. We were
happy to escape from it, but that wasn't free either. It wasn't obvious
it had reduced our long-term labor cost.
> <<<
> If your organization had lost $300 million due to a trivial software mistake
> that the programming language and its runtime could have caught or corrected
> automatically, would you roll over and play dead?
> >>>
>
> No, I would build, or contract to be built, a tool for checking code that is
> annotated with units information, for the reasons given above.
Those who read the proposal might have noticed that it doesn't just do
checking. Definition of a conversion unit creates a trivial generic
function and its inverse. Call that "code generation" if you like, but
I expect it would be done long before the real "code generator" comes
into play. Nothing like the code generation necessary for a coindexed
reference, or DO CONCURRENT. That means the tool would be a
preprocessor, not an analysis tool. Conversion is not automatic; it's
explicit. It looks just like a function reference, and that's what the
proposal calls it. I don't think of that as a new code generation
problem. A preprocessor would probably need to inline the conversion
functions, because they couldn't be internal functions in an internal
procedure if a conversion unit is defined in an internal procedure. I'm
not sure how a preprocessor would or could handle units checking and
conversion during formatted input.
While we waited during the interregnum between Fortran 77 and the
availability of Fortran 90 processors, we used a preprocessor to provide
a more comprehensive set of control structures, and a limited form of
internal procedures (we actually started it before Fortran 77). There
was a love-hate relationship with it for decades. Debugging was tedious
because line numbers were different in the source and generated code.
Error messages from the compiler were difficult to attach to the right
place. Source-level debuggers were pointless, because they only worked
on the (ugly) generated code. It stuck around for more than ten years
after Fortran 90 compilers became available, long after the primary
developer had retired. Six million lines of navigation software used
it. It might have been part of the reason that powers-that-be decided
to re-code that software in C++ instead of converting it to real Fortran
syntax. It's not obvious it reduced our overall long-term labor cost.
But the code looked nicer than Fortran 77.
Concerning preprocessors, we got a very strong "never again" signal.
Analysis tools are useful, and we use some.
> Cheers,
More information about the J3
mailing list