Post by Jean François MartinezIf we intend is to measure the actual quality of the compiler at
generating small code then the significtaive variable is the size of the
.o file not the size of the executable.
I remember a FORTRAN-IV compiler that generated direct and indirect code.
The latter was kind of interpreted, each code "instruction" was a jump to a
short subprogram implementing the semantics. The direct code was twice as
large than the indirect one and 50% faster. So, no, the size of object
files tells nothing.
Post by Jean François MartinezThe executable is parasited by
things like runtime, libraries, dynamic linker (or first stage of dynamic
linker) and so on.
The code size can be approximated by a linear function A + B*x, where x is
the program "complexity". Arguments that A might be irrelevant is true only
if B is correspondingly small compared to the alternatives so that with
growing x you could compensate your losses on A.
Regarding GNAT, I doubt that B is really small, especially if generics are
actively used, and A is substantially big has we all know.
Post by Jean François MartinezSo the more capable your environment and the biggest the program.
Which should be the opposite, actually, since you could get some
functionality from for the environment.
The problem is that the ratio capability/useless-garbage is quite low in
modern environments and the trend is that it rapidly nears zero.
Post by Jean François MartinezSo if the program is small and will run it on a half decent box you don't
care. Remember the EEPC, that five years old laptop good for web surfing
and little more? It had 1G of memory. A million Ks. So who cares about
the size of hello world?
The empiric law of SW teaches us that any resource available will be
consumed. A more specialized law, which is a consequence of the former, is:
Windows is here to make i486 out of any i7.
Post by Jean François MartinezIf your program has megs and megs of code then the overhead due to a more
capable runtime will be completely irrelevant respective to the "pure"
(that is .o) size of your program and third party (ie not related to Gnat)
libraries.
Which code must be linked dynamically or else, on the systems without
virtual memory, loaded into to RAM. Thus large A usually directly
translates into longer starting times. The effect we all know, each new
system, each new version takes 20% more time to start.
The time a program needed to start is a good estimation of the product line
age. E.g. compare start times of:
Borland C++
MS Visual Studio
AdaCore GPS
--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de