Discussion:
Ichbiah 2022 compiler mode
Add Reply
Kevin Chadwick
2024-09-05 11:52:37 UTC
Reply
Permalink
I guess this is a very subjective question.

A number of Ada users have expressed that they would rather Ada was simpler
whilst others desire more features.

I appreciate Ada 83 portability but also like a lot of modern Ada features.

Out of interest. Could anyone help me with what an Gnat or other compiler
Ichbiah_2022_Mode might look like. Perhaps it might be possible to use
pragmas to get an estimated mode of what features he might keep or drop.

I can continue research but currently I do not have the details of his
objections to Ada 95 and how those may have continued through to today is
perhaps a nuanced question.

What do you think Ichbiah would jettison from Ada 2022? All comments
welcome.
--
Regards, Kc
Jeffrey R.Carter
2024-09-05 13:40:35 UTC
Reply
Permalink
Post by Kevin Chadwick
I can continue research but currently I do not have the details of his
objections to Ada 95 and how those may have continued through to today is
perhaps a nuanced question.
Ichbiah's objections to Ada 95 are in
https://web.elastic.org/~fche/mirrors/old-usenet/ada-with-null.
--
Jeff Carter
"[T]he language [Ada] incorporates many excellent structural
features which have proved their value in many precursor
languages ..."
C. A. R. Hoare
180
Bill Findlay
2024-09-05 13:49:44 UTC
Reply
Permalink
Post by Jeffrey R.Carter
"[T]he language [Ada] incorporates many excellent structural
features which have proved their value in many precursor
languages ..."
C. A. R. Hoare
And he continued:

"one can look forward to a rapid and widespread improvement in programming
practice,
both from those who use the language and from those who study its concepts
and structures."

I am familiar with this, as one of the authors of the book
that contains that foreword. 8-)
--
Bill Findlay
Jeffrey R.Carter
2024-09-05 19:22:31 UTC
Reply
Permalink
Post by Bill Findlay
Post by Jeffrey R.Carter
"[T]he language [Ada] incorporates many excellent structural
features which have proved their value in many precursor
languages ..."
C. A. R. Hoare
"one can look forward to a rapid and widespread improvement in programming
practice,
both from those who use the language and from those who study its concepts
and structures."
I am familiar with this, as one of the authors of the book
that contains that foreword. 8-)
That's signature 181.
--
Jeff Carter
"[T]he language [Ada] incorporates many excellent structural
features which have proved their value in many precursor
languages ..."
C. A. R. Hoare
180
Kevin Chadwick
2024-09-05 14:05:50 UTC
Reply
Permalink
Post by Jeffrey R.Carter
Post by Kevin Chadwick
I can continue research but currently I do not have the details of his
objections to Ada 95 and how those may have continued through to today is
perhaps a nuanced question.
Ichbiah's objections to Ada 95 are in
https://web.elastic.org/~fche/mirrors/old-usenet/ada-with-null.
Thank you
--
Regards, Kc
Kevin Chadwick
2024-09-05 16:08:01 UTC
Reply
Permalink
Post by Jeffrey R.Carter
Post by Kevin Chadwick
I can continue research but currently I do not have the details of his
objections to Ada 95 and how those may have continued through to today is
perhaps a nuanced question.
Ichbiah's objections to Ada 95 are in
https://web.elastic.org/~fche/mirrors/old-usenet/ada-with-null.
What does this mean?

"elimination of accuracy constraints in subtypes"
--
Regards, Kc
Jeffrey R.Carter
2024-09-05 19:24:05 UTC
Reply
Permalink
Post by Kevin Chadwick
What does this mean?
"elimination of accuracy constraints in subtypes"
See ARM-95 J.3
(https://www.adaic.org/resources/add_content/standards/95lrm/ARM_HTML/RM-J-3.html),
Reduced Accuracy Subtypes.
--
Jeff Carter
"[T]he language [Ada] incorporates many excellent structural
features which have proved their value in many precursor
languages ..."
C. A. R. Hoare
180
Randy Brukardt
2024-09-06 00:03:22 UTC
Reply
Permalink
"Kevin Chadwick" <kc-***@chadwicks.me.uk> wrote in message news:vbc625$at65$***@dont-email.me...
...
Post by Kevin Chadwick
What do you think Ichbiah would jettison from Ada 2022? All comments
welcome.
My recollection is that he wanted a more complex "class" feature, which IMHO
would have made Ada more complex, not simpler.

In any case, I can't guess what Ichbiah would have suggested after 40 years
of experience. (He probably would have moved on to some other language
anyway, you have to be somewhat resistant to change to stick with a single
language for your entire career. I seem to resemble that remark... ;-)

What I can do is suggest what an RLB_2022 mode would look like, as I did the
exercise when we all were cooped up during the early days of the pandemic.
My philosophy is that Ada has a lot of combinations of features that cause a
lot of implementation trouble, but which are not very useful. So I want to
reduce the combinations that cause trouble. I note that every feature is
useful for something (else it wouldn't be in Ada in the first place). But
some things are not useful enough for the trouble that they cause. Also note
that I am not worrying about compatibility with Ada, which is always a
problem when updating Ada itself.

Here's some highlights off the top of my head:

(1) Simplify the resolution model; essentially everything resolves like a
subprogram. For instance, objects resolve similarly to enumeration literals.
This substantially reduces the danger of use clauses (having matching
profiles and names is less likely than just matching names), and eliminates
the subtle differences between a constant and a function (they should really
act the same).

(2) Operator functions have to be primitive for at least one of the types in
the profile. (Operators in a generic formal part have a pseudo-primitive
requirement.) That includes renamings. In exchange for that, operators have
the same visibility as the type (which means they are always directly
visible when any object of the type is visible). One then can eliminate "use
type" (since it would literally do nothing).

(3) A number of syntax options are eliminated. Matching identifiers are
required at the end of subprograms and packages. Initializers are always
required (<> can be used if default initialization is needed). Keyword
"variable" is needed to declare variables (we do not want the worst option
to be the easiest to write, as it is in Ada).

(4) Anonymous types of all sorts are eliminated. For access types, we would
use aspects to declare properties (static vs. dynamic accessibility,
"closure" types, etc.). For arrays, see next item.

(5) The array model would be greatly simplified. New Ada users (and old ones
as well) have a hard time dealing with the fact that the lower bound is not
fixed in Ada. Additionally, the existing Ada model is very complex when
private types are involved, with operators appearing long after a type is
declared. The more complex the model, the more complex the compiler, and
that means the more likely that errors occur in the compiler. There also is
runtime overhead with these features. The basic idea would be to provide the
features of an Ada.Containers.Vector, and no more. Very little is built-in.
That means that arrays can only be indexed by integers, but that is a good
thing: an array indexed by an enumeration type is really a map, and should
use a map interface. So I would add a Discrete_Map to the Ada.Containers
packages. Bounded_Arrays are a native type (most of the uses of arrays that
I have are really bounded arrays built by hand).

A side-effect of this model change is to greatly simplify what can be
written as discriminant-dependent components. Discriminant-dependent arrays
as we know them are gone, replaced by a parameterized array object that has
only one part that can change. Much of the nonsense associated with
discriminant-dependent components disappears with this model.

(6) Static items have to be declared as such (with a "static" keyword rather
than "constant"). Named numbers are replaced by explicit static constants.
(I would allow writing Universal_Integer and Universal_Real, so one could
declare static objects and operations of those types.)

(7) Types and packages have to be declared at library-level. This means that
most generic instances also have to be declared at library-level. Subtypes,
objects, and subprograms still can be declared at any nesting level. I make
this restriction for the following reasons:
(A) Accessibility checks associated with access types are simplified to
yes/no questions of library-level or not. The only cases where accessibilty
checks do any real good is when library-level data structures are
constructed out of aliased objects. These would still be allowed, but almost
all of the complication would be gone. Even if the check needs to be done
dynamically, it is very cheap.
(B) Tagged types declared in nested scopes necessarily require complex
dynamic accessibility checks to avoid use of dangling types (that is, an
object which exists of a type that does not exist).
(C) Reusability pretty much requires ODTs to be declared in
library-level packages. Mandating that won't change much for most programs,
and you'll be happier in the long run if you declare the types in library
packages in the first place.
(D) There are a lot of semantic complications that occur from allowing
packages in subprograms, but this is rarely a useful construct.

(8) Protected types become protected records (that is, a regular record type
with the keyword "protected"). Primitive operations of a protected record
type are those that are protected actions. (Entries can be declared and
renamed as such, they would no longer match procedures, which leads to all
kinds of nonsense.) This would eliminate the problems declaring helper types
and especially *hiding* helper types for protected types.
(See the problems we had defining the queues in the Ada.Containers to see
the problem.) The protected operations would allow the keyword "protected"
in order to make the subprograms involved explicit.

(9) Strings are not arrays! Strings would be provided by dedicated packages,
supporting a variety of representations. There would be a Root_String'Class
that encompasses all string types. (So many operations could be defined on
Root_String'Class).

(10) Variable-returning functions are introduced. They're pretty similar the
semantics of anonymous access returns (or the aliased function returns
suggested by Tucker). This means that a variable can easily be treated as a
function (and indeed, a variable declaration is just syntactic sugar for
such a function).

(11) Various obsolete features like representation_clauses, representation
pragmas, and the ability to use 'Class on untagged private types are
eliminated or restricted.

There were a couple of areas that I never made up my mind on:

(A) Do we need tasks at all? Parallel and task are very much overlapping
capabilities. But the parallel model would need substantial changes if we
were to allow suspension of parallel threads (Ada 2022 does not allow this).
Suspension seems necessary to support intermittent inputs of any type
(including interrupts) without wasting resources running busy-wait loops.

(B) Should type conversions be operators or remain as the type name as in
Ada? A type conversion operator, automatically constructed, would allow
user-defined types to have the same sort of conversions to numeric and
string types that the predefined do. But an operator would make conversions
easier, which is probably the wrong direction for a strongly typed language.

(C) I wanted to simply the assignment model, but my initial attempt did not
work semantically. I'm not sure that simplification is possible with the Ada
feature set (I'm sure Bob and Tuck tried to do that when creating Ada 95,
but they failed). The main issue is that one would like to be able to
replace discriminant checks on user-defined assignment. (Imagine the
capacity checks on a bounded vector; Ada requires these to match, but that's
way too strong; the only problem is if the target capacity cannot hold the
actual length of the source object. A user-defined replacement would be
helpful.)

My $20 worth (this was a lot more work than $0.02!!). I probably forgot a
number of items; my actual document is about 20 pages long.

Randy.
Lawrence D'Oliveiro
2024-09-06 00:58:05 UTC
Reply
Permalink
Keyword "variable" is needed to declare variables (we do not want the
worst option to be the easiest to write, as it is in Ada).
One language idea I toyed with years ago was that

«name» : «type»;

declared a variable, while

«name» : «type» := «value»;

declared a constant. So, no initialization of variables at declaration
time allowed.
(10) Variable-returning functions are introduced.
Is this like updater functions in POP-11, or “setf” in Lisp? So you have a
procedure

set_var(«var», «new value»)

which is declared to be attached to «var» in some way, such that when you
write

«var» := «new_value»

this automatically invokes set_var?
Randy Brukardt
2024-09-12 04:39:27 UTC
Reply
Permalink
...
Post by Randy Brukardt
(10) Variable-returning functions are introduced.
Is this like updater functions in POP-11, or "setf" in Lisp? So you have a
procedure
set_var(«var», «new value»)
which is declared to be attached to «var» in some way, such that when you
write
«var» := «new_value»
this automatically invokes set_var?
No, it is a function that returns a variable, meaning you can assign into
the function result. If you have:
function Foo return variable Integer;
then you can use Foo on either side of an assignment:
Foo := 1;
Bar := Foo + 1;

Essentially, this idea treats:
Var : variable Integer;
as syntactic sugar for
function Var return variable Integer;

The worth of that is two-fold: (1) Objects and functions now resolve the
same; (2) one can write a function that acts exactly like an object, and
thus can replace it in all uses.

Note that Ada currently has generalized reference objects and functions that
return anonymous access types, and both of these act similarly to a variable
returning function. But neither is quite a perfect match.

Randy.
Lawrence D'Oliveiro
2024-09-12 22:24:29 UTC
Reply
Permalink
Post by Randy Brukardt
...
Post by Randy Brukardt
(10) Variable-returning functions are introduced.
Is this like updater functions in POP-11, or "setf" in Lisp?
No, it is a function that returns a variable, meaning you can assign
into the function result.
I think an updater function would be more generally useful. Because some
things you want to update might not (depending on the implementation) live
independently in an explicit variable. And it seems good not to constrain
implementations unnecessarily.
Randy Brukardt
2024-09-14 06:18:25 UTC
Reply
Permalink
Post by Lawrence D'Oliveiro
Post by Randy Brukardt
...
Post by Randy Brukardt
(10) Variable-returning functions are introduced.
Is this like updater functions in POP-11, or "setf" in Lisp?
No, it is a function that returns a variable, meaning you can assign
into the function result.
I think an updater function would be more generally useful. Because some
things you want to update might not (depending on the implementation) live
independently in an explicit variable. And it seems good not to constrain
implementations unnecessarily.
Unfortunately, "updater" functions don't work with the Ada model of
components, because you can't tell what to do when a component appears or
disappears in an assignment. (That's why Ada doesn't allow overloading
":=".) And composition is very important to Ada -- stand-alone objects are
pretty rare outside of those for scalar types. I don't think something that
only worked with stand-alone objects would be very useful (can't use those
with ODTs, for instance)..

Randy.
Lawrence D'Oliveiro
2024-09-14 07:18:29 UTC
Reply
Permalink
Post by Randy Brukardt
Unfortunately, "updater" functions don't work with the Ada model of
components, because you can't tell what to do when a component appears
or disappears in an assignment.
But it’s just syntactic sugar, nothing more. Instead of

a := obj.get_prop()
obj.set_prop(a)

(both of which have valid Ada equivalents), you can unify them into

a:= obj.prop
obj.prop := a

What difference does writing it differently make?
Simon Wright
2024-09-06 21:22:08 UTC
Reply
Permalink
Post by Randy Brukardt
(A) Do we need tasks at all? Parallel and task are very much
overlapping capabilities.
I don't think I've ever wanted parallel. Most embedded system tasks are
one-off, aren't they?
Niklas Holsti
2024-09-07 17:13:03 UTC
Reply
Permalink
Post by Simon Wright
Post by Randy Brukardt
(A) Do we need tasks at all? Parallel and task are very much
overlapping capabilities.
I don't think I've ever wanted parallel. Most embedded system tasks are
one-off, aren't they?
More and more mebedded systems use multi-core processors and do heavy,
parallelizable computations. "Parallel" is intended to support that in a
light-weight way. In a recent discussion with the European Space Agency,
they expressed interest in using OpenMP for such computations on-board
spacecraft with multi-core processors, which is an "embedded" context.

Regarding tasks in embedded systems, I agree that most are one-off, but
I have occasionally also used tens of tasks of the same task type.

I disagree with Randy's view that tasks and "parallel" are much
overlapping. Tasks are able to communicate with each other, but AIUI
parallel tasklets are not meant to do that, and may not be able to do
that. Tasks can have different priorities; tasklets cannot.
Nioclás Pól Caileán de Ghloucester
2024-09-07 20:34:52 UTC
Reply
Permalink
On Sat, 7 Sep 2024, Niklas Holsti wrote:
"More and more mebedded systems use multi-core processors and do heavy,
parallelizable computations. "Parallel" is intended to support that in a
light-weight way. In a recent discussion with the European Space Agency, they
expressed interest in using OpenMP for such computations on-board spacecraft
with multi-core processors, which is an "embedded" context."


Hei!

Most of the languages which are referred to by
WWW.OpenMP.org/resources/openmp-compilers-tools
facilitate bugs. (The Spark which is referred to thereon is not the
Ada-related Spark language.)
Randy Brukardt
2024-09-12 04:46:18 UTC
Reply
Permalink
"Niklas Holsti" <***@tidorum.invalid> wrote in message news:***@mid.individual.net...
...
Post by Niklas Holsti
I disagree with Randy's view that tasks and "parallel" are much
overlapping. Tasks are able to communicate with each other, but AIUI
parallel tasklets are not meant to do that, and may not be able to do
that. Tasks can have different priorities; tasklets cannot.
I was (of course) presuming that "tasklets" would get those capabilities if
they were to replace tasks. That's what I meant about "suspension", which is
not currently allowed for threads in Ada (parallel code is not allowed to
call potentially blocking operations). If that was changed, then all forms
of existing task communication would be allowed.

I'm less certain about the value of priorities, most of the time, they don't
help writing correct Ada code. (You still need all of the protections
against race conditions and the like.) I do realize that they are a natural
way to express constraints on a program. So I admit I don't know in this
area, in particular if there are things that priorities are truly required
for.

Randy.
Niklas Holsti
2024-09-12 07:42:38 UTC
Reply
Permalink
Post by Randy Brukardt
...
Post by Niklas Holsti
I disagree with Randy's view that tasks and "parallel" are much
overlapping. Tasks are able to communicate with each other, but AIUI
parallel tasklets are not meant to do that, and may not be able to do
that. Tasks can have different priorities; tasklets cannot.
I was (of course) presuming that "tasklets" would get those capabilities if
they were to replace tasks. That's what I meant about "suspension", which is
not currently allowed for threads in Ada (parallel code is not allowed to
call potentially blocking operations). If that was changed, then all forms
of existing task communication would be allowed.
Ok, I understand. In that case, what "parallel" adds to the current
tasking feature is an easy way to create a largish and perhaps
dynamically defined number of concurrent threads from a "parallel" loop,
where the threads are automatically created when the loop is started and
automatically "joined" and destroyed when the loop completes.

I don't mind at all if a future Ada evolution merges tasks and
"parallel", although it might defeat the easier access to multi-core
true parallelism that is the goal of the "parallel" extension, AIUI.
Post by Randy Brukardt
I'm less certain about the value of priorities, most of the time, they don't
help writing correct Ada code. (You still need all of the protections
against race conditions and the like.) I do realize that they are a natural
way to express constraints on a program. So I admit I don't know in this
area, in particular if there are things that priorities are truly required
for.
Priorities (or the equivalent, such as deadlines) are absolutely
required for real-time systems where there are fewer cores than
concurrent/parallel activities so that the system has to schedule more
than one such activity on one core.

If Ada did not have tasks with priorities, most of the Ada applications
I have worked on in my life would have had to avoid Ada tasking and
retreat to using some other real-time kernel, with ad-hoc mapping of the
kernels's threads to Ada procedures.

Despite the transition to multi-core processors, I think that there will
continue to be systems where scheduling is required, because the number
of concurrent/parallel activities will increase too.
Dmitry A. Kazakov
2024-09-12 09:07:01 UTC
Reply
Permalink
Post by Niklas Holsti
I don't mind at all if a future Ada evolution merges tasks and
"parallel", although it might defeat the easier access to multi-core
true parallelism that is the goal of the "parallel" extension, AIUI.
To me usefulness of "parallel" is yet to be seen, while tasks proved to
be immensely useful on all architectures available.
Post by Niklas Holsti
Priorities (or the equivalent, such as deadlines) are absolutely
required for real-time systems where there are fewer cores than
concurrent/parallel activities so that the system has to schedule more
than one such activity on one core.
If Ada did not have tasks with priorities, most of the Ada applications
I have worked on in my life would have had to avoid Ada tasking and
retreat to using some other real-time kernel, with ad-hoc mapping of the
kernels's threads to Ada procedures.
Right.
Post by Niklas Holsti
Despite the transition to multi-core processors, I think that there will
continue to be systems where scheduling is required, because the number
of concurrent/parallel activities will increase too.
Yes.

The law of nature is that any resources becoming available will be
consumed by the software regardless the purpose of... (:-))
--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de
Kevin Chadwick
2024-09-12 12:36:19 UTC
Reply
Permalink
Post by Niklas Holsti
If Ada did not have tasks with priorities, most of the Ada applications
I have worked on in my life would have had to avoid Ada tasking and
retreat to using some other real-time kernel, with ad-hoc mapping of the
kernels's threads to Ada procedures.
Counter intuitively it is possible that this is holding Ada back. A lot of
Ada code cannot run without some fairly complex runtime support due to
tasks, protected objects, finalization etc.. Runtimes have to be developed
for each chip instead of each cpu. Atleast I assume that that is why these
features are not available to e.g. the light cortex-m33 or cortex-m4 or
cortex-m0+ runtimes. This requires rewriting code which isn't required with
equivalent C code such as containers and ip stacks etc.. Even support for
the Ada interrupt package is missing but it looks like porting that support
to chips is less work and research.

If you need advanced multi core support then using an OS seems like a more
suitable situation to be in to me.
--
Regards, Kc
Niklas Holsti
2024-09-12 15:43:45 UTC
Reply
Permalink
Post by Kevin Chadwick
Post by Niklas Holsti
If Ada did not have tasks with priorities, most of the Ada
applications I have worked on in my life would have had to avoid Ada
tasking and retreat to using some other real-time kernel, with ad-hoc
mapping of the kernels's threads to Ada procedures.
Counter intuitively it is possible that this is holding Ada back. A lot of
Ada code cannot run without some fairly complex runtime support due to
tasks, protected objects, finalization etc.. Runtimes have to be developed
for each chip instead of each cpu.
True, however an Ada RTS can implement many of the tasking features with
moderate effort on top of non-Ada real-time kernels such as FreeRTOS,
VxWorks, etc., as AdaCore have done for some kernels. At least for the
Ravenscar and Jorvik profiles. AIUI, the processor-specific stuff is
then mainly in the kernel, not in the RTS.
Post by Kevin Chadwick
If you need advanced multi core support then using an OS seems like a more
suitable situation to be in to me.
Using a large OS like Linux would not be acceptable for many embedded
systems. Fortunately the smaller real-time kernels are adding multi-core
support too.

The great advantage of using the standard Ada tasking feature, special
syntax and all, is that your embedded Ada program can then be executed
on a PC or other non-embedded computer, for testing or other purposes,
tasking and all. It can also be analysed by static-analysis tools such
as AdaControl for race conditions and other tasking-sensitive issues.
Nioclás Pól Caileán de Ghloucester
2024-09-13 20:45:03 UTC
Reply
Permalink
On Thu, 12 Sep 2024, Kevin Chadwick wrote:
"Counter intuitively it is possible that this is holding Ada back. A lot of
Ada code cannot run without some fairly complex runtime support due to
tasks, protected objects, finalization etc.. Runtimes have to be developed
for each chip instead of each cpu."


A book by Burns and Wellings unsensibly boasts that the demanding runtime
demands of Ada are an advantage because if you be with them then you be
with them, whereas as Kevin Chadwick points out - they are not easy to
make.
J-P. Rosen
2024-09-12 09:04:58 UTC
Reply
Permalink
Post by Randy Brukardt
I was (of course) presuming that "tasklets" would get those capabilities if
they were to replace tasks. That's what I meant about "suspension", which is
not currently allowed for threads in Ada (parallel code is not allowed to
call potentially blocking operations). If that was changed, then all forms
of existing task communication would be allowed.
Well, tasks are not only for speeding up code. They can be a very useful
design tool (active objects, independant activities). I think the Ada
model is clean and simple, I would hate to see it disappear.
Post by Randy Brukardt
I'm less certain about the value of priorities, most of the time, they don't
help writing correct Ada code. (You still need all of the protections
against race conditions and the like.) I do realize that they are a natural
way to express constraints on a program. So I admit I don't know in this
area, in particular if there are things that priorities are truly required
for.
If you had as many cores as tasks, you would not need priorities.
Priorities are just optimization on how to manage cores when there are
not enough of them.
I know that people use priorities to guarantee mutual exclusion, and
other properties. All these algorithms were designed at the time of
mono-CPU machines, but they fail on multi-cores. Nowadays, relying on
priorities for anything else than optimization is bad -and dangerous-
design.
--
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
https://www.adalog.fr https://www.adacontrol.fr
Niklas Holsti
2024-09-12 11:35:27 UTC
Reply
Permalink
On 2024-09-12 12:04, J-P. Rosen wrote:

[...]
Post by J-P. Rosen
Well, tasks are not only for speeding up code. They can be a very useful
design tool (active objects, independant activities). I think the Ada
model is clean and simple, I would hate to see it disappear.
I agree.
Post by J-P. Rosen
I'm less certain about the value of priorities, [...]
Priorities are just optimization on how to manage cores when there are
not enough of them.
In some contexts it could be optimization -- for example, to increase
throughput in a soft real-time system -- but in hard real-time systems
priorities (or deadlines) are needed for correctness, not just for
optimizatiion.
Post by J-P. Rosen
I know that people use priorities to guarantee mutual exclusion, and
other properties. All these algorithms were designed at the time of
mono-CPU machines, but they fail on multi-cores.
In SW for multi-core systems it can be beneficial to collect tasks that
frequently interact with each other or with the same single-user
resources in the same core, and then the mono-core mutual-exclusion
algorithms like priority ceiling inheritance can be used for that group
of tasks, while using other algorithms for mutual exclusion between
tasks running in different cores.
Randy Brukardt
2024-09-14 06:13:28 UTC
Reply
Permalink
"Niklas Holsti" <***@tidorum.invalid> wrote in message news:***@mid.individual.net...
...
Post by Niklas Holsti
Post by J-P. Rosen
Priorities are just optimization on how to manage cores when there are
not enough of them.
In some contexts it could be optimization -- for example, to increase
throughput in a soft real-time system -- but in hard real-time systems
priorities (or deadlines) are needed for correctness, not just for
optimizatiion.
This I don't buy: priorities never help for correctness. At least not
without extensive static analysis, but if you can do that, you almost
certainly can do the correctness without depending upon priorities.

I view priorities as similar to floating point accuracy: most people use
them and get the results they want, but the reason for that is that they got
lucky, and not because of anything intrinsic. Unless you do a lot of
detailed analysis, you don't know if priorities really are helping or not
(and similarly, whether your results actually are meaningful in the case of
floating point).

Anyway, I don't see any such changes coming to Ada, but rather to some
separate follow-on language (which necessarily needs to be simpler), and
thus some things that are sometimes useful would get dropped.

(Different message)
...
Post by Niklas Holsti
Ok, I understand. In that case, what "parallel" adds to the current
tasking feature is an easy way to create a largish and perhaps dynamically
defined number of concurrent threads from a "parallel" loop, where the
threads are automatically created when the loop is started and
automatically "joined" and destroyed when the loop completes.
I think the parallel block is more useful for general tasking. The advantage
of using parallel structures is that they look very similar to sequential
structures, and one lets the system do the scheduling (rather than trying to
figure out an organization manually).

One of the advantages of the model I'm thinking about is that it separates
concerns such as parallel execution, mutual exclusion, inheritance,
organization (privacy, type grouping), and so on into separate (mostly)
non-overlapping constructs. Ada started this process by having tagged types
a separate construct from packages; you need both to get traditional OOP,
but you can also construct many structures that are quite hard in
traditional "one construct" OOP. I think that ought to be done for all
constructs, and thus the special task and protected constructs ought to go.
We already know that protected types cause problems with privacy of
implementation and with inheritance. Tasks have similar issues (admittedly
less encountered), so splitting them into a set of constructs would fit the
model.

In any case, this is still a thought experiment at this time, whether
anything ever comes of it is unknown.

Randy.




Randy.
Dmitry A. Kazakov
2024-09-14 06:47:49 UTC
Reply
Permalink
Post by Randy Brukardt
I think the parallel block is more useful for general tasking. The advantage
of using parallel structures is that they look very similar to sequential
structures, and one lets the system do the scheduling (rather than trying to
figure out an organization manually).
Tasking is not about scheduling. It is about program logic expressed in
a sequential form. It is about software decomposition. Parallel
constructs simply do not do that.
Post by Randy Brukardt
One of the advantages of the model I'm thinking about is that it separates
concerns such as parallel execution, mutual exclusion, inheritance,
organization (privacy, type grouping), and so on into separate (mostly)
non-overlapping constructs.
To me it is exactly *one* construct: inheritance. You should be able to
inherit from an abstract protected interface at any point of type
hierarchy in order to add mutual exclusion:

type Protected_Integer is new Integer and Protected;
Post by Randy Brukardt
Ada started this process by having tagged types
a separate construct from packages;
I see modules and types as unrelated things.

you need both to get traditional OOP,
Post by Randy Brukardt
but you can also construct many structures that are quite hard in
traditional "one construct" OOP. I think that ought to be done for all
constructs, and thus the special task and protected constructs ought to go.
Constructs yes, they must go. It must be all inheritance. The concepts
must stay.
Post by Randy Brukardt
We already know that protected types cause problems with privacy of
implementation and with inheritance. Tasks have similar issues (admittedly
less encountered), so splitting them into a set of constructs would fit the
model.
The problems are of syntactic nature, IMO.

There is an issue with an incomplete inheritance model. You need not
just complete overriding but also more fine mechanisms like extension in
order to deal with entry point implementations. The same problem is with
constructors and destructors, BTW. What should really go is
Ada.Finalization mess replaced by a sane user construction hooks model
for all types, class-wide ones included.
--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de
Lawrence D'Oliveiro
2024-09-14 07:19:47 UTC
Reply
Permalink
... priorities never help for correctness.
Concurrent programming was never about correctness, it was about
efficiency/performance (throughput, latency, whatever is appropriate). And
priorities are just another part of this.
Niklas Holsti
2024-09-14 08:12:43 UTC
Reply
Permalink
Post by Randy Brukardt
...
Post by Niklas Holsti
Post by J-P. Rosen
Priorities are just optimization on how to manage cores when there are
not enough of them.
In some contexts it could be optimization -- for example, to increase
throughput in a soft real-time system -- but in hard real-time systems
priorities (or deadlines) are needed for correctness, not just for
optimizatiion.
This I don't buy: priorities never help for correctness. At least not
without extensive static analysis, but if you can do that, you almost
certainly can do the correctness without depending upon priorities.
You misunderstood me; perhaps I was too brief.

I said "hard real-time systems", which means that the program is correct
only if it meets its deadlines, for which priorities or deadline-based
scheduling are necessary if there are fewer cores than
concurrent/parallel activities, and the application has a wide range of
deadlines and activity execution times.

(To be honest, there is the alternative of using a single thread that is
manually sliced into small bits, interleaving all the activities
increment by increment, according to a static, cyclic schedule, but that
is IMO a horribly cumbersome and unmaintainable design, though
unfortunately still required in some contexts.)

I believe we agree that priorities should be used for other things, such
as controlling access to shared data, only if there is a well-defined
and safe mechanism for it, such as protected objects with priority
ceilings and priority inheritance on a single core.
Lioneldraghi
2024-12-20 23:26:06 UTC
Reply
Permalink
I am sensible to the complexity vs use balance, and I would easily cope
with the negative consequence of most of the simplifications you propose
(at least those I understand!) but obviously it's hard to be fully aware
of the consequence.
Post by Randy Brukardt
(3) A number of syntax options are eliminated. Matching identifiers are
required at the end of subprograms and packages. Initializers are always
required (<> can be used if default initialization is needed). Keyword
"variable" is needed to declare variables (we do not want the worst option
to be the easiest to write, as it is in Ada).
Why are you considering variables worst than constants?

I don't want the the "worst" option to be the easiest to write, but
neither do I want to put one more keyword in the most common case.

Note that :
1. I have no statistics, but it seems to me that there is more variables
than constants in my code.
2. I say "Useless" from my coder point of view, I dont know if it
simplify the work for compiler or tools implementers.

Lionel
Randy Brukardt
2024-12-21 08:14:00 UTC
Reply
Permalink
"Lioneldraghi" <***@free.fr> wrote in message news:vk4uee$3lfp0$***@dont-email.me...
...
Post by Lioneldraghi
Post by Randy Brukardt
(3) A number of syntax options are eliminated. Matching identifiers are
required at the end of subprograms and packages. Initializers are always
required (<> can be used if default initialization is needed). Keyword
"variable" is needed to declare variables (we do not want the worst option
to be the easiest to write, as it is in Ada).
Why are you considering variables worst than constants?
I don't want the the "worst" option to be the easiest to write, but
neither do I want to put one more keyword in the most common case.
A lot of "variables" in code actually are only written once. In Ada, those
are better modeled as constants. A constant tells the reader that the value
doesn't change during the life of the object, which is easier for analysis
(both human and machine).

Secondly, I am assuming that automation is helping to write a lot of code.
"One more keyword" is irrelevant in terms of creating code, the only
question is whether it hurts readability. I prefer to have most things
written explicitly (but not to the point of noise, of course). That seems
especially true if the code is being written by a program and mostly you are
trying to figure out why it doesn't work!
Post by Lioneldraghi
1. I have no statistics, but it seems to me that there is more variables
than constants in my code.
But how many of them *have* to be variables vs. the number that just are
because it is easier? I know I have a number of the latter.
Post by Lioneldraghi
2. I say "Useless" from my coder point of view, I dont know if it simplify
the work for compiler or tools implementers.
Constants do help the compiler generate better code, although a lot of the
benefits can be gained also by working harder. (That's what C compilers do,
after all.)

Randy.
Jeffrey R.Carter
2024-12-21 09:50:41 UTC
Reply
Permalink
Post by Randy Brukardt
"One more keyword" is irrelevant in terms of creating code, the only
question is whether it hurts readability.
More keywords = fewer words that can be used as identifiers

so more keywords makes writing code a little bit harder than fewer keywords.
Post by Randy Brukardt
But how many of them *have* to be variables vs. the number that just are
because it is easier? I know I have a number of the latter.
I put a lot of effort into making sure that all constants are so declared,
because I have the rule that (with certain exceptions) no non-local variables
may be referenced from subprograms, but constants may be referenced from
anywhere. However, I sometimes have constants that cannot be initialized with a
single expression, resulting in

C : T; -- Constant after initialization

Once C has been initialized, I treat it as a constant. Would your approach allow
the compiler to know that C is really a constant?
--
Jeff Carter
“Anyone who cannot cope with mathematics
is not fully human.”
The Notebooks of Lazarus Long
214
Randy Brukardt
2024-12-24 01:00:32 UTC
Reply
Permalink
"Jeffrey R.Carter" <***@spam.acm.org.not> wrote in message news:vk631h$3vfb4$***@dont-email.me...
...
Post by Jeffrey R.Carter
I put a lot of effort into making sure that all constants are so declared,
because I have the rule that (with certain exceptions) no non-local
variables may be referenced from subprograms, but constants may be
referenced from anywhere.
Precisely. The idea is to encourage use of constants by eliminating the
unnatural advantage to writing uninitialized variables. If everything is
equally easy/hard to write, then one is more likely to make the best choice
for the program.
Post by Jeffrey R.Carter
However, I sometimes have constants that cannot be initialized with a
single expression, resulting in
C : T; -- Constant after initialization
Once C has been initialized, I treat it as a constant. Would your approach
allow the compiler to know that C is really a constant?
Not with the approach I was envisioning. Of course, Ada 2022 and beyond
already make it possible to initialize a lot more objects (especially with
the introduction of container aggregates), so hopefully it will be less
necessary to write things like your example.

Randy.
Post by Jeffrey R.Carter
--
Jeff Carter
"Anyone who cannot cope with mathematics
is not fully human."
The Notebooks of Lazarus Long
214
G.B.
2024-12-21 17:19:12 UTC
Reply
Permalink
Post by Randy Brukardt
Post by Lioneldraghi
1. I have no statistics, but it seems to me that there is more variables
than constants in my code.
But how many of them *have* to be variables vs. the number that just are
because it is easier? I know I have a number of the latter.
Post by Lioneldraghi
2. I say "Useless" from my coder point of view, I dont know if it simplify
the work for compiler or tools implementers.
Constants do help the compiler generate better code, although a lot of the
benefits can be gained also by working harder. (That's what C compilers do,
after all.)
What are some compilers offering today? That is, can they find declarations
of variables that could be constants, if so instructed?
I am seeing some warnings about non-initialized variables for a meaningless
mock-up, but not much else. Ada, C++, Java.
(Maybe there are options that I have missed. Or an analysis of a whole
program yields more.)

function testc (b : Boolean) return Integer is
package P is
x : Integer;
end;
begin
if b then
P.x := 42;
end if;
return P.x;
end testc;

int testc(bool b) {
struct {
int x;
} P;
if (b) {
P.x = 42;
}
return P.x;
}

class testc {
class P {
int x;
}
P P;
int $(boolean b) {
if (b) {
P.x = 42;
}
return P.x;
}
}
Chris Townley
2024-12-21 17:35:30 UTC
Reply
Permalink
Post by G.B.
Post by Randy Brukardt
Post by Lioneldraghi
1. I have no statistics, but it seems to me that there is more variables
than constants in my code.
But how many of them *have* to be variables vs. the number that just are
because it is easier? I know I have a number of the latter.
Post by Lioneldraghi
2. I say "Useless" from my coder point of view, I dont know if it simplify
the work for compiler or tools implementers.
Constants do help the compiler generate better code, although a lot of the
benefits can be gained also by working harder. (That's what C
compilers do,
after all.)
What are some compilers offering today? That is, can they find declarations
of variables that could be constants, if so instructed?
I am seeing some warnings about non-initialized variables for a meaningless
mock-up, but not much else.  Ada, C++, Java.
(Maybe there are options that I have missed. Or an analysis of a whole
program yields more.)
function testc (b : Boolean) return Integer is
   package P is
     x : Integer;
   end;
begin
  if b then
     P.x := 42;
  end if;
  return P.x;
end testc;
int testc(bool b) {
   struct {
     int x;
   } P;
   if (b) {
     P.x = 42;
   }
   return P.x;
}
class testc {
  class P {
    int x;
  }
  P P;
  int $(boolean b) {
    if (b) {
       P.x = 42;
    }
    return P.x;
  }
}
My understanding is that many compliers will optimise these, and if
trivial numbers will 'optimise' out the variable. Confusing in the debugger!
--
Chris
Simon Wright
2024-12-22 12:15:54 UTC
Reply
Permalink
Post by G.B.
I am seeing some warnings about non-initialized variables for a meaningless
mock-up, but not much else.
See -gnatwk, which is included in -gnatwa. Been around for a long time now.
G.B.
2024-12-22 15:52:18 UTC
Reply
Permalink
Post by Simon Wright
Post by G.B.
I am seeing some warnings about non-initialized variables for a meaningless
mock-up, but not much else.
See -gnatwk, which is included in -gnatwa. Been around for a long time now.
No effect of either here (gcc-14.1.0). I see a CONSTRAINT_ERROR at run time
when using the uninitialized return value, for indexing an array.
Simon Wright
2024-12-22 23:06:24 UTC
Reply
Permalink
Post by G.B.
Post by Simon Wright
Post by G.B.
I am seeing some warnings about non-initialized variables for a meaningless
mock-up, but not much else.
See -gnatwk, which is included in -gnatwa. Been around for a long time now.
No effect of either here (gcc-14.1.0). I see a CONSTRAINT_ERROR at run time
when using the uninitialized return value, for indexing an array.
I _meant_ to be commenting on
Post by G.B.
Post by Simon Wright
Post by G.B.
What are some compilers offering today? That is, can they find
declarations of variables that could be constants, if so instructed?
Sorry about that.
Randy Brukardt
2024-12-24 01:03:53 UTC
Reply
Permalink
...
Post by G.B.
Post by Randy Brukardt
Constants do help the compiler generate better code, although a lot of the
benefits can be gained also by working harder. (That's what C compilers do,
after all.)
What are some compilers offering today? That is, can they find
declarations
of variables that could be constants, if so instructed?
I was thinking from a code generation perspective, as opposed to static
analysis. These are really the same process, but in one case the output is
only for a computer, and in the other, the output is for a human. The needs
of those outputs are quite different. I think a lot of compilers do
optimizations of constants by discovery, but few compilers do much beyond
rudimentary static analysis. I expect that to change, but I may be an
optimist on that...

Randy.

Keith Thompson
2024-12-21 21:26:06 UTC
Reply
Permalink
"Randy Brukardt" <***@rrsoftware.com> writes:
[...]
Post by Randy Brukardt
A lot of "variables" in code actually are only written once. In Ada, those
are better modeled as constants. A constant tells the reader that the value
doesn't change during the life of the object, which is easier for analysis
(both human and machine).
[...]

Agreed. But my understanding is that compilers typically do this kind
of analysis anyway, at least when optimization is enabled. For example,
if I write:

N : Integer := 42;

and later refer to the value of N, if the compiler is able to prove to
itself that N is never modified after its initialization, it can replace
a reference to N with the constant 42 (and possibly fold it into other
constant expressions).

Using "constant" for something that isn't going to be modified is good
practice, but I'd say it's for the benefit of the human reader.
--
Keith Thompson (The_Other_Keith) Keith.S.Thompson+***@gmail.com
void Void(void) { Void(); } /* The recursive call of the void */
Pascal Obry
2024-12-22 09:32:20 UTC
Reply
Permalink
Post by Keith Thompson
Using "constant" for something that isn't going to be modified is good
practice, but I'd say it's for the benefit of the human reader.
Exactly the point of Randy. And as Jean-Pierre would say "a program is
written once and read many times" (or something like that).

So having "constant" is a very important point when reading the code
for maintenance. Actually a developer don't care about the compiler,
what is important is the "visible" semantic to the human reader.
--
  Pascal Obry /  Magny Les Hameaux (78)

  The best way to travel is by means of imagination

  http://photos.obry.net

  gpg --keyserver keys.gnupg.net --recv-key F949BD3B
Lioneldraghi
2024-12-21 00:52:43 UTC
Reply
Permalink
Post by Randy Brukardt
(10) Variable-returning functions are introduced. They're pretty similar the
semantics of anonymous access returns (or the aliased function returns
suggested by Tucker). This means that a variable can easily be treated as a
function (and indeed, a variable declaration is just syntactic sugar for
such a function).
I suppose that to allows the compiler to discriminate this non sense code
Post by Randy Brukardt
Square (2) := 3;
from the legitimate
Post by Randy Brukardt
List.Index (3) := Item;
you will have to introduce some specific syntax, like Tucker's "aliased
function".

I see the huge benefit from a user point of view, but I'm not aware of
compiler internals : doesn't the introduction of a second function type
increase complexity?
Randy Brukardt
2024-12-21 08:19:09 UTC
Reply
Permalink
Post by Lioneldraghi
Post by Randy Brukardt
(10) Variable-returning functions are introduced. They're pretty similar the
semantics of anonymous access returns (or the aliased function returns
suggested by Tucker). This means that a variable can easily be treated as a
function (and indeed, a variable declaration is just syntactic sugar for
such a function).
I suppose that to allows the compiler to discriminate this non sense code
Post by Randy Brukardt
Square (2) := 3;
from the legitimate
Post by Randy Brukardt
List.Index (3) := Item;
you will have to introduce some specific syntax, like Tucker's "aliased
function".
I see the huge benefit from a user point of view, but I'm not aware of
compiler internals : doesn't the introduction of a second function type
increase complexity?
Yes, but Ada already has a bunch of different mechanisms for dealing with
objects/functions/exceptions/packages/types. My intent is to collapse those
all into one (somewhat more complex) mechanism. The basic idea is that
everything resolves a single way, meaning that everything can be overloaded,
and there no longer is a semantic difference between:
A : constant T := ...;
and
function A return T is (...);

Whether that really helps remains to be seen, of course. But the goal is to
reduce the number of disjoint mechanisms both in the language description
and in the implementation. The hope is then to be able to introduce
additional capabilities on top of a simpler and stronger foundation.

Randy.
Jeffrey R.Carter
2024-09-06 11:07:27 UTC
Reply
Permalink
Post by Kevin Chadwick
Out of interest. Could anyone help me with what an Gnat or other compiler
Ichbiah_2022_Mode might look like.
I have no idea what he would have done. For an idea of what I think a language
should have, you can look at my informal description of King
(https://github.com/jrcarter/King).
--
Jeff Carter
"My name is Jim, but most people call me ... Jim."
Blazing Saddles
39
Nioclás Pól Caileán de Ghloucester
2024-09-06 20:26:32 UTC
Reply
Permalink
On Fri, 6 Sep 2024, Jeffrey R.Carter wrote:
"I have no idea what he would have done. For an idea of what I think a
language should have, you can look at my informal description of King
(https://github.com/jrcarter/King)."

"Error rendering embedded code

Invalid PDF"
said
HTTPS://GitHub.com/jrcarter/King/blob/main/King_Basics_for_Adaists.pdf

"Probably no one will like the language except me."
said
HTTPS://GitHub.com/jrcarter/King

Hmm.
Loading...