integrating concurrency with oop

4.19: adda/oop/co/integrating concurrency with oop:
. oop was described as msg passing
and then there was said to be a complication
-- inversion of control --
because if you sent a msg
and also expected a response,
then you'd be waiting for a reply to your msg,
meaning that instead of
simply writing a function call,
you were now writing reply handlers for
every use of that function .
. the reason for the handler would be
if your object was getting a lot of requests,
you could be just waiting around
when you could be making more calls
or checking on other replies .
. but, what is there to wait about?
just have the compiler clear a flag
before a remote assignment;
then set it after the remote assignment;
and finally check for the flag
before continuing to use that var .]

. I've long assumed that oop naturally defined
a per-object granularity of concurrency
but now I'm wondering whether that assumption
really holds for my style of oop .
. I like the value-oriented paradigm
(vs the popular address-orientation);
. with value-orientation,
there are still the usual polymorphic functions;
but instead of asking an obj to operate on itself,
the functions can act like they do in
that classic example of polymorphism,
number.type: where {*,/,+,-} are
binary operations over the {N,C,Q,R,Z} subtypes .

. after a var holds a value,
new values seem to come from function assignments;
but, here's what I like about this style:
the functions don't generate garbage;
they use an implicit out-mode parameter, y,
which points at whatever address
the function's result was assigned to .
. all polymorphic vars have expandable
-- but nevertheless localized -- memory
that is dealloc'd in the usual way,
whenever the owning scope retires .
. conversely,
what seems like a self-mutating procedure
is really a like the cooperation between
a function and an assignment stmt .
. the way to view self-mutators like i`+1;
is that they are simply shorthand for i`= i+1;
y`*(x) means y`= (y)*(x)
and y`f means y`= f(y); each implicitely has
an inout-mode y parameter
rather than the usual implicit out-mode y;
to model math's use of y= f(x)
in teaching about functions,
x is the name of a function's
initial activation record,
so the full name of a formal param p,
in a function f, is f`x`p;
or just x`p means self`x`p
where self is the current function
when referring to itself anonymously .
. the interface declares y`f(x) like this:
the body implements it as:
f(x.t).proc: ( y`= routine(y, x) ) .

. I also wondered how sequences are preserved;
and, from this brief survey
I discovered the need for a contiguous
(read, Quick-modify, write) critical zone;
now my job is to have it done auto'ly
without involving app developers .

. when concurrent subprograms (cosub's)
are sharing inout access to a var,
the generally required minimal cooperation
is that only one writer has access at a time,
and no reading should be allowed during a write .
. if the compiler can't prove a var is not shared
then it must assume it is,
and provide it with some cooperation scheme .
. oop's message's can serialize concurrent accesses;
but value-oriented oop doesn't require msg-passing .
. the efficient and safe way is a
compiler-administered lock system;
it raises the lock just long eno' to complete
an entire read or write .
. the use must be minimal to protect against
deadlocks and unnecessary waiting .
. another safe use is for an atomic
(read, Qmodify, write) cycle
where Qmodify represents a certain class of procedures
that are guaranteed to be free of deadlock,
because either no dependencies are required
or they have already been procured
before initiating the lock attempt;
and the procedure's loops are proven to terminate .

. when is the (read,Qmodify,write)cycle really needed?
at least for the lock itself:
redo.loop:(is lock free?
lockit else redo).
-- the generalization of that exists
whenever an externally-aliased object is
# accessed by
both parts of the same conditional
(ie, being read by the guarding expression
# modified by the guarded stmt).