Despite all the recent talk, I think are actually moving together on
many fronts.
For the moment, lets ignore the issue of a total order over volatile
memory actions, and just assume that volatile memory is sequentially
consistent.
Consider two programming models, A and B.
In both models, you check the validity of a totally ordered execution trace E.
For each thread T, the actions of T in E must be consistent with the
single-thread semantics of the program (ignoring the values returned
by reads, which are determined by the memory model).
A happens-before relation is defined on actions in E, defined by
order within a thread, and by matching unlock/lock and volatile
write/read pairs.
In both models, the values read in E are consistent with hb; i.e.,
if R is a read that returns the value of a write W, then we cannot
have R hb
W and we cannot have a W' to the same location where W hb W' hb R in E.
In model B only, a read must see the value of a previous write.
I think all of the models now being discussed fall somewhere between A and B.
A is too relaxed, because it allows causal loops and allows for
correctly synchronized programs to exhibit non-sequentially
consistent behavior.
B is too strict, because it does allow reordering of a read and a
following write.
Jason and Sarita: do you agree that your models fit between A and B?
Jason, I know that your model isn't specified this way, but I'm
pretty sure the semantics of your model fall between A and B.
If I'm correct, this will help us focus on the remaining differences
between our models.
Bill
-------------------------------
JavaMemoryModel mailing list - http://www.cs.umd.edu/~pugh/java/memoryModel
This archive was generated by hypermail 2b29 : Thu Oct 13 2005 - 07:00:41 EDT