From daniel.smith at oracle.com Mon Oct 4 23:34:37 2021 From: daniel.smith at oracle.com (Dan Smith) Date: Mon, 4 Oct 2021 23:34:37 +0000 Subject: Addressing the full range of use cases Message-ID: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> When we talk about use cases for Valhalla, we've often considered a very broad set of class abstractions that represent immutable, identity-free data. JEP 401 mentions varieties of integers and floats, points, dates and times, tuples, records, subarrays, cursors, etc. However, as shorthand this broad set often gets reduced to an example like Point or Int128, and these latter examples are not necessarily representative of all candidate value types. Specifically, our favorite example classes have a property that doesn't generalize: they'll happily accept any combination of field values as a valid instance. (In fact, they're even happy to accept any combination of *bits* of the appropriate length.) Many candidate primitive classes don't have this property?the constructors do important validation work, and only certain combinations of fields are allowed to represent valid instances. Related areas of concern that we've had on the radar for awhile: - The "all zeros is your default value" strategy forces an all-zero instance into the class's value set, even if that doesn't make sense for the class. Many candidate classes have no reasonable default at all, leading naturally to wish for "null is your default value" (or other, more exotic, strategies involving revisiting the idea that every type has a default value). We've provided 'P.ref' for those use sites that *need* null, but haven't provided a complete story for value types that want it to be *their* default value, too. - Non-atomic heap updates can be used to create new instances that arbitrary combine previously-validated instances' fields. There is no guarantee that the new combination of fields is semantically valid. Again, while there's precedent for this with 'double' and 'long' (JLS 17.7), those are special cases that don't generalize?any combination of double bit fields is *still a valid double*. (This is usually described as "tearing", although JLS 17.6 has something else in mind when it uses that word...) The language provides 'volatile' as a use-site opt-in to atomicity, and we've toyed with a declaration-site opt-in as well. But object integrity being "off" by default may not be ideal. - Existing class types like LocalDate are both nullable and atomic. These are useful properties to preserve during migration; nullability, in particular, is essential for source compatibility. We've provided reference-default declarations as a mechanism to make reference types (which have these properties) the default, with 'P.val' as an opt-in to value types. But in doing so we take away the many benefits of value types by default, and force new code to work with the "bad name". While we can provide enough knobs to accommodate all of these special cases, we're left with a complex user model which asks class authors to make n different choices they may not immediately grasp the consequences of, and class users to keep 2^n different categories straight in their heads. As an alternative, we've been exploring whether a simpler model is workable. It is becoming clear that there are (at least) two clusters of uses for value types. The "classic" value types are like numerics -- they'll happily accept any combination of field values as a valid instance, and the zero value is a sensible (often the best possible) default value. They make relatively little use of encapsulation. These are the ones that best "work like an int." The "encapsulated" value types are those that are more like typical aggregates ("codes like a class") -- their constructors do important validation work, and only certain combinations of fields are allowed to represent valid instances. These are more likely to not have valid zero values (and hence want to be nullable). Some questions to consider for this approach: - How do we group features into clusters so that they meet the sweet spot of user expectations and use cases while minimizing complexity? Is two clusters the right number? Is two already too many? (And what do we call them? What keywords best convey the intended intuitions?) - If there are knobs within the clusters, what are the right defaults? E.g., should atomicity be opt-in or opt-out? - What are the performance costs (or, in the other direction, performance gains) associated with each feature? For certain feature combinations, have we canceled out the performance gains over identity classes (and at that point, is that combination even worth supporting?) From daniel.smith at oracle.com Tue Oct 5 00:04:07 2021 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 5 Oct 2021 00:04:07 +0000 Subject: Addressing the full range of use cases In-Reply-To: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> Message-ID: <6B7D957B-903A-4ED8-8DA0-0A9F35639F3F@oracle.com> Here's a followup with some answers reflecting my own understanding and what we've learned at Oracle while investigating these ideas. (Presented separately because there's still a lot of uncertainty, and because I want to encourage focusing on the contents of the original mail, with this reply as a supplement.) > On Oct 4, 2021, at 5:34 PM, Dan Smith wrote: > > Some questions to consider for this approach: > > - How do we group features into clusters so that they meet the sweet spot of user expectations and use cases while minimizing complexity? Is two clusters the right number? Is two already too many? (And what do we call them? What keywords best convey the intended intuitions?) A "classic" and "encapsulated" pair of clusters seems potentially workable (better names TBD). Classic primitive classes behave as described in JEP 401?this piece is pretty stable. (Although some pieces, like the construction model, could be refined to better match their less class-like semantics.) Encapsulated primitive classes are always nullable and (maybe?) always atomic. Nullability can be handled in one of two ways: - Flush the previous mental model that null is inherently a reference concept. Null is a part of the value set of both encapsulated primitive value types and reference types. - Encapsulated primitives are *always* reference types. They're just a special kind of reference type that can be optimized with flattening; if you want finer-grained control, use a classic primitive class. However, we often do the exercise of trying to get rid of the ".ref" type, only to find that there are still significant uses for a developer-controlled opt out of all flattening... For migration, encapsulated primitive classes mostly subsume "reference-default" classes, and let us drop the 'Foo.val' feature. As nullable types, encapsulated primitive value types are source compatible replacements for existing reference types, and potentially provide an instant performance boost on recompilation. (Still to do, though: binary compatibility. There are some strategies we can use that don't require so much attention in the language. This is a complex enough topic that it's probably best to set it aside for now until the bigger questions are resolved.) > - If there are knobs within the clusters, what are the right defaults? E.g., should atomicity be opt-in or opt-out? Fewer knobs are better. Potentially, the "encapsulated"/"classic" choice is the only one offered. Nullability and atomicity would come along for the ride, and be invisible to users. *However*, performance considerations could push us in a different direction. For the "encapsulated"/"classic" choice, perhaps "encapsulated" should be the default. Classic primitives have sharper edges, especially for class authors, so perhaps can be pitched as an "advanced" feature, with an extra modifier signaling this fact. (Everybody uses 'int', but most people don't need to concern themselves with declaring 'int'.) Alternatively, maybe we'd prefer a term for "classic", and a separate term for "encapsulated"? (Along the lines of "record" and "enum" being special kinds of classes with a variety of unique features.) > - What are the performance costs (or, in the other direction, performance gains) associated with each feature? For certain feature combinations, have we canceled out the performance gains over identity classes (and at that point, is that combination even worth supporting?) Nullability: Encapsulated primitive class types need *nullable Q types* in the JVM. A straightforward way to get there is by adding a boolean flag to the classes. This increases footprint in some cases, but is often essentially free. (For example: if the size of an array component must be a power of 2, boolean flags only increase the array size for 2 or so classes in java.time. Most have some free space.) There are some other strategies JVMs could use to compress null flags into existing footprint. In full generality, this could involve cooperation with class authors ("this pointer won't be null"). But it seems like that level of complexity might be unnecessary?for footprint-sensitive use cases, programmers can always fall back to classic primitive classes. Execution time costs of extra null checks for nullable Q types need to be considered and measured, but it seems like they should be tolerable. Atomicity: JVM support for atomicity guarantees seems more difficult?algorithms for ensuring atomicity above 64 bits tend to be prohibitively expensive. The current prototype simply gives up on flattening when atomicity is requested; not clear whether that's workable as the default behavior for a whole cluster of primitive classes. There are plenty of stack-level optimizations still to be had, but giving up on heap optimizations for these classes might be disappointing. (Can we discover better algorithms, or will hardware provide viable solutions in the near future? TBD...) Alternatively, can we train programmers to treat out-of-sync values with the same tolerance they give to out-of-sync object state in classes that aren't thread safe? It seems bad that a hostile or careless third party could create a LocalDate for February 31 via concurrent read-writes, with undefined subsequent instance method behavior; but is this more bad than how the same third party could *mutate* (via validating setters) a similar identity object with non-final fields to represent February 31? Migration: As noted above, minimal performance costs in this approach, even when using the plain class name as a type. Legacy class files will continue to use L types, though, and those have some inherent limitations. (We've prototyped scalarizing of L types in certain circumstances, but you really need the Q signal for optimal performance.) Overall: Optimistically, even if pointers are the best implementation (for now) of heap storage for encapsulated primitive value types, there's a lot to be gained by stack optimizations, and those could well be enough to justify the feature. If, pessimistically, the overall performance doesn't look good, it's worth asking whether we should tackle these use cases at all. But there's a risk that developers would misuse classic primitives if we don't provide the safer alternative. Could we effectively communicate "you're doing it wrong, just use identity"? Not sure. From brian.goetz at oracle.com Tue Oct 5 15:00:00 2021 From: brian.goetz at oracle.com (Brian Goetz) Date: Tue, 5 Oct 2021 15:00:00 +0000 Subject: Addressing the full range of use cases In-Reply-To: <6B7D957B-903A-4ED8-8DA0-0A9F35639F3F@oracle.com> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> <6B7D957B-903A-4ED8-8DA0-0A9F35639F3F@oracle.com> Message-ID: <45946748-E8A3-453B-BEF2-D2452F40AB7C@oracle.com> JVM support for atomicity guarantees seems more difficult?algorithms for ensuring atomicity above 64 bits tend to be prohibitively expensive. The current prototype simply gives up on flattening when atomicity is requested; not clear whether Note that this only gives up on flattening *in the heap*; flattening on the stack (calling convention optimization) and scalarization are still in play. Alternatively, can we train programmers to treat out-of-sync values with the same tolerance they give to out-of-sync object state in classes that aren't thread safe? Note that to produce tearing, you have to have a data race (i.e., broken program), where there is a read-write or write-write race on a reference to a primitive class. From kevinb at google.com Tue Oct 5 19:40:39 2021 From: kevinb at google.com (Kevin Bourrillion) Date: Tue, 5 Oct 2021 12:40:39 -0700 Subject: Addressing the full range of use cases In-Reply-To: <6B7D957B-903A-4ED8-8DA0-0A9F35639F3F@oracle.com> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> <6B7D957B-903A-4ED8-8DA0-0A9F35639F3F@oracle.com> Message-ID: On Mon, Oct 4, 2021 at 5:04 PM Dan Smith wrote: and because I want to encourage focusing on the contents of the original > mail, with this reply as a supplement. > Noted, but didn't have much useful to reply. I definitely think this is the right problem to be solving... A "classic" and "encapsulated" pair of clusters seems potentially workable > (better names TBD). Tend to agree. > Nullability can be handled in one of two ways: > > - Flush the previous mental model that null is inherently a reference > concept. Null is a part of the value set of both encapsulated primitive > value types and reference types. > imho there are other arguments for striking "the null reference" in favor of "the null value". A reference ought to be something you can *de*reference. And, it isn't really reference types *themselves* that bring null into the picture; it's the way Java "grafts" the null type onto most *usages* of a reference type. I do think many people will experience initial shock/aversion at this, owing primarily to decades of hating null. But it shouldn't be hard for them to recognize that however bad null might seem, a false value is worse, and that fact has nothing to do with references. For migration, encapsulated primitive classes mostly subsume > "reference-default" classes, and let us drop the 'Foo.val' feature. Indeed, it would probably be bad to introduce the classic/encapsulated distinction if it *can't* fully get rid of the val-default/ref-default distinction. > For the "encapsulated"/"classic" choice, perhaps "encapsulated" should be > the default. Classic primitives have sharper edges, especially for class > authors, so perhaps can be pitched as an "advanced" feature, with an extra > modifier signaling this fact. (Everybody uses 'int', but most people don't > need to concern themselves with declaring 'int'.) > fwiw, I agree (strongly). Atomicity: > > Alternatively, can we train programmers to treat out-of-sync values with > the same tolerance they give to out-of-sync object state in classes that > aren't thread safe? It seems bad that a hostile or careless third party > could create a LocalDate for February 31 via concurrent read-writes, with > undefined subsequent instance method behavior; but is this more bad than > how the same third party could *mutate* (via validating setters) a similar > identity object with non-final fields to represent February 31? > That seems reasonable. > If, pessimistically, the overall performance doesn't look good, it's worth > asking whether we should tackle these use cases at all. But there's a risk > that developers would misuse classic primitives if we don't provide the > safer alternative. Could we effectively communicate "you're doing it wrong, > just use identity"? Not sure. > It may be over-idealistic of me, but I think the less people have to make new identity objects when they didn't care about identity *per se*, the better. -- Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com From daniel.smith at oracle.com Tue Oct 5 22:52:20 2021 From: daniel.smith at oracle.com (Dan Smith) Date: Tue, 5 Oct 2021 22:52:20 +0000 Subject: EG meeting, 2021-10-06 Message-ID: <89B393B6-889A-4ECC-ADC1-8A13E8DF0BB4@oracle.com> EG Zoom meeting tomorrow, Wednesday October 6, at 4pm UTC (9am PDT, 12pm EDT). We can discuss "Addressing the full range of use cases", which concerns how we support nullability, atomicity, and migration. From forax at univ-mlv.fr Wed Oct 6 09:56:27 2021 From: forax at univ-mlv.fr (Remi Forax) Date: Wed, 6 Oct 2021 11:56:27 +0200 (CEST) Subject: Addressing the full range of use cases In-Reply-To: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> Message-ID: <708708839.1541691.1633514187916.JavaMail.zimbra@u-pem.fr> ----- Original Message ----- > From: "daniel smith" > To: "valhalla-spec-experts" > Sent: Mardi 5 Octobre 2021 01:34:37 > Subject: Addressing the full range of use cases > When we talk about use cases for Valhalla, we've often considered a very broad > set of class abstractions that represent immutable, identity-free data. JEP 401 > mentions varieties of integers and floats, points, dates and times, tuples, > records, subarrays, cursors, etc. However, as shorthand this broad set often > gets reduced to an example like Point or Int128, and these latter examples are > not necessarily representative of all candidate value types. yes ! > > Specifically, our favorite example classes have a property that doesn't > generalize: they'll happily accept any combination of field values as a valid > instance. (In fact, they're even happy to accept any combination of *bits* of > the appropriate length.) Many candidate primitive classes don't have this > property?the constructors do important validation work, and only certain > combinations of fields are allowed to represent valid instances. I now believe the mantra "code like a class acts as an int" is harmful. A class provides encapsulation, an int has no encapsulation, there is a mismatch. > > Related areas of concern that we've had on the radar for awhile: > > - The "all zeros is your default value" strategy forces an all-zero instance > into the class's value set, even if that doesn't make sense for the class. Many > candidate classes have no reasonable default at all, leading naturally to wish > for "null is your default value" (or other, more exotic, strategies involving > revisiting the idea that every type has a default value). We've provided > 'P.ref' for those use sites that *need* null, but haven't provided a complete > story for value types that want it to be *their* default value, too. > > - Non-atomic heap updates can be used to create new instances that arbitrary > combine previously-validated instances' fields. There is no guarantee that the > new combination of fields is semantically valid. Again, while there's precedent > for this with 'double' and 'long' (JLS 17.7), those are special cases that > don't generalize?any combination of double bit fields is *still a valid > double*. (This is usually described as "tearing", although JLS 17.6 has > something else in mind when it uses that word...) The language provides > 'volatile' as a use-site opt-in to atomicity, and we've toyed with a > declaration-site opt-in as well. But object integrity being "off" by default > may not be ideal. > > - Existing class types like LocalDate are both nullable and atomic. These are > useful properties to preserve during migration; nullability, in particular, is > essential for source compatibility. We've provided reference-default > declarations as a mechanism to make reference types (which have these > properties) the default, with 'P.val' as an opt-in to value types. But in doing > so we take away the many benefits of value types by default, and force new code > to work with the "bad name". The existing class LocalDate is not atomic per se, atomic in Java implies volatile and currently if a LocalDate field is updated in one thread, another thread may never see that update. LocalDate is currently not tearable, a QLocalDate; is tearable in case of racy code. And yes, nullablibilty is a huge compatibility issue. > > While we can provide enough knobs to accommodate all of these special cases, > we're left with a complex user model which asks class authors to make n > different choices they may not immediately grasp the consequences of, and class > users to keep 2^n different categories straight in their heads. yes ! > > As an alternative, we've been exploring whether a simpler model is workable. It > is becoming clear that there are (at least) two clusters of uses for value > types. The "classic" value types are like numerics -- they'll happily accept > any combination of field values as a valid instance, and the zero value is a > sensible (often the best possible) default value. They make relatively little > use of encapsulation. These are the ones that best "work like an int." The > "encapsulated" value types are those that are more like typical aggregates > ("codes like a class") -- their constructors do important validation work, and > only certain combinations of fields are allowed to represent valid instances. > These are more likely to not have valid zero values (and hence want to be > nullable). I agree. > > Some questions to consider for this approach: > > - How do we group features into clusters so that they meet the sweet spot of > user expectations and use cases while minimizing complexity? Is two clusters > the right number? Is two already too many? (And what do we call them? What > keywords best convey the intended intuitions?) Two is too many, see below. > > - If there are knobs within the clusters, what are the right defaults? E.g., > should atomicity be opt-in or opt-out? I prefer opt-in, see below. > > - What are the performance costs (or, in the other direction, performance gains) > associated with each feature? For certain feature combinations, have we > canceled out the performance gains over identity classes (and at that point, is > that combination even worth supporting?) Good question ... Let's me reformulate. But before, we can not that we have 3 ways of specifying primitive class features, - we can use different types, by example, Foo.val vs Foo.ref - we can have container attributes (opt-in or opt-out), by example, declaring a field volatile make it non tearable - we have runtime knobs, like an array can allow null or not. First the problem, as you said, if we have a code like the one just below, the field primFoo is flattened so primFoo.someValue is 0 bypassing the constructor. primitive class PrimFoo { PrimFoo(int someValue) { if (someValue == 0) { throw new IAE(); } this.someValue = someValue; } int someValue; } class Foo { PrimFoo primFoo; } I believe we should try to make a primitive class nullable and flattenable by default, so have one tent pole and have knobs for 2 special cases, non-nullable primitive classes (for use-cases like Complex) and non flattenable classes when stored in field/array cell (the use case "atomicity"). So a primitive class (the default): - represent the null value (initialized) with a supplementary field when stored on heap, and a supplementary register if necessary - is tearable in case of racy code (don't write racy code) - is represented by a Q-type in the bytecode for full flattening or a L-type using a pointer to be backward compatible - is represented by different java.lang.Class (one for the Q-type, the primary class and one for the L-Type, the secondary class) I think that a Q-type can be backward compatible with a L-type in the method descriptors, a Q-type should be represented as a L-type + an out-of-band bit saying that this is a Q-type so it should be loaded eagerly (like we use out-of-band attributes for the generic specialization). Obviously, the way to create a Q-type (default + with + with) is still different from a L-type (new + dup + invokespecial) so creating a Q-type instead of a L-type is not backward compatible. So the VM has to generate several method entry points for method that is annotated with the attribute saying there is Q-type in the descriptor (or override a method with such attribute). The special cases: 1) non-nullable when flattened. In believe that all primitive type should be nullable but that a user should have a knob to choose that a primitive class is non-nullable when flattened. So the VM will throw a NPE, if a field/or an array is annotated with something saying that null is not a supported value. For array, we already have that bit at runtime, i believe we should have a modifier for field saying that null is a possible value when flattened. 2) non tearable. We already support the modifier 'volatile' to say that a primitive class should be manipulated by pointer. Should we have a declaration site keyword, i don't know. It's perhaps a corner case where not using a primitive class is better. To summarize, i believe that if a primitive class is always nullable (apart some opt-in special cases), it can be backward compatible (enough) to transform all value based class to primitive class and just let the new version of javac to replace all the L-type by Q-type in the method descriptor (using an atttribute) without asking the user to think too much about it (apart if the code is racy). regards, R?mi From forax at univ-mlv.fr Wed Oct 6 09:56:58 2021 From: forax at univ-mlv.fr (Remi Forax) Date: Wed, 6 Oct 2021 11:56:58 +0200 (CEST) Subject: EG meeting, 2021-10-06 In-Reply-To: <89B393B6-889A-4ECC-ADC1-8A13E8DF0BB4@oracle.com> References: <89B393B6-889A-4ECC-ADC1-8A13E8DF0BB4@oracle.com> Message-ID: <429264866.1542560.1633514218687.JavaMail.zimbra@u-pem.fr> Sadly, i will not be able to attend this meeting :( regards, R?mi ----- Original Message ----- > From: "daniel smith" > To: "valhalla-spec-experts" > Sent: Mercredi 6 Octobre 2021 00:52:20 > Subject: EG meeting, 2021-10-06 > EG Zoom meeting tomorrow, Wednesday October 6, at 4pm UTC (9am PDT, 12pm EDT). > > We can discuss "Addressing the full range of use cases", which concerns how we > support nullability, atomicity, and migration. From maurizio.cimadamore at oracle.com Wed Oct 6 10:07:09 2021 From: maurizio.cimadamore at oracle.com (Maurizio Cimadamore) Date: Wed, 6 Oct 2021 11:07:09 +0100 Subject: Addressing the full range of use cases In-Reply-To: <708708839.1541691.1633514187916.JavaMail.zimbra@u-pem.fr> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> <708708839.1541691.1633514187916.JavaMail.zimbra@u-pem.fr> Message-ID: <97bc6167-ee9e-ebfb-422b-81251646bb47@oracle.com> On 06/10/2021 10:56, Remi Forax wrote: >> - Existing class types like LocalDate are both nullable and atomic. These are >> useful properties to preserve during migration; nullability, in particular, is >> essential for source compatibility. We've provided reference-default >> declarations as a mechanism to make reference types (which have these >> properties) the default, with 'P.val' as an opt-in to value types. But in doing >> so we take away the many benefits of value types by default, and force new code >> to work with the "bad name". > The existing class LocalDate is not atomic per se, atomic in Java implies volatile and currently if a LocalDate field is updated in one thread, another thread may never see that update. > LocalDate is currently not tearable, a QLocalDate; is tearable in case of racy code. The fact that QLocalDate is tearable is a consequence of the fact that e.g. elements of a QLocalDate[] cannot be read/written atomically - e.g. in a single shot, unlike for references - which are pointers and can be loaded/stored atomically (as per Java Memory Model). It's true the word "atomic" is sometimes used to refer to operations such as CAS which provide strong inter-thread guarantees - this is not what Dan had in mind here. Maurizio From forax at univ-mlv.fr Wed Oct 6 10:19:59 2021 From: forax at univ-mlv.fr (Remi Forax) Date: Wed, 06 Oct 2021 10:19:59 +0000 Subject: Addressing the full range of use cases In-Reply-To: <97bc6167-ee9e-ebfb-422b-81251646bb47@oracle.com> References: <91151DC6-D221-4F16-ABA5-67434D567F7A@oracle.com> <708708839.1541691.1633514187916.JavaMail.zimbra@u-pem.fr> <97bc6167-ee9e-ebfb-422b-81251646bb47@oracle.com> Message-ID: <5CCB1D0D-DAE2-437C-8313-52A2A91923EA@univ-mlv.fr> On October 6, 2021 10:07:09 AM UTC, Maurizio Cimadamore wrote: > >On 06/10/2021 10:56, Remi Forax wrote: >>> - Existing class types like LocalDate are both nullable and atomic. These are >>> useful properties to preserve during migration; nullability, in particular, is >>> essential for source compatibility. We've provided reference-default >>> declarations as a mechanism to make reference types (which have these >>> properties) the default, with 'P.val' as an opt-in to value types. But in doing >>> so we take away the many benefits of value types by default, and force new code >>> to work with the "bad name". >> The existing class LocalDate is not atomic per se, atomic in Java implies volatile and currently if a LocalDate field is updated in one thread, another thread may never see that update. >> LocalDate is currently not tearable, a QLocalDate; is tearable in case of racy code. > >The fact that QLocalDate is tearable is a consequence of the fact that >e.g. elements of a QLocalDate[] cannot be read/written atomically - e.g. >in a single shot, unlike for references - which are pointers and can be >loaded/stored atomically (as per Java Memory Model). > >It's true the word "atomic" is sometimes used to refer to operations >such as CAS which provide strong inter-thread guarantees - this is not >what Dan had in mind here. Right, I used to think that being tearable was a huge issue. Now, I don't. At least, not to the point of making primive classes non tearable the default. As I said, the code need to have race issues to observe that behavior and I think that a security token should not be a primitive class, i.e. i can live with the fact that a user can forge any permutations of my primitive class if he writes racy code on purpose. > >Maurizio > R?mi -- Envoy? de mon appareil Android avec Courriel K-9 Mail. Veuillez excuser ma bri?vet?. From daniel.smith at oracle.com Wed Oct 20 14:03:12 2021 From: daniel.smith at oracle.com (Dan Smith) Date: Wed, 20 Oct 2021 14:03:12 +0000 Subject: EG meeting *canceled*, 2021-10-20 Message-ID: <2A84D5B9-6475-4F00-B1DA-541819F4AFDE@oracle.com> No new topics today, so we'll cancel the meeting. Next scheduled slot is November 3. From forax at univ-mlv.fr Wed Oct 20 15:10:25 2021 From: forax at univ-mlv.fr (Remi Forax) Date: Wed, 20 Oct 2021 17:10:25 +0200 (CEST) Subject: EG meeting *canceled*, 2021-10-20 In-Reply-To: <2A84D5B9-6475-4F00-B1DA-541819F4AFDE@oracle.com> References: <2A84D5B9-6475-4F00-B1DA-541819F4AFDE@oracle.com> Message-ID: <1993551958.1519278.1634742625700.JavaMail.zimbra@u-pem.fr> I've sent a mail about considering all primitive types as always nullable on stack (as parameters or local variables). R?mi ----- Original Message ----- > From: "daniel smith" > To: "valhalla-spec-experts" > Sent: Mercredi 20 Octobre 2021 16:03:12 > Subject: EG meeting *canceled*, 2021-10-20 > No new topics today, so we'll cancel the meeting. Next scheduled slot is > November 3.