Nullness markers to enable flattening
Remi Forax
forax at univ-mlv.fr
Wed Feb 8 14:40:46 UTC 2023
> From: "Kevin Bourrillion" <kevinb at google.com>
> To: "daniel smith" <daniel.smith at oracle.com>
> Cc: "valhalla-spec-experts" <valhalla-spec-experts at openjdk.java.net>
> Sent: Tuesday, February 7, 2023 7:50:08 PM
> Subject: Re: Nullness markers to enable flattening
> As the person with a foot in both Valhalla and JSpecify camps, and who is
> obsessed with keeping Java as conceptually simple as it can be, as long as this
> all works out I'll consider it to be a major win.
> On Mon, Feb 6, 2023 at 5:26 PM Dan Smith < [ mailto:daniel.smith at oracle.com |
> daniel.smith at oracle.com ] > wrote:
>> - Because nullness is distinct from types, it shouldn't impact type checking
>> rules, subtyping, overriding, conversions, etc. Nullness has its own analysis,
>> subject to its own errors/warnings. The precise error/warning conditions
>> haven't been fleshed out, but our bias is towards minimal intrusion—we don't
>> want to make it hard to adopt these features in targeted ways.
> (The points below are general, but when the type in question is flattenable it
> likely makes sense to enforce more strongly.)
> I think this is a very salient point: there is a trade-off between enforcing the
> new nullness information as well as we might like vs. the adoptability of the
> feature. To arrive eventually at a null-safe world, the most importantly early
> step is for as many APIs as possible to provide the information. If they worry
> about breaking clients already, they won't be able to do it; if that means
> javac being lenient on enforcement, that's okay! Third-party nullness analysis
> tools have their role to play for a very long time anyway, which is a fine
> thing. Really, the "worst" thing about these tools today is simply "ugh, the
> type annotations are too bulky", and there's value even in solving *just that*.
> In fact, the spec being stringent about which cases definitely produce a warning
> might even be counterproductive. Third-party analyzers will analyze more deeply
> and take advantage of wider sources of information, and the typical effect is
> to recognize that a certain expression *can't* be null when it otherwise seemed
> like it could. That means the deeper analysis would end up wanting to
> *subtract* warnings from javac, which of course it can't do (currently has no
> way to do?). (Again, this applies less to the value types that are the actual
> motivating use case here.)
>> - That said, *type expressions* (the syntax in programs that expresses a type)
>> are closely intertwined with *nullness markers*. 'Foo!' refers to a non-null
>> Foo, and 'Foo?' refers to a Foo or null. And nullness is an optional property
>> of type arguments, type variable bounds, and array components. Nullness markers
>> are the way programmers express their intent to the compiler's nullness
>> analysis.
> Altogether, I think you're saying that this nullness attribute is attached to
> *every* static type recognized by the compiler: types of expressions, of
> expression contexts, and the (usually explicit) [
> https://github.com/jspecify/jspecify/wiki/type-usages | type usages ] you
> mention. Basically I would expect that most programmers most of the time will
> be well-served (enough) to think of this attribute as *if* it's an intrinsic
> part of the type. It's sort of the exact illusion this is all trying to create.
> (One mild exception is that it is possibly counterproductive to enforce override
> strict parameter agreement in the nullness attribute; as far as we can tell,
> allowing a nullable parameter to override a non-null one is nearly harmless and
> we think makes for a better migration story.)
I think we can follow the "generics erasure" playbook here, i.e. if a method has a nullness attribute either an overriden method has a nullness attribute with the same values or it does not have one.
This allow to annotate a library with nullness information without having to change the client code that may have a class that override a method from the library.
About having a nullable parameter to overrride a non-null one, this goes against the idea that Java parameters are invariant not contravariant (you can not have a method that takes an Object as parameter to override a method that takes a String as parameter even if it is sound in term of type). One advantage i see to keep the method parameter to be invariant is that as a user of a method i know that not only a method but also all its overrides will never allow null, even if the nullness checking is done at declaration site (erasure of the nullness information aside).
>> - There are features that change the default interpretation of the nullness of
>> class names. This is still pretty open-ended. Perhaps certain classes can be
>> declared (explicitly or implicitly) null-free by default (e.g., 'Point' is
>> implicitly 'Point!'). Perhaps a compilation-unit- or module- level directive
>> says that all unadorned types should be interpreted as '!'. Programs can be
>> usefully written without these convenience features, but for programmers who
>> want to widely adopt nullness, it will be important to get away from
>> "unspecified" as the default everywhere.
> If a release has the rest of it but not this part, users will find it extremely
> annoying, but it might be survivable if we can see the other side of the chasm
> from there. It's definitely essential long-term.
> Of course, TypeScript, Scala, C#, and Dart all do it this way, through either a
> configuration file or compiler flag etc. Java's analogue is module-info, but
> yes, I do think adoptability would benefit *very* much from a class- or
> compilation-unit level modifier as well. And (sigh) I guess package-info, even
> though when that gets used outside a module it's kind of a disaster.
>> - Nullness is generally enforced at run time, via cooperation between javac and
>> JVMs. Methods with null-free parameters can be expected to throw if a null is
>> passed in. Null-free storage should reject writes of nulls. (Details to be
>> worked out, but as a starting point, imagine 'Q'-typed storage for all types.
>> Writes reject nulls. Reads before any writes produce a default value, or if
>> none exists, throw.)
> This raises some questions for the scenarios where we are compelled to write
> `@SuppressWarnings("nullness")`; that information obviously isn't available at
> runtime. Perhaps there would need to be a more targeted suppression mechanism?
In that case, the attribute containing the nullness information can be erased/not generated.
>> - Type variable types have nullness, too. Besides 'T!' and 'T?', there's also a
>> "parametric" 'T*' that represents "whatever nullness is provided by the type
>> argument". (Again, some room for choosing the default interpretation of bare
>> 'T'; unspecified nullness is useful for type variables as well.) Nullness of
>> type arguments is inferred along with the types; when both a type argument and
>> its bound have nullness, bounds checks are based on '!' <: '*' <: '?'. Generics
>> are erased for now, but in the future '!' type arguments will be reified, and
>> specialized classes will provide the expected runtime behaviors.
> I believe you will need the distinction between *unspecified* T and *parametric*
> T. This is the second reason why JSpecify needs ` [
> https://jspecify.dev/docs/api/org/jspecify/annotations/NullMarked.html |
> @NullMarked ] `; it isn't just "`!` by default"; it's also "type-variable
> usages become parametric instead of unspecified". Again, what I believe matters
> is that users have some way to express the distinction, not how much or little
> javac and the runtime do out of the gates to enforce it.
> In summary, I'm extremely glad we're thinking along these lines, and I hope to
> be helpful.
Rémi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/valhalla-spec-observers/attachments/20230208/1cdf96eb/attachment.htm>
More information about the valhalla-spec-observers
mailing list