Revisiting default values

Kevin Bourrillion kevinb at google.com
Tue Jun 29 21:36:47 UTC 2021


Thanks for giving this your attention!

On Tue, Jun 29, 2021 at 12:56 PM Dan Smith <daniel.smith at oracle.com> wrote:

E.g., I can imagine a world in which a no-good-default primitive class is
> no better than an identity class in most use cases, and at that point,
> we're best off simply not supporting the no-good-default feature at all.


Ah. I don't have that imagination at the moment. Maybe my picture of this
whole feature went straight from overly choleric to overly sanguine!



> (With the implication that many of the primitive-candidate classes we are
> imagining should continue to be identity classes.)


I'm not sure that follows, because to the library owner, this should be
about letting their users have the *choice* between ref/val, and also about
not senselessly exposing identity when identity is senseless. I guess the
main reason I can see resisting the change for some data type (that's
appropriate for primitive-ness) is if "most users will want/have to use the
.ref projection, *and* they'll perform way worse than if ref was their only
choice." Is that a concern? (And of course, in this thread I'm talking
about *removing* one of the reasons for wide use of .ref, nullability.)



> I think we're pretty locked in to:
> - Some primitive class types like Complex must be non-nullable (for
> compactness)
> - We won't (at least for now) support non-nullable types in full generality
>

Good good. I guess I'm submitting #3 for consideration, "We will
deliberately not worry about problems caused by nullability of primitive
types if those problems are just the same ones we already have with
reference types."


Speaking of orthogonality, there *is* an open question about how we
> interpret <invalid>, and this is orthogonal to the question of whether
> <invalid> should be the "default default". We've talked about:
> - It's interchangeable with null
> - It's null-like (i.e., detected on member access), but distinct
> - It's a separate concept, and it is an error to ever read it from
> fields/arrays
>
> All still on the table.
>

Oh. Yeah, if you look at all the work we've all poured into how we manage
null and its attendant risks, and ongoing work (perhaps charitably assume
JSpecify will be successful! :-)), then it's kiiiind of a disaster if
there's suddenly a second kind of nullness. #nonewnulls



> Complex and friends are special cases, but they're also the *most
> important* cases. I'd really prefer not to have to pick, but if forced to,
> it may be more important for primitive classes to support optimally the 10%
> "has a good default" cases (roughly, those that are number-like) than the
> 90% "no good default" cases (roughly, those that wrap references).
>

To clarify, I don't think I meant "special case" as "deprioritize", only as
"that's the case I think I'd have users opt into intentionally".



> >       • If we don't do something like Brian describes here, then I
> suppose second-best is that we make a lot of these things ref-default
> (beginning with Instant and not stopping there!) and warn about the dangers
> of `.val`
>
> I'm not a big fan of this approach. It gives you the illusion of safety
> (well-written code only sees valid values) but blows up in unpredictable
> ways when a bug or a hostile actor leaks <invalid> into your program. If we
> don't offer stronger guarantees, and your code isn't willing to check for
> <invalid>, you really shouldn't be programming with a primitive class.
>

Indeed, calling it my second-best was not meant to imply I don't also hate
it. :-)

-- 
Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com


More information about the valhalla-spec-observers mailing list