Addressing the full range of use cases

Kevin Bourrillion kevinb at google.com
Tue Oct 5 19:40:39 UTC 2021


On Mon, Oct 4, 2021 at 5:04 PM Dan Smith <daniel.smith at oracle.com> wrote:

and because I want to encourage focusing on the contents of the original
> mail, with this reply as a supplement.
>

Noted, but didn't have much useful to reply. I definitely think this is the
right problem to be solving...


A "classic" and "encapsulated" pair of clusters seems potentially workable
> (better names TBD).


Tend to agree.



> Nullability can be handled in one of two ways:
>
> - Flush the previous mental model that null is inherently a reference
> concept. Null is a part of the value set of both encapsulated primitive
> value types and reference types.
>

imho there are other arguments for striking "the null reference" in favor
of "the null value". A reference ought to be something you can *de*reference.
And, it isn't really reference types *themselves* that bring null into the
picture; it's the way Java "grafts" the null type onto most *usages* of a
reference type.

I do think many people will experience initial shock/aversion at this,
owing primarily to decades of hating null. But it shouldn't be hard for
them to recognize that however bad null might seem, a false value is worse,
and that fact has nothing to do with references.


For migration, encapsulated primitive classes mostly subsume
> "reference-default" classes, and let us drop the 'Foo.val' feature.


Indeed, it would probably be bad to introduce the classic/encapsulated
distinction if it *can't* fully get rid of the val-default/ref-default
distinction.



> For the "encapsulated"/"classic" choice, perhaps "encapsulated" should be
> the default. Classic primitives have sharper edges, especially for class
> authors, so perhaps can be pitched as an "advanced" feature, with an extra
> modifier signaling this fact. (Everybody uses 'int', but most people don't
> need to concern themselves with declaring 'int'.)
>

fwiw, I agree (strongly).


Atomicity:
>
> Alternatively, can we train programmers to treat out-of-sync values with
> the same tolerance they give to out-of-sync object state in classes that
> aren't thread safe? It seems bad that a hostile or careless third party
> could create a LocalDate for February 31 via concurrent read-writes, with
> undefined subsequent instance method behavior; but is this more bad than
> how the same third party could *mutate* (via validating setters) a similar
> identity object with non-final fields to represent February 31?
>

That seems reasonable.



> If, pessimistically, the overall performance doesn't look good, it's worth
> asking whether we should tackle these use cases at all. But there's a risk
> that developers would misuse classic primitives if we don't provide the
> safer alternative. Could we effectively communicate "you're doing it wrong,
> just use identity"? Not sure.
>

It may be over-idealistic of me, but I think the less people have to make
new identity objects when they didn't care about identity *per se*, the
better.


-- 
Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com


More information about the valhalla-spec-observers mailing list