Nullity (was: User model stacking: current status)

Dan Smith daniel.smith at oracle.com
Thu May 12 17:07:08 UTC 2022


> On May 11, 2022, at 7:45 PM, Kevin Bourrillion <kevinb at google.com> wrote:
> 
> * `String!` indicates "an actual string" (I don't like to say "a non-null string" because *null is not a string!*)

The thread talks around this later, but... what do I get initially if I declare a field/array component of type 'String!'?

I think in most approaches this would end up being a warning, with the field/array erased to LString and storing a null. (Alternatively, we build 'String!' into the JVM, and I think that has to come with "uninitialized" detection on reads. We talked through that strategy quite a bit in the context of B2 before settling on "just use 'null'".)

So this is potentially a fundamental difference between String! and Point!: 'new String![5]' and 'new Point![5]' give you very different arrays.

> * Exclamation fatigue would be very real, so assume there is some way to make `!` the default for some scope

+1

Yes, I think it's a dead end to expect users to sprinkle '!' everywhere they don't want nulls—this is usually the informal default in common programming practice, so we need some way to enable flipping the default.

Lesson for B3: if B3! is primarily meant to be interpreted as a null-free type, people will naturally want to use that null-free type everywhere, and will want it to be default. (Reference default makes more sense where you generally want to use the nullable type, and only occasionally will opt in to the value type, probably for reasons other than whether 'null' is semantically meaningful.)

Also, a danger for B3 is that a rather casual flipping of defaults doesn't just affect compiler behavior—it changes the initial value and possibly atomicity of a field/array. So a little more scary for a random switch somewhere to change all your 'Point' usages from ref-default to val-default.


More information about the valhalla-spec-experts mailing list