Value type hash code
John Rose
john.r.rose at oracle.com
Wed Apr 11 20:59:00 UTC 2018
On Apr 11, 2018, at 9:40 AM, Dan Smith <daniel.smith at oracle.com> wrote:
>
> This is a reasonable language design, but I don't think the JVM should hard-wire that behavior. The compiler can generate the equals/hashCode methods it wants, and my thinking is that when the compiler fails to do so, the JVM's default (that is, the behavior of Object.equals and Object.hashCode if not overridden) should be conservative.
(As usual we are confusing ourselves by having simultaneous but distinct
discussions about what the JVM should do and what the language should do.)
What Dan is saying here is what the JVM should do if presented with a naked
value class, with no contents beyond what the verifier requires. Not even Java
1.0 had completely naked classes, since javac would often spin up synthetic
methods like <init> and <clinit>.
What others are saying is that the language should make the default be
something such as deepEquals or maybe substitutable or even "user
responsibility" (a la abstract inherited methods).
And I'm also saying that BSMs allow the relevant choices to be flipped easily
and declaratively, unlike in early versions of Java, which required javac
to weave fancy (and opaque) bytecodes for fancy stuff like string append,
class literals, or inner classes.
But, to turn back to the JVM responsibility: It seems like the most conservative
thing to do would implementing a value-oriented substitutability test, which
appropriately mirrors (for values) the substitutability test of references, which
is acmp/op==. Then, for better or worse, the contract of Object.equals (and
thus Object.hashCode) would be always a substitutability check. Mostly
worse, since we are more comfortable these days requiring well-dressed
classes to have suitable equals/hashCode methods.
Or we could choose to break symmetry between references and values
in the case of Object.equals, at the JVM level. I think the cost of this would
be low, though the esthetics would be very poor.
Third option: Keep symmetry on Object.equals, and define a new supertype
interface Equable (or some such) whose semantics would override the
substitutability check of Object.equals, replacing them with deepEquals.
Then declare that standard value types implement Equable as well as Object
(but allow the JVM to accept naked value types without Equable). Extend
the benefits of Equable to reference types also, of course.
This third option would seem to require the ability for default methods in
Equable to override Object methods, which would be a difficult new thing.
IMO it would be a not-unreasonable cost to pay, to avoid having to hard-wire
either kind of equality (substitutability or deep equality) into the JVM or require
boilerplate everywhere the hard-wired option was wrong. But, probably,
default methods aren't the right way to think about this.
Which brings us back to the "mindy" idea: The things implemented by
Equable (in the above scenario) wouldn't be today's default methods,
but rather some sort of inheritable BSM declaration, which would be
expanded separately in each concrete class that implements the
interface. I know how this would work in the JVM but I have not the
faintest idea what it would look like at the language level; maybe at
first it would be a hardwired Magick in that particular type, Equable.
Eventually along with template classes we might have template methods,
and then the language-level design would become clearer.
— John
More information about the valhalla-dev
mailing list