Some thoughts and suggestions regarding last month's post on Deconstruction Patterns
David Alayachew
davidalayachew at gmail.com
Mon Apr 3 14:06:21 UTC 2023
Hello Amber Dev Team,
Here is the post I am referencing --
https://mail.openjdk.org/pipermail/amber-spec-experts/2023-March/003766.html
The link above paints a pretty beautiful picture for the future of object
deconstruction. We now have a solid idea of the semantics behind how
deconstruction can work, and I think it flows really nicely. In my code,
all I need to do is identify a common form of deconstruction I perform in
my code base, abstract that out to its own deconstructor, then simply call
that deconstructor when needed to simplify object decomposition. And since
we know that patterns can compose, we can easily see how this might fit in
with nested patterns.
However, looking at this gives me some pause too, because that means that
the code is in our hands to write, for better or for worse. There seems to
be no form of constraint placed on these custom deconstruction pattern -
just that you have to assign a value to all of the "out-parameters".
My fear is with how open ended and flexible the concept of custom
deconstruction is. Aside from the above constraint, these deconstructors
seem to function just like methods. And one very common source of bugs in
the code we all write is when a method does something wildly different than
what you would expect. A method says getField(), but it modifies state
inside the method call or throws an exception for some likely incorrect
reason.
Now, the above link does a good job of showing how easy and simple it is to
write and read a deconstructor. So I'm not concerned about it on the small
scale.
The problem is that deconstructors were meant to be composed. They were
meant to be nested. It's that trait that makes this feature so worth it.
You can deconstruct complex objects down to just their essentials in an
expressive and intuitive way.
And therefore, regardless of how small the chance of a bug is, that chance
gets multiplied, not just for each level of decomposition, but for each
deconstructor that you write. After all, it's not just about having the
tool be correct, but using it correctly. So the chance of failure,
regardless of how small, grows exponentially.
All that said, I have some ideas on how to severely limit that rate of
growth.
One suggestion would be the ability to create deconstructors that delegate
some of their work to other deconstructors. That way, you can significantly
limit the first point of failure -- writing the proper deconstructor in the
first place. This limits the attack surface (in exchange for creating
greater downstream impact if it does fail) by avoiding duplication and just
delegating.
Another suggestion would be to create some concept of a "final" for the
fields that we are assigning stuff to. Being able to force myself to assign
something a value, but also only assign it once, is a powerful construct
that already helps me when writing normal code, so I know that it can
really help to avoid making some of the same mistakes here and limit the
errors in my decomposition.
Yet another suggestion would be something a little more controversial (not
to mention that this ship may have already sailed) -- I'd like the user of
the deconstruction pattern to be able to opt-in to forcing their binding
variables to CONTAIN the same identifiers that were used at the
deconstructing method's signature. Being able to know the field names
allows me to be sure that I am grabbing the right field from my pattern
match. This one is definitely the most guard-rail-y and least java-like of
the bunch, but I feel like deconstruction deserves the extra guard rails.
We consider it a code smell to make a method that has many parameters, but
the entire point of deconstruction is to get all the necessary fields out
as efficiently and simply as possible. That ends up feeling like that same
code smell once you get to any sort of depth. However, I think we can
bypass the risk associated with the code smell by allowing people to opt in
to this naming concept. Then, the number of parameters shouldn't be a
threat anymore since they are all named, and thus, clearly defined. Again,
opt in.
Honestly, even the first 2 alone would still do a lot of good. But these
are 3 suggestions I had that might help people use this feature properly.
Thank you for your time and help!
David Alayachew
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/amber-dev/attachments/20230403/1543d5a1/attachment.htm>
More information about the amber-dev
mailing list