Some thoughts and suggestions regarding last month's post on Deconstruction Patterns

Brian Goetz brian.goetz at oracle.com
Sat Apr 8 18:41:00 UTC 2023


However, looking at this gives me some pause too, because that means that the code is in our hands to write, for better or for worse. There seems to be no form of constraint placed on these custom deconstruction pattern - just that you have to assign a value to all of the "out-parameters".

It’s an overstatement to say “no constraints”, but the reality is, we can’t constrain away all the things people could do wrong.  So yes, there is a risk that people will not follow the rules and write bad deconstruction patterns.

Note that we have this problem already with record patterns!  Consider

    record Foo(int x) {
        Int x() { return 42; }
    }

or

    record Bar(int x) {
        Int x() { throw new RuntimeException(); }
    }

The first merely deconstructs the record wrong; the latter “poisons” any pattern match on it.  We can try to eliminate the most egregious errors (e.g., disallow “throw” statement at the top level) but this is obviously only a bandaid since it can easily be laundered through a method call.  The bottom line is that deconstructors and records do have a restricted programming model, and it is on users to code them correctly.

Note we had a very similar conversation when we did streams.  Given a stream pipeline:

    ints.stream()
         .map(x -> { lastX = x; return x + lastX; })
         …

If you try to run this in parallel you will not get the answer you expected.

Tl:dr; pervasive mutability means we can’t stop people from shooting their feet, we can only guide them to a better programming model and hope they follow.

The problem is that deconstructors were meant to be composed. They were meant to be nested. It's that trait that makes this feature so worth it. You can deconstruct complex objects down to just their essentials in an expressive and intuitive way.

Yes, composition is powerful, but it magnifies the risk of poison-in, poison-out.

And therefore, regardless of how small the chance of a bug is, that chance gets multiplied, not just for each level of decomposition, but for each deconstructor that you write. After all, it's not just about having the tool be correct, but using it correctly. So the chance of failure, regardless of how small, grows exponentially.

I get your concern, though I think “exponentially” is a bit hyperbolic.

One suggestion would be the ability to create deconstructors that delegate some of their work to other deconstructors.

That’s already in the plan; constructors delegate to super-constructors, and deconstructors are the dual of constructors.

Another suggestion would be to create some concept of a "final" for the fields that we are assigning stuff to. Being able to force myself to assign something a value, but also only assign it once, is a powerful construct that already helps me when writing normal code, so I know that it can really help to avoid making some of the same mistakes here and limit the errors in my decomposition.

It depends on the exact expression of the deconstructor body (and I don’t want to dive into syntax now), but yes, one rational model here is to treat bindings as blank finals, and require they be definitely assigned before exit.  Then we don’t need to “create some concept of final”, because we can use the finality we already have.

Yet another suggestion would be something a little more controversial (not to mention that this ship may have already sailed) -- I'd like the user of the deconstruction pattern to be able to opt-in to forcing their binding variables to CONTAIN the same identifiers that were used at the deconstructing method's signature.

Unfortunately, this is just unworkable.  Suppose you could this on a record:

    record R(@ForcedUseSiteNameMatchDammit int x) { }

Now, if we construct

    record Pair(R r1, R r2) { … }

Then no one can use pattern matching on Pair:

    case Pair(R(int x), R(int x)) // duplicate variable names

Now, you said “contain”, so perhaps you meant something like

    Case Pair(R(int x1), R(int x2))

But I promise you, no one will thank you for this degree of “do it because I think it is good for you.”

Being able to know the field names

Not all patterns will just be unpacking fields.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/amber-dev/attachments/20230408/d96d79fd/attachment-0001.htm>


More information about the amber-dev mailing list