JEP 468 updating non-updatable fields

Ethan McCue ethan at mccue.dev
Mon Jan 26 15:50:35 UTC 2026


When people start marking up their entities with Jackson annotations the
explanation I give is basically

1. Entities are a bad fit for the heuristic based class -> json method of
serializing data (because circularity, hydration, etc)
2. Rather than try to make them fit either manually produce json or move
the data into a class hierarchy that *is* a good fit (which records are)

The underlying issue is that people broadly seem to choose the convenience
of not writing a field out again over picking semantically sensible
mechanisms.

If we conceive of carrier classes as mostly "record-like but..." Then I
think what we tell people to do won't have changed. It's really just that
there's a new "wrong" thing that can be done that will be tempting to do.


On Mon, Jan 26, 2026, 10:25 AM Brian Goetz <brian.goetz at oracle.com> wrote:

> I don't think adding the conditional deconstruction story (as interesting
> as it is!) would shed light on this question.  I think the answer lies in
> "why does the active-row pattern not obey the requirements for being a
> carrier."  Whether deconstruction is conditional or unconditional, the
> problem is still that if someone can create records/carriers with
>
>     PersonRow r = new PersonRow(rand.nextInt(), "Bob Smith")
>
> and then persist them with
>
>     database.persist(r);
>
> they are bestowing the right to update random rows that were not dispensed
> by the ORM.  The database API has, by virtue of the fact that PersonRow has
> a public constructor that accepts an ID, essentially exposed a wider API
> than it intended to.
>
> The answer has always been "don't use carriers/records for this", but the
> interesting sub-question is (a) how to explain this succinctly to users so
> they get it and (b) what to tell them to do instead.
>
>
> On 1/26/2026 10:12 AM, Ethan McCue wrote:
>
> My immediate thought (aside from imagining Brian trapped in an eternal
> version of that huffalumps and woozles scene from Winnie the Pooh, but it's
> all these emails) is that database entities aren't actually good candidates
> for "unconditional deconstruction"
>
> I think this because the act of getting the data from the db/persistence
> context is intrinsically fallible *and* attached to instance behavior;
> maybe we need to look forward to what the conditional deconstruction story
> would be?
>
> On Mon, Jan 26, 2026, 10:04 AM Brian Goetz <brian.goetz at oracle.com> wrote:
>
>>
>>
>> It's interesting that when language designers make the code easier to
>> write, somebody may complain that it's too easy :-)
>>
>>
>> I too had that "you can't win" feeling :)
>>
>> I would recast the question here as "Can Java developers handle carrier
>> classes".  Records are restricted enough to keep developers _mostly_ out of
>> trouble, but the desire to believe that this is a syntactic and not
>> semantic feature is a strong one, and given that many developers education
>> about how the language works is limited to "what does IntelliJ suggest to
>> me", may not even _realize_ they are giving into the dark side.
>>
>> I think it is worth working through the example here for "how would we
>> recommend handling the case of a "active" row like this.
>>
>> I think it's a perfect place for static analysis tooling. One may invent
>> an annotation like `@NonUpdatable`
>> with the `RECORD_COMPONENT` target and use it on such fields, then create
>> an annotation processor
>> (ErrorProne plugin, IntelliJ IDEA inspection, CodeQL rule, etc.), that
>> will check the violations and fail the build if there are any.
>> Adding such a special case to the language specification would be an
>> overcomplication.
>>
>> With best regards,
>> Tagir Valeev.
>>
>> On Sun, Jan 25, 2026 at 11:48 PM Brian Goetz <brian.goetz at oracle.com>
>> wrote:
>>
>>> The important mental model here is that a reconstruction (`with`)
>>> expression is "just" a syntactic optimization for:
>>>
>>>  - destructure with the canonical deconstruction pattern
>>>  - mutate the components
>>>  - reconstruct with the primary constructor
>>>
>>> So the root problem here is not the reconstruction expression; if you
>>> can bork up your application state with a reconstruction expression, you
>>> can bork it up without one.
>>>
>>> Primary constructors can enforce invariants _on_ or _between_
>>> components, such as:
>>>
>>>     record Rational(int num, int denom) {
>>>         Rational { if (denom == 0) throw ... }
>>>     }
>>>
>>> or
>>>
>>>     record Range(int lo, int hi) {
>>>         Range { if (lo > hi) throw... }
>>>     }
>>>
>>> What they can't do is express invariants between the record / carrier
>>> state and "the rest of the system", because they are supposed to be simple
>>> data carriers, not serialized references to some external system.  A
>>> class that models a database row in this way is complecting entity state
>>> with an external entity id.  By modeling in this way, you have explicitly
>>> declared that
>>>
>>>     rec with { dbId++ }
>>>
>>> *is explicitly OK* in your system; that the components of the record can
>>> be freely combined in any way (modulo enforced cross-component
>>> invariants).  And there are systems in which this is fine!  But you're
>>> imagining (correctly) that this modeling technique will be used in systems
>>> in which this is not fine.
>>>
>>> The main challenge here is that developers will be so attracted to the
>>> syntactic concision that they will willfully ignore the semantic
>>> inconsistencies they are creating.
>>>
>>>
>>>
>>>
>>> On 1/25/2026 1:37 PM, Andy Gegg wrote:
>>>
>>> Hello,
>>> I apologise for coming late to the party here - Records have been of
>>> limited use to me but Mr Goetz's email on carrier classes is something that
>>> would be very useful so I've been thinking about the consequences.
>>>
>>> Since  carrier classes and records are for data, in a database
>>> application somewhere or other you're going to get database ids in records:
>>> record MyRec(int dbId, String name,...)
>>>
>>> While everything is immutable this is fine but JEP 468 opens up the
>>> possibility of mutation:
>>>
>>> MyRec rec = readDatabase(...);
>>> rec = rec with {name="...";};
>>> writeDatabase(rec);
>>>
>>> which is absolutely fine and what an application wants to do.  But:
>>> MyRec rec = readDatabase(...);
>>> rec = rec with {dbId++;};
>>> writeDatabase(rec);
>>>
>>> is disastrous.  There's no way the canonical constructor invoked from
>>> 'with' can detect stupidity nor can whatever the database access layer does.
>>>
>>> In the old days, the lack of a 'setter' would usually prevent stupid
>>> code - the above could be achieved, obviously, but the code is devious
>>> enough to make people stop and think (one hopes).
>>>
>>> Here there is nothing to say "do not update this!!!" except code
>>> comments, JavaDoc and naming conventions.
>>>
>>> It's not always obvious which fields may or may not be changed in the
>>> application.
>>>
>>> record MyRec(int dbId, int fatherId,...)
>>> probably doesn't want
>>> rec = rec with { fatherId = ... }
>>>
>>> but a HR application will need to be able to do:
>>>
>>> record MyRec(int dbId, int departmentId, ...);
>>> ...
>>> rec = rec with { departmentId = newDept; };
>>>
>>> Clearly, people can always write stupid code (guilty...) and the current
>>> state of play obviously allows the possibility (rec = new MyRec(rec.dbId++,
>>> ...);) which is enough to stop people using records here but carrier
>>> classes will be very tempting and that brings derived creation back to the
>>> fore.
>>>
>>> It's not just database ids which might need restricting from update,
>>> e.g. timestamps (which are better done in the database layer) and no doubt
>>> different applications will have their own business case restrictions.
>>>
>>> Thank you for your time,
>>> Andy Gegg
>>>
>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/amber-dev/attachments/20260126/8f984980/attachment-0001.htm>


More information about the amber-dev mailing list