Public Behavior API proposal
Andy Goryachev
andy.goryachev at oracle.com
Tue Nov 14 23:55:39 UTC 2023
Dear colleagues:
I would like to thank John and Michael for a lively discussion, during which we clarified a number of concepts, specifically the roles of Control (C), Skin (S), and Behavior (B). There is still a bit of difference in how we see things, but I am pleased to say we do have a lot in common.
Especially valuable is definition of Behavior as a layer that translates the user input into state changes of the control. How these changes are effected is where we have different ideas, but I really like this definition.
We now have a clear separation of concerns when it comes to C, S, B. We also acknowledge that effort required to migrate from the current implementation to a new one should be minimized within reason, and could be done gradually and without breaking compatibility.
Another positive development is realization that FX has an issue with event handling priority, which cannot be solved using existing APIs. This problem can be decoupled from the behavior/input map discussion and solved in a separate PR ( https://github.com/openjdk/jfx/pull/1266 ), so maybe we should look at it first (or in parallel)?
On the other hand, we do have some disagreement. The good thing is, we share the same goal - to give application developers a platform and the APIs that are both useful and convenient. This makes me think we can get to a mutually acceptable design. What should be the process to arrive at the common solution?
-andy
From: openjfx-dev <openjfx-dev-retn at openjdk.org> on behalf of John Hendrikx <john.hendrikx at gmail.com>
Date: Sunday, November 12, 2023 at 16:13
To: openjfx-dev at openjdk.org <openjfx-dev at openjdk.org>, Michael Strauß <michaelstrau2 at gmail.com>
Subject: Re: Public Behavior API proposal
Hi everyone, and specifically Andy and Michael,
I'm working on updating the Behavior API proposal, and I've been
thinking about the semantic events a lot. I would really like to hear
what you think, and how it matches with your ideas.
Quick recap, Semantic Events are high level events (like the ActionEvent
from Button) that can be triggered by a combination of low level events.
They represent an action to be achieved, but are not tied to any
specific means of triggering it. For example, the ActionEvent can be
triggered with the mouse, or with the keyboard; it is irrelevant which
one it was. Semantic events can be generated by Skins (as a result of
interactions with the Skin's managed children), by Controls (see below)
and users directly. You can compare these with Andy's FunctionTags or
Actions from various systems.
Let me describe exactly each part's role as I see it currently:
# Controls
Controls define semantic events, provides infrastructure for handling
events that is separated from internal needs (user comes first). User
installed event handlers always have priority to make the user feel in
control. The Control also provides for another new infrastructure, the
managing of key mappings. The mapping system can respond directly to
Key events (after the user had their chance) to generate a semantic
event. This means that both Control and Skin are allowed to generate
semantic events, although for Control this is strictly limited to the
mapping system. The key mappings are only overrideable, and their base
configuration is provided by whatever Behavior is installed. Exchanging
the Behavior does not undo user key mapping overrides, but instead
provides a new base line upon which the overrides are applied. So if a
Behavior provides a mapping for SPACE, and the user removed it,
installing a different behavior that also provides a mapping for SPACE
will still see that mapping as removed. If a behavior doesn't define
SPACE, and the user removed it, then nothing special happens (but it is
remembered).
- Controls refer to a Skin and Behavior
- Controls define semantic events
- Controls can generate semantic events (via mappings)
- Controls never modify their own (user writable) state on their own
accord (the user is in full control)
- Controls provide an override based key mapping system
# Skins
Skins provide the visuals, and although they get a Control reference,
they are restricted to only adding property listeners (not event
handlers) and modifying the children list (which is read only for users
as Control extends from Region). This keeps the user fully in control
when it comes to any writable properties and events on Control. Most
Skins already do this as I think it was an unwritten rule from the
beginning. Skins then install event handlers on their children (but
never the top level Control) where translation takes place to semantic
events. Skins have no reference to the Behavior to ensure that all
communication has to go through (interceptable) semantic events. Not
all events a Skin receives must be translated; if some events only
result in the Skin's internal state changing, and does not need to be
reflected in the Control's state then Skins can handle these directly
without going through a Behavior. Examples might be the position of the
caret, or the exact scroll location of a View, if such things aren't
part of the Control state.
- Skins refer to a Control (legacy inheritance) but are limited in their
allowed interactions (unwritten rule since the beginning)
- Better would be to provide skins with only a Context object that
only allows installing of listeners to ensure they can't do nasty things
(and to track all changes a Skin did for when it is replaced, similar
idea to BehaviorContext)
- Skins interprete normal events of their CHILDREN only (after the user
did not consume them), and either:
- Translates them to semantic events (targetted at the top level
Control)
- Acts upon them directly if only Skin internal state is involved
- Skins never act upon semantic events
# Behaviors
Behaviors provide a standard set of key mappings, and provide standard
ways of dealing with semantic events. Installing a new Behavior on a
control can make it "feel" completely different, from how it reacts to
keys, and which keys, how it deals with the high level semantic events,
as well as how it handles mouse interactions. Behaviors can act upon
both normal and semantic events, but don't generate any events
themselves. Again, they only act upon events after the user had a
chance to act upon them first. A behavior is free to act upon Key events
directly (for things too complicated for a simple key mapping), but it
would be better to indirect them as much as possible via a semantic
event that is triggered by a key mapping in the Control. Mouse events
are more complicated and don't need to be indirected to be handled (not
100% sure here yet). When receiving a semantic event that the user
didn't care about, the Behavior consumes it and does its thing by
calling methods on the Control and modifying control state.
- Behaviors refer to nothing
- Control reference is received via event handler and listeners only
- Behaviors define base key mappings
- Controls refer to these, after first checking for any user overrides
- Base key mappings can be global and immutable, user overrides
(maintained in Control) are mutable
- Behaviors never generate events
- Behaviors can act upon events of any type (if unconsumed by the user)
that are targetted at the control (enforced as that's the only place
they can install handlers)
- Behaviors are allowed to modify control state and their own state
# Interaction with Andy's proposal
I think the above works pretty nicely together with Andy's proposal. By
moving the responsibility managing the key mappings to Control, but
leaving the responsibility of defining the mappings with Behaviors, I
see a nice path forward to opening up a key mapping system for simple
overrides, as well as having a public Behavior API for more advanced
behaviorial modifications.
Notice that I didn't provide for a FunctionTag remapping system; as
these would be semantic events in this proposal, they can be
filtered/handled before the Behavior gets them. So to change a
function, just consume it and fire a different one. To completely block
it, just consume it. To replace a single function with two functions,
consume it and fire two new ones, etc. So to globally swap the
increment/decrement functions of all Spinners, at the Scene level you
could have a handler do this.
To also answer Andy's 10 questions:
Q1. Changing an existing key binding from one key combination to another.
-> Control provides a small API to override base mappings (remap)
Q2. Remapping an existing key binding to a different function.
-> Control provides a small API to override base mappings, including to
which semantic event they translate
Q3. Unmapping an existing key binding.
-> Control provides a small API to override base mappings (disable)
Q4. Adding a new key binding mapped to a new function.
-> Still debatable if this should be provided, but I don't see much
blockers for this; someone will have to interpret the new function
though; this could be a user event handler that knows the event (which
can be a custom one) or a customized Behavior.
Q5. (Q1...Q4) scenarios, at run time.
-> All possible at runtime. With a "-fx-behavior" CSS feature, this
could also be provided via CSS selectors, allowing far reaching changes
without having to modify each control individually.
Q6. How the set behavior handles a change from the default skin to a
custom skin with some visual elements that expects input removed, and
some added.
-> Behaviors act only upon events targetted at the Control. Skins that
don't provide some events means they will just not be picked up by
Behaviors. Skins that provide unknown semantic events require a
corresponding Behavior upgrade. Skins actions that don't require
Control state changes (only Skin state changes) can ignore this system
altogether.
Q7. Once the key binding has been modified, is it possible to invoke the
default functionality?
-> Yes, just fire the appropriate semantic event at the Control
Q8. How are the platform-specific key bindings created?
-> Debatable if this is needed, but probably something similar to your
proposal will be possible; Platforms don't change at runtime, so why
they are even added as bindings (instead of just skip adding them if not
on the right platform) is a mystery to me. A simple tool (perhaps on
the Platform class) to check the platform should be sufficient; no need
to interweave this with key mappings themselves.
Q9. How are the skin-specific (see Q6) handlers removed when changing
the skins?
-> Skins clean up after themselves, and they're not allowed to install
handlers on the control (only on their children)
Q10. When a key press happens, does it cause a linear search through
listeners or just a map lookup?
-> No, Controls have freedom to optimize how they do this; Behaviors
provide base mappings in some kind of Map form, or have an API to
quickly look up a base mapping (probably the latter to encapsulate it
better).
Thanks for reading,
--John
On 07/11/2023 08:09, Michael Strauß wrote:
> Hi John,
>
> I like that you clearly define the terms Control, Skin and Behavior,
> as well as their roles within the control architecture.
>
> However, I don't see how your proposal scales to non-trivial controls,
> and I agree with Andy that the Button example doesn't help (because a
> Button lacks substructure and provides only a single interaction).
>
> I'm missing your previous idea of using the event system for
> higher-level semantic events, because I think they're required to make
> this work. Here's how I see these parts working together:
>
> 1) A control is an opaque node in the scene graph, which defines the
> API for a particular interactive element. It also defines the
> interactions afforded by its implementation. For example, a Spinner
> will usually consist of a text field and two buttons, but a skin might
> choose to implement these components differently. The interactions
> afforded by a control are exposed as semantic events:
>
> class SpinnerEvent extends Event {
> static EventType<SpinnerEvent> COMMIT_TEXT;
> static EventType<SpinnerEvent> START_INCREMENT;
> static EventType<SpinnerEvent> STOP_INCREMENT;
> static EventType<SpinnerEvent> START_DECREMENT;
> static EventType<SpinnerEvent> STOP_DECREMENT;
> }
>
> 2) Skins are responsible for generating semantic events, and sending
> those events to the control. Since we don't need those events to have
> a tunneling/bubbling behavior, we could have a flag on the event that
> indicates a "direct event", one that is dispatched directly to its
> target.
>
> 3) Behaviors listen for semantic events on the control, and convert
> these events into state changes of the control. This part would
> probably be quite similar to some of the things that have already been
> proposed.
>
> In this way, controls, skins, and behaviors would end up as loosely
> coupled parts. In particular, I don't see the value in making
> behaviors public API if they are so tightly coupled to skins that they
> end up as being basically implementation details.
>
> Andy:
>> Imagine a specific skin that has a Node that accepts a user input. A scroll bar, a button, or a region with a some function. Unless this element is proclaimed as must-have for any skin and codified via some new public API (MySkin.getSomeElement()), it is specific to that particular skin and that particular behavior.
> I think that's a very important observation. A skin can't just be
> anything it wants to be, it must be suitable for its control. So we
> need a place where we define the API and the interactions afforded by
> that control. In my opinion, this place is the Control. Its
> functionality is exposed via properties and methods, and its
> interactions are specified using semantic events.
>
> Now skins are free to be implemented in any imaginable way, provided
> that they interact with the control using semantic events. This gives
> us very straightforward restrictions:
> * A skin can never add interactions that the control didn't specify.
> * If additional interactions are required, the control must be
> subclassed and the interactions must be specified by the control.
> Additionally, the behavior must be extended to account for the
> additional interactions.
>
>
>
>
> On Mon, Nov 6, 2023 at 4:50 AM John Hendrikx <john.hendrikx at gmail.com> wrote:
>> As promised, a public Behavior API proposal.
>>
>> Summary:
>>
>> Introduce a new Behavior interface that can be set on a control to replace its current behavior. The new behavior can be fully custom or composed (or later subclassed) from a default behavior. Some default behaviors will be provided as part of this proposal, but not all.
>>
>> See here: https://gist.github.com/hjohn/293f3b0ec98562547d49832a2ce56fe7
>>
>> --John
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mail.openjdk.org/pipermail/openjfx-dev/attachments/20231114/78b0d4b8/attachment-0001.htm>
More information about the openjfx-dev
mailing list