discussion about touch events
Pavel Safrata
pavel.safrata at oracle.com
Mon Nov 11 13:30:55 PST 2013
On 11.11.2013 17:49, Tomas Mikula wrote:
> On Mon, Nov 11, 2013 at 1:28 PM, Philipp Dörfler <phdoerfler at gmail.com> wrote:
>> I see the need to be aware of the area that is covered by fingers rather
>> than just considering that area's center point.
>> I'd guess that this adds a new layer of complexity, though. For instance:
>> Say we have a button on some background and both the background and the
>> button do have an onClick listener attached. If you tap the button in a way
>> that the touched area's center point is outside of the buttons boundaries -
>> what event will be fired? Will both the background and the button receive a
>> click event? Or just either the background or the button exclusively? Will
>> there be a new event type which gets fired in case of such area-based taps?
>>
>> My suggestion would therefore be to have an additional area tap event which
>> gives precise information about diameter and center of the tap. Besides
>> that there should be some kind of "priority" for choosing which node's
>> onClick will be called.
> What about picking the one that is closest to the center of the touch?
>
There is always something directly on the center of the touch (possibly
the scene background, but it can have event handlers too). That's what
we pick right now.
Pavel
> Tomas
>
>> Maybe the draw order / order in the scene graph / z
>> buffer value might be sufficient to model what would happen in the real,
>> physical world.
>> Am 11.11.2013 13:05 schrieb "Assaf Yavnai" <assaf.yavnai at oracle.com>:
>>
>>> The ascii sketch looked fine on my screen before I sent the mail :( I hope
>>> the idea is clear from the text
>>> (now in the reply dialog its also look good)
>>>
>>> Assaf
>>> On 11/11/2013 12:51 PM, Assaf Yavnai wrote:
>>>
>>>> Hi Guys,
>>>>
>>>> I hope that I'm right about this, but it seems that touch events in glass
>>>> are translated (and reported) as a single point events (x & y) without an
>>>> area, like pointer events.
>>>> AFAIK, the controls response for touch events same as mouse events (using
>>>> the same pickers) and as a result a button press, for example, will only
>>>> triggered if the x & y of the touch event is within the control area.
>>>>
>>>> This means that small controls, or even quite large controls (like
>>>> buttons with text) will often get missed because the 'strict' node picking,
>>>> although from a UX point of view it is strange as the user clearly pressed
>>>> on a node (the finger was clearly above it) but nothing happens...
>>>>
>>>> With current implementation its hard to use small features in controls,
>>>> like scrollbars in lists, and it almost impossible to implement something
>>>> like 'screen navigator' (the series of small dots in the bottom of a smart
>>>> phones screen which allow you to jump directly to a 'far away' screen)
>>>>
>>>> To illustrate it consider the bellow low resolution sketch, where the "+"
>>>> is the actual x,y reported, the ellipse is the finger touch area and the
>>>> rectangle is the node.
>>>> With current implementation this type of tap will not trigger the node
>>>> handlers
>>>>
>>>> __
>>>> / \
>>>> / \
>>>> ___/ __+_ \___ in this scenario the 'button' will not get
>>>> pressed
>>>> | \ / |
>>>> |___\ ___ / __ |
>>>> \___/
>>>>
>>>> If your smart phone support it, turn on the touch debugging options in
>>>> settings and see that each point translate to a quite large circle and what
>>>> ever fall in it, or reasonably close to it, get picked.
>>>>
>>>> I want to start a discussion to understand if my perspective is accurate
>>>> and to understand what can be done, if any, for the coming release or the
>>>> next one.
>>>>
>>>> We might use recently opened RT-34136 <https://javafx-jira.kenai.
>>>> com/browse/RT-34136> for logging this, or open a new JIRA for it
>>>>
>>>> Thanks,
>>>> Assaf
>>>>
>>>
More information about the openjfx-dev
mailing list