RFR: 8305709: [testbug] Tree/TableViewResizeColumnToFitContentTest fails with fractional screen scale

Kevin Rushforth kcr at openjdk.org
Wed Sep 20 15:24:54 UTC 2023


On Thu, 7 Sep 2023 18:26:50 GMT, Andy Goryachev <angorya at openjdk.org> wrote:

> Snapping introduces differences between computed values and snapped values, so we need to use non-zero tolerance when checking for equality.  The maximum tolerance is (1 / scale) - one display pixel scaled back to the local coordinates.
> 
> The tests have been modified to use the scale-specific tolerance.
> 
> Tested with macOS at scale 1.0 and win11 at scales (100%, 125%, 150%, 175%).

The changes to the test look fine to me. I did ad a question about the new test utility method, but I'll leave it up to you as to whether and how you want to fix it.

tests/system/src/test/java/test/util/Util.java line 490:

> 488:                     double scale = win.getRenderScaleX();
> 489:                     // distance between pixels in the local (unscaled) coordinates is (1 / scale)
> 490:                     return 1.0 / scale;

Is this computation what you want? This will give a smaller tolerance for a screen scale of, say, 2.0 (0.5) than it will for 1.0 (1). Intuitively, this seems backwards. For a screen scale of 1, a tolerance of 1 might be too large, whereas for a screen scale of 3, a tolerance of 0.3333 might be too small. If the goal is to account for the results of snapping, then would a fixed tolerance of 0.5 be better? It's possible I'm missing something.

-------------

Marked as reviewed by kcr (Lead).

PR Review: https://git.openjdk.org/jfx/pull/1234#pullrequestreview-1635833485
PR Review Comment: https://git.openjdk.org/jfx/pull/1234#discussion_r1331751243


More information about the openjfx-dev mailing list