[OpenJDK 2D-Dev] Question regarding GraphicsConfiguration/DisplayMode

Roman Kennke roman at kennke.org
Tue Mar 31 20:11:45 UTC 2009


Hi Dmitri,

> > I think my mental model regarding the GD/GC/DisplayMode API is a bit
> > messed up, or the API is a bit messed up.
> 
>    It's the API that's a bit messed up.

Phew. ;-)

>    Basically, GD represents a real graphics device - like a video board on 
> windows, and screen on X11 (w/o xinerama). If your video board has two outputs, 
> depending on the settings GD may represent the board, or each of the outputs - 
> "physical screens".

Ok, good.

>    GCs represent Visuals on X11, and pixel formats on Windows.
> 
>    It makes a little less sense on Windows as there Pixel Formats are only 
> relevant for creating windows that work well with OpenGL so that third party 
> applications can use HWNDs from AWT windows for rendering OpenGL content.
> 
>    For d3d (and somewhat confusingly, for the built-in opengl pipeline) 
>     we have one single GraphicsConfiguration.
> 
>    This doesn't explain some methods that crept up into this class (like 
> getBounds and getNormalizingTransform). They should have been on GraphicsDevice.

Why? Isn't getBounds() used to indicate which part of the whole virtual
screen (Xinerama) a GC is representing? At least, this is what the API
docs tell me. I suppose in the ideal world we'd need getBounds() in both
the GD (the bounds of the whole virtual screen) and the GC (the bounds
of the sub-screen). Ok, in an ideal world, Xinerama would be represented
by a different class then GC I suppose...

>    So suppose you have an X11 system with single board with two screens (no 
> xinerama), 32 bit IntBgr default visual, and the board is capable of 
> simultaneously displaying a 8-bit grayscale visual windows.
> 
>    But also imagine that this system supports changing display nodes to 16 bit 
> (I hope you have good imagination!), and different resolutions.
> 
>    Then you'll have two GD with two GC each (32-bit Bgr and 8-bit GrayScale).
> 
>    The DisplayMode[] will have entries for 32 and 16 bit modes, and whatever 
> different resolutions you have.

Ok, this makes sense. So in my case, where I can display windows only in
one color model simultanously (and no Xinerama of course on this poor
embedded box), I'd only return one GC at any time. Good.

> > So far I was always assuming that the GCs (as returned by
> > GD.getGraphicsConfigurations()) represent all the possible settings
> > (resolution, color model, etc) that the GD can use. BUT, then we also
> 
>    This was an incorrect assumption.

I see.

> > have GD.getDisplayModes(). And GC is also used for Xinerama support,
> > right? So what does GC represent? I am implementing on an embedded
> > platform right now, I only have one screen, but a couple of different
> > settings, and would like to know what is correct here. Is there a
> > relationship between the list of display modes and the list of GCs? If
> 
>    No. List of display modes is the list of possible display modes the graphics 
> device can be switched to. That includes different resolutions, as well as bit 
> depths. (see below for more)

Ok, so this is what I assumed the GCs are for. Good to know.

> > so, how are color models in DisplayMode handled? I only see a bitDepth
> > property there, but for example how to differenciate between a RGB565
> > and RGB556 setup in the DisplayMode API?
> 
>    DisplayMode for a device represents the desktop resolution and (where 
> applicable) the bit depth.
> 
>    It makes less sense if your device can display windows with different bit 
> depths simultaneously so there isn't one "desktop bit depth", which is why 
> there's DisplayMode.BIT_DEPTH_MULTI to indicate that case.

I think I mean something different. Suppose your graphics board is
capable of using 2 different resolutions (1024x768 and 800x600) in two
different color modes (RGB565 and RGB556) (not simultanously). This
would make 4 DMs. The problem I see is that bitDepth is not expressive
enough to differentiate between RGB565 and RGB556, both use 16 bits.

> > Then: what is the default configuration? Is it just some random
> 
>    This is tricky.
> 
>    In _most_ cases it corresponds to the screen's default visual / pixel format.
> 
>    Except for the cases where the default visual is say 8-bit, but there's a 
> 32-bit visual available. This happens on older SunOS systems with CDE where the 
> default visual is 8 bit. There are Sun adapters for sparc which can display 
> windows with different bit depths (and palettes, and gamma correction) 
> simultaneously.

Ok, I guess when I only ever have one GC (see above) that's easy now :-)

> > What is (and how to use) GD.getBestConfiguration()? The
> > GraphicsConfigTemplate class seems pretty useless to me.
> 
>    It is. =) I'm not sure anyone uses this stuff.

Haha, good. :-)

Thanks a lot for clearing this up Dmitri. Oh, it means I have to rework
lots of code. THANKS! ;-)

/Roman





More information about the 2d-dev mailing list