[OpenJDK 2D-Dev] Question regarding GraphicsConfiguration/DisplayMode

Dmitri Trembovetski Dmitri.Trembovetski at Sun.COM
Tue Mar 31 18:37:57 UTC 2009

   Hi Roman,

Roman Kennke wrote:
> I think my mental model regarding the GD/GC/DisplayMode API is a bit
> messed up, or the API is a bit messed up.

   It's the API that's a bit messed up.

   Basically, GD represents a real graphics device - like a video board on 
windows, and screen on X11 (w/o xinerama). If your video board has two outputs, 
depending on the settings GD may represent the board, or each of the outputs - 
"physical screens".

   GCs represent Visuals on X11, and pixel formats on Windows.

   It makes a little less sense on Windows as there Pixel Formats are only 
relevant for creating windows that work well with OpenGL so that third party 
applications can use HWNDs from AWT windows for rendering OpenGL content.

   For d3d (and somewhat confusingly, for the built-in opengl pipeline) 
    we have one single GraphicsConfiguration.

   This doesn't explain some methods that crept up into this class (like 
getBounds and getNormalizingTransform). They should have been on GraphicsDevice.

   So suppose you have an X11 system with single board with two screens (no 
xinerama), 32 bit IntBgr default visual, and the board is capable of 
simultaneously displaying a 8-bit grayscale visual windows.

   But also imagine that this system supports changing display nodes to 16 bit 
(I hope you have good imagination!), and different resolutions.

   Then you'll have two GD with two GC each (32-bit Bgr and 8-bit GrayScale).

   The DisplayMode[] will have entries for 32 and 16 bit modes, and whatever 
different resolutions you have.

   Xinerama is a tricky case. Most X11 applications aren't really aware of 
Xinerama presence, but Java is (so that it doesn't place windows in between the 
screens and stuff like that). But I believe we still expose a single device in 
this case.

   On Windows one can configure a multiscreen system as having two separate 
screens, or one single continuous screen - similar to xinerama vs no xinerama on 
X11 (except you can always drag windows from screen to screen on Windows).
   This is typically done through vendor drivers, but I think it's generic 
starting with Vista or W7.

> So far I was always assuming that the GCs (as returned by
> GD.getGraphicsConfigurations()) represent all the possible settings
> (resolution, color model, etc) that the GD can use. BUT, then we also

   This was an incorrect assumption.

> have GD.getDisplayModes(). And GC is also used for Xinerama support,
> right? So what does GC represent? I am implementing on an embedded
> platform right now, I only have one screen, but a couple of different
> settings, and would like to know what is correct here. Is there a
> relationship between the list of display modes and the list of GCs? If

   No. List of display modes is the list of possible display modes the graphics 
device can be switched to. That includes different resolutions, as well as bit 
depths. (see below for more)

> so, how are color models in DisplayMode handled? I only see a bitDepth
> property there, but for example how to differenciate between a RGB565
> and RGB556 setup in the DisplayMode API?

   DisplayMode for a device represents the desktop resolution and (where 
applicable) the bit depth.

   It makes less sense if your device can display windows with different bit 
depths simultaneously so there isn't one "desktop bit depth", which is why 
there's DisplayMode.BIT_DEPTH_MULTI to indicate that case.

> Then: what is the default configuration? Is it just some random

   This is tricky.

   In _most_ cases it corresponds to the screen's default visual / pixel format.

   Except for the cases where the default visual is say 8-bit, but there's a 
32-bit visual available. This happens on older SunOS systems with CDE where the 
default visual is 8 bit. There are Sun adapters for sparc which can display 
windows with different bit depths (and palettes, and gamma correction) 

   (BTW, you can force the default visual by using FORCEDEFVIS env. variable, 

> configuration that is considered the default? Is it the one currently in
> use? Does it relate to getDisplayMode()?

   You can think of it as representing the current device's resolution and bit 
depth (where applicable).

> What is (and how to use) GD.getBestConfiguration()? The
> GraphicsConfigTemplate class seems pretty useless to me.

   It is. =) I'm not sure anyone uses this stuff.


> Thanks in advance for your hints,
> Roman

More information about the 2d-dev mailing list