I'm working on a libGDX (library on top of LWJGL) game project, and use the Intellij IDEA IDE from several different workstations:
- Windows 7 x64 laptop with two displays (1920x1080 and 1600x1200), nVidia GT540M.
- Ubuntu 12.04 LTS on a laptop with a single display (1366x768), Intel integrated graphics.
- Ubuntu 12.04 LTS on a desktop with two displays (1920x1080 and 1280x1024), nVidia GTS 450.
I'm using the OpenJDK for Java 6 on the Ubuntu boxes, and Sun/Oracle Java 6 on the Windows box (I heard Java 6 was the one to use for Android compatibility).
When running on full-screen:
- Windows 7 laptop: works fine.
- Ubuntu laptop: works fine.
- Ubuntu desktop: background image is shown enlarged, and only part of it fits on the screen. +
Looking into this further, I see that the calls to Gdx.graphics.getHeight() and Gdx.graphics.getWidth() return the size of the rectangle needed to cover both displays, in my case 3200x1080, but when I tell my game to run full-screen, it only uses one of the displays, so the cameras get set to 1920x1080, but my camera movement and Tiled map panning think they've got 3200x1080 to work with, making things distorted and unusable (since character can walk off of the screen to the right).
I'm guessing my problem actually comes from the sizes returned by the awt.Toolkit's getScreenSize() call, but I don't know how to interrogate it more deeply to get the size of the screen it will actually use when I go fullscreen.
My DesktopStarter gets the screen size and sets fullscreen as follows:
LwjglApplicationConfiguration cfg = new LwjglApplicationConfiguration();
Dimension screenDimension = Toolkit.getDefaultToolkit().getScreenSize();
cfg.width = screenDimension.width;
cfg.height = screenDimension.height;
cfg.fullscreen = true;
new LwjglApplication(new Game1(), cfg);
Is there a work-around to get the height/width of just the display that "full screen" will actually launch into?
So the trouble I'm seeing, is that executing the game.jar file, exiting the game, then executing again, repeatedly, results in different display modes showing up in the list of modes returned by Gdx.graphics.getDisplayModes() -- as P.T. pointed out below, this is a thin wrapper around LWJGL's Display.getAvailableDisplayModes()
. Why is this happening? Why would it be a different set of modes presented on subsequent runs on Ubuntu?
edit: per P.T.'s suggestion, put LWJGL references in question, since it seems to be LWJGL that's providing the list of display modes.
Thanks!
Gdx.graphics.getDisplayModes()
return a more consistently useful result? (A bit of a chicken-and-the-egg, as you have to initialize libgdx before you can query this, though.) – ShaumGdx.graphics.getDisplayModes()
is sporatic on my Ubuntu workstation. Quite often, of the 43 modes it reports, 1920x1080 isn't listed at all. Yet sometimes, it's listed with several different refresh rates? Frustrating. – Autoionizationorg.lwjgl.opengl.DisplayMode
you might retag/rephrase this as a LWJGL question? (And you can probably use the LWJGL APIs before Libgdx is initialized ...) – Shaum