java and libGDX / LWJGL game fullscreen wrong size for multiple monitors on Ubuntu
Asked Answered
A

8

35

I'm working on a libGDX (library on top of LWJGL) game project, and use the Intellij IDEA IDE from several different workstations:

  • Windows 7 x64 laptop with two displays (1920x1080 and 1600x1200), nVidia GT540M.
  • Ubuntu 12.04 LTS on a laptop with a single display (1366x768), Intel integrated graphics.
  • Ubuntu 12.04 LTS on a desktop with two displays (1920x1080 and 1280x1024), nVidia GTS 450.

I'm using the OpenJDK for Java 6 on the Ubuntu boxes, and Sun/Oracle Java 6 on the Windows box (I heard Java 6 was the one to use for Android compatibility).

When running on full-screen:

  • Windows 7 laptop: works fine.
  • Ubuntu laptop: works fine.
  • Ubuntu desktop: background image is shown enlarged, and only part of it fits on the screen. +

Looking into this further, I see that the calls to Gdx.graphics.getHeight() and Gdx.graphics.getWidth() return the size of the rectangle needed to cover both displays, in my case 3200x1080, but when I tell my game to run full-screen, it only uses one of the displays, so the cameras get set to 1920x1080, but my camera movement and Tiled map panning think they've got 3200x1080 to work with, making things distorted and unusable (since character can walk off of the screen to the right).

I'm guessing my problem actually comes from the sizes returned by the awt.Toolkit's getScreenSize() call, but I don't know how to interrogate it more deeply to get the size of the screen it will actually use when I go fullscreen.

My DesktopStarter gets the screen size and sets fullscreen as follows:

    LwjglApplicationConfiguration cfg = new LwjglApplicationConfiguration();
    Dimension screenDimension = Toolkit.getDefaultToolkit().getScreenSize();
    cfg.width = screenDimension.width;
    cfg.height = screenDimension.height;
    cfg.fullscreen = true;
    new LwjglApplication(new Game1(), cfg);

Is there a work-around to get the height/width of just the display that "full screen" will actually launch into?

So the trouble I'm seeing, is that executing the game.jar file, exiting the game, then executing again, repeatedly, results in different display modes showing up in the list of modes returned by Gdx.graphics.getDisplayModes() -- as P.T. pointed out below, this is a thin wrapper around LWJGL's Display.getAvailableDisplayModes(). Why is this happening? Why would it be a different set of modes presented on subsequent runs on Ubuntu?

edit: per P.T.'s suggestion, put LWJGL references in question, since it seems to be LWJGL that's providing the list of display modes.

Thanks!

Autoionization answered 9/5, 2013 at 1:14 Comment(7)
Does Gdx.graphics.getDisplayModes() return a more consistently useful result? (A bit of a chicken-and-the-egg, as you have to initialize libgdx before you can query this, though.)Shaum
@Shaum Thanks! While not directly useful, since it simply returns a list of all supported display modes, presumably as reported by the JRE's interface with the hardware. I used that hint, and setup a loop in my Game.Create() to spin through all modes, selecting the highest resolution (only up to 1920x1200), then selecting the one with the highest refresh rate at that resolution. That selects a playable default unless using two (or more) extremely low-resolution monitors (such that their widths add up to 1920 or less). Add that as an answer, and I'll mark it -- unless better comes along. ;)Autoionization
Actually, after playing with this a bit more, it turns out Gdx.graphics.getDisplayModes() is sporatic on my Ubuntu workstation. Quite often, of the 43 modes it reports, 1920x1080 isn't listed at all. Yet sometimes, it's listed with several different refresh rates? Frustrating.Autoionization
The source (github.com/libgdx/libgdx/blob/master/backends/gdx-backend-lwjgl/…) shows Libgdx is just using org.lwjgl.opengl.DisplayMode you might retag/rephrase this as a LWJGL question? (And you can probably use the LWJGL APIs before Libgdx is initialized ...)Shaum
Updated question and tags, per suggestion. ThanksAutoionization
Not a programming answer but: this is an ongoing issue with most linux games and multiple monitors. One of the reasons alternatives to X like Wayland are being worked on. Please offer a way to override resolution detection in the command line or a config file, just to make sure linux multiple monitor people can play your game :)Inexpressive
@Torp: Thanks for the comment. Yeah, I've found similar info around the 'net, and more importantly, I've found quite a few apps that mis-behave on my multi-monitor Ubuntu setup. For now, I've switched back to using Windows on my development box (I was able to re-use the license that was on the box prior to it's hardware refresh / Ubuntu install). I'll still test on Ubuntu via my i5 laptop, but don't feel like dealing with it on a daily basis with my dev workstation. I'll be leaving in my command-line processing to select resolution that way, just in case...Autoionization
I
1
GraphicsDevice monitors[]=GraphicsEnvironment.getScreenDevices(); 

Dimension screen_resolution[monitors.length];

for (int monitorID = 0; monitorID < monitors.length; monitorID++)
{
    screen_resolution[monitorID] = new Dimension(monitors[monitorID].getDisplyMode().getWidth(),monitors[monitorID].getDisplyMode().getHeight());
}
Ironist answered 31/8, 2013 at 19:52 Comment(0)
A
1

I would refrain from using Toolkit.getDefaultToolkit() and use solely lwjgl.util.Display.getAvailableDisplayModes() or the method described by libgdx.

Once you have set up a fullscreen window, fetch its size (if your set-up method doesn't already know that) and only use this information from thereon.

If Display.getAvailableDisplayModes() changes its sort order on different executions, simply re-sort them and use the biggest one available or use a standard one and provide in-game settings to change them.

Avidin answered 3/9, 2013 at 13:11 Comment(0)
R
0

You could use javafx.stage.screen to work around this issue, but you'd either have to add JavaFX if you stick with Java 6, or upgrade to Java 7 (It's included with the Java 7 SDK.)

From the documentation:

"Describes the characteristics of a graphics destination such as monitor. In a virtual device multi-screen environment in which the desktop area could span multiple physical screen devices, the bounds of the Screen objects are relative to the Screen.primary.

For example:

Rectangle2D primaryScreenBounds = Screen.getPrimary().getVisualBounds();

//set Stage boundaries to visible bounds of the main screen stage.setX(primaryScreenBounds.getMinX()); stage.setY(primaryScreenBounds.getMinY()); stage.setWidth(primaryScreenBounds.getWidth()); stage.setHeight(primaryScreenBounds.getHeight());

stage.show();"

There are a variety of other methods (getBounds(), getDpi(), getScreens(), etc.) that would also likely be very useful for you.

Ridgepole answered 13/8, 2013 at 22:3 Comment(1)
Harvath II -- Thanks for the reply. I'll see if I can make any of that work within the context of my libGDX game -- main issue may be in supporting Android... I'd like to avoid platform dependencies as much as possible. I'll give it a try and see what I can make work.Autoionization
E
0

Maybe get the dimensions of the monitor your currently using, so on the different sized monitors, just get the dimensions. You may have to do things differently for different sized monitors.

Maybe the background image is enlarged due to it thinking it's still working on the other screens.

Hope this helps, if not gl

Empennage answered 9/9, 2013 at 21:46 Comment(0)
P
0

I would refrain from using lwjgl completely. It may seem simple enough on the surface, but as you try to add in more advanced features, you'll realize that a wrapper over OpenGL won't give you that functionality, or it'll make the program 3x slower. I suggest moving into c++ and OpenGL (if this is an option). You will be surprised at its simplicity. Otherwise, cross-platform libraries generally do have these sorts of glitches. To solve your problem, you need to stop relying on the library to determine your viewport resolutions. Instead, enumerate through all the possible display modes and pick the best one according to you. This will result in much more defined behavior and will be easier to debug than utilizing a default config which appears as if by magic. In general, you want to always be sure you know what data you are using. Even worse, I just noticed that you are using a wrapper over a wrapper. This over-complicates and will slow down rendering much more that you can imagine (20x on average).

Piloting answered 10/9, 2013 at 5:13 Comment(5)
Yes, every wrapper over a wrapper always slows down everything 20 times on average. Using just a wrapper makes it only 3 times more slow. That's straight facts. And a wrapper can NEVER offer the same functionality like the stuff being wrapped. Only C++ is the one and only true language and it is so simple that a 3 years old kid could make games with it. -1Sternmost
Not to mention C++ is automatically better and faster than Java! The internet said it is, so obviously I should believe it! Christian, unless you have sources to confirm your wildly inaccurate question, please do not try to confuse the OP and give him unhelpful lies. -1Herrah
I'm not saying that all wrappers are terrible, but in this particular instance, libGDX is monstruously bad. As for an example, look at this comparison:Gora
I'd like to make it very clear that I have used OpenGL natively, used LWJGL, and used libGDX for various activities. LWJGL is slower than C++ simply because of its memory management. java.nio.Buffers may alleviate some of the problem areas, but not all of it. Try rapidly creating interleaved vertex data in pure Java - not as easy, is it? And as for libGDX... it exposes very little control over its drawing routines (similar to XNA). It decides to just throw its hands up in the air and say: "Here you go, a handle to the wrapped LWJGL context, do with it what you want."Gora
"Not to mention C++ is automatically better and faster than Java!" -> Java runs in a JVM that s built using C++. Non-argument, Java can never be faster than C++. It's like trying to say that a C++ program will be faster than writing optimized machine code. Also, if you enjoy trolling these types of websites (judging by the anon username), I would highly encourage that you stop.Gora
C
0

Here

http://www.lwjgl.org/wiki/index.php?title=LWJGL_Basics_5_%28Fullscreen%29

you will find a convenience method for lwjgl to chose a suitable full screen display mode for your desktop. You can use the same method to determine the resolution as well and use it for your gl ops later.

Connieconniption answered 18/3, 2014 at 6:17 Comment(0)
C
0

I am not sure but it worked for me. Just set the size after "start" the game, like in the following code:

public static void main(String[] args) {
    LwjglApplicationConfiguration cfg = new LwjglApplicationConfiguration();
    cfg.title = "Palmeiras!!";
    cfg.vSyncEnabled = false;
    cfg.forceExit = true;
    cfg.allowSoftwareMode = true;
    new LwjglApplication(new Palmeiras(null), cfg);

    Gdx.graphics.setDisplayMode(Gdx.graphics.getDesktopDisplayMode().width,
            Gdx.graphics.getDesktopDisplayMode().height, true);
}

Let me know if it worked for you.

Conference answered 13/9, 2014 at 0:40 Comment(0)
P
-2

I would refrain from using Toolkit.getDefaultToolkit() and use solely lwjgl.util.Display.getAvailableDisplayModes() or the method described by libgdx.

Once you have set up a fullscreen window, fetch its size (if your set-up method doesn't already know that) and only use this information from thereon.

If Display.getAvailableDisplayModes() changes its sort order on different executions, simply re-sort them and use the biggest one available or use a standard one and provide in-game settings to change them.

Paola answered 23/4, 2014 at 12:47 Comment(1)
If you're going to copy my answer, please at least retain the formatting.Avidin

© 2022 - 2024 — McMap. All rights reserved.