libGDX 1.9.6 released

libGDX 1.9.6

Here’s what’s new (plus a ton of bug fixes, see the commit logs on GitHub).

- Fix performance regression in LWJGL3 backend, use java.nio instead of BufferUtils. Those are intrinsics and quite a bit faster than BufferUtils on HotSpot.
- Updated to latest Sound Manager 2
- Added mappings for Xbox 360 controller for Linux
- Separated error log for vertex/fragment shaders for easier debugging
- Minimum Android API level is now level 9 (Android 2.3)
- API addition: Configurable TexturePacker bleed iterations
- Updated IOS Multi-OS Engine backend to 1.3.0-beta-2
- API Change: Pixmap.setBlending, Pixmap.setFilter are now instance methods
- VertexAttribute expert constructors exposed. Short types can now be used for attributes.

Please update your projects as usual.

If you use MobiDevelop’s RoboVM fork make sure to update your Eclipse/IntelliJ IDEA/Android Studio plugin to version 2.3.0.

If you use the GWT plugin in Eclipse and don’t run your GWT project via Gradle, make sure you update your GWT plugin to version 2.8.0.


In other news, I’ve been working on some VR related projects recently. Our friends from LWJGL have created wrapper for OpenVR, Valve’s SDK to talk to all kinds of VR hardware, from Oculus to the HTC Vive. I’ve built a libGDX specific wrapper on top of the OpenVR bindings, which should make working on VR projects with libGDX a little simpler.

You can find the repository include a sample application on GitHub.

There are a few caveats. OpenVR is currently really only working on Windows. OpenVR on Mac appears to be broken. Valve has released support for Linux a few weeks ago, I did not have time to test this yet. It should however just work out of the box.

Another caveat is the way rendering is performed currently. For each eye, a separate frame buffer is created. Rendering as in the example is then performed for each framebuffer, using a separate camera/projection matrix for each eye, essentially drawing the scene twice. With a 11ms/frame budget, that can be problematic if you have thousands of draw calls to render a scene for one eye, as it doubles the CPU/driver work in the VR case. Ideally, you only issue one batch of draw calls for both eyes, using instancing and a simple trick involving clipping planes in the vertex shader to redirect the rendering to either the left or right side of the combined frame buffer. Long story short: gdx-vr plus the libGDX 3D API will take you a long way, but for more complex stuff, you’ll have to get your hands dirty with custom rendering.

The frame buffers allocated per eye aren’t setup to use MSAA at the moment. The probpem with MSAA is that it’s no good for deferred rendering (not a problem with libGDX’s current 3D API, which is a basic forward renderer).

The final caveat is that the project hasn’t been released to Maven Central yet, and hardcodes dependencies on LWJGL version 3.1.2-SNAPSHOT. Once LWJGL 3.1.2 is released, I’ll also release gdx-vr to Maven Central, with its version synched to the libGDX version that uses LWJGL 3.1.2. For now, you can just import gdx-vr as a Maven project into the IDE of choice, play around with the sample or integrate it in your own project.

I have not made this an official extension yet, as the implementation currently only supports desktop system. I’ve received a Google Daydream and am looking into abstracting the gdx-vr API enough to have multiple backends, like any good extension citizen.

The minimal example looks like this:

You can find a more elaborate sample with teleportation controls here.

Happy coding,
The libGDX team

2 thoughts on “libGDX 1.9.6 released

  1. It look like I jumped the gun when I decided to try and build a multiplatform VR API for Libgdx. Everyone seemed to be doing their own thing without any tought about multiplatform, and many projects seemed completely abandoned. Still, for this C# coder who struggles with the Java environment and Gradle in general, it’s been a nice experience trying to get everything working together within a single project. I might still work on it as a learning experience and maybe be able to provide a few PR for the official extension once it’s set up.

Leave a Reply

Your email address will not be published.