OpenGL ES 1.x, should we ditch it?

OpenGL ES 1.x has been around for a while. On Android it’s been replaced by OpenGL ES 2.0 (99% of all devices) and more recently OpenGL ES 3.0. On the desktop, our OpenGL ES 2.0 emulation requires a graphics card supporting desktop OpenGL 2.1, released in 2006, and savely assumed to be available to pretty much everyone. WebGL is essentially OpenGL ES 2.0, 1.x will not be supported. And iOS is OpenGL ES 2.0 everywhere as well.

Removing OpenGL ES 1.x has a few benefits:

  • Removal of a lot of legacy code in the Android backend
  • Simplification of existing APIs
  • OpenGL ES 2.0 as default, power-of-two textures wouldn’t confuse newcomers anymore (though you might still want to use power-of-two with GLES 2.0!)

Of course there are drawbacks:

  • Legacy apps relying on GLES 1.x couldn’t use the latest libgdx
  • Some tutorials on the web might get outdated

What do you think?

72 thoughts on “OpenGL ES 1.x, should we ditch it?

  1. They can just use older versions of the library. There’s no sense in keeping older legacy code just to provide support for obsolete devices.

  2. I’m currently porting an HTML5 game to libGdx for Android.
    The game was written with y-downward.
    For me as a beginner with OpenGL, it was easier to set an OpenGL1 camera with y downward: it seemed impossible or too difficult with OpenGL2.
    A few months later, I gained experience and refactored code to use Scene2D and was forced to translate all y draws to use y-upward.
    So I remembered this post and thought it was the time to move to OpenGL2: nothing retains me to use OpenGL1.
    Works fine on the desktop.

    But here is what I get when running on my ART-enabled Nexus 5:
    (not problem with GL1… Switching to useGL2 and it crashes):

    02-07 21:09:27.972: A/art(6691): art/runtime/] JNI DETECTED ERROR IN APPLICATION: capacity must be greater than 0: 0
    02-07 21:09:27.972: A/art(6691): art/runtime/] in call to NewDirectByteBuffer
    02-07 21:09:27.972: A/art(6691): art/runtime/] from java.nio.ByteBuffer com.badlogic.gdx.utils.BufferUtils.newDisposableByteBuffer(int)
    02-07 21:09:27.972: A/art(6691): art/runtime/] “GLThread 632″ prio=5 tid=14 Runnable
    02-07 21:09:27.972: A/art(6691): art/runtime/] | group=”main” sCount=0 dsCount=0 obj=0x655a38c8 self=0x4414a880
    02-07 21:09:27.972: A/art(6691): art/runtime/] | sysTid=6739 nice=0 cgrp=apps sched=0/0 handle=0x4414ab80
    02-07 21:09:27.972: A/art(6691): art/runtime/] | state=R schedstat=( 49579528 20558077 96 ) utm=2 stm=2 core=3 HZ=100
    02-07 21:09:27.972: A/art(6691): art/runtime/] | stack=0x48b57000-0x48b5b000 stackSize=1040KB
    02-07 21:09:27.972: A/art(6691): art/runtime/] native: art::Thread::DumpStack(std::ostream&) const+87 [0x4166cedc] (
    02-07 21:09:27.972: A/art(6691): art/runtime/] native: ??? [0x4151a83e] (
    02-07 21:09:27.972: A/art(6691): art/runtime/] native: art::JniAbortF(char const*, char const*, …)+51 [0x4151b1f4] (
    02-07 21:09:27.972: A/art(6691): art/runtime/] native: ??? [0x415263e4] (
    02-07 21:09:27.972: A/art(6691): art/runtime/] native: Java_com_badlogic_gdx_utils_BufferUtils_newDisposableByteBuffer+44 [0x48b3e99c] (
    02-07 21:09:27.972: A/art(6691): art/runtime/] at com.badlogic.gdx.utils.BufferUtils.newDisposableByteBuffer(Native method)
    02-07 21:09:27.972: A/art(6691): art/runtime/] at com.badlogic.gdx.utils.BufferUtils.newUnsafeByteBuffer(
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at
    02-07 21:09:27.972: A/art(6691): art/runtime/] at android.opengl.GLSurfaceView$GLThread.guardedRun(
    02-07 21:09:27.972: A/art(6691): art/runtime/] at android.opengl.GLSurfaceView$

    Can it be solved?
    Please make sure such issues are not there anymore when ditching OpenGL1!
    I’d love to abandon OpenGL1, since it would mean less code to maintain for you, less bugs for us, and a cleaner API.

  3. OK, I dug up a little more on the OpenGL2 crash for ART, and you will not correct it:

    I will wait for Google to fix the ART issue.
    Until then, I will stick with OpenGL1.

    I rely on ART to have a speedy Android and make my battery last longer.
    And more importantly, I don’t want to release my game and get negative reviews/staring just because some people activated ART.

  4. There’s no difference between in terms of camera handling between OpenGL 1.x and 2.0 if you use Scene2D/SpriteBatch.

    The ART error is known, Google is actually fixing it, which is why we don’t take action.

  5. Relying on ART is a terrible idea at this point as it’s quite broken. Wait for it to become the default.

  6. You’re right: I will test the game without ART from time to time before releasing it.
    It would be bad to get negative reviews because it crashes when not using ART 🙂

    But from a user point of view, ART is very stable and I appreciate its speed and battery saving (maybe a part of it is psychological 😉 but a part of it is real too).
    I only once got a crashy application, but it was a root application and it was fixed right away.
    Since then, I completely forgot I’m using an experimental feature.

    Well, to get back to the point of this blog entry: I understand your point of view regarding ART and I’m OK for you to ditch OpenGL2.

    And for the camera issue, I don’t exactly recall what was the problem.
    I tought OrthographicCamera needed OpenGL1 a few months ago and I was surprised it worked fine with OpenGL2 yesterday.
    It was perhapse problematic with an older version of libGDX, or perhapse I was a newbee with OpenGL/libGDX and used it incorrectly.
    But don’t bother: I don’t have time and/or knowledge in this area in order to recall what was the exact problem: now it’s working fine.

    And by the way: I love libGDX.
    It’s feature-full and has a very good documentation and community.
    I think I wouldn’t have started making an Android game if it libGDX was not there: good job!
    I would have abandoned at the stage “oh, the Canvas is very slow, and OpenGL learning is too much time consuming”.

  7. Finally, I found the code I was using:

    // Set inverted camera to have y down
    GL10 gl =;

    This is what retained my on OpenGL1 a few months ago when I ported my application from HTML5 to libGDX: I wanted a very simple conversion, without too much refactoring.
    So I was not using Scene2D at all and I used that camera trick to use y-downward.
    It was easier to begin with OpenGL/libGDX while learning it step by step.

    Would this GL1 camera trick be easier for other newcomers to libGdx like it was for me?
    Would a full-GL2 libGdx discourage other newcomers?
    Or would it be easy for you to make camera.apply(gl) works for OpenGL2 if you ditch OpenGL1?

    Going forward to today, I finally was forced to do heavy refactorings and to use Scene2D to get a better game.
    And I think that if libGDX was already full-GL2, I would have took the time to use the y-upward philosophy.

  8. If you’re gonna break the API and since the only reason for degree based methods was “it was like that in opengl 1.0” then maybe also replace degree based methods with radian based?

    I hate degree based methods so much.

  9. I’ll put it this way:
    It’s just a question of time when you have to do it. And posing the question whether to do it or not – is already part of doing it.
    Lets lighten it up and look towards the future.

  10. Actually a fixed has been merged into the source since Decemember. The next release of android (major or minor) should have the fix included.

  11. Drop it and get on with developing a decent 2.0 shader implementation. Anything else is wasting time. Really the api is lacking so much in that area that I decided to migrate to a different game engine (Unity).

  12. Being specific would help us improve things. More details otherthan ‘it sucks’ would be helpful.

  13. The thing which I found most problematic was the lack of a Normal mapping implementation.

    From an outside perspective: things were 99% ready for the new composite shader release a very long time ago. But then a very long time passed without any tangible implementation.

    Development seem focused of trying to make the shader engine extremely configurable. For example the fancy graph GUI stuff. In my oppinion, putting effort into a GUI for the Shader plugins is a waste of time when the core functionality wasn’t even released. As a developer that was very frustrating.

    Fast forward many months, and no further updates on twitter, the blog here, or in the Forum. To get progress updates, I had to IRC Xoppa directly.

    Eventually when looking at the alternatives, I found that both the reliability, ease of use and depth of features of Unity to be much better than I expected. I had assumed that Unity was going to be more complex than it is.

  14. I’m not using libGDX for mobile development at the current moment, but how hard would it be to make a separate backend that is GLES1.x only code for compatibility purposes and that’s all? Maybe even just let the community maintain it if they want and have the ability to use it just in case?

  15. I would like it to be removed too, since you can concentrate on evolving API’s without having to take care of compatibility with a very poorly used technology.

Leave a Reply

Your email address will not be published.