Gdx2D and Super Jumper

Good news everyone. I finished the gdx2d backend today. What is gdx2d? An image manipulation library. A very small one. It allows us to load Pngs, Jpegs and other formats uniformely on all platforms in native code thanks to the awesome stb_image library by Sean Barett. The stb_image library will load an image to either alpha, luminance-alpha, rgb888 or rgba8888 format, immediately suitable for uploading as an OpenGL (ES) texture. In addition i added support for converting images to rgb565 and rgba4444 so one can conserve some memory if needed (e.g. large backgrounds and the like).

Based on the image loading and conversion functionality I implemented procedures to set and get pixels, draw lines, circles and rectangles, fill circles and rectangles and most of all blit portions from one image to another image. All this works without you needing to care for the underlying pixel format, you always work with rgba8888 when specifying colors and alpha. The same is true for blitting of course. Additionally we support alpha blending for all operations. Finally, i implemented nearest neighbour and bilinear filtering for the blitters. While i tried to make the performance not suck entirely, i didn’t put any emphasis on speed. Especially the blitters are not something i’m proud off so if you want a good laugh check out the source code. I’ll eventually convert the bilinear blitter to fixed point. For now the performance is sufficient.

Why did I do that? For one, i wanted to get rid of the dependency on BufferedImage on the desktop and Bitmap on Android. There’s just so much shit going on behind the scenes (e.g. alpha premultiplication and so on) that it’s not funny to work with these classes. Due to this dependencies we also have to implement custom backends for the Pixmap and Texture interfaces which led to some incoherencies when it comes to premultiplied alpha (jogl and android backend have it, lwjgl and angle don’t). With gdx2d we can decouple libgdx from those system dependencies and have a single Pixmap and Texture class. Best of all: we have full control over everything. You can now directly access the memory a Pixmap occupies via a direct ByteBuffer if you feel that our implementation of the drawing methods is lacking!

It also means that using Pixmaps and Textures gets a bit easier. I have yet to make those changes live, i’ll wait for Dave to give me the green light as he’s working off the trunk in Discretion. Don’t want to fuck up his game so close to release 🙂 Here’s a fun image from the test of gdx2d. Just some clearing, drawing, filling and blitting with and without blending.

If all goes well you can use the new Texture and Pixmap class by tomorrow evening. Hurray…

In other news: i converted one of the example games from my book “Beginning Android Games” to libgdx. Took me maybe 20 minutes in total and worked as expected out of the box. You can find it in SVN/trunk/demos/superjumper. Here’s a screenshot.

There’s only a few minor things left for 0.9, then it’s time to go fullblown 3D. The book is nearly finished as well (at least the first drafts for each chapter) so i’ll have a lot more free time to go back to libgdx more often.

6 thoughts on “Gdx2D and Super Jumper

  1. A really nice devoted job ! LibGDX is becoming the most awesome drawing lib I know. I’m even strongly considering it for a Ph.D scientific desktop tool to display data-flow graphs.

    Anyway, isn’t Super-Jumper a clone a Abduction or vice-versa ? 😀

  2. Youd be surprised. I use libgdx for stereoscopic visualizations at work. See angle backend 🙂

    And yes, superjumper is heavily inspired by abduction. Which was inspired by doodle jump. Which was inspired by papi jump. Ad infinitum 🙂

  3. Really good job, Mario. Thanks for libgdx.

    Unfortunately, I’m really dissapointed in today’s ogl/mobile performance. It is just stinky pile of shit !

    Some years ago I was creating code for Amiga in assembler. It was MUCH easier writing 2d intros in M68k code than today coding in Java & ogl, really.

    And the fun thing is – on Amiga’s 7Mhz cpu I have got super-smooth animation with multiple layers, big screen objects in full color, vga res, etc..

    Today, we have 500Mhz+ phones that barely can run simple 2d game with 2 layers… and with lags. Really, dissapointing. I have seen much better games even on 10 years old Gameboy Advance with 32 Mhz cpu than on latest Android phone for 600$…

    Where this super-objective and useless programming technology is going ?

    Do not hate me, I’m just thinking loud 🙂

  4. haters gonna hate 🙂 i sort of agree. It’s not all that bad. i was a DOS coder for a long time so i know my way around assembler programming for the x86 architecture a little and 0xa000 as well. Having that said, you just can’t compare the programming model of directly accessing the framebuffer like in the old days and OpenGL. Given enough knowledge of the OpenGL pipeline you can achieve pretty amazing results we wouldn’t have dreamed of back in the days.

    but again, i agree. a lot of mobile gpus and drivers our there are a pile of shit :/

Leave a Reply

Your email address will not be published.