I'm back again after a week off. I had a short visit from my sister, who checked to see that I was still breathing and hadn't been buried in trash or carried away by dust bunnies. And I took a couple of days off to sulk about my OpenGL progress.
As you'll remember from the last part, I converted my demo to OpenGL 3.3 and started to port it to Mac OS X, only to find out I'd been misled by the OpenGL SuperBible and that Apple only supports OpenGL 2.1.
I had also discovered that there is no way to write a fragment shader using 3.3 shader language that works everywhere. The NVidia driver requires that I declare the (previously builtin) variable gl_FragColor, whereas the ATI drivers reject this as an error (or warning, on some drivers.) For the record, NVidia is following the spec as I read it, and ATI is wrong.
This doesn't give me a lot of confidence that I'm writing portable code. In addition to the extra work in handling these small differences in OpenGL implementation, how do I test it all and make sure it really is going to work everywhere?
I finished off the last part by going back and implementing a Texture Atlas in my DirectX9 version. This replaces the Texture Array feature I was hoping to get in OpenGL. I could have done this after part 11, weeks ago.
I couldn't bear the thought of just dumping all the OpenGL work, so I decided that I had to try one more platform. I started porting the demo to Linux.
OpenGL on Linux
I haven't done anything other than Apache installations on my Linux machine, and not even that in ages. So I wiped it and installed Ubuntu 10.10. You need to request the proprietary drivers for the display. Then I needed programming tools and libraries for writing OpenGL code. Google pointed me to some pages with lists of stuff, and I just kept installing packages until the first OpenGL SuperBible example compiled and linked.
Unfortunately, the first example program didn't quite run. It's supposed to draw a toy block with letters on the faces, reflected in a desktop. Only the top of the cube and top of the desk were being drawn. The driver was reporting version 3.3 of both OpenGL and the shader language, and the program was running without errors, so it was hard to see what the problem could be.
My Linux machine has integrated ATI graphics though, so I thought perhaps it just wasn't a good enough display. I figured that if my code compiled and at least sort-of ran, I could always buy a new display card for the machine. So I continued.
There's a library called "glut" that's supposed to make it easy to port your OpenGL apps, but I had looked inside it when doing my Windows version and was not impressed. I just used it as a reference and wrote my own X Windows framework.
It took awhile to get all the system calls in the right place, and get it all to compile and link. I had tried one of the C++ development environments back a few months ago and had bad luck with it. There wasn't much code to write, so I just built my own makefile and used the default Ubuntu editor, gedit.
After a few days, everything seemed to work, so I sent the result to Florian to test, and of course it didn't run for him. He also couldn't get it to compile and link.
It turns out there were three problems. One was that I didn't keep track of which packages I installed, so I couldn't tell Florian what he needed. I've since started with a clean install (on my Windows XP machine, which has an NVidia display), and have the correct list of prereqs.
Second, the order of events coming out of X Windows was different than my code expected, and it was trying to open a 0 by 0 window. That was easily fixed. But the third problem was another nuisance. It turns out there's a bug in the NVidia display drivers.
It's not my fault!
When an OpenGL program starts up, it requests an interface, and it can request any version of OpenGL. I'm requesting version 3.3. The driver gives me a rendering context, and I'm good to go. One of the calls you can make is to get the list of OpenGL extensions supported by the driver. There are two versions, one which returns a big long string of all the extensions, and one which returns them a single extension at a time:
glGetString(GL_EXTENSIONS); glGetStringi(GL_EXTENSIONS, index);
I didn't understand at first how important the extensions list is. I was intending to just use the core standard and not mess with extensions which may or may not be there. But it turns out that parts of the core are implemented by the extensions.
Under Windows and on Linux, you use a library called "glew", which looks through all the extensions and sets function pointers by calling into the display driver. So you might have a call like glGenerateMipmap(), which looks like a library function, but isn't. Instead, it's a macro covering a pointer set by the glewInit function at startup.
What this means in practice is that you have a program with a call to a core function. It compiles, it starts to run, but in glewInit, it has to find pieces of the standard in the extensions list. If it doesn't find them, it never sets the pointer that you are using when you call the OpenGL function. Your program tries to call address zero and dies.
There are no warnings in the initialization that you are about to use some part of the standard that glewInit couldn't find. This was happening on Florian's Linux machine with an NVidia 460 display, but not on my Linux machine with integrated ATI graphics.
If things had gone differently, at this point I would have thrown out the OpenGL books and gone back to DirectX. Fortunately, I knew what the problem was.
On Windows, you start with an OpenGL 1.1 interface, then get the OpenGL 3.3 interface you want and restart everything. When I discovered that the Mac only supported OpenGL 2.1, I tried requesting a 2.1 interface on my Windows machine. That worked, and so I thought, "well, at least I can debug a 2.1 version under Windows before I port it to the Mac." But there was actually a problem with that.
glewInit was being called too soon, while it had the 1.1 interface running, and it looked from the documentation like I should wait until I had selected the correct version. So I restructured the code a bit. When I did, I discovered this same NVidia bug under Windows.
If you request version 3.1, everything works. If you request version 3.3, glGetString returns NULL, as if there were no extensions at all. However, if you use glGetStringi, it will cheerfully report 195 extensions, one at a time! This is clearly a bug and doesn't happen with ATI displays. It does happen with NVidia on Windows 7, Windows XP, and Linux, and three different display types. (I've reported this to NVidia -- no reply so far.)
To code around this, I had to create my own extension list from glGetStringi calls, and feed that to glewInit so it would do the right thing. I had assumed this was Windows weirdness, but when the demo crashed at address zero on Florian's machine, I realized it was doing the same thing there. With that fix in, the demo ran under Linux, on Florian's machine and on two of mine.
The demo is still a bit slow. On my Windows XP box, using OpenGL under Windows, I get a render time of 12.72ms. On the same machine, using OpenGL under Linux, I get a time of 30.94ms. I'm thinking that I just don't have the right optimization options turned on in the compiler, but I haven't hunted it down.
With Linux working, it was time to take another stab at Mac OS X.
OpenGL on the Mac, part II
First, I needed to do an OpenGL 2.1 version of my code. I had three problems:
I copied it all over to my Hackintosh and got it to compile with XCode. I used the "glut" library to wrap my drawing code in a simple application. And it worked! I saw the familiar bit of Minecraft landscape on the Mac! Yay!
Then I had a choice -- continue with "glut", which I didn't trust, or learn Cocoa, Apple's Objective-C based environment. I went with the easier "glut" route, since I didn't want to learn a new programming language and interface builder and find all the library calls to implement my framework on yet another platform.
As you'll see if you download the Mac demo, this was a mistake.
I was up very late last night trying to get this to work correctly, but it just doesn't. There's no resize handle on the window, and I can't seem to produce one. The application crashes on shutdown (hit ESC) because there's no way to request the application exit (it has to be quit by the user via the menus.) And most seriously, the cursor handling is crap.
My code is like a first-person shooter, in that it keeps the cursor at the center of the screen, rather than tracking it with the mouse. I do this by moving the (invisible) mouse pointer back to the center on each point. You are supposed to be able to do this with glut, but it just doesn't work right.
In the demo, it will look like you are getting a terrible frame rate, but it's just cursor weirdness. Move with the cursor keys and you'll see that the update speed is fine.
To get this right, I'm going to have recode the framework in Cocoa, which I'm not inclined to do at this point. If you want me to do that, you Mac guys need to download the demo and tell me if it works. I can't afford to buy a real Mac at this point.
Update: Quite a few Mac people did download the demo, and I've had no message that it didn't work on the Mac. So I recoded the demo with a Cocoa framework. You shouldn't have any cursor issues now.
But wait, there's more! (work to do...)
So I have a version of this demo running on Windows 7, Windows XP, Linux and Mac OS X. But there is still more work to do. If I want text and GUI overlays, I need 2D graphics. I covered this for Windows in Part 3. There I ended up using GDI to render my 2D graphics to a bitmap, then turned the bitmap into a texture and painted it over my 3D graphics. I can do the same under Linux and Mac OS X, but it means learning the API for 2D graphics on both of those systems.
At the time, commenters said I should just do my 2D graphics with OpenGL (or DirectX) and not mess with the Windows 2D graphics libraries. For simple lines, rectangles and images, I could do that. But text has two nasty problems.
To draw a string with OpenGL, I need to create a big texture with all of the letters in the font. Then to write a word, I copy out portions of that font texture onto triangles in the display. This means I have to do my own character spacing, which is a problem.
You can get all the character widths under Windows (and the other OS, I assume), but this isn't the right way to draw high-quality text. Nice text includes things like kerning (see Part 3) and anti-aliased rendering. If I draw my own text, I'm not going to be able to do all of that.
The second problem is Asian languages like Kanji. There, the font includes thousands of characters. To draw them using the "copy-from-a-texture" technique, the source texture would have to be huge. Just drawing it and storing it will be a problem.
Now, I don't intend to support Kanji anytime soon, since I can't read it, but I would hate to make it impossible for architectural reasons that can't be fixed. At least on Windows, I think you can draw Kanji strings with 2D graphics. I would hope the same is true under Mac OS X. I'm not sure about Linux, since one of the XLib tutorials I read explicity said that wide strings still don't work right.
My choice then is to either do all the 2D graphics with OpenGL and have limited text, or learn and reimplement 2D graphics on each platform (XLib for Linux and Quartz for Mac). Since I'm kind of sick of mucking around in the weeds here, I will probably defer this to later.
Finally, there's the problem that portability is not just about graphics.
The next version of the demo will be moving through a large world. I will want the demo to load new scenery in the background as you move. I will need multiple threads, critical sections (locks) and events to signal that new work is available. To keep the game platform-independent, I need to cover all of these with abstract classes, the way I've done for the 3D graphics. I also have to learn how these things are done under Linux and Mac OS X.
In other words, in order to keep this portable, I have to cover every single part of the operating system that I use with a class, and then implement that class on all three operating systems.
The Bottom Line
The limited demo I have currently is portable. It has run on all these combinations of OS and graphics card:
To keep the demo portable, I need my 2D graphics on other platforms. I also need to do a good framework for the Mac. I need an implementation of threads and related stuff on all three platforms.
The current Windows and Linux version of the demo try to load NVidia shaders, since they seem to implement the spec correctly. If the compilation of the shader fails, it tries the ATI versions. For the Mac, it always uses the NVidia 1.2 shaders. Going forwards, I would have to restrict myself to what I can do with the 1.2 shaders or lose the Mac users.
I don't know how well my various machines really test the code. I did get reports last part from people who couldn't get the Windows version to run with OpenGL. People reported those ATI vs. NVidia shader problems. It's all very frustrating.
I want to get back into writing the real game. At worst, this portability work gives me an indication of what capabilities I should use. I could drop back to OpenGL 2.1 everywhere. I'm just not sure if I want to spend the time each week to get this debugged and tested on all three platforms.
Of course, if I were a glutton for punishment, I'd buy an iPad 2 and see if I can get any of this to work there. If I really do end up with a nice game development platform that worked on all these systems, that would be interesting.
It's going to depend on you, my readers! We'll see how many people download the Linux and Mac versions, and whether I get a lot of bug reports. If people come back and say that the Mac version just doesn't work on real Macs, there's no way I can support that. If every Linux version is different and the demo fails there, I can't really support that either. If I don't get reasonable success on either Mac or Linux, I will switch back to DirectX on Windows.
So please regard downloading this demo as a vote on which platforms I should support. No ballot stuffing -- I have the server logs... :-)
Most of the demo UI is still missing. Use WASD keys for movement, or use the cursor arrows. Hit ESC to exit the program.
For Windows, download The Part 14 Demo - Windows.
For Linux, download The Part 14 Demo - Linux.
For Mac, download The Part 14 Demo - Mac.
Set the "platform" string in the options.xml file to control which display support is used. The Windows version will run with either "DirectX9", "OpenGL3.3" or "OpenGL2.1". The Linux version will run with "OpenGL3.3" or "OpenGL2.1". The Mac version will run with "OpenGL2.1".
If the program fails, you will find a trace file called "errors.txt" in the demo directory. Please email the file to me at
The Source Code
Download The Part 14 Source for the source code of all three versions. In a change from previous parts, this does not contain built versions of the demo. It does contain the supporting files -- docs, options.xml and world.txt.
To build on Windows, use Visual C++ or equivalent compiler. Build the JpegLib first in release or debug mode, then the Crafty build.
To build on Linux, use the supplied makefile. See the readme.txt in the BuildsLinux directory for a list of packages you need to install.
To build on Mac, use XCode on the Crafty.xcodeproj file. See the notes there in case file names are not picked up correctly.
As you would expect, the growth in the code this part is all in the framework. Some sections of that are repeated across platforms. I didn't really write 4000 lines of code this week!
Thanks to Tapio, who spotted a major typo in my Linux makefile -- "-o3" isn't nearly as useful as "-O3"... So if you want a version actually compiled with optimization, download the Linux demo again.
He also pointed out that the source did not compile for 64-bit Linux. I've cleaned up the pointer to int problems. If you are running 64-bit Linux and want to recompile the code, download the source again.
Sorry for the lack of posting recently. I've had another week of poor sleep, and my brain has turned to mush.
Before I started getting 3 hours or so of sleep a night, I did a new port to the Mac, using the Cocoa framework instead of GLUT. I've replaced the version linked above. Download it again from here. If any of you Mac users have trouble with it, email me or leave a comment. I had to finish up a couple of bugs tonight and I can barely think straight. Send me the "errors.txt" file, which is now in the top directory, like the other platforms.
More new stuff soon.
In case you were wondering, I pulled my shoulder muscle again, just like back in January. This makes it hard to sleep, and hard even to sit and type (I can't lift my right arm.) It seems to be healing OK. I might even go to the grocery store today, since I'm out of food.
In the meantime, I've updated the Linux demo to support OpenGL 2.1, for those of you with really old displays/drivers. Just change the "platform" string in "options.xml" to "OpenGL2.1". Let me know if it still doesn't work for you. Download it again from here.
blog comments powered by Disqus