I continue to plod along on the project, doing miscellaneous bits and pieces.
I have world which contains both large objects like planets, medium-sized objects like asteroids or the cylindrical habitats, and nearby details such as buildings, avatars, etc.
I know that the Z-buffer limits what I can do at a distance. A planet covered with separate water or cloud polygons will cause flimmering, since the Z-buffer can't handle the tiny difference in depth at that distance. So I had planned to manually sort all the distant objects and only use the Z-buffer for nearby rendering.
However, the intermediate distances turned out to be a real problem. As you approach the cylindrical habitats, the hex framework on the outside is still in the long-range group and would need to be sorted. This is a lot of polygons and would be a significant amount of time to sort.
I tried breaking it into large chunks sorted every time, and just doing the detailed sort within a chunk as time permits. This didn't work well though. The landscape can briefly show through the shell if the sort is wrong, which is a glaring error. Even when mostly right, my eye seems to pick up on errors in sorting the hex mesh. It's very noticeable.
I went back to using two passes through the Z-buffer. I render far objects with my own sorting, medium objects with one pass of the buffer, then close objects with smaller setting for near and far plane. This works, but there's some sort of error between the passes.
I noticed this way back in Part 28, but I've never been able to figure out exactly what is causing it. The front and back plane just don't seem to clip a triangle into two seamless pieces. I'm not sure if that's something to do with the precision of the hardware, or the Z-buffer.
This is the kind of thing that makes me hate 3D graphics, and I haven't put in the time to figure it out.
I've been generating all the graphics for the space station, habitat and colony in code. That's fine, since these structures are all pretty cheap to create. But I wanted an artist to be able to work on the architecture of things like the station. That meant reading models from an external file.
I implemented a reader for Wavefront OBJ files, since the format is very simple. I created all my models as files and it all works. The problem is that that files are large (a few megabytes altogether) and slow to read.
I spent a few days playing around with this, putting the objects into my own format with a converter. They load faster, but are still very large, so I'm not completely happy with this. Most of the vertexes are in these hex meshes though, so perhaps I will generate those in code and take them out of the files. That would be the simplest solution, although it would constrain an artist.
Next, I wanted to add buildings to the various worlds in my demos. They really lack a sense of scale at this point. I also wanted to see the mix of cube-based objects and polygonal terrain.
When I looked over the old cube-drawing code, I also went through my ToDo list for it. The top item was to save lots of display memory by using instancing. Since I'm already using a lot of display memory for landscape textures, I thought this would be a good item to work on.
As I expected in Part 64, I got a big reduction in memory use, but slower performance. I'm also now referencing textures in the vertex shaders, which some low-end devices will not handle.
For best performance, I should be cutting the number of vertexes by combining adjacent faces. I did that in part Part 72 and got good results. But if I'm going to wrap my cubical scenery on spheres or cylinders, I can't do that. You would think that even on a small world, local curvature would not be a factor, but it is. These images are Minecraft rendered on small worlds.
On the planet, moon and ring, I could probably get away with flat coordinates within each 32 by 32 by 32 chunk. I'll have to try it and see if tall buildings start to have noticeable gaps between chunks.
A reader named Scott Hooper pointed out that all the processing of vertexes for faces that point away from the eye is wasted. So instead of having a single buffer for all the opaque cubes, and letting the display cull faces, I should have six buffers, for faces in each direction. Then if a chunk of scenery is completely facing away from the eye in "x+", I can just skip that buffer. This has a signficant effect on performance and was an easy change to make.
Analyzing the new rendering code, the biggest time sink now is drawing all the non-cube faces. I really need to do near and far versions of these. Right now, I'm wasting thousands of vertexes on things like grass blocks that are too far away to even see clearly.
I've wanted to do some 3d printing for a long time now, but one thing really puts me off. I expect anything I design to take multiple attempts to debug, and the cost would really add up. I thought of buying myself a cheap 3d printer, but the pages I've read on them make it sound like a real chore to get them to work. Some of the sites just advise you to wait for patents to expire and better home printers to become available.
I was reading an Instructables item the other day and a commenter suggested you check your local library to see if they have any printers. To my surprise, the local library here has two machines which they make available to the public. Supposedly, it's even free until their grant runs out (they started the Design Spot this spring.) I wasn't able to get any information during the holiday, but I'll try again this week.
Shapeways.com wants models that are "manifold" and "watertight". The models I've generated by hand just place individual triangles wherever needed to get the right look, so they don't qualify. When I tried to create watertight models in my code, I was still failing the "manifold" test.
To do this right, I think I need a Constructive Solid Geometry package. The primitives will be manifold and so will unions and intersections. I looked around and downloaded a package called Carve CSG. It seemed like it would do the trick, and I've gotten it to do things, but I'm not happy with it.
After fighting through some compiler and platform issues, I subtracted two spheres to get a sphere with walls, and then subtracted a cylinder from that, to see inside. That worked, but when I rotated the component objects by 45 degrees, it failed. I thought this might be just my rendering code, but the package insists the 45 degree rotated one is no longer manifold, even though the exact same thing rotated 60 degrees is fine. Note the black strip of missing triangles on the top of the right image.
The other problem is performance. Some of these simple models are taking seconds to render, despite being low polygon count. It also seems to matter a lot whether I run the thing in the program development environment or standalone. I am not sure what feature they would be using that causes it to run 10 times slower under debug, even when no breakpoints are triggered. The whole library seems to be using a layer called Boost, which I may have configured incorrectly.
Needless to say, I would like another package to use. Any recommendations?
That's what I've been up to this month. Sorry there's no new demo or screen shots.
blog comments powered by Disqus