Wednesday, March 2, 2011

Minecraft, Python, and the Temple of Rule 30

I've noticed in my recent job search that Python is a commonly requested scripting language.  I haven't really had much interest in learning it; several years of Ruby (on Rails), a long stretch of Perl, and various other languages at various levels of abstraction ranging all the way down to assembler have made it quite clear that there are only so many ways to express a control flow statement.  I feel that a computer language, as with any tool, should include staying out of your way as a central part of its design philosophy.  Ruby is very good about that.

So why learn Python?  Because it's frequently used as an embedded interpreter.  It's used to control and extend other software.  But what does this have to do with Minecraft?  MCEdit includes a Python interpreter, and I've been using MCEdit quite a bit.

I decided one day that I'd try to clear a large ziggurat-shaped space underground using more or less the same techniques used for TNT drift mining, only without leaving any of the intervening walls standing.  I'd clear the whole space and make the walls nice and tidy.

Right.

As amusing as it is to blast the hell out of stuff in Minecraft, clearing layers of 5x5x5 chunks became mighty tedious by the 5th layer down, at which point the floor was roughly 45x45.  Roughly, because some very large dirt and gravel inclusions were making a mess of things.  There was still a long way to go to bedrock.  I have my kinks, but holding down a mouse button long enough to remove or replace c. 73,000 blocks and calling that "fun" isn't one of them.  It was time to bring the power tools into the sandbox.

A long editing session brought the the pattern down to bedrock, exposing an existing railway and a bunch of deep caves I'd explored earlier.  I cleared out the lava and cleaned up the walls.  Here's the result.  The floor is 125 x 125, with torches in the middle of each visible 5x5 face.


Great!  All done.  The simple repeating patterns of light and dark and the hundreds of pinpoint torches looked quite nice.  But after riding through it and viewing the room from many angles, I thought it was missing a little something.  How about some texture?

This is where Python comes in.  Why not script a volumetric texture generator and use it to "paint" the world?  I had a vague idea that cellular automata might be up to the task, rather than using discrete samples of continuous functions.  Water and lava already appear to follow CA rules, and Minecraft is basically a voxel space you can walk around in.  This seemed a perfect match.  Elegant, even.

It's been a long time since I played with CA.  It's a topic I revisit every few years as a sort of touchstone for how insanely fast microprocessors have become,  like watching real-time fractal renders on a GPU while still having a memory of waiting for a single low-res Mandelbrot render to fill in one scan line at a time on workstation-class hardware.  Well, it's 2011, and now we have Golly and people simulating Conway's Life inside Conway's Life (sort of) just because they can.  I love it.

I experimented with Golly for a while and decided to implement a Generations algorithm with NumPy.  Generations is Life-like, but cells which are marked for death are not zeroed out immediately; instead they age for a certain number of iterations before they are removed from the grid.  This spreads out the active growth regions and tends to make the patterns flow rather than shimmer.

After a lot of trial-and-error, I was able to fill a 3D selection within MCEdit using two dimensions for the CA grid and one for time.  It worked, and I learned a bit of Python and NumPy in the process.

There was only one problem:  The results looked awful.

Within Golly, iterations of the Generations rule are expressed as video, with subtle changes in color indicating the changing age of each cell.  The result is a gnarly, mesmerizing wash of color.

Generations rule S2/B13/15 starting with only 5 living cells,
at 20, 100, 500, and 1000 iterations



But regardless of how I mapped the CA into the world, the subtlety was lost.  It always looked either streaky or random.  Part of the difficulty was the deliberate lo-fi look of Minecraft itself -- a smooth color palette isn't really possible -- but it was mostly due to the translation of time into depth.

Back to the drawing board.  I could continue to tweak the 2D CA and mapping rules to look for a pleasing translation, but a different approach might be more appropriate for this particular application.

How about a 1D CA instead?  I could run a 1D rule to create a 2D array and then project that into the 3D world.  This simpler approach actually looked much nicer in tests.  The most complex part turned out to be the projection of the texture onto the "front" of an MCEdit selection volume which has no knowledge of where the camera is located.  I handled this by requiring that the center of one side of the selection contain nothing but air; it's assumed that the camera isn't inside a solid object.

Here's the final appearance.  All the torches have been eliminated and the walls "textured" with glowstone and obsidian based on Rule 30.


Sort of a cross between TRON and the Luxor Las Vegas.  :)

No comments:

Post a Comment