Category: Demos


Plant Growth

Generating plants and fauna has been one of the primary successes of computer graphics over the last few decades, most people would be surprised at how often the plants in a film or advert are virtual copies of the real thing.

A simple way to create plants is using Lindenmayer Systems, which take a starting string and over a number of steps re-write the original string based upon simple rules.

So for example we could start with the character A and every time we iterate we replace the letter A with AB, and if we find a B we replace it with letter A. This gives us a series that looks like:

A
AB
ABA
ABAAB
ABAABABA
ABAABABAABAAB

and so on.  Visually we could change the characters to directions for an image drawer (things like turn left, turn right, draw forwards) and instantly we’ve produced some nice little snowflakes.

We can generate all sorts of interesting shapes by making tiny additions to those initial rules, such as adding the ability to jump spaces without drawing, changing colour, all in all a very simple way to make some familiar looking patterns.

Where things get more dynamic is if we allow the algorithms to maintain a stack of visited matrices, which means to branch, a bit like trees.

Move into the realm of 3D and we begin to get some very lifelike trees.  Of course trees are not made up of identical sticks from root to branch, so we need to add in a little stochastic sampling (randomness in the growth) as well as the idea that the tree will grow seasonally, and a trunk will grow differently to a branch, a leaf or a flower, however all of this can be managed with some imagination.

To get an idea of how far this has come, check out the virtual arboretum managed by SpeedTree, a company which specializes in dreaming up virtual fauna.  To get some more details of how the magic happens, I’d recommend The Algorithmic Beauty of Plants by Przemyslaw Prusinkiewicz and Aristid Lindenmayer.

 

Advertisements

Route Planning: A*

The first week of the Stanford AI Class focused on route planning.  This is a pretty standard problem in computer science, with a number of well known solutions, including depth first and breadth first tree searching (it either means something to you or you really shouldn’t care).  Both of these have the drawback of taking a while to focus on the best route, so by simply applying a heuristic (a guess) to the remaining distance from each tested location, the algorithm should focus on the best path faster.

This slideshow requires JavaScript.

Here’s a few screenshots of the A* at work – blue is the starting point, pink the finish, yellow the tiles tested, and white the optimum path.

In this case there is a bug which means something is out of whack.  Unfortunately Processing (the language I’ve been using as a demo tool) doesn’t have much in the way of debugging, so I’m going to have to move to C++ or Java in XCode or Visual Studio.  It’ll take a while to transition this code, so I may as well head down the C++ route (in my opinion more useful as it allows me to easily port code to the iOS devices).

As an example of how procedural content generation (PCG) works, I’ve created a little demo.

In this case the level genotype is a random number, which the unpacking algorithm uses to mine out a level map.  Entering the same number as a seed creates exactly the same level, however the simple algorithm provides millions of variations on the level.  Granted, at this stage there is not much to tell one level from another other than geometry, but it should be easy to see how the geometry could be varied over time, and of course there is nothing restricting the technique to a grid pattern.

This second video has a minor modification, the algorithm back fills any areas it finds that are almost completely surrounded by space, which allows the map to look a little more natural.

Perlin Noise

Perlin noise is an algorithm for generating gradients in 1, 2 or 3 dimensions, largely by overlaying ‘octaves’ of noise over itself at varying frequencies.  The end result is that we can use this for creating seemingly random but realistic looking textures and fractal landscapes very easily.

The first video is a generic 3 octave Perlin noise gradient.

We can easily turn this into a 3d model, the following shows how this works by using frequency cutoffs to create terrain lines.

Bacteria

A very simple implementation of Conway’s Game of Life.

The above is the standard implementation, I then tweaked the program to be a bit more aggressive:

This stuff is really helpful to create a progression, used in games, textures, AI and music.

Source code here:

https://github.com/opposable/Conway

Bristol Game Jam

Bristol is a wonderful place for digital creatives – it’s a bit of a surprise that the games development scene seems over the last few years to have quietened down a bit.

But that doesn’t mean it needs to stay quiet.

To stimulate the scene (and have some fun) a bunch of Bristol game developers, sound and graphics designers got together to stage the Bristol Game Jam event.  Held at Pervasive Media Studio, organised by Red Wasp Design with a few helping hands, and in cooperation with the ExPlay festival, a bunch of us got together to build a game in 24 hours.

Except that we got together on Friday night, whereupon we were given the theme for the games (Mirror) and then decided that food and beer was the way to generate some interesting ideas.  That ended at about midnight for most of us, and there were a few sore heads on Saturday morning.  The time limit was 7pm Saturday evening, so in effect we had 7 hours to put together a working game.

Amazingly all four groups achieved this, and by the end of the day there were five playable games.  Granted, none were going to win any awards (although given more time who knows) it was very inspiring for everyone involved, I think a lot of relationships were formed and there was a general buzz about the day which everyone took out with them on Saturday night.

Our team put together a very simple game based upon Jason (of the Argonauts) trying to reflect Medusa’s gaze back before being hit himself:

Our team included Jae (DJ Task) and myself programming, Tessa providing the visuals and 3d models (you don’t want to know what it looked like before the visuals) and Dave (founder of Echoic Audio) producing the audio which really made the game.  Dave has put a far better post up here to describe the day.

Great fun, and hopefully the start of many more Game Jams and the opening up of the game scene in Bristol.

Source code on GitHub.