Plug-In : An arts coucil technology residency

Introduction | Research | Project Stage 1 | Project Stage 2 | Project Stage 3 | Results & Links

Project Stage 2


I decided to create different 'play' zones in one single application. Each zone would be accessible via a number key on the phone. I didn't want the program to be directed at competitive play or level orientated goals, so there was to be no 'point scoring' or 'lives' in the traditional sense.

Key to the development of the zones was the use of a pixel engine. I found that one useful resource for coding and experimental ideas was the 'demo-scene'. Popular in the 80s this was a movement of young programmers and hackers writing advanced graphics and audio routines to try and overcome the limitations of contemporary PCs (286, Amiga etc). They often focussed on optimizing code for small memory sets and low resolution low color displays. Mobile platform development is in much the same stage of technological development at the moment and as such demo coding ideas apply very well to development on these devices.

Various graphical effects were designed and tested for each potential zone and a simple particle system was developed. After trying out voxel based engines and other pseudo 3d effects it seemed better to allow the zones to share a common ground. In this spirit, the user interacts with each zone using only the cursor (which is a box shape, centered on a crosshair). This provides the user with a familiar interface and reduces the complexity of input to just 4 directional keypresses. A brief description of the pixel engine design follows.

Pixel Engine

I decided to write my own pixel drawing code rather than stick with the in-built sprite bitmap blitting functions. However there are a few obstacles to overcome when working this way.

Firstly if you are writing data to the screen you are at the mercy of the 'window server' this is the part of the symbian OS callbacks that updates the screen and can run very slowly. It can especially cause problems if it is trying to render a screen area while you are trying to write to it. This can be solved by bypassing the window server and using direct screen access, this requires you to update the screen yourself (and write some handling code to cope with application interruptions).

Secondly the display on most symbian 6.1 devices (the version I am working on) use a 12bit display depth. This is unusual as most systems are based on 8bit, 16bit, 24bit or 32bit depths. We are trying to represent r,g,b values (which in 24 bit would be 8bits per color). In 12 bit mode there is a maximum of 4096 values split into 3(4bit) ranges as follows: blue[1 2 4 8] green[16 32 64 128] red[256 512 1024 2048]. An offscreen Bitmap Is used as a temporary 12bit buffer in my application and then the whole scene is blitted to the screen after each frame of processing is finished.

Once a solid pixel drawing routine had been established and a common interface was decided upon the programming of the zones could commence.


Initially I assumed that mobile devices, often sold on the strength of polyphonic ringtones and great sound quality, would be a good platform for developing interesting uses of sound. However Symbian offers little in the way of sound interfaces and most of the target devices are monophonic and support only low quality sound playback.

There are two main routes to sound creation on Symbian, CMdaAudioToneUtility for generation of simple monophonic sinewave tones, and CMdaAudioPlayerUtility for playback of media types such as wav files and midi files. Although an application can hold a number of audio files in memory it can only play one file at a time. There is no easy way to multi-track audio without writing data direct to the sound output port and building an independent mixer interface (a substantial project in itself).

This level of simplicity ony allows for basic event triggered sounds (like menu selection noises and alert messages). I did create a parallel version of several zones that used basic audio such as this, but ultimately I found that the existing system was too restrictive to do anything very dynamic. It is also worth noting that even by the addition of a few short wave (.wav) files to the project the overall filesize was instanstly doubled.

Next: Project Stage 3 >>>