Archive for the ‘ Coding ’ Category

Ruby, MySQL and Snow Leopard

It should have been a simple excercise. I just wanted to set up a dev environment for a Ruby on Rails app that uses MySQL on my Snow Leopard Macbook 5.5 – easy, right? I wish… after much trial-and-error, and many encounters with the dreaded error message:

"uninitialized constant MysqlCompat::MysqlRes"

… I finally got everything to work using the following steps:

  1. Installing the 64-bit version of MySQL 5.5.14 using the .dmg installer.
  2. removing all possibly pre-existing versions of the mysql gem:
    sudo gem uninstall mysql
  3. Installing the correct gem with the following line:
    sudo env ARCHFLAGS="-arch x86_64" gem install mysql -- --with-mysql-config=/usr/local/mysql/bin/mysql_config
  4. This trick is missing from most howto’s: I needed to set the linker path.
    export DYLD_LIBRARY_PATH=/usr/local/mysql/lib/

After this, you can verify your installation is working with irb:

Motoko:project-rails karsten$ irb
>> require 'rubygems'
=> false
>> require 'mysql_api'
=> true
>> Mysql.get_client_info
=> "5.5.14"

Having done this, the rake db creation and migration tasks went smoothly.

developing a VR tank

Last week I finished the most exciting project of my professional life (so far): I got to develop a 180° projection surface virtual reality system, more catchily coined a VR tank. It features:

  • a curved projection surface measuring more than 4 meters in width and 2 meters height
  • a HD stereo rear-projection system consisting of 4 projectors
  • a 10-camera VICON motion capture system
  • optionally, a treadmill to give users an infinite walk-space

My part in the project focussed on developing a VR application framework in C++, integrating the open-source Horde3D engine, interfacing with the real-time Vicon data stream and developing a new perspective-based rendering pipeline. It was most satisfying!

Below you can see a video of the system in action (filmed in infrared). The bright things atop the screen are Vicon cameras, their bright glare is infrared only and usually not visible. You can see how the user perspective is dynamically calculated based on head position: the black areas outside his field of view are not visible to the user.

The objects that the user is interacting with appear to him as floating in front of the screen. The Vicon system does full-body skeleton reconstruction, and we’re tracking the user’s hand position to detect object interactions: the spheres “bounce” and disappear when he touches them. They seem blurry in the video because the two stereo-separated views are superimposed.

How does it work?

We’re using the Vicon system to implement a 6-degrees-of-freedom headtracker, with which we can approximate the user’s field of view in the virtual world. Using standard OpenGL cameras, we then render two fixed-fov views, one for each eye. The “magical ingredient” is then a GLSL shader to raytrace these world views onto a theoretical cylinder, representing our projection surface. These final images are then drawn (quadbuffered) on a fullscreen quad spanning the projection screen.

The video unfortunately shows only a rather boring environment view – it’s amazing how real and immersive more complex environments feel when experienced through our setup.

I’d like to thank my esteemed colleague Martin Löffler for the productive collaboration and all the fun we had!

Screen-space Ambient Occlusion

Shading only with texture or flat color and SSAO. No lighting or other shading used.

Screen-space ambient occlusion is all the rage today, and for a reason: it’s hard to find an approximation of accessibility shading with similar run-time characteristics (most importantly, being independent of scene complexity). Results are fast and consistent, and it looks a lot like global illumination (which, of course, it is not).

So, when tasked to make our techdemos prettier, this was a pretty obvious venue to explore. Since we already use Horde 3D, a shader-driven game engine framework, it was mostly a matter of finding sample algorithms and adapting them to our engine. In the picture above you can see the shader in action.

In the demo, you can control the character and walk/run around the famous Sponza atrium (which for some inexplicable reason has filled up with giant blue cubes). Smooth animation blending thanks to the sweet Horde3D framework. Developing the shader took about a week, the demo application ca. 1 day.

The cubes appear a bit out-of-focus because the ambient occlusion map is being rendered at small resolutions, then filtered with a Gaussian blur kernel and up-scaled to reduce the noise that is inherent to the method. Combining the SSAO shader with some actual lighting (that would accentuate the edges) would fix that, but wasn’t in the scope of the demo.

Code and binary downloads will follow in a few days!

iPhone, here I come!

I’ve recently started to acquaint myself with iPhone development. Being completely new to the Cocoa (touch) API, I’ve been impressed by the ease and level of abstraction that this development environment offers (although, of course, getting used to the API concepts will take some while – I’ve never coded for the Mac before). I like how Apple strongly encourages the use of the MVC paradigm, it almost invariably results in good code architecture.

I’ve chosen to develop a “sellable” application as a tutorial project for me – I always learn best with a clear goal in mind. I’m calling it “BudgetBalancer”, and it’s a tool for a couple who are trying to balance their expenses. I may have to find a better name before putting it in the App store… :)

The project will teach me about:

  • Objective-C
  • Interface building
  • Cocoa touch
  • Core Data
  • online synchronization

I’m a few days into the project now, and really starting to like Objective-C. I found this tutorial very helpful – it’s concise and concentrates on the essentials. Perfect if you already know a lot of languages and really don’t need to be taking baby steps.

I’ll try to keep posting about my progress as it, uh… progresses. :)

“Fishtank VR” for TrackIR

I’ve been incredibly impressed by Johnny Lee’s “Fishtank VR” demo using the Nintendo Wiimote as a headtracker: simply by introducing physical-world-related motion parralax, Lee was able to create a very strong sense of three-dimensionality – completely without stereoscopy!

Even more awesome, he provided the complete source code for his applications! Having a TrackIR 5 consumer-grade infrared head tracker at my desk, I decided to adapt Lee’s application to the headtracker. Using the OptiTrack SDK, I was soon up and running… here’s my application (source code included):

(Note that I do not claim any sort of originality, credit or even real authorship.  All I did was adapt the existing code to a new SDK.)