17 May 2009

1st position wheel navigation test in Blender

Thinking of Ho and the zebra crossing scene (visual part), I wanted to test Navigation from camera angle placed in 1st position in Blender. To do so, I still used:

* The real largest Navigation wheel I have in the berliner studio.
* An optical mouse (this time an Apple mouse with higher refreshing rate than the one employed before - some cheap stuff from Media Market)
* Blender 2.48 game engine using a parented simple cube and camera, looking to a third object (a sphere)
* A Pyton script called MouseLook.py found at www.tutorialsforblender3d.com

The navigation wheel will be rotating the cube. The idea is that the navigator will need to turn around the wheel several times to achieve a single rotation of the cube of 360 degrees. This may resemble the physical effort needed to move the virtual object in Ho. The quality of the video is not great but I trust it should be able to see that the movement of the wheel coincides with the rotation of the cube.

Prior to this experiment I tested the navigator's wheel and the mouse in MaxMSP, using the aka object and learned how to hide and show the mouse and also calibrated the mousestate movement for two screens (one above the other, as the mouse is in the wrong position (I can't hack the mouse because it belongs to Manchester University). In combination with blender, there is no need to hide the mouse, of course, because it goes away when the game engine is active.

Below a couple of night-shot pictures of the apple mouse (there is people sleeping here...), simply placed on top of the rotation axis. I added some mask tape to the wheel bar for better mouse reading

Next, I guess it will be to assembly all recent ideas together:

- OSC and Max to receive data from Blender
- A 3D model of the navigator (for 1st camera position)
- A 3D zebra crossing with the vehicles passing by (need to associate proximity to sound etc. I just read a number of tutorial about collision, proximity, etc.
- The metadata maxpatch to retrieve sounds from a database using custom made metadata inspired by typomorphology for this particular scene.