Friday, January 4, 2008

LadyBug Adventures at Mawson's Hut

Yesterday (Thurs 3rd) I manhauled a sled over to the Main Hut (truly in the manner of the early explorers...) carrying the LadyBug camera, a laptop, several other digital cameras, survival kit and so on - it was quite windy, and that makes it surprisingly tough going with a reasonable amount of weight on the sled and the force of the wind off the plateau - even though it was only about 25 knots. Nevertheless, having persevered through this extreme personal test of endurance, I arrived with all the equipment necessary to embark upon shooting some 360º video.

The LadyBug is the small red object you see on the top of the tripod in the right-hand image - it captures a 360º dome image via six lenses on the camera body (a surprisingly compact unit) - each lens capturing at 1024x768 pixel resolution, at a frame rate of 15fps. This data is piped down an optical fibre link to the laptop and the Ladybug software, where you can preview the view from each camera, the fully stitched pano and a navigable 3D view, where the video is wrapped around a virtual sphere that the user can rotate. I placed the camera in the centre of the Main Hut - where the dining table used to be - which gave a pretty good view of the interior. Because of parallax issues with the camera the closest objects need to be about ~1m from the lenses, so this was a fairly optimum position that still allowed us to walk around the camera.

I was able to interview Michelle and Anne about their respective activities and interests as materials conservator and archaeologist using my iPod with a mic attachment. The Ladybug itself doesn't capture synch audio (or any audio), so I will have to synch this up in postproduction - trivial stuff. This was quite an interesting exercise as the nature of the 360º camera changes the way you think about working in a scene - there's no panning the camera around or framing shots but a more natural sense of gesturing to objects in a space that the user can then navigate to.

However, this introduces the feeling that one can stand anywhere in the space and to forget that objects and corners can obscure the interviewee! Similarly it will be interesting to think through the sound design of the sequence - the iPod will pick up the immediate soundfield of the interview, but not the more general sound environment - this is an issue that interests me greatly. I will probably end up creating a multichannel surround sound mix using a variety of ambisonic sources that I have also been experimenting with. This sound field could be crossfaded and directionally mixed depending upon user focus in the navigable video when it is dome-projected. Lots of interesting possibilities.

More technically, I found the software with the camera to be a bit flaky - but this is to be expected as it is an alpha release! Even despite the portable generator cutting out twice and thus the RAID array stopping (this is the harddrive on which the video is captured) it seemed that all the data was written to disk correctly. You simply had to restart the generator, reboot the software on the laptop and start the camera system again and it was all sweet - thank goodness.

The only real disaster came when I attempted to shoot some material outside and it became impossible to see the laptop screen in the glare - so I was forced to huddle under my Cahart jacket as a hood (probably like Hurley would have with his cameras!) in order to operate that properly. Then, thanks to the Wonder of Windows, the computer crashed and subsequently told me that there were no images on the harddrive - even though I had shot over 17,000 frames. Sigh. Bloody computers. But I'm used to this sort of thing - one can only be stoical, despite however careful you are. So, calling it a day, I trudged back - aware of what I had learnt in camera usage, its various quirks and bugs, and thinking I'd have to come back the next day. Fortunately, though, as I had not faffed around with the laptop after that crash (simply shut it down), when I got back to the Sorenson Hut and rebooted - a miracle! There it all was! I hope the scene at the end when about 20 penguins came up to check out the camera is there....

But there's one more aspect to this Ladybug saga - getting the data off the RAID. Because the camera captures a datastream from 6 cameras and writes them to RAID, the RAID seems to have been formatted in some weird proprietary way that neither Windows nor OSX can directly access. This means that the only way to get the data off the RAID is via the Ladybug software - and this is a VERY slow process. Despite trying several techniques it looks like I will simply not be able to transfer the raw data off the RAID quickly - it must be processed by the laptop and Ladybug software/sdk (stitched either to large 2000x1000 .bmp files at 5.1MB per frame, or, hopefully, much more compact .png's) - and, given the amount of data, this will take days. Normally this wouldn't be a problem, but when you're in Antarctica and the generator only runs for about 3-4 hours a day it most certainly is. I wish I could rapidly transfer all the data off and then process it back in Australia - as the camera can only capture about 30mins worth of material at a time(80GB). Anyway, I was aware of this issue when I came, but it is a definite limitation that would be nice to find a workaround or kludge or whatever. I experimented with some raw files exported by the system in order to see if other stitching algorithms might work more rapidly, but as there is no EXIF data in the files and I don't have access to the camera specs, the results were not up to scratch.

So - an interesting experiment. I think the results will be useful - even striking - once I've had a chance to work on the footage back in Oz - but it would have been nice to be able to do that here. Hopefully with the new sustainable energy sources operating here (the wind generator and solar panels) this will be possible in the future - we'll see!

- Peter Morse

1 comment:

Unknown said...

Hi Peter

Since the Ladybug sends a signal out via firewire, I wonder if it would be possible to capture the image in FCP, with the capture settings set to 'generic capture' and 'non-controllable device'. Don't think this would be do-able in FCE, but you could possibly customise a FCP template enough to get straight capture, and avoid the long transfer times.

Hope you're enjoying the cold - been disgusting hot here in Perth!