Check out the main OpenLase page here.
First of all, as I’m sure everyone knows by now, I’ve been working on hacking the Kinect and writing open drivers for it. There’s a website for the community and a Git repo with the code, and it’s working fairly nicely by now.
With that out of the way, here’s a project that I’ve been working on on-and-off for the past year or so. I’ve been interested in laser scanning and DIY laser projectors, but I couldn’t find any good open source software to drive them. Specifically, I was interested in the realtime aspect: rendering and showing dynamically generated images and responding to events, not just making and preprocessing laser shows. So I set out to write my own set of software to do real-time rendering. This was the result:
DIY laser projectors commonly use sound cards as DACs. This shifts most of the processing over to the PC, but also lets us get very fine control over the realtime aspects of projection, which is what I want. Thus, my laser projector is based on a bog-standard USB soundcard, modified to pass DC. I’ll probably write a detailed article on the hardware later, but suffice it to say that it’s a galvo kit, a hacked chinese laser pointer, my own laser driver and monitoring circuitry, and some other minor parts. Total cost is about €200, if you play your cards right.
Since we’re converting laser images to audio data, why not just treat the laser data as audio in the first place? After all, laser samples are audio-rate data, and 16-bit multichannel 48kHz fits the requirements for laser projection very well. So that is what I did. OpenLase isn’t really a monolithic framework. Instead, it’s a series of stand-alone applications and chunks built on the excellent JACK audio connection kit, which serves to pipe realtime laser data around the different bits.
On a typical setup, you’d have two processes running on top of JACK. On one hand, there’s the output processor, which is responsible for formatting the idealized laser data to suit the peculiarities of the hardware. This includes things like brightness scaling, the obvious X/Y inversion settings, the final output perspective transform (to fit the screen), and minor filters to try to compensate for hardware imperfections. It also generates a 1kHz square wave on one channel - this is a peculiar safety feature of my laser hardware. I have a microcontroller monitoring this signal, such that if the software hangs or crashes for some reason, the laser shuts down immediately (to avoid having a static dot which would be a serious eye hazard). The OpenLase output processor has a simple Qt GUI that lets you tweak these settings on the fly.
On the other hand, you have whatever picture source you want to use. You can have bare JACK applications, such as two examples: ‘circlescope’, a circular oscilloscope that takes realtime audio data from a media player, and ‘playilda’, a bare-bones ILDA file player (ILDA / .ild is the standard file format for laser graphics). The circlescope is particularly good for showing off the real-time aspect (note that the input can come from the laser DAC’s line-in with only the small JACK buffering delay):
However, the other big part of OpenLase is libol, a realtime rendering library loosely modeled on OpenGL which lets you produce 2D and 3D graphics on the fly. This is what I used for the LASE demo above. The demo itself isn’t currently open source (and the code is utterly horrid - I wrote half of it at Euskal Encounter and finished it mere minutes before the deadline), but if there’s demand I might open source it too, just please don’t expect pretty code! However, keep in mind that most features used by the demo (text rendering, 3D, “shaders”, ILDA file loading, etc.) were implemented as part of libol, so you aren’t missing out on much. There are two libol-based examples: some rotating cubes, and Pong.
There’s one part of the demo made it into the OpenLase distribution as a separate example: tools/trace.c. This used to be some tracing code that I used for the metaballs and fire effects (I kind of cheated there, as those are rendered as bitmaps and then traced in realtime into laser vectors). It’s a terribly naive algorithm (check the source out for details), but it worked surprisingly well for certain kinds of video, so I hacked it and tacked on more heuristics in order to attempt to make it work better. It now lives next to tools/playvid.c, which is a simple video player using libavcodec. Here’s what it looks like (original YouTube video). More complex videos are hit and miss, but some things turn out surprisingly well for such a silly algorithm.
You can also add filters between the output and the image generator. This is what I did for my Kinect + OpenCV + OpenLase demo, which projects anything projectable by OpenLase onto a moving screen, with a dynamic perspective transform (in this case the perspective transform happens in the filter, not in the output processor). That code currently doesn’t even build with current libfreenect, but again, if someone is interested, drop me a line and I’ll make it work again and publish it. OpenLase doesn’t have any facilities to patch together these JACK apps. Instead, you should just use existing JACK tools, such as QJackCtl, to connect all the input and output ports together. QJackCtl has a patchbay feature that automagically connects the ports when applications start up, so it’s quite seamless.
Right now there is pretty much no documentation, but I’d like to know if people are interested. If you have (or want to build) this kind of DIY hardware, you run Linux or some other UNIX that can run JACK, and you’re interested in hacking on the code or using it for something, please let me know! Here’s the git repo.
Update: Three more videos, musically themed.