So suppose you’re writing some custom software, as can happen. Suppose that software produces some laser art. Then, at some point, inevitably you want to make that software output to an actual projector. Unfortunately that requires you to get off your computer, leave the code behind for a while, and delve into hardware. At the least, you’d need a laser projector and some form of controller – a DAC. You’re then left with the task to let your software talk to the controller, either over network or an USB driver. Some make it easy, some don’t. Most controller manufacturers don’t even give you that option and lock their stuff down. It’s no good.
But suppose you got it working. Your program is now sending its carefully crafted laser art to a controller. Coherent light is flowing. The next hurdle is the largest of them all: physics! Laser scanners are typically galvanometers with mirrors on a stick, driven by a feedback loop based analog amplifier. They are limited in rotational speed, and if you try to make them go faster, they are liable to burn up and let their magic smoke loose. This magic smoke is an essential component of laser scanners and is very hard to get back into the little buggers once it’s free. For this reason, laser software limits the distance between two samples (typically by interpolating large jumps, adding more samples in-between). Does your software do that already? Better go and add that feature.
Hang on. What’s happened to your beautiful frames? They look all distorted! Corners are too round, colour fades are way off, there are weird tails everywhere… What was supposed to be white now has the appearance of milk that’s been left out in the sun for a couple weeks. Foiled by physics again. Your next job is to fix all of that. You need to implement extra angle dwell points, extra points at blanking transitions, colour fixing curves, colour blanking offsets, and so on. I hope it’s a rainy day…
Let somebody else do the work
Or just forget all that and find yourself a program that does that already. Most laser software has advanced features that tackle all of these problems. All of these programs have one thing in common: they’re not yours, and they don’t produce the art work you spent so long on. If only there existed a program that could accept your frames – in real time – and do all the heavy lifting for you…
That program is called LaserShow Xpress (LSX). It is compatible with quite a range of controllers (Etherdream, Easylase, Riya, Helios, Laserdock, modified sound card, among others) and has advanced optimisation features built in. Since a few years now, it has a feature where you can stream a complete frame to it over network using the Open Sound Control (OSC) protocol. The frame needs to be structured as a byte array (a blob) with a precise format.
Set up LSX to accept a frame over OSC
In the LSX settings panel, expand the Remote Control tree and click OSC Setup. In this window, note the root OSC name (by default LSX_0) and the port (by default 7000). If necessary, start the server. Also note the Help! button, which will essentially explain what I’m trying to say below.
Next, choose a timeline. Place an Animation event somewhere on that timeline. Set that event to a certain frame in the frame catalog, and remember which frame it is (don’t use frame 0).
That should to it.
The OSC packet and byte blob layout
This guide supposes that your program treats laser art as a “frame”, which is a static collection of “points”. Every point should have a position (x, y and optionally z components) and a colour (in RGB format). This is not dissimilar to how an Ilda file is composed. If your software generates laser art by having a continuous “stream”, you’ll need to chop it up into discrete frames.
In your code, create a byte array (or list, or vector, or whatever you millenials call it nowadays). This array will, in most cases, have a length of 12 + (number of points) x 9 bytes. Bytes are unsigned and little-endian (careful: standard OSC bytes are big-endian).
Just like an Ilda file, the array starts with a header, a fixed structure of 12 bytes.
LSX OSC packet header
|0||byte||type||0: XYRGB - 8 bytes/point
1: XYZRGB - 9 bytes/point - I mostly use this one
2: XYZPRRGB - 11 bytes/point
|1||byte||store||0: incomplete frame, temporarily buffer, frame split into multiple blobs and OSC messages
1: complete frame, place directly into catalog
|4 & 5||uint16_t||destination frame||frame number in the frame catalog|
|6 & 7||uint16_t||frame points||total points in the frame|
|8 & 9||uint16_t||blob start point||start point number of this blob (for when a frame needs to be split into multiple blobs)
0 when not split
|10 & 11||uint16_t||blob points||number of points in this blob (equal to frame points if not split)|
There are three types of packets: 2D (type 0), 3D (type 1) and 3D with repeats/palettes (type 2). Typically you’d use type 1 for most applications, but type 2 allows you to send advanced LSX (and LDS) specific stuff like normal vector flags, blanking flags, a 6-bit palette colour, point repetition and point part. LSX should interpret RGB values of 0, 0, 0 as blanked.
The scanner byte refers to the timeline of the animation event. If you have multiple projectors, you could send your frames to multiple timelines at the same time, and use this to for example make a chase of the same frame, which is not a simple task with the current state of LSX.
There is also a feature where, if your frame is too large, you can split it into multiple OSC messages.
The following structure follows directly behind the header in the byte array and is repeated for every point in the frame.
LSX OSC point structure
|0 & 1||int16_t||x||Point x coordinate between -32767 and 32768|
|2 & 3||int16_t||y|
|4 & 5||int16_t||z|
|6||byte||palette - normal - blanked||bits 1-6: palette colour index
7th bit: normal vector
last bit: blanked
|7||byte||part - repetition||first 4 bits: repetition number
last 4 bits: part number
When using type 0 (XYRGB), omit bytes 4, 5, 6 and 7. When using type 1 (XYZRGB), omit bytes 6 and 7.
Even when using format 2 and a palette, it’s a good idea to specify RGB values. Some colour events in LSX only work with RGB values (the reverse is also true).
Now that the byte array is formatted, create a new OSC message using your favorite OSC library or package or plugin. The command pattern should follow this structure: “/LSX_0/Frame” (unless you changed the root name in the OSC setup). Attach the byte blob to the message body and send it to the appropriate IP/port combination.
There are some prerequisites to the OSC message body (like including amount of bytes and the amount being a multiple of 4) but the library should take care of that automatically.
My Ilda library for Processing has some examples that demonstrate this procedure. This file is a direct implementation of the above format, although without the splitting into multiple messages when the frame is too large.