Drawing a circle with lasers, as seen here, was an interesting project, and really stretched my ability trigonometry-wise (read on for more details). What would be even better, however, would be to take an image (gif, jpg, or otherwise) and convert it into a light graffiti automatically. This is my latest experiment with servos, lasers, and the Python programming language using a pyMCU.* Check out the results in this video:
One should note that the example images are not from the laser moving around in the video, since the same camera is used for the video and long-exposure light graffiti shots. The physical setup for this fixture is explained in the “servo circle” light graffiti post, and if you want to know how light-graffiti works in general, here’s my intro. Basically, one opens up the camera’s shutter, records all light coming in, and merges it into an image as if it happened at one time. Because of this, a servo putting little light dots on a screen can appear to be a coherent image as if it was all there at one time.
Here is the Python code that I used to program my pyMCU* for this project. The math is a bit less complicated than the circle code, as I just incremented the “mb.PulseOut”* command for the pyMCU by one for each pixel. The code is based on my previous pixel-machining experiment, and works in a very similar manner.
Moving the servos, and laser timing issues:
One thing that was different with this program is that the pixel-machining routine generated Gcode for a CNC router, so the machine takes care of actually getting the router to where it’s supposed to go. In contrast, I had to actually command the lasers to go to different positions, and figure out how long each move would take. As seen in two (three if you count the first where the initial movement was lit up) of the photos, if this isn’t done correctly, the light will “run” and look weird. Originally, I used a set value for each movement, but eventually came up with a more “elegant” solution.
As the code loops through where to move, the first point is represented as a coordinate, and the second as another coordinate. The distance between the two is calculated in the Y direction, representing a, and the X direction, representing b. Since in a right triangle a^2 + b^2 = c^2, c will represent the total distance the servos need to travel. So c is equal to the sqrt(a^2 +b^2) and can be scaled by some set variable.
That’s how this program is written at least, however, as I write this, I realize that each servo moves independently. Therefore, the thing that dictates how long each movement takes isn’t “c” at all, but the greatest value between “a” and “b” since they move simultaneously. The Pythagorean routine works pretty well, but it looks like this is unnecessary in this particular case. Regardless, these are still very early tests, and I definitely plan to refine this code and technique further.
I’m quite proud of how this device turned out. It’s a combination of things that I’ve learned from making stuff that I talk about here. If you find this interesting, be sure to subscribe by any of the methods on the upper-right corner of this webpage. Note that this isn’t the first time I’ve tried automatic light-graffiti, check out this experiment moving LEDs around with my CNC router, or this one where I use a laser instead.
*Please note that the pyMCU used in this post is a beta version that can control multiple servos simultaneously. Some commands (like mb.PulseOut) may not work in the standard pyMCU currently available here.