Blackstripes on Github

We decided to opensource the Blackstripes project, we did this on the popular Github platform:

There you will find any info/sourcecode you possibly need to build and run these drawing bots yourself.

Clone this url: into your desktop (or anywhere else) and you are ready. You need Python version 2.6/2.7 with a working python imaging library installed and additonally NumPy for the Mk2 software to get rolling.

Scripts to run from the command line:
Mk1 :

bash$ python testies_210_180_120_50.png

This command should generate a folder called generated_data in ‘macine_motion’ and a subdir called ‘filename – level info‘. This folder contains previews of the drawing paths of the machine. The numbers in the filename represent  ‘quantization tresholds’ of the input image grayscale values. The script is also automagically trying to upload the machine data to the raspberry pi that is supposed to drive the machine.

Mk2 :

cd into ‘image_input’ folder first and create a dir named input. Put a preferably square jpg or png file inside it from around 1000 x 1000 px. Run the following command
bash$ python

This will convert any png or jpg inside the input folder and writes output to a new output folder. The output consists of a custom made binary image file (.bsi for the machine/simulator) and a preview image. You are able to encode the levels in the filename of the image by creating a name in this fashion: 180_150_103_73_43_13_.png
The file creates a really nice high res preview of the machine output, the final line in the module should reald something like this:
where the first argument is the motion data to use (this motionpath data generation code is also included), the second argument is the blaskstripes image to simulate (we created this in the previous python command). The third argument is a total number of instructions this is a prop to give some feedback of the simulation progress in the terminal whilst processing. To run the simulator, run this command in blackstripesMK2/machine_motion:

bash$ python
If that worked you should see the progress starting to increase.

If this helped you in your own robots project let us know, we love to hear about it!

Regards Blackstripes!

Multicolor head II and demo tooling

We made a new printhead iteration, now with a fan mount built onto it and better support to keep the elastic rubber bands in place. We also swapped out the messy triple double wire solution for an all in one cable. This cable also routes the power for the fan and the three solenoids. Looks much cleaner too. In the picture you can see this upgraded head without the viltpin-markers installed.

fig 1. Printhead with fan mount and new cable.

That is not the only improvement we made to the machine, we had some requests to be able to demo this machine live at trade fairs, etc. We decided it would be cool to be able to shoot a portrait with your phone and without any further hassle the drawing bot would then start printing that image. So we made an iPhone app that does exactly that. You take a picture, (live or from the camera roll), you adjust brightness and contrast with a preview of what drawing will be made. Push upload and you send it over wifi to the Raspberry Pi inside the machine (just another reason why the Raspberry Pi is in fact so much cooler than an Arduino for this kind of stuff) as soon as the machine has successfully received the image from the phone it will more or less stop the python server and start with the process of drawing the drawing. As soon as it finishes drawing, it reboots the python tcp server and is ready to receive the next drawing. (We stop the server so the drawing process wont be interrupted (interrupts will disturb the steppers from running nice and smoothly)).

fig 2. iPhone app for on the fly print job submission. (Lara was not really here ;-( )

Before we were able to do all this we made some serious changes to the printer driver code. In our previous workflow it took quite some time to re-generate all picture dependant machine instructions on our iMacs let alone the RPi, so we changed that process.
Now we split the generation into the picture dependant data (pen up and down states) and  stepper data that creates the motionpath of the head. The stepper data is more or less static and resides on the machine. If you want a different pattern to be drawn you can select that with an extra argument when starting the driver (programmed in C ). We designed a few drawing paths but decided we like the circulair one best.
The picture dependant data is basically the bitmap with a custom data header send from the iPhone and the machine does realtime lookups into that bitmap data to see what pen should go up or down while it runs down the programmed path. So literally seconds after uploading the image it starts plotting ;-)

And what does it all look like in stop-motion? You can watch all that in the youtube video underneath, in this video the image was prepared on the iPhone and send over wifi to the Blackstripes drawing machine. B.T.W. In realtime a drawing takes a little bit longer to make than in this movie!


And Last but not least, we created a html javascript based editor for our new machine, from now on everyone with a dutch based mail address is able to order this new drawing from Blackstripes. We are still working on the editor to create a responsive as possible web based editor solution and we are not sure yet if canvas objects are the way to go for this. On phones the editor is basically to big/sluggish right now and we are looking into some serverside based solutions to overcome this issues. Here is a preview of what it looks like on an iPad. It’s basically a 2 step process, first make the optimal circle shaped crop, second step choose the approariate process levels for your image (takes some serious processing we want to offload to our server, not your mobile device).

fig 3. New html editor at work.


The multi color head

Since we have a laser cutter at our lab we have much more freedom to design and implement complex elegant mechanical constructions, the head is it used to be on our Mk 2 robot is no longer what it was, the way we used to lift the entire arm to lift the pen was not free of introducing loads of jitter in the arm and then pen would bounce of the paper again once it landed, it was a giant spring system. So we abandoned this simple solution and are back to a system where a solenoid lifts the pen of the paper (as in our v-plotter). The downside is you need something that rolls over the paper as the pen lifts up (to keep it up). And the pen needs more calibration so it is just not touching the paper when it shouldn’t. We quickly designed a contraption on the computer and in a matter of minutes you are glueing the mdf pieces you cut together. Since this worked much better and was so nice and easy to do, my companion decided he could do even better and made a head that contains and controls 3 pens. The pens are driven by solenoids through linear ball bearings and the head design allows the pens to be loaded and unloaded quite easily, this is important because they need a refill with fresh ink after each print. From the image it may appear as we used two black markers this is not the case, we diluded one marker to give us some kind of gray tone.

And here are some pictures of it.
Printhead from underneath
Printhead from above with 2 of 3 markers loaded
Results… Printed straight on 9mm mdf.
blackstripes Mk 2 multicolor
And 9mm plywood
multicolor drawing on plywood

Ballpoint pen testdrive

blackstripes MkII ballpen drawing

In the new Blackstripes Bot we have a really basic printhead design, this allowes us to test output with all different shapes and sizes of pens/markers/etc. Today we decided to try something different away from the Edding marker path. We took an ordinairy ballpoint pen, increased the number of circles to over 500 because of the smaller line size (estimation of 0.5 mm linewidth) and this print came out (see picture of Marilyn). We are pretty happy with it although the image suffers from numerous artifacts. Halfway during the printing process the paper came half loose from the bot causing lines to change in density (also a gap when we rearranged the paper). And we have a bump in the wood backing panel. This bump creates a real lift-off of the pen, not too good. Apart from this the increased size of the printinstruction datafile came in way over 256Mbyte. This is over the RAM limit of the Raspberry Pi (we are currently still running the old version B) so we had to change the code to swap out and in the chopped datasets.but that worked quite well. In the final ballpoint pen test we will probably draw well over 1000 circles. To get a really sharp looking image. We hope we can stay under the one hour printing time for this circulair print with a diameter of 1 meter (approx. 3 feet).

As promised

Today was a good Blackstripes day, we modified the machine (mkII) a little bit, by adding a solenoid which enables the machine to lift its printing head (the edding 550 viltpin). Also we printed the first photographic image (more or less photographic). It’s a portrait in the form of one big spiral going slowly from in to out . The title of this post refers to the promise from the previous post which stated that I would put some footage online if the printer was up and running so here it is. For all the penplotter fans out there ;-)

Note! In the final shots you see some photos of the building process (ken burns style) At first we mounted the arms on the steppers directly (we specifically purchased some torque potent engines to be able to do this). This led to really shaky print-lines, so we went back to the drawing board and added gearboxes. In the final photo you can see how straight the lines were after that.

Any news?

finished drawing on the machine

So is there any news? Yes we have had a lot of exposure since our last blog writing and have worked some more on our software editor/webshop part of the site. First press coverage was with popular Dutch Magazine (digital technology, design & style): David Lemereis wrote a nice article about us. You can read all about that here (in dutch). Lately we were picked up by major tech site This gave us quite some exposure in the Raspberry Pi scene (and beyond). Very nice…

So what’s next?

Currently we are in the process of developing a new machine. It’s quite a different one actually, so not a V plotter but still a drawing machine. First attempts to run the machine were partly succesfull but also a little bit of a disappointment. We had to re-design the mechanical part of the machine. At this moment I can only show a software demonstration of the machine drawing its signature. As soon as our machine starts to pick up some speed in the real world a will add some more video’s.

From concept to product

Marylin Monroe by BlackstripesFullscreen logo drawn by BlackstripesBMW 3.0 CSL by Blackstripes

After we started to ship our first products, we got a lot of reactions that went like: “really nice but can’t your bot draw a little bit smaller?” So we asked ourselves the same question, started to tweak the code a little and it turns out that with a little tinkering we can pretty much draw in any size as long as our machine physically supports it (max size would approx. be 2 x 2m).

So right now we are contemplating about this size thingy. We think we will ship 2 different sizes in the very near future. We print the same amount of lines in this smaller print but we condense the space between the stripes. This looks best with a slightly smaller tip on the marker. So we no longer need to rough up our markertips to make them draw wider when they come fresh out of the box. On the top you can see three of our tests we printed in that smaller size. Size being approx. 1 by 1 meter (3.25 by 3.25 feet). Not bad he? The not so good part is, the time it takes to print a much smaller print is not reduced by a considerable amount. That is not what you would expect since surface has been reduced by a factor of 2/3 * 2/3 = 4/9

Next to this size aspect there is also the finished product, we learned that people are having trouble with hanging the drawing. They want to frame it or do something alike, without having to go through the hassle of purchasing some custom sized frame. Right now we are looking into this, perhaps we might sell a framed version of the drawing as an different product. In total that would give you four different products (right now), without counting the endless possiblities with your own images of course ;-)

Old pony, new trick!

Today we’ve learned our pony (Blackstripes) a new trick, it now automagically signs all its drawings with its own handwritten name. You can see this happen here:

The signing takes place near the end of the movie. It is nice to watch Blackstripes doing something different then just drawing strokes. So how did we get this signature converted into stepper instructions?

Simple: We tracked the drawing of the signature, using a flash movie which recorded the x,y positions during the writing on a wacom board, also putting down some fat feltpin like big trace on the flash player. After that we converted these recorded x,y positions array into timing-belt lengths (offsetting x,y to the right position and scale on our canvas), then we fed these timing-belt lengths into our “convert belt lengths to delta steps script” and that created the code that eventually drives the steppers, yay!

Interrupt problems solved?!

Erratic motor movement while printing, causing timing belt to bump up

While earlier I was really optimistic about having all interrupt problems solved by not nicing the proccess at highest priority, I found out a day later that our celebrations were premature. We had a concrete printing job to be done and after failing three times or so, in the middle of the printing run, we gave up on it altogether. Interrupts were coming all the time and also fully locking up the engines causing timing belts to derail. So we decided to order an ethernet shield for the Arduino to give that a shot, after all we are not married with the Raspberry Pi. Whatever thing can drive our printer is fine with us. We also spend the rest of the day looking for higher-performance boards with general purpose IO facilities, and we were looking into real time OS-es for the Raspberry Pi. We found some and decided to try a Pi downloadable SD card image from the Xenomai linux framework from th(e/i)s(e) guy(s). But since it was friday afternoon we didnt actually test anything and decided to think it over in the weekend.

That next monday, Antratek already delivered our Arduino shield (ordered friday late afternoon! Not the first time, they are surprisingly fast in processing orders, thumbs up!!) and we recoded our little machine instruction processor from C to “Arduino C Dialect” but once we introduced reading from micro-SDcard into the loop, performance of the Arduino board went down the drain. The steppers were making a not so nice noise and performance was less then slow. So that was not our solution. Then we went back to the Xenomai framework. We got this image running pretty quick, installed necessary tools and tried to run our previous program. It ran for some time before it kept bailing out on more or less the same point (perhaps because of our busy wait loops, interfering with the realtime-ness of the system). We could just hear the steppers spinning up and then it stopped. So we turned to a code example that comes with the framework, did some tinkering to drive the correct gpio pin and soon we heard uninterrupted steppers running (like good music to our ears ;-) ) So that was certainly promising, but still without reading instructions from the sdcard. As soon as we added that to the program it appeared as not much did change from the standard Debian distro, so the sdcard reading was definitely unpredictable as to how much interrupt it would create on the engines (of course this makes perfect sense) in order to kill this little issue we decided to read large amounts of machine instructions into memory and loop through these arrays. So at this moment we have produced an entire day without any issues and all the prints came out fine. Perhaps we didnt even need Xenomai to tackle this problem and we just had to solve our data reading logic. Perhaps it also explains why the problem was getting worse (we did not experince it in the first few weeks), because the SDcard is getting older and perhaps our printdata is written more scattered all the time causing more erratic reading times of the data?  We don’t know for sure but reading chunks of approx. 70 MB at once seems to kill our engine locking problem. Anyways for now we stick with Xenomai because the stepper engines never sounded this good before and we feel that we could even increase printing speed even further.


the lineprinter development

This post is about inspiration, almost everything you ever see has already been done in some shape of form. Most “new” inventions are mere refinements/alterations of earlier designs. The same goes for this printer, the concept is called V-plotter, most of these plotters use some kind of vector based input, the media used varies from walls, paper, whiteboards etc. Most of the plotters use 2 flexible lines connected to two stepper engines and often these engines are driven using the really popular Arduino. We have set (for now) to work with the not so realtime Raspberry Pi. The two lines (timing belts in our case) come together at the printerhead. A lot of these “printer-heads” consist of some weightty balanced construction holding a spray-can, marker or pen of some sort. The constructions often have an on/off facility. In our case that’s a solenoid pulling up the marker and some rubber string pulling it back to the paper. Online we found loads of varieties: servo’s, steppers and also solenoids/electromagnets. It’s all more or less the same,we used a solenoid to give us some instant on/off movement, there is almost no transition time involved and also in I/O terms its only going from 0 to 1 once or vice versa. To give some concrete links I invite you to check out, as we find them quite interesting:

Der Kritzler

As you can clearly see every machine has quite its own take on the concept. The output of the machines also shows quite a wide variety. If you feel you want more of this stuff please google on V-plotter, that should get you started.