3D Printing, Computer Vision and Dragons Converge
When you are a 3D printing hobbyist like me, every morning is like Christmas: you set your printer before hitting the pillows and then go downstairs the next morning to discover what amazing things have appeared. At the moment, in my house it’s usually articulated dragons that come right off the print bed ready to move around.
I currently have a printer farm with four machines running around the clock, working their mechanical hearts out for me. So Christmas comes nearly every day.
But there’s a dark side – sometimes I get coal in my stocking. Ok, not actual coal but a tangled mess of filament “spaghetti” – so just as disappointing.
While 3D printing is a technology that does things right out of Star Trek, sometimes things don’t quite work out as planned. In fact, there are several things that commonly go wrong with a print run. The most frustrating is when the first layer of filament does not properly adhere to the print bed. When that happens, the print object might move from its expected position – meanwhile, the printer will still happily keep trying to add layer upon layer of plastic onto something that isn’t there.
The end result of that scenario is an overnight print that resembles a plate of plastic colored pasta – or “spaghettification” – and it's brought me to tears a few times.
Computer Vision and the IIOT
While this problem is a regular occurrence when using an early generation 3D printer, it is far rarer thanks to the evolution of these printers. Newer models incorporate computer vision and AI techniques common to the Internet of Things (IoT) and Industrial Internet of Things (IIoT). For instance, by utilizing edge computing, real-time monitoring, and robust connectivity over the internet, new 3D printers take a lot of headache out of production.
That’s important well beyond the world of 3D printing hobbyists. These techniques are revolutionizing manufacturing and streamlining the production process.
Next-Gen 3D Printing
The latest printer I’ve added to my farm comes from this new generation of machines. The device is internet-ready and equipped with a companion mobile application (other models also come with remote desktop software) that I can use to control the device. This connected app allows me to conveniently check print progress of my beloved dragons and receive status information so I’ll know early-on whether there’s a problem.
The most impressive advancement in this printer over earlier models is the use of machine vision to detect problems with the print. The machine vision compares what the first layer of the print should look like to what is actually on the print bed. If the reality drifts too far from the expectation, the printer sends me an alert via the companion mobile app and pauses the print until I can take a look at the situation.
While the 3D printer may seem like a simple use case for computer vision, it actually has a lot of relevance to more complex use cases like autonomous vehicles and sensitive diagnostic equipment. At ICS, we’ve worked on a number of customer projects utilizing vision techniques in a wide range of devices, among them machines that make circuit boards, medical devices that perform cell counts in a biological sample, even a self-driving car.
What these projects have in common is the pairing of AI with human decision making. My 3D printer’s machine vision application has been trained to specifically detect the spaghettification condition mentioned above, and halt the print job when detected until I can examine the issue. This kind of “human in the loop” behavior, where the AI handles the real-time monitoring of the situation but a person makes the final decisions on actions to take, is essential (at the heart of a lot of our IoT and IIoT work.)
But that may change as industrial devices grow more sophisticated, and IIoT product development matures. If you’re creating a new device or industrial control that could/should incorporate AI techniques, we can provide guidance to help you keep pace with evolving requirements and best practices. If you’re interested, get in touch.