top of page
The Project

      The primary objective of the EZ Laser was to create a child-friendly laser cutter, capable of cutting paper and engraving wood. The most popular software to create laser cutting images, InkScape, is difficult for even trained engineers to use, let alone children. To mitigate such a high information barrier to use, we developed a system which mitigates the need for any software know how whatsoever.

      Our system utilizes a Raspberry Pi 2, two Arduino UNO microcontrollers, and various electromechanical components to create the most user friendly laser cutting experience yet. All a user has to do is simply plug the system in, wait for the “lightshow” to end in a constant blue, press the big red button and begin drawing on the clear acrylic screen with an expo marker (fitted with a customized ball-tip). A camera then locates the centroid of the customized tip, records its position, converts the position into machine language (G-Code) and then commands the motors to move the laser to the specified XY-coordinate. The laser will not turn on unless a centroid is detected. In other words, the laser will shut off if the specialized casing is lifted from the acrylic screen, allowing easy “jumps” in the laser cut pattern.

    The laser will only make a complete cut in paper if the user is very conscientious and draws very slowly, otherwise the device will simply etch (burn) the drawn design onto paper. This, however, is not much of a drawback, as the experience of safely watching (via webcam) a laser cutter respond in real-time to your own hand drawing is quite thrilling, even for graduate engineering students!

Motivation

STEM education has been articulated as a clear priority for the nation in order to best prepare the next generation and generate innovation in our country (http://www.ed.gov/stem).To facilitate such interest in the youth, many school age groups are brought on tours of the mechanical engineering makerspace, the Invention Studio. Unfortunately, the tools in the invention studio require technical skills that are out of the scope of most children’s capabilities. Therefore, while the children enjoy getting to see what the machines are capable of, a longer lasting impact on the children could be gleaned if they were actually able to use such machines themselves.

Edited Image 2016-04-21 00-41-43
Image Capture with OpenCV

      Successfully tracking the centroid of the custom expo marker case is a critical part of our project. To achieve this, we utilized the popular OpenCV (open source computer vision) python library. This library is very powerful and offers many extremely useful functions to even a novice python user. However, it is not immediately obvious how to integrate the raspberry pi camera module into (pi cam), luckily there is a great website detailing much of how to both use OpenCV and track a centroid, called pyimagesearch.com, written by Adrian Rosebuck. We adapted his code for tracking a ball to our code, to find the relative xy-coordinate of the ball in our system. One of the more challenging parts of using the camera was actually installing the library, and getting all of the dependencies set-up correctly. It is about a 9 hour install time, and we had many issues doing it correctly, including basic but disastrous ones like having someone knock the power supply off, thus it took a few nights to successfully give our raspberry pi “vision”.

Image Capture with OpenCV

      Successfully tracking the centroid of the custom expo marker case is a critical part of our project. To achieve this, we utilized the popular OpenCV (open source computer vision) python library. This library is very powerful and offers many extremely useful functions to even a novice python user. However, it is not immediately obvious how to integrate the raspberry pi camera module into (pi cam), luckily there is a great website detailing much of how to both use OpenCV and track a centroid, called pyimagesearch.com, written by Adrian Rosebuck. We adapted his code for tracking a ball to our code, to find the relative xy-coordinate of the ball in our system. One of the more challenging parts of using the camera was actually installing the library, and getting all of the dependencies set-up correctly. It is about a 9 hour install time, and we had many issues doing it correctly, including basic but disastrous ones like having someone knock the power supply off, thus it took a few nights to successfully give our raspberry pi “vision”.

Image Capture with OpenCV

      Successfully tracking the centroid of the custom expo marker case is a critical part of our project. To achieve this, we utilized the popular OpenCV (open source computer vision) python library. This library is very powerful and offers many extremely useful functions to even a novice python user. However, it is not immediately obvious how to integrate the raspberry pi camera module into (pi cam), luckily there is a great website detailing much of how to both use OpenCV and track a centroid, called pyimagesearch.com, written by Adrian Rosebuck. We adapted his code for tracking a ball to our code, to find the relative xy-coordinate of the ball in our system. One of the more challenging parts of using the camera was actually installing the library, and getting all of the dependencies set-up correctly. It is about a 9 hour install time, and we had many issues doing it correctly, including basic but disastrous ones like having someone knock the power supply off, thus it took a few nights to successfully give our raspberry pi “vision”.

Converting from Input to X-Y Coords
     The first step in converting from visual cues to laser commands was creating a mouse following algorithm using python and an included library pygame. The mouse coordinates were then transffered into G-code that the arduino could understand as shown here. Once the OpenCV part of the project was figured out, the XY commands were easily transferrable. This G-code output can be used on many different types of machines rather than just laser cutters.
Communicating between devices
     One of the main dificulties involved when makeing a "real time" system is the communication between all the different microcontrollers. The main two microcontrollers used were the Raspberry Pi 2 and an Arduino Uno. To interface the two we used serial communication with a ser.write command executed after each X-Y point was defined. 
Problems/Future Work
    The main limitation of this project is the power of the laser. The 500 mV laser needs to move very slowly to cut all the way through paper. Therefore, in the future it would be best to implement a 1000 mV laser. The price and temperature increase would not be very substantial but the speed of cut could close to double. The second main issue was getting the correct lighting for the camera to sense. The irregularities in the lab's celing caused false positive points on the drawing surface. This problem was mitigated by covering the unit with sheets of green pexiglass. If another type of sensor were to be used rather than relying on optical techniques then the device could opperate regardless of environment. 
Lighting
    The lighting of the system was done on a completely independent circuit. This was mainly due to the rasberry pi not working well with two different arduino controllers. The best lighting was provided by NeoPixels with RGB LEDs. The lighting color inside the sensor box was experimented with to ensure the best contrast between the 3D printed ball and the acrylic drawing surface. 
ME6408 - Advanced Mechatronics
Spring 2016 - Dr. Charles Ume
bottom of page