Industrial Light Painting aims to create high fidelity three-dimensional light paintings of real people. This is done by combining the preci...
Industrial Light Painting aims to create high fidelity three-dimensional light paintings of real people. This is done by combining the precision of a computer controlled industrial arm and a RGB LED with a Kinect camera to capture and recreate portraits in depth and color.nnLight painting is a photographic technique where light is moved in front of a camera taking a long exposure. The result is a streaking effect that resembles a stroke on a canvas. This is usually accomplished using a free moving handheld light source which creates paintings with lots of arcs and random patterns. While some artists can achieve recognizable shapes and figures in their paintings, they usually lack proper proportions and appear more abstract due to the lack of real-time visual feedback while painting. Unlike traditional painting, the lines the artist makes does not persist in the physical space and is only visible using a camera. Recently, arrays of computer controlled LEDs placed on a rigid rod have allowed for highly precise paintings, but only on a single plane.nnAs in a manufacturing environment, an industrial robot replaces the fluid, less precise movements of a human with highly accurate and controlled motions of a machine. The automated motions of the industrial robot solves the problem of lack of visual feedback to the artist while painting in light, by allowing him or her to create the painting virtually within the software used to instruct the robot as well as the light attached to it.nnMore Details: http://www.jeffcrossman.com/industriallightpainting/nnHow it WorksnIndustrial Light Painting creates full color three-dimensional point clouds in real space using an ABB manufactured IRB 6640 industrial robot. The point clouds are captured and stored using a Processing script and a Microsoft Kinect camera. The stored depth and RGB color values for each point are then fed into Grasshopper and HAL, which are plugins to Rhino, a 3-D modeler. Within Rhino, toolpath commands are created for the industrial robot which instruct the arm how to move to each location in the point cloud. Custom written instructions are also added to make use of the robots built-in low-power digital and analog lines which run to the end of the arm. This allows for precise control of a BlinkM smart LED which is mounted at the end of the arm along with a Teensy microcontroller.nnUsing DSLR cameras set to capture long exposures, the commanded robot movements along with precise control over the LED recreate the colored point clouds of approximately 5,000 points, within about a 25 minute period. nnnAbout the CreatorsnJeff Crossman is a master’s student at Carnegie Mellon University studying human-computer interaction. He is a software engineer turned designer who is interested in moving computing out of the confines of a screen and into the physical world. nwww.jeffcrossman.comnnKevyn McPhail is a undergraduate student at Carnegie Mellon University studying architecture. He concentrates heavily on fabrication, crafting objects in a variety of mediums pushing the limits of the latest CNC machines, laser cutters, 3D printers, and industrial robots. nwww.kevynmc.comnnSpecial Thanks TonGolan Levin for concept development support, equipment, and software.nCarnegie Mellon Digital Fabrication Lab for proving access to its industrial robots.nCarnegie Mellon Art Fabrication Studio for microcontroller and other electronic components.nThingM for providing BlinkM ultra bright LEDsnnAdditionally the creators would like to thank the following people for their help and support during the making of this project: Mike Jeffers, Tony Zhang, Clara Lee, Feyisope Quadri, Chris Ball, Samuel Sanders, Lauren Krupsaw Less