top of page

2017

I am fascinated with both art and robots. During my junior year, I worked on the Robot Art project in Cornell University Autonomous Systems Lab (ASL) with a team of 6.

 

The goal was to create artistic behavior for KUKA YouBot, or more specifically, a pipeline for YouBot to reinterpret artworks and paint on canvas.

My contribution includes actuating the 5-link robot arm given commands specifying the brush width, line length, and color of a series of strokes. In the process I implemented/tested controlling YouBot using inverse kinematics, trajectory planning, and PID control. Besides that, I also designed and built a compliant brush holder for KUKA YouBot.

 

The following GIFs show YouBot drawing a square using motion primitives implemented using different approaches. For the final implementation, PID controller was used for its fast drawing speed.

Pointilism.gif

Inverse Kinematics

PID.gif

PID Control

OLvel.gif

Trajectory Planning

Reinterpretation of artworks was done by the image processing team who collaborate with me to generate feasible commands for the robot to execute.

The following flowchart shows the image processing pipeline: an original artwork is first color-reduced, and then simplified into a paintable image with a specified style. This image is broken down into a series of commands which are stored in a JSON file. These commands contains information such as stroke color, stroke width, stroke sequence, and etc.

flow chart.png

Image processing flow chart

The compliant brush holder is designed for easy assembly and manufacturability. It is entirely fabricated with a laser cutter. The rubber band holds the pen/brush and the tightness is adjusted to allow the pen/brush to flex. 

WeChat Image_20181106030104.jpg

Compliant brush holder

The result is a painting system that reinterprets art works and autonomously paint in a predefined style.

IMG_4056.png

Bumblebee (but really Andy Warhol's Banana)

bottom of page