Low-cost Raspberry Pi robot with computer vision

The ever decreasing costs of hardware and the rise of Maker culture is allowing hobbyists to take advantage of state of the art tools in robotics and computer vision for a fraction of the price. During my informal public talk in San Diego’s Pint of Science event “Machines: Train or be Trained” I talked about this trend and got to show off the results of a side project I had been working on. My aim in the project was to create a robot that was capable of acting autonomously, had computer vision capabilities, and was affordable for researchers and hobbyists.

When I was an undergrad at IU in the Computational Cognitive Neuroscience Laboratory the trend was using Khepera robots for research in Cognitive Science and Robotics. These robots run close to $2800 today. Years later I was fortunate to help teach some high school robotics courses (Andrew’s Leap and SAMS) with David Touretzky (here are videos of course projects) and got to see first-hand the great work that was being done to turn high level cognitive robotics into a more affordable option. In the past few years, I’ve been helping to develop the introductory programming and robotics course (Hands-on Computing: SP13SP14, FA15, SP15) here in the UCSD Cognitive Science department and have really enjoyed using theflexible platform and materials from Parallax (about the course).

For awhile now, my advisor and I wanted to set up a multi-agent system with robots capable of using computer vision. While there exist some camera solutions for Arduino, the setup was not ideal for our aims. My friends had recently used the Raspberry Pi to create an offline version of Khan Academy and it seemed likely that the Raspberry Pi was up to the task.


I borrowed the chassis of the BOE Bot (includes wheels) and compiled this list of materials (totaling around $250 with the option of going cheaper, especially if you design your own chassis):


While I had some issues with setting up wifi (UCSD has a protected network) and configuring an ad-hoc network, I don’t think it’s impossible to do. This was the procedure I followed to get the robot set up:

  1. Install Raspbian on Raspberry Pi: link (and directions for installation)
  2. Configure the Raspberry Pi: link (make sure to enable camera support)
  3. Update the OS: sudo apt-get update && sudo apt-get upgrade
  4. Install necessary software on Raspberry Pi: sudo apt-get install python python-numpy python-scipy python-matplotlib ipython python-opencv openssh-server
  5. Install python packages for RPIO (link)
  6. Setup Raspberry Pi to work with camera: link1 link2
  7. Setup Raspberry Pi to work with wifi: link


The battery that powers the Raspberry Pi does not have enough power to run the Raspberry Pi and the servos at the same time. I used the battery pack from the BOE Bot and connected two wires to the servos — one at the point in the battery pack where the power starts and one where it ends, in order to use the batteries in a series. Much of the wiring is based on the diagram found here. The pin numbers for the Raspberry Pi can be found here (also shows how to blink an LED). Here is an updated diagram (while the diagram shows 2 batteries, there were actually 5):


Learning python is awesome and will help with any projects after this one:

Controlling Servos

The control of servos is based on code found here. Use the following to center the servos (with the help of a screwdriver):

Using the Camera

There is excellent documentation for using the picamera module for Python:

Drawing a Circle

The easiest way to draw a circle on an image involves using cv2, the second version of opencv (code modified from here).


And finally, below is a demo to show off the proof of concept. This awesome (!) article shows how to use the Raspberry Pi Camera with OpenCV and Python (I found the ‘orangest’ object in the frame and had the robot turn towards it) and the above shows how to create circles on the images for display. I don’t recommend using the computer to view the output as this drastically slows down processing. It’s nice for a demo though! I hope you enjoyed this and found this useful!