Meshing the tracking with robot controls was pretty straight forward. We used the database as a message queue, essentially we would update the database to indicate where the camera should be pointed based on image tracking. The position checking would read the new values from the database and correct the movement path. Made for fluid movement as the head tilt/pan was always in motion. For small movements it would move slowly for larger movements it would move faster, always in an acceleration arch that removed any jerkiness. We completed this several years ago, but never posted the video… this is Half Built after all
I finally finished my port of the E1 object tracking from OpenCV (CV2) / Python 2.7 to OpenCV / Python 2.6. I had to do this because Beagleboard doesn’t have a readily available Python 2.7 package (at least not with Angstrom), and I didn’t realize that OpenCV (CV2) required Python 2.7 until after I developed the original code on my laptop.
This was a bit of pain because there are not many examples. If your platform supports Python 2.7, use OpenCV CV2! There are a lot of examples and it’s all object oriented. It’s so much more enjoyable to work with.
I made a quick and dirty web GUI so that I can adjust the HSV values in real-time. It works pretty well.
I also built a quick-and-dirty web interface to control the servos. The big motivator here was I replaced the original side-to-side movement with a pan-tilt configuration.
Unfortunately when used together, the board gets reset…. Still trying to figure out why…
One of the take-a-ways from last year’s Maker Faire was that having multiple battery packs and no overall power switches were problematic and actually dangerous.
As we were setting up our booth my boy took to hooking the batteries up to the robot. Somehow there was a short, followed by sparks and smoke. The battery was toast and the boy ended up with a small burn on his finger. It was at that moment that I knew a formal power system would be built before any further “playing” with the robot.
My requirements were simple.
- I wanted one source. I didn’t want one battery to run the servos and a different battery (or 2 as was the case) to run the motors.
- It had to be regulated for running the computer and all peripherals.
- Each sub system could be switched off
- It needed one main power switch AND an emergency cut off
In researching for ideas I came across the idea of using step-down voltage regulators running in parallel off of a large battery bank. This makes it easy to add power capacity as needed. I would carve off 5v for BeagleBoard, 5-6v for the servos (and controller), and 7v to each motor.
I used these 3amp Buck Converters for the BeagleBoard
I used these 5amp Buck Converters for the each of the the remaining sub-systems (servos, left motor, right motor)
For power switches I couldn’t resist lighted rocker switches (just for the bling) and of course the emergency switch had to be a mushroom button.
I had envisioned the switches running horizontal across the shoulders of the back of the robot, but the boy over-ruled me with a vertical design (which looks great and works better). We used a corrugated plastic sign trimmed down to the proper dimensions and spray painted white. additionally it was re-inforced with white duck-tape (so the painting ended up not being all that necessary).
On paper this all seemed great and was easy to layout in a crisp design:
The reality of the wiring was pretty messy
Labels make for a polished look.
And it works!
I put together a fun script which isolates colors using HSV filtering and then finds the largest “blob”, which is presumably an object that should be tracked. It then finds the center of the “blob” draws a target indicator on it, along with the X,Y coordinates.
The result is surprising good object tracking with minimal code. I have yet to port it over to the BeagleBoard because I wrote the script taking advantage of OpenCV’s “CV2″ library which requires Python 2.7. Unfortunately Only 2.6 is available via packages for the BeagleBoard and I have had little success in getting 2.7 to cross-compile. So I’m in the middle of porting the code to use only the core “CV” functions.
Here is the results of the object tracking (running on my Windows laptop):
Ultimately the E1 will be controlled by a beagle board computer. To accomplish this I bought a Torobot 24-servo controller board, but had a really hard time getting an easy-to-use API to interface it. I tried pyUSB to no avail.
Finally I found that the Torobot USB board could be communicated with through an Arduino serial driver. Conveniently this is available through opkg:
opkg install kernel-module-cdc-acm
When the board is plugged in, it comes up as
From here you can simply echo commands to the device.
echo "#8P1500T100" > /dev/ttyACM0
This basically says “set servo 8 to position 1500 with speed 100″. Doesn’t get much simpler than that!
Make Magazine, one of my favorite sources of inspiration is sponsering the Rhode Island Mini Maker Faire at AS220 Foo Fest today. We are proud to have the opportunity to show off the E1 (Now known as “Nuts”).
Join us at the Maker Faire today (Aug 10). We will be there from 1pm till at least 5pm – though the Faire will last till 1am.