This project involved using the pixy cam built in hardware along with an arduino UNO to design a device that can autonomously track a single colored object. 4 motor drivers were used to control the wheels independently. visual data is gathered by the pixy cam and the motors on the wheels move according the position of the object on the pixy’s screen. The ability to recognize single colored objects is something that is already built into the hardware in the pixy cam.

 

The pixy cam software is capable of delivering information about the image that it is following on its screen one it has been trained. The training process is incredibly simple and only involves presenting the the object that you want to track to the pixy cam, then highlighting the area that is covered by the camera using the mouse. As long as the lighting in the area is sufficient, then the pixy cam will recognize the object as long as it remains within the camera view. Achieving consistent results with varying lighting conditions was something that posed a challenge throughout the implementation process. Depending on the lighting, that could determine whether the pixy cam confuses an object with another or whether its able to pick up an object at all.

The pixy cam will store important information about the object that it is viewing as well as the cameras angle position information as it moves. This information is stored in the form of member variables and functions that are inside the pixy arduino library. After doing some research, I was able to find some documentation that elaborated on the respective member information that I needed to pull in order to get useful data from the camera.

The links below contain the documentation supporting the color connected components and camera pan tilt APIs:

https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:ccc_api

https://docs.pixycam.com/wiki/doku.php?id=wiki:v2:pan_tilt_demo

The only information that I needed to pull directly from the pixy cam and give to the arduino were: the x and y position of the image on the screen; the number of blocks that are being generated on the screen; and the amount of deviation the pan tilt system experiences from the center of the camera mount. This information was used with arduino code to control a pair of DC motor drivers that controlled 4 DC motors acting as the wheels.

I didn’t know anything about using motor drivers or DC motors before this project. The robot kit originally came with its own board with its own built-in motor drivers. For our purposes however, that board would not be suitable. So in this project, I used the orignal power supply and DC motors that came with the robot kit, but I used two L298N motor drivers to control the motors. I did the wiring so that the motor drivers would get the power directly from the battery source in parallel with each other while the arduino uno and pixy cam would be receiving power in series with one of the motor drivers. Each of the motor drivers is able to control two DC motors independently by use of simple high and low logic to control the direction and a PWM signal to control power output.

I created a simple stand alone library called “directional control” that I used to create several functions that can control movement forward, backwards, left, and right. I wanted to have all of the movement functionality put together as functions that I could call to in the main file without risking messing any previous work up. So I created a library for these movement instructions and used functions that inherited the movement functions in the main code to make things easier. I learned a lot about object oriented programming through working on this project, such as inheritance, classes and objects, and libraries and methods.

The instructions that determine the robots movement are simple. If the object that is being tracked is to far to the left or right of the center X position, then the directional control ‘turn_right’ or ‘turn_left’ functions will be called respectively until the object is near the center of the screen. Once the object is within an acceptable range in the middle of the screen, then the angle that the camera is being tilted is taken into account. The shallower the angle, the further the object is likely to be away. I made this assumption because the main target that the robot needs to track were little yellow ducks that were low to the ground. If the camera angle is above a certain angle, and the object being tracked is near the center X position on the screen, then the directional control command ‘forward’ is called and the robot moves closer until it comes close enough to the object.

A big challenge for this project was to get the robot to respond consistently. There were many instances where the robot would respond well, and then when when put in the same conditions again it would have difficultly tracking the object. This would often times lead to erratic and jittery motion. Many of the issues came down the fidelity of the pixy cam itself and this is something that ended up being very difficult to work around.