The final project for EEEE-485 Principles of Robotics at RIT was to create an autonomous robot with the following requirements:
Additionally a unique "personality trait" was required and we were given the freedom to choose whatever personality trait we desired and could be as ambitious or conservative as we wanted.
The software for this project was programmed functionally in C++ and was designed with a few distinct states and actions triggered by flags and was designed to iterate through the main loop as frequently as possible. This prevented any issues from arising where the program would enter several nested loops causing odd behavior, by keeping one top level loop with timers and flags the program could never get stuck in an infinite loop because the entire program was the only infinite loop and had feedback systems to change states as needed. The code is available on github
The physical design was printed on my Elegoo SLA printer and had to be rotatedand trimmed slightly to make full use of the print volume. Being an EE major with no formal CAD training I spent more than half of the development time on the model before fabricating any parts and used Solidworks to create an assembly of all my designs and vendor parts including a few that I measured and modelled myself. This helped inform a lot of the design and reduced the need to re-design any parts due to improper fitment or collision, or rather eliminated it as all the parts fit together and the small channels I included for routing wires kept everything neat and tidy. The models are available for reference and download on Thingiverse and Printables
In envisioned an array of audio sensors on top of the robot which could be used to calculate the direction of the noise source by comparing the relative time and intensity picked up by each sensor, and decided that theming the robot to be one of the monsters from A Quiet Place was far more important than choosing something more muindane but easily achieveable. I selected some commonly available sensors that were known to work with Arduino platforms however in preliminary testing I found that the sensors worked best with low frequencies or at closer range which would not be suitable to picking up conversational sound levels from more than a few feet away, but responded well to closer range yelling or blowing so I proposed a slight pivot in the action while the physical design and theory remained the same. Instead of tracking the source of sounds, the driving system used the signal from each sensor to influence the speed and direction of movement when each sensor was blown into and with this the most important aspect, the theme, was changed from a sound tracking monster to a sailboat catching a big gust of wind.
The standard lab parts kit contained a Pololu QTRX reflectance sensor and having worked with that sensor and the provided library before I started with a proportional gain closed loop feedback system iterating as many times as possible every second to minimize deviation from the center point of the line on the course. This tracked the line extremely accurately without the need for an integral or deriviate gain term in error correction. By using both the raw reflectance values from the QTRSensors.read() function and the position estimate in the QTRSensors.readLine() function the program could detect a situation where it had gone off the line but was not near the left or right edge of the line, indicating that it had reached the end of the line and had not just gone too far off the side of it.
Being able to identify this condition, and seeing the physical placement of the sensors relative to the axis of rotation, I thought that if I could ensure an accurate rotation I may be able to create a rotational map of reflectance values and use that data to decide which direction to turn towards. I used the quadrature encoder signals from the two gearmotors and created a barebones encoder system which was used regulate the relative driving pulse for each motor, ensuring consistent speed and a center of rotation that remained very stable. Through some empiracal testing and plotting in Excel I found that by averaging the reflectance values from all three sensors at every angle and calculating a gradient, the program could reliably identify each local maxima. Then low values were filtered out and for the turns on the sample course there were only two remaining values: ~180° behind where it came from, and a second angle which represented the direction in which the course continues.
I belive my navigation system could be used to navigate the most complicated line following courses either by pre-loading the robot with a map of correct directions or by building a self-mapping system and allowing it to navigate the course at-will and create one on the fly. I have posted a short video on youtube with my developments so far and intend to post more if I develop this system further.