“Talk is cheap. Show me the code.” ~ Linus Torvalds
I’ve being doing a lot of talking about it these last few blogs but I haven’t really said how I’m going to get this robot around the course now have I? Well, as the creator and lead developer of Linux Kernel says, the only real way to do that is to demonstrate my code and show you!
Love it or hate it, coding is an integral part of being an engineer. You generally won’t get too far without being at least able to read it. In this assignment, we were tasked with being able to read and write the code needed for the robot to perform its tight manoeuvres. The reading of code was necessary if we wanted to use any of in the ELEGOOs built-in code. As it’s open source that was more than okay, but for the purpose of the assignment it would have to be well commented in order to demonstrate our understanding. For this reason, I chose not to include the self balancing code I previously talked about and instead, physically attached a stabilising caster wheel as outlined in a previous blog.
Before you say it, I know, I know. I raved on about how interesting I found the balancing code in my Assembly and Testing blog (link this) but to be honest, considering time constraints and my growing number of fast approaching deadlines, it wouldn’t be feasible or logical for me to invest so much time into fully understanding code as sophisticated as that. For sure though, if I have time between end of exams and needing to return the robot to Trinity, it is definitely something I plan on looking into a bit more!
So what in-built code was I going to use? Well, to be honest, not a lot of it. There’s numerous scripts involved, all feeding into one another other and there was a lot of stuff I don’t think was quite necessary for what I was tasked with. Actually, most of it WAS unnecessary because the final code I compiled to get Eli around the course, was a fraction of the length of the preprogrammed code.
I stripped the code right back and created my own, blank script. I used the style and format we were thought in the first half of the course last semester. I did keep the pin definitions and initialisations however as they saved me scouring the internet to see what pins were connected where. Essentially, I required only my void setup, void loop and a couple of functions which control the direction. The way in which these functions were executed took a bit of tinkering though and I did run into a problem or two along the way.
First off, I planned on using my ultrasonic sensor to detect when the walls were a certain distance away and signal to Eli that he needed to stop and turn. Perhaps then I could’ve employed the IR sensors to detect which side was free from obstacles and then turn the robot to face this way. The reason I say this so philosophically is because in the end, I couldn’t actually reliably use my sensors. We all know they were a bit dodgy anyway but stay with me as I take you through my coding process and explain why I chose to definitely leave them out.
Shown here is the pre-programmed code detailing how the the motors should move once the robot is told to go in a certain direction, whether that be back, forward, left or right. The part I want to draw your attention to here though is how the robot is told to turn left and right. To turn left, the right motor is held stationary and the left motor is driven backwards. Now, this is all well and good if space isn’t an issue. It actually worked fine for the first turn but as the course progressed, there was no way this was accurate enough.
I created the following little depiction (drawn pretty much to scale) to demonstrate exactly why this is the case. The first half of each step shows the direction the motors are operating and the second half shows the final stopping position of the robot. Zoom in to see the directional arrows more clearly.
Hence based on this reasoning, I couldn’t use the ultrasonic sensor to detect when the robot juuusst reached the wall and to then turn, or at least not using this method of turning.
So now that Plan A was in the bin, where did I go next? Well to begin, I altered the turning mechanism. Instead on turning left like it did before (right wheel stationary, left driven back), I swapped this around so that the left wheel stayed stationary and the right was driven forward. With my turning altered, I hoped to use my ultrasonic sensor to detect when I was about 20cm (approx the robot width) from an obstacle in front which would provide enough room for my robot to manoeuvre within the bounds of the course. Once again though I was stumped. With the detection distance increased, the sensor just couldn’t seem to pick up the measurements accurately even though the maximum range is supposed to be 4m!! The accuracy just wasn’t there and once again, I decided to scrap my plan.
Fed up with the sensors, I reverted to the most rudimentary method of getting Eli around this course and instead focused primarily on the DC motors and sequential execution. I also adapted my turning once again so that both wheels rotated to produce the movement. For example, to turn left, the left motor goes forward and the right motor goes backwards. This also helped to ensure the robot turned the same amount each time (approx 90°), as now turning was not dependant on one motor as it was before and so variations in motor operations were not as cumbersome to deal with as I’ll explain in a minute.
Essentially, I implemented the functions the same way as the in-built code but now with some minor tweaks, they looked like this instead.
Now you’ll notice that the ‘Forward’ function is just slightly different from the other two. That is because, for a number of reasons, veering can occur. In this case, rather than it occurring due to uneven mass distribution etc, the dominant factor is that variations in motors mean that they may not always operate identically. Thus, I needed to make one motor operate slightly faster than the other in order to mitigate this drift when moving forwards. It didn’t effect the left and right functions as I could simply just change the timings on these in the void loop. Also, whether the AIN1 and BIN1 pins are in a high or low state dictates what way the motor turns i.e. low is forward and high in backwards.
Now I only needed to implement these into my void loop in the correct order and enter my timings. This took a couple of trial and errors but was overall much quicker than the time I had already invested into my other two plans. I was pretty disappointed that I couldn’t use my sensors to actually get around the course but I decided to incorporate the ultrasonic sensor anyway… just to show that I could I suppose. Instead of turning Eli on and having him race forward straight away, or even implementing a delay of a random time, I decided to introduce an if statement so that when the robot is turned on, he doesn’t move until I wave my hand in front of him. This will ensure I have the exact time I need to get my launcher and video recording set up with each run. It’s a bit superficial but it’s one way of making use of the sensors I suppose.
Here’s a quick video of Eli navigating the course and putting himself in prime position to catch balls shot from the launcher.
What I Would Change Within The Code If I Were To Do It Again
- Shorten the script- As this was not actually a coding assignment my execution didn’t have to be that fancy. However, the ideal situation in coding is that it as concise as possible and mine was somewhat repetitive. Given that my code was relatively short, copy and pasting worked fine but ideally this should be avoided.
- Avoid changing and picking arbitrary motor speeds to rectify the motor variations- with some more time I feel I would’ve been able to incorporate a control loop with speed feedback to rectify this inaccuracy. Perhaps a Hall sensor could work either.
- Use the balancing code- once again with some more time and as I previously expressed, I would love to be able to pick apart and understand the balancing code. This would eliminate my need for both the stabilising wheel and the counterweights. I tried to mitigate the need for counterweights by slowing down the robot and/or getting it to build up speed gradually however poor Eli took quite a few falls even while travelling at snail’s pace.
What I Would Change About The Robot
- The wheel- it worked perfectly as a stabiliser don’t get me wrong but I began to think that removing the gold supports and changing the height may have affected the sensors. They still worked obviously but with the new change they were highly inaccurate. It is also possible of course that I did just received some extremely dodgy sensors but it feel that my physical changing of the set up could have disrupted/upset operations.
Instead of removing the supports, I also tried to rectify the new height difference and tilt by adding nuts into strategic areas as shown here. It was an extremely fiddly process though and two was the maximum I could insert yet it wasn’t enough.
At one stage I even had the front tray on upside down incase that little lip was affecting anything slightly.
All to no avail. So, I didn’t think it’d be as big of a deal at first due to my ability to physically change the set up but if I were to do this again, I would find a wheel that fit perfectly (a) to avoid all this extra hassle and (b) in the hope that my sensors would work more precisely, more often.
Also I obviously would’ve found a flatter counterweight than my AirPods in the event that I did use the sensors.
Also also!! I sometimes had the problem of overshooting as you’ll see in the coming video, perhaps I could’ve rectified this by creating a taller backboard on my catcher so that balls could hit off it and fall back down into the box. After all, the brief never said how tall it could be!
So, while it may not have been the most advanced or technical coding execution, it worked reliably and that was all that I needed.
I now had a fully equipped robot that could get around the course and a launcher than was just waiting to be used. All that was left now was to bring it all together! It was a one woman show. Operating the camera, starting the robot and firing the launcher proved tricky at times and my poor knees certainly weren’t the better of it haha. There was also some odd shapes being thrown as I attempted to set the launcher up by myself - all four limbs definitely came in handy! So let’s take a look at how I got on…
So as you can see, not 100% success but not bad either!
I definitely landed balls out of each of the three cylinders and with the reload also… they just weren’t all at the same time unfortunately.
The robot could repeatedly take up the correct position, but the launcher wasn’t quite as accurate. A sliiightly bigger catching area and I could definitely have caught all the balls… but that’s the price of playing by the rules I suppose. I knew the launcher could attain relatively the same launches each time, however 129cm² was just that bit too small.
So if I were to do it again…
What Would I Change About The Launcher?
- Use different materials- The threaded bars seemed like a great idea to help adjust my fire however as the launcher was repeatedly fired, the nuts (inside the cylinder dictating the maximum point and also the nuts holding the washers/spring) slowly loosened and therefore had to be recalibrated often. You would fix and test one cylinder and then move onto the next, however as you were testing the second one, the first was silently shifting. It became one big game of chance, hoping they were all calibrated roughly the same. You can actually see in my video above how I have trouble pulling the middle bar back to reload the balls and this was because the inside/middle nuts (seen in the picture below) had moved and were preventing me drawing the bar back far enough. Hence you can also see as I try to perform a twisting action to correct this and get the balls out. I would like to point out here that this doesn’t happen extremely often or anything but it is a definite draw back.
- Use different materials- I envisioned the heavy materials being sturdy and steady whereas in reality, the kick back they created made it hard for me to keep the launcher in the same position for all of the shots. I just didn’t have the strength to keep it fixed again and again.
- Use different materials- The centre cylinder, as you know, was made out of steel, the outer two were plastic. Once assembled, I found it much easier to pull the the threaded bars through the plastic cylinders rather than the steel.
- And-you guessed it- use different materials- The compression spring worked well however the washers on top of it were prone to moving (due to movement of the nuts)and altering how the spring sat. Although this was primarily cause by the nuts and threaded bar situation mentioned earlier, I would still like to see how this mechanism would’ve worked with tension springs positioned outside the cylinder. Perhaps the fact that they would be definitely fixed would mitigate some more of the possible variations. Perhaps they would cause more hassle, I’ve learned it is hard to predict or assume these things without actually testing them. This project definitely threw up problems I had not envisioned beforehand.
Again, if I were to attempt this in the future, I would also possibly experiment on different surfaces; I’ve heard these kinds of robots tend to have a soft spot for carpet. I would also get more ping pong balls! When my launcher reloading system was working smoothly, I had more than enough time to fire even more balls… I just didn’t have enough to fire! I did buy 9 of them but I ended up with only 6 when it came to completing the course; I think one or two of them may have fallen victim to Doug.
But anyways! That seems to be that for now folks. After the amount of time I put into this little project, it’s kind of hard to believe it’s over. But fear not, the show’s not over yet! I still have my Rube Goldberg machine to show you. There’s sure to be some more regular updates over on my Twitter if you can’t bear to wait until the final reveal.
Ciao, aoife xo