Default

Robocup 2024

Hi! I wanted to make a post to talk about my experience with RoboCup 2024, a robotics competition held in Singapore on 5-7 April. It’s a bit late to do so, but I figured better late than never right?

The beginning

I was introduced to RoboCup from a friend of mine who used to do Robotics as a CCA. Around early January this year, he mentioned he was planning on attending this year’s competition, and happened to have a free spot on his team. I was a little hesitant at first, as apart from a brief experience with LegoMindstorm in primary school and working with Audrinos in my NUS Electrical Engineering introduction course, I did not have much of an experience in robotics. After mulling it over a little, I decided that it would be a good opportunity anyways!

Processing the problem

After we had gotten approval to represent our school, we began planning what to do. The task was quite simple on paper. The robot performs line tracking to navigate to a closed room with one entrance and one exit. The robot then collects the 3 colored balls , deposits them into their respective areas, exits the room, and continues line tracking. We immediately started work on coming out with solutions to the problem.

While we had many discussions, the crux of the problem boiled down to a dynamic approach vs a static approach. We could either have the robot map out the room, deposit locations, entrance and exits, before slowly sweeping the room to collect the balls. The alternative was to have the robot enter and immediately scan for a ball, before picking it up and repeating the search. There were pros and cons to each method, the former was more reliable, whilst the latter would be faster and more efficient. Ultimately with time running out, we decided on the former, for reasons that will be highlighted below.

Equipment

Of course, we did not just think about the problem from a purely theoretical standpoint. We had to consider the hardware we were working with which……wasn’t quite the best. Unlike some very cool robots in the competition, we were broke high school students from a somewhat underfunded cca. :( Me and another friend had originally wanted to use Arduinos and a Raspberry Pi to do CV, but we had to scrap that for an Ev3 brick and last gen sensors as that was all the school had available. While we were initially dejected at first, we quickly bounced back, and it did help us settle on a decision on which method to use faster. Considering the relative inaccuracy of the infrared sensors, we decided to go with the safer option.

Getting to Work

As the competition date drew nearer, we started spending more and more time on our robot. Meetings went from once a week to twice and sometimes even thrice. On the building side, we went through multiple iterations of our robot, thanks to the help of a genius builder on our team.

We experimented with different locations to put the infrared sensors, to function for both wall-tracking and ball detection.

For our ball collection, we used a mechanical claw system powered by motors and servos in the front. We used a mechanical solution to deposit the balls, which involved the robot backing into a wall to turn a spoke that used rubber bands to snap it back into place.

Now here is where we ran into a major problem. We realized we needed to track both the color of the ball for sorting purposes, as well as the lines on the floor for navigation. However, due to the limited ports on an Ev3 brick and our decision to use two infrared sensors, we found ourselves in a difficult situation. Should we remove one infrared sensor and redo our logic to accommodate this? Thankfully, a teammate came up with a brilliant solution. What if we made a rotating beam that could change the direction the light sensor faced?

And with that, our robot was done! Well, the hardware at least… While we had finished most of the boilerplate code needed to run the robot and the logic for the evacuation zone, it was time to test it out, flesh out our code, and calibrate our values.

Testing the software

Now that we had finished the hardware aspect, we needed to run the actual software that would make our robot work. Our language of choice was micropython, a compact version of python designed to run on the Ev3 brick. The line tracking proved to be fairly simple, a bit of code to perform calibration of colors and some (many) tweaks to motor values later, it was good to go. The evacuation zone was slightly trickier.

Our algorithm was simple. Upon entering the room, the bot would do the following

  • Immediately turn left and head to the leftmost corner (relative to the entry point)
  • Use wall tracking to scan the dimensions of the room and identify the corners the deposit zones were located at, as well as the exit.
  • Upon reaching the leftmost corner, orient itself such that it was facing the same direction as it was when it entered. Procedurally go forward and backwards, “sweeping” the sides to detect a ball
  • When a ball is detected, turn towards it, record its color, collect it, and return to the position where it was when it detected the ball.
  • Deposit the balls in their respective corners based on the order recorded and deposit locations detected.
  • Exit the room and continue line tracking

We had originally tried to differentiate between a ball and a corner/wall using the difference in gradients between a curve, straight line, and diagonal line. However, we soon found this to be unfeasible with our hardware, as the sensors were not reliable enough to measure the minute differences in readings, especially when accounting for the fact that the robot might not be moving in a perfectly straight line. We ended up making use of the fact that the deposit zones could only be in the corners, and that there would not be obstacles in the evacuation zone for Singapore. (Supposedly because of lower expectations of standards) In retrospect, I did feel somewhat disappointed that we had to leverage something that wasn’t in the official rules to make our robot work. Yet at the same time, I learnt an important lesson about how hardware could limit software.

With a working robot ready and constant testing and tweaks being made even up to the night of the competition, we were confident that we would be able to do well on the actual day.

The actual day

This was the day we had spent nearly four months waiting and preparing for. I was worried of course that something would go wrong. What if we ran out of batteries? What if the brick failed? We had many backups and precautions of course, but there was still a gnawing feeling of worry in my mind. Yet, seeing the relatively confident behavior of my teammates despite their nerves, I loosened up and begin to expect better. After all, we had tested our robot extensively, right?

The moment we were allowed to, we quickly flocked to the testing site to test our robot. We performed our calibration of the various colors, and placed the robot at the start of the track. We watched excitedly as it aligned itself along a straight path, navigated the weird curves and gaps in lines, and cheered when it cleared the first turn successfully.

Then disaster struck. It failed the next turn. And the next. We tried recalibrating the values and adjusting the thresholds, but to no avail.

What seemed like a simple problem took us hours to work on to no avail. We just couldn’t seem to get the robot to identify all the turns correctly. One problem would be resolved, only for another to occur. At the very least, we were able to identify two key causes.

  • One was the shade of green on the plates. The plates we had used in school were bright green, whilst the ones at the competition were a dark shade. (So I was told at least, given my color blindness). This caused the sensor to overlap in certain values of black and green. In addition, being too lenient with the thresholds would make it detect blue when positioned above a spot with both green and white.
  • Another was likely due to the material of the plates and the lighting conditions. We had originally thought our calibration would be able to handle different kinds of lighting, but it just didn’t seem to work for the actual venue.

Ultimately, we had to give up on the evacuation zone. We ended up letting the robot do its best to line track between the various checkpoints, and manually moved it when it failed. At this point, most other teams had disassembled their apparatus for the evacuation zone as they were facing similar problems.

Overall thoughts

Unsurprisingly, we didn’t exactly do very well. Personally, I was really disappointed, as the evacuation zone was the part I had put most of my effort into, but we weren’t even able to test it in the first place.

There were certainly things we could have done better. One of the competitors mentioned that he had sourced parts from the RoboCup organization in order to ensure they had the proper conditions to test the robot. I do believe if we had started earlier, we would also have been able to source for better hardware such as a better brain or sensors.

At the end of the day, I still felt like it was a worthwhile experience. I got to learn something new, as well as have some hands-on experience with robotics. I truly had a lot of fun working on the project, although I can definitely say that engineering is not something I have much interest in for the foreseeable future.

Perhaps I’ll attend another robotics competition in the future? Who knows!