Nearly there…

With only two weeks to go, final wiring is finish. We have all the sensors we think we need mounted on the robot. It is all coding from here… It is a shame that next post are not going to be judged but I pretty much covered every important part of the building and developing of my very first Pi Wars robot.

We’re going to try to go to a “test” location like Cambridge Makespace to test our robot before competition day to be able to fine tuning the code. No much to do on the hardware other than adding the firing mechanism and maybe another sensor if needed but mostly will be just aesthetics 😀

This is the last judged blog post.

Advertisement

Coding strategy

I was able to build what I think is a reliable bot. With my basic knowledge on coding I was able to use every single sensor mounted on the robot, acquire data and drive it from the joystick.

The As under the sleeve is going to be my amazing partner who will be fine tuning code to be able to cope with all of the obstacles/challenges. I was hoping for him to write a blog on his strategy but work commitments has taken him away from me for the last two weeks and so you’re not going to be able to see his strategy until the D-Day ! Hopefully he will be posting something soon even-though it is not going to be judged. 😦

Sorry !

The Challenges

This blog is about our ideas to solve each of the challenges. We still have a lot to code but we will first concentrate on the autonomous challenges an then the remote controlled ones. Here are the general ideas…

The Hubble telescope nebula challenge: For this challenge we’ll be using the camera to identify the different colours. The general idea is that the robot will make one complete turn around himself to identify where each colour is and then it will be able to follow each one of them in the required order returning each time to its initial location.

The canyons of mars: For this challenge the idea is to try to pre-build the maze so that the robot can localise itself on it. Then, the idea is to use the ToF sensors to map and match the “environment” around the robot while moving. We intend as well to use the cameras and 6DoF sensor to be able to localise the little aliens and make an accurate 90 degrees turn when needed.

Blast Off: the Straight-ish Line Speed Test: For this our first idea was to go to the simplest solution an use infrared sensors to detect the white line and follow it.

I found out a bit later in one of the Robot club sessions at makespace, that there is no obligation to “follow” the line, but to complete the course. There is for sure not much of a challenge using the ultrasonic sensors, but we are aiming to just finishing the course. If we have time, we’ll try to use the camera as well to follow the line and the 6DoF to keep straight-ish and try to increase the speed and finish the course a bit faster.

Pi Noon: This will rely entirety on the skills of the driver (which is not me). The only thing maybe will be to try to add a button on our joystick to reduce torque and speed on the motors to be able to drive it more easily. Otherwise the robot has been equipped successfully with the “famous” connector to mount the balloons.

Same as above for the spirit of curiosity. The Space invaders has been explained in a previous post.

The Apollo obstacle course: Now, I have no idea on how to approach this challenge. Obviously it will rely as well on the skills of the driver… but at least we are sure that the robot will not get stucked as there are nothing underneath the chassis other than the motors. Fingers crossed!

Space Invaders

I has taken us a while to think about how to cope with the space invaders challenge. On the Pi Wars website it is recommend the Nerf Darts and firing weapon but we wanted something original and different. My idea at the beginning was to use a device that I have when I was a kid: the Mexican Resortera. It is a really easy to use device as you just only need a Y shaped stick and a rubber band like the ones below. You can find these on any Mexican market or Amazon in the UK.

Mexican Art

Unfortunately, this idea would have taken us a considerable amount of time to build with a motorised pulling the rubber band. Kind of the same thing will happen with the Nerf pistol and we are running out of time. My partner is a fan of drones and find out that the parrot Mambo has a mini canyon that look perfect to use on our robot.

Mambo Canyon

The idea now is to do some reverse engineering to be able to mount it on the robot and command it from the Pi Hut game controller. We’ll need to do a couple of modifications on top of the robot but think overall it will be “easy” to use on the space invaders challenge. We have it now and will try to “hack” it, if succeed we’ll be able to participate on that challenge hoping to win some points. Fingers crossed !!

Nenemeni V2.0

Always be a first rate version of yourself and not a second rate version of someone else.

Judy Garland

This is how Nenemeni looks life after (more less) mounting all the elements on it:

Nenemeni V1.0

I am kind of regretting using “clear” acrylic, as if you “cover” the inside, it has a cleaner look of the bot. (*ordering purple acrylic asap)*. On the above pictures, I didn’t took off the “plastic” on the top layer acrylic and the ToF sensors are mounted on the bottom part of the chassis.

Nenemeni V2.0

For version 2, sensors will be mounted on the upper part of the chassis, Pi will be rotated and battery and Thunderborg re-arranged. After another night with the laser cutter machine, the results are much better:

The L-shaped front piece to mount IR sensor, connector and ToF was exactly as I expected it. Only drawback: there is no space to mount a third IR sensor if needed. On the picture there is still space for the “hand-made” board for ToF connections on the MUX. Nenemeni V2.0 is looking good.

So Far, there is nothing on the top layer, but hoping to add a HyperPixel to be able to easily launch in between programs for the different challenges. Problem: the HyperPixel uses literally all the GPIO Pins on the Pi, so no possibility to use the IR sensor (which if using the camera, wouldn’t be catastrophic). Luckyly, the Hyperpixel has a breakout I2C connector, so the rest of the sensors are safe… to think about… HyperPixel or Not…

Tidying up…

To truly cherish the things that are important to you, you must first discard those that have outlived their purpose.

Marie Kondo

It is starting to get “messy” on my (3rd) carton chassis, and I don’t need Marie Kondo to tell me it is time for a change. My carton chassis (the third) has for sure lived their purpose, so we need a “serious chassis”. The material: acrylic (if possible purple) 3mm.

One (whole) weekend has been spend on making on CAD (SW) to make sure we’re within the Pi Wars rules. And finally, here it is! Nenemeni is out of the box!!!

Nenemeni V1.0

This is the V1.0 of Nenemeni from SolidWorks. Next step… the laser cutter and migrate all components to the new chassis. Happily, I have a friend with access to a laser cutter and a bending machine willing to help. I decided to go for the “clear” acrylic (as I already have it).

Laser cutting and bending pieces

I wish I have one of those fabulous machines at home (or work). After, the laser cutting day, I could not wait to test if it was as I imagined…

Now you see me…

It was almost perfect, honestly… few tweaks and it will be perfect… already fixing some details for Nenemeni 2.0. I do have to say that, the idea from the Tiny 4WD camera holder adapted to Nenemeni’s needs was the right call. I am, in general, really happy with the result… Wait to see V2.0 😀

Camera or IR sensor? Both!


I Am Thankful For All Of Those Who Said, “No” To Me. It’s Because Of Them I’m Doing It Myself.

Albert Einstein

That is the question… Of course, the IR sensor do not represent much of a challenge, right?, but it is reliable and at the end, the aim is to go from one end of the course to the other as fast as possible. The question is how fast can it go just with IR sensors? Just in case, I’ll have the possibility to mount a couple of IR sensors on the robot. Maybe, I’ll be able to use both to go “faster”… to think about it…

Buggy sensors

Once again for IR I took the one from Pimoroni, used on the Bit Buggy. It comes with everything you need to “plug and play”. Just need to mount… having a CAD (carton-aided-design) it is easy to test…

Result of testing, event with the stand-offs that the sensors came with, they are still to high for them to “see” or not the line. For the final design I’ll need to find a way of bring them closer to the ground. When thinking of this… I remember that we need to mount also the “famous” connector for the Pi-Noon and Spirit of Curiosity challenges. I thought that using something like the “camera holder” from the Tiny will easily do the trick. The idea is something like this:

Tiny 4WD camera holder

Thoughts, we’ll use the same “front holder” for the ToF, the connector and the IR sensors… *keep thinking….

Obstacle avoidance

If you don’t know where you are going, any road will take you there

George Harrison

Now, for obstacle avoidance there are (for me) two possibilities. Sonar sensors (as I often use for my Robogals Workshops) or Lasers. Thanks to Pi Wars peers… I just discovered a nice and cute little ToF sensor that uses Pew Pew lasers at @Pimoroni: VL53L1X.

The VL53L1X ToF sensor

For me (used to build huge industrial robots) it is amazing to have a such an accuracy for distance measurement with this tiny sensor. I love it… and when conecting only one to the PI, is pretty straight forward using the libraries for it. The problem started when trying to use 3 (maybe 4, I haven’t decided yet) ToF sensors at the same time as they have the same I2C address. Furthermore, the Raspberry Pi only have one available port and although you can convert normal BCM pins to host I2C, I’ll still be short as motors and PanTiltHat also use I2C.

One problem at the time… I forgot to say in last post that motors and PanTiltHat have (unfortunately) the same I2C address as well 0x15, but since I am using the Thunderborg controller for the motors it was possible (and actually pretty easy) to change the I2C address in case of using several boards. So, problem solved for the first part (changed to 0a)… But with 3 ToF sensors that have as well same I2C address, I needed a multiplexer (MUX).

TCA9548A

I found that the most used one is the TCA9548A from Adafruit (also called I2C expander), with which it is possible to have up to 8 different devices. The MUX have the 0x70 address and the ToF sensor 0x29. First test, just one sensor on the MUX, it literally took me three days (and nights) to figure it out how to get it to see the address for the ToF. I was not able to see the I2C address for the ToF, only for the MUX (see image 1 below). Apparently, I wasn’t “opening” the “gate” properly for the MUX to see the sensor. Once its done, the rest is pretty straight forward. Only thing to take into consideration is that it is not possible to range the three sensors at the “same” time, but kind of “switching” in between them as they kept the same address. So, second problem solved.

This sensors will be really usefull for the Canyons of Mars challenge, allowing to keep the robot within boundaries (hopefully centric). A 6DoF sensor (which is also I2C) will be use to keep turns within the right angle. Furthermore, hopefully with the camera, the robot will be able to find the little green aliens to keep on track!

The eyes


It’s not what you look at that matters, it’s what you see.


Henry David Thoreau

Meanwhile having a stronger chassis, and now that the body is moving we need the robot to be able to “see” where it is going. A standard Raspberry Pi Camera will be Nenemeni’s eyes. There are three autonomous challenges and the camera will be very useful to solve those challenges, so we though a Pan Tilt will be the best option and the module from Pimoroni is just perfect.

Furthermore, it is straight forward to make it work with their library and example codes as above. Now, in order to start testing more complex programming using openCV, we install it on our “CAD” chassis.

Every time I start thinking about “autonomous robots”, I can’t stop thinking about the very first chapter I wrote for my PhD dissertation, here a couple of lines…

Autonomous navigation in unknown environments has been the focus of attention in the mobile robotics community for the last three decades. When neither the location of the robot nor a map of the region are known, localisation and mapping are two tasks that are highly inter-dependent and must be performed concurrently. This problem, is known as Simultaneous Localisation and Mapping (SLAM).

In order to gather accurate information about the environment, mobile robots are equipped with a variety of sensors (e.g. laser, vision, sonar, odometer, GPS), that together form a perception system, that allows accurate localisation and reconstruction of reliable and consistent representations of the environment. Vision sensors give mobile robots relatively cheap means of obtaining rich 3D information on their environment, but lack the depth information that laser range finders can provide.

Status of the robot so far…

Knowing this, next step is to install and test some ToF sensors… keep reading the next post !

Baby steps…


To climb steep hills requires slow pace at first.

William Shakespeare

Now that everything is set up and powered up, we need a couple of libraries from Thunderborg to start. The “getting started guide” (link on last post) explain very clearly how to install the software you needed to get the motors moving with a couple of examples.

The library has a nice GUI to test the motors. It is possible to check the power input being received.

Thunderborg GUI

It is alive !!!! We were even able to link it to the Pi game pad, but our chassis is to fragile to keep on testing… we’ll need to find something to harden it a bit to continue on testing…

Nenemeni is alive !!!

Next blog post… Nenemeni’s eyes.