Projects/Nav Bot I

From Leeds Hackspace Wiki
Jump to navigation Jump to search

What is it?

A small circular 2 wheel skid steer bot, currently pretty stupid and lacking in sensors, but the intention is to use it as a platform for experimenting with sensors, navigation, and all sort of bot behaviour. Now with added lasers, all robots are better with lasers!

What's it made from?

Body

The main construction material is (believe it or not) chopping boards - they're made from polythene or polyprop, it's light, strong, easy to work with, and you can pick it up incredibly cheaply at pound shops. The disks are cut with a large hole saw, which leaves a 10mm hole in the center which is really handy for routing cables through. The spacers between the base and the first deck are made from some polystyrene tube bought from a model shop, chosen for a perfect fit over the long bolts (M6x100) that hold the entire thing together.

Drive system

The drive system consists of 2 continuous rotation servos, and wheels with solid rubber tyres designed to fit the servo output gear (from techsupplies.co.uk ).

Controller

The onboard controller is an arduino

Sensors

  • Bumper

It's currently fitted with 2 plastic bumpers activating microswitches, which do at least prevent it from stalling against objects in its path (doesn't prevent it falling off the desk though.)

  • Line Follower

An array of 7 Vishay Siliconix CNY70 IR LED/phototransistor sensors allow Bot I to follow a line of black tape stuck to a desk (time to build a bot arena so I don't need to take over all the desks). Balance quite critical, as if the bot tips backwards the sensors are lifted too high and register reflected light from outside their field of view (it's currently running with a roll of insulation on top acting as a counterweight). The sensor seems fairly precise, but line following can be quite jerky, but some more time mapping sensor inputs to driving directions should smooth that out.

  • IR Distance Sensor

Mounted on the front of the robot above the bumpers is a Sharp IR distance sensor. It's fitted on a servo, allowing it to be aimed 45° to each site of the robots track. Range for this sensor is 10-80cm (though other models support different ranges). It should allow longer range obstruction detection, and a very basic mapping ability.

What can it do?

It can drive around (at random or following lines on the ground), sense when it has bumped into things, and use its IR sensor to detect more distant objects, and map its environment.

It's a long way from world domination, I don't think we need another bot to protect us from it just yet - the worst it could do is bump into your ankles at a particularly ineffective speed.

Project Diary

9th May
It's happily following lines on a desk, and with a discontinuous line it will turn round at the end to re-acquire the line and follow it back in the direction it came - a sort of robot sentry. Next steps are to re-fit the bumper switches above the line sensor so that objects blocking the line can be detected.

{{#widget:Vimeo|id=11620354}} {{#widget:Vimeo|id=11619056}}


16th May
In order to avoid running out of arduino IO pins I've got some MCP23008 i2c IO expanders. These connect to the arduino on the i2c serial bus (found on analog pins 4 and 5) and provide 8 general purpose IO pins. The first one I've connected up is currently configured with 4 input pins, and 4 output pins, and the microswitch bumpers I'd already built are refitted to the front of the bot above the line sensor, allowing it to detect collisions with objects obstructing its line. Currently on detecting a collision the robot turns around, relocates the line, and continues back the way it came.

{{#widget:Vimeo|id=11788794}}


22 May
IR distance sensor fitted, a function to scan an arc in front of the robot and return the ranges to obstructions is working, and the robot can hold station a fixed distance from an obstruction, correcting its position if the obstruction is moved. Didn't get any further than that as we were quite busy.


29 May
More work on the IR sensor. The scanner function now accepts step angle, dwell time, and multi-sampling settings to give far more control over the speed and accuracy of the scan. The data is read into an array which can be dumped over serial as a graph for debugging purposes, and then post processed to populate the map of the environment based on the robots location and orientation. Here's the scanner in motion:

{{#widget:Vimeo|id=12135510}}

It tries to turn the environment in the image on the left to the map shown on the right:


17th July
File:Bot-1-rebuilt.jpg
Bot-1 post rebuild
A rebuild to replace a lot of the temporary attachments for sensors with something more permanent. The IR sensors are now securely mounted on brackets made from L section plastic extrusions, and the line sensor and bumpers have been combined into a single unit which is firmly attached to the 2 structural bolts at the front of the robot. There is now a single bumper although it still activates 2 independent microswitches. Still lots to wire up again though.


21 Sept
File:Bot-1-RC.jpg
Bot-1 radio control system
We recently had a number of arduino boards and a pair of xbee shields donated, which had me thinking about how useful a wireless robot would be. After digging through our junk pile I also managed to find a usb joypad, the end result was possibly the most needlessly over complicated radio control system ever devised. Pretty useless, but great fun.


16 Nov
I've not spent a great deal of time working on this recently (I was working on other projects) but everything is now wired up again, i2c addressing clashes have been eliminated, and code has been written. So I now have a multi-mode bot, three buttons (select/enter/reset) allow you to select a particular task (indicated on the status LEDs) which is executed when you press enter. Now the encoders on the wheels are reconnected it's possible to measure the distance travelled (provided you know the wheel diameter and number of pulses per revolution), and also angles turned (you need to know the wheel diameter, pulses per revolution, and track), so I now have something that approximates a bigtrak:

{{#widget:Vimeo|id=16962647}}

Or alternatively you could attach a pen (though not in the best place) in order to make a turtle:

{{#widget:Vimeo|id=16962786}}

So now I'm left wondering if I can program a logo interpreter.....

20 Nov
Due to an attempt to make use of lots of sensors at once Leeds Hackspace witnessed the birth of a new sport. I've created Robobowling. The idea is to locate a triangle of 6 pins, aim the bot, and knock as many as possible off the desk. If the bot falls off the desk you score 0 (and probably have a lengthy rebuild to do too).

{{#widget:Vimeo|id=17030593}} {{#widget:Vimeo|id=17030670}}

The main reason behind all this was testing of the maths in the turn routines, which are now capable of various different radius turns, this has already resulted in much smoother line following, and a few more calculations should result in precision turns being possible with any available radius. If impact detection is enabled then when the bumpers are triggered the bot will stop and return the time and/or distance left to run, allowing a position calculation even when a manoeuvre failed to complete.