Search This Blog

Friday, June 3, 2016

Ultrasonic sensor: A more sophisticated scanning and decision making - The AuRGent Project

I will start by saying "its been too long", to say its due to research projects at the university. However, I made some decent improvements in my AuRGent robot project, previously you just saw a basic navigation with hard coded movement. Recently I took the liberty of purchasing an ultrasonic sensor servo mount and hooked it up to my Arduino board, thus allowing the Robot to make some sort of decision based on its precepts from the sensor. Keep on reading, more details below.

The Sensor, mount and Servo

From the figure below, you see exactly how i mounted the sensor to the mount and thus to the servo, this allows me to program the ultrasonic range sensor to scan 165 °, taking readings at 0 ° (the right distance), at 90 ° (the center distance) and at 165 ° (the left distance). Why 165 you ask? Well i left some room to accommodate the error in servo (keep in mind you have to experiment to find your value).
What happens here is, as the robot navigate, the sensor is at 90 °, now whenever there is an obstacle in >=40cm, the bot stops, scans around taking a reading of both left and right and then makes a decision to take the path (left or right) with the highest distance to another obstacle. Okay okay am just going to give you the code.
    cm = get_distance();
    motor_speed = get_motor_speed(cm);
    type = bot_move(motor_speed, cm); 
    if( type == 1){
        servo2.write(0); 
        delay(500);
        right = get_distance(); //scan to the right
        delay(500);
        servo2.write(165);
        delay(600);
        left = get_distance(); //scan to the left
        delay(500);
        servo2.write(90); //return to center
        delay(100);
        compareDistance();
    }
What you see here is the loop, we get the distance from the sensor with the
get_distance()
function.
    int get_distance(){
        // establish variables for duration of the ping,
        // and the distance result in inches and centimeters:
        long duration, cm;
        digitalWrite(trig, LOW);
        delayMicroseconds(5);
        digitalWrite(trig, HIGH);
        delayMicroseconds(15);
        digitalWrite(trig, LOW);
        duration = pulseIn(echo, HIGH);
        cm = microseconds_to_centimeters(duration); //Converting distance to cm
        return cm;
    }
Based on that (left and right), the distance is read by the function
compareDistance()
which makes the decision on which path to take next.
    void compareDistance(){
  
        if (left>right) //if left is less obstructed 
        {
            turn_left();
            delay(500); 
        }
        else if (right>left) //if right is less obstructed
        {
            turn_right();
            delay(500);
        }
        else //if they are equally obstructed
        {
            turn_right();
            delay(1000);
        }
    }
That easy right?

Well you have seen most of the rest of the code in my previous posts, however i have restructured some movements, like for instance the function calls to left, right, back and forward in specialized function calls. Am just gonna give it to you and you figure out the rest. But just for argument sake, here are some specialised function calls you can find in the code:

  • get_motor_speed()
    function just takes in a distance as argument and uses
    map()
    to find a suitable speed for the bot given the distance, setting it to the global motor_speed variable.
  • set_speed()
    function takes in the motorspeed as argument and sets it to each of the motors (front left and right, back left and right).
  • bot_move()
    function on the other hand takes both the distance and speed based on the previous functions, decides how should the bot move given this variables, it also sets the speed of the motors.
Now you ask doesn't bot_move and compareDistance() do the same thing? No not the same, you see the first decides on whether to keep going forward, go backwards, go left or go right even at the initial state where there is no current reading yet. However, the prior only apply's only when we encounter and obstacle while going in a forward direction, then it decides to turn left or right.

Current AuRGent Structure

I think i can say we covered the most important aspects of this issue, if you need the full code kindly go to my repo GitHub and the video for this feature will be made available momentarily on my channel YouTube

Coming soon: Upgrades

  • FSM (Finite State Machine): So honestly i hate my implementation with the bunch of if-then-else, on my defense, this solution was a preliminary hack solution. However in the next issue on this project, i will introduce FSM and a refactored version of my code which will also be in the repo.
  • Reinforcement Learning: I am still designing my policy matrix to enable my bot to learn from experience and come up with its own optimal policy instead of my approximate policy in the current implementation.
  • Task and path finding: I will also implement using a GPS and Compass module a feature to enable navigation from a given state to another with (as much as i can) optimal solution.
Thanks and if you love my posts and videos, don't forget to share, comment on improvements i can make to either the post, videos or projects, learning never ends.