Difference between revisions of "Proj-2013-2014-RobAIR-2/How to configure robAIR"

From air
Jump to navigation Jump to search
Line 12: Line 12:
 
NewPing sonar [ SONAR_NUM ] = {
 
NewPing sonar [ SONAR_NUM ] = {
 
  NewPing (27 , 26, max_distance ) / / front left
 
  NewPing (27 , 26, max_distance ) / / front left
  +
 
  NewPing ( 2, 3, max_distance ) / / front center
 
  NewPing ( 2, 3, max_distance ) / / front center
  +
 
  NewPing (4 , 5, max_distance ) / / front right
 
  NewPing (4 , 5, max_distance ) / / front right
  +
 
  NewPing (6 , 7, max_distance ) / / rear right
 
  NewPing (6 , 7, max_distance ) / / rear right
  +
 
  NewPing (8 , 9, max_distance ) / / rear center
 
  NewPing (8 , 9, max_distance ) / / rear center
  +
 
  NewPing (10 , 11, max_distance ) / / rear left
 
  NewPing (10 , 11, max_distance ) / / rear left
  +
 
} ;
 
} ;
   

Revision as of 14:23, 8 April 2014

Setting the Arduino sketch

Now for a very important step for the proper functioning of the robot. To retrieve all the information environment through sensors , we need to make a sketch on the Arduino board to turn. With it, the card will be traced all the necessary information.

We used when setting the robot infrared_and_ultrasound_mega.ino file found at the location ~\arduino_sketches\infrared_and_ultrasound_mega . We will begin to set our sketch . As a first step , we will set the global variables so that our number of sensors in the sketch correspond to the number of sensors in the robot . For this, we used the variable INFRA_NB and SONAR_NUM .

Then, we define an integer array to identify the pins on which we connected the infrared sensors: const int INFRA_PINS [ INFRA_NB ] = { A0 , A1 , A2 , A3 , A4 , A5 , A6 , A7 } ; Fat , we can see references pines on which we connected our sensors, we must adapt this part depending on the configuration of the robot. For ultrasonic sensors , we define an array of NewPing corresponding to objects of type Ultrasonic Sensor:

NewPing sonar [ SONAR_NUM ] = {   NewPing (27 , 26, max_distance ) / / front left

  NewPing ( 2, 3, max_distance ) / / front center

  NewPing (4 , 5, max_distance ) / / front right

  NewPing (6 , 7, max_distance ) / / rear right

  NewPing (8 , 9, max_distance ) / / rear center

  NewPing (10 , 11, max_distance ) / / rear left

} ;

Bold values ​​correspond to the numbers of pins on which we connected the TRIG and ECHO our sensors in order that the . The following functions correspond to recovery algorithms basic data, which need to be changed if a particular behavior is desired .

Description and configuration files

We will see the most important file in the handling and control of the robot . So we'll see what they do and how to configure so that the robot is working properly :

Bags Folder

In this case it will find all files with extension. Bag . The " bags " are formats to store and replay the messages exchanged. This system is used to collect such data measured by sensors and then replay it as many times as desired to make the simulation with real data . This system is also very useful for debugging a system afterwards. So every time we make a record card by the lidar , we will save the messages in this file format bag . More information about the bag on the next page : http://wiki.ros.org/rosbag

Launch folder

To launch different scenario our robots, we need to use format files . Launch . A " launch " file is used to run multiple nodes / application and the ROS core command line . A simple example file for keyboard navigation : <launch>   < - Allows you to move the robot simply Keyboard ->   <node pkg="robair_demo" type="kb_control.py" name="kb_control"/>   <node pkg="robair_demo" type="motion_control_node.py" name="motion_control_node"/>   <node pkg="robair_demo" type="arduino_sensors.py" name="arduino_sensors"/> < / launch > Thus we see that in this launch, we will use the nodes kb_control , motion_control_node and arduino_sensors . Thereafter, we can complicate the launch by adding parameters to our nodes to perform the actions we desire. More information on the launch on the next page : http://wiki.ros.org/roslaunch/XML

Maps folder

The maps folder is where all the maps generated will be recorded. The creation of maps can be done either with hokuyo_node node is the node hector_slam that exist for different versions of ros . There are enough tutorials developed the following addresses: http://wiki.ros.org/hokuyo_node http://wiki.ros.org/hector_slam Thus, during the execution of the node creation map, we can create and save a file type . Yaml representing our map. The background idea is that any data YAML can be represented by a combination of lists, tables ( hash ) and scalar data . YAML described forms of these data ( YAML representations ) , and a syntax to present these data as a character stream ( stream YAML ) .

Msg folder

It is in this folder that we will position the files that define the names of all parts of the robots :

file Command.msg

In this file , it was possible to set all the movements of our robot : int8 move uint8 speed1 uint8 turn

encoderData.msg file

In this file, it has been defined the names of our two wheels int32 Wheelright int32 wheelLeft

InfraredPotholes.msg file

In this file , it has been defined names infrared sensors . Infrared sensors were defined as Boolean . This boolean is set to True if there is a hole, False otherwise. They were named as follows : bool rear_left bool rear_center_left bool rear_center_right bool rear_right bool front_left bool front_center_left bool front_center_right bool front_right

UltrasoundObstacles.msg file

As InfraredPotholes.msg folder, it has been defined in this file ultrasonic sensors. The value of these sensors correspond to the distance they have an obstacle. They were named as follows : uint32 north_left uint32 north_right uint32 NORTH_EAST

uint32 south_left uint32 south_right uint32 south_east Thus, if we want to build a robot with more sensors , either infrared or ultrasound, or add movements, it is these files that we are going to touch .

Rviz_cfg folder

In this case, we have all the configurations rviz program, which will be used to display the map that we create the lidar . However, the extension is vcg type and extension of the current configuration files rviz program is . Rviz . So we have to redefine a configuration file.

Script folder

In this case, we have all the script files that will turn on our shelf , and serve the functioning of our robot .

kb_control.py

Here are defined keys that serve to move forward, backward or turn the robot . It will modify this file if we want to use different keys.

motion_control_node.py

This file corresponds to the node control the robot. We will have to check properly configure the port or the controller is connected .

arduino_sensor.py

This file corresponds to the node management sensors . It will also verify that the port is correct sensors .

Src folder

Arduino.py

In this file, we have all the functions to handle the sensors. The two main functions are: process_infrared_line This function will loop through all the infrared sensors of the robot, and update the value of these sensors. process_ultrasound_line This function is similar to the previous one, but it works on ultrasonic sensors.

keylogger.py

This file is used to record the information entered on the keyboard. It is not necessary to modify this file .

motorcmd.py

send_order : Function to send a move command to the robot isFrontSensorsOK : Function that checks if no obstacle / hole interferes with the movement of the robot forward isRearSensorsOK : Function that checks if no obstacle / hole gene robot motion backwards move: Sending order the robot to move

Util folder

Configuration file on the local IP and serial ports. We did not change these file, because the work had been done the previous year and operating correctly

Voice folder

In this case, we set all the audio files that serve the robot when it moves and meets a person or another . Audio files must be wav category.

Starting the robot

We will launch the robot command line using scripts . We often use the run.sh script, which will give rights to different ports (sensors and motor control ) and launch a launch scenario . Before starting the robot , check that all controllers and sensors are correctly connected and that they have the rights to operate.