Report - Robotics.
Report - Robotics.
ACADEMIC VICE-RECTORATE
FACULTY OF ENGINEERING
COMPUTER SCHOOL
ROBOTICS
Members:
Fernandez, Jhannifer
27.260.129
Salazar, Daniel
26.827.908
Section:
IC0831
Index
The history of robots began much earlier than is often thought; since ancient
Greece, the most renowned philosophers began to consider the possibility of
creating artifacts or devices that would replace human work, specifically in
those activities related to cleaning farms and growing food.
Thanks to the fact that home computers are becoming more and more
powerful, the use of these simulators is now a reality.
Currently there are a large number of simulators with varying quality and
capacity.
Robot Morphology
There are transmissions, reducers, actuators and sensors on the market with
factory-defined operating specifications, and in this sense the simulators
should have either a database with different elements existing on the market
or at least the possibility of assigning values or ranges to the elements used
in our model robot.
In general, one of the requirements that a good simulator must have is the
ability to graphically represent, if possible in 3D, the robotic arm, and it must
show its movement in real time. It is advisable to have the possibility of
moving the camera and zooming in on a specific area, thus allowing the
complete operation of the system to be checked.
Robot Kinematics
Once the physical form of the robot has been defined, the simulator must
allow and control the robot's own kinematics to avoid impossible movements.
The simulator must allow both the links and the joints to be moved while
respecting the kinematic chain. In addition, any movement restrictions defined
in each axis must be respected at all times.
Robot dynamics
Robot dynamics relates the motion of the robot and the forces involved in it.
In this sense, the simulator must allow defining the values specific to each
element of the robot. You must take into account all the forces involved in the
system and act accordingly in the event of heavy movements or loads.
Around
As indicated above, the system should allow the definition of the environment
in which the robot operates and should also have the possibility of adding
sensors that respond to the conditions indicated in the simulation.
Programming
Connection
Simulators
Gazebo
Robot morphology
For the rendering part, Gazebo has the help of OGRE. This software offers a
simple environment for object-oriented design and is independent of 3D
implementation, so it uses both DirectX and OpenGL.
Gazebo has an editor that allows you to add basic shapes such as cylinders,
spheres or cubes, or more elaborate ones based on extrudable SVG graphics
or 3D meshes in “.dae” or “.stl” format. To create this type of mesh you can
use free software such as Blender.
To join the figures and form the robot, you have joints in which you can define
the type of movement allowed (Rotation, Prismatic, Spherical, etc.). In
addition, it allows defining movement limits, supported force and supported
speed limits, viscosity, friction, etc.
Another possibility that Gazebo offers is to import robots using the SDF
format. This XML-based format describes objects and environments for robot
simulation, visualization, and control. In addition to allowing us to develop our
own prototypes, the system has its own models.
ODE (Open Dynamics Engine) is a library for the simulation of rigid bodies
and allows the control of collisions and friction.
SYMBODY. This library provides our objects with dynamic capabilities,
solving for example Newton's second law (F=m*a) (Force equals mass
times acceleration).
DART (Dynamyc Animation and Robotics Toolkit). This library provides
data structures and algorithms for kinematics and dynamics.
BULLET. This open source library is a 3D detection and dynamics engine
for both rigid and soft bodies.
Around
Gazebo has an interface that allows you to design the entire environment in
which the robot will operate, giving all objects the characteristics of rigid
bodies with collision. In addition to collisions inherent to objects, there is the
possibility of adding laser sensors, 2D or 3D cameras, force sensors, contact
sensors, etc. All of these sensors can be included with noise, thus achieving
a more realistic simulation.
Another point in its favor is the possibility of carrying out remote and cloud
simulations.
Programming
As for the programming of the system, and since it is open source software,
there is the possibility of developing specific plugins that alter the behavior of
the robot, the sensors or the environment, making it possible to configure it
for any situation.
Before starting with the development of the robot, we must configure our work
environment. To do this we must create a folder in which we will host our
code and where the compilation libraries will be located.
The first step will be to create the folders using the “ mkdir” command. In our
case we will choose the user root folder “~/”. The accent on the ñ is obtained
with the key combination alt+124 or by pressing alt gr + 4. Another option is
to put /home/user
We will create an initial folder (in this case called “Robot”) and inside this we
will create the “src” folder (from source).
Now we must configure the terminal to know the location of ROS, to do this
we will execute the command “source /opt/ros/kinetic/setup.bash”.
cd ~/Robot/src/
catkin_init_workspace
cd ..
catkin_make
Now we are ready to start developing our robot. To do this we will start by
creating our project within src. We do this using the “ catkin_create_pkg”
command in which we can indicate the libraries that we are going to need. In
this case it will be needed.
The next step is to build our world. To do this, create a folder in which to save the
world and give it a recognizable name. In this case it has been called
“myRobot.world” and has been saved in the “worlds” folder.
To launch our project, we will use the “roslaunch” command. This command requires
a configuration file in which we will indicate how we want to launch our simulation.
We will create the file in a folder inside the “worlds” folder and with the name
“miRobot.launch”
aunch>
<arg name="paused" default="false"/>
<arg name="use_sim_time" default="true"/>
<arg name="gui" default="true"/>
<arg name="headless" default="false"/>
<arg name="debug" default="false"/>
<include file="$(find
gazebo_ros)/launch/empty_world.launch">
<arg name="world_name" value="$(find
miRobot)/worlds/miRobot.world"/>
<arg name="verbose" value="true"/>
<arg name="debug" value="$(arg debug)" />
<arg name="gui" value="$(arg gui)" />
<arg name="paused" value="$(arg
paused)"/>
<arg name="use_sim_time" value="$(arg
use_sim_time)"/>
<arg name="headless" value="$(arg
headless)"/>
</include>
</launch>
With the above options we will start our robot with the world paused so that
no movement and physics will be applied to our robot.
Definition of Robot
To define the robot we will use the SDF language (http://www.sdformat.org).
In this language the robot will be defined using the Model element.
Within this element, basically two other elements will be used: the link and the
union.
Link
The link allows you to define how the link should be displayed, what the
collision criteria are and what dynamics are applied.
In the following piece of definition code you can see:
All these parameters are processed by Ogre and ODE to simulate a real
environment.
<link name='arm_link_0'>
<pose frame=''>0.143 0 0.046 0 -0 0</pose>
<inertial>
<pose frame=''>0 0 0 0 -0 0</pose>
<inertia>
<ixx>0.01</ixx>
<ixy>0</ixy>
<ixz>0</ixz>
<iyy>0.01</iyy>
<iyz>0</iyz>
<izz>0.01</izz>
</inertia>
<mass>0.845</mass>
</inertial>
<collision name='arm_link_0_geom'>
<pose frame=''>0 0 0 0 -0 0</pose>
<geometry>
<mesh>
<uri>model://youbot/meshes/arm/arm0_convex.stl</uri>
</mesh>
</geometry>
<surface>
<friction>
<ode>
<mu>0</mu>
<mu2>0</mu2>
<fdir1>0 0 0</fdir1>
<slip1>0</slip1>
<slip2>0</slip2>
</ode>
<torsional>
<ode/>
</torsional>
</friction>
<bounce>
<restitution_coefficient>0</restitution_coefficient>
<threshold>0</threshold>
</bounce>
<contact>
<ode>
<soft_cfm>0</soft_cfm>
<soft_erp>0.2</soft_erp>
<kp>1e+13</kp>
<kd>1e+11</kd>
<max_vel>-1</max_vel>
<min_depth>0</min_depth>
</ode>
</contact>
</surface>
<laser_retro>0</laser_retro>
<max_contacts>10</max_contacts>
</collision>
<visual name='arm_link_0_geom_visual'>
<pose frame=''>0 0 0 0 -0 0</pose>
<geometry>
<mesh>
<uri>model://youbot/meshes/arm/arm0.dae</uri>
</mesh>
</geometry>
<material>
<script>
<uri>model://youbot/materials/scripts/youbot.material</uri>
<name>youbot/DarkGrey</name>
</script>
</material>
</visual>
<gravity>1</gravity>
<self_collide>0</self_collide>
<kinematic>0</kinematic>
</link>
Joint
Joints are one of the most important elements to define our movement. These
define how the links are hierarchically linked and what type of joint and
actuator are defined in the joint.
On the one hand, we have the hierarchy through the definition of the
parent and the child. In the case of our first union, it has “base” as its
parent with the objective of fixing the robotic arm to the base of the
robot.
The next point is the axes. SDF allows you to define 2 axes of
movement, although in the robot of this project you will only define one
axis per joint. The axes can move in a rotational or prismatic manner
and allow defining:
The dynamics of the axis.
The limits of both the maximum and minimum angle that the shaft can
acquire and the maximum force and speed that can be applied.
Finally, you will define the coordinate axes on which the axis can rotate
or move.
Conclusion
One of the most popular applications for robotics simulators is for 3D
modeling and rendering of a robot and its environment. This type of robotics
software has a simulator which is a virtual robot, which is capable of
emulating the movement of a real robot in a real work envelope. Some
robotics simulators use a physics engine for more realistic robot motion
generation. The use of a robotics simulator is recommended for the
development of a robotics control program, regardless of whether a real robot
is available or not. The simulator allows robotics programs to be conveniently
written and debugged offline with the final version of the program tested on a
real robot.
By using a simulation, costs are reduced and robots can be programmed offline,
eliminating any downtime for an assembly line. Robot actions and assembly parts
can be visualized in a three-dimensional virtual environment months before
prototypes are produced. Writing code for a simulation is also easier than writing
code for a physical robot.
Gazebo offers the ability to accurately and efficiently simulate robot populations in
complex indoor and outdoor environments. It features a robust physics engine, high-
quality graphics, and convenient graphical and programming interfaces. Best of all,
Gazebo is free.
Bibliographic References
https://programacionextrema.es/2016/07/16/entornos-de-simulacion-de-robots/
Building the robot:
https://programacionextrema.es/2016/10/09/simulacion-robots-gazebo-construccion-del-
robot/