前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >ROS1云课→16机器人模型从urdf到xacro

ROS1云课→16机器人模型从urdf到xacro

作者头像
zhangrelay
发布2022-09-28 12:51:13
3610
发布2022-09-28 12:51:13
举报

ROS1云课→15主题与坐标系


补充:

2020:ROS机器人URDF建模_zhangrelay的博客-CSDN博客

2022:URDF机器人模型ROS1&2案例(noetic+galactic)_zhangrelay的博客-CSDN博客


这里,先回顾一下概念:

在ROS中自定义机器人的3D模型

机器人3D模型或部分结构模型主要用于仿真机器人或者为了帮助开发者简化他们的常规工作,在ROS中通过URDF文件实现。

标准化机器人描述格式(Unified Robot Description Format , URDF)是一种用于描述机器人及其部分结构、关节、自由度等的XML格式文件。每次在ROS系统中看到3D机器人都会有URDF文件与之对应,例如PR2(Willow Garage)或者Robonaut(NASA)。在下面的小节中我们将会学习如何创建这种文件和格式用于定义不同的值。

云端支持所有ROS1/2机器人模型,使用功能包安装并使用吧^_^

通常机器人描述文件功能包格式:

ros-版本号-机器人名称-description

一个简单差动小车: 


ROS官方给出了urdf的教程,非常详细,自主学习即可。

xacroXML Macros的简写)可帮助压缩URDF文件的大小,并且增加文件的可读性和可维护性。它还允许我们创建模型并复用这些模型去创建相同的结构。

此处只补充一些案例,这部分ROS1/2基本通用。

在教程中有一个常用的指令:

roslaunch urdf_tutorial display.launch model:=xxx.urdf.xacro


好了,安装一些案例并且使用rviz查看这些案例的模型:

  1. 工业机械臂
  2. 移动机器人
  3. 移动协作机器人

 1. roslaunch urdf_tutorial display.launch model:=irb4400l_30_243.xacro

2.  roslaunch urdf_tutorial display.launch model:=husky.urdf.xacro

3.  roslaunch urdf_tutorial display.launch model:=pr2.urdf.xacro


其实,这部分和机械制图类似,就靠多练习,没啥特别技术,主要是建模。

以pr2为例:

代码语言:javascript
复制
<?xml version="1.0"?>
<robot xmlns:xacro="http://www.ros.org/wiki/xacro" name="pr2" >

  <!-- The following included files set up definitions of parts of the robot body -->
  <!-- misc common stuff? -->
  <xacro:include filename="$(find pr2_description)/urdf/common.xacro" />
  <!-- PR2 Arm -->
  <xacro:include filename="$(find pr2_description)/urdf/shoulder_v0/shoulder.urdf.xacro" />
  <xacro:include filename="$(find pr2_description)/urdf/upper_arm_v0/upper_arm.urdf.xacro" />
  <xacro:include filename="$(find pr2_description)/urdf/forearm_v0/forearm.urdf.xacro" />
  <!-- PR2 gripper -->
  <xacro:include filename="$(find pr2_description)/urdf/gripper_v0/gripper.urdf.xacro" />
  <!-- PR2 head -->
  <xacro:include filename="$(find pr2_description)/urdf/head_v0/head.urdf.xacro" />
  <!-- PR2 tilting laser mount -->
  <xacro:include filename="$(find pr2_description)/urdf/tilting_laser_v0/tilting_laser.urdf.xacro" />
  <!-- PR2 torso -->
  <xacro:include filename="$(find pr2_description)/urdf/torso_v0/torso.urdf.xacro" />
  <!-- PR2 base -->
  <xacro:include filename="$(find pr2_description)/urdf/base_v0/base.urdf.xacro" />
  <!-- Head sensors -->
  <xacro:include filename="$(find pr2_description)/urdf/sensors/head_sensor_package.urdf.xacro" />
  <!-- Camera sensors -->
  <xacro:include filename="$(find pr2_description)/urdf/sensors/wge100_camera.urdf.xacro" />
  <!-- Texture projector -->
  <xacro:include filename="$(find pr2_description)/urdf/sensors/projector_wg6802418.urdf.xacro" />
  <!-- generic simulator_gazebo plugins for starting mechanism control, ros time, ros battery -->
  <xacro:include filename="$(find pr2_description)/gazebo/gazebo.urdf.xacro" />
  <!-- materials for visualization -->
  <xacro:include filename="$(find pr2_description)/urdf/materials.urdf.xacro" />

  <!-- Now we can start using the macros included above to define the actual PR2 -->

  <!-- The first use of a macro.  This one was defined in base.urdf.xacro above.
       A macro like this will expand to a set of link and joint definitions, and to additional
       Gazebo-related extensions (sensor plugins, etc).  The macro takes an argument, name,
       that equals "base", and uses it to generate names for its component links and joints
       (e.g., base_link).  The included origin block is also an argument to the macro.  By convention,
       the origin block defines where the component is w.r.t its parent (in this case the parent
       is the world frame). For more, see http://www.ros.org/wiki/xacro -->
  <xacro:pr2_base_v0 name="base"/>

  <xacro:pr2_torso_v0 name="torso_lift" parent="base_link">
    <origin xyz="-0.05 0 0.739675" rpy="0 0 0" />
  </xacro:pr2_torso_v0>

  <!-- The xacro preprocesser will replace the parameters below, such as ${cal_head_x}, with
       numerical values that were specified in common.xacro which was included above -->
  <xacro:pr2_head_v0 name="head" parent="torso_lift_link">
    <origin xyz="-0.01707 0.0 0.38145"
            rpy="0.0 0.0 0.0" />
  </xacro:pr2_head_v0>

  <!-- Camera package: double stereo, prosilica -->
  <xacro:pr2_head_sensor_package_v0 name="sensor_mount" hd_frame_name="high_def"
         hd_camera_name="prosilica"
         stereo_name="double_stereo"
         parent="head_plate_frame">
    <origin xyz="0.0 0.0 0.0" rpy="0 0 0" />
  </xacro:pr2_head_sensor_package_v0>

  <!-- Projector -->
  <xacro:projector_wg6802418_v0 name="projector_wg6802418" parent="head_plate_frame" >
    <!-- Camera is slightly recessed from front, where is camera origin? Lens? -->
    <origin xyz="0 0.110 0.0546" rpy="0 0 0" />
  </xacro:projector_wg6802418_v0>

  <xacro:pr2_tilting_laser_v0 name="laser_tilt" parent="torso_lift_link" laser_calib_ref="0.0">
    <origin xyz="0.09893 0 0.227" rpy="0 0 0" />
  </xacro:pr2_tilting_laser_v0>

  <!-- This is a common convention, to use a reflect parameter that equals +-1 to distinguish left from right -->
  <xacro:pr2_shoulder_v0 side="r" reflect="-1" parent="torso_lift_link">
    <origin xyz="0.0 -0.188 0.0" rpy="0 0 0" />
  </xacro:pr2_shoulder_v0>
  <xacro:pr2_upper_arm_v0 side="r" reflect="-1" parent="r_upper_arm_roll_link"/>
  <xacro:pr2_forearm_v0 side="r" reflect="-1" parent="r_forearm_roll_link">
    <origin xyz="0 0 0" rpy="0 0 0" />
  </xacro:pr2_forearm_v0>

  <xacro:pr2_gripper_v0 reflect="-1.0" side="r" parent="r_wrist_roll_link"
               screw_reduction="${4.0/1000.0}"
               gear_ratio="${(729.0/25.0)*(22.0/16.0)}"
               theta0="${3.6029*M_PI/180.0}"
               phi0="${29.7089*M_PI/180.0}"
               t0="${-0.1914/1000.0}"
               L0="${37.5528/1000.0}"
               h="${0.0/1000.0}"
               a="${68.3698/1000.0}"
               b="${43.3849/1000.0}"
               r="${91.5/1000.0}" >
    <origin xyz="0 0 0" rpy="0 0 0" />
  </xacro:pr2_gripper_v0>

  <xacro:pr2_shoulder_v0 side="l" reflect="1" parent="torso_lift_link">
    <origin xyz="0.0 0.188 0.0" rpy="0 0 0" />
  </xacro:pr2_shoulder_v0>
  <xacro:pr2_upper_arm_v0 side="l" reflect="1" parent="l_upper_arm_roll_link"/>
  <xacro:pr2_forearm_v0 side="l" reflect="1" parent="l_forearm_roll_link">
    <origin xyz="0 0 0" rpy="0 0 0" />
  </xacro:pr2_forearm_v0>

  <xacro:pr2_gripper_v0 reflect="1.0" side="l" parent="l_wrist_roll_link"
               screw_reduction="${4.0/1000.0}"
               gear_ratio="${(729.0/25.0)*(22.0/16.0)}"
               theta0="${3.6029*M_PI/180.0}"
               phi0="${29.7089*M_PI/180.0}"
               t0="${-0.1914/1000.0}"
               L0="${37.5528/1000.0}"
               h="${0.0/1000.0}"
               a="${68.3698/1000.0}"
               b="${43.3849/1000.0}"
               r="${91.5/1000.0}" >
    <origin xyz="0 0 0" rpy="0 0 0" />
  </xacro:pr2_gripper_v0>

  <!-- Forearm cam Position is a guess, based on full robot calibration -->
  <!-- Forearm cam Orientation is from Function -->
  <xacro:wge100_camera_v0 name="l_forearm_cam" image_format="R8G8B8" camera_name="l_forearm_cam" image_topic_name="image_raw"
                          camera_info_topic_name="camera_info"
                          parent="l_forearm_roll_link" hfov="90" focal_length="320.000105"
                          frame_id="l_forearm_cam_optical_frame" hack_baseline="0"
                          image_width="640" image_height="480">
    <origin xyz=".135 0 .044" rpy="${-M_PI/2} ${-32.25*M_PI/180} 0" />
  </xacro:wge100_camera_v0>
  <xacro:wge100_camera_v0 name="r_forearm_cam" image_format="R8G8B8" camera_name="r_forearm_cam" image_topic_name="image_raw"
                          camera_info_topic_name="camera_info"
                          parent="r_forearm_roll_link" hfov="90" focal_length="320.000105"
                          frame_id="r_forearm_cam_optical_frame" hack_baseline="0"
                          image_width="640" image_height="480">
    <origin xyz=".135 0 .044" rpy="${M_PI/2} ${-32.25*M_PI/180} 0" />
  </xacro:wge100_camera_v0>

  <!-- Kinect models -->
  <xacro:arg name="KINECT1" default="false" />
  <xacro:arg name="KINECT2" default="false" />
  <!-- Kinect2 xacro -->
  <xacro:if value="$(arg KINECT2)">
    <xacro:include filename="$(find pr2_description)/urdf/sensors/kinect2.urdf.xacro" />
    <xacro:kinect2_v0 name="head_mount" parent="head_plate_frame" >
      <origin xyz="-0.137376 0 0.091746" rpy="0 0 0" />
    </xacro:kinect2_v0>
  </xacro:if>

  <!-- Kinect1 xacro -->
  <xacro:unless value="$(arg KINECT2)">
    <xacro:if value="$(arg KINECT1)">
      <xacro:include filename="$(find pr2_description)/urdf/sensors/kinect_prosilica_camera.urdf.xacro" />
      <!-- Base of Kinect/Prosilica Mount -->
      <xacro:kinect_prosilica_camera_swept_back_v0 name="head_mount" parent="head_plate_frame" >
        <origin xyz="-0.137376 0 0.091746" rpy="0 0 0" />
        <!--
              Was -0.1 0 0.09225, -0.138 0 0.09225 looks good front back and sheet metal on bolt pattern,
              -0.138 0 0.06225 is where Prosilica FOV just clears the head.
              This is 30 mm (!) below the position in SolidWorks.
              Lowering the Prosilica by this amount in SW makes the FOV hit around the aluminum / plastic junction.
              Turns out that head_plate_frame is in some random position in space.
              It is [0.0232 0 0.0645 from its parent, head_tilt_link.
              The distance from head_plate_frame to the head_kinect_prosilica is 137.376 mm X and 91.746 mm Z.
              So the origin should in fact be -0.137376 0 0.091746.
              John is changing the Prosilica FOV based on my numbers.
              Just need to roscd and svn up.
        -->
      </xacro:kinect_prosilica_camera_swept_back_v0>
    </xacro:if>
  </xacro:unless>

</robot>

其中,tf特别复杂:


思考: 如何移动机器人,必须了解ROS中一些通常使用的tf坐标系,如mapodombase_link。map tf坐标系是世界固连坐标系,它可用于长时间的全局参考。odom坐标系可用于精确的、短时间的局部参考。base_link与移动机器人的底座严格相连。通常这些坐标系是相互关联的,它们之间的关系可通过图形表示为map | odom | base_link。


本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2022-09-02,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
相关产品与服务
图像处理
图像处理基于腾讯云深度学习等人工智能技术,提供综合性的图像优化处理服务,包括图像质量评估、图像清晰度增强、图像智能裁剪等。
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档