File size: 5,317 Bytes
2d6bfa0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 |
.. _camera:
Camera
=======
Tower-mount camera calibration
-----------------------
On workstation or your PC
1. To shh the youbot (in all terminals):
.. code-block:: bash
yb2
.. note::
alias yb2=ssh -X robocup@youbot-brsu-2-pc2
2. Export the youbot ssh alias
.. code-block:: bash
export_yb2
.. note::
alias export_yb2=export ROS_MASTER_URI=http://youbot-brsu-2-pc2:11311
On robot
3. Launch the robot
.. code-block:: bash
roslaunch mir_bringup robot.launch
4. Launch Nav2d (to visualize the robot in Rviz)
.. code-block:: bash
roslaunch mir_2dnav 2dnav.launch
4. Run calibration
.. code-block:: bash
roslaunch mir_calibrate_pick calibrate_pick.launch
A small gui window to adjust the pose of the end-effector in terms of XYZRPY will appear.
5. Place a small round object on the ground (at approx. 25 cm from the center of the front wheel)
On workstation or your PC
6. Run rviz
.. code-block:: bash
rosrun rviz rviz
7. Load the camera_calibration_config.rviz config and set the global frame to `base_link`
Ensure to enable `raw_color_pcl` and `camera_calibration_obj_pose`. You can see an arrow pointing away from the robot.
8. Adjust the values in the pose mockup gui, so that the beginning of the arrow matches the center of the round object.
.. note::
Make sure the object is not too far away from the robot and the Z value is slightly above the object.
.. figure:: images/camera_calib_side.png
:align: center
Camera calibration example side view
.. figure:: images/camera_calib_top.png
:align: center
Camera calibration example top view
On robot
9. Test the calibration in another terminal
.. code-block:: bash
rosrun mir_calibrate_pick calibrate_pick_client_test.py
The robot will move towards the gripper and move the end-effector of the arm close the object based on the given offset.
10. If the final end-effector position is not properly aligned to the desired goal position, in the robot navigate to the robot urdf configuration and edit the `robot.urdf.xacro`
.. code-block:: bash
/ros/noetic/robocup/src/mas_industrial_robotics/mir_robots/mir_hardware_config/youbot-brsu-2/urdf/robot.urdf.xacro
.. code-block:: bash
<xacro:realsense_d435 name="arm_cam3d" parent="base_link">
<origin xyz="0.30 -0.05 0.80" rpy="0.00 1.137 0.0" />
</xacro:realsense_d435>
Adjust the values of xyz and rpy to account for the offset according to the values set in step 8 and repeat from step 8.
11. If the final end-effector position is properly aligned to the desired goal position, the camera calibration is complete.
12. Terminate all operations and relaunch the robot to continue.
.. _realsense2_camera:
RealSense2 camera
------------------
How to use the RealSense2 camera
1. Installation
Go to the intel-ros github page. Clone the realsense repository in your catkin workspace inside src:
.. code-block:: bash
git clone [email protected]:intel-ros/realsense.git
2. Camera Output
Run the following to get access to the camera:
.. code-block:: bash
roslaunch realsense2_camera rs_rgbd.launch
Open rviz to visualize the camera output.
3. Configure camera output (OPTIONAL)
Run the following to open the rviz configuration window:
.. code-block:: bash
rosrun rqt_reconfigure rqt_reconfigure
You can also try to change the "octree_resolution" value:
.. code-block:: bash
cd *catkin workspace*/src/mas_perception/mcr_scene_segmentation/ros/config
4. Setup Base Frame
Run the following:
.. code-block:: bash
rosrun tf static_transform_publisher x y z roll pitch yaw base_link camera_link 100
where x, y, z are the distances and roll, pitch, yaw are the rotations from the base_link to the camera_link.
To visualize your frames in rzviz, add the TF feature in the rviz menu.
5. Save Point Clouds
If it's your first time saving point clouds, you need to choose where you want to save them and enable data collection:
.. code-block:: bash
cd *catkin workspace*/src/mas_perception/mcr_scene_segmentation/ros/launch
Change the value of "dataset_collection" from "false" to "true". Change value of "logdir" from "/temp/
to the path in your computer where you want to save the files.
Run the following to get access to the point clouds given by the camera:
.. code-block:: bash
roslaunch mcr_scene_segmentation scene_segmentation.launch input_pointcloud_topic:=/camera/depth_registered/points
Publish the message 'e-start':
.. code-block:: bash
rostopic pub /mcr_perception/scene_segmentation/event_in std_msgs/String "data: 'e_start'"
Publish the message 'e-add-cloud-start':
.. code-block:: bash
rostopic pub /mcr_perception/scene_segmentation/event_in std_msgs/String "data: 'e_add_cloud_start'"
This last one will save the current point cloud of the observed object in your system.
.. warning::
Sometimes the camera won't save the point cloud (don't worry, not your fault).
Just try a different position for the object until it works.
6. Visualize Point Cloud
Run the following in the folder where you saved the point clouds:
.. code-block:: bash
pcl_viewer *.pcd file you want to open*
|