Here are the steps taken to put the iCub to grab Lego pieces and hopefully put them together.
Take a look in the official documentation or if you are using debian sid. Here we include a fast step-by-step guide of how to do it with debian sid.
sudo easy_install -U rosinstall
mkdir -p local/src/ros cd local/src/
rosinstall ros http://www.ros.org/rosinstalls/cturtle_base.rosinstall
export ROS_ROOT=${HOME}/local/src/ros/ros ; export PATH=${ROS_ROOT}/bin:${PATH} ; export PYTHONPATH=${ROS_ROOT}/core/roslib/src:${PYTHONPATH} ; export OCTAVE_PATH=${ROS_UP}/ros/core/experimental/rosoct/octave:${OCTAVE_PATH} ; #if [ ! "$ROS_MASTER_URI" ] ; then export ROS_MASTER_URI=http://leela:11311 ; fi ; if [ ! "$ROS_MASTER_URI" ] ; then export ROS_MASTER_URI=http://localhost:11311 ; fi ; export ROS_PACKAGE_PATH=${HOME}/local/src/ros/stacks:${HOME}/local/src/repositories/oid5; #export ROS_STACK_PATH=${ROS_ROOT}:${ROS_UP}/ros-pkg ; #source `rosstack find ias_semantic_mapping`/setup.sh NUM_CPUS=`cat /proc/cpuinfo |grep processor |wc -l` let "PAR_JOBS=${NUM_CPUS}+2" export ROS_PARALLEL_JOBS="-j${PAR_JOBS}" export ROS_LANG_DISABLE=roslisp:rosjava export ROS_IP=`ip addr show \`/sbin/route -n | awk '/UG/ {print $8}'\`| awk '/inet /{print $2}' |sed -e 's/\/.*//'` #export ROS_IP=192.168.137.2 #HOSTNAME=`hostname` #if [[ "${HOSTNAME}" == "leela" ]] ; then echo "Hola"; else echo "lala" ;fi alias kcart_left="`rospack find kbd-cart-cmd`/kbd-cart-cmd -p kbd-left -c /lwr/left" alias kcart_right="`rospack find kbd-cart-cmd`/kbd-cart-cmd -p kbd-right -c /lwr/right" . ${ROS_ROOT}/tools/rosbash/rosbash
alias env_ros='source ${HOME}/.bashrc.ros'
In the ros directory there is a subdirectory called stacks this is where you can put extra packages. You just have to download somehow this packages and put them there.
Example: Gstreamer video adquisition:
cd $ROS_ROOT cd ../stacks
svn co http://brown-ros-pkg.googlecode.com/svn/tags/brown-ros-pkg
roscd cd ../stacks/brown-ros-pkg/ cat /directory_to_patch/gscam.patch | patch -p0
rosmake gscam
Captures images from gstreamer video sources and sends them to a ros topic.
gst-launch --gst-debug=v4l2:2 v4l2src device=/dev/video1 ! video/x-raw-rgb,width=800,height=600,framerate=30/1 ! ffmpegcolorspace ! xvimagesink
export GSCAM_CONFIG="v4l2src device=/dev/video0 ! video/x-raw-rgb,width=800,height=600,framerate=30/1 ! ffmpegcolorspace ! identity name=ros ! fakesink" rosrun gscam gscam
rosmake image_view rosrun image_view image_view image:=/gscam/image_raw
With a Logitech webcam C600 one can get better image quality (less noisy) setting the video mode to YUV. In gst-launch:
gst-launch --gst-debug=v4l2:5 v4l2src device=/dev/video1 ! video/x-raw-yuv,format=\(fourcc\)YUY2,width=800,height=600,framerate=15/1 ! ffmpegcolorspace ! xvimagesink
For gscam:
export GSCAM_CONFIG="v4l2src device=/dev/video1 ! video/x-raw-yuv,width=800,height=600,framerate=15/1,format=(fourcc)YUY2 ! ffmpegcolorspace ! video/x-raw-rgb ! identity name=ros ! fakesink"
the last conversion is necessary because gscam only takes rgb images. Please carefully notice that for the gscam GSCAM_CONFIG export there are no back-slashes for the format “fourcc” part.
One can use gst-inspect to check the capabilities of the different gstreamer filters.
rosmake camera_calibration rosrun camera_calibration cameracalibrator.py --size 5x4 --square 0.02464 image:=/gscam/image_raw camera:=/gscam
export ROS_NAMESPACE=gscam rosrun image_proc image_proc
rosrun image_view image_view image:=/gscam/image_rect_color
The algorithm that selects the pictures in camera_calibration is not perfect, and it prefers to select blur images over sharp, also most of the times it doesn't use enough pictures. So here we explain how to use your own pictures to calibrate.
cd /tmp/ rosrun image_view image_view image:=/gscam/image
Images are store in the current directory.
rosrun camera_calibration camera_calibrate_from_disk.py --size 8x6 --square 0.0247881 /tmp/frame00*
roscd; cd ../stacks git clone http://robotics.ccny.cuny.edu/git/ccny-ros-pkg.git
cd ccny-ros-pkg cat directory_to_patch/ar_pose.patch | patch -p1
rosmake artoolkit rosmake ar_pose
4x4_23 data/4x4/4x4_23.patt 25.0 0.0 0.0
First is the name of the marker, second is the file of the marker, then the size in mm, then the relative position?
rosrun ar_pose ar_multi /usb_cam/camera_info:=/gscam/camera_info /usb_cam/image_raw:=/gscam/image_rect
rostopic echo /visualization_marker
roscd; cd ../stacks git clone gitosis@git9.in.tum.de:tumros-internal rosmake yarp2 rosmake yarp_to_ros_image
rosrun yarp_to_ros_image yarp_to_ros_image
We use a closed loop inverse kinematics system which integrates a vector field in the task space part of the controller. We use two controllers, one for each arm. Both controllers can control the torso, but a high level arbiter have to make sure that only one is taking control at the same moment of time.
sudo apt-get install mercurial kdiff3
[ui] username = Name Lastname <email_address> merge = kdiff3 [extensions] hgk=
mkdir -p local/src/repositories/ cd local/src/repositories hg clone ssh://repo@nibbler.informatik.tu-muenchen.de//work/repos/hg/oid5
cd tools/yarp make -j20 make install cd ../../tools/kdl make -j20 make install cd cd local/DIR xstow yarp xstow kdl
cd local/src/repositories/oid5/control/motionControl ./system_start.sh -r icub -i right -s -d
./kill.sh
roscd yarp_to_ros_image rosrun yarp_to_ros_image yarp_to_ros_image yarp connect /icub/cam/left /yarp_to_ros_image
roscd yarp_to_ros_image rosrun yarp_to_ros_image yarp_to_ros_image
This will use the camera_calibration.txt file that is in the yarp_to_ros_image directory.
yarp connect /icub/cam/left /yarp_to_ros_image
export ROS_NAMESPACE=yarp_to_ros_image rosrun image_proc image_proc image_raw:=image
rosrun ar_pose ar_multi /usb_cam/camera_info:=/yarp_to_ros_image/camera_info /usb_cam/image_raw:=/yarp_to_ros_image/image_rect
rostopic echo /ar_multi/visualization_marker
roscd icub_bringup roslaunch icub_marker.launch
roscd tf_yarp_bridge ./calib_map.py
roscd tf_yarp_bridge ./tf_yarp_bridge.py
cd ~/local/src/repositories/oid5/robots/iCub/gaze_follower/ ./start_lego_demo_gaze_follower.sh
cd ~/local/src/repositories/oid5/perception/lego/ ./start_lego_demo.sh
cd ~/local/src/repositories/oid5/control/motionControl/ ./system_start.sh -r icub -i right ./system_start.sh -r icub -i left
cd ~/local/src/repositories/oid5/control/handControl ./start_lego_demo_hand_control.sh
cd ~/local/src/repositories/oid5/control/motionControl/ ./lego_demo.py