<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="http://toychest.ai.uni-bremen.de/wiki/lib/exe/css.php?s=feed" type="text/css"?>
<rss version="2.0">
    <channel xmlns:g="http://base.google.com/ns/1.0">
        <title>Robot Toychest - projects</title>
        <description></description>
        <link>http://toychest.ai.uni-bremen.de/wiki/</link>
        <lastBuildDate>Sat, 09 May 2026 01:57:27 +0000</lastBuildDate>
        <generator>FeedCreator 1.8</generator>
        
        <item>
            <title>cucumber_cutting</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:cucumber_cutting?rev=1341931703&amp;do=diff</link>
            <description>Cucumber Cutting, magnetically tracked

This page hosts some experimental data from cucumber cutting which was recorded
with the magnetic motion tracker &#039;Razor Hydra (tm)&#039;.

The following images show the setup before the start of the experiment.

[cucumber cutting setup]
[cucumber cutting setup (top)]</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 10 Jul 2012 14:48:23 +0000</pubDate>
        </item>
        <item>
            <title>dataglove</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:dataglove?rev=1358077255&amp;do=diff</link>
            <description>X-IST Dataglove Software

This is the software package that we have made for our dataglove which is equipped with 15 bend sensors (3 per finger, one for each joint), 5 force sensors (one on each fingertip) and 2 acceleration sensors which can measure the tilt of the glove. The information how far the fingers (and thumb!) are stretched apart is missing. Also, the glove can not (by itself) detect, whether the palm is up or down. The glove can record 60 frames per second. It is attached through a U…</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Sun, 13 Jan 2013 11:40:55 +0000</pubDate>
        </item>
        <item>
            <title>dlr_hit_hand_arm_sim</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:dlr_hit_hand_arm_sim?rev=1329149050&amp;do=diff</link>
            <description>Hand/arm simulator for the Kuka arms and DLR/HIT hands

You can run a system that simulates the impedance and damping control of the arms and hands on TUM-Rosie.

	*  Run the hand simulator.
cd ias_manipulation_nonfree/sahand_api
./sahand_yarp.py -s -d -n</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Mon, 13 Feb 2012 16:04:10 +0000</pubDate>
        </item>
        <item>
            <title>errors-and-solutions-cl-bullet-assertion</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:errors-and-solutions-cl-bullet-assertion?rev=1359099113&amp;do=diff</link>
            <description>CL-BULLET not working (assertion in `(setup-world-database)&#039;

In case you try to load the CRAM-system and want to access bullet-based reasoning, you might stumble across this error. To make CL-BULLET work correctly, the JSON-PROLOG-node has to run correctly. This can be done by directly starting the node or via launch files:</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 25 Jan 2013 07:31:53 +0000</pubDate>
        </item>
        <item>
            <title>fingertip</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:fingertip?rev=1349889718&amp;do=diff</link>
            <description>Fingertip Laser Sensor

[Fingertip sensor on paper (uses an ADNS-9500)]
[Fingertip sensor]
[Sensor installed on a DLR/HIT hand]

This is a sensor designed to be installed on the fingertip of robotic hands. It uses an Avago ADNS-9500 laser mouse sensor to acquire information about the surface of the grasped objects, and detect slippage as well as distance to the object.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Wed, 10 Oct 2012 17:21:58 +0000</pubDate>
        </item>
        <item>
            <title>fingertip_distance</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:fingertip_distance?rev=1285325054&amp;do=diff</link>
            <description>Distance classification using the fingertip sensor

Client programs

The project contains client programs to train and test the sensors.

training file format

The training data needs to be in the following format.
This allows the use in the distance evaluation program and in the libsvm-tools programs.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 24 Sep 2010 10:44:14 +0000</pubDate>
        </item>
        <item>
            <title>fingertip_surface</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:fingertip_surface?rev=1321283534&amp;do=diff</link>
            <description>Experiment Setup

Initially, 13 object surfaces were recorded using the fingertip sensor. These would become the classes to be recognized by means of a SVM (Support Vector Machine). Depending on the object, specially the size and heterogeneity of its surface, a longer or shorter time was needed to record the surface. Below you can see a list of the objects recorded and how many frames were taken for each of them:</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Mon, 14 Nov 2011 15:12:14 +0000</pubDate>
        </item>
        <item>
            <title>graveyard</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:graveyard?rev=1570991413&amp;do=diff</link>
            <description>Some old or unmaintained pages:

	*  How to get started with UIMA/C++
	*  PR2: In-hand Objects Modelling
	*  PR2: Launch Kinect Driver
	*  PR2: Extract Region of Interest from 3D-2D Projection
	*  Troubleshoot cop for Pancake Demo
	*  Which current version of PCL in ROS are we using
	*  Windows-Linux Installation
	*  Development with Phidgets + player
	*  iCub related
	*  [ROS node for ARToolKitPlus]
	*  Ingo&#039;s ROS code sandbox
	*  Jan&#039;s Hands-on-examples
	*  Cucumber Cutting (Magnetically track…</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Sun, 13 Oct 2019 18:30:13 +0000</pubDate>
        </item>
        <item>
            <title>hands_on_examples_gazebo_intel</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hands_on_examples_gazebo_intel?rev=1342710678&amp;do=diff</link>
            <description>Workarounds for Gazebo Intel Graphics Card Issues

Segmentation faults

If you encounter the error

[gazebo_gui-19] process has died [pid 6086, exit code 139, cmd /opt/ros/fuerte/stacks/simulator_gazebo/gazebo/scripts/gui __name:=gazebo_gui __log:=/home/winkler/.ros/log/ba357cea-d1b2-11e1-b926-8c705ac51404/gazebo_gui-19.log].
log file: /home/winkler/.ros/log/ba357cea-d1b2-11e1-b926-8c705ac51404/gazebo_gui-19*.log</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Thu, 19 Jul 2012 15:11:18 +0000</pubDate>
        </item>
        <item>
            <title>hands_on_examples_rosservices_examples</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hands_on_examples_rosservices_examples?rev=1340376659&amp;do=diff</link>
            <description>Simple ROS service examples

An example project (ROS node) that serves a virtual “song title database” via ROS can be checked out here:

git clone https://github.com/fairlight1337/ros_service_examples.git

This is just an example for reference purposes.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 22 Jun 2012 14:50:59 +0000</pubDate>
        </item>
        <item>
            <title>hands_on_examples_sending_pose_commands</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hands_on_examples_sending_pose_commands?rev=1340369550&amp;do=diff</link>
            <description>Sending poses to PR2 in gazebo using action designators

This step-by-step explanation shows how to get the PR2 gazebo simulation
up and running and how to execute simple translation/rotation commands
using high level components (i.e. action designators).</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 22 Jun 2012 12:52:30 +0000</pubDate>
        </item>
        <item>
            <title>hand_cartesian</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hand_cartesian?rev=1282671399&amp;do=diff</link>
            <description>DLR/HIT hand cartesian library

The hand cartesian library enables the DLR/HIT hand to:

	*  Calculate current cartesian position of finger tips.
	*  Calculate current forces at finger tips.
	*  Run different types of controllers for each finger.
	*</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 24 Aug 2010 17:36:39 +0000</pubDate>
        </item>
        <item>
            <title>hand_on_exampes_sending_pose_commands</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hand_on_exampes_sending_pose_commands?rev=1340369423&amp;do=diff</link>
            <description>Sending poses to PR2 in gazebo using action designators

This step-by-step explanation shows how to get the PR2 gazebo simulation
up and running and how to execute simple translation/rotation commands
using high level components (i.e. action designators).</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 22 Jun 2012 12:50:23 +0000</pubDate>
        </item>
        <item>
            <title>hand_on_examples</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:hand_on_examples?rev=1359098819&amp;do=diff</link>
            <description>Hands-on-examples

	*  Simple (working) ROS service examples in C++, Python and LISP in one project
	*  Sending pose commands through action designators to a simulated (gazebo) PR2
	*  Gazebo workarounds for Intel graphics cards (segfaults, ...)

Errors and their Solutions

	*  CL-BULLET not working (assertion in `setup-world-database&#039;)</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 25 Jan 2013 07:26:59 +0000</pubDate>
        </item>
        <item>
            <title>kinect</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:kinect?rev=1295453467&amp;do=diff</link>
            <description>Mounting Plate for Kinect

Kudos to Brennand Pierce!

[CAD Model for Mounting Plate]

[Mounting Plate for Kinect]
[Mounting Plate for Kinect]
[Mounting Plate for Kinect]
[Mounting Plate for Kinect]
[Mounting Plate for Kinect]
[Mounting Plate for Kinect]
[Mounting Plate for Kinect]</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Wed, 19 Jan 2011 16:11:07 +0000</pubDate>
        </item>
        <item>
            <title>kuka_lwr</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:kuka_lwr?rev=1360677496&amp;do=diff</link>
            <description>Information about the KUKA lightweight robot

Here we collect some bits and pieces about the KUKA lightweight robot. We hope it is useful for someone.

3D Model of the lightweight arm

[light weight robot]

[This] is a blender model of the lightweight arm using subdivision surfaces. The number of vertices/faces can still be adjusted when generating meshes. It is licensed under CC-BY-3.0.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 12 Feb 2013 13:58:16 +0000</pubDate>
        </item>
        <item>
            <title>kuka_lwr_cables</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:kuka_lwr_cables?rev=1456826249&amp;do=diff</link>
            <description>KUKA LWR-4 Cables



The LWR-4 has a Harting HAN-DD connector on the base.

Parts

Adjust accordingly for the cable length that you want to have (you can order different variants of the fiber optic patch cable, for example a 15m one that you could cut in two. In our last test we used 2 4m ones, as it was the longest type on stock in Digikey).. The cable gages (2.5 and 1.0 sqmm) shown below are the same on the official 7m cable from the company. We have tried 2m and 4.5m cables with success, but …</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 01 Mar 2016 09:57:29 +0000</pubDate>
        </item>
        <item>
            <title>libertymotiontracker</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:libertymotiontracker?rev=1272014304&amp;do=diff</link>
            <description>Magnetic motion tracker &quot;Polhemus Liberty&quot;

This motion tracker has a measurement space of about 4 m and 8 sensors which report position and orientation at a rate of 240 Hz over USB (with a precision better than 1 mm).

	*  Figuring out the orientation of the sensors

	*  Calibrating the skeleton of the test subject

Polhemus, the manufacturer, has a</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 23 Apr 2010 09:18:24 +0000</pubDate>
        </item>
        <item>
            <title>liberty_findingorientation</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:liberty_findingorientation?rev=1272014670&amp;do=diff</link>
            <description>Making sense out of tracker orientation data

Here we describe what we did to make sense out of the orientation data from our Polhemus Liberty tracking device. Maybe this is useful for someone...

Finding the global reference frame

First, we tried to find the tracker coordinate system in the real world. For this test we printed the position data to</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 23 Apr 2010 09:24:30 +0000</pubDate>
        </item>
        <item>
            <title>liberty_skeletoncalibration</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:liberty_skeletoncalibration?rev=1272016115&amp;do=diff</link>
            <description>Recovering the skeleton from magnetic motion tracking data

[Sensor placement on the arm]

This page describes the implementation of a paper of James F. O&#039;Brien et. al. on Automatic Joint Parameter Estimation from Magnetic Motion Capture Data.
It requires the sensors to be attached as rigidly as possible to the limbs, especially their orientation should not change too much.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Fri, 23 Apr 2010 09:48:35 +0000</pubDate>
        </item>
        <item>
            <title>lwr_force_measurements</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:lwr_force_measurements?rev=1279014748&amp;do=diff</link>
            <description>Force Measurements with the KUKA lightweight arms

We would like to find out, how well we can use our lightweight arms to measure torques and forces.

We are interested in:

	*  sensitivity (what is the smallest force that we can detect)
	*  noise (variance, peak-to-peak)</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 13 Jul 2010 09:52:28 +0000</pubDate>
        </item>
        <item>
            <title>pr2_backpack</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:pr2_backpack?rev=1432624569&amp;do=diff</link>
            <description>Backpack PC for a Kinect-v2 on the PR2

[PR2 with a Kinect-v2 (Kinect One)]
[Backpack PC installed on the PR2]

This is a small computer for the PR2 robot, in order to install a Kinect v2 (or Kinect One).

The Kinect-v2 needs a USB-3 port, not available on the computers of the PR2.

The driver also needs a GPU capable of running OpenCL.</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 26 May 2015 07:16:09 +0000</pubDate>
        </item>
        <item>
            <title>robot_limit_checker</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:robot_limit_checker?rev=1355849456&amp;do=diff</link>
            <description>Joint Command Limiter

The main purpose of the Joint Command Limiter is to avoid the robot hitting its joint limits and to avoid the hand hitting its own arm. Basically, once position, velocity and acceleration ranges are specified for any joint, the idea is to keep the robot within these ranges all the time. Imagine for example, that certain joint is going with a high speed against its mechanical limit, then the limiter&#039;s job is to know how far the joint can continue with this speed and when to…</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 18 Dec 2012 16:50:56 +0000</pubDate>
        </item>
        <item>
            <title>sandbox_ingo</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:sandbox_ingo?rev=1366036059&amp;do=diff</link>
            <description>Ingo&#039;s ROS code sandbox

You can check out my public Mercurial repository with:
hg clone http://toychest.in.tum.de/users/kresse/sandbox

NOTE: The relevant code is being moved to github and bitbucket. Stay tuned.

[Constraint specification for pancake-pushing]

Among other stuff it contains:

	*</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Mon, 15 Apr 2013 14:27:39 +0000</pubDate>
        </item>
        <item>
            <title>surface_classifier</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:surface_classifier?rev=1342536798&amp;do=diff</link>
            <description>fingerpub and surfaceClassifier

In order to do surface classification these two nodes should be running. Start the fingerpub node (./bin/fingerpub) first and then the surfaceClassifier (./bin/surfaceClassifier). 

fingerpub  reads information directly from the sensors and publishes it as adns messages (</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Tue, 17 Jul 2012 14:53:18 +0000</pubDate>
        </item>
        <item>
            <title>tuning_pr2_vel_resolved</title>
            <link>http://toychest.ai.uni-bremen.de/wiki/projects:tuning_pr2_vel_resolved?rev=1358181716&amp;do=diff</link>
            <description>Plots taken on the real PR2 (press on image to open in new tab):

x-unit: [s]
y-unit: [rad/s]
joint: l_shoulder_pan_joint



Zoomed in:



Plots taken on the simulated PR2 (press on image to open in new tab):

x-unit: [s]
y-unit: [rad/s]
joint: l_shoulder_pan_joint</description>
            <author>anonymous@undisclosed.example.com (Anonymous)</author>
            <pubDate>Mon, 14 Jan 2013 16:41:56 +0000</pubDate>
        </item>
    </channel>
</rss>
