FS2HSC - IEEE/RSJ IROS 2006 Workshop:
From sensors to human spatial concepts
(geometric approaches and appearance-based approaches)

October 10, 2006, Beijing, China


Call for Papers

Important Dates




Invited Speakers


Sponsored by: Workshop sponsored by COGNIRON project


2d map of the home Introduction: A dataset is recorded consisting of omnidirectional images, laser range data, sonar readings and robot odometry. The dataset is meant for testing and benchmarking methods proposed in the papers sent to the workshop, as well as stimulating the discussion about spatial representations for mobile robots.

Environment: The acquisition of the dataset took place in a home environment constructed by UNET (site linked is in dutch). More information (in dutch) about this environment can be found here. Click on the 2d map to get an enlarged version.
A Nomad scout was driven around by tele-operation while data was collected on a mounted laptop. The robot was driven through the environment three times, one "clean" run with as low noise level as possible, one "noisy" run with people walking through the environment and one "home-tour" run, in which a person is guiding the robot around the house. Each run took around 5 minutes.

The omnidirectional images were taken by a camera with a hyperbolic mirror. For more information on the omnidirectional vision sensor, look here. On average 7.5 images per second were taken amounting in +/- 2000 images per run.
Laser range
A SICK-laser (LMS-200) was mounted on the Nomad to record 180 degree range scans at the front of the robot. Approximately 3.5 scans were conducted per second.
On average 12 odometry measurements per second were taken. Because the robot has solid wheels the odometry is quite accurate. At the same time the current values of the 16 ultrasonic sonar sensors were recorded giving a 360 degrees range scan.

For more information on the time-stamping, the format of the files given for download and specifications of the sensors used, have a look at the respective README's.

Annotation: Image taken by the omnidirectional camera We will constrain ourselves here to a simple but still rich set of spatial concepts. We start with the human concept of different rooms in a home environment. Furthermore, within each room we selected a number of prominent objects. The objects are manually roughly segmented in the omnidirectional images. The region close to an object is defined as the region from where the object can be used in a manner common for that object. The data was annotated by an inexperienced person. The person has never visited the environment and relied on the omnidirectional image and a rough drawing of the environment map. Every second frame was annotated. See example movie(900KB). The annotation is provided in XML format. The XML structure for one frame is described in frame.xml. .

Download: The data is available via: http://www2.science.uva.nl/sites/cogniron . Some useful MATLAB functions are also provided: reading the annotation, geometric transformations for the omnicam, etc. Detailed description is here. For X={1,4,5} the data consists of the following files in the corresponding directories:

runAlmereX.jpg Graphical representation of geometric data
runAlmereXData.zip Odometry, sonar, laser and image-timestamp data
runAlmereXHQ.avi High quality video of image data (MPEG4)
runAlmereXLQ.avi Low quality video of image data (MPEG4)
runAlmereXYUV.zip Raw image data in YUV format
runAlmereXAnnotation.zip Data annotation (XML)

-annotation just for the datasets 1 and 5.
-if you open an xml annotation file prepare to wait since the files are huge.

Credits: Following persons were involved in making this dataset (alphabetical order): Anne Doggenaar, Bas Terwijn, Ben Krose, Edwin Steffens, Elin Anna Topp, Henrik Christensen, Matthijs Spaan, Olaf Booij, Ruben Boumans, Zoran Zivkovic. Special thanks to UNET for making their space available for the experiments.

Contact: B.Terwijn@uva.nl