From Sensors to Human Spatial Concepts - Annotated Robot Sensor Data
FS2HSC Annotated Robot Sensor Data






Sponsored by: Sponsored by COGNIRON project

The robot was driven through the environment three times, one "clean" run with as low noise level as possible, one "noisy" run with people walking through the environment and one "home-tour" run, in which a person is guiding the robot around the house. Each run took around 5 minutes.

The omnidirectional images were taken by a camera with a hyperbolic mirror. For more information on the omnidirectional vision sensor, look here. On average 7.5 images per second were taken amounting in +/- 2000 images per run.
Laser range
A SICK-laser (LMS-200) was mounted on the Nomad to record 180 degree range scans at the front of the robot. Approximately 3.5 scans were conducted per second.
On average 12 odometry measurements per second were taken. Because the robot has solid wheels the odometry is quite accurate. At the same time the current values of the 16 ultrasonic sonar sensors were recorded giving a 360 degrees range scan.

For more information on the time-stamping, the format of the files given for download and specifications of the sensors used, have a look at the respective README's.

Annotation: Image taken by the omnidirectional camera We will constrain ourselves here to a simple but still rich set of spatial concepts. We start with the human concept of different rooms in a home environment. Furthermore, within each room we selected a number of prominent objects. The objects are manually roughly segmented in the omnidirectional images. The region close to an object is defined as the region from where the object can be used in a manner common for that object. The data was annotated by an inexperienced person. The person has never visited the environment and relied on the omnidirectional image and a rough drawing of the environment map. Every second frame was annotated. See example movie(900KB). The annotation is provided in XML format. The XML structure for one frame is described in frame.xml. .

Download: The data is available via: . Some useful MATLAB functions are also provided: reading the annotation, geometric transformations for the omnicam, etc. Detailed description is here.

Credits: Following persons were involved in making this dataset (alphabetical order): Anne Doggenaar, Bas Terwijn, Ben Krose, Edwin Steffens, Elin Anna Topp, Henrik Christensen, Matthijs Spaan, Olaf Booij, Ruben Boumans, Zoran Zivkovic. Special thanks to UNET for making their space available for the experiments.

FS2HSC Annotated Robot Sensor Data Contact: Bas Terwijn