Please switch on JavaScript in your browser!

If you see this message, JavaScript is not activated in your browser. Please switch it on to see this page properly and help us keeping the maintainance effort low.

Please switch on JavaScript in your browser!

SmokeBot Summary

SmokeBot - Solutions for Robots Facing Restricted Visibility

SmokeBot is driven by the application needs for robots that operate in domains with restricted visibility. The focus is on civil robots supporting fire brigades in search and rescue missions, e.g. in post-disaster management operations in response to tunnel fires. Existing sensor technology and the related cognitive approaches cannot cope with such demanding conditions. SmokeBot addresses this shortcoming and can thus bring about a step change for robotics. It will deliver software and hardware components which facilitate robot systems to perform under harsh conditions of smoke, dust or fog.

SmokeBot - An Instance of a Low Visibility Explorer Robot

The ability to operate under low visibility conditions will be demonstrated through integration of the project results in an industrial prototype of a Low Visibility Explorer Robot for providing situational awareness based on a commercial platform from partner taurob. In close collaboration with the Fire Department Dortmund and other end users in the advisory group, SmokeBot will crucially improve the abilities of the selected platform, thus increasing safety of rescue staff and European citizens.

SmokeBot - Development of a Novel RGT-V Sensor Unit

An even wider impact is expected through the development of a novel sensor unit and the corresponding cognitive approaches. In addition to traditional sensors such as LIDAR and cameras, which are affected by smoke or dust, this sensing unit includes also a novel 3D radar camera, a stereo thermal camera, and high-bandwidth gas sensors. Fusion of sensor modalities in an RGT-V unit (radar, gas sensors, thermal cameras - vision) will allow the inclusion of measurements from LIDAR and camera into the world model when they occasionally penetrate through e.g. smoke. In addition, we will develop the means to integrate prior knowledge in the form of crude human sketch maps to allow for robust mapping and navigation even under conditions of low visibility. Sensor technology from SmokeBot will result in new products to be brought to market after the project. Software developed will be made available as open source.

SmokeBot Objectives

SmokeBot has three high level objectives: hardware and software development of a novel sensor suite for low visibility conditions ([O1]), a set of perception modules to enhance the cognitive abilities of mobile robots, especially for use in disaster response ([O2]), and integration of the project results in a prototype for a commercial Low Visibility Explorer Robot ([O3]).

[O1] Perception with a Novel, Multimodal RGT-V Sensor Unit for Low Visibility Conditions

In SmokeBot we will develop perception algorithms aligned with improvement of the corresponding sensing hardware. A key result of the project will be a novel sensing unit that combines a high-resolution 3D radar with a thermal camera (which can penetrate smoke or dust), with gas sensors (which are able to sense chemical compounds in the air that are potentially dangerous for people), and with vision sensors: a camera and a 3D laser range finder. The particular combination of rather unconventional Radar, Gas and Thermal sensing with traditional Vision into one RGT-V unit and the development of corresponding perception algorithms constitute a significant contribution to Robotics. Our general approach in SmokeBot will be to start the development of perception algorithms with available state-of-the-art sensors and to integrate the improved hardware during the project. Radar and gas sensors will be improved specifically for use on mobile robots during the project.

Objective [O1] includes two sub-objectives: [O1a] High Resolution 3D Radar Perception and [O1b] High-Bandwidth Mobile Robot Gas Perception.

[O2] Perception and Cognitive Abilities for Mobile Robots in Low Visibility Scenarios

In SmokeBot we will exploit the complementary characteristics of different sensor modalities to develop sensor fusion based mapping approaches, which allow to build detailed models of the environment despite limited visibility conditions. To render high resolution maps of the environment, the robot will integrate sparse patches - delivered by traditional sensors in robotics (laser range finder and camera) when they glimpse through smoke or dust - with radar depth measurements.

Based on input from thermal cameras and range sensors we will develop methods to truthfully map surface temperatures onto 3D structures in the environment. Sensor fusion will be crucial to be able to handle reflections and gas radiation. We will also investigate how thermal imaging can aid localization so as to make the developed approach to SLAM in feature-sparse environments more robust.

Simultaneous gas discrimination and distribution mapping of several compounds is a very challenging task due to the combination of slow sensor response and fast concentration fluctuations in real-world environments. Through the development of high-bandwidth gas sensors, this issue will become manageable during the project. We will also develop multi-compound gas distribution mapping approaches that use input from the gas sensors and a wind sensor also carried by the robot. The output will be gas distribution and wind distribution models. We will further investigate means to identify the type of gas sources (whether it is an area or a point gas source) and in which areas gas sources are likely located, using the gas distribution and wind maps.

All measurements, particularly the high-resolution environment model, the thermal and gas distribution map, will be fused and integrated with a priori information into a General Disaster Model. As the central data structure this model will inherently support different levels of granularity. This enables the system to automatically adapt to restricted bandwidth conditions when communicating to the mission experts during an operation. Based on the General Disaster Model we will further develop means to predict hazards for decision support for mission experts and human operators.

The General Disaster Model will also be used to select suitable locations for further inspection. Suitable locations are those where most informative sensor measurements are expected and that guarantee self-preservation of the robot. The selected locations will either be used directly in autonomous mode or suggested to the mission experts in semi-autonomous mode.

Information about the environment can be very sparse given the conditions in a disaster scenario and the limitations of the sensors. We will investigate in SmokeBot how sketch maps can be used as heterogeneous prior information for Simultaneous Localization and Mapping (SLAM) under these circumstances. Such sketch maps can be hand-drawn sketches by an operator, or scans of a fire evacuation plan, for example. The methods developed to relate sketch maps to the environment models created by the robot will also be used for human-robot communication.

Objective [O2] includes six sub-objectives: [O2a] High-Resolution Environment Model through Sensor Fusion, [O2b] Thermal Camera Perception, [O2c] Multi-modal, Multi-Compound Gas Distribution Mapping and Gas Source Localization, [O2d] General Disaster Model, [O2e] Sensor Planning and Self-Preservation, and [O2f] Sketch Maps for Navigation and Human-Robot Communication.

[O3] Low Visibility Explorer Robot Prototype

The RGT-V sensor unit for low visibility conditions and the developed perception and cognition modules will be integrated in a prototype of a Low Visibility Explorer Robot. In close collaboration with the partner taurob, an end user (fire department Dortmund) and the advisory board, the prototype will be especially designed for disaster scenarios that involve fire in enclosed spaces such as tunnels, for example. The entire system will be made to withstand at up to 800oC for five minutes and special care will be taken to protect the sensor hardware for missions under harsh environmental conditions. In addition, communication to the robot is addressed through the development of deployable rugged wireless repeaters, which will span a mesh network with redundant links during a mission.

SmokeBot Work Packages

  Nr.  

Description

Responsible

WP1

3D Structure Perception with Limited Vision

FHR

WP2

Gas Perception and Mapping with High-Bandwidth Sensors

UWAR

WP3

Thermal Imaging

LUH

WP4

Situation Analysis

LUH

WP5

Autonomous Behavior

ORU

WP6

Human-Robot Interface

ORU

WP7

Requirements, Integration & Evaluation

TAUR

WP8

Management

ORU

WP8

Dissemination and Exploitation

ORU