DHRS CIM
m (→Source of Inspiration) |
m (→Source of Inspiration) |
||
Line 76: | Line 76: | ||
* [http://www.youtube.com/watch?v=MvRTALJp8DM GRASP Lab, University of Pennsylvania] | * [http://www.youtube.com/watch?v=MvRTALJp8DM GRASP Lab, University of Pennsylvania] | ||
+ | * [http://www.youtube.com/watch?v=TjQPHprBTPs&feature=player_embedded GRASP lab, ROS UAV mapping] | ||
* [http://groups.csail.mit.edu/rrg/mit-mav/videos.shtml MIT MAV team] | * [http://groups.csail.mit.edu/rrg/mit-mav/videos.shtml MIT MAV team] | ||
* [http://pixhawk.ethz.ch/ ETH Open Source Micro Aerial Vehicle] | * [http://pixhawk.ethz.ch/ ETH Open Source Micro Aerial Vehicle] |
Latest revision as of 23:18, 29 September 2010
Contents |
[edit] The DHRS-CIM project
Distributed Human-Robot System for Chemical Incident Management
- Disclaimer: To the best of our knowledge the information provided herein are correct at the time of publication. The views or claims expressed in external links provided herein are not endorsed by Sheffield Hallam University. If you find any mistakes or omissions please contact MMVL (details at the main MMVL wiki-page).
[edit] Aim and Objectives
The overall aim of this project is knowledge exchange between industrial and academic organizations in the field of Disaster Management; working towards implementing / prototyping, an intelligent decision support system for humans to manage chemical incidents.
This system can be described as a Distributed Human-Robot System for Chemical Incident Management (DHRS-CIM). This will include ground robots with Micro Unmanned Aerial Vehicles (MAV): the ground robots may be sensor rich vehicles for ground inspection whilst the MAVs have a reduced set of sensors (e.g. chemical sensor and stereo camera) that aid in the presentation of an overhead ground map.
The ultimate objectives are to:
- (1) devise an intelligent system which can be deployed to detect (potential) chemical incidents,
- (2) help crisis professionals to take informed decisions, and to
- (3) manage chemical incidents via effective data gathering and representation.
MMVL is predominantly involved in WP1: Human / Robot sensing and decision support. Particularly in environment exploration, active perception and visualisation. The focus will be on the artificial perception and human interaction system in order to integrate sensor information into an intelligent collaborative system for human decision support.
[edit] Partners
[edit] Coordinator
[edit] Academic partners involved
[edit] Industrial partners involved
[edit] Links
[edit] Reference material
- DVB driver subsystem
- Driver for the RF Syntek camera interface
- Range Video: Aerial Video System 2.4GHz 1000mW (<3km trx range), Range Video: Aerial Video Systems
- EasyCAP: usb video capture device
- HornetEye: Towards Interpreted Real-time Computer Vision, HornetsEye: V4L2
- Distributed Ruby