BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Date iCal//NONSGML kigkonsult.se iCalcreator 2.20.2//
METHOD:PUBLISH
X-WR-CALNAME;VALUE=TEXT:Eventi DIAG
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:STANDARD
DTSTART:20141026T030000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20140330T020000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:calendar.6918.field_data.0@www.u-gov-ricerca.uniroma1.it
DTSTAMP:20260405T115709Z
CREATED:20140630T095257Z
DESCRIPTION:The ability to build high-fidelity 3D representations of the en
 vironment from sensor data is critical for autonomous robots. Multi-sensor
  data fusion allows for more complete and accurate representations. Furthe
 rmore\, using distinct sensing modalities (i.e. sensors using a different 
 physical process and/or operating at different electromagnetic frequencies
 ) usually leads to more reliable perception\, especially in challenging en
 vironments\, as modalities may complement each other. However\, they may r
 eact differently to certain materials or environmental conditions\, leadin
 g to catastrophic fusion.In this presentation\, we propose a new method to
  reliably fuse data from multiple sensing modalities\, including in situat
 ions where they detect different targets. We first compute distinct contin
 uous surface representations for each sensing modality\, with uncertainty\
 , using Gaussian Process Implicit Surfaces (GPIS). Second\, we perform a l
 ocal consistency test between these representations\, to separate consiste
 nt data (i.e. data corresponding to the detection of the same target by th
 e sensors) from inconsistent data. The consistent data can then be fused t
 ogether\, using another GPIS process\, and the rest of the data can be com
 bined as appropriate.We will show that in challenging environmental condit
 ions\, where differences of perception between distinct sensing modalities
  are common\, our proposed method with integrated consistency test avoids 
 catastrophic fusion and thereby highly improves object representations\, b
 oth in terms of accuracy and certainty.
DTSTART;TZID=Europe/Paris:20140724T110000
DTEND;TZID=Europe/Paris:20140724T110000
LAST-MODIFIED:20140723T114709Z
LOCATION:Aula Magna DIS
SUMMARY:Robust Multiple-Sensing-Modality Data Fusion for Reliable Perceptio
 n  -  Marcos Paul Gerardo Castro
URL;TYPE=URI:http://www.u-gov-ricerca.uniroma1.it/node/6918
END:VEVENT
END:VCALENDAR
