Abstract
![CDATA[Any mobile agent, whether biological or robotic, needs to avoid collisions with obstacles. Insects, such as bees and flies, use optic flow to estimate the relative nearness to obstacles. Optic flow induced by ego-motion is composed of a translational and a rotational component. The segregation of both components is computationally and thus energetically expensive. Flies and bees actively separate the rotational and translational optic flow components via behaviour, i.e. by employing a saccadic strategy of flight and gaze control. Although robotic systems are able to mimic this gaze-strategy, the calculation of optic-flow fields from standard camera images remains time and energy consuming. To overcome this problem, we use a dynamic vision sensor (DVS), which provides event-based information about changes in contrast over time at each pixel location. To extract optic flow from this information, a plane-fitting algorithm estimating the relative velocity in a small spatio-temporal cuboid is used. The depth-structure is derived from the translational optic flow by using local properties of the retina. A collision avoidance direction is then computed from the event-based depth-structure of the environment. The system has successfully been tested on a robotic platform in open loop.]]
Original language | English |
---|---|
Title of host publication | Proceedings of First International Conference on Event-Based Control, Communication and Signal Processing (EBCCSP 2015), 17-19 June 2015, Krakow, Poland |
Publisher | Institute of Electrical and Electronic Engineers |
Number of pages | 7 |
ISBN (Print) | 9781467378888 |
Publication status | Published - 2015 |
Event | International Conference on Event-Based Control_Communication and Signal Processing - Duration: 17 Jun 2015 → … |
Conference
Conference | International Conference on Event-Based Control_Communication and Signal Processing |
---|---|
Period | 17/06/15 → … |
Keywords
- optical data processing
- robotics
- signal processing
- algorithms