Low-cost Unmanned Aerial Vehicles have large potential for applications in the civil sector. Cheap inertial sensors alone can not provide the degree of accuracy required for control and navigation of the UAVs. Additional sensors, and sensor fusion techniques are needed to reduce the state estimation error. Flying insects overcome the problem of inaccurate inertial sensors by using the optical flow field. As the definition of the optical flow is underspecified, optical flow can not be calculated locally. Methods like Horn-Schunk use global constraints, but require significant computational effort. To simplify finding the movement of the observer from image sensor data, rotation and translation relative to a plane is considered. A direct formulated Kalman filter architecture is proposed that fuses inertial sensor data with the data derived from the image sensor. Results from simulation show that this architecture decreases the state estimation error. For further investigation, an internal project is proposed to test the approach on a four rotor flying robot. The aim is to increase flight stability and allow hovering without the need for specified landmarks.
State Estimation Using Optical Flow and Inertial Sensor Data
VeranstaltungsortRaum Seminarraum 117, Robert-Hooke-Str. 5 in Bremen
In der Regel sind die Vorträge Teil von Lehrveranstaltungsreihen der Universität Bremen und nicht frei zugänglich. Bei Interesse wird um Rücksprache mit dem Sekretariat unter sek-ric(at)dfki.de gebeten.