MS Defense: Inertially Constrained Ruled Surfaces for Egomotion Estimation
IRB IRB-4105
https://umd.zoom.us/j/5037331890?pwd=ZDN2TTlhbkNVYzQwYmlSeXRnV29KZz09&omn=97024016232
Abstract:
In computer vision, camera egomotion is typically solved with visual odometry techniques which relies on feature extraction from a sequence of images and computation of the optical flow. This, however, often requires a point-to-point correspondence between two consecutive frames which can often be costly to compute and its varying accuracy greatly affects the quality of estimated motion. Attempts have been made to bypass the difficulties originated from the correspondence problem by adopting line features and fusing other sensors (event camera, IMU) to improve performance, many of which still heavily rely on feature detectors. If the camera observes a straight line as the it moves, the image of such line is sweeping a surface, this is a ruled surface and analyzing its shapes gives information about egomotion. Inspired by event cameras' capabilities on edge detection, this research presents a novel algorithm to reconstruct 3D scenes with ruled surfaces from which the camera egomotion is computed simultaneously. Constraining the egomotion with the inertia measurements from an onboard IMU sensor, the dimensionality of the solution space is greatly reduced.