Show simple item record

dc.contributor.authorAzzi, Charbel
dc.date.accessioned2022-05-18 15:03:38 (GMT)
dc.date.available2022-05-18 15:03:38 (GMT)
dc.date.issued2022-05-18
dc.date.submitted2022-05-09
dc.identifier.urihttp://hdl.handle.net/10012/18293
dc.description.abstractMotion estimation is considered essential for many applications such as robotics, automation, and augmented reality to name a few. All cheap and low cost sensors which are commonly used for motion estimation have many shortcomings. Recently, event cameras are a new stream in imaging sensor technology characterized by low latency, high dynamic range, low power and high resilience to motion blur. These advantages allow them to have the potential to fill some of the gaps of other low cost motion sensors, offering alternatives to motion estimation that are worth exploring. All current event-based approaches estimate motion by considering that events in a neighborhood encode the local structure of the imaged scene, then track the evolution of this structure over time which is problematic since events are only an approximation of the local structure that can be very sparse in some cases. In this thesis, we tackle the problem in a fundamentally different way by considering that events generated by the motion of the same scene point relative to the camera constitute an event track. We show that consistency with a single camera motion is sufficient for correct data association of events and their previous firings along event tracks resulting in more accurate and robust motion estimation. Towards that, we present new voting based solutions which consider all potential data association candidates that are consistent with a single camera motion for candidates evaluation by handling each event individually with- out assuming any relationship to its neighbors beyond the camera motion. We first exploit this in a particle filtering framework for the simple case of a camera undergoing a planar motion, and show that our approach can yield motion estimates that are an order of magnitude more accurate than opti- cal flow based approaches. Furthermore, we show that the consensus based approach can be extended to work even in the case of arbitrary camera mo- tion and unknown scene depth. Our general motion framework significantly outperforms other approaches in terms of accuracy and robustness.en
dc.language.isoenen
dc.publisherUniversity of Waterlooen
dc.subjectOptical Flowen
dc.subjectEgomotionen
dc.subjectMotion Estimationen
dc.subjectEvent Cameraen
dc.subjectPlanar Motionen
dc.subjectGeneral Motionen
dc.titleAsynchronous Optical Flow and Egomotion Estimation from Address Events Sensorsen
dc.typeDoctoral Thesisen
dc.pendingfalse
uws-etd.degree.departmentSystems Design Engineeringen
uws-etd.degree.disciplineSystem Design Engineeringen
uws-etd.degree.grantorUniversity of Waterlooen
uws-etd.degreeDoctor of Philosophyen
uws-etd.embargo.terms0en
uws.contributor.advisorAbdel-Rahman, Eihab
uws.contributor.advisorFakih, Adel
uws.contributor.affiliation1Faculty of Engineeringen
uws.published.cityWaterlooen
uws.published.countryCanadaen
uws.published.provinceOntarioen
uws.typeOfResourceTexten
uws.peerReviewStatusUnrevieweden
uws.scholarLevelGraduateen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


UWSpace

University of Waterloo Library
200 University Avenue West
Waterloo, Ontario, Canada N2L 3G1
519 888 4883

All items in UWSpace are protected by copyright, with all rights reserved.

DSpace software

Service outages