Event-Driven Single-Photon Detectors and Sensors for ToF LiDAR Systems
Description
This Thesis focuses on conceiving, developing, and implementing a novel vision sensor where Single- Photon Avalanche Diodes (SPADs) and event-driven architecture converge in a scalable architecture. The departure from conventional frame-based architectures represents a paradigm shift in SPAD-based sensors, aiming to alleviate the prevalent challenge of handling and processing extensive data volumes. By exclusively transmitting meaningful data, it eases storage and processing requirements, an advantageous characteristic for applications such as augmented reality and autonomous driving. The proposed sensor embeds 2D and 3D imaging capabilities. Also, its operation principle aligns with the paradigm of dynamic vision by introducing a discrete version of this paradigm that fits the operation of SPADs and supports motion detection and reduced data transmission. The research begins by introducing the concept of an Event-Driven camera system specifically designed and tailored for Light Detection And Ranging (LiDAR) applications. This camera system encompasses the optical emitter, receiver (vision sensor), and required auxiliary circuitry for an autonomous operation. It directly processes information through events received from the vision sensor. Among its various functionalities, the camera system can dynamically adjust the sensitivity of individual pixels based on their absolute intensity values and detect intensity variations by analyzing the temporal information conveyed through these events. The discrete arrival of photons presents challenges in devising circuits for motion detection. It motivates introducing the concept of a discrete dynamic vision sensor, whose behavior and metrics are analyzed in the Thesis. Theoretical models proposed in this study are evaluated using experimental data, showcasing a methodology for true-event-driven dynamic vision with single-photon detectors that qualify for a digital implementation which behavior would only be limited by photon shot noise. The Thesis then explores the implementation of the integrated circuit housing the vision sensor, elaborating on the pixel concept and its diverse implementations. A detailed discussion covers their respective advantages and drawbacks and suggests enhancements to augment and broaden the functionalities of the pixel. Validation of the Thesis proposals encompasses a bottom-up characterization of the system, covering the evaluation of SPAD devices until electrical verifications of the sensor. This comprehensive process culminates in validating the sensor in controlled lab conditions and real-world scenarios. Particularly notable is its validation in an astronomy application, highlighting the sensor's event-driven nature that enables the extraction of information at the single-photon level from the occultation of Betelgeuse by asteroid Leona with microsecond resolution, an unprecedented event in the field. This substantial advancement surpasses the temporal resolution limitations of conventional cameras, which are limited by their frame rate. In summary, this dissertation describes a SPAD-based event-driven architecture that lays the foundation for future research in single-photon sensors. Its functionality and versatility are assessed through extensive experimental validation, setting the stage for prospective advancements, and demonstrating remarkable potential to revolutionize single-photon imaging technology.
Additional details
- URL
- https://idus.us.es/handle//11441/163382
- URN
- urn:oai:idus.us.es:11441/163382
- Origin repository
- USE