Published September 17, 2024 | Version v1
Publication

Spiking monocular event based 6D pose estimation for space application

Description

With the growing interest in on On-orbit servicing (OOS) and Active Debris Removal (ADR) missions, spacecraft poses estimation algorithms are being developed using deep learning to improve the precision of this complex task and find the most efficient solution. With the advances of bio-inspired low-power solutions, such as spiking neural networks and event-based cameras, and their recent work for space applications, we propose to investigate the feasibility of a fully event-based solution to improve event-based pose estimation for spacecraft. In this paper, we address the first event-based dataset SEENIC with real event frames captured with an event-based camera on a testbed. We show the methods and results of the first event-based solution for this use case, where our small spiking end-to-end network (S2E2) solution achieves interesting results over 0.21m position error and 14.3◦ rotation error, which is the first step towards fully event-based processing for spacecraft pose estimation.

Abstract

International audience

Additional details

Created:
September 24, 2024
Modified:
September 24, 2024