Published February 27, 2020 | Version v1
Conference paper

Meta-parameters Exploration for Unsupervised Event-based Motion Analysis

Description

Being able to estimate motion features is an essential step in dynamic scene analysis. Optical flow typically quantifies the apparent motion of objects. Motion features can benefit from bio-inspired models of mammalian retina, where ganglion cells show preferences to global patterns of direction, especially in the four cardinal translatory directions. We study the meta-parameters of a bio-inspired motion estimation model using event cameras, that are bio-inspired vision sensors that naturally capture the dynamics of a scene. The motion estimation model is made of an elementary Spiking Neural Network, that learns the motion dynamics in a non-supervised way through the Spike-Timing-Dependent Plasticity. After short simulation times, the model can successfully estimate directions without supervision. Some of the advantages of such networks are the non-supervised and continuous learning capabilities, and also their implementability on very low-power hardware. The model is tuned using a synthetic dataset generated for parameter estimation, made of various patterns moving in several directions. The parameter exploration shows that attention should be given to model tuning, and yet the model is generally stable over meta-parameter changes.

Abstract

International audience

Additional details

Identifiers

URL
https://hal.archives-ouvertes.fr/hal-02529895
URN
urn:oai:HAL:hal-02529895v1

Origin repository

Origin repository
UNICA