Fast reactions to changes in the surrounding visual environment require efficient attention mechanisms to reallocate computational resources to most relevant locations in the visual field. While current computational models keep improving their predictive ability thanks to the increasing availability of data, they still struggle approximating...
-
June 23, 2020 (v1)PublicationUploaded on: December 4, 2022
-
December 1, 2021 (v1)Journal article
Symmetries, invariances and conservation equations have always been an invaluable guide in Science to model natural phenomena through simple yet effective relations. For instance, in computer vision, translation equivariance is typically a built-in property of neural architectures that are used to solve visual tasks; networks with computational...
Uploaded on: December 3, 2022 -
August 22, 2022 (v1)Conference paper
In the last few years there has been a growing interest in approaches that allow neural networks to learn how to predict optical flow, both in a supervised and, more recently, unsupervised manner. While this clearly opens up the possibility of learning to estimate optical flow in a truly lifelong setting, by processing a potentially endless...
Uploaded on: December 4, 2022 -
November 28, 2022 (v1)Publication
Devising intelligent agents able to live in an environment and learn by observing the surroundings is a longstanding goal of Artificial Intelligence. From a bare Machine Learning perspective, challenges arise when the agent is prevented from leveraging large fully-annotated dataset, but rather the interactions with supervisory signals are...
Uploaded on: December 4, 2022 -
January 14, 2022 (v1)Publication
Continual learning refers to the ability of humans and animals to incrementally learn over time in a given environment. Trying to simulate this learning process in machines is a challenging task, also due to the inherent difficulty in creating conditions for designing continuously evolving dynamics that are typical of the real-world. Many...
Uploaded on: December 3, 2022 -
September 19, 2022 (v1)Conference paper
The classic computational scheme of convolutional layers leverages filter banks that are shared over all the spatial coordinates of the input, independently on external information on what is specifically under observation and without any distinctions between what is closer to the observed area and what is peripheral. In this paper we propose...
Uploaded on: December 4, 2022 -
November 28, 2022 (v1)Publication
In this paper, we present PARTIME, a software library written in Python and based on PyTorch, designed specifically to speed up neural networks whenever data is continuously streamed over time, for both learning and inference. Existing libraries are designed to exploit data-level parallelism, assuming that samples are batched, a condition that...
Uploaded on: December 4, 2022 -
August 22, 2022 (v1)Conference paper
Learning in a continual manner is one of the main challenges that the machine learning community is currently facing. The importance of the problem can be readily understood as soon as we consider settings where an agent is supposed to learn through an online interaction with a data stream, rather than operating offline on previously prepared...
Uploaded on: December 3, 2022