The availability of new datasets and deep learning techniques have led to a surge of effort directed towards the creation of new models that can exploit the large amount of data. However, little attention has been given to the development of models that are not only accurate, but also suitable for user-specific use or geared towards...
-
2023 (v1)PublicationUploaded on: October 3, 2024
-
2021 (v1)Publication
In the past few years, the use of transformer-based models has experienced increasing popularity as new state-of-the-art performance was achieved in several natural language processing tasks. As these models are often extremely large, however, their use for applications within embedded devices may not be feasible. In this work, we look at one...
Uploaded on: April 14, 2023 -
2021 (v1)Publication
Emotion recognition, among other natural language processing tasks, has greatly benefited from the use of large transformer models. Deploying these models on resource-constrained devices, however, is a major challenge due to their computational cost. In this paper, we show that the combination of large transformers, as high-quality feature...
Uploaded on: April 14, 2023