Published 2023
| Version v1
Publication
Selecting Language Models Features VIA Software-Hardware Co-Design
Creators
Contributors
Description
The availability of new datasets and deep learning techniques have led to a surge of effort directed towards the creation of new models that can exploit the large amount of data. However, little attention has been given to the development of models that are not only accurate, but also suitable for user-specific use or geared towards resource-constrained devices. Fine-tuning deep models on edge devices is impractical and, often, user customization stands on the sub-optimal feature-extractor/classifier paradigm. Here, we propose a method to fully utilize the intermediate outputs of the popular large pre-trained models in natural language processing when used as frozen feature extractors, and further close the gap between their fine-tuning and more computationally efficient solutions. We reach this goal exploiting the concept of software-hardware co-design and propose a methodical procedure, inspired by Neural Architecture Search, to select the most desirable model taking into consideration application constraints.
Additional details
Identifiers
- URL
- https://hdl.handle.net/11567/1212876
- URN
- urn:oai:iris.unige.it:11567/1212876
Origin repository
- Origin repository
- UNIGE