Top-tuning: A study on transfer learning for an efficient alternative to fine tuning for image classification with fast kernel methods
- Creators
- Alfano P. D.
- Pastore V. P.
- Rosasco L.
- Odone F.
Description
The impressive performance of deep learning architectures is associated with a massive increase in model complexity. Millions of parameters need to be tuned, with training and inference time scaling accordingly, together with energy consumption. But is massive fine-tuning always necessary? In this paper, focusing on image classification, we consider a simple transfer learning approach exploiting pre-trained convolutional features as input for a fast-to-train kernel method. We refer to this approach as top-tuning since only the kernel classifier is trained on the target dataset. In our study, we perform more than 3000 training processes focusing on 32 small to medium-sized target datasets, a typical situation where transfer learning is necessary. We show that the top-tuning approach provides comparable accuracy with respect to fine-tuning, with a training time between one and two orders of magnitude smaller. These results suggest that top-tuning is an effective alternative to fine-tuning in small/medium datasets, being especially useful when training time efficiency and computational resources saving are crucial.
Additional details
- URL
- https://hdl.handle.net/11567/1160136
- URN
- urn:oai:iris.unige.it:11567/1160136
- Origin repository
- UNIGE