Published 2022 | Version v1
Publication

Deeper Insights into Neural Nets with Random Weights

Description

In this work, the "effective dimension" of the output of the hidden layer of a one-hidden-layer neural network with random inner weights of its computational units is investigated. To do this, a polynomial approximation of the sigmoidal activation function of each computational unit is used, whose degree is chosen based both on a desired upper bound on the approximation error and on an estimate of the range of the input to that computational unit. This estimate of the range is parameterized by the number of inputs to the network and by an upper bound both on the size of the random inner weights of the network and on the size of its inputs. The results show that the Root Mean Square Error (RMSE) on the training set is influenced by the effective dimension and by the quality of the features associated with the output of the hidden layer.

Additional details

Created:
February 22, 2023
Modified:
November 29, 2023