Published 2020 | Version v1
Publication

Some families of FSP functions and their properties

Description

We report properties of fixed-structure parametrized (FSP) functions that give insights into the effectiveness of the "Extended Ritz Method" (ERIM) as a methodology for the approximate solution of infinite-dimensional optimization problems. First, we present the structure of some widespread FSP functions, including linear combinations of fixed-basis functions, one-hidden-layer (OHL) and multiple-hidden-layer (MHL) networks, and kernel smoothing models. Second, focusing on the case of OHL neural networks based on ridge and radial constructions, we report their density properties under different metrics. Third, we present rates of function approximation via ridge OHL neural networks, by reporting a fundamental theorem by Maurey, Jones, and Barron, together with its extensions, based on a norm tailored to approximation by computational units from a given set of functions. We also discuss approximation properties valid for MHL networks. Fourth, we compare the classical Ritz method and the ERIM from the point of view of the curse of dimensionality, proving advantages of the latter for a specific class of problems, where the functional to be optimized is quadratic. Finally, we provide rates of approximate optimization by the ERIM, based on the concepts of modulus of continuity and modulus of convexity of the functional to be optimized.

Additional details

Created:
April 14, 2023
Modified:
November 22, 2023