logo
today local_bar
Poster Session 1 · Wednesday, December 3, 2025 11:00 AM → 2:00 PM
#3310

Optimal Minimum Width for the Universal Approximation of Continuously Differentiable Functions by Deep Narrow MLPs

NeurIPS OpenReview

Abstract

In this paper, we investigate the universal approximation property of deep, narrow multilayer perceptrons (MLPs) for functions under the Sobolev norm, specifically the norm. Although the optimal width of deep, narrow MLPs for approximating continuous functions has been extensively studied, significantly less is known about the corresponding optimal width for functions. We demonstrate that the optimal width can be determined in a wide range of cases within the setting.
Our approach consists of two main steps.
  1. First, leveraging control theory, we show that any diffeomorphism can be approximated by deep, narrow MLPs.
  2. Second, using the Borsuk-Ulam theorem and various results from differential geometry, we prove that the optimal width for approximating arbitrary functions via diffeomorphisms is in certain cases, including and , where and denote the input and output dimensions, respectively.
Our results apply to a broad class of activation functions.