eduzhai > Applied Sciences > Engineering >

AMITE A Novel Polynomial Expansion for Analyzing Neural Network Nonlinearities

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 12 pages

Abstract: Polynomial expansions are an important technique in the analysis and study ofneural network nonlinearities. Recently, expansions have been applied to neuralnetworks addressing well known difficulties in the verifiable, explainable andsecure deployment thereof. Existing approaches span classical Taylor andChebyshev methods, asymptotics, and many numerical and algorithmic approaches.We find that while existing approaches individually have useful properties suchas exact error formulas, monic form, adjustable domain, and robustness toundefined derivatives, there are no approaches that provide a consistent methodyielding an expansion with all these properties. To address this gap, wedevelop an analytically modified integral transform expansion referred to asAMITE, which is a novel expansion via integral transforms modified usingderived criteria for convergence. We apply AMITE to the nonlinear activationfunctions of neural networks including hyperbolic tangent and rectified linearunits. Compared with existing state-of-the-art expansion techniques such asChebyshev, Taylor series, and numerical approximations, AMITE is the firstpolynomial expansion that can provide six previously mutually exclusive desiredexpansion properties such as exact formulas for the coefficients and exactexpansion errors (Table II). Using an MLP as a case study, we demonstrate theeffectiveness of AMITE in the equivalence testing problem of MLP where ablack-box network under test is stimulated, and a replicated multivariatepolynomial form is efficiently extracted from a noisy response to enablecomparison against an original network. AMITE presents a new dimension ofexpansion methods that are suitable for analysis approximation ofnonlinearities in neural networks, which opens up new directions andopportunities for the theoretical analysis and systematic testing of neuralnetworks.

Please select stars to rate!


0 comments Sign in to leave a comment.

    Data loading, please wait...