Skip to main content
Erschienen in:
Buchtitelbild

Open Access 2022 | OriginalPaper | Buchkapitel

LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions

loading …

The most scalable approaches to certifying neural network robustness depend on computing sound linear lower and upper bounds for the network’s activation functions. Current approaches are limited in that the linear bounds must be handcrafted by an expert, and can be sub-optimal, especially when the network’s architecture composes operations using, for example, multiplication such as in LSTMs and the recently popular Swish activation. The dependence on an expert prevents the application of robustness certification to developments in the state-of-the-art of activation functions, and furthermore the lack of tightness guarantees may give a false sense of insecurity about a particular model. To the best of our knowledge, we are the first to consider the problem of automatically synthesizing tight linear bounds for arbitrary n-dimensional activation functions. We propose the first fully automated method that achieves tight linear bounds while only leveraging the mathematical definition of the activation function itself. Our method leverages an efficient heuristic technique to synthesize bounds that are tight and usually sound, and then verifies the soundness (and adjusts the bounds if necessary) using the highly optimized branch-and-bound SMT solver, dReal. Even though our method depends on an SMT solver, we show that the runtime is reasonable in practice, and, compared with state of the art, our method often achieves 2-5X tighter final output bounds and more than quadruple certified robustness.

download
DOWNLOAD
print
DRUCKEN
Metadaten
Titel
LinSyn: Synthesizing Tight Linear Bounds for Arbitrary Neural Network Activation Functions
verfasst von
Brandon Paulsen
Chao Wang
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-99524-9_19

Premium Partner