Skip to main content

2018 | Buch

Automated Technology for Verification and Analysis

16th International Symposium, ATVA 2018, Los Angeles, CA, USA, October 7-10, 2018, Proceedings

insite
SUCHEN

Über dieses Buch

This book constitutes the refereed proceedings of the 24th International Symposium on Automated Technology for Verification and Analysis, ATVA 2018, held in Los Angeles, CA, USA in October 2018.

The 27 full papers presented together with 5 short papers and 3 invited talks were carefully reviewed and selected from 82 submissions.

The symposium is dedicated to the promotion of research on theoretical and practical aspects of automated analysis, verification and synthesis by providing a forum for interaction between the regional and the international research communities and industry in the field.

Inhaltsverzeichnis

Frontmatter

Invited Papers

Frontmatter
DeepSafe: A Data-Driven Approach for Assessing Robustness of Neural Networks

Deep neural networks have achieved impressive results in many complex applications, including classification tasks for image and speech recognition, pattern analysis or perception in self-driving vehicles. However, it has been observed that even highly trained networks are very vulnerable to adversarial perturbations. Adding minimal changes to inputs that are correctly classified can lead to wrong predictions, raising serious security and safety concerns. Existing techniques for checking robustness against such perturbations only consider searching locally around a few individual inputs, providing limited guarantees. We propose DeepSafe, a novel approach for automatically assessing the overall robustness of a neural network. DeepSafe applies clustering over known labeled data and leverages off-the-shelf constraint solvers to automatically identify and check safe regions in which the network is robust, i.e. all the inputs in the region are guaranteed to be classified correctly. We also introduce the concept of targeted robustness, which ensures that the neural network is guaranteed not to misclassify inputs within a region to a specific target (adversarial) label. We evaluate DeepSafe on a neural network implementation of a controller for the next-generation Airborne Collision Avoidance System for unmanned aircraft (ACAS Xu) and for the well known MNIST network. For these networks, DeepSafe identified many regions which were safe, and also found adversarial perturbations of interest.

Divya Gopinath, Guy Katz, Corina S. Păsăreanu, Clark Barrett
Formal Specification for Deep Neural Networks

The increasing use of deep neural networks in a variety of applications, including some safety-critical ones, has brought renewed interest in the topic of verification of neural networks. However, verification is most meaningful when performed with high-quality formal specifications. In this paper, we survey the landscape of formal specification for deep neural networks, and discuss the opportunities and challenges for formal methods for this domain.

Sanjit A. Seshia, Ankush Desai, Tommaso Dreossi, Daniel J. Fremont, Shromona Ghosh, Edward Kim, Sumukh Shivakumar, Marcell Vazquez-Chanlatte, Xiangyu Yue

Regular Papers

Frontmatter
Optimal Proofs for Linear Temporal Logic on Lasso Words

Counterexamples produced by model checkers can be hard to grasp. Often it is not even evident why a trace violates a specification. We show how to provide easy-to-check evidence for the violation of a linear temporal logic (LTL) formula on a lasso word, based on a novel sound and complete proof system for LTL on lasso words. Valid proof trees in our proof system follow the syntactic structure of the formula and provide insight on why each Boolean or temporal operator is violated or satisfied. We introduce the notion of optimal proofs with respect to a user-specified preference order and identify sufficient conditions for efficiently computing optimal proofs. We design and evaluate an algorithm that performs this computation, demonstrating that it can produce optimal proofs for complex formulas in under a second.

David Basin, Bhargav Nagaraja Bhatt, Dmitriy Traytel
What’s to Come is Still Unsure
Synthesizing Controllers Resilient to Delayed Interaction

The possible interactions between a controller and its environment can naturally be modelled as the arena of a two-player game, and adding an appropriate winning condition permits to specify desirable behavior. The classical model here is the positional game, where both players can (fully or partially) observe the current position in the game graph, which in turn is indicative of their mutual current states. In practice, neither sensing or actuating the environment through physical devices nor data forwarding to and signal processing in the controller are instantaneous. The resultant delays force the controller to draw decisions before being aware of the recent history of a play. It is known that existence of a winning strategy for the controller in games with such delays is decidable over finite game graphs and with respect to $$\omega $$ -regular objectives. The underlying reduction, however, is impractical for non-trivial delays as it incurs a blow-up of the game graph which is exponential in the magnitude of the delay. For safety objectives, we propose a more practical incremental algorithm synthesizing a series of controllers handling increasing delays and reducing game-graph size in between. It is demonstrated using benchmark examples that even a simplistic explicit-state implementation of this algorithm outperforms state-of-the-art symbolic synthesis algorithms as soon as non-trivial delays have to be handled. We furthermore shed some light on the practically relevant case of non-order-preserving delays, as arising in actual networked control, thereby considerably extending the scope of regular game theory under delay pioneered by Klein and Zimmermann.

Mingshuai Chen, Martin Fränzle, Yangjia Li, Peter N. Mosaad, Naijun Zhan
A Formally Verified Motion Planner for Autonomous Vehicles

Autonomous vehicles are safety-critical cyber-physical systems. To ensure their correctness, we use a proof assistant to prove safety properties deductively. This paper presents a formally verified motion planner based on manoeuvre automata in Isabelle/HOL. Two general properties which we ensure are numerical soundness (the absence of floating-point errors) and logical correctness (satisfying a plan specified in linear temporal logic). From these two properties, we obtain a motion planner whose correctness only depends on the validity of the models of the ego vehicle and its environment.

Albert Rizaldi, Fabian Immler, Bastian Schürmann, Matthias Althoff
Robustness Testing of Intermediate Verifiers

Program verifiers are not exempt from the bugs that affectnearly every piece of software. In addition, they often exhibit brittle behavior: their performance changes considerably with details of how the input program is expressed—details that should be irrelevant, such as the order of independent declarations. Such a lack of robustness frustrates users who have to spend considerable time figuring out a tool’s idiosyncrasies before they can use it effectively. This paper introduces a technique to detect lack of robustness of program verifiers; the technique is lightweight and fully automated, as it is based on testing methods (such as mutation testing and metamorphic testing). The key idea is to generate many simple variants of a program that initially passes verification. All variants are, by construction, equivalent to the original program; thus, any variant that fails verification indicates lack of robustness in the verifier. We implemented our technique in a tool called $$\mu $$ μ gie, which operates on programs written in the popular Boogie language for verification—used as intermediate representation in numerous program verifiers. Experiments targeting 135 Boogie programs indicate that brittle behavior occurs fairly frequently (16 programs) and is not hard to trigger. Based on these results, the paper discusses the main sources of brittle behavior and suggests means of improving robustness.

YuTing Chen, Carlo A. Furia
Simulation Algorithms for Symbolic Automata

We investigate means of efficient computation of the simulation relation over symbolic finite automata (SFAs), i.e., finite automata with transitions labeled by predicates over alphabet symbols. In one approach, we build on the algorithm by Ilie, Navaro, and Yu proposed originally for classical finite automata, modifying it using the so-called mintermisation of the transition predicates. This solution, however, generates all Boolean combinations of the predicates, which easily causes an exponential blowup in the number of transitions. Therefore, we propose two more advanced solutions. The first one still applies mintermisation but in a local way, mitigating the size of the exponential blowup. The other one focuses on a novel symbolic way of dealing with transitions, for which we need to sacrifice the counting technique of the original algorithm (counting is used to decrease the dependency of the running time on the number of transitions from quadratic to linear). We perform a thorough experimental evaluation of all the algorithms, together with several further alternatives, showing that all of them have their merits in practice, but with the clear indication that in most of the cases, efficient treatment of symbolic transitions is more beneficial than counting.

Lukáš Holík, Ondřej Lengál, Juraj Síč, Margus Veanes, Tomáš Vojnar
Quantitative Projection Coverage for Testing ML-enabled Autonomous Systems

Systematically testing models learned from neural networks remains a crucial unsolved barrier to successfully justify safety for autonomous vehicles engineered using data-driven approach. We propose quantitative k-projection coverage as a metric to mediate combinatorial explosion while guiding the data sampling process. By assuming that domain experts propose largely independent environment conditions and by associating elements in each condition with weights, the product of these conditions forms scenarios, and one may interpret weights associated with each equivalence class as relative importance. Achieving full k-projection coverage requires that the data set, when being projected to the hyperplane formed by arbitrarily selected k-conditions, covers each class with number of data points no less than the associated weight. For the general case where scenario composition is constrained by rules, precisely computing k-projection coverage remains in NP. In terms of finding minimum test cases to achieve full coverage, we present theoretical complexity for important sub-cases and an encoding to 0-1 integer programming. We have implemented a research prototype that generates test cases for a visual object detection unit in automated driving, demonstrating the technological feasibility of our proposed coverage criterion.

Chih-Hong Cheng, Chung-Hao Huang, Hirotoshi Yasuoka
Recursive Online Enumeration of All Minimal Unsatisfiable Subsets

In various areas of computer science, we deal with a set of constraints to be satisfied. If the constraints cannot be satisfied simultaneously, it is desirable to identify the core problems among them. Such cores are called minimal unsatisfiable subsets (MUSes). The more MUSes are identified, the more information about the conflicts among the constraints is obtained. However, a full enumeration of all MUSes is in general intractable due to the large number (even exponential) of possible conflicts. Moreover, to identify MUSes, algorithms have to test sets of constraints for their simultaneous satisfiability. The type of the test depends on the application domains. The complexity of the tests can be extremely high especially for domains like temporal logics, model checking, or SMT. In this paper, we propose a recursive algorithm that identifies MUSes in an online manner (i.e., one by one) and can be terminated at any time. The key feature of our algorithm is that it minimises the number of satisfiability tests and thus speeds up the computation. The algorithm is applicable to an arbitrary constraint domain. We benchmark our algorithm against the state-of-the-art algorithm Marco on the Boolean and SMT constraint domains and demonstrate that our algorithm really requires less satisfiability tests and consequently finds more MUSes in the given time limits.

Jaroslav Bendík, Ivana Černá, Nikola Beneš
Synthesis in pMDPs: A Tale of 1001 Parameters

This paper considers parametric Markov decision processes (pMDPs) whose transitions are equipped with affine functions over a finite set of parameters. The synthesis problem is to find a parameter valuation such that the instantiated pMDP satisfies a (temporal logic) specification under all strategies. We show that this problem can be formulated as a quadratically-constrained quadratic program (QCQP) and is non-convex in general. To deal with the NP-hardness of such problems, we exploit a convex-concave procedure (CCP) to iteratively obtain local optima. An appropriate interplay between CCP solvers and probabilistic model checkers creates a procedure—realized in the tool PROPheSY—that solves the synthesis problem for models with thousands of parameters.

Murat Cubuktepe, Nils Jansen, Sebastian Junges, Joost-Pieter Katoen, Ufuk Topcu
Temporal Logic Verification of Stochastic Systems Using Barrier Certificates

This paper presents a methodology for temporal logic verification of discrete-time stochastic systems. Our goal is to find a lower bound on the probability that a complex temporal property is satisfied by finite traces of the system. Desired temporal properties of the system are expressed using a fragment of linear temporal logic, called safe LTL over finite traces. We propose to use barrier certificates for computations of such lower bounds, which is computationally much more efficient than the existing discretization-based approaches. The new approach is discretization-free and does not suffer from the curse of dimensionality caused by discretizing state sets. The proposed approach relies on decomposing the negation of the specification into a union of sequential reachabilities and then using barrier certificates to compute upper bounds for these reachability probabilities. We demonstrate the effectiveness of the proposed approach on case studies with linear and polynomial dynamics.

Pushpak Jagtap, Sadegh Soudjani, Majid Zamani
Bisimilarity Distances for Approximate Differential Privacy

Differential privacy is a widely studied notion of privacy for various models of computation. Technically, it is based on measuring differences between probability distributions. We study $$\epsilon ,\delta $$ -differential privacy in the setting of labelled Markov chains. While the exact differences relevant to $$\epsilon ,\delta $$ -differential privacy are not computable in this framework, we propose a computable bisimilarity distance that yields a sound technique for measuring $$\delta $$ , the parameter that quantifies deviation from pure differential privacy. We show this bisimilarity distance is always rational, the associated threshold problem is in NP, and the distance can be computed exactly with polynomially many calls to an NP oracle.

Dmitry Chistikov, Andrzej S. Murawski, David Purser
A Symbolic Algorithm for Lazy Synthesis of Eager Strategies

We present an algorithm for solving two-player safety games that combines a mixed forward/backward search strategy with a symbolic representation of the state space. By combining forward and backward exploration, our algorithm can synthesize strategies that are eager in the sense that they try to prevent progress towards the error states as soon as possible, whereas standard backwards algorithms often produce permissive solutions that only react when absolutely necessary. We provide experimental results for two new sets of benchmarks, as well as the benchmark set of the Reactive Synthesis Competition (SYNTCOMP) 2017. The results show that our algorithm in many cases produces more eager strategies than a standard backwards algorithm, and solves a number of benchmarks that are intractable for existing tools. Finally, we observe a connection between our algorithm and a recently proposed algorithm for the synthesis of controllers that are robust against disturbances, pointing to possible future applications.

Swen Jacobs, Mouhammad Sakr
Modular Verification of Concurrent Programs via Sequential Model Checking

This work utilizes the plethora of work on verification of sequential programs for the purpose of verifying concurrent programs. We reduce the verification of a concurrent program to a series of verification tasks of sequential programs. Our approach is modular in the sense that each sequential verification task roughly corresponds to the verification of a single thread, with some additional information about the environment in which it operates. Information regarding the environment is gathered during the run of the algorithm, by need. While our approach is general, it specializes on concurrent programs where the threads are structured hierarchically. The idea is to exploit the hierarchy in order to minimize the amount of information that needs to be transferred between threads. To that end, we verify one of the threads, considered “main”, as a sequential program. Its verification process initiates queries to its “environment” (which may contain multiple threads). Those queries are answered by sequential verification, if the environment consists of a single thread, or, otherwise, by applying the same hierarchical algorithm on the environment. Our technique is fully automatic, and allows us to use any off-the-shelf sequential model checker. We implemented our technique in a tool called CoMuS and evaluated it against established tools for concurrent verification. Our experiments show that it works particularly well on hierarchically structured programs.

Dan Rasin, Orna Grumberg, Sharon Shoham
Quantifiers on Demand

Automated program verification is a difficult problem. It is undecidable even for transition systems over Linear Integer Arithmetic (LIA). Extending the transition system with theory of Arrays, further complicates the problem by requiring inference and reasoning with universally quantified formulas. In this paper, we present a new algorithm, Quic3, that extends IC3 to infer universally quantified invariants over the combined theory of LIA and Arrays. Unlike other approaches that use either IC3 or an SMT solver as a black box, Quic3 carefully manages quantified generalization (to construct quantified invariants) and quantifier instantiation (to detect convergence in the presence of quantifiers). While Quic3 is not guaranteed to converge, it is guaranteed to make progress by exploring longer and longer executions. We have implemented Quic3 within the Constrained Horn Clause solver engine of Z3 and experimented with it by applying Quic3 to verifying a variety of public benchmarks of array manipulating C programs.

Arie Gurfinkel, Sharon Shoham, Yakir Vizel
Signal Convolution Logic

We introduce a new logic called Signal Convolution Logic ( $$\text {SCL}$$ ) that combines temporal logic with convolutional filters from digital signal processing. $$\text {SCL}$$ enables to reason about the percentage of time a formula is satisfied in a bounded interval. We demonstrate that this new logic is a suitable formalism to effectively express non-functional requirements in Cyber-Physical Systems displaying noisy and irregular behaviours. We define both a qualitative and quantitative semantics for it, providing an efficient monitoring procedure. Finally, we prove $$\text {SCL}$$ at work to monitor the artificial pancreas controllers that are employed to automate the delivery of insulin for patients with type-1 diabetes.

Simone Silvetti, Laura Nenzi, Ezio Bartocci, Luca Bortolussi
Efficient Symbolic Representation of Convex Polyhedra in High-Dimensional Spaces

This work is aimed at developing an efficient data structure for representing symbolically convex polyhedra. We introduce an original data structure, the Decomposed Convex Polyhedron (DCP), that is closed under intersection and linear transformations, and allows to check inclusion, equality, and emptiness. The main feature of DCPs lies in their ability to represent concisely polyhedra that can be expressed as combinations of simpler sets, which can overcome combinatorial explosion in high dimensional spaces. DCPs also have the advantage of being reducible into a canonical form, which makes them efficient for representing simple sets constructed by long sequences of manipulations, such as those handled by state-space exploration tools. Their practical efficiency has been evaluated with the help of a prototype implementation, with promising results.

Bernard Boigelot, Isabelle Mainz
Accelerated Model Checking of Parametric Markov Chains

Parametric Markov chains occur quite naturally in various applications: they can be used for a conservative analysis of probabilistic systems (no matter how the parameter is chosen, the system works to specification); they can be used to find optimal settings for a parameter; they can be used to visualise the influence of system parameters; and they can be used to make it easy to adjust the analysis for the case that parameters change. Unfortunately, these advancements come at a cost: parametric model checking is—or rather was—often slow. To make the analysis of parametric Markov models scale, we need three ingredients: clever algorithms, the right data structure, and good engineering. Clever algorithms are often the main (or sole) selling point; and we face the trouble that this paper focuses on – the latter ingredients to efficient model checking. Consequently, our easiest claim to fame is in the speed-up we have often realised when comparing to the state of the art.

Paul Gainer, Ernst Moritz Hahn, Sven Schewe
Continuous-Time Markov Decisions Based on Partial Exploration

We provide a framework for speeding up algorithms for time-bounded reachability analysis of continuous-time Markov decision processes. The principle is to find a small, but almost equivalent subsystem of the original system and only analyse the subsystem. Candidates for the subsystem are identified through simulations and iteratively enlarged until runs are represented in the subsystem with high enough probability. The framework is thus dual to that of abstraction refinement. We instantiate the framework in several ways with several traditional algorithms and experimentally confirm orders-of-magnitude speed ups in many cases.

Pranav Ashok, Yuliya Butkova, Holger Hermanns, Jan Křetínský
A Fragment of Linear Temporal Logic for Universal Very Weak Automata

Many temporal specifications used in practical model checking can be represented as universal very weak automata (UVW). They are structurally simple and their states can be labeled by simple temporal logic formulas that they represent. For complex temporal properties, it can be hard to understand why a trace violates a property, so when employing UVWs in model checking, this information helps with interpreting the trace. At the same time, the simple structure of UVWs helps the model checker with finding short traces.While a translation from computation tree logic (CTL) with only universal path quantifiers to UVWs has been described in earlier work, complex temporal properties that define sequences of allowed events along computations of a system are easier to describe in linear temporal logic (LTL). However, no direct translation from LTL to UVWs with little blow-up is known.In this paper, we define a fragment of LTL that gives rise to a simple and efficient translation from it to UVW. The logic contains the most common shapes of safety and liveness properties, including all nestings of “Until”-subformulas. We give a translation from this fragment to UVWs that only has an exponential blow-up in the worst case, which we show to be unavoidable. We demonstrate that the simple shape of UVWs helps with understanding counter-examples in a case study.

Keerthi Adabala, Rüdiger Ehlers
Quadratic Word Equations with Length Constraints, Counter Systems, and Presburger Arithmetic with Divisibility

Word equations are a crucial element in the theoretical foundation of constraint solving over strings. A word equation relates two words over string variables and constants. Its solution amounts to a function mapping variables to constant strings that equate the left and right hand sides of the equation. While the problem of solving word equations is decidable, the decidability of the problem of solving a word equation with a length constraint (i.e., a constraint relating the lengths of words in the word equation) has remained a long-standing open problem. We focus on the subclass of quadratic word equations, i.e., in which each variable occurs at most twice. We first show that the length abstractions of solutions to quadratic word equations are in general not Presburger-definable. We then describe a class of counter systems with Presburger transition relations which capture the length abstraction of a quadratic word equation with regular constraints. We provide an encoding of the effect of a simple loop of the counter systems in the existential theory of Presburger Arithmetic with divisibility (PAD). Since PAD is decidable, we get a decision procedure for quadratic words equations with length constraints for which the associated counter system is flat (i.e., all nodes belong to at most one cycle). In particular, we show a decidability result (in fact, also an NP algorithm with a PAD oracle) for a recently proposed NP-complete fragment of word equations called regular-oriented word equations, when augmented with length constraints. Decidability holds when the constraints are extended with regular constraints with a 1-weak control structure.

Anthony W. Lin, Rupak Majumdar
Round-Bounded Control of Parameterized Systems

We consider systems with unboundedly many processes that communicate through shared memory. In that context, simple verification questions have a high complexity or, in the case of pushdown processes, are even undecidable. Good algorithmic properties are recovered under round-bounded verification, which restricts the system behavior to a bounded number of round-robin schedules. In this paper, we extend this approach to a game-based setting. This allows one to solve synthesis and control problems and constitutes a further step towards a theory of languages over infinite alphabets.

Benedikt Bollig, Mathieu Lehaut, Nathalie Sznajder
PSense: Automatic Sensitivity Analysis for Probabilistic Programs

PSense is a novel system for sensitivity analysis of probabilistic programs. It computes the impact that a noise in the values of the parameters of the prior distributions and the data have on the program’s result. PSense relates the program executions with and without noise using a developer-provided sensitivity metric. PSense calculates the impact as a set of symbolic functions of each noise variable and supports various non-linear sensitivity metrics. Our evaluation on 66 programs from the literature and five common sensitivity metrics demonstrates the effectiveness of PSense.

Zixin Huang, Zhenbang Wang, Sasa Misailovic
Information Leakage in Arbiter Protocols

Resource sharing while preserving privacy is an increasingly important problem due to a wide-scale adoption of cloud computing. Under multitenancy, it is common to have multiple mutually distrustful “processes” (e.g. cores, threads, etc.) running on the same system simultaneously. This paper explores a new approach for automatically identifying and quantifying the information leakage in protocols that arbitrate utilization of shared resources between processes. Our approach is based on symbolic execution of arbiter protocols to extract constraints relating adversary observations to victim requests, then using model counting constraint solvers to quantify the information leaked. We present enumerative and optimized methods of exact model counting, and apply our methods to a set of nine different arbiter protocols, quantifying their leakage under different scenarios and allowing for informed comparison.

Nestan Tsiskaridze, Lucas Bang, Joseph McMahan, Tevfik Bultan, Timothy Sherwood
Neural State Classification for Hybrid Systems

We introduce the State Classification Problem (SCP) for hybrid systems, and present Neural State Classification (NSC) as an efficient solution technique. SCP generalizes the model checking problem as it entails classifying each state s of a hybrid automaton as either positive or negative, depending on whether or not s satisfies a given time-bounded reachability specification. This is an interesting problem in its own right, which NSC solves using machine-learning techniques, Deep Neural Networks in particular. State classifiers produced by NSC tend to be very efficient (run in constant time and space), but may be subject to classification errors. To quantify and mitigate such errors, our approach comprises: (i) techniques for certifying, with statistical guarantees, that an NSC classifier meets given accuracy levels; (ii) tuning techniques, including a novel technique based on adversarial sampling, that can virtually eliminate false negatives (positive states classified as negative), thereby making the classifier more conservative. We have applied NSC to six nonlinear hybrid system benchmarks, achieving an accuracy of 99.25% to 99.98%, and a false-negative rate of 0.0033 to 0, which we further reduced to 0.0015 to 0 after tuning the classifier. We believe that this level of accuracy is acceptable in many practical applications, and that these results demonstrate the promise of the NSC approach.

Dung Phan, Nicola Paoletti, Timothy Zhang, Radu Grosu, Scott A. Smolka, Scott D. Stoller
Bounded Synthesis of Reactive Programs

Most algorithms for the synthesis of reactive systems focus on the construction of finite-state machines rather than actual programs. This often leads to badly structured, unreadable code. In this paper, we present a bounded synthesis approach that automatically constructs, from a given specification in linear-time temporal logic (LTL), a program in Madhusudan’s simple imperative language for reactive programs. We develop and compare two principal approaches for the reduction of the synthesis problem to a Boolean constraint satisfaction problem. The first reduction is based on a generalization of bounded synthesis to two-way alternating automata, the second reduction is based on a direct encoding of the program syntax in the constraint system. We report on preliminary experience with a prototype implementation, which indicates that the direct encoding outperforms the automata approach.

Carsten Gerstacker, Felix Klein, Bernd Finkbeiner
Maximum Realizability for Linear Temporal Logic Specifications

Automatic synthesis from linear temporal logic (LTL) specifications is widely used in robotic motion planning and control of autonomous systems. A common specification pattern in such applications consists of an LTL formula describing the requirements on the behaviour of the system, together with a set of additional desirable properties. We study the synthesis problem in settings where the overall specification is unrealizable, more precisely, when some of the desirable properties have to be (temporarily) violated in order to satisfy the system’s objective. We provide a quantitative semantics of sets of safety specifications, and use it to formalize the “best-effort” satisfaction of such soft specifications while satisfying the hard LTL specification. We propose an algorithm for synthesizing implementations that are optimal with respect to this quantitative semantics. Our method builds upon the idea of bounded synthesis, and we develop a MaxSAT encoding which allows for maximizing the quantitative satisfaction of the soft specifications. We evaluate our algorithm on scenarios from robotics and power distribution networks.

Rayna Dimitrova, Mahsa Ghasemi, Ufuk Topcu
Ranking and Repulsing Supermartingales for Reachability in Probabilistic Programs

Computing reachability probabilities is a fundamental problem in the analysis of probabilistic programs. This paper aims at a comprehensive and comparative account of various martingale-based methods for over- and underapproximating reachability probabilities. Based on the existing works that stretch across different communities (formal verification, control theory, etc.), we offer a unifying account. In particular, we emphasize the role of order-theoretic fixed points—a classic topic in computer science—in the analysis of probabilistic programs. This leads us to two new martingale-based techniques, too. We also make an experimental comparison using our implementation of template-based synthesis algorithms for those martingales.

Toru Takisaka, Yuichiro Oyabu, Natsuki Urabe, Ichiro Hasuo
Bounded Synthesis of Register Transducers

Reactive synthesis aims at automatic construction of systems from their behavioural specifications. The research mostly focuses on synthesis of systems dealing with Boolean signals. But real-life systems are often described using bit-vectors, integers, etc. Bit-blasting would make such systems unreadable, hit synthesis scalability, and is not possible for infinite data-domains. One step closer to real-life systems are register transducers [10]: they can store data-input into registers and later output the content of a register, but they do not directly depend on the data-input, only on its comparison with the registers. Previously [5] it was proven that synthesis of register transducers from register automata is undecidable, but there the authors considered transducers equipped with the unbounded queue of registers. First, we prove the problem becomes decidable if bound the number of registers in transducers, by reducing the problem to standard synthesis of Boolean systems. Second, we show how to use quantified temporal logic, instead of automata, for specifications.

Ayrat Khalimov, Benedikt Maderbacher, Roderick Bloem

Tool Papers

Frontmatter
EthIR: A Framework for High-Level Analysis of Ethereum Bytecode

Analyzing Ethereum bytecode, rather than the source code from which it was generated, is a necessity when: (1) the source code is not available (e.g., the blockchain only stores the bytecode), (2) the information to be gathered in the analysis is only visible at the level of bytecode (e.g., gas consumption is specified at the level of EVM instructions), (3) the analysis results may be affected by optimizations performed by the compiler (thus the analysis should be done ideally after compilation). This paper presents EthIR, a framework for analyzing Ethereum bytecode, which relies on (an extension of) Oyente, a tool that generates CFGs; EthIR produces from the CFGs, a rule-based representation (RBR) of the bytecode that enables the application of (existing) high-level analyses to infer properties of EVM code.

Elvira Albert, Pablo Gordillo, Benjamin Livshits, Albert Rubio, Ilya Sergey
MGHyper: Checking Satisfiability of HyperLTL Formulas Beyond the Fragment

Hyperproperties are properties that refer to multiple computation traces. This includes many information-flow security policies, such as observational determinism, (generalized) noninterference, and noninference, and other system properties like symmetry or Hamming distances between in error-resistant codes. We introduce MGHyper, a tool for automatic satisfiability checking and model generation for hyperproperties expressed in HyperLTL. Unlike previous satisfiability checkers, MGHyper is not limited to the decidable $$\exists ^*\forall ^*$$ ∃ ∗ ∀ ∗ fragment of HyperLTL, but provides a semi-decision procedure for the full logic. An important application of MGHyper is to automatically check equivalences between different hyperproperties (and different formalizations of the same hyperproperty) and to build counterexamples that disprove a certain claimed implication. We describe the semi-decisionprocedure implemented in MGHyper and report on experimental results obtained both with typical hyperproperties from the literature and with randomly generated HyperLTL formulas.

Bernd Finkbeiner, Christopher Hahn, Tobias Hans
Verifying Rust Programs with SMACK

Rust is an emerging systems programming language with guaranteed memory safety and modern language features that has been extensively adopted to build safety-critical software. However, there is currently a lack of automated software verifiers for Rust. In this work, we present our experience extending the SMACK verifier to enable its usage on Rust programs. We evaluate SMACK on a set of Rust programs to demonstrate a wide spectrum of language features it supports.

Marek Baranowski, Shaobo He, Zvonimir Rakamarić
BIP 2.0: Statistical Model Checking Stochastic Real-Time Systems

This paper presents a major new release of $$\mathcal {S}$$ BIP, an extensible statistical model checker for Metric (MTL) and Linear-time Temporal Logic (LTL) properties on respectively Generalized Semi-Markov Processes (GSMP), Continuous-Time (CTMC) and Discrete-Time Markov Chain (DTMC) models. The newly added support for MTL, GSMPs, CTMCs and rare events allows to capture both real-time and stochastic aspects, allowing faithful specification, modeling and analysis of real-life systems. $$\mathcal {S}$$ BIP is redesigned as an IDE providing project management, model edition, compilation, simulation, and statistical analysis.

Braham Lotfi Mediouni, Ayoub Nouri, Marius Bozga, Mahieddine Dellabani, Axel Legay, Saddek Bensalem
Owl: A Library for -Words, Automata, and LTL

We present the library Owl (Omega-Words, automata, and LTL) for $$\omega $$ -automata and linear temporal logic. It forms a backbone of several translations from LTL to automata and related tools by different authors. We describe the functionality of the library and the recent experience, which has already shown the library is apt for easy prototyping of new tools in this area.

Jan Křetínský, Tobias Meggendorfer, Salomon Sickert
EVE: A Tool for Temporal Equilibrium Analysis

We present EVE (Equilibrium Verification Environment), a formal verification tool for the automated analysis of temporal equilibrium properties of concurrent and multi-agent systems. In EVE, systems are modelled using the Simple Reactive Module Language (SRML) as a collection of independent system components (players/agents in a game) and players’ goals are expressed using Linear Temporal Logic (LTL) formulae. EVE can be used to automatically check the existence of pure strategy Nash equilibria in such concurrent and multi-agent systems and to verify which temporal logic properties are satisfied in the equilibria.

Julian Gutierrez, Muhammad Najib, Giuseppe Perelli, Michael Wooldridge
Backmatter
Metadaten
Titel
Automated Technology for Verification and Analysis
herausgegeben von
Shuvendu K. Lahiri
Chao Wang
Copyright-Jahr
2018
Electronic ISBN
978-3-030-01090-4
Print ISBN
978-3-030-01089-8
DOI
https://doi.org/10.1007/978-3-030-01090-4