The researcher from the IMDEA Software Institute,, Isabel García successfully presented her thesis last Wednesday, July 21. “A scalable static analysis framework for reliable program development exploiting incrementality and modularity” is the title of the work developed by the researcher, directed by the distinguished professor and former director of the Institute, Manuel Hermenegildo. With this thesis, Isabel completes her doctoral studies in artificial intelligence at the Polytechnic University of Madrid (UPM).
Her object of study comes in response of the trend pattern, automatic source code analysis and verification is of great importance both at the level of software development and software maintenance.
Automatic static analysis tools allow inferring properties about software without executing it and without the need for human interaction. When these tools are based on formal methods, the properties are guaranteed to hold and come with a mathematical proof. The usage of these tools during the coding, testing, and maintenance phases of the software development cycle helps reduce efforts in terms of time and cost, as they contribute to the early detection of bugs, automatic optimizations, or automatic documentation. The increasing importance of the reliability of evolving software is evidenced by the current number of tools and on-line platforms for continuous integration and deployment. In this setting, when changes happen fast, analysis tools are only useful if they are precise and, at the same time, scalable enough to provide results before the next change happens.
In this thesis she studies scalable analyses in the context of abstract interpretation. Since a way to improve scalability is to perform coarser abstractions, she first inspect what effect this may have in effectively proving the absence of bugs. Second, she presents a framework for scalable static analyses which is generic, that is, independent of the data abstraction of the program. Isabel presents several algorithms for incrementally reanalyzing whole programs in a context-sensitive manner, reusing as much as possible previous analysis results. A key novel aspect of the approach is to take advantage of the modular structure of programs, typically as defined by the programmer, while keeping a fine-grained relation between the analysis result and the source program.
Additionally, she presents a mechanism for the programmer to help the analyzer in terms of precision and performance by means of assertions. She shows that these assertions together with incremental analysis are specially useful when analyzing generic code. All these algorithms have been implemented and evaluated for different abstract domains within the CiaoPP framework.
Lastly, she presents an application of the analysis framework to perform on-the-fly assertion checking, providing continuous and almost instantaneous feedback to the programmer as the code is written. Her initial experience with this integrated tool shows quite promising results, with low latency times that provide early, continuous, and precise “on-the-fly” semantic feedback to programmers during the development process. This allows detecting many types of errors including swapped variables, property incompatibilities, illegal calls to library predicates, violated numeric constraints, unintended behavior w.r.t. termination, resource usage, determinism, covering and failure, etc.