Alessio Mansutti, Research Associate, University of Oxford, United Kingdom
In this talk I will touch on two results I achieved during my PhD and Postdoc in the area of logics for algorithmic verification.
Integer linear arithmetic, also known as Presburger arithmetic, has been a central subject in computer science for many decades. A celebrated result by Ginsburg and Spanier show that the family of sets definable in Presburger arithmetic coincides with the family of semilinear sets. Despite more than fifty years of work on semilinear sets and several attempts, one question has remained unanswered: how do we efficiently compute the complement of a semilinear set? In a recent work with Chistikov and Haase, I answer this question by giving an optimal algorithm to compute such complements. As a corollary, this result gives a new way of deciding Presburger arithmetic optimally. Furthermore, our algorithm allows us to positively answer a conjecture by Pak and Nguyel on the VC dimension of Presbuger arithmetic [Combinatorica 39, 2019]. The VC dimension is a fundamental concept from Comptational Learning theory that can be applied to all first-order theories.
Separation logic is a well-known logic for the modular verification of pointer programs. An open problem in the area was how to design sound and complete proof systems for its assertion language, the major issue being that the multiplicative connectives of separation logic break all standard techniques used to prove completeness of axiom systems. In a paper published at CSL'20, I showed how to handle this issue, and derived the first sound and complete proof system for a separation logic featuring both multiplicative connectives (the so called separating conjunction and the separating implication). Completeness is shown with a model theoretical technique that is reusable in practice, and that can be adapted to tackle complexity questions.
After a round-up on the aforementioned results, I will move to my current research project, that aims at developing proof techniques and algorithms to mathematically prove the presence of arithmetic errors (e.g., buffer and integer overflows) in programs.