Invited Talks
UNIFYING Invited Speaker:
Geoffrey Smith (Florida International University, US)
Operational Significance and Robustness in Quantitative Information Flow
Abstract: Protecting sensitive information from improper disclosure is a fundamental security goal, but one that is clearly not being achieved well in today’s cyber infrastructure. The issue is complicated by the realization that some leakage of sensitive information is often unavoidable in practice, due either to system functionality or due to side channels. For this reason, the last decade has seen growing interest in quantitative theories of information flow, which let us talk about "how much" information is leaked and perhaps allow us to tolerate "small" leaks. One major theme has been the development of leakage measures with strong operational significance, so that the amount of information leaked is associated with strong security guarantees; in this respect, notable measures include min-entropy leakage and g-leakage, which uses gain functions g to model the operational scenario. A second, somewhat contrary, theme aims at robustness, trying to minimize sensitivity to (perhaps questionable) assumptions about the adversary’s prior knowledge and goals, as modeled by the secret’s prior distribution and by the gain function. Approaches to robustness include the study of capacity (the maximum leakage over all priors or gain functions) and of the strong g-leakage ordering (which requires that one channel never leak more than another, regardless of the prior or gain function). This talk will survey these and other recent developments in quantitative information flow, and will also discuss directions for future research.
UNIFYING Invited Speaker:
John Launchbury (Galois, US)
Practical Challenges to Secure Computation
Abstract: In secure computation, one or more parties collaborate to compute a result while keeping all the inputs private. That is, no-one can gain knowledge about the inputs from the other parties, except what can be determined from the output of the computation. Methods of secure computation include fully homomorphic encryption (where one party owns the input data and the other party performs the whole computation), and secure multiparty computation (where multiple parties collaborate in the computation itself). The underlying methods are still exceedingly costly in time, space, and communication requirements, but there are also many other practical problems to be solved before secure computation can be usable. For programmers, the algorithm construction is often nonintuitive; for compiler writers, the machine assumptions are very different from usual; and for application designers, the application information flow has to match the security architecture. In this talk we will highlight these challenges, and indicate promising research directions.
CC 2014 Invited Speaker:
Benoit Dupont de Dinechin (Kalray, France)
Using the SSA-Form in a Code Generator
Abstract: In high-end compilers such as Open64, GCC or LLVM, the Static Single Assignment (SSA) form is a structural part of the target-independent program representation that supports most of the code optimizations. However, aggressive compilation also requires that optimizations that are more effective with the SSA form be applied to the target-specific program representations operated by the code generator, that is, the set of compiler phases after and including instruction selection.
While using the SSA form in the code generator has definite advantages, the SSA form does not apply to all the code generator program representations, and is not suited for all optimizations. We discuss some of the issues of inserting the SSA form in a code generator, specifically: what are the challenges of maintaining the SSA form on a program representation based on machine instructions; how the SSA form may be used in the if-conversion optimizations; why the SSA form does not seem to benefit instruction scheduling; and what is the state-of-the-art in SSA form destruction on machine code.
ESOP 2014 Invited Speaker:
Maurice Herlihy (Brown University, US)
Why Concurrent Data Structures are Still Hard
Abstract: Ultimately, whether we like it or not, everything in computer science is driven by advances in hardware. Today, it is getting harder and harder to make invididual processors run faster, so hardware architects are turning to parallelism to provide scalability, from multicores to clusters to distributed systems.
Software designers must somehow learn to make effective use of increasing parallelism. This adaptation will not be easy. Conventional synchronization techniques based on locks and conditions have many well-known limitations. Models based on transactions are promising, but they, too, face daunting challenges, both in implementation and reasoning. This talk will survey some of those challenges, along with a number of open research problems.
FASE 2014 Invited Speaker:
Christel Baier (Technical University of Dresden, Germany)
Probabilistic model checking and non-standard multi-objective reasoning
Abstract. Probabilistic model checking is a well-established method for the automated quantitative system analysis. It has been used in various application areas such as coordination algorithms for distributed systems, communication and multimedia protocols, biological systems, resilient systems or security. In this paper, we report on the experiences we made in inter-disciplinary research projects where we contribute with formal methods for the analysis of hardware and software systems. Many performance measures that have been identified as highly relevant by the respective domain experts refer to multiple objectives and require a good balance between two or more cost or reward functions, such as energy and utility. The formalization of these performance measures requires several concepts like quantiles, conditional probabilities and expectations and ratios of cost or reward functions that are not supported by state-ofthe- art probabilistic model checkers. We report on our current work in this direction, including applications in the field of software product line verification.
FOSSACS 2014 Invited Speaker:
Petr Jančar (Technical Univ of Ostrava, Czech Republic)
Equivalences of Pushdown Systems are Hard
Abstract: The talk aims to re-attract the research attention to the so far unclarified complexities of some fundamental equivalence problems.
Language equivalence of deterministic pushdown automata (DPDA) was shown to be decidable by Sénizergues who got the Gödel Prize in 2002 for this achievement. He also generalized the proof to show decidability of bisimulation equivalence of (nondeterministic) PDA with restricted epsilon-rules. Stirling showed that the DPDA equivalence is primitive recursive but no upper bound for the more general problem is known. Regarding the lower bounds, only P-hardness is known in the case of DPDA while the general problem for PDA is nonelementary, as shown by Benedikt, Göller, Kiefer, and Murawski even for PDA with no epsilon-rules.
One main goal of the talk is to sketch the main ideas behind Stirling's approach in a non-technical way, in the framework of standard first-order terms (depicted as trees), which also allows to see that DPDA equivalence is, in fact, in TOWER, i.e. in the "least'' nonelementary complexity class.
Another main goal is to show why bisimulation equivalence on PDA with restricted epsilon-moves is even Ackermann-hard, and thus not primitive recursive.
POST 2014 Invited Speaker:
David Mazières (Stanford University, US)
Security and the Average Programmer
TACAS 2014 Invited Speaker:
Orna Kupferman (Hebrew University Jerusalem, Israel)
Variations on Safety
Abstract: Of special interest in formal verification are safety properties, which assert that the system always stays within some allowed region, in which nothing ``bad'' happens. Equivalently, a property is a safety property if every violation of it occurs after a finite execution of the system. Thus, a computation violates the property if it has a ``bad prefix'', all whose extensions violate the property. The theoretical properties of safety properties as well as their practical advantages with respect to general properties have been widely studied. The paper surveys several extensions and variations of safety. We start with bounded and checkable properties -- fragments of safety properties that enable an even simpler reasoning. We proceed to a reactive setting, where safety properties require the system to stay in a region of states that is both allowed and from which the environment cannot force it out. Finally, we describe a probability-based approach for defining different levels of safety.