28 Nov: Correctness of Speculative Optimizations with Dynamic Deoptimization

  • Date: November 28th, 11am
  • B31
  • Speakers:
    • Olivier Flückiger (Northeastern University, US)
    • Gabriel Scherer (Parsifal, INRIA Saclay, France)


  Correctness of Speculative Optimizations with Dynamic Deoptimization


  In the talk we will not assume previous knowledge of JIT   compilation.

  High-performance dynamic language implementation make heavy use of   speculative optimizations to achieve speeds close to statically   compiled languages. These optimizations are typically implemented by   a just-in-time compiler that generates code optimized under a set of   assumptions about the state of the program and its environment. In   certain cases, a program may finds itself executing code compiled   under assumptions that have become invalid, the language   implementation must, then, execute a bailout procedure that exits   the speculatively optimized code region. Effectively, the   implementation must deoptimize the program on-the-fly; this entails   finding a corresponding code region that does not rely on invalid   assumptions, translating program state to that expected by the   target code, and transferring control. The procedure is sometimes   called on-stack-replacement as it entails modifying stack frames,   and in some cases, synthesizing new ones for inlined   functions. Reasoning about the correctness of speculative   optimizations is challenging because the optimized code is not   equivalent to the source program. The main benefit of speculative   optimization is to entirely remove low-likelihood code   paths. Soundness is only recovered when the bailout mechanism is   taken into account, but even then, it is necessary to be able to   recreate the program state exactly as expected by the unoptimized   program.

  The presented work, joint work with Ming-Ho Yee, Aviral Goel, Amal   Ahmed and Jan Vitek at Northeastern University, looks closely at the   interaction between compiler optimization and deoptimization. We   demonstrate that reasoning about speculative optimizations is   surprisingly easy when assumptions are reified in the intermediate   representation. To this end we present a model intermediate   representation (IR) that stands in for the high-level IR of   a compiler for a dynamic language. Our IR models core difficulties   of speculative optimization, such as interleaving assumptions and   optimizations, and the interaction with inlining. Our formalization   stays at the IR level, thus abstracting orthogonal issues such as   code generation and self mutation. We describe a set of common   optimizations (constant folding, unreachable code elimination, and   function inlining) and prove them correct in presence of speculative   assumptions

Posted by admin at 27 November 2017, 8:04 am link