Wed 18 Jan 2023 15:35 - 16:00 at Avenue34 - Security Chair(s): Benjamin C. Pierce

This paper proposes a reconciliation of two different theories of information. The first, originally proposed in a lesser-known work by Claude Shannon (some five years after the publication of his celebrated quantitative theory of communication), describes how the information content of channels can be described \emph{qualitatively}, but still abstractly, in terms of \emph{information elements}, where information elements can be viewed as equivalence relations over the data source domain. Shannon showed that these elements have a partial ordering, expressing when one information element is more informative than another, and that these partially ordered information elements form a complete lattice. In the context of security and information flow this structure has been independently rediscovered several times, and used as a foundation for understanding and reasoning about information flow.

The second theory of information is Dana Scott's domain theory, a mathematical framework for giving meaning to programs as continuous functions over a particular topology. Scott's partial ordering also represents when one element is more informative than another, but in the sense of computational progress – i.e. when one element is a more defined or evolved version of another.

To give a satisfactory account of information flow in computer programs it is necessary to consider both theories together, in order to understand not only what information is conveyed by a program (viewed as a channel, à la Shannon) but also how the precision with which that information can be observed is determined by the definedness of its encoding (à la Scott). To this end we show how these theories can be fruitfully combined, by defining \emph{the Lattice of Computable Information} (LoCI), a lattice of preorders rather than equivalence relations. LoCI retains the rich lattice structure of Shannon's theory, filters out elements that do not make computational sense, and refines the remaining information elements to reflect how Scott's ordering captures possible varieties in the way that information is presented.

We show how the new theory facilitates the first general definition of termination-insensitive information flow properties, a weakened form of information flow property commonly targeted by static program analyses.

Wed 18 Jan

Displayed time zone: Eastern Time (US & Canada) change

15:10 - 16:25
SecurityPOPL at Avenue34
Chair(s): Benjamin C. Pierce University of Pennsylvania
15:10
25m
Talk
MSWasm: Soundly Enforcing Memory-Safe Execution of Unsafe Code
POPL
Alexandra E. Michael University of California at San Diego; University of Washington, Anitha Gollamudi University of Massachusetts Lowell, Jay Bosamiya Carnegie Mellon University, Evan Johnson University of California at San Diego; Arm, Aidan Denlinger University of California at San Diego, Craig Disselkoen University of California at San Diego, Conrad Watt University of Cambridge, Bryan Parno Carnegie Mellon University, Marco Patrignani University of Trento, Marco Vassena Utrecht University, Deian Stefan University of California at San Diego
DOI
15:35
25m
Talk
Reconciling Shannon and Scott with a Lattice of Computable Information
POPL
Sebastian Hunt City University of London, Dave Sands Chalmers University of Technology, Sandro Stucki Amazon Prime Video
Link to publication DOI Pre-print
16:00
25m
Talk
A Core Calculus for Equational Proofs of Cryptographic Protocols
POPL
Joshua Gancher Carnegie Mellon University, Kristina Sojakova Inria, Xiong Fan Rutgers University, Elaine Shi Carnegie Mellon University, Greg Morrisett Cornell University
DOI