Abramsky, [2008] “Information, Processes and Games”
presenter: Sasha
1. Prelude: Some Basic Puzzles
Why do we compute?
- to gain information - explicitness (normal form, relative to the user) & deletion (irrelevant data)
- happens in open systems - trades energy with environment (thermodynamics)
What does the internet compute?
- Traditional model is a function
- Nowadays it’s processes & behaviours
2. Introduction: Matter and Method
Existing theories of information are static: they describe invariants. What is information dynamics?
- how is information increased?
- qualitative + quantitative
- logic & geometry
- power of copying
3. Some Background Theories
|
Static |
Dynamic |
Qualitative |
Domain Theory, Dynamic Logic |
Process Algebra |
Quantitative |
Shannon Information theory |
? |
Domain theory
- models infinite computations or computational objects as limits (LUBs) of processes of information increase, where at each stage information is finite.
- qualitative - only relates states, but doesn’t quantify information
- objective/subjective states
- Stone duality: points on topological space <-> properties
- Abramsky’s thesis was on connecting denotational semantics and program logics via SD
- approximate infinite ideal objects by continuous processes (similar to intuitionism?)
- time is implicit, how to add causality?
- concrete domains -> event structures
- monotonicity: only positive information
- continuity: a process which use only finite resources at each finite stage
- Kleene’s fixpoint theorem
- function domains (not concrete)
Dynamic logic
modal logic + Hoare logic
- in Domain Theory (as in Kripke semantics for intuitionistic logic) information can only increase along a computation (like time in physics - we can’t change what happened)
- Kripke structures allow arbitrary state change - mutability
Mutually recursive formulas + relations (relational algebra)
Limited to input-output behaviour of relations (multi-modal). We can go beyond Hoare logic and imperative programs but this breaks compositionality.
Process algebra
- domain theory ~ order/topology
- concurrency theory ~ ?
different formalisms: LTSs, naming & scope, automata
What is the right notion of behavioural equivalence of processes?
LTS & (weak/strong) bisimulation:
- Kripke structures focus on states, in LTS they aren’t observable, only actions.
- Automata focus on sequences/traces, LTS also consider branching.
Hennessey-Milner logic: two states are bisimilar if they satisfy the same formulas; no standard algebra like for dynamic logic.
Communication: involutive structure for synchronization
- Shannon information theory - a thermodynamic theory, information can only be lost during transmission (considers the whole system)
- Domain Theory - adds an observer whose information increases during a computation (relatively)
A purely quantitative theory is weak in that numbers are always comparable, so that some structure can be obscured, like different directions of information increase.
Domains with measurements
- adds degree of undefinedness (maximal element ~ 0)
- non-monotonic functions
- rate of convergence
- real numbers
“Quantum domains”: Bayesian/spectral orders
Quantum logic: non-distributive lattices (cannot observe two variables at once)
Game semantics + GoI
- old view: input -> output (Hoare/dynamic logic)
- distributed view: interaction (robustness/atomicity/security/quality)
System vs environment (+ role reversal) - needs a “logic of interaction”
- static tautology: it’s either raining or not
- dynamic tautology: copycat game (only works in a symmetrical setting)
Logical principles as conservation laws (resource-sensitivity, i.e., linear logic) for information flow, geometry is important.
- Games ~ module interfaces / dynamic propositions
- Agents ~ strategies, winning strategies are proofs
- Cut ~ hiding the “glued” intermediate game (intensionality?)
Quantitative aspect: rate of information flow, minimality, noise
What are computational processes, without referring to an operational model?
particle-style GoI (there’s also wave-style)
(linear) combinatory logic
agents that copy information ~ topologies that untangle/yank wires
reversible computation (fixed-point partial involutions): processes map “tokens” to positions with ticking time
Particle style is based on coproducts (because it can be only in one place)
Two inputs, two outputs (internal + external)
Composition via mutual feedback.
Descriptive vs intrinsic logic
Discussion
- No mention of abstract interpretation / Cousout’s theory
- Does FCSL count as a dynamic theory / intrinsic logic?
2021.06.22 Information, Processes and Games
Abramsky, [2008] “Information, Processes and Games”
presenter: Sasha
1. Prelude: Some Basic Puzzles
Why do we compute?
What does the internet compute?
2. Introduction: Matter and Method
Existing theories of information are static: they describe invariants. What is information dynamics?
3. Some Background Theories
Domain theory
Dynamic logic
modal logic + Hoare logic
Mutually recursive formulas + relations (relational algebra)
Limited to input-output behaviour of relations (multi-modal). We can go beyond Hoare logic and imperative programs but this breaks compositionality.
Process algebra
different formalisms: LTSs, naming & scope, automata
What is the right notion of behavioural equivalence of processes?
LTS & (weak/strong) bisimulation:
Hennessey-Milner logic: two states are bisimilar if they satisfy the same formulas; no standard algebra like for dynamic logic.
Communication: involutive structure for synchronization
4. Combining Qualitative and Quantitative Theories of Information
A purely quantitative theory is weak in that numbers are always comparable, so that some structure can be obscured, like different directions of information increase.
Domains with measurements
“Quantum domains”: Bayesian/spectral orders
Quantum logic: non-distributive lattices (cannot observe two variables at once)
5. Games, Logical Equilibria and Conservation of Information Flow
Game semantics + GoI
System vs environment (+ role reversal) - needs a “logic of interaction”
Logical principles as conservation laws (resource-sensitivity, i.e., linear logic) for information flow, geometry is important.
Quantitative aspect: rate of information flow, minimality, noise
What are computational processes, without referring to an operational model?
6. Emergent Logic: The Geometry of Information Flow
particle-style GoI (there’s also wave-style)
(linear) combinatory logic
agents that copy information ~ topologies that untangle/yank wires
reversible computation (fixed-point partial involutions): processes map “tokens” to positions with ticking time
Particle style is based on coproducts (because it can be only in one place)
Two inputs, two outputs (internal + external)
Composition via mutual feedback.
7. Concluding Remarks
Descriptive vs intrinsic logic
Discussion