by James Chapman on March 7, 2008.
Tagged as: Lunches.
Last friday I talked about OPEs which Peter Morris and I have been thinking about recently. OPEs give you a nice way of implementing weakening for a syntax with de Bruijn variables and binding. Here’s one:
data Ty : Set where
ι : Ty
_→_ : Ty -> Ty -> Ty
data Con : Set where
ε : Con
_,_ : Con -> Ty -> Con
data Var : Con -> Ty -> Set where
vZ : Var (Γ , σ) σ
vS : (τ : Ty) -> Var Γ σ -> Var (Γ , τ) σ
data Tm : Con -> Ty -> Set where
var : Var Γ σ -> Tm Γ σ
λ : Tm (Γ , σ) τ -> Tm Γ (σ → τ)
_$_ : Tm Γ (σ → τ) -> Tm Γ σ -> Tm Γ τ
The simply-typed lambda calculus with one base type. If you throw away the types your contexts become natural numbers, you variables become Fin and your terms become the well-scoped terms.
The operation we want is this one:
wk : (τ : Ty) -> Tm Γ σ -> Tm (Γ , τ) σ
But we cannot implement it by induction on the terms:
wk τ (var x) = var (vS τ x)
wk τ (t $ u) = wk τ t $ wk τ u
wk τ (λ t) = λ ? -- blast!
The type of ?
is Tm ((Γ , τ) , σ) ρ
where σ
and ρ
are the domain and range of the lambda.
Clearly we need a more liberal kind of weakening where we can introduce new variables earlier in the context. We could introduce thinning like this:
th : forall Δ τ -> Tm (Γ + Δ) σ -> Tm ((Γ < τ) + Δ) σ
+ is context extension here. We can then define weakening as:
wk = th ε
This works ok until you have to prove properties about thinnings. For example:
wk ρ (th Γ' τ v) == th (Γ' < ρ) τ (wk ρ v)
To prove this you need a more general lemma where both are thinnings which is even very difficult to state due to difficulties with associativity of + not being free in type theory.
Ok, that's the problem. What's the solution? Yes; order preserving embeddings. The idea is to give an inductive description of exactly the types of functions from one context to another that we are interested in. We can then prove the properties we want in a category where the objects are context and the morphisms are OPEs and lift our results to the category of terms.
What are OPEs then?
data OPE : Con -> Con -> Set where
done : OPE ε ε
keep : forall {Γ Δ σ} -> OPE Γ Δ -> OPE (Γ , σ) (Δ , σ)
skip : forall {Γ Δ σ} -> OPE Γ Δ -> OPE (Γ , σ) Δ
This type classifies exactly the operations from Con -> Con where all you can do is say which things to avoid in the target context (which is the first one by the way).
We define identity and composition like so:
id : OPE Γ Γ
id {ε} = done
id {Γ , σ} = keep id
_•_ : OPE B Γ -> OPE Γ Δ -> OPE B Δ
done • done = done
skip f • g = skip (f • g)
keep f • keep g = keep (f • g)
keep f • skip g = skip (f • g)
and can easily show that they satisfy the left and right identity laws and associativity of composition if we feel so inclined.
Given identity it is easy to define the weakening OPE:
weak : forall τ -> OPE (Γ , τ) Γ
weak τ = skip id
and I can state my property easily:
weak ρ • f == (keep f) • (weak ρ)
The proof is now one line.
We define functors:
vmap : OPE Γ Δ -> Var Δ σ -> Var Γ σ
tmap : OPE Γ Δ -> Tm Δ σ -> Tm Γ σ
Now we can define weakening:
wk τ = tmap (weak τ)
And that's it. Ok so we did have to do some work but the structure is just the categorical structure you'd expect. In fact for a dependently-typed version of this (using induction recursion) you are forced to prove the laws about identity, composition and map as you define the operations. The structure is forced upon you.
A more liberal version of the above called Order Preserving Functions are mentioned in Conor's excellent AFP notes (well worth a read for any budding (wilting?) epigram or agda programmer). Peter and I plan to keep digging on this subject.