Sep. 15th, 2013

akuklev: (Свечка и валокардин)
Какой превосходный доклад по ординалам, теориям типов и индукции-рекурсии/индукции-индукции/коданным: http://www.cs.swan.ac.uk/~csetzer/slides/lisbon2012/setzerLisbon2012InductionInductionInduction.pdf ! Вот реально жалею, что он мне не попался полгода-год назад. :(
akuklev: (Свечка и валокардин)
Various axiomatizations of type theories variously deal with sum types. For one, you can define the type _+_ : (Type, Type) -> Type where
  injr: {A: Type, B: Type} -> A -> (A + B)
  injr: {A: Type, B: Type} -> B -> (A + B)
axiomatically. Or you can define the enumerations and say that sums are dependent pairs:
  A + B = (head: {'left, 'right}) * (value: {'left => A | 'right => B}).

One other elegant option is axiomatic introduction of the Maybe type with constructors
  nil
  val a
and its eliminator (default | value => result). It's a very nice solution because you can define lots of useful types a'la carte:

  Nat = Maybe Rec[Nat]
  List (A: Type) = Maybe (A, Rec [List A])
  BTree A = Maybe(A, Rec [BTree A], Rec [BTree A])

  Fin (n: Nat) = (() | prevN => Maybe (Fin prevN))
  Vec (A: Type) (l: Nat) = (() | prevL => (A, (Vec A prevL)))

Note how short and natural primitive recursion looks in this setting:
  add (a: Nat) (b: Nat) = (a || prevB => add (val a) prevB)
  mul (a: Nat) (b: Nat) = (nil || prevB => add a (mul (val a) prevB))
akuklev: (Свечка и валокардин)
The nice thing about ΠΣ-calculus is that everything is explicit there. For example, recursive datatypes are defined with Rec in their definition, and values of these datatypes must contain a "fold" constructor every time recursiveness of their type is made use of.

This allows us to define a major Nat-valued property for normalizable values:
Structural complexity |t| of a term t is the number of "fold" constructor usages in its normal form. Note that memory footprint of a term t is always O(|t|). For a type T let's denote by T[n] a set of terms of type t with structural complexity of utmost n.

For a function f: A -> B we can always determine it's structural boost function
  ||f|| = (n: Nat) => max {|f(t)| | t : A[n]},
i.e. how much it can maximally increase the structural complexity of its argument. Since every total function Nat -> Nat is majored by some function from the canonical Hardy hierarchy H_o(n), we can find the minimal such ordinal o that ||f||(n) ≤ H_o(n) (at least at one point they would coincide), we'll call it complexity ordinal |f| of the function f.

(Canonical) Examples:
|id| = ω
|n: Nat => n + 1| = ω + 1
|n: Nat => n + 2| = ω + 2
|n: Nat => n + n| = ω·2
|n: Nat => n + n + n| = ω·3
|n: Nat => (n choose p)| = ωp

It's not hard to see that for a function that is completely nonrecursive (not recursive and makes no use of other recursive function, the boost function is O(1): structural complexity is decremented whenever you use unfold, incremented whenever you use fold, sum'ed when you construct a pair, max'ed when you perform choice and remains unaffected by all other constructions of ΠΣ. Their complexity ordinals are therefore of the form ω·n + c. If a function uses every variable utmost once, it is a function of constant boost, i.e. it's ||f|| is bounded by x => x + c for some constant c.

A recursive function f: A -> B can be always defined as
  f = finalize . (fix step) . init
where fix is a fixed point combinator and the functions
  init: A -> I
  step: I -> I
  finalize : I -> B
are completely nonrecursive. The function f is total iff the preorder P induced on the type I by applications of the step function is well-founded. We'll show that
  |f| = P⨳|step| ⧺ |init| ⧺ |finalize|
(Conway addition and multiplication).
We also conjecture that this relation still holds also when one allows for step with internal recursion.
Page generated Aug. 31st, 2025 02:50 pm
Powered by Dreamwidth Studios