TYPES CONFERENCE - TORINO 2008
March, 26 (Wednesday morning) to March, 29 (Saturday morning)
Colocated with ITRS08, March, 25, 2008


VILLA GUALINO
Viale Settimio Severo 65 - 10133 - Torino - Tel: +39 011 6603555

Menu

1. Introduction To Types 2008
2. Invited Speakers
3. List Of Participants
4. Schedule Of The Congress
5. How To Register
6. Hotel Booking
7. How To Submit A Talk
8. Social Events
9. Call For Post Proceedings Papers
10. How To Get To Villa Gualino
11. Map Of The Town Of Turin
12. Picture Of Villa Gualino
13. Organizing Committee

4. Schedule Of The Congress




Wednesday: 1 invited talk, 12 contributed talk.
  • 09.00-10.00 Jan Willem Klop: Infinity and Productivity
  • 10.00-10.30 Ferruccio Guidi. Towards the Unification of Terms, Types and Contexts


  • 10.30-11.00 COFFEE BREAK


  • 11.00-11.30 Thorsten Altenkirch and Nicolas Oury. A core language forDependently Typed Programming
  • 11.30-12.00 Pierre Corbineau. Towards the end of the Streicher K axiom (asan axiom)
  • 12.00-12.30 Stefan Berghofer and Christian Urban. Nominal Inversion Principles
  • 12.30-13.00 Cezary Kaliszyk and Russell S S O'Connor. Computing with classicalreal numbers.


  • 13.00-14.30 LUNCH BREAK


  • 14.30-15.00 Andrew Polonsky. Translation from First Order Logic to Coherent Logic
  • 15.00-15.30 Zhaohui Luo. Modelling object-oriented features in intensionaltype theory
  • 15.30-16.00 Michele Pagani. Between interaction and semantics: visible acyclic nets
  • 16.00-16.30 Alexis Saurin. Interactive Proof Search: Toward Ludics Programming.


  • 16.30-17.00 COFFEE BREAK


  • 17.00-17.30 Mauro Piccolo and Luca Paolini. ProcessingPrograms: the linear case
  • 17.30-18.00 Marco Gaboardi. Towards a Soft logic for PSPACE
  • 18.00-18.30 Michele Basaldella and Claudia Faggian. Ludics with repetitions: Hyland-Ong style exponentials


  • Thursday: 1 invited talk, 12 contributed talk.
  • 09.00-10.00 Andrea Asperti: About the formalization of some results by Chebyshev in number theory.
  • 10.00-10.30 Carsten Schurmann and Jeffrey Sarnat. Structural Logical Relations


  • 10.30-11.00 COFFEE BREAK

  • 11.00-11.30 Aarne Ranta, Bjorn Bringert and Krasimir Angelov. PGF: A Run-Time Format for Type-Theoretical Grammars
  • 11.30-12.00 Ralph Matthes. A representation of untyped lambda-calculus with explicit flattening through a nested datatype
  • 12.00-12.30 Herman Geuvers. Formalizing Hybrid Systems in Coq
  • 12.30-13.00 Francois Garillot. A small reflection on group automorphisms


  • 13.00-14.30 LUNCH BREAK

  • 14.30-15.00 Aleksy Schubert. A characterisation of solutions for non-structural subtype constraints
  • 15.00-15.30 Dulma Rodriguez. Algorithmic Subtyping for Higher Order BoundedQuantification Revisited
  • 15.30-16.00 Rene David and Delia Kesner. Strong normalisation of MELL proof-nets
  • 16.00-16.30 Andreas Abel. Normalization by Evaluation for Martin-Lof TypeTheory


  • 16.30-17.00 COFFEE BREAK

  • 17.00-17.30 Christophe Raffalli. PML and strong normalisation
  • 17.30-18.00 Jose Espirito Santo, Ralph Matthes and Luis Pinto. Cgps-translations for intuitionsistic higher-order sequent calculi
  • 18.00-18.30 Stephane Lengrand. Functionalogic programming: a call for interest
  • 18.30-19.00 BUSINESS MEETING


  • Friday: 1 invited talk, 12 contributed talk.
  • 09.00-10.00 Gilles Dowek: TBA
  • 10.00-10.30 Alexandre Miquel. Classical program extraction in the calculus of constructions


  • 10.30-11.00 COFFEE BREAK

  • 11.00-11.30 Luca Chiarabini. Program Development by Proof Transformation.
  • 11.30-12.00 Silvia Ghilezan, Hugo Herbelin and Alexis Saurin. A Uniform Approach to Call-by-Name and Call-by-Value Delimited Control
  • 12.00-12.30 Maria Emilia Maietti and Giovanni Sambin. A minimalist two-level foundation for constructive mathematics
  • 12.30-13.00 Aldo Ursini. Algebraic Types

  • 13.00-14.30 LUNCH BREAK

  • 14.30-15.00 Anton Setzer. The strength of Martin-Lof Type Theory with the logical framework
  • 15.00-15.30 Jean-Francois Monin. On defining f91 in Coq
  • 15.30-16.00 Yves Bertot and Ekaterina Komendantskaya. Inductive and Coinductive Components of Corecursive Functions in Coq
  • 16.00-16.30 Marcin Benke. Multiple Recursion in Type Theory and Generic Programming


  • 16.30-17.00 COFFEE BREAK

  • 17.00-17.30 Johannes Kanig, Sylvain Conchon and Evelyne Contejean. LightweightEquality Certificates
  • 17.30-18.00 Evelyne Contejean. Coccinelle, a Coq library for rewriting
  • 18.00-18.30 Zhaohui Luo and Serguei Soloviev. Coercive subtyping and multilingual semantics.
  • 19.30-20.00 Bus from Villa Gualino to Restaurant "CIACOLON".
  • 20.00-23.30 Social dinner

    Saturday: 5 contributed talks.
  • 09.00-09.30 Guillaume Burel. Encoding Pure Types Systems in Superdeduction
  • 09.30-10.00 Paul Brauner and Allali Lisa. A semantic normalization proof for a system with recursors
  • 10.00-10.30 Yves Bertot and Vladimir Komendantsky. Fixed point semantics and partial recursion in Coq


  • 10.30-11.00 COFFEE BREAK

  • 11.00-11.30 Muhammad Humayoun. Software Specifications and Mathematical Proofs in Natural Languages
  • 11.30-12.00 Davide Ancona, Giovanni Lagorio and Elena Zucca. Type inference for Java-like programs by coinductive logic programming



  • LIST OF PRESENTATIONS (sorted by argument)

    typed lambda calculus

    • Ferruccio Guidi. Towards the Unification of Terms, Types and Contexts (typed lambda calculus: abbrevations)
      • KEYWORDS: abbreviations, contexts as terms, types as terms, uniformly dependenttypes
      • ABSTRACT:lambda-delta is a typed lambda calculus intended as the firststep towards the verification of the following conjecture: theadoption of a single set of constructions for terms, types, andcontexts (i.e. the ``contexts as types as terms'' paradigm) iscompatible with the presence of a desirable meta-theory. lambda-deltarealizes the ``types as terms'' paradigm in full and features anuniform typing policy for all binders. Therefore this calculus islambda-typed but differs from the Automath-related calculi in that itprovides for an abbreviation construction at the level of terms. Inthe ``propositions as types'' perspective it can encode theimplicative fragment of predicative logic without quantifiers becauseuniformly dependent types are allowed.
    • Thorsten Altenkirch and Nicolas Oury. A core language forDependently Typed Programming (typed lambda calculus: dependent types)
      • KEYWORDS:Functional Programming, core language, recursion
      • ABSTRACT: We discuss the design of a core language, whichshould play a similar role for dependently typed programming likeextensions of System F do for conventional functional languages likeHaskell. The core language incorporates mutual dependent recursivedefinitions, type:type, pi and sigma types and finite sets oflabels. We show that standard constructions from Type Theory includinguniverses can be easily encoded in our language. We address someimportant issues: having an equality checker which unfolds recursiononly when needed, avoiding looping when typechecking "sensible"programs; and the simplification of type checking for eliminators likecase by using equational constraints, allowing the flexible use ofcase expressions within dependently typed programming. We discuss someinteresting extensions: lazy datatypes, integrating equality typeswith constraints and reflection.
    • Pierre Corbineau. Towards the end of the Streicher K axiom (asan axiom) (typed lambda calculus: Streicher K axiom)
      • KEYWORDS: Type Theory, Streicher K axiom, Equality,Inductive types
      • ABSTRACT: It is a well known result that the Streicher K axiom, which statesthat all proofs of x=x are canonical, is independent of the rules forthe Calculus of Inductive Constructions. This result has several annoyingconsequences for CIC : the heterogeneous equality (a.k.a. John Majorequality) and the homogeneous equality are not equivalent, and theinjectivity of the second projection does not hold for dependentpairs. In this talk, we present a simple extension of the Calculus ofInductive Constructions that makes the K axiom admissible by allowingmore pattern-matching terms to be correctly typed. We show thatconversely the new theory can be simulated by CIC extended with aconstant representing the K axiom equipped with a correspondingreduction rule and hence is consistent.We finally give several examples of the possibilities offered by thenew type system and discuss the adoption of the heterogeneous equalityas the default equality for CIC.
    • Stefan Berghofer and Christian Urban. Nominal Inversion Principles (typed lambda calculus: alpha rule)
      • KEYWORDS: Nominal syntax, Inductive definitions
      • ABSTRACT: When reasoning about inductively defined predicates, such as typing judgements or evaluation relations, proofs are often done by case analysis on the last rule in a derivation. This case analysis involves solving equational constraints on the arguments of the inductively defined predicate athand. This is well-understood in case where term constructors areinjective. In case of alpha equivalence classes, which are not injective, thiscase analysis gives rise to annoying variable renamings. In this paper, weshow that more convenient inversion principles can be derived where one doesnot have to deal with explicit variable renamings. An interesting observationis that this fact relies on the inductive rules to satisfy the variableconvention compatibility condition, which was introduced to justify theadmissibility of Barendregt's variable convention in rule inductions.
    • Zhaohui Luo. Modelling object-oriented features in intensionaltype theory (typed lambda calculi: object-oriented features)
      • KEYWORDS: intensional type theory, object-orientedprogramming
      • ABSTRACT: Functional interpretations of object-oriented programming have been studied in the literature by means of type systems with powerful but questionable features such as bounded quantification and manifest types, which have unwelcome concequences including serious meta-theoretic difficulties.In this talk, the modelling of OO-features in class-based languages is studied in intensional type theory. In particular, different from other existing approaches, classes are interpreted as types and their instances ("objects") as objects of the corresponding types. This is made possible in an intensional type theory by considering the parameterised family of inductive unit types and coercive subtyping.Based on such an interpretation, it is then possible to consider the verification of programs with object-oriented features in a proof assistant that implements an intensional type theory.
    • Aleksy Schubert. A characterisation of solutions for non-structural subtype constraints
      • KEYWORDS:subtypes, constraint solving, entailment
      • ABSTRACT:The subtype entailment problem is a problem to decide if each solution of a given subtype inequalities constraint set can be a solution of a fixed inequality. This problem is a long time pending problem which was tackled using various techniques.In this work, we present a direct characterisation of all thesolutions of non-structural subtype constraint inequalities. We hopethat this characterisation can lead to better understanding of thesubtype entailment problem.
    • Linear Logic

    • Rene David and Delia Kesner. Strong normalisation of MELL proof-nets (linear logic)
      • KEYWORDS:proof-nets, strong normalisation
      • ABSTRACT:We prove strong normalisation for the multiplicative exponential fragment of linear logic (MELL) proof-nets. In contrast to known proofs of the same result, which are based on the combination of weak normalisation and confluence properties, our proof is inspired from the simple arithmetical proof of strong normalisation for the lambda-calculus, where a natural notion of commutation is used.
    • Michele Pagani. Between interaction and semantics: visible acyclic nets (linear logic)
      • KEYWORDS:Linear Logic, Denotational semantics, Weaknormalization, Finiteness spaces, Differential nets
      • ABSTRACT:I present a strict correspondence between cut-elimination, acyclicityand finiteness spaces in the framework of differential nets withpromotion (DiffNet for short). DiffNet have been defined by Ehrhard&Regnier asan extension of the multiplicative exponential linear logic,introducing the semantical constructions defined by Ehrhard in hisfiniteness spaces.Consider the orthogonal (P)Degree of a net P with conclusion C as the setof all nets R with a conclusion notC s.t. the cut between C and notCis weak normalizable. Let P be a cut-free net with conclusion C, myresults show that the following conditions are equivalent:1. P is interpreted as a finitary relation in the finiteness spaceassociated with C2. P is visible acyclic (a geometric condition more general than theusual Danos&Regnier's correctness criterion)3. (P)Degree contains all visible acyclic nets with conclusion notCThese equivalences provide a generalization of the ones in the maintheorems of the theory of linear logic proof-nets, namely: Retore''stheorem (1 --> 2), semantic soundness theorem (2 --> 1), weaknormalization theorem (2 --> 3), and Beche't's theorem (3 --> 2).Above all, it discloses a deep unity in DiffNet among normalization,visible acyclicity and finiteness spaces, which was present (even ifnever actually remarked) only in the multiplicative fragment of linearlogic.
    • Alexis Saurin. Interactive Proof Search: Toward Ludics Programming. (linear logic)
      • KEYWORDS: Ludics, (Interactive) Proof Search, LogicProgramming, Interactive Types
      • ABSTRACT: Proof theory and Computation are research areas which have very strong relationships: new concepts in logic and proof theory often apply to the theory of programming languages. The use of proofs to model computation led to the modelling of two main programming paradigms which are functional programming and logic programming. While functional programming is based on proof normalization, logic programming is based on proof search.This approach has been very successful in the sense that it allowed to capture many programming primitives logically. Nevertheless, important fragments of real logic programming languages are still hardly understood from the logical point of view and it has been found very difficult to give a logical semantics to control primitives.Girard introduced Ludics, as a new theory to study interaction. In Ludics, everything is built on interaction or in an interactive way.In this talk, we shall explain how and why Ludics could be consideredas a foundation for interactive proof search. We present the SLAM, anabstract machine (extending Faggian's Loci Abstract Machine fornormalization of designs) allowing to interactively constructdesigns. We examplify interactive proof search on several levels ofexamples (IPS in MALL, runs of the SLAM on simples designs and toyprogramming examples). We then discuss the flexibility we obtain andhow it can be used in order to capture control primitive from logicprogramming.
    • Mauro Piccolo and Luca Paolini. Processing Programs: thelinear case (linear logic)
      • KEYWORDS:process calculi, linear programming languages, pcf, linear functions
      • ABSTRACT:We study how to express computational processesgenerated by programs of SlPCF , a very simple paradigmaticprogramming language conceived in order to program functions, linearin a semantic way. For this purpose we adapt the Solos calculusintroduced by Laneve, Parrow and Viktor. We show a fully abstractembedding of SlPCF in Solos, using a typed version of barbedcongruence.
    • Marco Gaboardi. Towards a Soft logic for PSPACE
        (linear logic)
      • KEYWORDS:linear logic, Implicit Computational Complexity,type assignment system
      • ABSTRACT:In a previous work we have introduced a type assignment system, named STA_B, correct and complete for PSPACE. STA_B is the first type system characterizing this class using the light logics approach. The expressiveness of STA_B depends on the use, both at type and language level, of Booleans constants managed additively. The soundness is obtained by giving a big step abstract machine that performs programs evaluation in polynomial space.Starting from the ideas underlying STA_B, we would here exploit the proofs-as-programs correspondence to design a logic characterizing PSPACE. The logic system obtained by forgetting terms in STA_B is unsatisfactory. Boolean constants in Soft Linear Logic are redundant. Moreover, the STA_B evaluation mechanism does not directly translate in a standard cut elimination procedure.Starting from these considerations we propose a logic system extendingSoft Linear Logic by an additive cut rule. This system is expressiveenough to characterize all the PSPACE functions. The correctness ofthe cut elimination process is assured by the use of closedreduction.
    • Michele Basaldella and Claudia Faggian. Ludics with repetitions: Hyland-Ong style exponentials
      • KEYWORDS:Ludics, Linear Logic, Game Semantics, InteractiveTypes
      • ABSTRACT:Ludics is a research program introduced by J.-Y.Girard whose aim is to provide an interactivefoundation of logic.It can be seen as a form of GameSemantics whose approach is completely based on interaction:strategies are untyped (given on a universal arena) and they caninteract with each other (by composing them).An interactive type is a setof strategies which react the same way to a set of tests(which are strategies themselves).The semantical types which are interpretationof logical formulas enjoy a closure property, called internalcompleteness, which is essential to provefull completeness.In this talk, we extend the original setting of Ludics byintroducing the exponentials in order to interpret proofsof polarizedMELL (Multiplicative-Exponentials-Linear-Logic).To this purpose, we considerHyland-Ong strategies, which naturally admit repetitionof moves.Moreover, we show that we need to extend the universeof strategies by introducing non-uniform strategies:non-uniformity isessential to build exponential types interactively.We show that in our setting itis possible to define interactive types, prove internal completeness,and from this full completeness.
    • Typed Lambda Calculi: Program Extraction

    • Alexandre Miquel. Classical program extraction in the calculus of constructions (classical logic)
      • KEYWORDS: program extraction, classical logic, calculus ofconstructions, lambda-calculus, continuations
      • ABSTRACT: In this talk, I will show that Krivine's techniques of classical realizability can be extended to the calculus of constructions with universes (CCw), which makes possible to define an extraction mechanism from classical proofs (with proof-irrelevance and countable axiom of choice) formalized in Coq (in sort Prop).For that, I will recall the basics of classical realizability, and I will show how to extend Krivine's realisability model (initially defined for 2nd order arithmetic) up to CCw, using a structure of Pi-set which is reminiscent from Hyland and Moggi's omega-sets, Streicher's D-sets and Altenkirch lambda-sets.After having presented the basic extraction scheme, I will discusspossible ways to extend it to inductively defined data-types usingoptimized representations of data structures in the target language.
    • Luca Chiarabini. Program Development by Proof Transformation.
        (typed lambda calculus: program extraction)
      • KEYWORDS:Pruning, Program Extraction
      • ABSTRACT:"Prune" a constructive proof means drop somecomputational part of the proof in function, for example, ofadditional knowledge on the input data. Christopher Alan Goad (1980)showed that a program extracted from a pruned proof can be moreperforming w.r.t. a program extracted from the same but not prunedproof. We will show an implementation of the "pruning" method in theMINLOG proof assistant, with an application to the "Bin Packing"Problem. At the end, some ideas on how to extend "pruning" will bepresented.
    • Silvia Ghilezan, Hugo Herbelin and Alexis Saurin. A Uniform Approach to Call-by-Name and Call-by-Value Delimited Control
      • KEYWORDS:Classical logic, Observational completeness,Delimited control
      • ABSTRACT:We present a comparative study of the call-by-name and call-by-valuevariants of an extension of lambda mu calculus with delimited control,introduced by Ariola, Herbelin and Sabry, covering in particular thequestions of simple typing, operational semantics, andcontinuation-passing-style semantics. Finally, we discuss therelevance of this calculus as a uniform framework for representing fourdifferent calculi of delimited continuations: two CBV calculi (Danvyand Filinski shift/reset; Sabry shift/lazy reset) and two CBN calculi(a CBN variant of Danvy and Filinski shift/reset; de Groote and SaurinLambda mu calculus -- unexpectedly!).
    • Typed Lambda Calculi: induction, coinduction and recursion

    • Jean-Franc,ois Monin. On defining f91 in Coq (typed lambda calculus: prop-bounded recursive functions)
      • KEYWORDS: Coq, general recursion
      • ABSTRACT:We propose a way to define Prop-bounded recursive functions in theCalculus of Inductive Constructions, using only inductive types(i.e. without auxiliary co-inductive types nor Dybjerinduction-recursion). The technique is illustrated on McCarthy f91function.
    • Yves Bertot and Ekaterina Komendantskaya. Inductive and Coinductive Components of Corecursive Functions in Coq
      • KEYWORDS:Coq, Coinduction, Productiveness, Guardedness, Accessibility predicates
      • ABSTRACT:In Constructive Type Theory, recursive and corecursive definitions are subject to syntactic restrictions which guarantee termination for recursive functions and productivity for corecursive functions.However, many terminating and productive functions do not pass the syntactic tests. Bove proposed in her thesis an elegant reformulation of the method of accessibility predicates thatwidens the range of terminative recursive functions formalisable in Constructive Type Theory. In this talk, we pursue the same goal for productive corecursive functions. Notably, our method of formalisation ofcoinductive definitions of productive functions in Coq requires notonly the use of ad-hoc predicates, but also a systematic algorithmthat separates the inductive and coinductive parts of functions.
    • Marcin Benke. Multiple Recursion in Type Theory and Generic Programming
      • KEYWORDS:type theory, recursion, generic programming
      • ABSTRACT:Many functions (such as e.g. boolean equality) arenaturally defined by simultaneous recursion in two or morearguments. Yet most formulations of Type Theory allow only primitiverecursion in one argument at a time. We dicsuss some ways in whichmultiplerecursion can be justified and explained in context of genericprogramming in Type Theory.
    • Typed Lambda Calculi: type assignment algorithm

    • Dulma Rodriguez. Algorithmic Subtyping for Higher Order BoundedQuantification Revisited (typed lambda calculus: type assignment algorithm)
      • KEYWORDS:subtyping, bounded quantification
      • ABSTRACT:Algorithmic subtyping for the system Fomegasub of higher order bounded polymorphism has been investigated in the past extensively yielding to positive results. The reason for this new revision of the topic is the development ofnew proof techniques in the recent years and the question, whether thecompleteness of algorithmic subtyping for this system can be provedwith these proof techniques, i.e. syntactically, without constructionof models. We provide a revision of the algorithm and prove directly,using lexicographical induction on kinds and derivations, itssoundness, completeness and termination.
    • Davide Ancona, Giovanni Lagorio and Elena Zucca. Type inference for Java-like programs by coinductive logic programming (typed lambda calculus: type assignment algorithm)
      • KEYWORDS: class analysis, coinduction, logic programming, object-oriented programming
      • ABSTRACT: Although coinductive logic programming (Co-LP) has proved to have severalapplications, including for instance verification of infinitaryproperties, model checking and bisimilarity proofs,type inference by Co-LP has not been properly investigated yet.In this presentation we show a novel approach to solve the problem of type inference for Java-like programs based onCo-LP and on the following generic scheme: first, the program P tobe analyzed is translated into a corresponding logic program P'and into a goal G; then, the solution of the type inference problemcorresponds to find a most general substitution theta s.t. (G theta)belongs to the greatest model of P', that is, the coinductive model ofP'. Operationally, this corresponds to find a co-SLD derivation of Gin P', according to the operational semantics of co-LP recentlydefined by Simon et al. [ICLP06,ICALP07].A complete example of instantiation of this scheme is considered for a simpleobject-oriented language with nominal subtyping, where type annotationsare optional. In particular, the following two different type inference problems areconsidered: (1) type inference of a complete program, that is,a collection of classes equipped with a main expression; (2) typeinference of a library of classes, where no main expression is provided.In the final part of the talk we try to address some scalabilityissues of the approach, by sketching some possible instantiations ableto deal with more expressive object-oriented languages.
    • Typed Lambda Calculi: normalization

    • Andreas Abel. Normalization by Evaluation for Martin-Lof TypeTheory (typed lambda calculus: normalization)
      • KEYWORDS: Normalization by Evaluation for Martin-Lof Type Theory
      • ABSTRACT:The decidability of equality is proved for Martin-Lof type theory witha universe 'a la Russell and typed beta-eta-equality judgements. Acorollary of this result is that the constructor for dependent functiontypes is injective, a property which is crucial for establishing thecorrectness of the type-checking algorithm. The decision procedure usesnormalization by evaluation, an algorithm which first interprets termsin a domain with untyped semantic elements and then extracts normalforms. The correctness of this algorithm is established using aPER-model and a logical relation between syntax and semantics.
    • Christophe Raffalli. PML and strong normalisation
        (typed lambda calculus: normalization)
      • KEYWORDS: strong normalisation, lambda-calculus, types
      • ABSTRACT: The development of the PML proof assistant is progressing:- the language is now stable- one can start to write proof- the theoretical work is also maturingIn this talk, we will concentrate on the last aspect and mainly thestrong normalisation proof for a subset of the language withoutfixpoint. The proof method is quite different from usual ones mainlybecause we use some kind of constraint checking algorithm (as opposedto usual typing rules of constraint solving).
    • Jose Espirito Santo, Ralph Matthes and Luis Pinto. Cgps-translations for intuitionsistic higher-order sequent calculi
        (typed lambda calculus: strong normalization)
      • KEYWORDS:cgps translation, strong normalisation, sequentcalculus, higher order logic
      • ABSTRACT:In this talk we consider type theories defined in the sequent calculus format. Specifically, we want to obtain strong normalisation of these theories as corollary to strong normalisation of the corresponding theories in the natural deduction format. The technique used is called continuation-and-garbage passing style translation, a refinement of cps-translation achieving a strict simulation of reduction in the source calculus by reduction in the target calculus. We review the technique in the propositional / simple type setting, and then we extend it to system F^omega and intuitionistic higher-order logic.Joint work with R. Matthes and L. Pinto.
    • Foundation of constructive mathematics

    • Maria Emilia Maietti and Giovanni Sambin. A minimalist two-level foundation for constructive mathematics (foundation of constructive mathematics)
      • KEYWORDS: intuitionistic logic, quotients, type theory
      • ABSTRACT:We present a two-level theory to formalizeconstructive mathematics. One level is givenby an intensional type theory previously introduced andcalled Minimal type theory. The other level is givenby an extensional set theory which is interpretedin the first one by means of a quotient model.This two-level theory has two main features: it is minimalamong the most relevant foundations for constructive mathematics;it fulfils the ``proofs-as-programs'' paradigm thanks to the waythe two levels are linked.
    • Aldo Ursini. Algebraic Types (foundation of typed lambda calculus)
      • KEYWORDS:Types, Algebraic terms
      • ABSTRACT:To a philosopher, a type is a 'co-set'; to a logician, a type is a 'formula'; to a computer scientist a type is a data-type.What is then a type to a poor mathematician? The proposal is: a type is a (FOL)term, i.e. an algebraic term. This is not so silly, as far as one can give an algebraic account for the crucial notions:- being an object a of a given type t;- for such an object a, to be in normal form;- for such an object a, what is a procedure of reduction of a to normal form;- provide some (denotational) semantics for types and objects.This can be achieved using (a common generalization of) classical algebraic notions:ideals of rings, normal subgroups, filters in Heyting or Boolean algebras, etc.; and equational logic.Thus we get a "one-dimensional world": objects, types, derivations areall just algebraic terms. Whence, reduction is handled via equationallogic. And at least in lucky cases -namely when there is a "generic"algebra available-, denotational semantics is dealt with justalgebraic terms.
    • Anton Setzer. The strength of Martin-Lof Type Theory with the logical framework (typed lambda calculus: foundations)
      • KEYWORDS:Martin-Lof Type Theory, Proof Theory, Unverses, W-type, Mahlo universe
      • ABSTRACT: Our previous investigations into Martin-Lof Type Theory wererestricted to the version without the logical framework.This avoided having to give an interpretation of Set -all what was needed is to state when the judgementA: Set is correct - this was the case essentially whenthe interpretation of A is a partial equivalence relation.We show how to overcome this problem by interpreting Set andother higher types as proper set theoretic classes.We then determine upper bounds for Martin-Lof Type Theorywith one universe and the W-type, and for the same type theorywith one Mahlo universe and the W-type.
    • Guillaume Burel. Encoding Pure Types Systems in Superdeduction
        (typed lambda calculus: subsuming PTS into superdeduction)
      • KEYWORDS:first-order logic, deduction modulo, pure type systems, supernatural deduction
      • ABSTRACT:Superdeduction is a formalism closely related todeduction modulowhich permits to enrich a first-order deductive system---such asnatural deduction or sequent calculus---with new inference rulesautomatically computed from the presentation of a theory. We give anatural encoding from every Pure Type System (PTS) into superdeductionby defining an appropriate first-order theory. We prove that thistranslation is correct and conservative, showing a correspondencebetween valid typing judgment in the PTS and provable sequents in thecorresponding superdeductive system. This implies that superdeductioncan be easily used as a logical framework, and leads to a betterunderstanding of the implementation and the automation of proof searchfor PTS, as well as to more cooperation between proof assistants. Thisalso opens new approaches regarding normalization of PTS.
      • KEYWORDS:
      • ABSTRACT:

    Implementations

  • Johannes Kanig, Sylvain Conchon and Evelyne Contejean. LightweightEquality Certificates (applications: implementing proof assistents)
    • KEYWORDS:automation, computer-assisted reasoning
    • ABSTRACT: e present a mechanism to automate equality proofsin skeptical proof assistants. The equality we consider is thecombination of the pure theory of equality over uninterpreted symbolsand a built-in decidable theory X. Our approach is intermediatebetween traditional complete proof reconstruction and verifiedprovers. Reasoning about X is implemented independently in the proofassistant and as an external decision procedure. This enables bothtools to interact only by lightweight traces about pure equalitieswhich are produced by the decision procedure and interpreted by theproof assistant. We validate our approach with the Coq proof assistant and our decision procedure CC(X), a congruence closure algorithm modulo a built-in theory. CC(X) is implemented as a module parameterized by X and the Coq development follows the same modular architecture. Currently, we have instantiated X by the theory of linear arithmetic.
  • Evelyne Contejean. Coccinelle, a Coq library for rewriting(applications: implementing proof assistents)
    • KEYWORDS:term rewriting, termination, equational theory
    • ABSTRACT:The main purpose of the Coq library Coccinelle is to model termrewriting and to use this modelling together with automaticalledgenerated traces in order to prove some key properties of rewritesystems such as termination and local confluence.For the moment Coccinelle contains:- universal (first-order) term algebras- generic equational theories over terms- associativity and commutivativity (AC) as a built-in equational theory- matching modulo AC- multisets (thanks to permuations modulo an arbitrary relation) and Dickon lemma- RPO- several criteria for termination of rewriting such as general dependencypairs, modular termination, innermost terminationThe traces are produced by the automated prover CiME,and currently Coccinelle + CiME can automatically prove thetermination of a significant part of the TPDB (termination problemdata base).
  • Stephane Lengrand. Functionalogic programming: a call for interest (implementing proof assistents: proof-search)
    • KEYWORDS: Proof search, Logic Programming, Meta-variables,Unification, Workshop / project proposal
    • ABSTRACT: At Types'06 we presented a Sequent Calculus for Pure Type Systems, building on pioneering works by Pym, Dowek, Pinto, Munoz & others. We since turned it into a framework with meta-variables to represent proofs to be found. The aim is to formalise the process of proof-search in proof-assistants based on PTS & connect it to the computation paradigm of (pure) Logic Programming and its recent developments that build on eg Lambda-Prolog. They share indeed some standard techniques/issues such as resolution and unification but also face challenges such as object-level binders (cf Nominal Isabelle) and searching for proofs by induction.Also, programming in proof-assistants based on PTS can require tedious extra work (namely, the proving) from the programmer, for which it is highly desirable to raise the level of automation, inspired by computation in logic programming. While the addition of uncontrolled fixpoints (eg let rec in ML) break the Curry-Howard paradigm of programs as proofs, we push instead the idea of offering the user the convenience of meta-variables, to use the search for type inhabitants as a programming paradigm, in the tradition of logic programming (but in more expressive type theories).We would like to put forward such a research direction as a suggestionfor a focused TYPES workshop and/or sub-project and thus call forinterests.
  • Automated Deduction

  • Andrew Polonsky. Translation from First Order Logic to CoherentLogic (Automated deduction)
    • KEYWORDS:Automated deduction, coherent logic, interactivetheorem proving
    • ABSTRACT:Coherent Logic (CL) is a subsystem of FOL with a simple, natural prooftheory. Therefore it is possible to write an automated prover for CLwhich is quick yet allows recovery of high-quality proof objects.These properties would make CL an ideal candidate for automatingreasoning in interactive proof assistants, if an efficient translationfrom arbitrary formulas to coherent logic can be found. A naiveapproach, based on the method of semantic tableaux, has proved to beinadequate for practical problems. We present recent efforts toimprove the translator, and the latest results on the performance ofthe prover as measured by the problems translated from the TPTPdatabase.
  • Formalization works

  • Carsten Schu"rmann and Jeffrey Sarnat. Structural Logical Relations (machine-checked proofs: logical relations)
    • KEYWORDS:Twelf, logical frameworks
    • ABSTRACT:Tait's method (a.k.a.\ proof by logical relations) is a powerful proof technique frequently used for showing foundational properties of languages based on typed lambda-calculi. Historically, these proofs have been extremely difficult to formalize in proof assistants with weak meta-logics, such as Twelf allowing only $\forall\exists$-sentences, some would even go as far and say impossible; and yet they are often straightforward in proof assistants with stronger meta-logics. In my talk I propose a technique called "structural logical relation" settling once and for all this debate. In support of our claims, we give a Twelf checked proof of the completeness of an equality algorithm for the simply typed $\lambda$-calculus, a potential problem for the POPLmark challenge.
  • Ralph Matthes. A representation of untyped lambda-calculus with explicit flattening through a nested datatype (machine-checked proofs: injectivity of explicit flattening)
    • KEYWORDS: nested datatype, iteration in Mendler's style,termination guarantee, explicit flattening, Coq theorem prover
    • ABSTRACT: Lambda-calculus can be turned into a monad by an notion of parallel substitution that becomes the Kleisli extension operation. FollowingBellegarde and Hook (1994), one can also study monad multiplication directly,and this amounts to an operation of flattening terms whose free variables arethemselves terms into just terms. Nested datatypes (Bird & Meertens 1998) area good means of representing untyped lambda-calculus (Altenkirch & Reus 1999,Bird & Paterson 1999). This representation can be extended by an explicit formof flattening, hence with a new datatype constructor with a nested call to thedatatype, yielding a truly nested datatype. For truly nested datatypes, nodirect support by the Calculus of Inductive Constructions (CIC) - asimplemented by the theorem prover Coq - is available. In previous work withAbel and Uustalu (2005), the author proposed Mendler-style (Mendler 1987)iteration schemes that can work on truly nested datatypes and guaranteetermination. In previous work, the author introduced an extension of the CIC -Logic of Natural Mendler-style Iteration (LNMIt) - that allows in particularto prove naturality of functions defined by these iteration schemes, and gavean implementation of LNMIt within Coq (assuming impredicative Set andpropositional proof irrelevance).However, LNMIt makes an essential use of non-canonical elements that preventthe proof of basic properties such as injectivity of the datatype constructorsfor application, abstraction and explicit flattening. Through a relativizationto hereditarily canonical elements, the problems with non-canonical elementscan be overcome and yield a truly nested datatype where exhaustivity andinjectivity of the datatype constructors can be proven. The whole developmentis done in Coq and uses propositional proof irrelevance in order to discardthe proof of hereditary canonicity (impredicativity of Set is not needed forthe development within LNMIt). The proof of injectivity of explicit flatteningis amazingly complicated at the moment, and feedback would be verywelcome.
  • Herman Geuvers. Formalizing Hybrid Systems in Coq(machine-assisted proofs: hybrid systems)
    • KEYWORDS: Hybrid Systems, Formal Verification, Proof Assistant
    • ABSTRACT: We are in the process of formalizing Hybrid Systems in Coq, with the end goal to verify safety properties of these systems. We use the abstraction method to do this, which maps the (uncountable) continuous state space to a finite state space.We will outline the general approach and indicate how safetyproperties can be decided for simple examples, where the continuousbehavior is of a relatively simple form. In doing so, we have to solveinequalities over real number expressions, for which we resort to adecidability algorithm for exact reals, as implemented in CoRN.
  • Franc,ois Garillot. A small reflection on group automorphisms
      (machine-assisted proofs: group automorphisms)
    • KEYWORDS: proof assistants, finite groups, coq
    • ABSTRACT:A Small Scale Reflection extension for the Coq proof assistant has recently been developped in synergy with a long-term effortto formalize the Feit-Thompson theorem. Their reunion features atactic language as well as a significative library on finitegroups, and multiple uses of the canonical structure mechanism ofCoq.We describe our experience using this extension to reason aboutgroup automorphisms, and an application to the study of cyclicgroups. Our work articulates various components of the library wecontribute to, and thus allows an analysis of the costs andbenefits - both significant - of their design andcompositionality.
  • Constructive real analysis

  • Cezary Kaliszyk and Russell S S O'Connor. Computing with classicalreal numbers. (constructive real analysis)
    • KEYWORDS: real numbers, Coq, CoRN
    • ABSTRACT:There are two Coq libraries that have a theory of the realnumbers. The Coq standard library gives an axiomatic treatment ofclassical reals, while the CoRN library from Nijmegen definesconstructive reals. We present a way of making these two librariescompatible by showing that their real number structures are isomorphicassuming the classical axioms.To do this we show that the axioms of classical Coq reals implydecidability of pi 0 1 sentences, namely:(forall n : nat, {P n} + {~P n}) -> {forall n, P n} + {~forall n, P n}.This shows that classical logic leaks into the computational part.We then show that this isomorphism preserves some basic constants(0, 1, e, pi), arithmetic operations and the basic transcendentalfunctions (exponential, logarithm, sine, cosine). This allows us touse O'Connor's decision procedure for solving ground inequalitiespresent in CoRN with the reals from the Coq standard library, whichwill be used for formalizing hybrid systems within Coq. It also makes itpossible to use some of the theorems from the Coq standard librarywith the CoRN reals assuming classical logic.
  • Semantics

  • Paul Brauner and Allali Lisa. A semantic normalization proof for a system with recursors (semantics: cut-elimination)
    • KEYWORDS: recursors, cut elimination, deduction modulo, superconsistency
    • ABSTRACT:Semantics methods have been used to prove cut elimination theorems for a long time. It is only recently that they have been extend to prove strong normalization results, in particular for theories in deduction modulo. However such semantic methods did not apply for systems with recursors like Godel system T. We show in this paper how supernatural deduction provides a bridge between superconsistency ofarithmetic and strong normalization of system T. We then generalizethis result to a family of inductive types before discussing thedependant case.
  • Yves Bertot and Vladimir Komendantsky. Fixed point semantics and partial recursion in Coq (semantics: fixed point in Coq)
    • KEYWORDS:Knaster-Tarski least fixed point theorem, partialrecursive functions, denotational semantics, non-termination, programextraction
    • ABSTRACT: We propose to use Knaster-Tarski least fixedpoint theorem as a basis to define recursive functions in the Calculusof Inductive Constructions. This widens the class of functions thatcan be modeled in type-theory based theorem proving tools topotentially non-terminating functions. This is only possible if weextend the logical framework by adding some axioms of classicallogic. We claim that the extended framework makes it possible toreason about terminating or non-terminating computations and we showthat extraction can also be extended to handle the new functions.
  • Linguistic

  • (17) Muhammad Humayoun. Software Specifications and Mathematical Proofs in Natural Languages (linguistic)
    • KEYWORDS:Natural languages, Formal languages, MathematicalProofs, Software Specifications, translation
    • ABSTRACT:This project is an attempt to make a connectionbetween formal and natural languages. We are developing a controllednatural language having large coverage, which will be good enough forwriting Software Specifications and Mathematical Proofs interactively.
  • (18) Aarne Ranta, Bjo"rn Bringert and Krasimir Angelov. PGF: A Run-Time Format for Type-Theoretical Grammars (linguistic)
    • KEYWORDS: grammatical framework, grammar formalism, natural language processing, type theory
    • ABSTRACT: PGF (Portable Grammar Format) is arun-time format for type theoretical grammars which is designedfor easy and efficient embedding of grammar-based processingin programs written in different programming languages. The PGFformat allows parsing, linearization, and type checkingto be performed by a simple interpreter, which can be seamlesslyused as a module in a larger program, such as a translator ora dialogue system. PGF interpreters already exist for C, C++, Haskell,Java, and JavaScript.The main way to produce grammars in the PGF format is compilation fromhigher-level grammar formalisms. As an example, a compiler fromGF (Grammatical Framework) to PGF is outlined.The main goal of the present paper is to give a complete description of PGF - bothits syntax and semantics - to support the implementation of PGF interpretersin new programming language, and the compilation of other grammarformalisms to PGF.
  • Zhaohui Luo and Serguei Soloviev. Coercive subtyping and multilingual semantics.
    • KEYWORDS:coercive subtyping, multilingual semantics
    • ABSTRACT:In [1] the use of coercive subtyping in a dependent type framework wassuggested to explain certain lexical phenomena, such as homonymywithin a single language, for example English. Now we generalize thisapproach to the correspondence between different languages, withapplications to the problems of translation.[1] Z. Luo and P. Callaghan. Coercive subtyping and lexical semantics(extended abstract). Logical Aspects of Computational Linguistics(LACL'98), Grenoble, 1998.
  • Invited Talks

    Infinity and Productivity. Jan Willem KlopVU University Amsterdam
    Abstract. In the setting of infinitary term rewriting,both for the first order case and for lambda calculus,we discuss the problem of productivity, which is forinfinitary rewriting what termination is for finitary rewriting. While in general undecidable, methods can be givenfor subclasses of infinitary specifications. In particularwe treat such a method for infinite stream specifications,but we will also sketch an approach for a wider context.We illustrate this with our pet guiding examples such as the Thue-Morse sequence, the Toeplitz sequence (and as an aside their connection to the snow flake of von Koch), and tree ordinals. These topics stem from joint work within our Infinity project, in particularwith Dimitri, Joerg, Ariya, Roel, Clemens. (See http://infinity.few.vu.nl/)

    About the formalization of some results by Chebyshev in number theory.Andrea Asperti, Universita di Bologna. AbstractWe discuss the formalization, with the Matita Interative Theorem Prover,of a few famous results by Chebyshev concerning the distribution ofprime numbers, comprising a purely arithmetical proof of Bertrand'spostulate.

    Prev.: 3. List Of Participants . Up: 4. Schedule Of The Congress. Next: 5. How To Register .
    Web Adress: http://types2008.di.unito.it/. E-mail : Acknowledgements: Types 2008 is hosted by the Semantic Group of Computer Science Dept. of Turin University .This site has been generated using a Small Site Generator written in the Mathematica language by Stefano Berardi .