Skip to main content
Top

2010 | Book

The Mathematics of Language

10th and 11th Biennial Conference, MOL 10, Los Angeles, CA, USA, July 28-30, 2007, and MOL 11, Bielefeld, Germany, August 20-21, 2009, Revised Selected Papers

Editors: Christian Ebert, Gerhard Jäger, Jens Michaelis

Publisher: Springer Berlin Heidelberg

Book Series : Lecture Notes in Computer Science

insite
SEARCH

Table of Contents

Frontmatter
Dependency Structures Derived from Minimalist Grammars
Abstract
This paper provides an interpretation of Minimalist Grammars [16],[17] in terms of dependency structures. Under this interpretation, merge operations derive projective dependency structures, and movement operations introduce both non-projectivity and illnestedness. This new characterization of the generative capacity of Minimalist Grammar makes it possible to discuss the linguistic relevance of non-projectivity and illnestedness. This in turn provides insight into grammars that derive structures with these properties.
Marisa Ferrara Boston, John T. Hale, Marco Kuhlmann
Deforesting Logical Form
Abstract
This paper argues against Logical Form (LF) as an intermediate level of representation in language processing. We apply a program transformation technique called deforestation to demonstrate the inessentiality of LF in a parsing system that builds semantic interpretations. We consider two phenomena, Quantifier Raising in English and Wh-movement in Chinese, which have played key roles in the broader argument for LF. Deforestation derives LF-free versions of these parsing systems. This casts doubt on LF’s relevance for processing models, contrary to suggestions in the literature.
Zhong Chen, John T. Hale
On the Probability Distribution of Typological Frequencies
Abstract
Some language types are more frequent among the world’s languages than others, and the field of linguistic typology attempts to elucidate the reasons for such differences in type frequency. However, there is no consensus in that field about the stochastic processes that shape these frequencies, and there is thus likewise no agreement about the expected probability distribution of typological frequencies. This paper explains the problem and presents a first attempt to build a theory of typological probability purely based on processes of language change.
Michael Cysouw
A Polynomial Time Algorithm for Parsing with the Bounded Order Lambek Calculus
Abstract
[2] introduced the bounded-order Lambek Calculus and provided a polynomial time algorithm for its sequent derivability. However, this result is limited because it requires exponential time in the presence of lexical ambiguity. That is, [2] did not provide a polynomial time parsing algorithm. The purpose of this paper will be to provide such an algorithm. We will prove an asymptotic bound of O(n 4) for parsing and improve the bound for sequent derivability from O(n 5) to O(n 3).
Timothy A. D. Fowler
LC Graphs for the Lambek Calculus with Product
Abstract
This paper introduces a novel graph representation of proof nets for the Lambek calculus that extends the LC graph representation of [13] to include the product connective. This graph representation more clearly specifies the difference between the Lambek calculus with and without product than other proof net representations, which is important to the search for polynomial time among Lambek calculus fragments. We use LC graphs to further the efforts to characterize the boundary between polynomial time and NP-complete sequent derivability by analyzing the NP-completeness proof of [14] and discussing a sequent derivability algorithm.
Timothy A. D. Fowler
Proof-Theoretic Semantics for a Natural Language Fragment
Introduction
We propose a Proof − Theoretic Semantics (PTS) for a (positive) fragment \(E^{+}_{0}\) of Natural Language (NL) (English in this case). The semantics is intended [7] to be incorporated into actual grammars, within the framework of Type − Logical Grammar (TLG) [12]. Thereby, this semantics constitutes an alternative to the traditional model − theoretic semantics (MTS), originating in Montague’s seminal work [11], used in TLG.
Nissim Francez, Roy Dyckhoff
Some Interdefinability Results for Syntactic Constraint Classes
Abstract
Choosing as my vantage point the linguistically motivated Müller-Sternefeld hierarchy [23], which classifies constraints according to their locality properties, I investigate the interplay of various syntactic constraint classes on a formal level. For non-comparative constraints, I use Rogers’s framework of multi-dimensional trees [31] to state Müller and Sternefeld’s definitions in general yet rigorous terms that are compatible with a wide range of syntactic theories, and I formulate conditions under which distinct non-comparative constraints are equivalent. Comparative constraints, on the other hand, are shown to be best understood in terms of optimality systems [5]. From this I derive that some of them are reducible to non-comparative constraints. The results jointly vindicate a broadly construed version of the Müller-Sternefeld hierarchy, yet they also support a refined picture of constraint interaction that has profound repercussions for both the study of locality phenomena in natural language and how the complexity of linguistic proposals is to be assessed.
Thomas Graf
Sortal Equivalence of Bare Grammars
Abstract
We discuss a concept of structural equivalence between grammars in the framework of Keenan and Stabler’s bare grammars. The definition of syntactic sorts for a grammar L permits the introduction of a sort structure group Aut π (L). The automorphism group Aut(L) of L is found to be a group extension by Aut π (L). We develop then a concept of equivalence of grammars based on isomorphisms between the syntactic sort algebras. We study the implications of this equivalence with techniques from category theory: we invert the class of grammar homomorphisms that induce isomorphisms of sort algebras. The resulting category of fractions is found to be equivalent to a category of sortally reduced grammars.
Thomas Holder
Deriving Syntactic Properties of Arguments and Adjuncts from Neo-Davidsonian Semantics
Abstract
This paper aims to show that certain syntactic differences between arguments and adjuncts can be thought of as a transparent reflection of differences between their contributions to neo-Davidsonian logical forms. Specifically, the crucial underlying distinction will be that between modifying an event variable directly, and modifying an event variable indirectly via a thematic relation. I note a convergence between the semantic composition of neo-Davidsonian logical forms and existing descriptions of the syntactic properties of adjunction, and then propose a novel integration of syntactic mechanisms with explicit neo-Davidsonian semantics which sheds light on the nature of the distinction between arguments and adjuncts.
Tim Hunter
On Monadic Second-Order Theories of Multidominance Structures
Introduction
Multidominance structures were introduced by Kracht [4] to provide a data structure for the formalisation of various aspects of GB-Theory. Kracht studied the PDL-theory of MDSes and showed that this theory is decidable in [5], actually 2EXPTIME-complete. He continues to conjecture that thus the MSO-theory of MDSes should be decidable, too. We show here the contrary. Actually, both the MSO-theory over vertices only and the MSO-theory over vertices and edges turn out to be undecidable.
Stephan Kepser
The Equivalence of Tree Adjoining Grammars and Monadic Linear Context-Free Tree Grammars
Abstract
It has been observed quite early after the introduction of Tree Adjoining Grammars that the adjoining operation seems to be a special case of the more general deduction step in a context-free tree grammar (CFTG) derivation. TAGs look like special cases of a subclass of CFTGs, namely monadic linear CFTGs. More than a decade ago it was shown that the two grammar formalisms are indeed weakly equivalent, i.e., define the same classes of string languages. This paper now closes the remaining gap showing the strong equivalence for so-called non-strict TAGs, a variant of TAGs where the restrictions for head and foot nodes are slightly generalised.
Stephan Kepser, James Rogers
A Formal Foundation for A and A-bar Movement
Abstract
It seems a fact that movement dependencies come in two flavours: ”A” and ”A-bar”. Over the years, a number of apparently independent properties have been shown to cluster together around this distinction. However, the basic structural property relating these two kinds of movement, the ban on improper movement (’once you go bar, you never go back’), has never been given a satisfactory explanation. Here, I propose a timing-based account of the A/A-bar distinction, which derives the ban on improper movement, and allows for a simple and elegant account of some of their differences. In this account, ”A” dependencies are those which are entered into before an expression is first merged into a structure, and ”A-bar” dependencies are those an expression enters into after having been merged. The resulting system is mildly context-sensitive, providing therefore a restrictive account of possible human grammars, while remaining expressive enough to be able to describe the kinds of dependencies which are thought to be manifest.
Gregory M. Kobele
Without Remnant Movement, MGs Are Context-Free
Abstract
Minimalist grammars offer a formal perspective on a popular linguistic theory, and are comparable in weak generative capacity to other mildly context sensitive formalism. Minimalist grammars allow for the straightforward definition of so-called remnant movement constructions, which have found use in many linguistic analyses. It has been conjectured that the ability to generate this kind of configuration is crucial to the super-context-free expressivity of minimalist grammars. This conjecture is here proven.
Gregory M. Kobele
The Algebra of Lexical Semantics
Abstract
The current generative theory of the lexicon relies primarily on tools from formal language theory and mathematical logic. Here we describe how a different formal apparatus, taken from algebra and automata theory, resolves many of the known problems with the generative lexicon. We develop a finite state theory of word meaning based on machines in the sense of Eilenberg [11], a formalism capable of describing discrepancies between syntactic type (lexical category) and semantic type (number of arguments). This mechanism is compared both to the standard linguistic approaches and to the formalisms developed in AI/KR.
András Kornai
Phonological Interpretation into Preordered Algebras
Abstract
We propose a novel architecture for categorial grammar that clarifies the relationship between semantically relevant combinatoric reasoning and semantically inert reasoning that only affects surface-oriented phonological form. To this end, we employ a level of structured phonology that mediates between syntax (abstract combinatorics) and phonology proper (strings). To notate structured phonologies, we employ a lambda calculus analogous to the φ-terms of [8]. However, unlike Oehrle’s purely equational φ-calculus, our phonological calculus is inequational, in a way that is strongly analogous to the functional programming language LCF [10]. Like LCF, our phonological terms are interpreted into a Henkin frame of posets, with degree of definedness (’height’ in the preorder that interprets the base type) corresponding to degree of pronounceability; only maximal elements are actual strings and therefore fully pronounceable. We illustrate with an analysis (also new) of some complex constituent-order phenomena in Japanese.
Yusuke Kubota, Carl Pollard
Relational Semantics for the Lambek-Grishin Calculus
Abstract
We study ternary relational semantics for LG: a symmetric version of the Lambek calculus with interaction principles due to Grishin [10]. We obtain completeness on the basis of a Henkin-style weak filter construction.
Natasha Kurtonina, Michael Moortgat
Intersecting Adjectives in Syllogistic Logic
Abstract
The goal of natural logic is to present and study logical systems for reasoning with sentences of (or which are reasonably close to) ordinary language. This paper explores simple systems of natural logic which make use of intersecting adjectives; these are adjectives whose interpretation does not vary with the noun they modify. Our project in this paper is to take one of the simplest syllogistic fragments, that of all and some, and to add intersecting adjectives. There are two ways to do this, depending on whether one allows iteration or prefers a “flat” structure of at most one adjective. We present rules of inference for both types of syntax, and these differ. The main results are four completeness theorems: for each of the two types of syntax we have completeness for the all fragment and for the full language of this paper.
Lawrence S. Moss
Creation Myths of Generative Grammar and the Mathematics of Syntactic Structures
Abstract
Syntactic Structures (Chomsky [6]) is widely believed to have laid the foundations of a cognitive revolution in linguistic science, and to have presented (i) the first use in linguistics of powerful new ideas regarding grammars as generative systems, (ii) a proof that English was not a regular language, (iii) decisive syntactic arguments against context-free phrase structure grammar description, and (iv) a demonstration of how transformational rules could provide a formal solution to those problems. None of these things are true. This paper offers a retrospective analysis and evaluation.
Geoffrey K. Pullum
On Languages Piecewise Testable in the Strict Sense
Abstract
In this paper we explore the class of Strictly Piecewise languages, originally introduced to characterize long-distance phonotactic patterns by Heinz [7] as the Precedence Languages. We provide a series of equivalent abstract characterizations, discuss their basic properties, locate them relative to other well-known subregular classes and provide algorithms for translating between the grammars defined here and finite state automata as well as an algorithm for deciding whether a regular language is Strictly Piecewise.
James Rogers, Jeffrey Heinz, Gil Bailey, Matt Edlefsen, Molly Visscher, David Wellcome, Sean Wibel
A Note on the Complexity of Abstract Categorial Grammars
Introduction
This paper presents a precise and detailed study of the complexities of the membership and the universal membership problems for Abstract Categorial Grammars (ACG). ACGs have been introduced by Philippe de Groote in [2] as a simplification over categorial grammars which reduces the number of necessary primitives involved in the definition of the formalism. Thus in ACGs, every structure is represented with the help of the linear λ-calculus and the languages defined by means of ACGs are sets of linear λ-terms. The problem under investigation has already been studied in [7], but we give here some more precise results with some arguably simpler proofs. We use the same classification of the grammars in terms of the order of the lexicon and of the order of abstract language.
Sylvain Salvati
Almost All Complex Quantifiers Are Simple
Abstract
We prove that PTIME generalized quantifiers are closed under Boolean operations, iteration, cumulation and resumption.
Jakub Szymanik
Constituent Structure Sets I
Abstract
We replace traditional phrase structure tree representations by a new type of set-based representations. In comparison to labeled trees in which we can potentially copy one category onto an infinite number of distinct nodes, the proposed representation system significantly restricts syntactic copying. Thus, we do not need to filter out copying possibilities by way of additional constraints. Each structure represents a set of constituents with structurally distinguished items. We provide the intended PF structures with regard to which our set based syntactic representations are sound and complete via our PF interpretation rules. The intended PF structures provide enough flexibility to accommodate word order variation across languages relative to the same syntactic structures.
Hiroyuki Uchida, Dirk Bury
Backmatter
Metadata
Title
The Mathematics of Language
Editors
Christian Ebert
Gerhard Jäger
Jens Michaelis
Copyright Year
2010
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-14322-9
Print ISBN
978-3-642-14321-2
DOI
https://doi.org/10.1007/978-3-642-14322-9

Premium Partner