Linguistics and language-related topics

Selected Publications

Neuvel, S. and Sean Fulop. (2002) Unsupervised Learning of Morphology Without Morphemes. Proceedings of the Workshop on Morphological and Phonological Learning 2002. ACL Publications.

Read the paper (on arxiv.org)


Neuvel, S. and R. Singh (2002) Vive la difference! What Morphology is About. Folia Linguistica: 35/3-4. 313-320

Read the paper (PDF)


Neuvel, S. (2002) Whole Word Morphologizer. Expanding the Word-Based Lexicon: A non-stochastic computational approach. Brain and Language 81. 454-463.

Read the paper (PDF)


Neuvel, S. (2001) Pattern Analogy vs. Word Internal Syntactic Structure in West-Greenlandic. Geert Booij and Jaap van Marle (eds.), Yearbook of Morphology 2000. Dordrecht: Kluwer. 253-278.

Whole Word Morphology

Whole Word Morphology is a theory of non-concatenative morphology developed by Alan Ford and Rajendra Singh at the Université de Montréal. It focuses on contrastive relations between whole words and is one of the very few truly word-based theories of morphology.

The theory of WWM was first suggested in Ford and Singh 1983. A series of papers dealing with various aspects of it was published by Ford and Singh between 1983 and 1990. Drawing on these papers, they published a full outline of it in 1991 and an even fuller defense of it in 1997 (with Martohardjono). Since then, aspects of it have been taken up in a series of publications by Agnihotri, Dasgupta, Ford, Neuvel, and Singh, and various combinations of these authors. (Read more ...)

Testing the theory. Machine learning withour morphemes.

WWM
Whole Word Morphologizer
WWM is a small application developped within the framework of Whole Word Morphology that identifies word-based morphological relations in a lexicon and creates new words based on these relations.
WWM compares every word of a small lexicon and calculates the segmental differences found between them to create bi-directional word-based morphological strategies. Each word in the lexicon is then mapped onto as many strategies as possible and contrasting new words are added to the lexicon. (Read more ...)

Autolexical Grammar

Autolexical Grammar is a variety of non-transformational generative grammar in which fully autonomous systems of rules characterize various dimensions of linguistic structure.

The framework is non-derivational, in that each component, usually called dimension, is static, with no processes applying to create new forms from underlying forms. Language is viewed as the intersection of a set of independent representations. There are no rewrite rules. Instead, each component consists of a structural description of the information relevant to that dimension.

Read this Introductory text, from Eric Schiller.

Meaning-Text Linguistics

The Meaning-Text Theory (MTT), first put forward in Moscow by Zholkovskij & Mel'chuk (1965), operates on the principle that language consists in a mapping from the content or meaning (semantics) of an utterance to its form or text (phonetics). (Read more ...)