Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2023, Invited talk. First International Conference on Language and the Brain (Biolinguistics panel)
…
26 pages
1 file
The concept of ‘computation’ is, to make an understatement, multifaceted. ‘Computation’ in syntax, cognitive science, and computer science often receive drastically different definitions, sometimes at direct odds with each other. Are we defining closed input-output mappings over naturals? Are we integrating information from multiple sources interactively? What are the basic ingredients in a definition of ‘computation’ such that we can say that a digital computer and a human are doing it? In this talk we will examine some aspects of the relation between what ‘computation’ looks like in the theory of syntax, as well as in some aspects of neurocognition and computer science, and try to establish to what extent these approaches deal with the same kind of process. Asking these questions is important in order to bridge the gap between syntactic theory (which is concerned with providing empirically adequate structural descriptions for natural language sentences) and cognitive neuroscience (which is concerned with the neurocognitive underpinnings of what goes on in language production and processing). Building on the distinction between emulation and simulation, of long pedigree in computer science and AI research, we will focus on the basic properties of syntactic computation, analyse what we should require of a descriptively adequate grammar, and whether a correspondence with neurocognitive processes is not only possible, but even desirable.
The focus of this talk is the nature of cognitive computation, and the relation between computation, linguistic theory, and dynamical systems. We will review traditional notions of computation and analyse their applicability to natural language, distinguishing it from formal languages as usually studied in Computer Science. The main theoretical result of the thesis is that imposing a single computational template for the assignment of structural descriptions to natural language sentences is not only empirically inadequate, but also theoretically more costly than assuming a strongly cyclic approach in which computational dependencies vary, oscillating up and down the Chomsky Hierarchy of formal grammars. The idea that the grammar assigns substrings the simplest possible structural description that captures semantic dependencies between syntactic objects will be referred to as mixed computation. The analysis of theories of computation will reveal that the theory of computable functions must not be identified with the theory of effective computation, and we will argue for the necessity to introduce aspects of interaction in the study of physically realized computational procedures, which configure dynamical systems of a very specific kind: those defined by the irreconcilable tension between opposing requirements.
Journal of Cognition and Neuroethics, 2018
This paper will attempt to debunk the idea that human language grammar as part of the Faculty of Language (FoL) is intrinsically a computing device. The central argument here is that grammar does not compute. One way of demonstrating this is to show that the operations of grammar in the Generative model do not have the character typical of computations. Thus, the central operation of grammar Merge, which combines lexical items to produce larger expressions, can be defined as a recursive function, but it does not share the inductive properties of recursive functions in mathematics in view of the consideration that recursive functions define computability. On the other hand, if the language faculty is a computing system, the language faculty must inherit the halting problem as well. It is easy to impose the halting problem on the selection of lexical items from the lexicon in such a manner that FoL may or may not terminate over the selection of lexical items. We can say: there is no FoL way of telling if FoL will ever terminate on x or not when x is a selection from the lexicon. The halting problem for FoL is disastrous for the view that grammar is a computing system of the brain/mind since it detracts from the deterministic character of FoL. This has significant repercussions not just for grammar that cannot be restricted to any limited view of mental computation but also for the nature of the cognitive system as a whole since any cognitive domain that is (supposed to be) language-like cannot be said to compute as well.
2008
There is a tendency in science to proceed from descriptive methods towards an adequate explanatory theory and then move beyond its conclusions. Our purpose is to discover the concepts of computational efficiency in natural language that exclude redundancy, and to investigate how these relate to more general principles. By developing the idea that linguistic structures possess the features of other biological systems this article focuses on the third factor that enters into the growth of language in the individual. It is suggested that the core principles of grammar can be observed in nature itself. The Faculty of Language is an efficient mechanism designed for the continuation of movement in compliance with optimization requirements. To illustrate that, a functional explanation of syntactic Merge is offered in this work, and an attempt is made to identify some criteria that single out this particular computational system as species-specific.
The computational program for theoretical neuroscience initiated by calls for a study of biological information processing on several distinct levels of abstraction. At each of these levels -computational (defining the problems and considering possible solutions), algorithmic (specifying the sequence of operations leading to a solution) and implementational -significant progress has been made in the understanding of cognition. In the past three decades, computational principles have been discovered that are common to a wide range of functions in perception (vision, hearing, olfaction) and action (motor control). More recently, these principles have been applied to the analysis of cognitive tasks that require dealing with structured information, such as visual scene understanding and analogical reasoning. Insofar as language relies on cognition-general principles and mechanisms, it should be possible to capitalize on the recent advances in the computational study of cognition by extending its methods to linguistics.
A n umber of language processing studies indicate that violations of syntactic constraints are processed differently from violations of semantic constraints (Brain imaging: e.g., Ainsworth-Darnell et al., 1998 Ni et al., in press Speeded grammaticality judgment: McElree & Gri th, 1995 Eye-tracking: Ni et al., 1998). Although these results are often taken as support for the view that the processor employs two separate modules for enforcing the two classes of constraints, we nd (in keeping with Rohde & Plaut, 1999, and Tabor & Tanenhaus, 1999) that a nonmodular connectionist network can learn a quantitative distinction between the two types of constraints. But prior connectionist studies have been inexplicit about why the distinction arises. We argue that it stems from the distinct distributional correlates of the di erent types of information: syntax involves gross distinctions semantics involves subtle ones. We also describe the Bramble Net, an attractor network which derives grammatical categories and models an approximation of the syntax/semantics distinction in qualitative terms. These results support Elman's (1990) suggestion that grammatical structures may arise by self-organization, rather than by hardwiring. They also help clarify what the grammatical structures are in a self-organizing connectionist network, and emphasize the usefulness of dynamical systems theory in grammatical explanation.
In this paper, a multi-agent computational model is used to simulate the emergence of a compositional language from a holistic signaling system through iterative interactions among heterogeneous agents. Syntax, in the form of simple word order, coevolves with the emergence of the lexicon through self-organization in individuals. We simulate an indirect meaning transference, in which the listener's comprehension is based on the interaction of linguistic and nonlinguistic information, together with a feedback without direct meaning checking. Homonyms and synonyms emerge inevitably during the rule acquisition. Homonym avoidance is assumed to be a necessary mechanism for developing an effective communication system.
Topics in Computer Mathematics, 2003
Chomsky's theory of syntax came after criticism of probabilistic associative models of word order in sentences. Immediate constituent structures are plausible but their description by generative grammars has met with difficulties. The type 2 (context-free) grammars account for constituent structure, but already trespass the mathematical capacity required by language, because they generate unnatural mathematical sets: a consequence of being based on recursive function theory. Abstract associative models investigated by formal language theoreticians (Schutzenberger, McNaughton, Papert, Brzozowsky, Simon) are known as locally testable models. A combination of locally testable and constituent structure models is proposed under the name of Associative Language Description, arguing that it equals type 2 grammars in explanatory adequacy, yet is compatible with brain models. Two versions of ALD are exemplified and discussed: one based on modulation, the other on pattern rules. A sketch of brain organization in terms of cell assemblies and synfire chains concludes. Key words and phrases. context-free grammar, word association, cell assemblies, synfire chains, non-counting property, grammar inference. This work was presented at the Workshop on Interdisciplinary approaches to a new understanding of cognition and consciousness, Villa Vigoni, Menaggio 1997. We acknowledge the support of Forschung für anwendungsorientierte Wissenverarbeitung, Ulm and of CNR-CESTIA.
This book investigates the connection between language, mind and computation in theoretical linguistics in particular and cognitive science in general. The relationship between grammar, mind and computation which buttresses much of mainstream linguistic theory is rarely questioned but forms the basis of many theoretical developments and empirical advances. Language, Mind and Computation challenges and critiques the basis of this relationship, attempting to demonstrate that natural language grammars cannot be both mental and computational if the nature of interpretation is unaccounted for. This ambitious book will be of interest to theoretical linguists, philosophers of language, psycholinguists and even computer scientists. Reviews: 'This book sheds a new light on the relationship between language, mind and computation as conceived of in current linguistic theory.' Marcelo Dascal, Tel Aviv University, Israel
Advances in multimedia and interactive technologies book series, 2018
Computationalism should not be the view that (human) cognition is computation; it should be the view that cognition (simpliciter) is computable. It follows that computationalism can be true even if (human) cognition is not the result of computations in the brain. If semiotic systems are systems that interpret signs, then both humans and computers are semiotic systems. Finally, minds can be considered as virtual machines implemented in certain semiotic systems, primarily the brain, but also AI computers.
Theoretical Linguistics, 2000
Let us assume for the sake of argument that Martin Stokhof and Michiel van Lambalgen (S&vL) have correctly identified methodological flaws that obstruct progress in modern linguistics 2 and let us therefore comment on one of the alternatives S&vL point out as potentially promising. For reasons of curiosity we choose "Neural Syntax" (NS) (Fitz 2009), which instantiates one of the "approaches in which neuronal models of language acquisition and language use are studied." It is important for us to stress right at the outset that we cannot do justice to the intricacies of NS. In fact, we will be rather selective in focusing on issues that seem to us to cast doubt on the NS approach.