Skip to main content
Top

1988 | Book

Mind, Language, Machine

Artificial Intelligence in the Poststructuralist Age

insite
SEARCH

Table of Contents

Frontmatter
1. Mind, Language, Machine

Today there is no more exciting field of inquiry than that variegated and largely disunified one concerned with investigating human language. In so far as it can be defined as a discipline, it draws relentlessly on the resources of many other disciplines with which it shares interests. Recently the interests of language theorists and researchers have overlapped fruitfully with those of various kinds of computer scientists and psychologists, whose interests in turn appear to be melding in the development of the hybrid discipline of a general cognitive science. As Philip N. Johnson-Laird observes,

During the past decade, there has grown up a confederation of disciplines — linguistics, philosophy of mind, cognitive anthropology, neuroscience, artificial intelligence, and psychology itself — whose practitioners have realized that they are converging on the same set of problems and even independently invoking the same explanatory ideas. The so-called cognitive sciences have sometimes seemed like six subjects in search of an interdisciplinary synthesis; if that synthesis does not yet exist, it is certainly necessary to invent it.

1

Michael L. Johnson
2. System, Text, Difference

Mind, language, machine: three systems without positive substance that, copulating interactively, create the universe as it is known and manipulated by man. Each seems to have a ‘physical’ reality, and yet each is also — and more importantly — an ethereal network of relations: a mind, elaborated through its eerie space like a language, is not simply a brain (a neural structure, ‘wetware’); a language is not simply air in motion or ink on paper; a machine is not simply silicon-based circuitry (hardware). Each, like the signs that comprise language, has a particle/wave nature, has something of Jacques Derrida’s trace (‘trace’) about it: each ‘can be focussed either materially or conceptually,… both is and is not matter, and carries within itself a kind of necessary exteriority’.1 Each in its insubstantiality consists of software: differential relations, diacritical nodes. No stuff at all finally: just functionalist patterns of interrelation, maps without territory, wispy microtexts interwoven. Each is — or is coming to be understood as — the same kind of system. And yet there seems to be a divorce among them, an otherness of each to the others.

Michael L. Johnson
3. Relations, Origin, Silicon

The consideration of language as a system of relations gives rise to a vertiginous question, one that feels like a petit-mal seizure in the mind of God: what if everything is a network of differences, nothing… centred or substantial at all, concrete and particular only by the delusions of desire? The question prompts a terror, an intoxication with the bottomless and indeterminate.

Michael L. Johnson
4. Writing, Urtext, Bullae

But to gain perspective on such questions, let me begin again, however questionably, with beginnings.

Michael L. Johnson
5. Invisibility, Children, Rules

If phylogenetic theories of the origin of language are quicksand and the history of writing is just now being pieced together, what about ontogenetic theories? What can they tell about the relations between mind and language?

Michael L. Johnson
6. Chimpanzees, Gorillas, Fictionalization

If research on the child’s acquisition of language is blocked by an invisibility, what about research, still quite controversial, on the acquisition of human language by chimpanzees and gorillas? There is an invisibility there too — but partly of a different kind.

Michael L. Johnson
7. Alternity, Acquisition, Invariance

I wonder if that aphasic moment when a word evades me is not suggestive of how a nonverbal animal engages the universe. But then when the word ‘comes to mind’, it has already reconstructed and particularized consciousness. It has the power to revise reality or even to generate alternate realities: fantasies, myths, lies. With the word forgotten or never learned, the universe must be taken nakedly, without the shaping gloss. For man the aphasic moment is a moment of irritation and dislocation, perhaps even panic or mental evacuation, or perhaps — if experienced religiously — of Buddhistic enlightenment.

Michael L. Johnson
8. Transformation, Discontinuity, Metalanguage

Life on Earth began some three or four billion years ago with a protein text embedded in the progenote, the ancestor of all cells. Its first major transformation was an act of information: the emergence of an organism that could perpetuate itself and evolve through the communication of its form and formal variations from generation to generation — biogenesis. The second major transformation, some 300–odd million years ago, occurred when ‘there emerged an organism that for the first time in the history of the world had more information in its brains than in its genes’.1 That transformation involved a radical morphogenesis, a revolution in the differentiation of the nervous system (itself a language) that separated it from the alimentary system and centralized its power. The third major transformation was the development, within the last million years or so, of the human brain, a structure that now has created more information exterior to itself, in human culture, than it could ever contain within its own boundaries. This last transformation involved a morphogenesis that, when its various aspects are taken together, was not only radical but singular.

Michael L. Johnson
9. Synergy, Neuroanatomy, Hemispheres

If there was some such dialectic of cultural and biological factors in the evolution of human language (one that is still ontogenetically recapitulated), the latter factors involved a synergy accounting for Bronowski’s ‘biological frame’ for language. It is certain that human language was made possible biologically not by any single genetic mutation but by a complex of mutations phenotypically realized and selected for over a (relatively) brief period. Together comprising a threshold event, they gave rise to the neuroanatomy of language and tuned the grosser structures of acoustic communication. The oral and aural structures, delicate and impressive as they are, are fairly obvious in their workings, and recent speech-synthesis programs demonstrate that the electronic production of speech, complete with a wide range of nuances, will soon be possible (though speech-recognition programs remain problematic — this situation contrasts with that of simulation programs that deal with semantically conditioned syntax, for there recognition presents fewer problems than production). The engaging subject is thus the neuroanatomy.

Michael L. Johnson
10. Interconnectivity, Self-Consciousness, Excess

A richly innervated binocular vision, erect posture (which may have encouraged the lengthening of the pharynx and the posterior location of the tongue and whose verticality suggests connections between language and religious sensibility), the invention of the master tool, a sense of the future, and so on — these and other factors contributed to a discontinuity in evolution that set man the language user on the path of his development. But what happened in the brain/mind itself? What was the threshold event?

Michael L. Johnson
11. Neocortex, Manifold, Catastrophe

A complicated terminology is involved in adapting Thom’s ideas to the graphic mapping of the animal—man discontinuity, and it must be introduced carefully so that no important assumptions are ignored.

Michael L. Johnson
12. AI, Computers, Density

After the neocortex reached a certain size, after its interconnectivity reached a certain density, the rapid development of language-dedicated areas was achievable — in an organism of small enough somatic dimensions to free a large portion of neocortical tissue for linguistic activities. Through time those activities — and their cultural systems — were selected for and enriched. The catastrophic discontinuity in the evolution of intelligence made man possible.

Michael L. Johnson
13. Program, Game, Assembler

In the wake of these concerns looms the question that most intrigues: how are mind and machine, whose evolutions have been perhaps too easily analogized, alike and different? Language, the third sector of the circle of metaphor, is obviously the key to properly exploring that question. Language is software, the ‘languescape’ of both man and computer; microelectronic circuitry and neural tissues are, after all, just two kinds of hardware. J. Z. Young is helpful in this connection:

Information is carried by physical entities, such as books or sound waves or brains, but it is not itself material. Information in a living system is a feature of the order and arrangement of its parts, which arrangement provides the signs that constitute a ‘code’ or ‘language.’… The organization of the brain can be considered as the written script of the programs of our lives. So the important feature of brains is not the material that they are made of but the information that they carry.

Michael L. Johnson
14. Turing, More, Analogies

Alan M. Turing several decades ago proved that ‘all computers (save a few special-purpose types…) are equivalent to one another, i. e., are all universal’.1 That is, all computers, regardless of the stuff of which they are made (they exist, in a functional sense, apart from it), are also abstract machines of a certain kind: they are universal Turing machines. Or, to put it another way, a computer language is, in effect, a set of instructions for building such a machine. (The Turing machine is a theoretical device formulated by Turing in the 1930s as a way of discussing algorithms or procedures for solving problems, a preoccupation that grew from logicians’ interest in a methodology whereby ‘the proofs of mathematical theorems could be generated automatically by a mechanical process’. The device consists of three parts: an infinitely long paper tape that is divided into an infinite number of squares, a mechanism that moves the tape and prints or erases marks [symbols from a finite alphabet] on individual squares, and a scanning head that senses whether or not a given square contains a mark. It ‘can be programmed to find the solution to a problem by executing a finite number of scanning and printing actions’, and ‘in spite of its simplicity it is not exceeded in problem-solving ability by any other known computing device.’)2

Michael L. Johnson
15. Form, Representation, Presence

Young argues that in the evolution of human consciousness ‘the critical stage was the acquisition of the power to make symbolic representations by language of concepts indicating the distinction between self and other’. That this distinction, in man—machine terms, may be confused by metaphors that extend self into other (‘face of a clock’) is fairly obvious, but that it also may be confused less comfortably by metaphors that extend other into self is less recognized though commonly observable. Technological man must be very canny about how he conceives himself, by this second kind of metaphor, as an extension of the environment of his artefacts, mind as an imitation of machine (an inversion that has engaged Derrida); for, as Young notes, ‘as man devises tools to substitute for the functions of his body he also creates new language to describe these very functions…’, a language almost invariably derived, by a perverse McLuhanism, from the terminology of the substitute. So a more specific caveat emerges: ‘We cannot hope to describe in detail how the brain works by looking at the bits of metal and minerals inside a computer.’ (The physical machine is not the simulation: the simulation is the computation of the consequences of a theory expressed as a program that more or less approximates what it models.) Rather, ‘What we have to use is the understanding of the principles….’1

Michael L. Johnson
16. Recognition, Hypothesis, HEARSAY

‘Each nerve fibre is a charged system’, according to Young; frequency-modulated messages propogate along it, by a ‘conduction without decrement’, as electrochemical impulses. ‘Differences of quality, such as different tones or visual contours, are encoded in the nervous system by having a different nerve cell and nerve fibre for each quality’, so that the brain is a ‘multichannel system’, a parallel processor — in that respect at least.1 The impulses animating this differential code are not individually meaningful to the mind; they must be combined, as the constituent actions of perception and cognition, to become so, just as letters must combine to become meaningful as words and words must combine to become meaningful as sentences.

Michael L. Johnson
17. ATN, Automata, Expectation

Of the four knowledge sources necessary to language comprehension as discussed by Rumeihart, the syntactic has been of greatest interest in man-machine comparative studies — largely because, of the four, it is the one most easily subject to productive model-building, though there is much controversy concerning how it is structured and used in human language processing. None the less, a powerful modelling tool has been developed by Rumelhart and others: the augmented transition network (ATN). A consideration of ATN research foregrounds many of the key issues in comparative studies, with regard not only to syntactic knowledge but, to a lesser extent, to the other kinds as well.

Michael L. Johnson
18. Semantics, Components, Rapprochement

What are the principal considerations in developing a model of semantic processing? According to Manfred Bierwisch, ‘The semantic analysis of a given language must explain how the sentences of this language are understood, interpreted, and related to states, processes, and objects in the universe.’ A problematic order, to be sure. None the less, if ‘analysis’ is changed to model, then this imperative is exactly the one to which the modeller must respond. The fundamental question of semantic analysis or modelling is, for Bierwisch, ‘What is the meaning of a sentence S of the language L?’ In attempting to answer that question, one necessarily deals with another: how is that sentence meaningful in that language? Of course, context is critical because one must interpret sentence S in terms of ‘its semantic relations to other expressions’ in that language. Furthermore, one must know ‘not only the meaning of its lexical elements, but also how they interrelate’ — which interrelation, of course, ‘depends on the syntactic structure of the sentence’. Thus a semantic theory — or model must ‘make reference to the syntactic structure in a precise way’; ‘systematically represent the meaning of… lexical elements’; ‘show how the structure of the meanings of words and the syntactic relations interact, in order to constitute the interpretation of sentences’; and ‘indicate how these interpretations are related to things spoken [or written] about’.1

Michael L. Johnson
19. Context, Reference, Schema

All language involves context; its meaning is contextually constrained. There is always an interplay of text and context. Indeed, human consciousness is inherently responsive to context. For example, when the eye sweeps its environment, the peripheral visual system is continually sampling context while the central ocular attention shifts; the perceptual data of the former are buffered and used to update, in terms of a changing assessment of context, those of the latter. Likewise, in the use of verbal language, there is a continual retracing of the hermeneutic circle of sign and context, an attempt to ‘frame’ properly the associative scenario of the sign (the play of A. J. Greimas’s ‘semes’), to equilibrize the tension between its general (lexemic) and particular (sememic) meanings. Context, for human language, is very thick, and its specification seems constantly to invite an infinite regress, an endless recursive looping of the hermeneutic circle; but, just as a dog stops chasing its tail by an ineffable impulse, so that regress or recursion is somehow truncated. (The determination of that stopping point is a profound problem in the simulation of natural-language comprehension. It is interlinked with the problem of adequately specifying rhetorical ‘decorum’ in language use and the larger problem of formalizing linguistic performance as well as competence.) With these thoughts in mind, I would like to turn to a consideration of the problem of context in the machine modelling of semantic processing.

Michael L. Johnson
20. SHRDLU, Procedures, Mini-World

Rumelhart’s observation that ‘Linguistic inputs are designed to fit into a general framework and are dependent upon that framework to make sense’1 parallels Terry Winograd’s more poststructuralist one, about linguistically based knowledge, that ‘There is no self-contained set of “primitives” from which everything else can be defined. Definitions are circular, with the meaning of each concept depending on the other concepts.’2 (Every text has — and itself already is — an intertext.) Like Rumelhart, Winograd has researched extensively the machine modelling of language comprehension, but he has been especially concerned with building contextually conditioned ‘semantic structures’ (structured lists that describe subjects and relationships and summate plausibilities among their components) to explore how meaning inheres in the interrelations of concepts in some ‘general framework’.

Michael L. Johnson
21. Correspondence, MARGIE, Concept

Having discussed Winograd’s system in some detail, I would like to touch on some more general considerations and then turn to other research relevant to issues raised in that discussion.

Michael L. Johnson
22. Flexibility, Networks, Maps

Discussing the advantages of using the ATN in modelling language comprehension, Eric Wanner and Michael Maratsos observe that early models, based on transformational grammar, attempted complete syntactic analysis of input but that more recent ones tend to rely on minimal syntactic information. They eschew both extremes, arguing that the amount of such information required must vary: ‘In sentences where there is little contextual or semantic information, a complete syntactic analysis may be necessary. In others a contextual or semantic resolution may be possible.’1 They rightly insist on flexibility. None the less, knowledge about semantic information is doubtless much more important to achieving it than that about syntactic information (the limitations of language-production programs certainly argue that). This is a complex issue. Robert F. Simmons’s theory of semantic networks offers a fruitful approach to further exploration of it and should serve as well to enlarge understanding of some of the problems with Schank’s system.

Michael L. Johnson
23. MT, Linguistics, Logic

Simmons’s concern with semantic mapping mirrors a general concern in AI theory and research with problems of translation. They are epitomized in attempts at the mechanical translation (MT) of natural languages, a project that has long intrigued the artificial intelligentsia, who were spurred on greatly by the development of Chomskyan grammar because it seemed at first to offer a formalism for generating all the sentences of any given natural language — perhaps from those of any other.

Michael L. Johnson
24. Interlingua, Thesaurus, Deduction

Wilks’s system, which has produced, in his opinion, ‘good French for small English paragraphs’, is still under development, and I do not wish to discuss its operation at much greater length; but some further consideration is required for a more thorough evaluation of his theoretical position.

Michael L. Johnson
25. Translation, Universals, Mystère

In an essay appropriately entitled ‘Understanding as Translation’, Steiner considers the omnipresence of translation in the human use of language. He notices how reading Shakespeare is tantamount to translating him: the historical development of language demands a decipherment between two psychocultural contexts. Likewise, ‘The world of an Austen novel is radically linguistic: all reality is “encoded” in a distinctive idiom’1 that the reader must translate (from ciphertext to cleartext by means of some encicode) in order to understand. But translation — the assumption of an alien vision, the inhalation of an alien voice — is not peculiar to literary experience; indeed, it is co-extensive with the whole human experience of language and essential to it. One who uses language effectively is a translator, an interpreter or, to use Steiner’s polyvalent word, an interprète.

Michael L. Johnson
26. Wiring, Conundrums, Topology

Steiner is not, however, finally unsympathetic to the MLM project, as the unravelling of his further, broader, and perhaps expectable objections shows.

Michael L. Johnson
27. Meaning, Discourse, Speech Act

It is obvious by now that there are differences between philosophical and modellist approaches to issues associated with language comprehension, especially that of meaning. Philosophers of language of various persuasions tend to split hairs, sometimes too abstractly; modellers tend to be practical, sometimes too reductively. Both approaches, however, have enlarged understanding of the issue, and their respective ideologies and insights increasingly show patterns of positive interference — as my discussion of Steiner suggests. A survey of other work representative of the two kinds of approaches can make a case for certain syntheses, however tensional, between them that urge a new paradigm for MLM phenomena.

Michael L. Johnson
28. Wittgenstein, Use, Functionalism

As I discuss meaning, the shade of Wittgenstein seems always at my side, aphoristically prompting or riddling, alternately confirming the MLM metaphor and calling it into question. But he is present also in another way, especially now, because during his career his thought underwent a sea change that summarizes in nuce the difference between the kind of conception of meaning with which the philosopher (especially if of a logicist bent) is typically concerned and the kind with which the modeller is — though, of course, the difference is not strictly applicable in all cases.

Michael L. Johnson
29. Entropy, Readiness,Probability

‘The meaning of a sentence,’ according to Jackson, ‘is the semantic information it conveys’ or a threefold description of ‘(1) whatever causes the sentence to be used; (2) whatever is caused by the use of the sentence; (3) whatever else is described by the sentence.’ For a computer, understanding that meaning ‘may be corresponded to making internal data structures (vectors, lists, graphs, programs, etc) that model these elements of the meaning of the sentence’ (in referential terms, he says, ‘we may think of the “meaning” of a sentence as being a collection of situations, each of which the sentence possibly denotes’).1

Michael L. Johnson
30. Percept, Label, Core

For several decades George A. Miller has been concerned with an astounding variety of MLM issues, such as the structure and function of the mental lexicon as a dynamic database, the parallels between syntactic and semantic relations, and the application of formal-language theory to the study of natural language. His efforts, along with those of Philip N. Johnson-Laird, have generated a cornucopia of conclusions and speculations pertinent to the modelling of meaning, especially about the possibility that linguistic knowledge can be described in terms of ‘associations between words and percepts’.1

Michael L. Johnson
31. Isomorphism, Decipherment, Symbol

In a single big book entitled Gödel, Escher, Bach: An Eternal Golden Braid, Hofstadter attempts, with both playful creativity and rigour, to effect a rapprochement between the ‘artificial microworlds’ of the computer and the perceptual—conceptual universe of the brain/mind. His approach is by way of formal-systems theory. It involves a polymathic synthesis of ideas from many disciplines and is preoccupied with meaning.

Michael L. Johnson
32. Formalism, Recursion, Gödel

There are many problems and possibilities relevant to the new paradigm, especially in regard to the relation between formal systems and natural language, that must be investigated. I have already touched on some of them, but, since Hofstadter’s ideas concerning meaning are obviously interlocked with his ideas concerning formal systems and natural language, those latter ideas should be addressed first, to preserve continuity, before I turn to the ideas of others.

Michael L. Johnson
33. Loops, Self-Reference, Substrates

Hofstadter’s interest in formal systems leads him into considering a variety of processes that involve the isomorphic translation of information, many of which, seen as he sees them, have important implications for the MLM metaphor. He observes, for instance, that the ‘isomorphism that mirrors TNT inside the abstract realm of natural numbers can be likened to the quasi-isomorphism that mirrors the real world inside our brains, by means of symbols’, where the symbols themselves ‘play quasi-isomorphic roles to the objects …’. Such a translation occurs also on the intracellular level, in protein production, after DNA dispatches messenger RNA from the nucleus:

Now when a strand of mRNA, after its escape into the cytoplasm, encounters a ribosome, a very intricate and beautiful process called

translation

takes place…. Imagine the mRNA to be like a long piece of magnetic recording tape, and the ribosome to be like a tape recorder. As the tape passes through the playing head of the recorder, it is ‘read’ and converted into music, or other sounds. Thus magnetic markings are ‘translated’ into notes. Similarly, when a ‘tape’ of mRNA passes through the ‘playing head’ of a ribosome, the ‘notes’ which are produced are

amino acids

and the ‘pieces of music’ which they make up are

proteins

. This is what translation is all about… .

Michael L. Johnson
34. Disagreements, Problems, Possibilities

Most AI theorists and researchers would agree with Hofstadter’s arguments or at least be in close sympathy with them, as would information theorists like MacKay. Jackson, for example, contends that ‘There is no known a priori limit to the extensibility of a computer’s language capability other than those limits of a purely practical nature (memory size and processing speed)’ and that, ‘Although the difficulties involved with understanding natural language should not be minimized, no one has been able to show … that English is theoretically outside the language capability of all computers ….’1 However, a number of computer scientists and others, like Steiner, would deny that any formalism ever could simulate natural-language processes fully, even in theory. Weizenbaum’s opinions in that regard are well known and representative.

Michael L. Johnson
35. Memory, Learning, Self-Knowledge

Earl Hunt asks of himself the question ‘What kind of computer is Man?’ and offers a conjectural description (and a word of advice) as an answer:

Man is describable as a dual processor, dual memory system with extensive input-output buffering within each system. The input-output system appears to have substantial peripheral computing power itself. But man is not modeled by a dual processor computer. The two processors of the brain are asymmetric. The semantic memory processor is a serial processor with a list structure memory. The image memory processor may very well be a sophisticated analog processor attached to an associative memory. When we propose models of cognition it would perhaps be advisable if we specified the relation of the model to this system architecture and its associated addressing system and data structure.

1

Michael L. Johnson
36. Dependency, Postsemantics, Paradigm

The Chomskyan revolution, which replaced the old paradigm that dictated the taxonomic study of natural language with another that proposed a theoretical methodology (of ‘hidden laws’, in Searle’s phrase) for interpreting the data accumulated by that study, has undoubtedly changed forever man’s view of his language and directly or indirectly occasioned much of the theory and research I have been discussing. That paradigm has stirred controversies that extend, like Chomsky’s speculations, far beyond questions about the details of transformational grammar. But, as has been seen, it has serious inadequacies. They raise important questions about the MLM metaphor and encourage speculation about the post-Chomskyan paradigm, which increasingly will be integrated with AI theory and tested by AI research.

Michael L. Johnson
37. Synthesis, Semiotics, Biology

If the post-Chomskyan paradigm is to be richly enough elaborated to address effectively the range of phenomena implicated by the MLM metaphor, it will require eclectic strategies, the synthesis of theoretical and empirical knowledge from many disciplines. To recall Johnson-Laird’s phrasing, that synthesis ‘does not yet exist’, and its invention is imperative. Theorists and researchers will have to conduct their work co-operatively, with sufficient intelligence and openness to avoid the kind of fruitless oppugnancy that has developed between Chomskyans and behaviourists or High Church Computationalists and Zen Holists. Especially, I would suggest, theorists of language and modellers should give enlarged attention to biological aspects of linguistic and cognitive processes.

Michael L. Johnson
38. Subsumption, Simulation, Information

By now it is obvious that I can hardly pretend, like the maître de philosophie in Molière’s Bourgeois Gentilhomme, to explain everything about the way language (or mind or machine) works — the evidence is strongly to the contrary — but I can at least offer some meditations toward a conclusion.

Michael L. Johnson
39. Ventriloquism, Indifference, Beyond

In the early 1960s Marshall McLuhan made the following argument about AI:

Any process that approaches instant interrelation of a total field tends to raise itself to the level of conscious awareness, so that computers seem to ‘think.’ In fact, they are highly specialized at present, and quite lacking in the full process of interrelation that makes for consciousness. Obviously, they can be made to simulate the process of consciousness, just as our electric global networks now begin to simulate the condition of our central nervous system. But a conscious computer would still be one that was an extension of our consciousness, as a telescope is an extension of our eyes, or as a ventriloquist’s dummy is an extension of the ventriloquist.

1

Michael L. Johnson
Backmatter
Metadata
Title
Mind, Language, Machine
Author
Michael L. Johnson
Copyright Year
1988
Publisher
Palgrave Macmillan UK
Electronic ISBN
978-1-349-19404-9
Print ISBN
978-1-349-19406-3
DOI
https://doi.org/10.1007/978-1-349-19404-9

Premium Partner