cognitive critique



Department of Geology

The Field Museum, Chicago, Illinois, USA



abstract, assertion, discrete, equation, fractal, language, mind, phoneme, sentence, symmetry, syntax, truth


Language and mind share a common source in nature with the number system and algebra. An equation represents the property of symmetry expressed in the form of symbols, where the equals represents the axis of symmetry, and the numbers are attached to the symmetrical halves. At the same time, an equation is a simple declarative sentence whose main verb is the equals, and which asserts, I am symmetrical, and It is true that I am symmetrical. Equations with arithmetical operators are generated on the basis of a dynamically developing cascade of symmetrical fractal subunits occurring in successive tiers. Ordinary sentences are generated by modifying the equation’s symmetrical infrastructure into an asymmetrical configuration, thus (1) allowing ordinary sentences to accept verbs other than the equals, and nouns other than numbers, and (2) introducing syntax by introducing meaning to word order, i.e., a=b and b=a are the same because equations are symmetrical, while John keeps bees and Bees keep John are different because ordinary sentences are asymmetrical. Since only human beings possess language, the uniquely human property of mind consists of the sense of truth-and-falsity expressed through declarative sentences. The necessity of discreteness for algebra and language, together with the periodic nature of the phoneme chart and the periodic table of the integers, suggest that language and algebra represent the quantum property of matter manifested at a human scale.



In much the way that physics had accumulated unsustainable contradictions by the beginning of the twentieth century — spontaneous release of energy by pitchblende at room temperature, the passage of X-rays through solid matter, discrepancies between the calculated and observed orbits of the planet Mercury, the failure to detect the luminiferous aether, even the cold light of fireflies — our current understanding of language and mind has accumulated so many contradictions that only a systemic re-structuring can resolve them.

Thus we believe that language evolved by natural selection (Pinker & Bloom 1990; Pinker 1994) and that Chomsky’s “Colorless green ideas sleep furiously” shows that syntax does not carry meaning, i.e., that natural selection favored something that is meaningless.

The persistence of myths, prejudices and other nonsense shows that the human sense of truth-and-falsity has nothing to do with objective truth-and-falsity, while at the same time we believe that the mind evolved by natural selection, something that could have happened only through the selective advantage of objective truth.

We believe in a single origin of language, but can not reconcile the structure of word-order languages like English and Chinese, with that of word-ending languages like Latin and Greek, under a single theory.

We believe that there are good reasons to reject the idea of an origin in physics (Pinker 1994, p.363), but the periodic property of the “phoneme” chart (Abler 1989) might cause us to withhold judgment.

We believe that what we say influences what we think, and that what we think influences what we say (Whorf 1956), and that this interaction helps us “understand the very essence of what makes us human” (Boroditsky 2011). All of our philosophy, science, mathematics, religion, law, ethics, politics, superstition and gossip is a matter of making assertions, and showing whether they are true or false. What could be more basic than that? But with no concept of assertion, or of truth-and-falsity, or even of sentence, we are taking for granted and bypassing the system (declarative sentences) that makes what we say possible, i.e., the system that makes us human in the first place.


We believe that language is a “kluge” (Marcus 2008), or structureless hodgepodge of unrelated bits and pieces thrown together by the random accidents of history, and that each language is its own individual hodgepodge. Yet we be believe that children can acquire these vast and structureless hodgepodges despite poverty of the stimulus, a poverty that could be relieved only by underlying structure.


In much the way that the accumulated anomalies in physics were traceable to the natural but mistaken assumption that the arrival-time of light is instantaneous, the anomalies in the modern concept of language are traceable to the natural but mistaken assumption that language is an ordinary object in material nature, built out of its component parts in the way that a molecule is built out of atoms, or a beetle is built out of wings and legs. I will call this idea the brick-and-mortar theory of nature; and the apparently different forms which it can assume when applied to language are so numerous as to present an illusion of variety, or even of a revolution in science, when in fact nothing basic has changed.

Brick-and-mortar theories include sentence-building (Clark 1870), sentence parsing, or immediate-constituent analysis (Wells 1947), structuralism (Harris 1951), string analysis (Harris 1962), transformational grammar (Chomsky 1957), the “kluge” hypothesis (Marcus 2008), the “discrete combinatorial system” (Pinker 1994), recursion (Hauser, Chomsky & Fitch 2002), and language evolution (Pinker & Bloom 1990; Pinker 1994).

Clark’s English grammar of 1870 is perhaps the most elegant and successful presentation of the idea that sentences are built by attaching words and phrases to other words and phrases; immediate-constituent analysis proceeds in the opposite direction, by dividing already-existing sentences into their component words and phrases. Structuralism represents the idea that language is composed of “discrete combinatorial elements” (Harris 1951, p.367), and consists largely of methods for isolating and identifying such elements. String analysis (Harris 1962) represents the idea that language structure consists of word sequences.


Pinker’s (1994, p.84-85) “discrete combinatorial system” is a paraphrase of Harris’s (1951, p.367) “discrete combinatorial elements”, or structuralism, the theory that Noam Chomsky disproved during the 1960’s. If language were a discrete combinatorial system, then “Dog bites man; Dog man bites; Bites dog man; Bites man dog: Man bites dog; Man dog bites” (to use one of Pinker’s favorite examples) would all be legitimate, and equally legitimate, sentences of language. By generating every possible combination, combinatorics is structureless by definition. Language, then, falls within the limits of combinatorics, but sentence structure is not driven by combinatorics. The “discrete combinatorial system” is Pinker’s (1994, p.434, 447) substitute for the geometry (Abler 1989, p.2) that is the real source of new or emergent properties in nature, and the source of structure in language.

Realizing that language is not a matter of combinatorics, Harris (1951, p.364) observes that “we state what limitations there are on the random distribution … of each element relative to each other element”, thus coming close to suggesting that the difference between combinatorics and the actual sequences of language amounts to the rules of language. Transformational grammar is structuralism in a dynamic form, and consists of the idea that all sentences represent primitive, or kernel sentences which have been modified in some way. Linguistic transformations work by moving structures around, and adding words to indicate what has been moved where. For example John hit the ball is transformed into The ball was hit by John by interchanging ball and John, and adding was and by to show it. The kluge hypothesis implicitly holds that language consists of its constituent pieces, with no underlying system to hold them together. Recursion is the theory that sentences are built by inserting linguistic material into already-existing linguistic material. The idea of language evolution, with its familiar image of one structure morphing seamlessly into another — fins into legs, swim-bladders into lungs — is also a brick-and-mortar theory. With the exception of linguistic “deep structure” (Chomsky 1957, p. 27), all theories of language tacitly hold that human language and mind are ordinary objects in material nature, obtained by modifying some precursor object, or by assembling its visible pieces in the right way. There is no definition of sentence, or even the idea that such a definition might be valuable, thus by-passing and taking for granted the most basic single structure in all of language. “Deep structure” was eventually abandoned. Special pleading aside, the question, then, is whether, in principle, the sentence can be understood under any brick-and-mortar theory, i.e., whether or not the sentence is an ordinary object in material nature.



Equations are simple declarative sentences. Thus A=B asserts that A is equal to B just as much as John keeps bees asserts that John keeps bees.

In addition to being a sentence, an equation is a geometric construction: An equation represents the symmetry relation expressed in the form of symbols, i.e., its two sides can be interchanged to obtain the same equation. Thus, if A=B, then B=A, where the equals symbol represents the axis of symmetry. Since the symbols A and B are in no sense mirror images of one another, however, it is clear that the symmetry relationship is independent of the symbols, and that the symbols are attached to the symmetry relation only after it has formed. The symmetry relation, then, functions as a scaffolding to which the symbols are attached, providing the comprehensive organization that we recognize as an equation. Further, the equivalency of A=B and B=A shows that word-order is not intrinsically meaningful, but that something causes it to become meaningful in the sentences of everyday language. Figure 1 illustrates the difference between the abstract properties of symmetry and asymmetry that form the functional basis of algebra and language.


Figure 1. Symmetrical equation infrastructure, at left, and its corresponding asymmetrical sentence counterpart, at right. The origin, original function, and physiological representation of the symmetrical infrastructure are not known.


In contrast to equations, sentences of ordinary language are asymmetrical, i.e., their two sides can not be interchanged to obtain the same sentence. Thus John keeps bees is not the same as Bees keep John. The consequence of infrastructure asymmetry, then, is that it causes word-order to become meaningful in everyday sentences; and Figure 2 shows the mechanism whereby asymmetry of infrastructure introduces syntax to language by making word-order meaningful in sentences.


Figure 2. A derivational genealogy of the equation and the sentence. Shared source, or common ancestor is the symmetry property, shown at upper left. Equations are formed by attaching symbols (numbers and the equals) to the ancestral symmetrical infrastructure, as shown at lower left. Sentences are formed by attaching symbols (nouns other than numbers, and verbs other than equals) to the asymmetrical infrastructure, as shown at lower right.

Languages with word-ending grammar take advantage of the same asymmetrical infrastructure as word-order languages, but use word endings, instead of word order, to indicate where on the infrastructure scaffolding the words are to be attached. Figure 3 compares English, a word-order language, to Latin, a word-ending language.



Figure 3. Comparison between the structure of a word-order language (English, at left), and a word-ending language (Latin, at right). The difference is that, in word-order languages, the intended placement of words on the sentence infrastructure is indicated by word order, while in word-ending languages, intended placement of words on sentence infrastructure is indicated by word endings. Word order, and word endings, acquire meaning only in reference to the asymmetrical sentence infrastructure. Latin sentence: "Cicero keeps bees."

The infrastructure scaffolding of complex equations is generated by repeated fractal duplication of the equation’s original symmetry relation. Figure 4 can be understood as the first three frames in a movie showing the development in progressive stages of the fractal scaffolding that provides structure to equations and sentences.



Figure 4. First three frames in a movie of the equation’s infrastructure as the symmetry relation (at top) is duplicated in repeating fractal tiers. In principle, the fractal infrastructure will continue to grow indefinitely, until it is stopped where a symbol is attached to one of its branches.

The equals symbol is always attached to the axis of symmetry in the first fractal subunit of the developing equation infrastructure. Arithmetical operators such as + or √ are attached to subsequent subunits at the place where the axis of symmetry originally stood. Numbers are attached to the branches of the subunits. Fractal development stops where a number is attached to one of the fractal branches. Figure 5 shows the fractal infrastructure of the equation A=B+C. The missing fractal subunit, blocked by the symbol A, is indicated by dotted lines.



Figure 5. Fractal infrastructure of the equation A=B+C. Fractal development stops where a symbol is attached to one of the fractal branches. The missing fractal subunit at lower left is indicated by dotted lines.

To demonstrate the effectiveness of the fractal system in generating complex equations, Figure 6 shows the fractal development of the equation ax2+bx+c=0. The missing fractal subunits are indicated by narrow lines.


Figure 6. Fractal infrastructure of the equation ax2+bx+c=0. Missing fractal subunits are indicated by narrow lines.

The infrastructure of ordinary sentences is the same as that of equations, except that it is asymmetrical. Figure 7 shows the asymmetrical fractal infrastructure of the direct object (John threw the ball) and the indirect object (John threw the ball to Tom). The missing fractal subunit is indicated by dotted lines.



Figure 7. Fractal infrastructure of the indirect object. The geometry is the same as that of addition (Figure 5), except that the fractal subunit is asymmetrical. Sentence: John threw the ball to Tom.

The mechanism of sequentially cascading tiers of the fractal system elevates the concept of equations and sentences from the descriptive level to the theoretical level.

Everett (2005) correctly understood that the absence of recursion from a normal language spoken by normal people was a wake-up call. Under the fractal theory, “recursion”, the embedding of one linguistic structure into another, emerges as a psychological process that affects language, but is not a fundamentally linguistic process. Recursion is a matter of turning one’s attention briefly away from one task, pursuing another temporarily, then returning to the original task, as shown in Figure 8. In other words, recursion is a non-linguistic psychological process, and its presence or absence in any given language has nothing to do with the theory of language at a basic level.



Figure 8. Infrastructure of recursion, the embedding of one linguistic structure into another. The process is the non-linguistic one of shifting one’s attention from one task to another, and back again. Sentence: John, who raises apricots, also keeps bees.


The purpose of this section is to isolate, identify and define the uniquely human aspect of mind. Equations are true when the numbers attached to their symmetrically-placed branches have a symmetrical relation with one another, 2 and √4, for example; and false when the attached numbers have an asymmetrical relation to one another, 2 and √5, for example (Abler 2010). Thus the equation’s universal assertion, I am symmetrical and It is true that I am symmetrical, is independent of the numbers attached to it, confirming the independence, mentioned earlier, of the equation’s infrastructure from its symbols. Thus the symbols that we see are not the whole equation but are, so-to-speak, the ornaments on the tree. The equation has an independent infrastructure scaffolding that remains symmetrical regardless of the numbers attached to it. Equations accept only numbers as nouns, and only the equals as a main verb because only these symbols intrinsically represent symmetry and asymmetry, i.e., only numbers can cause the assertion I am symmetrical to be meaningful. Asymmetry of infrastructure introduces syntax to language, and allows ordinary sentences to accept nouns other than numbers and verbs other than equals, but at a cost. Ordinary sentences retain power of assertion and the property of truth-and-falsity, but are no longer intrinsically true or false. Thus speakers can talk about topics other than numbers, but the way is open for rumor, superstition, prejudice, myth, and every kind of nonsense and trouble (Abler 2010).


The single further observation, that only human beings possess language, allows us to isolate and identify the defining component of the human mind: It is the sense of truth-and-falsity expressed through assertions, or declarative sentences. The mutually interdependent nature of symmetry, assertion, truth-and-falsity, and the structure of the sentence-equation, confirms the accuracy of the fractal theory.

The persistence of myths and prejudices shows that perceived truth-and-falsity has nothing to do with objective truth-and-falsity. Thus, as paradoxical as it seems, the human sense of truth-and-falsity, and power of assertion, i.e., the formative components of the human mind, have their source not in natural selection, or even in language, but in the only other remaining system: It can be seen in Figures 1 and 2, above, that the abstract symmetrical infrastructure of the equation is less differentiated (to borrow a word from the biologists) from the ancestral symmetry relation than is the infrastructure of the everyday sentence.


The symmetrical infrastructure that provides organization to equations, and is the mechanism that generates power-of-assertion, is not expressed in speech, and has formed for reasons that are not known at the current moment. It stands in opposition to any notion that language or algebra, or the human mind, is an ordinary object in material nature, built out of its visible parts. The reader who knows a little history will recognize, in the component of the fractal infrastructure that has symbols attached to it, the “linguistic deep structure” introduced by Noam Chomsky in 1957 (p.27). The failure of the linguists to recognize that equations are sentences, and the eventual abandonment of Chomsky’s only lasting contribution, whose layered tiers might have suggested a fractal origin (Mandelbrot 1977), is among the most lamentable omissions in the history of science (cf. Nuland 2003, p.180).

The reader will also have noticed that the equation-sentence infrastructure has the same bifurcating geometry as hydraulic fractals — trees, river systems, tracheal tubes, blood vessels, and, of course, nerve cells (Strausfeld 1976, p.7), in effect “making the touch” between theory and physiology. While sentences are not necessarily generated inside individual nerve cells, the propagation of information from one nerve cell to another in a fractal pattern is a plausible process, and offers a first search-image for language in the brain. Without a theory-based search image to guide them, neuroscientists might not recognize a bifurcating fractal cascade as the substratum of algebra or language, or a symmetrical system as the mechanism of truth-and-falsity, even if they stumbled across it. Mirror neurons represent one possible mechanism.



The major constituents of language, words and the distinct speech-sounds called phonemes, represented roughly but faithfully by the letters of alphabets, are necessarily particulate, or discrete. Discrete objects are ones that are separate and distinct from their neighbors, and that remain separate and distinct, and retain their individual identities, and can be retrieved intact, even after they have formed combinations with other discrete objects (Abler 1989). The discreteness property of words and phonemes is the property that makes writing possible. It is the discreteness property of words that allows a sentence to be precisely identified and differentiated from other sentences, and that allows a word to be precisely identified and differentiated from other words. Discreteness thus becomes the basis of the precise quotes that form the basis of human culture, i.e., a waveform would be too subject to deterioration. Laws are quotes, and would be unreliable if they could not be repeated exactly, that is, with their exact discrete components. Discreteness allows exact quotes such as the Ten Commandments or the Lord’s Prayer to survive intact after millennia. Prayers or hymns, sometimes recited by thousands of people in unison, or by everyone in a small community, can be critical in providing cohesion to human society: They are exact quotes made possible by discreteness. In effect, human culture and human existence are built on the stability provided by the discreteness property of phonemes, words and sentences (Abler 2005, p.106).

The property of discreteness seems so simple and obvious as to be almost innocent, but its requirements and consequences are highly-structured and precisely-detailed. Discreteness is squeezed between blending on one side, and continuity on the other. Thus if two constituents combine by blending or mixing or melting together, like water and ink, the result is an average of properties (Abler 1989). Constituents lose their individual identities, and the result is that properties beyond those of the original constituents are not possible. In chemistry, the limits of the periodic table would define the properties of all matter, because the new or emergent properties that characterize chemical compounds are due, in large part, to the new or emergent geometry (Abler 1989) made possible by the discreteness property of the atom. Thus the genetic molecule is not a pile of atoms, or a soup of atoms, but a zipper, whose ability to gather, store, duplicate and dispense information is embodied in the geometry made possible by discreteness.


In addition to blending, discrete constituents must avoid its geometric cousin, continuity. Under continuity, constituents are so similar that, as a practical matter, natural systems can not distinguish one from the next. Thus if there were, say, a million chemical elements, one atom would be so similar to the next that molecules could not tell them apart (Abler 1989). The enzymes and crystals that are the basis of our daily life could not have their precise and characteristic shapes; and the specificity that makes enzymes viable would not be possible. There would be no life. As it is, there are only about eighty stable elements, and living things mostly take advantage of only about a dozen or a score of those.

In the same way, if there were a language with, say, a million phonemes, speakers could not tell one phoneme apart from the next. Precisely identifiable and repeatable words and sentences would not be possible, and there would be no stable human culture. As a practical matter, the upper limit to the number of phonemes in any given language is about a hundred, roughly the same as the number of stable elements. Thus continuity places an upper limit to the number of distinct constituents in a particulate or discrete system.

Continuity can creep into a discrete system on the basis of its discrete components, not only if there are too many different kinds, but also if there are not enough different kinds. Computer code gets along on just two discrete components, and the genetic code on four, but these are mechanical (computer code) or nearly mechanical (genetic code), where identifying a component is a matter of selecting from a small menu. But a language with just two discrete phonemes would produce sentences that look like this, Ba, abab ab bab baba a babab, that is, indistinguishable from a structureless buzz. Such a language would approximate dangerously close to the structureless sequences of combinatorics. And, in spite of computer code, a number system with just two digits would also approximate dangerously close to a continuity system, if used for our everyday activities such as adding up a grocery bill. Thus we might have difficulty distinguishing between (base-2) $1,001,100 and $1,011,000, while we distinguish easily enough between (base-10) $76 and $88.


Everybody knows that atoms are numbers, defined by the number of protons in their nuclei. Hydrogen is first, and the rest fall into line behind hydrogen. With their cyclically repeating properties, however, the elements also fall into place on a periodic table. In fact, the mis-named phoneme chart, actually a chart of speech articulations (IPA 1949), is also a periodic table (Abler 1989). That is to say that there is a first speech-articulation, b, which carries the most information because it is seen as well as heard: The cyclically-repeating articulation categories of stop, fricative, spirant, and vowel, shows that the properties of the speech articulations take their places on a periodic table, confirming their status as numbers. It goes without saying that the multiplication table is a periodic table, with the length of its periods corresponding to numerical base.

Why are the properties of the chemical elements, the speech articulations, and the numbers as orderly as they are? In other words, why aren’t the constituents of discrete systems randomly distributed around their respective property spaces, instead of being located at the junctions between intersecting dimensions? Random distribution would still maintain sufficient property distance to maintain constituent uniqueness, as shown in Figure 9. Why bother with the extra component of orderliness?


Figure 9. Periodic tables (above), and their non-periodic counterparts (below). Left: tables of the integers in base-4. Middle: tables of the biological elements. Right: Tables of the speech articulations.



Since the numbers are the most abstract of the periodic systems, their properties are the most general. For the numbers, then, the question of periodicity can be broken down into two questions. (1) Why are the numbers evenly-spaced along the number-line, instead of being randomly-spaced along it; and (2) Why is there a need for numerical base?

Numbers are evenly-spaced or equally-spaced along the number line because the magnitudes of randomly-spaced numbers would have to be recorded and physically stored somewhere. They would have to be individually consulted when needed, and the largest recorded number would represent some kind of upper limit to the known numbers. A sum such as 5+3 would not be equal to 6+2, so that the dimensions of objects would not line up, and the geometry of biological molecules would not be standard. Life could not form. On the other hand, numbers generated by rule are standard, have no upper limit, and do not have to be stored in an archive because they can be derived when needed, and they yield uniform dimensional quantities.

Numerical base is, ultimately, a means of avoiding the continuity trap for numbers that must be represented in the form of digits, that is, in the form of matter. Thus while there might be, in principle, a new digit for every integer on the number line, and no upper limit to the number of digits, in practice we soon run out of easily recognizable geometric shapes; and some means must be found for representing large numbers by a limited number of digits. A limited number of digits is another way of specifying numerical base. Thus if we used a numerical base of, say, a million, there would be so many digits that we could not distinguish one from the next; and the highest base ever in practical use was 60, probably chosen deliberately for handling small fractions (Boyer and Merzbach 1991, p.25), and still in use in surveying instruments. We have already considered the minimum base, 2, which is so small as to represent a transitional state. That is, numbers in base-2 can be represented as simple + and −, or not-quite-numbers. If numbers larger than unity are derived from the concept of zero and one as numbers, then the natural status of numbers is not as clear as it once seemed.



We sometimes tell ourselves that our standard numerical base of 10 is completely arbitrary, chosen, perhaps, to let us use our fingers as a portable adding machine, and that we might as easily have chosen base-8 or base-12. But we see that base-2 is too small, and base-a-million is too large, placing practical limits on the size of a functioning numerical base in nature. The number of biological elements agrees roughly with the minimum number of phonemes in a language, about ten or a dozen, while the total number of stable elements, about 80, agrees roughly with the maximum number of phonemes in a language, about 100 — which represents an outlier. Something near base-10, then, emerges as a naturally-occurring constant, while about 100 is the number of discrete constituents generated in a 10x10 periodic system.


Since the three great periodic systems, the speech-articulations, the atoms and the numbers, are subject to roughly the same limits for roughly the same reasons, I will venture a few guesses about their nature, and the systems of which they represent a part. First, all three systems are manifestations of the same underlying laws. Thus in addition to the atoms, phonemes are also numbers. Since atoms exhibit their orderly properties as a result of quantum constraints, numbers and phonemes exhibit their orderly properties under the same constraints, imposed because they must be manifested in the form of matter. While the speech articulations are ordinal numbers in some obvious way, they are merely addresses for the actual phonemes, which emerge as abstract numbers without magnitude, and without sequence and without dimension. The phonemes thus present themselves as some of the most mysterious objects in all of nature, i.e., it is not clear how the phonemes might be represented in the brain, or how we might recognize them if we stumbled across them there.


In looking at quantum properties, we ordinarily look at matter on a progressively smaller scale. But the orderly nature and absolute simplicity of the equation infrastructure, combined with its dependency upon numbers and discreteness, suggests that it represents the quantum property of matter manifested at a larger scale. Figure 10 shows a genealogy of algebra, mind and language, proceeding upward from the quantum property of matter. Further, the basis of the equation in symmetry shows that algebra is not a matter of manipulating symbols, but of maintaining an absolute symmetry that is only represented by symbols. Since the two sides of a symmetrical system are dependent upon one another, not upon external conditions, equations are self-regulated and independent of their surroundings. Algebra and the language-mind complex, which have their source in the geometry that generates equations, are thus not generated in natural selection, which is dependent upon external conditions. The perceived components of algebra and language are attached to their fractal infrastructure, not to one another, ruling out any brick-and-mortar theory of language and mind, and setting language and mind apart from other systems in material nature. This concept has the status in philosophy that the second law of thermodynamics has in physics, ruling out any theory that violates it. The idea that language is composed of its words is a little like the idea that the ocean is composed of waves, or a Christmas tree is composed of ornaments.


Figure 10. Derivation of syntax (at upper right) and the algebra-mind complex (at top) from the quantum property of matter (at bottom). New properties are indicated along the vertical axis of the diagram. New structures are indicated within boxes.


The absolute nature of the symmetry that characterizes the equation occupies a kind of limbo between natural law and physical object. While we must maintain faith that the human mind is a physical system that is not intrinsically mysterious, unless the physiological equivalent of a perfect balance-scale can be found in the brain, the abstract nature of the phoneme, and the symmetrical equation infrastructure, make a physical understanding of mind more remote than ever.

While biology inevitably had something to do with it, the intimate and exhaustive relationship between language and algebra shows that the first question in the origin of language is not, How did language evolve?, but How did language find a physical channel that would be capable of expressing it? The new or emergent structure of the algebra-language complex is a matter of geometry, as indicated under the particulate principle (Abler 1989, p.2). Language emerged when the first asymmetrical sentence scaffolding was formed, and became public with the first spoken sentence. Speech before the asymmetry event was non-syntactic and non-linguistic. The first syntactic sentence has to be seen as the type specimen of language, and the only example of language that is basic. The asymmetry event was internal to the brains of speakers, and would have passed un-noticed by an observer at the time. The symmetrical proto-infrastructure (Figure 1) remained un-used, or used in some other way, in the brains of speakers, until it was later exploited in the equation. The first sentence was a word-order sentence, and all subsequent sentences must be seen as ramifications whose features are not basic, even if they exist only within language. Thus, along with recursion, inflections or grammatical word-endings, and morphemes or meaningful word-fragments, as well as linguistic gender, number, case, person, tense, mood, voice, finite versus infinitive, participle, gerund, absolute, dependent versus independent — are consequences of language, but are not basic to it. Linguistic transformations probably represent historical reconstructions in some form. The first language may have looked like Ab bab abab when there were only three or four words.

Language and mind are aspects of the same thing, like the wave and particle properties of matter. The fundamental mechanism of the human brain will be understood through the study of language, not vice versa. The abstract, discrete stuff of the phoneme/number, and that of the equation/sentence infrastructure, emerge as the profoundest and most urgent of mysteries.



I thank J. Wine, S. Smerin, O. Pergams and L. Progovac for valuable comments. Much of this paper represents a refinement, unification and completion of ideas in Abler 1989, 2005, 2010. The role of asymmetry in introducing syntax, and the fractal mechanism that generates equations and sentences, as well as the comments on the origin of language, are new. Figures 1 and 2 are adapted after Abler 2010, with permission.


Abler WL (1989) On the particulate principle of self-diversifying systems. J Soc Biol Struct 12:1-13.

Abler WL (2005) Structure of Matter, Structure of Mind. Pensoft, Sofia; BainBridgeBooks, Philadelphia

Abler WL (2010) The human mind: origin in geometry. Sci Prog 93:403-427

Boroditsky L (2011) How language shapes thought. Sci Am 304(2):63-65

Boyer CB Merzbach UC (1991) A History of Mathematics. Wiley, New York

Chomsky N (1957) Syntactic Structures. Mouton, The Hague

Clark SW (1870) First Lessons in English Grammar. AS Barnes, New York

Everett D (2005) Cultural constraints on grammar and cognition in Piraha: another look at the design features of human language. Curr Anthropol 46:621-646

Harris ZS (1951) Methods in Structural Linguistics. U of Chicago Press, Chicago

Harris ZS (1962, 1965) String Analysis of Sentence Structure. Mouton, The Hague

Hauser MD, Chomsky N, Fitch WT (2002) The faculty of language: What is it, who has it, and how did it evolve? Science 298:1569-1579

IPA (1949) The Principles of the International Phonetic Association. International Phonetic Association, Department of Phonetics, University College, London


Mandelbrot BB (1977, 1983) The Fractal Geometry of Nature. Freeman, New York

Marcus G (2008) Kluge. Houghton Mifflin, Boston

Nuland SB (2003) The Doctors’ Plague. WW Norton, New York

Pinker S, Bloom P (1990) Natural language and natural selection. Behav Brain Sci 13:707-784

Pinker S (1994) The Language Instinct. Morrow, New York

Strausfeld NJ (1976) Atlas of an Insect Brain. Springer, Berlin

Wells RS (1947) Immediate constituents. Language 23:81-117. In Joos M (ed. 1963) Readings in Linguistics. American Council of Learned Societies, New York

Whorf BL (1956) Language, Thought, and Reality. MIT Press, Cambridge

Online ISSN: 1946-7060
Contact U of M | Privacy
Cognitive Critique is published by the Center for Cognitive Sciences at the University of Minnesota.
©2016 Regents of the University of Minnesota. All rights reserved. The University of Minnesota is an equal opportunity educator and employer.
Updated August 3, 2013