Full Text pdf (1.02mb)| Full Text HTML
The goal of neural coding research is to understand how the brain uses adaptive neural signals to represent and transmit information. This review surveys recent evidence concerning the nature of representation implemented by neural circuits. We contrast rate coding with different forms of temporal codes, arguing that at the level of a single neuron, this dichotomy is a simple problem of demonstrating the optimal window size for integration that could carry the behaviorally relevant information. Also, we draw on examples from vision and from other systems to illustrate how information may be coded hierarchically along a pathway. More-over, we stress 1 Neural Temporal Codes the importance of higher-order interactions, such as the relative timing of first-spike latencies from ensembles of neurons, which gives the cortex a potentially large immense representational capacity. Evidence derived from coupling massive multirecording techniques and 3D real-time voltage and/or magnetic imaging should yield enough information to reveal a more realistic picture of neural codes and network interactions.
Full Text pdf (307kb) | Full Text HTML
This commentary argues that James Gibson's contribution to the field of perception can be best understood as a set of heuristics that directs researchers to describe the stimulus information that makes it possible for animals to function effectively in the environment in which they evolved. He argued that the description of stimulation provided by traditional physics could not account for veridical perception and needed to be replaced by a new, biologically relevant ecological description of the input for perceptual functioning.
Full Text pdf (541kb) | Full Text HTML
Edward Tolman's ideas on cognition and animal behavior have broadly influenced modern cognitive science. His principle contribution, the cognitive map, has provided deep insights into how animals represent information about the world and how these representations inform behavior. Although a variety of modern theories share the same title, they represent ideas far different than what Tolman proposed. We revisit Tolman's original treatment of cognitive maps and the behavioral experiments that exemplify cognitive mapbased behavior. We consider cognitive maps from the perspective of probabilistic learning and show how they can be treated as a form hierarchical Bayesian learning. This probabilistic explanation of cognitive maps provides a novel interpretation of Tolman's theories and, in tandem with Tolman's original experimental observations, offers new possibilities for understanding animal cognition.
Full Text pdf (399kb) | Full Text HTML
This paper presents results of investigations of children's use of referring expressions in spontaneous conversation with adults, and considers possible implications of this work for questions relating to development of a theory of mind. The study further confirms previous findings that children use the full range of referring forms (definite and indefinite articles, demonstrative determiners, and demonstrative and personal pronouns) appropriately by age 3 or earlier. It also provides support for two distinct stages in mind-reading ability. The first, which is implicit, includes the ability to assess non-propositional cognitive states such as familiarity and focus of attention in relation to the intended referent; the second, which is representational and more conscious, includes the ability to assess epistemic states such as knowledge and belief. Distinguishing these two stages supports attempts to reconcile seemingly inconsistent results concerning the age at which children develop a theory of mind. It also makes it possible to explain why children learn to use referring forms in ways consistent with the cognitive statuses they encode before they exhibit the pragmatic ability to consider and calculate quantity implicatures, which require assessment of how much information is relevant for the addressee.
Full Text pdf (819kb) | Full Text HTML
I contend that Alva Noë's Enactive Approach to Perception fails to give an adequate account of the periphery of attention. Noë claims that our peripheral experience is not produced by the brain's representation of peripheral items, but rather by our mastery of sensorimotor skills and contingencies. I offer a two-pronged assault on this account of the periphery of attention. The first challenge comes from Mack and Rock's work on inattentional blindness, and provides robust empirical evidence for the semantic processing (and hence representation) of some wholly unattended stimuli. The second challenge draws on LaBerge's theory of attention to provide a substantial advantage to peripheral representations, saving time whenever we shift the focus of our attention to something which had been in the periphery, allowing us to respond to that thing more quickly than would be possible if Noë's account of perception were correct.
Full Text pdf (552kb) | Full Text HTML
Sensorimotor control is the primary and fundamental function of the cerebral cortex. The cortex performs this function by generating patterns of electrical activity in neuronal networks that encode objects and events in the world and that program effective motor responses. In humans, and most likely higher primates, the engagement of cortical circuits to repeatedly represent stimuli and generate movements leads to the creation of knowledge, the capacity for abstraction, and the emergence of intelligence. One of the most crucial questions confronting cognitive neuroscience is how this occurs. We are far from a full understanding. However, research into neural coding in the posterior parietal cortex of the nonhuman primate is providing important clues, notably that sensorimotor control and cognition are processes embedded within the same neural architecture. That is, the same neurons and circuits within posterior parietal cortex which participate in processing stimuli and programming movements demonstrate the capacity for abstract representation. In these cells patterns of activity can reflect computational processes that unfold independently of concurrent sensorimotor events, or that encode generalized information akin to a rule, principle or concept that provides the basis for intelligent action in a given circumstance, but that is not directly related to the characteristics of the particular stimuli or movements involved. These neurons appear to participate in forms of information processing that ultimately must enable thinking as we presently understand it, and their existence within the sensorimotor circuits of the cerebral cortex suggests that the capacity for thought may emerge as a byproduct of sensorimotor experience. This is likely to open a new line of enquiry by which it may be possible to discover the biological principles that govern how cortical circuits acquire intelligence by interacting with the world.
Full Text pdf (479kb) | Full Text HTML
In his 1994 doctoral thesis, Morten Christiansen used connectionist modeling to demonstrate that models could produce recursive language-like output without the use of prescribed recursive rules. The purpose of this paper is to provide a brief overview of Christiansen's arguments and the models he used to support them, as a foundation for understanding their implications regarding the role of recursion in the evolution of language. Based on connectionist principles, Christiansen argued against concepts such as the competence/performance distinction and poverty of stimulus as an objection to learning-based theories of language acquisition. He proposed that recursion, rather than being a prescriptive shaper of language, is an emergent outcome of processing constraints in the human brain.
Full Text pdf (606kb) | Full Text HTML
The ability to read symbols and construct a coherent, meaningful message from those symbols is widely regarded to be a complex and uniquely human cognitive skill. The study of how people attain meaning from symbols — i.e., how they comprehend — can provide a window into the inner workings of the human mind and be a testing ground for theories about human intellectual functioning. There are not just theoretical but also practical reasons for studying comprehension. To function adequately in society it is necessary to read and comprehend. In everyday life there are forms to fill out, informational documents to read, and instructions to follow, and education relies heavily on the transmission of knowledge and skills through written materials. The study of how reading comprehension takes place, of factors that prevent its success, and of procedures for diagnosing and remedying problems, has the potential to significantly influence educational practice.
The aim of this paper is to present an overview of the scientific findings on reading comprehension and, in doing so, (1) to convey the nature and complexity of the cognitive processes that take place during comprehension; (2) to illustrate the variety of methods and technologies used in comprehension research; (3) to describe the effects of those approaches on theoretical directions in the field; (4) to discuss the potential relevance of understanding comprehension processes for educational practice.
The paper is divided into two major sections. In the first section, we review cognitive research on reading comprehension, distinguishing the product of comprehension — what is stored in the human mind once one has read and comprehended a text — from the process of comprehension — the cognitive activities by which the product is constructed. This first section concludes with a description of initial studies examining the neural basis for the described comprehension processes. The second section focuses on the practical implications of the research for educational practice, identifying the promises and limitations of translating cognitive science research into educational practice.
Full Text pdf (400kb) | Full Text HTML
Questions about non-human animal language are intriguing, and finding an animal communication system complex enough to challenge human language’s unique status would be a significant milestone in linguistic and psychological research. In this paper I investigate the claim that the dance system of honey bees, one of the most complex animal communication systems known, displays a complexity and versatility that rivals human language and therefore should be considered ‘language’. I evaluate this claim based on both the features of honey bee dance and previous research into the features that characterize human language, and conclude that bee dance does not have enough of the significant features of human language to rival its unique status among systems of communication. In addition I argue against the claim that the label of ‘language’ is meaningful when describing an animal communication system. The aim of this paper is to inform future research into human and non-human communication systems both by arguing against the traditional ‘language/non-language’ dichotomy and by arguing that human language nevertheless remains unique among animal communication systems based on several significant features.
The study of the neural mechanisms underlying visual attention has been bound up with the study of oculomotor control for almost as long as both have been investigated scientifically. Before the 1970s, most knowledge about the neural mechanisms of visual attention and oculomotor control was derived from lesion studies in animals and from clinical case reports. The phenomenon of contralateral inattention following posterior parietal lobe injury in humans was well documented, but it did not seem possible to approach the study of the neural substrates of visual attention directly, given the technological limitations of the time. However, in the early 1970s, several laboratories began the study of the activity of single neurons in monkeys that were alert and performing learned behavioral tasks, including tasks involving voluntary eye movements and involving shifts of visual attention. A group at Johns Hopkins proposed that some neurons in posterior parietal cortex participated in the initiation and control of visually-guided saccadic eye movements and visual pursuit movements, while another group argued that all neural activity in the posterior parietal lobe was related to visual attention rather than oculomotor control, and that only the frontal eye field, located in prefrontal association cortex, participated in active oculomotor control. From this difference of interpretation grew more than thirty years of active investigation and discussion that has involved many different laboratories as they worked toward further understanding of the neural circuitry underlying visual attention and oculomotor control. Neurophysiological, neuroanatomical, and functional imaging studies have led to the present concept of cortico-cortical networks which incorporate nodes that play a role in both oculomotor and visual attention functions. In fact, recent research results suggest that the functions of visual attention and oculomotor control may be inextricably linked with one another - perhaps two sides of the same coin. The present paper is intended to be a brief historical overview of the development of some of the current concepts of the interaction between the neural mechanisms of attention and those of oculomotor control.
When looking at an object while exploring and manipulating it with the hands, visual and haptic senses provide information about the properties of the object. How these two streams of sensory information are integrated by the brain to form a single percept is still not fully understood. Recent advances in computational neuroscience and brain imaging research have added new insights into the underlying mechanisms and identified possible brain regions involved in visuo-haptic integration. This review examines the following main findings of previous research: First, the notion that the nervous system combines visual and haptic inputs in a fashion that minimizes the variance of the final percept and performs operations commensurable to a maximum-likelihood integrator. Second, similar to vision, haptic information may be mediated by two separate neural pathways devoted to perception and action. This claim is based on a set of psychophysical studies investigating how humans judge the size and orientation of objects. Third, a cortical neural system described as the lateral occipital complex (LOC) has been identified as a possible locus of visuo-haptic integration. This claim rests on functional imaging studies revealing an activation of LOC to both visual and haptic stimulation. We conclude that much progress has been made to provide a computational framework that can formalize and explain the results of behavioral and psychophysical studies on visuo-haptic integration. Yet, there still exists a gap between the computationally driven studies and the results derived from brain imaging studies. One reason why the closing of this gap has proven to be difficult is that visuo-haptic integration processes seem to be highly influenced by the task and context.
Assigning meaning to the actions of other subjects is an essential aspect of everyday social communication. A major contribution in understanding the brain mechanism supporting recognition of others’ actions was the discovery of ‘mirror neurons’, which are activated both when a monkey grasps an object and when he observes other subjects executing the same grasps. However, our recent findings indicate that the system responsible for understanding the actions of other subjects encompasses the entire brain circuitry that supports action execution, rather than just the part of cortex containing ‘mirror neurons’. These findings contradict the widely held notion that ‘mirror neurons’ alone code the meaning of observed actions performed by other subjects, and instead support the theory that we decode others’ actions by activating our own action system, i.e. by mentally simulating the observed acts.
Full Text pdf (379kb) | Full Text HTML
The elaborate agile spatial control skills of humans and many animals are not fully accounted for in the cognitive sciences. At the same time, engineering faces challenges in developing methodologies beyond traditional tracking and regulations functions that would lead toward improved adaptation and flexibility. The role of cognitive functions in spatial behavior has traditionally been focused on planning and memorization required for simple forms of navigation. However, observation of humans and animals performing spatial control tasks under competitive conditions requiring agile and adaptive behavior suggest more elaborate cognitive dimensions. These dimensions support the fine-tuned maneuvering and guidance behavior based on a broader understanding of the task and environment. This understanding must encompass more than the topology of geographical space but also take into account the dynamics of movement and a dynamic fit between behavior, environment and task. Here I highlight this gap and bring attention to these additional cognitive dimensions. In particular I suggest that agile behavior must be based on, and operate within, some structure that is inherent to the dynamic interaction between the agent and its environment. I argue that these structures are central to cognition. They provide a physically consistent link between the high-dimensional, nonlinear physical nature of movement and the discrete, lower-dimensional space typically associated with tactical reasoning. These structures can be understood as a semantic basis for abstraction. Finally their internal, organizational principles would also provide a functional basis for mechanisms needed for learning and adaptation. I conclude with avenues to understand the structure and organizational principles of agile behavior based on principles of dynamics and optimal control.
Full Text pdf (372kb) | Full Text HTML
What Hume has to say about personal identity in the section of the Treatise titled Of personal identity is the subject of this paper. Here I present my interpretation of his position and argue for the coherence and intelligibility of it. Further, I suggest that the view displays insight about human personal identity.
Full Text pdf (842kb) | Full Text HTML
Exploratory Data Analysis (EDA) and Confirmatory Data Analysis (CDA) are two statistical methods widely used in scientific research. They are typically applied in sequence: first, EDA helps form a model or a hypothesis to be tested, and then CDA provides the tools to confirm if that model or hypothesis holds true. When both analyses are applied within a single experiment, two main types of errors can occur that fall under the general term of selection bias. One error is the biased selection of the set of data used to confirm the model derived by the EDA. The other error occurs when CDA becomes part of EDA instead of being applied after EDA completion. As a result of selection bias, overfitting of a model can occur in a manner that makes the model stand true only narrowly, i.e. for the specific sample from which it was derived, without any generalizability. This bias in planning the analysis occurs frequently in the literature. This paper provides the theoretical background and the conceptual tools by which to identify such errors in the literature and to carry out the analysis properly. Applications of EDA and CDA in medical biomarker research are used as paradigms for clarification of concepts.
Full Text pdf (1.42mb) | Full Text HTML
The serial order problem is the problem of how any animal controls action sequences. Lashley maintained that in the case of speech the order of sounds was externally imposed on units that had, in themselves, no temporal valences. Speech errors reveal that the external source of control is a syllable structure (or frame) constraint on the placement of consonants and vowels (content elements) whereby these two segmental forms cannot occupy each others' positions in syllable structure. According to the author's frame/content theory of evolution of speech, the frame constraint evolved because the original form of speech was a consonant-vowel (CV) syllabic cyclicity involving a close (consonant) open (vowel) mouth alternation produced by mandibular oscillation. As the requisite mandibular elevation and depression involve antagonistic movements, there was no opportunity in the evolution of speech for control signals related to these two phases to get mixed up with each other. The central contention of this paper is that babbling, which is a rhythmic series of CV alternations powered by mandibular oscillation, is an innate fixed action pattern which evolved as an ontogenetic affordance of the original frames for speech, and for their subsequent programmability with content elements in both phylogeny and ontogeny.
Full Text pdf (496kb) | Full Text HTML
Law has spent surprisingly little time developing a theory of human nature. Its efforts have largely focused on the abnormal — notably, those not responsible for their actions by reason of mental illness or diminished capacity. The normal has barely been addressed. The field within legal scholarship that comes closest is law and economics. Law and economics embeds a theory — that people are rational maximizers of their self-interest. Law and economics admits its theory is unrealistic; it touts instead its theory's ability to predict. Behavioral law and economics aspires to more realism (and more predictive power). Its trajectory has, however, sometimes been contorted insofar as it has focused on exceptions to the law and economics view rather than a broader reconception of the overall endeavor. Such a reconception is desirable, necessary, and increasingly feasible.
Full Text pdf (840kb) | Full Text HTML
In human history, a great number of significant discoveries in medical science began with case studies. These include John Martin Harlow's description of Phineas Gage in his letter to the editor of the Boston Medical and Surgical Journal, Alois Alzheimer's report on his patient Auguste Deter, and the initial six children reported by Landau and Kleffner in 1957. Similarly, modern research on autism is founded on cases reported by two physicians: Leo Kanner at Johns Hopkins and Hans Asperger at the University of Vienna.
The word autism is derived from the Greek word autós (self) and was originally used by Swiss psychiatrist Eugen Bleuler to describe patients with schizophrenia (Frith 1991). Subsequently, the term was used by two physicians (Leo Kanner in the United States, and Hans Asperger in Austria) to describe children who they believed to suffer from a unique syndrome. In 1943, Leo Kanner published the paper Autistic Disturbances of Affective Contact describing eleven cases. A year later Hans Asperger published his paper entitled The Autistic Psychopathy in Childhood in which he described similar cases.
Exactly what symptoms exhibited by these children made the two physicians choose the same word to describe the pathology? Driven by this question, my paper will provide a description of autism by examining the original cases described by Kanner and Asperger. I will begin with an introduction to their works. I will then describe the core deficits of autism using cases described by the two physicians and end with a discussion of the impacts of their investigations on current research on autism.
Full Text pdf (385kb) | Full Text HTML
Over a period of some fifty years, in three books and a mountain of articles, James J. Gibson developed what he called a theory of direct visual perception, a theory which, he believed, makes reasonable the common sense position that has been called by philosophers direct or naïve realism (Gibson 1967, p.168). His theory is novel, iconoclastic, and vastly important both for psychology and philosophy. I am as eager as he to defend some version of direct perceptual realism, and as dissatisfied as he with most theories currently in vogue. But I am not yet persuaded that his theory is what he claims it to be, and I would like to present my doubts in this paper. Like Gibson, I will concentrate on visual perception.
In brief, Gibson's theory is that visual perception is not a process of inferring from or organizing visual sensations produced by light falling on the retina, but rather a process in which the total visual system extracts (picks up) information about the environment from the light at the eye(s) of the organism as it explores its environment. He objects to theories that base perception on sensations (sense impressions, sensedata) or postulate some operation that converts sensations into percepts. His alternative information-based theory of perception assumes that sensory impressions are occasional and incidental symptoms of perception and are not required or normally involved in perception.
Full Text pdf (569kb) | Full Text HTML
Economics, and law and economics, assume that preferences are fixed and not constructed. The assumption is unrealistic, they acknowledge, but is nevertheless useful for generating accurate predictions. They assume as well that having fixed preferences with the other attributes accorded to them under rational choice theory is normatively desirable. Both these assumptions are false. It is critical in many cases to acknowledge constructed preferences; moreover, such preferences are often not normatively undesirable. More can and should be done to develop a more nuanced conception of preferences; such an account should take into account the extent to which the 'discovery' metaphor underlying the idea of fixed preferences distracts from the needed inquiries into how preferences are 'created' — the importance of the process, and the role of narratives and classification, While the task is difficult, payoffs potentially include better approaches to difficult public policy problems.
Full Text pdf (423kb) | Full Text HTML
Traditionally, a sharp distinction was made between conscious perception of elapsed time, considered a key attribute of cognition, and automatic time processes involving basic sensory and motor functions. Recently, however, this dichotomous view has been challenged on the ground that time perception and timed actions share very similar features, at least for events lasting less than a second. For both perception and action, time estimates require internally generated and/or externally triggered signals, because there is no specific sensory receptor for time in the nervous system. We argue that time can be estimated by synchronizing a neural time base either to sensory stimuli reflecting external events or to an internal simulation of the corresponding events. We review evidence in favor of the existence of distributed, specialized mechanisms, possibly related to brain mechanisms which simulate the behavior of different categories of objects by means of distinct internal models. A critical specialization is related to the animate-inanimate distinction which hinges on different kinematic and kinetic properties of these two different categories. Thus, the time base used by the brain to process visual motion can be calibrated against the specific predictions regarding the motion of biological characters in the case of animate motion, whereas it can be calibrated against the predictions of motion of passive objects in the case of inanimate motion.
Full Text pdf (1.49mb) | Full Text HTML
Decisions made during social interaction are complex due to the inherent uncertainty about their outcomes, which are jointly determined by the actions of the decision maker and others. Game theory, a mathematical analysis of such interdependent decision making, provides a computational framework to extract core components of complex social situations and to analyze decision making in terms of those quantifiable components. In particular, normative prescription of optimal strategies can be compared to the strategies actually used by humans and animals, thereby providing insights into the nature of observed deviations from prescribed strategies. Here, we review the recent advances in decision neuroscience based on game theoretic approaches, focusing on two major topics. First, a number of studies have uncovered behavioral and neural mechanisms of learning that mediate adaptive decision making during dynamic interactions among decision agents. We highlight multiple learning systems distributed in the cortical and subcortical networks supporting different types of learning during interactive games, such as model-free reinforcement learning and model-based belief learning. Second, numerous studies have investigated the role of social norms, such as fairness, reciprocity and cooperation, in decision making and their representations in the brain. We predict that in combination with sophisticated manipulation of socio-cognitive factors, game theoretic approaches will continue to provide useful tools to understand multifaceted aspects of complex social decision making, including their neural substrates.
Full Text pdf (2.34mb) | Full Text HTML
Since its introduction in 1992, there has been a revolution in the ability to image brain function with functional magnetic resonance (fMRI), going from early experiments demonstrating relatively course images of activity in the visual cortex to mapping cortical columns and to "brain reading" that constructs mental experiences of an individual, all using the fact that we were endowed with a complex paramagnetic molecule sequestered in our blood vessels and that neuronal activity has spatially-specific metabolic and physiologic consequences. These two decades of fMRI is marked by incessant improvements in instrumentation, innovative developments in image acquisition and reconstruction methods, and a significant expansion in our knowledge of neurovascular coupling. Collectively, this body of work has brought us recently to the point of depicting functional activity in three dimensions, in the entire human brain with submillimeter resolution. Some aspects of these accomplishments and the rational for their pursuit is reviewed and presented together with a personal history of the development of fMRI.
Full Text pdf (525kb) | Full Text HTML
The purpose of this essay is to address and reconcile the conflicting messages deriving from the functional neuroimaging literature regarding whether the brain mechanism of consciousness and the neuronal correlates of concepts and of transient experiences are visualizable, and to what extent they have been visualized. It is argued, first, that the likelihood of visualization of different aspects of mentation can be deduced unambiguously from the fundamental principles and facts that comprise the formal structure of the functional neuroimaging methods. Second, it is shown that to the degree that such aspects do have neurological validity and the formal structure of the methods holds true, the likelihood of visualizing their neuronal correlates varies from extremely high for all psychological functions, including consciousness, to practically null for conscious experiences constituting the stream of consciousness. Third, the various claims broadcasted through the professional journals regarding visualization of the neuronal networks of consciousness and its products are scrutinized. The results of this scrutiny are that, thus far, and in spite of tremendous technical achievements on the part of the researchers in the area, none of these claims is correct. The validity of these results and what they bode for future research is left for the reader to evaluate, in the context of the formal structure of the methods and the pragmatic constraints that condition their implementation.
Full Text pdf (458kb) | Full Text HTML
What does it mean to say that one is listening to a piece of music, to a performance, or an artist? Initially, it may seem that there is little or no difference between these three questions. Upon closer reflection, though, there may be significant differences and these may be contingent on a variety of variables pertaining to musical content.
This paper examines the case of listening to Bob Dylan and argues that it may provide a unique listening protocol. This case is examined in the context of a larger theory of music listening that considers issues including memory, erudition, focus versus diffusion, and verbal versus non-verbal musicality. Aspects of the research are grounded in the field of disability studies in music, and intermittent mental disability in particular. The potential of a role for experimental research based on this theory of listening in the making is discussed.
Full Text pdf (642kb) | Full Text HTML
We propose here a science of human body/embodiment. We discuss our motivations, provide scientific and artistic precedents/foreshadowings, and then offer an overall picture of this nascent science, as well as a sample, unsolved problem from the field of disability studies. We imagine a science whose core subject is the connection of the body — as it appears in an action-perception paradigm derived from mirror neurons, the embodied artificial intelligence (AI), and embodied theories of dance, music, and painting — to the cognition of emotions, language, mathematics and logic. We propose to construe the missing link between these two poles, action-body and cognitive stratum, by use of a theory of gestures developed from music theory. We conclude with some suggestions for the foundation of a science of human embodiment and an associated scientific journal.
Full Text pdf (655kb) | Full Text HTML
A fundamental goal of systems neuroscience is to understand how the collective dynamics of neurons encode sensory information and guide behavior. To answer this important question one needs to uncover the network of underlying neuronal interactions. Whereas advances in technology during the last several decades made it possible to record neural activity simultaneously from a large number of network elements, these techniques do not provide information about the physical connectivity between the elements being recorded. Thus, neuroscientists are challenged to solve the inverse problem: inferring interactions between network elements from the recorded signals that arise from the network connectivity structure. Here, we review studies that address the problem of reconstructing network interactions from high-dimensional datasets generated by modern techniques, and focus on the emerging theoretical models capable of capturing the dominant network interactions of any order. These models are beginning to shed light on the structure and complexity of neuronal interactions.
Full Text pdf (406kb) | Full Text HTML
For millennia, scholars, philosophers and poets have speculated on the origins of individual differences in behavior, and especially the extent to which these differences owe to inborn natural factors (nature) versus life circumstances (nurture). The modern form of the nature-nurture debate took shape in the late 19th century when, based on his empirical studies, Sir Francis Galton concluded that nature prevails enormously over nurture. However, Galtons interest in eugenics undermined early research into the genetic origins of behavior, which did not re-emerge until the latter half of the 20th century when behavioral geneticists started to publish their findings from twin and adoption studies. Current consensus is that the nature versus nurture debate represents a false dichotomy, and that to progress beyond this fallacy will require the investigation of how both genetic and environmental factors combine to affect the biological systems that underlie behavioral phenotypes.
Full Text pdf (5.11mb) | Full Text HTML
This paper offers a computational description and provisional statistical profile of properties of building layouts that contribute to making them intelligible. Buildings are arrangements of boundaries that demarcate interior space according to the organization of human life and activity. At the same time a building is usually a continuously connected interior. Thus, the implicit question addressed in practice by the design of every building layout is as follows: how can spatial demarcation and differentiation be consistent with our capacity to cognitively map a building as a whole? Everyday experience suggests that spaces with characteristic names such as corridors, courtyards, atria, halls and hallways act as references. They provide us with a more expansive visual field than the rooms devoted to particular uses; more importantly, they often afford an overview of the connections to adjoining spaces. This is generalized by the idea of the purview interface: the interface between ordinary spaces confining perception to a limited part of the interior, and prominent spaces providing overview not only of area but also of connections. The purview interface can function as a rudimentary foundation of layout intelligibility. I show that prominent overview spaces are strategically distributed so that all other spaces are only very few visual turns away from the nearest overview space. The strategic distribution of overview spaces limits the minimum number of turns around corners that are necessary in order to move between any two spaces in the building. This affords anchoring cognitive maps on overview spaces which can act as effective references when navigating between any two spaces.
Full Text pdf (388kb) | Full Text HTML
We present a structuralist analysis of the current state of cognitive science research into the phenomenon of metacognition. We begin from the assumption that cognitive intelligence is an organ just like any other biological organ, with the defined function of allowing intelligent entities to maintain homeostasis with a changing but predictable environment. This understanding leads to the conclusion that human cognition is functionally derived to generate accurate preference relations about the world. Examining empirical evidence from recent research using definitions of metacognition and theory of mind that emerge from our functionalist understanding of cognition, we conclude decisively in favor of the existence of metacognition in non-human animals.
Full Text pdf (452kb) | Full Text HTML
Recent work on decision-making suggests that we are a conglomeration of multiple decision-making information-processing systems. In this paper, I address consequences of this work on decision-making processes for three philosophical conceptualizations that can be categorized as forms of dualism.
All three of these errors are still used in making decisions in current law and legal scholarship. As scientific results begin to undercut these dualist interpretations, it becomes dangerously easy to reject intention, free will, and responsibility. I argue that taking a more realistic perspective on human decision-making will provide a more reasonable basis for legal decisions.
The concept of linguistics has isolated language from other natural systems, and linguistics has remained overwhelmingly a descriptive, rather than a theoretical science. The idea of language universals has thus been a matter of finding shared properties among the highest levels in the organization of language, rather than an attempt to understand the connections of language to other natural systems at all levels of its organization. Language basics such as discreteness have been treated as axiomatic or refractory. The origin of language has been treated as an exercise in continuity, rather than an attempt to understand the organization of language. These trends have been driven by the personalities of Edward Sapir, Zellig Harris and Noam Chomsky, more than by scientific considerations. Sapir’s eagerness, Harris’s indecision, and Chomsky’s magnetism have generated a perfect scientific storm.
Language and mind share a common source in nature with the number system and algebra. An equation represents the property of symmetry expressed in the form of symbols, where the equals represents the axis of symmetry, and the numbers are attached to the symmetrical halves. At the same time, an equation is a simple declarative sentence whose main verb is the equals, and which asserts, I am symmetrical, and It is true that I am symmetrical. Equations with arithmetical operators are generated on the basis of a dynamically developing cascade of symmetrical fractal subunits occurring in successive tiers. Ordinary sentences are generated by modifying the equation’s symmetrical infrastructure into an asymmetrical configuration, thus (1) allowing ordinary sentences to accept verbs other than the equals, and nouns other than numbers, and (2) introducing syntax by introducing meaning to word order, i.e., a=b and b=a are the same because equations are symmetrical, while John keeps bees and Bees keep John are different because ordinary sentences are asymmetrical. Since only human beings possess language, the uniquely human property of mind consists of the sense of truth-and-falsity expressed through declarative sentences. The necessity of discreteness for algebra and language, together with the periodic nature of the “phoneme” chart and the periodic table of the integers, suggest that language and algebra represent the quantum property of matter manifested at a human scale.
Important institutions and events merit occasional reflection—reflection on their accomplishments, on the antecedents to current events and situations, and on the people who brought them about. Cognitive Critique is an appropriate place for reflecting on recent events related to the Center for Cognitive Sciences. It was founded as the Center for Research in Human Learning (January 1964), and with shifting research trends became the Center for Research in Learning, Perception and Cognition (1987), and then the Center for Cognitive Sciences (1996). As of academic year 2013-14, The Center for Cognitive Sciences has reached an incredible milestone: 50 years of continuous funding by grants from the National Science Foundation, the National Institute of Child Health and Human Development, and the colleges at the University of Minnesota. This remarkable record reflects special leadership, effective training, and highly successful research and scholarship by the faculty and student members of the Center. It certainly is cause for reflection and celebration.
Sadly, however, the past year also records the loss of the Center’s first two Directors, James J. Jenkins and Herbert L. Pick. These two intellectual leaders of the Center modeled wide-ranging scholarship, commitment to true fellowship, the importance of working across disciplinary boundaries to advance our understanding, and abiding faith in the value of training the next generation. Their personal strengths shaped the Center for all time, led to many successes, and guided later directors. Herein, we shall reflect on these two seminal figures in the Center'
s history.