Neurophysiological investigations of the past two decades have consistently demonstrated a deficit in sensory gating associated with schizophrenia. Phillips & Silverstein interpret this impairment as being consistent with cognitive coordination dysfunction. However, the physiological mechanisms that underlie sensory gating have not been shown to involve gamma-band oscillations or NMDA-receptors, both of which are critical neural elements in the cognitive coordination model.
This article sets the stage for the essays in this issue of Perichoresis on the Trinitarianism of the Particular Baptists in the British Isles and Ireland between the 1640s and 1840s. It argues that this Trinitarianism is part of a larger debate about the Trinity that has been greatly forgotten in the scholarly history of this doctrine. It also touches on the way that Baptist theologians like John Gill were critical to the preservation of Trinitarian witness among this Christian community.
Introduction: education, philosophy and politics -- Writing the self: Wittgenstein, confession and pedagogy -- Nietzsche, nihilism and the critique of modernity: post-Nietzschean philosophy of education -- Heidegger, education and modernity -- Truth-telling as an educational practice of the self: Foucault and the ethics of subjectivity -- Neoliberal governmentality: Foucault on the birth of biopolitics -- Lyotard, nihilism and education -- Gilles Deleuze's 'societies of control': from disciplinary pedagogy to perpetual training -- Geophilosophy, education and the pedagogy of the concept - (...) Humanism, Derrida and the new humanities -- Politics and deconstruction: Derrida, neoliberalism and democracy -- Neopragmatism, ethnocentrism and the politics of the ethnos: Rorty's 'postmodernist bourgeois liberalism' -- Achieving America: postmodernism and Rorty's critique of the cultural left -- Deranging the investigations: Cavell on the philosophy of the child -- White philosophy in/of America. (shrink)
This article is concerned with developing a philosophical approach to a number of significant changes to academic publishing, and specifically the global journal knowledge system wrought by a range of new digital technologies that herald the third age of the journal as an electronic, interactive and mixed-media form of scientific communication. The paper emerges from an Editors' Collective, a small New Zealand-based organisation comprised of editors and reviewers of academic journals mostly in the fields of education and philosophy. The paper (...) is the result of a collective writing process. (shrink)
Management theory and practice are facing unprecedented challenges. The lack of sustainability, the increasing inequity, and the continuous decline in societal trust pose a threat to ‘business as usual’. Capitalism is at a crossroad and scholars, practitioners, and policy makers are called to rethink business strategy in light of major external changes. In the following, we review an alternative view of human beings that is based on a renewed Darwinian theory developed by Lawrence and Nohria. We label this alternative view (...) ‘humanistic’ and draw distinctions to current ‘economistic’ conceptions. We then develop the consequences that this humanistic view has for business organizations, examining business strategy, governance structures, leadership forms, and organizational culture. Afterward, we outline the influences of humanism on management in the past and the present, and suggest options for humanism to shape the future of management. In this manner, we will contribute to the discussion of alternative management paradigms that help solve the current crises. (shrink)
In this book, Michael Arbib, a researcher in artificial intelligence and brain theory, joins forces with Mary Hesse, a philosopher of science, to present an integrated account of how humans 'construct' reality through interaction with the social and physical world around them. The book is a major expansion of the Gifford Lectures delivered by the authors at the University of Edinburgh in the autumn of 1983. The authors reconcile a theory of the individual's construction of reality as a network (...) of schemas 'in the head' with an account of the social construction of language, science, ideology and religion to provide an integrated schema-theoretic view of human knowledge. The authors still find scope for lively debate, particularly in their discussion of free will and of the reality of God. The book integrates an accessible exposition of background information with a cumulative marshalling of evidence to address fundamental questions concerning human action in the world and the nature of ultimate reality. (shrink)
Using relevant encyclicals issued over the last 100 years, the author extracts those principles that constitute the underpinnings of Catholic Social Teaching about the employment relationship and contemplates implications of their incorporation into human resource policy. Respect for worker dignity, for his or her family's economic security, and for the common good of society clearly emerge as the primary guidelines for responsible human resource management. Dovetailing these three Church mandates with the economic objectives of the firm could, in essence, alter (...) the firm's nature because profit motivations would be constrained by consideration for worker and societal welfare. Integration of Church teaching with current corporate goals should therefore impact greatly on a variety of human resource policies. (shrink)
Here, we argue that any neurobiological theory based on an experience/function division cannot be empirically confirmed or falsified and is thus outside the scope of science. A ‘perfect experiment’ illustrates this point, highlighting the unbreachable boundaries of the scientific study of consciousness. We describe a more nuanced notion of cognitive access that captures personal experience without positing the existence of inaccessible conscious states. Finally, we discuss the criteria necessary for forming and testing a falsifiable theory of consciousness.
This paper, based on an invited Thesis Eleven presentation, provides a ‘map of technopolitics’ that springs from an investigation of the theoretical notion of technological convergence adopted by the US National Science Foundation, signaling a new paradigm of ‘nano-bio-info-cogno’ technologies. This integration at the nano-level is expected to drive the next wave of scientific research, technology and knowledge economy. The paper explores the concept of ‘technopolitics’ by investigating the links between Wittgenstein’s anti-scientism and Lyotard’s ‘technoscience’, reviewing the history of the (...) notion in the work of the Belgium philosopher Gilbert Hottois. The ‘deep convergence’ representing a new technoscientific synergy is the product of long-term trends of ‘bioinformational capitalism’ that harnesses the twin forces of information and genetic sciences that coalesce in the least mature ‘cognosciences’ in their application to education and research. The map of technopolitics systematically identifies the political relations between Big Tech and ‘new digital publics’ to reveal that the new paradigm is based on the supreme value of cognitive efficiency. There are a closely-knit cluster of concerns that frame a map of political issues about the fifth-generation technological impacts on human beings, their bodies and minds, and public institutions, not least the logic of the distribution and ownership of data, information and knowledge, and its effects on democracy. (shrink)
Bishop and Trout here present a unique and provocative new approach to epistemology. Their approach aims to liberate epistemology from the scholastic debates of standard analytic epistemology, and treat it as a branch of the philosophy of science. The approach is novel in its use of cost-benefit analysis to guide people facing real reasoning problems and in its framework for resolving normative disputes in psychology. Based on empirical data, Bishop and Trout show how people can improve their reasoning by relying (...) on Statistical Prediction Rules. They then develop and articulate the positive core of the book. Their view, Strategic Reliabilism, claims that epistemic excellence consists in the efficient allocation of cognitive resources to reliable reasoning strategies, applied to significant problems. The last third of the book develops the implications of this view for standard analytic epistemology; for resolving normative disputes in psychology; and for offering practical, concrete advice on how this theory can improve real people's reasoning. This is a truly distinctive and controversial work that spans many disciplines and will speak to an unusually diverse group, including people in epistemology, philosophy of science, decision theory, cognitive and clinical psychology, and ethics and public policy. (shrink)
Morals from Motives develops a virtue ethics inspired more by Hume and Hutcheson's moral sentimentalism than by recently-influential Aristotelianism. It argues that a reconfigured and expanded "morality of caring" can offer a general account of right and wrong action as well as social justice. Expanding the frontiers of ethics, it goes on to show how a motive-based "pure" virtue theory can also help us to understand the nature of human well-being and practical reason.
The article analyzes the neural and functional grounding of language skills as well as their emergence in hominid evolution, hypothesizing stages leading from abilities known to exist in monkeys and apes and presumed to exist in our hominid ancestors right through to modern spoken and signed languages. The starting point is the observation that both premotor area F5 in monkeys and Broca's area in humans contain a “mirror system” active for both execution and observation of manual actions, and that F5 (...) and Broca's area are homologous brain regions. This grounded the mirror system hypothesis of Rizzolatti and Arbib (1998) which offers the mirror system for grasping as a key neural “missing link” between the abilities of our nonhuman ancestors of 20 million years ago and modern human language, with manual gestures rather than a system for vocal communication providing the initial seed for this evolutionary process. The present article, however, goes “beyond the mirror” to offer hypotheses on evolutionary changes within and outside the mirror systems which may have occurred to equip Homo sapiens with a language-ready brain. Crucial to the early stages of this progression is the mirror system for grasping and its extension to permit imitation. Imitation is seen as evolving via a so-called simple system such as that found in chimpanzees (which allows imitation of complex “object-oriented” sequences but only as the result of extensive practice) to a so-called complex system found in humans (which allows rapid imitation even of complex sequences, under appropriate conditions) which supports pantomime. This is hypothesized to have provided the substrate for the development of protosign, a combinatorially open repertoire of manual gestures, which then provides the scaffolding for the emergence of protospeech (which thus owes little to nonhuman vocalizations), with protosign and protospeech then developing in an expanding spiral. It is argued that these stages involve biological evolution of both brain and body. By contrast, it is argued that the progression from protosign and protospeech to languages with full-blown syntax and compositional semantics was a historical phenomenon in the development of Homo sapiens, involving few if any further biological changes. Key Words: gestures; hominids; language evolution; mirror system; neurolinguistics; primates; protolanguage; sign language; speech; vocalization. (shrink)
Largely due to the popular allegation that contemporary science has uncovered indeterminism in the deepest known levels of physical reality, the debate as to whether humans have moral freedom, the sort of freedom on which moral responsibility depends, has put aside to some extent the traditional worry over whether determinism is true. As I argue in this paper, however, there are powerful proofs for both chronological determinism and necessitarianism, forms of determinism that pose the most penetrative threat to human moral (...) freedom. My ultimate hope is to show that, despite the robust case against human moral freedom that can be made without even relying on them, chronological determinism and necessitarianism should be regarded with renewed urgency. (shrink)
This essay examines the funeral sermon given by the Baptist theologian Andrew Fuller for his friend and deacon Beeby Wallis in 1792 as a vantage-point from which to pursue reflection on Fuller’s concept of heaven and the beatific vision. The sermon has two main themes: the rest and rewards of those who die in Christ. The essay examines how Fuller interprets both of these phrases and then, looking at the rest of Fuller’s corpus, notes that ultimately God himself is the (...) believer’s reward. (shrink)
Science and philosophy study well-being with different but complementary methods. Marry these methods and a new picture emerges: To have well-being is to be "stuck" in a positive cycle of emotions, attitudes, traits and success. This book unites the scientific and philosophical worldviews into a powerful new theory of well-being.
Although our subjective impression is of a richly detailed visual world, numerous empirical results suggest that the amount of visual information observers can perceive and remember at any given moment is limited. How can our subjective impressions be reconciled with these objective observations? Here, we answer this question by arguing that, although we see more than the handful of objects, claimed by prominent models of visual attention and working memory, we still see far less than we think we do. Taken (...) together, we argue that these considerations resolve the apparent conflict between our subjective impressions and empirical data on visual capacity, while also illuminating the nature of the representations underlying perceptual experience. (shrink)
Coalescent argumentation is a normative ideal that involves the joining together of two disparate claims through recognition and exploration of opposing positions. By uncovering the crucial connection between a claim and the attitudes, beliefs, feelings, values and needs to which it is connected dispute partners are able to identify points of agreement and disagreement. These points can then be utilized to effect coalescence, a joining or merging of divergent positions, by forming the basis for a mutual investigation of non-conflictual options (...) that might otherwise have remained unconsidered. The essay proceeds by defining and discussing ‘argument’, ‘position’ and ‘understanding’. These notions are then brought together to outline the concept of coalescent reasoning. (shrink)
The human mind has proven uniquely capable of unraveling untold mysteries, and yet, the mind is fundamentally challenged when it turns back on itself to ask what it itself is. How do we conceive of mind in this postmodern world; how can we use philosophical anthropology to understand mind and its functions? While philosophers and social scientists have made important contributions to our understanding of mind, existing theories are insufficient for penetrating the complexities of mind in the twenty-first century. Mind (...) Unmasked: A Political Phenomenology of Consciousness draws on twentieth-century philosophies of consciousness to explain the phenomenon of mind in the broadest sense of the word. Michael A. Weinstein and Timothy M. Yetman develop a thought provoking discourse that moves beyond the nature of the human experience of mind at both the individual and interpersonal levels and present a meditation on life in the contemporary world of global mass-mediated human culture. (shrink)
Exploring the construct of social-responsibility orientation across three Asian and two Western societies, we show evidence that top-level executives in these societies hold fundamentally different beliefs about their responsibilities toward different stakeholders, with concomitant implications for their understanding and enactment of responsible leadership. We further find that these variations are more closely aligned with institutional factors than with cultural variables, suggesting a need to clarify the connection between culture and institutions on the one hand and culture and social-responsibility orientations on (...) the other. (shrink)
Thirty years ago, I elaborated on a position that could be seen as a compromise between an "extreme," symbol-based AI, and a "neurochemical reductionism" in AI. The present article recalls aspects of the espoused framework of schema theory that, it suggested, could provide a better bridge from human psychology to brain theory than that offered by the symbol systems of A. Newell and H. A. Simon.
In this paper, we offer a Piagetian perspective on the construction of the logico-mathematical schemas which embody our knowledge of logic and mathematics. Logico-mathematical entities are tied to the subject's activities, yet are so constructed by reflective abstraction that they result from sensorimotor experience only via the construction of intermediate schemas of increasing abstraction. The axiom set does not exhaust the cognitive structure (schema network) which the mathematician thus acquires. We thus view truth not as something to be defined within (...) the closed world of a formal system but rather in terms of the schema network within which the formal system is embedded. We differ from Piaget in that we see mathematical knowledge as based on social processes of mutual verification which provide an external drive to any necessary dynamic of reflective abstraction within the individual. From this perspective, we argue that axiom schemas tied to a preferred interpretation may provide a necessary intermediate stage of reflective abstraction en route to acquisition of the ability to use formal systems in abstracto. (shrink)
Strategic Reliabilism is a framework that yields relative epistemic evaluations of belief-producing cognitive processes. It is a theory of cognitive excellence, or more colloquially, a theory of reasoning excellence (where 'reasoning' is understood very broadly as any sort of cognitive process for coming to judgments or beliefs). First introduced in our book, Epistemology and the Psychology of Human Judgment (henceforth EPHJ), the basic idea behind SR is that epistemically excellent reasoning is efficient reasoning that leads in a robustly reliable fashion (...) to significant, true beliefs. It differs from most contemporary epistemological theories in two ways. First, it is not a theory of justification or knowledge – a theory of epistemically worthy belief. Strategic Reliabilism is a theory of epistemically worthy ways of forming beliefs. And second, Strategic Reliabilism does not attempt to account for an epistemological property that is assumed to be faithfully reflected in the epistemic judgments and intuitions of philosophers. If SR makes recommendations that accord with our reflective epistemic judgments and intuitions, great. If not, then so much the worse for our reflective epistemic judgments and intuitions. (shrink)
The generality problem is widely considered to be a devastating objection to reliabilist theories of justification. My goal in this paper is to argue that a version of the generality problem applies to all plausible theories of justification. Assume that any plausible theory must allow for the possibility of reflective justification—S's belief, B, is justified on the basis of S's knowledge that she arrived at B as a result of a highly (but not perfectly) reliable way of reasoning, R. The (...) generality problem applies to all cases of reflective justification: Given that is the product of a process-token that is an instance of indefinitely many belief-forming process-types (or BFPTs), why is the reliability of R, rather than the reliability of one of the indefinitely many other BFPTs, relevant to B's justificatory status? This form of the generality problem is restricted because it applies only to cases of reflective justification. But unless it is solved, the generality problem haunts all plausible theories of justification, not just reliabilist ones. (shrink)
Viral modernity is a concept based upon the nature of viruses, the ancient and critical role they play in evolution and culture, and the basic application to understanding the role of information and forms of bioinformation in the social world. The concept draws a close association between viral biology on the one hand, and information science on the other – it is an illustration and prime example of bioinformationalism that brings together two of the most powerful forces that now drive (...) cultural evolution. The concept of viral modernity applies to viral technologies, codes and ecosystems in information, publishing, education and emerging knowledge systems. This paper traces the relationship between epidemics, quarantine, and public health management and outlines elements of viral-digital philosophy based on the fusion of living and technological systems. We discuss Covid-19 as a ‘bioinformationalist’ response that represents historically unprecedented level of sharing information from the sequencing of the genome to testing for a vaccination. Finally, we look at the US response to Covid-19 through the lens of infodemics and post-truth. The paper is followed by three open reviews, which further refine its conclusions as they relate to philosophy and the notion of the virus as Pharmakon. (shrink)
Martin Heidegger is, perhaps, the most controversial philosopher of the twentieth-century. Little has been written on him or about his work and its significance for educational thought. This unique collection by a group of international scholars reexamines Heidegger's work and its legacy for educational thought.
Are thought experiments nothing but arguments? I argue that it is not possible to make sense of the historical trajectory of certain thought experiments if one takes them to be arguments. Einstein and Bohr disagreed about the outcome of the clock-in-the-box thought experiment, and so they reconstructed it using different arguments. This is to be expected whenever scientists disagree about a thought experiment's outcome. Since any such episode consists of two arguments but just one thought experiment, the thought experiment cannot (...) be the arguments. (shrink)