In Part 1 the reader is introduced to some standard systems of modal logic and encouraged through a series of exercises to become proficient in manipulating these logics. The emphasis is on possible world semantics for modal logics and the semantic emphasis is carried into the formal method, Jeffrey-style truth-trees. Standard truth-trees are extended in a simple and transparent way to take possible worlds into account. Part 2 systematically explores the applications of modal logic to philosophical issues such as truth, (...) time, processes, knowledge and belief, obligation and permission. Accessible, authoritative, and assured, Modal Logics and Philosophy requires no more background than the completion of a standard introductory logic course. It will be welcomed not only by students looking for a bridge between introductory logic texts and the high-level technical literature but as a guide to, and exploration of, work at the forefront of logic and philosophy. (shrink)
The first edition, published by Acumen in 2000, became a prescribed textbook on modal logic courses. The second edition has been fully revised in response to readers' suggestions, including two new chapters on conditional logic, which was not covered in the first edition. "Modal Logics and Philosophy" is a fully comprehensive introduction to modal logics and their application suitable for course use. Unlike most modal logic textbooks, which are both forbidding mathematically and short on philosophical discussion, "Modal Logics and Philosophy" (...) places its emphasis firmly on showing how useful modal logic can be as a tool for formal philosophical analysis. In part 1 of the book, the reader is introduced to some standard systems of modal logic and encouraged through a series of exercises to become proficient in manipulating these logics. The emphasis is on possible world semantics for modal logics and the semantic emphasis is carried into the formal method, Jeffrey-style truth-trees. Standard truth-trees are extended in a simple and transparent way to take possible worlds into account. Part 2 systematically explores the applications of modal logic to philosophical issues such as truth, time, processes, knowledge and belief, obligation and permission. (shrink)
This book examines the conservative theory that guided Jeb Bush's behavior and the aggressive manner in which he used the Office of Governor to pursue his goals. It offers insight into his motivations and competencies and analyzes the extent to which his self proclaimed 'revolution' achieved its goals in Florida.
Ever since Saul Kripke and others developed a semantic interpretation for modal logic, 'possible worlds' has been a much debated issue in contemporary metaphysics. To propose the idea of a possible world that differs in some way from our actual world - for example a world where the grass is red or where no people exist - can help us to analyse and understand a wide range of philosophical concepts, such as counterfactuals, properties, modality, and of course, the notions of (...) possibility and necessity. This book examines the ways in which possible worlds have been used as a framework for considering problems in logic and argument analysis. The book begins with a non-technical introduction to the basic ideas of modal logic in terms of Kripke's possible worlds and then moves on to a discussion of 'possible for' and 'possible that'. The central chapters examine questions of meaning, epistemic possibility, temporal logic, metaphysics, and impossibility. Girle also investigates how the idea of a possible world can be put to use in different areas of philosophy, the problems it may raise, and the benefits that can be gained. (shrink)
Ever since Saul Kripke and others developed a semantic interpretation for modal logic, 'possible worlds' has been a much debated issue in contemporary metaphysics. To propose the idea of a possible world that differs in some way from our actual world - for example a world where the grass is red or where no people exist - can help us to analyse and understand a wide range of philosophical concepts, such as counterfactuals, properties, modality, and of course, the notions of (...) possibility and necessity. This book examines the ways in which possible worlds have been used as a framework for considering problems in logic and argument analysis. The book begins with a non-technical introduction to the basic ideas of modal logic in terms of Kripke's possible worlds and then moves on to a discussion of 'possible for' and 'possible that'. The central chapters examine questions of meaning, epistemic possibility, temporal logic, metaphysics, and impossibility. Girle also investigates how the idea of a possible world can be put to use in different areas of philosophy, the problems it may raise, and the benefits that can be gained. (shrink)
Simulations using idealized numerical models can often generate behaviors or patterns that are visually very similar to the natural phenomenon being investigated and to be explained. The question arises, when should these model simulations be taken to provide an explanation for why the natural phenomena exhibit the patterns that they do? An important distinction for answering this question is that between ‘how-possibly’ explanations and ‘how-actually’ explanations. Despite the importance of this distinction there has been surprisingly little agreement over how exactly (...) this distinction should bedrawn. I shall argue that inadequate attention has been paid to the different contexts in which an explanation can be given and the different levels of abstraction at which the explanandum phenomenon can be framed. By tracing how scientists are using model simulations to explain a striking periodic banding of vegetation known as tiger bush, I will show how our understanding of the distinction between how-possibly and how-actually model explanations needs to be revised. (shrink)
Anterior cingulate cortex {ACC} is a part of the brain's limbic system. Classically, this region has been related to affect, on the basis of lession studies in human and in animals. In the late 1980s, neuroimaging research indicated that ACC was active in many studies of cognition. The finding from EEG studies of a focal area of negativity in scalp electrodes following an error response led to the idea that ACC might be the brain's error detection and correction device.
The usual treatment of a dinner table utterance of ‘Can you pass the salt?’ is that it involves an indirect request to pass the salt as well as a direct question about the hearer’s ability to do so: an indirect speech act. These are held to involve two illocutionary forces and two illocutionary acts. Rod Bertolet has raised doubts about whether consideration of such examples warrants the postulation of indirect speech acts and illocutionary forces other than the literal ones. In (...) a recent article, Mary Kate McGowan, Shan Shan Tam, and Margaret Hall claim to show that these doubts are unfounded. The purpose of this paper is to show that they have not established this. (shrink)
Winner of the Gustave O. Arlt Award in the HumanitiesThree understandings of the nature of religion--religion as experience, symbolic meaning, and power--have dominated scholarly discussions, in succession, for the past hundred years. Proponents of each of these three approaches have tended to downplay, ignore, or actively criticize the others. But why should the three approaches be at odds? Religion as it is practiced involves experiences, meanings, and power, so students of religion should attend to all three. Furthermore, theorists of religion (...) should have an account that carefully conceptualizes all three aspects, without regarding any of them as more basic than the others.Visions of Religion provides just such an account. Stephen S. Bush examines influential proponents of the three visions, arguing that each approach offers substantial and lasting contributions to the study of religion, although each requires revision.Bush rehabilitates the concepts of experience and meaning, two categories that are much maligned these days. In doing so, he shows the extent to which these categories are implicated in matters of social power. As for power, the book argues that the analysis of power requires attention to meaning and experience.Visions of Religion accomplishes all this by articulating a social practical theory of religion that can account for all three aspects, even as it incorporates them into a single theoretical framework. (shrink)
This work is an introduction to logic, covering what is most commonly taught in the first term of a two-term sequence in logic at four-year colleges and universities. It is designed for use by community college students who plan to transfer credits to four-year institutions. The material covered seeks to maintain logic's place in philosophical thought systems, and avoids political examples in order to appeal to reason and study rather than ill-conceived jokes that often offend students' varying policitcal beliefs. This (...) work concludes with studies in proof constructions and rules and provides explanations of various grading decisions commonly made in logic courses, a unique feature helpful to students and teachers alike. (shrink)
Even if preventive military counter-terrorism may sometimes be ethically justifiable, it remains an open question whether the Bush Doctrine presents a discursively coherent account of the relevant normative conditions. With a view towards answering this question, this article critically examines efforts to ground the morally personifying language of the Bush Doctrine in term of hegemonic stability theory. Particular critical attention is paid to the arguments of leading proponents of this brand of game theory, including J. Yoo, E. Posner, (...) A. Sykes, and J. Goldsmith. When examined in their terms, the Bush Doctine is best understood as an ethically hypocritical and shortsighted international discursive strategy. Its use of moralistic language in demonizing 'rogue states' for purely amoral purposes is normatively incoherent and discursively unsustainable. If it is a strategically rational piece of international communication, it seems designed to undermine globally shared normative meanings for the sake of short-term unilateral military advantage. (shrink)
We report on some recent work centered on attempts to understand when one set is more random than another. We look at various methods of calibration by initial segment complexity, such as those introduced by Solovay [125], Downey, Hirschfeldt, and Nies [39], Downey, Hirschfeldt, and LaForte [36], and Downey [31]; as well as other methods such as lowness notions of Kučera and Terwijn [71], Terwijn and Zambella [133], Nies [101, 100], and Downey, Griffiths, and Reid [34]; higher level randomness notions (...) going back to the work of Kurtz [73], Kautz [61], and Solovay [125]; and other calibrations of randomness based on definitions along the lines of Schnorr [117].These notions have complex interrelationships, and connections to classical notions from computability theory such as relative computability and enumerability. Computability figures in obvious ways in definitions of effective randomness, but there are also applications of notions related to randomness in computability theory. For instance, an exciting by-product of the program we describe is a more-or-less naturalrequirement-freesolution to Post's Problem, much along the lines of the Dekker deficiency set. (shrink)
Despite the evidence showing the promise of HIV treatment as prevention (TasP) in reducing HIV incidence, a variety of ethical questions surrounding the implementation and “scaling up” of TasP have been articulated by a variety of stakeholders including scientists, community activists and government officials. Given the high profile and potential promise of TasP in combatting the global HIV epidemic, an explicit and transparent research priority-setting process is critical to inform ongoing ethical discussions pertaining to TasP.
The Encyclopedia of Organ includes articles on the organ family of instruments, including famous players, composers, instrument builders, the construction of the instruments, and related terminology. It is the first complete A-Z reference on this important family of keyboard instruments. The contributors include major scholars of music and musical instrument history from around the world.
Even if preventive military counter-terrorism may sometimes be ethically justifiable, it remains an open question whether the Bush Doctrine presented a discursively coherent account of the relevant normative conditions. With a view towards answering this question, this article critically examines efforts to ground the morally personifying language of the Bush Doctrine in term of hegemonic stability theory. Particular critical attention is paid to the arguments of leading proponents of this brand of game theory, including J. Yoo, E. Posner, (...) A. Sykes, and J. Goldsmith. When examined in their terms, the Bush Doctine is best understood as an ethically hypocritical and shortsighted international discursive strategy. Its use of moralistic language in demonizing ‘rogue states’ for purely amoral purposes is normatively incoherent and discursively unsustainable. If it is a strategically rational piece of international communication, it seems designed to undermine globally shared normative meanings for the sake of short-term unilateral military advantage. (shrink)
A common contemporary view is that the Bible and subsequent Christian thought authorize humans to exploit animals purely as means to human ends. This paper argues that Biblical and Christian thought have given rise to a more complex ethic of animal use informed by its pastoralist origins, Biblical pronouncements that permit different interpretations, and competing ideas and doctrines that arose during its development, and influenced by the rich and often contradictory features of ancient Hebrew and Greco-Roman traditions. The result is (...) not a uniform ethic but a tradition of unresolved debate. Differing interpretations of the Great Chain of Being and the conflict over animal experimentation demonstrate the colliding values inherent in the complex history of Biblical and Christian thought on animals. (shrink)
We prove that there exists a noncomputable c.e. real which is low for weak 2-randomness, a definition of randomness due to Kurtz, and that all reals which are low for weak 2-randomness are low for Martin-Löf randomness.
In their book, Darwinism Evolving: Systems Dynamics and the Genealogy of Natural Selection, Depew and Weber argued for the need to address the relationship between self-organization and natural selection in evolutionary theory, and focused on seven “visions” for doing so. Recently, Batten et al. in a paper in this journal, entitled “Visions of evolution: self-organization proposes what natural selection disposes,” picked up the issue with the work of Depew and Weber as a starting point. While the efforts of both sets (...) of authors are to be commended, there are substantive errors in both the presentations of my work and of my work with colleagues that undermine theirs. My purpose here is to correct the errors in question, thereby removing the undermining effects and in so doing reassert the position my colleagues and I first advanced more than two decades ago, and that I still stand by and argue for today. The central points are as follows: Self-organization or spontaneous ordering is a process of selection; this selection process is governed by a “physical selection principle”; this principle is the law of maximum entropy production; and natural selection is a special case where the components are replicating. (shrink)
Agriculture has been enormously productive in recent decades. The main problem is that fragmentation of issues, knowledge, and responsibilities has hidden the costs associated with this success. These are mainly environmental, social, and health costs, which have been assigned to other ministries, with their own histories unconnected to agriculture. Now that agricultural policy has achieved its success, its costs are becoming apparent. The current system is preoccupied with traditional views of competitiveness and efficiency. Policies, programs, and regulations are organized to (...) support specific commodities, not farming and food systems. Responsibilities are extremely fragmented and frequently uncoordinated. In this environment, the focus on nourishment, food security, and environmental sustainability is subordinated to economic issues.The future lies in reorienting agricultural policy away from maximum production and towards sustainability. We propose a major transformation of the policy making apparatus in order to shift the focus of the system towards nourishment, food security, and sustainability. A new policy making system must be built on the themes of: integrated responsibilities and activities; emphasis on macro-policy; transdisciplinary policy development; proximity of policy makers to the diverse groups affected by problems needing resolution; food systems policy.The design principles for such a new system are taken from the theory of food security and ecology. Using these principles, we design a new provincial department of food and food security, and test this design with two case studies. (shrink)
Philosophical arguments stemming from the public health ethics arena suggest that public health interventions ought to be subject to normative inquiry that considers relational values, including concepts such as solidarity, reciprocity and health equity. As yet, however, the extent to which ‘public’ values influence the ‘autonomous’ decisions of the public remains largely unexplored. Drawing on interviews with 50 men in Vancouver, Canada, this study employs a critical discourse analysis to examine participants’ decisions and motivations to voluntarily access HIV testing and/or (...) to accept a routine HIV test offer. Within a sub-set of interviews, a transactional discourse emerged in which the decision to test features an arrangement of ‘giving and receiving’. Discourses related to notions of solidarity emphasize considerations of justice and positions testing as a ‘public’ act. Lastly, ‘individualistic’ discourses focused on individual-level considerations, with less concern for the broader public ‘good’. These findings underscore how normative dimensions pertaining to men’s decisions to test are dialectically interrelated with the broader social and structural influences on individual and collective health-related behaviour, thereby suggesting a need to advance an explicit empirical-normative research agenda related to population and public health intervention research. (shrink)
In Kate Bush’s 1993 album, The Red Shoes, and her film, The Line, the Cross and the Curve, she engages with the symbolism of The Red Shoes fairytale as first depicted in Hans Christian Andersen’s 1845 fairy tale and later developed by the Powell and Pressburger film of the same name. In Bush’s versions of the tale she attempts to find a space of agency for the main female protagonist in a plot structure over-determined by patriarchal narrative and (...) symbolic logic. I will argue that it is through her own use of mystical symbolism — the Line, the Cross and the Curve — mediated through the deployment of ritual magick and kabbalistic ritual — that she breaks the ‘spell’ of the red shoes story where the main female character can escape the gender specific ‘curse’ of the red shoes. (shrink)
Harold Garfinkel wrote a series of highly detailed and lengthy 'memos' during his time (1951-53) at Princeton, where remarkable developments in information theory were taking place. These very substantial manuscripts have been edited by Anne Warfield Rawls in Toward a Sociological Theory of Information (Garfinkel 2008). This paper explores some of the implications of these memos, which we suggest are still relevant for the study of 'information' and information theory. Definitional privilege of 'information' as a technical term has been arrogated (...) by information science, which thereby excludes the interactional occasions of use of 'information'. The authors examine some 'professional' and 'laic' determinations of 'information'. Looking at in situ uses of 'information' shows how dealing with 'information' is characterized by ad hoc practices, such as specifications, 'authorization' and 'particularization' procedures. The authors report on a series of workplace studies in academic libraries, looking at how librarians account for 'information' through practices of classification. Classifying 'information' is a member's local accomplishment, and explicating practices of classifying 'information' undermines the formal-analytic project of the 'Philosophy of Information,' as formulated, for instance, by Luciano Floridi. Implications of Garfinkel's work must remain beyond the purview of information science if it is to maintain its status as the recognized field dealing with 'information'. However, such omission risks 'losing the phenomenon' of 'information': to adapt an argument from Dorothy Smith (Catalyst, 8, pp 39—54, 1974), it trades upon decontextualized uses and recontextualizes 'information' for the practical purposes of formal analysis. (shrink)
Many individuals domestically and internationally who strive for peace and justice are concerned about the new National Security Strategy issued by the George W. Bush Administration in September 2002. 1 William Galston, for example, writes in a recent issue of Philosophy and Public Policy Quarterly: A global strategy based on the new Bush doctrine of preemption means the end of the system of international institutions, laws and norms that we have worked to build for more than a half (...) a century. To his credit, Kissinger recognizes this; he labels Bush’s new approach “revolutionary” and declares, “Regime change as a goal for military intervention challenges the international system.” 2 Does the new Bush doctrine end the international legal system? Is the new Bush doctrine making policy declarations that are unprecedented in United States history? While I share many of the concerns critics are expressing about the new national security strategy, I contend that the more serious issue is not the ways in which this strategy represents a departure from those of prior United States presidential administrations but the actual practices of the Bush administration that appeal to this strategy. I will indicate how this new national security strategy does not represent much of a shift in policy, capability, or practice. Instead, this strategy Bush is using the strategy as an enabling device for a disturbing resurgence of United States global imperialism that serves interests that are actually opposed to the political rhetoric of the value of nations aiming for democracy and a market economy. I conclude by commenting on pursuing genuinely democratic values. I suggest that if the United States were truly committed to democratic values, then any military interventions would require the prior consent of the people. Otherwise what the United States refer to as “bringing democracy” to a people will be more like a militarily enforced authoritarianism that too closely resembles old-style exploitive imperialism. (shrink)
Anterior cingulate cortex (ACC) is a part of the brain's limbic system. Classically, this region has been related to affect, on the basis of lesion studies in humans and in animals. In the late 1980s, neuroimaging research indicated that ACC was active in many studies of cognition. The findings from EEG studies of a focal area of negativity in scalp electrodes following an error response led to the idea that ACC might be the brain's error detection and correction device. In (...) this article, these various findings are reviewed in relation to the idea that ACC is a part of a circuit involved in a form of attention that serves to regulate both cognitive and emotional processing. Neuroimaging studies showing that separate areas of ACC are involved in cognition and emotion are discussed and related to results showing that the error negativity is influenced by affect and motivation. In addition, the development of the emotional and cognitive roles of ACC are discussed, and how the success of this regulation in controlling responses might be correlated with cingulate size. Finally, some theories are considered about how the different subdivisions of ACC might interact with other cortical structures as a part of the circuits involved in the regulation of mental and emotional activity. (shrink)
As a natural example of a 1-random real, Chaitin proposed the halting probability Ω of a universal prefix-free machine. We can relativize this example by considering a universal prefix-free oracle machine U. Let [Formula: see text] be the halting probability of UA; this gives a natural uniform way of producing an A-random real for every A ∈ 2ω. It is this operator which is our primary object of study. We can draw an analogy between the jump operator from computability theory (...) and this Omega operator. But unlike the jump, which is invariant under the choice of an effective enumeration of the partial computable functions, [Formula: see text] can be vastly different for different choices of U. Even for a fixed U, there are oracles A =* B such that [Formula: see text] and [Formula: see text] are 1-random relative to each other. We prove this and many other interesting properties of Omega operators. We investigate these operators from the perspective of analysis, computability theory, and of course, algorithmic randomness. (shrink)
The use of real clocks and measuring rods in quantum mechanics implies a natural loss of unitarity in the description of the theory. We briefly review this point and then discuss the implications it has for the measurement problem in quantum mechanics. The intrinsic loss of coherence allows to circumvent some of the usual objections to the measurement process as due to environmental decoherence.
We describe punctual categoricity in several natural classes, including binary relational structures and mono-unary functional structures. We prove that every punctually categorical structure in a finite unary language is ${\text {PA}}$-categorical, and we show that this upper bound is tight. We also construct an example of a punctually categorical structure whose degree of categoricity is $0''$. We also prove that, with a bit of work, the latter result can be pushed beyond $\Delta ^1_1$, thus showing that punctually categorical structures can (...) possess arbitrarily complex automorphism orbits.As a consequence, it follows that binary relational structures and unary structures are not universal with respect to primitive recursive interpretations; equivalently, in these classes every rich enough interpretation technique must necessarily involve unbounded existential quantification or infinite disjunction. In contrast, it is well-known that both classes are universal for Turing computability. (shrink)
The new edition of this widely used and respected textbook includes three new chapters on conditional logic. Other chapters have been revised and updated, making the second edition a fully comprehensive introduction to modal logics and their application. Unlike most modal logic textbooks, which are both forbidding mathematically and short on philosophical discussion, Modal Logics and Philosophy focuses on showing how useful modal logic can be as a tool for formal philosophical analysis. In Part 1, the reader is introduced to (...) some standard systems of modal logic and provided with a series of exercises that encourage proficiency in manipulating these logics. Girle emphasizes possible world semantics for modal logics and its formal method, Jeffrey-style truth-trees, in which standard truth-trees are extended in a simple and transparent way to take possible worlds into account. Part 2 explores the applications of modal logic to philosophical issues such as truth, time, processes, knowledge and belief, and obligation and permission. (shrink)
Robert I. Soare, Automorphisms of the Lattice of Recursively Enumerable Sets. Part I: Maximal Sets.Manuel Lerman, Robert I. Soare, $d$-Simple Sets, Small Sets, and Degree Classes.Peter Cholak, Automorphisms of the Lattice of Recursively Enumerable Sets.Leo Harrington, Robert I. Soare, The $\Delta^0_3$-Automorphism Method and Noninvariant Classes of Degrees.
Active networking is an exciting new paradigm in digital networking that has the potential to revolutionize the manner in which communication takes place. It is an emerging technology, one in which new ideas are constantly being formulated and new topics of research are springing up even as this book is being written. This technology is very likely to appeal to a broad spectrum of users from academia and industry. Therefore, this book was written in a way that enables all these (...) groups to understand the impact of active networking in their sphere of interest. Information services managers, network administrators, and e-commerce developers would like to know the potential benefits of the new technology to their businesses, networks, and applications. The book introduces the basic active networking paradigm and its potential impacts on the future of information handling in general and on communications in particular. This is useful for forward-looking businesses that wish to actively participate in the development of active networks and ensure a head start in the integration of the technology in their future products, be they applications or networks. Areas in which active networking is likely to make significant impact are identified, and the reader is pointed to any related ongoing research efforts in the area. The book also provides a deeper insight into the active networking model for students and researchers, who seek challenging topics that define or extend frontiers of the technology. It describes basic components of the model, explains some of the terms used by the active networking community, and provides the reader with taxonomy of the research being conducted at the time this book was written. Current efforts are classified based on typical research areas such as mobility, security, and management. The intent is to introduce the serious reader to the background regarding some of the models adopted by the community, to outline outstanding issues concerning active networking, and to provide a snapshot of the fast-changing landscape in active networking research. Management is a very important issue in active networks because of its open nature. The latter half of the book explains the architectural concepts of a model for managing active networks and the motivation for a reference model that addresses limitations of the current network management framework by leveraging the powerful features of active networking to develop an integrated framework. It also describes a novel application enabled by active network technology called the Active Virtual Network Management Prediction (AVNMP) algorithm. AVNMP is a proactive management system; in other words, it provides the ability to solve a potential problem before it impacts the system by modeling network devices within the network itself and running that model ahead of real time. (shrink)
A highly useful resource for professionals and students alike, this cutting-edge, first-of-its-kind book provides a thorough introduction to nanoscale communication networks. Written in a clear tutorial style, this volume covers a wide range of the most important topics in the area, from molecular communication and carbon nanotube nano-networks, to nanoscale quantum networking and the future direction of nano networks. Moreover, the book features numerous exercise problems at the end of each chapter to ensure a solid understanding of the material.
This book bridges the divide between the fields of power systems engineering and computer communication through the new field of power system information theory. Written by an expert with vast experience in the field, this book explores the smart grid from generation to consumption, both as it is planned today and how it will evolve tomorrow. The book focuses upon what differentiates the smart grid from the "traditional" power grid as it has been known for the last century. Furthermore, the (...) author provides the reader with a fundamental understanding of both power systems and communication networking. It shows the complexity and operational requirements of the evolving power grid, the so-called "smart grid," to the communication networking engineer; and similarly, it shows the complexity and operational requirements for communications to the power systems engineer. The book is divided into three parts. Part One discusses the basic operation of the electric power grid, covering fundamental knowledge that is assumed in Parts Two and Three. Part Two introduces communications and networking, which are critical enablers for the smart grid. It also considers how communication and networking will evolve as technology develops. This lays the foundation for Part Three, which utilizes communication within the power grid. Part Three draws heavily upon both the embedded intelligence within the power grid and current research, anticipating how and where computational intelligence will be implemented within the smart grid. Each part is divided into chapters and each chapter has a set of questions useful for exercising the readers' understanding of the material in that chapter. Key Features: Bridges the gap between power systems and communications experts Addresses the smart grid from generation to consumption, both as it is planned today and how it will likely evolve tomorrow Explores the smart grid from the perspective of traditional power systems as well as from communications Discusses power systems, communications, and machine learning that all define the smart grid It introduces the new field of power system information theory. (shrink)
In this commentary on Stokoe’s article, ‘Moving forward with membership categorization analysis’, I take up the challenge to apply her keys for MCA to an extract of conversation recorded in a restaurant. The strengths of conversation analysis have not included – and indeed have not attempted to achieve – successful engagement with beyond-the-immediate-talk aspects of culture and the commonsense workings of society. The aim of the article is to explore what MCA might add to an analysis of a stretch of (...) talk using conversation analytic tools. It was found that a systematic application of the keys did indeed provide a richer account of what was going on. Whereas categories alone did not appear to provide more insights than commonsense can tell us, when the broader array of MCA tools and keys were applied, an enhanced analysis of the passage of talk emerged. An exploration of whether this can be extended as a method for a rigorous investigation of culture and society while still being grounded in participants’ mutual, moment-by-moment orientations to categories seems at the very least worth the serious attention of scholars interested in interaction. (shrink)
In their book, Darwinism Evolving: Systems Dynamics and the Genealogy of Natural Selection, Depew and Weber argued for the need to address the relationship between self-organization and natural selection in evolutionary theory, and focused on seven “visions” for doing so. Recently, Batten et al. in a paper in this journal, entitled “Visions of evolution: self-organization proposes what natural selection disposes,” picked up the issue with the work of Depew and Weber as a starting point. While the efforts of both sets (...) of authors are to be commended, there are substantive errors in both the presentations of my work and of my work with colleagues that undermine theirs. My purpose here is to correct the errors in question, thereby removing the undermining effects and in so doing reassert the position my colleagues and I first advanced more than two decades ago, and that I still stand by and argue for today. The central points are as follows: Self-organization or spontaneous ordering is a process of selection; this selection process is governed by a “physical selection principle”; this principle is the law of maximum entropy production; and natural selection is a special case where the components are replicating. (shrink)