In this century the major insight in the field of moral philosophy has been that moral arguments need not proceed by way of the deduction of moral conclusions from non-moral premises. This realisation sprang from a recognition that the purpose of moral argument was not just to get one party to a moral disagreement to assent to a proposition that at the outset of the discussion he denied. If a moral argument was to be able to be considered successful it (...) was insufficient for someone to recognise that an action he had previously considered right was wrong; it was essential that this recognition have an influence on his subsequent conduct. The change in belief was important only in so far as it led to a change in action. And although this insight led people to over diminish the importance of belief and propose various types of non-cognitive theories of ethics, it is none the less true that the acceptance of a proposition of the form ‘action X is wrong’ must have an impact of some kind on the behaviour of those who accept it. (shrink)
This paper is an introduction to a virtual special issue of AI and Law exploring the legacy of the influential HYPO system of Rissland and Ashley. The papers included are: Arguments and cases: An inevitable intertwining, BankXX: Supporting legal arguments through heuristic retrieval, Modelling reasoning with precedents in a formal dialogue Game, A note on dimensions and factors, An empirical investigation of reasoning with legal cases through theory construction and application, Automatically classifying case texts and predicting outcomes, A factor-based definition (...) of precedential constraint and An improved factor based approach to precedential constraint. After describing HYPO, in this introduction to the special issue I look at various aspects of its influence on AI and Law: the developments led by Rissland at Amherst; the developments led by Ashley in Pittsburgh; the expression of these ideas in terms of rule based systems, and their subsequent formalisation; value based theories, which were inspired by a critique of HYPO; and contemporary approaches which revive the idea of dimensions. (shrink)
Modelling reasoning with legal cases has been a central concern of AI and Law since the 1980s. The approach which represents cases as factors and dimensions has been a central part of that work. In this paper I consider how several varieties of the approach can be applied to the interesting case of Popov v Hayashi. After briefly reviewing some of the key landmarks of the approach, the case is represented in terms of factors and dimensions, and further explored using (...) theory construction and argumentation schemes approaches. (shrink)
This paper discusses some engineering considerations that should be taken into account when building a knowledge based system, and recommends isomorphism, the well defined correspondence of the knowledge base to the source texts, as a basic principle of system construction in the legal domain. Isomorphism, as it has been used in the field of legal knowledge based systems, is characterised and the benefits which stem from its use are described. Some objections to and limitations of the approach are discussed. The (...) paper concludes with a case study giving a detailed example of the use of the isomorphic approach in a particular application. (shrink)
In this paper I recapitulate the ideas of Berman and Hafner (1993) regarding the role of teleology in legal argument. I show how these ideas can be used to address some issues arising from more recent work on legal argument, and how this relates to ideas associated with the New Rhetoric of Perelman. I illustrate the points with a discussion of the classic problem of which vehicles should be allowed in parks.
In this paper we describe a method for the specification of computationalmodels of argument using dialogue games. The method, which consists ofsupplying a set of semantic definitions for the performatives making upthe game, together with a state transition diagram, is described in full.Its use is illustrated by some examples of varying complexity, includingtwo complete specifications of particular dialogue games, Mackenzie's DC,and the authors' own TDG. The latter is also illustrated by a fully workedexample illustrating all the features of the game.
In this introduction I give an overview of Carole Hafner’s work and discuss the papers in this volume. The final section offers some more personal reminiscences of Carole and her contribution to the AI and Law community, from myself and other colleagues.
During human-human interaction, emotion plays a vital role in structuring dialogue. Emotional content drives features such as topic shift, lexicalisation change and timing; it affects the delicate balance between goals related to the task at hand and those of social interaction; and it represents one type of feedback on the effect that utterances are having. These various facets are so central to most real-world interaction, that it is reasonable to suppose that emotion should also play an important role in human-computer (...) interaction. To that end, techniques for detecting, modelling, and responding appropriately to emotion are explored, and an architecture for bringing these techniques together into a coherent system is presented. (shrink)
Biologically motivated computing seeks to transfer ideas from the biosciences to computer science. In seeking to make transfers it is helpful to be able to appreciate the metaphors which people use. This is because metaphors provide the context through which analogies and similes are made and by which many scientific models are constructed. As such, it is important for any rapidly evolving domain of knowledge to have developments accounted for in these terms. This paper seeks to provide one overview of (...) the process of modelling and shows how it can be used to account for a variety of biologically motivated computational models. Certain key ideas are identified in the subsequent analysis of biological sources, notably, systemic metaphors. Three important aspects of biological thinking are then considered in the light of computer science applications: biological organization, the cell, and models of evolution. The analysis throughout the paper is descriptive rather than formalized so that a large variety of potential applications may be considered. (shrink)
This paper describes one way in which a precise reason model of precedent could be developed, based on the general idea that courts are constrained to reach a decision that is consistent with the assessment of the balance of reasons made in relevant earlier decisions. The account provided here has the additional advantage of showing how this reason model can be reconciled with the traditional idea that precedential constraint involves rules, as long as these rules are taken to be defeasible. (...) The account presented is firmly based on a body of work that has emerged in AI and Law. This work is discussed, and there is a particular discussion of approaches based on theory construction, and how that work relates to the model described in this paper. (shrink)
The design and analysis of norms is a somewhat neglected topic in AI and Law, but this is not so in other areas of Computer Science. In recent years powerful techniques to model and analyse norms have been developed in the Multi-Agent Systems community, driven both by the practical need to regulate electronic institutions and open agent systems, and by a theoretical interest in mechanism design and normative systems. Agent based techniques often rely heavily on enforcing norms using the software (...) to prevent violation, but I will also discuss the use of sanctions and rewards, and the conditions under which compliance by autonomous agents can be expected or encouraged without sanctions or rewards. In the course of the paper a suggested framework for the exploration of these issues is developed. (shrink)
In this paper I argue that to explain and resolve some kinds of disagreement we need to go beyond what logic alone can provide. In particular, following Perelman, I argue that we need to consider how arguments are ascribed different strengths by different audiences, according to how accepting these arguments promotes values favoured by the audience to which they are addressed. I show how we can extend the standard framework for modelling argumentation systems to allow different audiences to be represented. (...) I also show how this formalism can explain how some disputes can be resolved while in others the parties can only agree to differ. I illustrate this by consideration of a legal example. Finally, I make some suggestions as to where these values come from, and how they can be used to explain differences across jurisdictions, and changes in views over time. (shrink)
There is a growing interest in how people conceptualise the legal domain for the purpose of legal knowledge systems. In this paper we discuss four such conceptualisations (referred to as ontologies): McCarty's language for legal discourse, Stamper's norma formalism, Valente's functional ontology of law, and the ontology of Van Kralingen and Visser. We present criteria for a comparison of the ontologies and discuss the strengths and weaknesses of the ontologies in relation to these criteria. Moreover, we critically review the criteria.
Norms provide a valuable mechanism for establishing coherent cooperative behaviour in decentralised systems in which there is no central authority. One of the most influential formulations of norm emergence was proposed by Axelrod :1095–1111, 1986). This paper provides an empirical analysis of aspects of Axelrod’s approach, by exploring some of the key assumptions made in previous evaluations of the model. We explore the dynamics of norm emergence and the occurrence of norm collapse when applying the model over extended durations. It (...) is this phenomenon of norm collapse that can motivate the emergence of a central authority to enforce laws and so preserve the norms, rather than relying on individuals to punish defection. Our findings identify characteristics that significantly influence norm establishment using Axelrod’s formulation, but are likely to be of importance for norm establishment more generally. Moreover, Axelrod’s model suffers from significant limitations in assuming that private strategies of individuals are available to others, and that agents are omniscient in being aware of all norm violations and punishments. Because this is an unreasonable expectation, the approach does not lend itself to modelling real-world systems such as online networks or electronic markets. In response, the paper proposes alternatives to Axelrod’s model, by replacing the evolutionary approach, enabling agents to learn, and by restricting the metapunishment of agents to cases where the original defection is observed, in order to be able to apply the model to real-world domains. This work can also help explain the formation of a “social contract” to legitimate enforcement by a central authority. (shrink)
In this text, the reader will find a well focused, clearly written, and concise review of major themes in the philosophy of Edmund Husserl. This work could well serve the beginning student to focus on the major problems in Husserlian thought. Fuchs argues that Husserl’s phenomenology is in conformity with and an outgrowth of the traditional orientation of Western philosophy called the metaphysics of presence. In separate discussions of evidence, temporality, and intersubjectivity, the author attempts to demonstrate both that Husserl (...) is tied to this traditional metaphysical doctrine which grants primacy to presence over absence, and that he nonetheless laid the foundation for the overcoming of this metaphysic. Throughout, Husserl is portrayed as wrestling with the juxtaposition of presence and absence; and yet on each point, presence is said to remain of primary constitutive status: evidence is seen as "absolute presence," the presence of impression plays a stronger constitutive role than the absences of past and future, and knowledge of other human beings is secured on the ground of presence. In short, the author claims that Husserl provided the foundation for the overcoming of the prejudice of the metaphysics of presence, and yet did not succeed in fully extricating himself from the confusions of this tradition. (shrink)
This book deserves the attention of philosophers of religion. Tracy presents a monumental synthesis of philosophy and history within the context of a "revisionist" theological model. Part I attempts adequately to articulate a method of inquiry by outlining the sets of evaluative criteria, the uses of evidence, and the place of the various philosophical and historical methods within this model. Not only must the method be responsive to the historical tradition, but it also must heed the non-Christian scrutiny of what (...) Van A. Harvey calls the "morality of scientific knowledge," which demands a critical posture toward beliefs and tradition. Tracy’s model involves three steps: 1) phenomenology of the "religious dimension" of our "common experience and language," 2) an "historical and hermeneutical investigation" of the Christian tradition, and 3) "transcendental or metaphysical reflection" in order both to determine the "truth-status" of the previous steps and to effect a critical correlation of philosophy and hermeneutics. (shrink)
This is the collection of essays presented to Bochenski on his 60th birthday, and it contains, as a mirror of Bochenski's own work, a broad spectrum of studies ranging from formal logic and history of logic, to the philosophy of logic and language, and to the methodology of explanation in Greek philosophy. Of the seventeen articles, these are some of the more important to the reviewer: "Betrachtungen zum Sequenzen Kalkül" by Paul Bernays, which is an extensive study of Gentzen-type formulations (...) of logic; "Remarks on Formal Deduction," H. B. Curry, a further discussion of sequenzen-logics; "Marginalia on Gentzen's Sequenzen Kalkül" by Hughes Leblanc; "Method and Logic in Presocratic Explanation," Jerry Stannard; "On the Logic of Preference and Choice," H. S. Houthakker, a suggestive presentation of decision and utility theory in logical form; "Leibniz's Law in Belief Contexts," Chisholm; "On Ontology and the Province of Logic," R. M. Martin; and "N. A. Vasilev and the Development of Many-valued Logics," G. L. Kline, an important addition to the history of logic. Other contributors are: Storrs McCall, Albert Menne, E. W. Beth, Benson Mates, Ivo Thomas, J. F. Staal, F. R. Barbò, A.-T. Tymieniecka, and N. M. Luyten. There is a bibliography of Bochenski's writings through 1962.—P. J. M. (shrink)
OBJECTIVE: To report and analyse the pattern of end-of-life decision making for terminal Chinese cancer patients. DESIGN: Retrospective descriptive study. SETTING: A cancer clinical trials unit in a large teaching hospital. PATIENTS: From April 1992 to August 1997, 177 consecutive deaths of cancer clinical trial patients were studied. MAIN MEASUREMENT: Basic demographic data, patient status at the time of signing a DNR consent, or at the moment of returning home to die are documented, and circumstances surrounding these events evaluated. RESULTS: (...) DNR orders were written for 64.4% of patients. Patients in pain (odds ratio 0.45, 95% CI 0.22-0.89), especially if requiring opioid analgesia (odds ratio 0.40, 95% CI 0.21-0.77), were factors associated with a higher probability of such an order. Thirty-five patients were taken home to die, a more likely occurrence if the patient was over 75 years (odds ratio 0.12, 95% CI 0.04-0.34), had children (odds ratio 0.14, 95% CI 0.02-0.79), had Taiwanese as a first language (odds ratio 6.74, 95% CI 3.04-14.93), or was unable to intake orally (odds ratio 2.73, 95% CI 1.26-5.92). CPR was performed in 30 patients, none survived to discharge. CONCLUSIONS: DNR orders are instituted in a large proportion of dying Chinese cancer patients in a cancer centre, however, the order is seldom signed by the patient personally. This study also illustrates that as many as 20% of dying patients are taken home to die, in accordance with local custom. (shrink)
In a reflective and richly entertaining piece from 1979, Doug Hofstadter playfully imagined a conversation between ‘Achilles’ and an anthill (the eponymous ‘Aunt Hillary’), in which he famously explored many ideas and themes related to cognition and consciousness. For Hofstadter, the anthill is able to carry on a conversation because the ants that compose it play roughly the same role that neurons play in human languaging; unfortunately, Hofstadter’s work is notably short on detail suggesting how this magic might be achieved1. (...) Conversely in this paper - finally reifying Hofstadter’s imagination - we demonstrate how populations of simple ant-like creatures can be organised to solve complex problems; problems that involve the use of forward planning and strategy. Specifically we will demonstrate that populations of such creatures can be configured to play a strategically strong - though tactically weak - game of HeX (a complex strategic game).We subsequently demonstrate how tactical play can be improved by introducing a form of forward planning instantiated via multiple populations of agents; a technique that can be compared to the dynamics of interacting populations of social insects via the concept of meta-population. In this way although, pace Hofstadter, we do not establish that a meta-population of ants could actually hold a conversation with Achilles, we do successfully introduce Aunt Hillary to the complex, seductive charms of HeX. (shrink)
We give a historical overview of the development of almost 50 years of empirical research on the affordances in the past and in the present. Defined by James Jerome Gibson in the early development of the Ecological Approach to Perception and Action as the prime of perception and action, affordances have become a rich topic of investigation in the fields of human movement science and experimental psychology. The methodological origins of the empirical research performed on affordances can be traced back (...) to the mid 1980’s and the works of Warren (1984, 1988) and Michaels (1988). Most of the research in Ecological Psychology performed since has focused on the actualization of discretely defined actions, the perception of action boundaries, the calculation of pi-numbers, and the measurement of response times. The research efforts have resulted in advancements in the understanding of the dynamic nature of affordances, affordances in a social context and the importance of calibration for perception of affordances. Although affordances are seen as an instrumental part of the control of action most studies investigating affordances do not pay attention to the control of the action. We conclude that affordances are still primarily treated as a utility to select behaviour, which creates a conceptual barrier that hinders deeper understanding of affordances. A focus on action-boundaries has largely prevented advancement in other aspects of affordances, most notably an integrative understanding of the role of affordances in the control of action. (shrink)
This study investigated the differences between past and future temporal discounting in terms of neural activity in relation to temporal distance. Results show that brain regions are engaged differently in past and future temporal discounting. This is likely because past temporal discounting requires memory reconstruction, whereas future temporal discounting requires the processing of uncertainty about the future. In past temporal discounting, neural activity differed only when preferences were made between rewards received one hour prior and rewards received further in the (...) past. The peak amplitudes of P2 and P3 varied as the temporal distance increased from 2 weeks to 50 years. In future temporal discounting, neural activity differed only when preferences were evaluated between two delayed rewards. The delay conditions and had a significant influence on P2 and N2. Findings indicate the existence of different decision-making systems operating in past and future temporal discounting. (shrink)
Prospective follow-up studies have shown that even though some children outgrow the disorder, a childhood diagnosis of attention deficit hyperactivity disorder is clearly a risk factor for a broad range of adverse outcomes, with extremes including drug abuse and juvenile delinquency. This article considers the use of several spectrum concepts and some neuroethical issues. It provides a list of criterion symptoms with a threshold set for the number of symptoms required for categorical diagnoses of disorders. It gives a brief review (...) of some brain imaging and pharmacological treatment studies of ADHD to set the stage for a consideration of brain-specific issues related to neuroethics. Studies using reaction time tasks of cognitive control, response inhibition, and conflict have identified interindividual variance in task performance as one of the most prominent aspects of cognitive deficits related to ADHD. (shrink)