Bruce Waller has defended a deductive reconstruction of the kinds of analogical arguments found in ethics, law, and metaphysics. This paper demonstrates the limits of such a reconstruction and argues for an alternative. non-deductive reconstruction. It will be shown that some analogical arguments do not fit Waller's deductive schema, and that such a schema does not allow for an adequate account of the strengths and weaknesses of an analogical argument. The similarities and differences between the account defended herein and the (...) Trudy Govier's account are discussed as well. (shrink)
Bruce Waller has defended a deductive reconstruction of the kinds of analogical arguments found in ethics, law, and metaphysics. This paper demonstrates the limits of such a reconstruction and argues for an alternative. non-deductive reconstruction. It will be shown that some analogical arguments do not fit Waller's deductive schema, and that such a schema does not allow for an adequate account of the strengths and weaknesses of an analogical argument. The similarities and differences between the account defended herein and the (...) Trudy Govier's account are discussed as well. (shrink)
‘Particularism’ and ‘generalism’ refer to families of positions in the philosophy of moral reasoning, with the former playing down the importance of principles, rules or standards, and the latter stressing their importance. Part of the debate has taken an empirical turn, and this turn has implications for AI research and the philosophy of cognitive modeling. In this paper, Jonathan Dancy’s approach to particularism (arguably one of the best known and most radical approaches) is questioned both on logical and empirical grounds. (...) Doubts are raised over whether Dancy’s brand of particularism can adequately explain the graded nature of similarity assessments in analogical arguments. Also, simple recurrent neural network models of moral case classification are presented and discussed. This is done to raise concerns about Dancy’s suggestion that neural networks can help us to understand how we could classify situations in a way that is compatible with his particularism. Throughout, the idea of a surveyable standard—one with restricted length and complexity—plays a key role. Analogical arguments are taken to involve multidimensional similarity assessments, and surveyable contributory standards are taken to be attempts to articulate the dimensions of similarity that may exist between cases. This work will be of relevance both to those who have interests in computationally modeling human moral cognition and to those who are interested in how such models may or may not improve our philosophical understanding of such cognition. (shrink)
Work on analogy has been done from a number of disciplinary perspectives throughout the history of Western thought. This work is a multidisciplinary guide to theorizing about analogy. It contains 1,406 references, primarily to journal articles and monographs, and primarily to English language material. classical through to contemporary sources are included. The work is classified into eight different sections (with a number of subsections). A brief introduction to each section is provided. Keywords and key expressions of importance to research on (...) analogy are discussed in the introductory material. Electronic resources for conducting research on analogy are listed as well. (shrink)
David Bohm's interpretation of quantum mechanics yields a quantum potential, Q. In his early work, the effects of Q are understood in causal terms as acting through a real (quantum) field which pushes particles around. In his later work (with Basil Hiley), the causal understanding of Q appears to have been abandoned. The purpose of this paper is to understand how the use of certain metaphors leads Bohm away from a causal treatment of Q, and to evaluate the use of (...) those metaphors. (shrink)
This paper presents the results of training an artificial neural network (ANN) to classify moral situations. The ANN produces a similarity space in the process of solving its classification problem. The state space is subjected to analysis that suggests that holistic approaches to interpreting its functioning are problematic. The idea of a contributory or pro tanto standard, as discussed in debates between moral particularists and generalists, is used to understand the structure of the similarity space generated by the ANN. A (...) spectrum of possibilities for reasons, from atomistic to holistic, is discussed. Reasons are understood as increasing in nonlocality as they move away from atomism. It is argued that contributory standards could be used to understand forms of nonlocality that need not go all the way to holism. It is also argued that contributory standards may help us to understand the kind of similarity at work in analogical reasoning and argument in ethics. Some objections to using state space approaches to similarity are dealt with, as are objections to using empirical and computational work in philosophy. (shrink)
I argue that a basic similarity analysis of analogical reasoning handles many apparent cases of visual analogy. I consider how the visual and verbal elements interact in analogical cases. Finally, I offer two analyses of visual elements. One analysis is evidential. The visual elements are evidence for their ver-bal counterparts. One is non-evidential: the visual elements link to verbal elements without providing evi-dence for those elements. The result is to make more room for the logical analysis of visual argumentation.
A simple recurrent artificial neural network is used to classify situations as permissible or impermissible. The trained ANN can be understood as having set up a similarity space of cases at the level of its internal or hidden units. An analysis of the network’s internal representations is undertaken using a new visualization technique for state space approaches to understanding similarity. Insights from the literature on moral philosophy pertaining to contributory standards will be used to interpret the state space set up (...) by the ANN as being structured by implicit reasons. The ANN, on its own, is not capable of explicitly representing or offering reasons to itself or others. That said, the low level similarity space set up by the network could be made available to higher order processes that exploit it for case-based reasoning. It is argued that for normative purposes, similarity could be seen as a contributor to procedural coherence in case-based reasoning and local forms of substantive coherence, but not to global forms of coherence given the computational complexity of managing those more ambitious forms of coherence. (shrink)
This paper identifies a type of multi-source (case-based) reasoning and differentiates it from other types of analogical reasoning. Work in cognitive science on mental space mapping or conceptual blending is used to better understand this type of reasoning. The type of argument featured herein will be shown to be a kind of source-blended argument. While it possesses some similarities to traditionally conceived analogical arguments, there are important differences as well. The triple contract (a key development in the usury debates of (...) the fifteenth and sixteenth centuries) will be shown to make use of source-blended arguments. (shrink)
In "Representations without Rules, Connectionism and the Syntactic Argument'', Kenneth Aizawa argues against the view that connectionist nets can be understood as processing representations without the use of representation-level rules, and he provides a positive characterization of how to interpret connectionist nets as following representation-level rules. He takes Terry Horgan and John Tienson to be the targets of his critique. The present paper marshals functional and methodological considerations, gleaned from the practice of cognitive modelling, to argue against Aizawa's characterization of (...) how connectionist nets may be understood as making use of representation-level rules. (shrink)
Terence Horgan and John Tienson claim that folk psychological laws are different in kind from basic physical laws in at least two ways: first, physical laws do not possess the kind of ceteris paribus qualifications possessed by folk psychological laws, which means the two types of laws have different logical forms; and second, applied physical laws are best thought of as being about an idealized world and folk psychological laws about the actual world. I argue that Horgan and Tienson have (...) not made a persuasive case for either of the preceding views. (shrink)
David Bohm's interpretation of quantum mechanics yields a quantum potential, Q. In his early work, the effects of Q are understood in causal terms as acting through a real field which pushes particles around. In his later work, the causal understanding of Q appears to have been abandoned. The purpose of this paper is to understand how the use of certain metaphors leads Bohm away from a causal treatment of Q, and to evaluate the use of those metaphors.
Theories of moral, and more generally, practical reasoning sometimes draw on the notion of coherence. Admirably, Paul Thagard has attempted to give a computationally detailed account of the kind of coherence involved in practical reasoning, claiming that it will help overcome problems in foundationalist approaches to ethics. The arguments herein rebut the alleged role of coherence in practical reasoning endorsed by Thagard. While there are some general lessons to be learned from the preceding, no attempt is made to argue against (...) all forms of coherence in all contexts. Nor is the usefulness of computational modelling called into question. The point will be that coherence cannot be as useful in understanding moral reasoning as coherentists may think. This result has clear implications for the future of Machine Ethics, a newly emerging subfield of AI. (shrink)
This dissertation is a work in the philosophical foundations of cognitive modelling. To a significant extent, it is presented as a response to a critique of connectionist modelling originated by Jerry Fodor and Zenon Pylyshyn. The essence of the critique is that either connectionist models implement classical models of cognition, or if connectionist models are not implementational, then they are incapable of modelling cognition. I argue that barring an implausible interpretation of "implementation," there exists a subset of connectionist models which (...) cannot be implementations of classical models, and while no connectionist researcher has met all the demands Fodor and Pylyshyn placed on adequate models of cognition, the properties possessed by some non-implementational connectionist models provide reason to think that these demands can be met in principle, and that connectionist models are capable of kinds of processing which classical models are not capable of, processing which makes connectionist models worth developing in spite of some of their present limitations. ;Many have responded to the Fodor and Pylyshyn critique. However, the responses have not payed sufficient attention to key notions such as implementation and rule. One of the principal contributions of this dissertation will be an analysis of the notion of implementation. The notion of a tacit rule also will be defined. Many have suggested that the difference between connectionist and classical models of cognition is that connectionist nets do not have explicit rules. I argue that the preceding is incorrect; I also argue that there is a certain kind of rule which connectionist nets cannot be understood as implementing--a tacit rule. ;Chapters one through four respond to the Fodor and Pylyshyn argument against connectionism. Chapter five takes up a critique of connectionist models made by Andy Clark. Chapter six responds to an argument by Terrence Horgan and John Tienson which aims to show that folk psychological laws make use of a type of ceteris paribus clause not possessed by physical laws. Horgan and Tienson are shown not to have made a persuasive case. (shrink)
This paper responds to criticisms levelled by Fodor, Pylyshyn, and McLaughlin against connectionism. Specifically, I will rebut the charge that connectionists cannot account for representational systematicity without implementing a classical architecture. This will be accomplished by drawing on Paul Smolensky's Tensor Product model of representation and on his insights about split-level architectures.
One form of analogical argument proceeds by comparing a disputed case with an agreed upon case to try to resolve the dispute. There is a variation on preceding form of argument not yet identified in the theoretical literature. This variation involves multiple sources, and it requires that the sources be combined or blended for the argument to work. Arguments supporting the Triple Contract are shown to possess this structure.