Nearly all students believe academic cheating is wrong, yet few students say they would report witnessed acts of cheating. To explain this apparent tension, the present research examined college students’ reasoning about whether to report plagiarism or other forms of cheating. Study 1 examined students’ conflicts when deciding whether to report cheating. Most students gave reasons against reporting a peer as well as reasons in favor of reporting. Study 2 provided experimental confirmation that the contextual factors referenced by Study 1 (...) participants in fact influenced decisions about whether to report cheating. Overall, the findings indicate that students often decide against reporting peers’ acts of cheating, though not due to a lack of concern about integrity. Rather, students may refrain from reporting because of conflicting concerns, lack of information about school policy, and perceived better alternatives to reporting. (shrink)
Norbert M. Samuelson is Harold and Jean Grossman Chair of Jewish Studies at Arizona State University. Trained in analytic philosophy, he has contributed to the professionalization of Jewish philosophy in America and to the field of religion and science.
The paper proposes a way for adherents of Fregean, structured propositions to designate propositions and other complex senses/concepts using a special kind of functor. I consider some formulations from Peacocke's works and highlight certain problems that arise as we try to quantify over propositional constituents while referring to propositions using "that"-clauses. With the functor notation, by contrast, we can quantify over senses/concepts with objectual, first-order quantifiers and speak without further ado about their involvement in propositions. The functor notation also turns (...) out to come with an important kind of expressive strengthening, and is shown to be neutral on several controversial issues. (shrink)
Two intellectual vices seem to always tempt us: arrogance and diffidence. Regarding the former, the world is permeated by dogmatism and table-thumping close-mindedness. From politics, to religion, to simple matters of taste, zealots and ideologues all too often define our disagreements, often making debate and dialogue completely intractable. But to the other extreme, given a world with so much pluralism and heated disagreement, intellectual apathy and a prevailing agnosticism can be simply all too alluring. So the need for intellectual humility, (...) open-mindedness, and a careful, humble commitment to the truth are apparent. In this book, Dr Church and Dr Samuelson explicate a robust and vibrant account of the philosophy and science of this most valuable virtue, and they highlight how it can be best applied and personally developed. (shrink)
Unmeasured confounding is one of the most important threats to the validity of observational studies. In this paper we scrutinize a recently proposed sensitivity analysis for unmeasured confounding. The analysis requires specification of two parameters, loosely defined as the maximal strength of association that an unmeasured confounder may have with the exposure and with the outcome, respectively. The E-value is defined as the strength of association that the confounder must have with the exposure and the outcome, to fully explain away (...) an observed exposure-outcome association. We derive the feasible region of the sensitivity analysis parameters, and we show that the bounds produced by the sensitivity analysis are not always sharp. We finally establish a region in which the bounds are guaranteed to be sharp, and we discuss the implications of this sharp region for the interpretation of the E-value. We illustrate the theory with a real data example and a simulation. (shrink)
I here argue that Ted Sider's indeterminacy argument against vagueness in quantifiers fails. Sider claims that vagueness entails precisifications, but holds that precisifications of quantifiers cannot be coherently described: they will either deliver the wrong logical form to quantified sentences, or involve a presupposition that contradicts the claim that the quantifier is vague. Assuming (as does Sider) that the “connectedness” of objects can be precisely defined, I present a counter-example to Sider's contention, consisting of a partial, implicit definition of the (...) existential quantifier that in effect sets a given degree of connectedness among the putative parts of an object as a condition upon there being something (in the sense in question) with those parts. I then argue that such an implicit definition, taken together with an “auxiliary logic” (e.g., introduction and elimination rules), proves to function as a precisification in just the same way as paradigmatic precisifications of, e.g., “red”. I also argue that with a quantifier that is stipulated as maximally tolerant as to what mereological sums there are, precisifications can be given in the form of truth-conditions of quantified sentences, rather than by implicit definition. (shrink)
Samuelson's text was first published in 1948, and it immediately became the authority for the principles of economics courses. The book continues to be the standard-bearer for principles courses, and this revision continues to be a clear, accurate, and interesting introduction to modern economics principles. Bill Nordhaus is now the primary author of this text, and he has revised the book to be as current and relevant as ever.
This paper explores the influence of operationalism and its corollary, descriptivism, on Paul Samuelson's revealed preference theory as it developed between 1937 and 1948. Samuelson urged the disencumbering of metaphysics from economic theory. As an illustration, he showed how utility could be operationally redefined as revealed preference, and, furthermore, how from hypotheses such as maximizing behavior, operationally meaningful theorems could be deduced, thereby satisfying his demand for a scientific, empirical approach toward consumer behavior theory. In this paper I (...) discuss the ensuing debate during the 1950s and 1960s on Samuelson's operationalism that raised doubts about its efficacy. In addition, I argue that certain concepts (revealed preference, equilibrium) and theorems (e.g., weak and strong axioms) that are supposedly operational in revealed preference theory, lack operational meaning, not withstanding their mathematical implications. Finally, I suggest that, although Samuelson's methodological rhetoric did not correspond with his implicit aprioristic theorizing, he possibly thought that his methodology and theorizing would converge in the long run. (shrink)
Emotions are foremost self-regulating processes that permit rapid responses and adaptations to situations of personal concern. They have biological bases and are shaped ontogenetically via learning and experience. Many situations and events of personal concern are social in nature. Thus, social exchanges play an important role in learning about rules and norms that shape regulation processes. I argue that (a) emotions often are actively auto-regulating—the behavior implied by the emotional reaction bias to the eliciting event or situation modifies or terminates (...) the situation; (b) certain emotion components are likely to habituate dynamically, modifying the emotional states; (c) emotions are typically intra- and interpersonal processes at the same time, and modulating forces at these different levels interact; (d) emotions are not just regulated—they regulate. Important conclusions of my arguments are that the scientific analysis of emotion should not exclude regulatory processes, and that effortful emotion regulation should be seen relative to a backdrop of auto-regulation and habituation, and not the ideal notion of a neutral baseline. For all practical purposes unregulated emotion is not a realistic concept. (shrink)
I show that the act-type theories of Soames and Hanks entail that every sentence with alternative analyses (including every atomic sentence with a polyadic predicate) is ambiguous, many of them massively so. I assume that act types directed toward distinct objects are themselves distinct, plus some standard semantic axioms, and infer that act-type theorists are committed to saying that ‘Mary loves John’ expresses both the act type of predicating [loving John] of Mary and that of predicating [being loved by Mary] (...) of John. Since the two properties are distinct, so are the act types. Hence, the sentence expresses two propositions. I also discuss a non-standard “pluralist” act-type theory, as well as some retreat positions, which all come with considerable problems. Finally, I extrapolate to a general constraint on theories of structured propositions, and find that Jeffrey King’s theory has the same unacceptable consequence as the act-type theory. (shrink)
A key feature of facial behavior is its dynamic quality. However, most previous research has been limited to the use of static images of prototypical expressive patterns. This article explores the role of facial dynamics in the perception of emotions, reviewing relevant empirical evidence demonstrating that dynamic information improves coherence in the identification of affect (particularly for degraded and subtle stimuli), leads to higher emotion judgments (i.e., intensity and arousal), and helps to differentiate between genuine and fake expressions. The findings (...) underline that using static expressions not only poses problems of ecological validity, but also limits our understanding of what facial activity does. Implications for future research on facial activity, particularly for social neuroscience and affective computing, are discussed. (shrink)
Semantic dispositionalism is roughly the view that meaning a certain thing by a word, or possessing a certain concept, consists in being disposed to do something, e.g., infer a certain way. Its main problem is that it seems to have so many and disparate exceptions. People can fail to infer as required due to lack of logical acumen, intoxication, confusion, deviant theories, neural malfunctioning, and so on. I present a theory stating possession conditions of concepts that are counterfactuals, rather than (...) disposition attributions, but which is otherwise similar to inferentialist versions of dispositionalism. I argue that it can handle all the exceptions discussed in the literature without recourse to ceteris paribus clauses. Psychological exceptions are handled by suitably undemanding requirements (unlike that of giving the sum of any two numbers) and by setting the following two preconditions upon someone’s making the inference: that she considers the inference and has no motivating reason against it. The non-psychological exceptions, i.e., cases of neural malfunctioning, are handled by requiring that the counterfactuals be true sufficiently often during the relevant interval. I argue that this accommodates some important intuitions about concept possession, in particular, the intuition that concept possession is vague along a certain dimension. (shrink)
Under normal circumstances, we experience that our center of awareness is located behind our eyes and inside our own body. To learn more about the perceptual processes that underlie this tight coupling between the spatial dimensions of our consciously perceived self and our physical body, we conducted a series of experiments using an ‘out-of-body illusion’. In this illusion, the conscious sense of self is displaced in the testing room by experimental manipulation of the congruency of visual and tactile information and (...) a change in the visual perspective. We demonstrate that when healthy individuals experience that they are located in a different place from their real body, they disown this body and no longer perceive it as part of themselves. Our findings are important because they reveal a relationship between the representation of self-location in the local environment and the multisensory representation of one’s own body. (shrink)
This paper examines the importance of aspirations as reference points in a multi-period decision-making context. After stating their personal aspiration level, 172 individuals made six sequential decisions among risky prospects as part of a choice experiment. The results show that individuals make different risky-choices in a multi-period compared to a single-period setting. In particular, individuals’ aspiration level is their main reference point during the early stages of decision-making, while their starting status (wealth level at the start of the experiment) becomes (...) the central reference point during the later stages of their multi-period decision-making. (shrink)
I here defend a theory consisting of four claims about ‘property’ and properties, and argue that they form a coherent whole that can solve various serious problems. The claims are (1): ‘property’ is defined by the principles (PR): ‘F-ness/Being F/etc. is a property of x iff F’ and (PA): ‘F-ness/Being F/etc. is a property’; (2) the function of ‘property’ is to increase the expressive power of English, roughly by mimicking quantification into predicate position; (3) property talk should be understood at (...) face value: apparent commitments are real and our apparently literal use of ‘property’ is really literal; (4) there are no properties. In virtue of (1)–(2), this is a deflationist theory and in virtue of (3)–(4), it is an error theory. (1) is fleshed out as a claim about understanding conditions, and it is argued at length, and by going through a number of examples, that it satisfies a crucial constraint on meaning claims: all facts about ‘property’ can be explained, together with auxiliary facts, on its basis. Once claim (1) has been expanded upon, I argue that the combination of (1)–(3) provides the means for handling several problems: they help giving a happy-face solution to what I call the paradox of abstraction , they form part of a plausible account of the correctness of committive sentences, and, most importantly, they help respond to various indispensability arguments against nominalism. (shrink)
Michael Fishbane is Nathan Cummings Distinguished Service Professor of Jewish Studies at the University of Chicago Divinity School. Trained in biblical studies, he also writes constructive hermeneutic theology.
It is argued that although George Bealer's influential ‘Self-Consciousness argument’ refutes standard versions of reductive functionalism (RF), it fails to generalize in the way Bealer supposes. To wit, he presupposes that any version of RF must take the content of ‘pain’ to be the property of being in pain (and so on), which is expressly rejected in independently motivated versions of conceptual role semantics (CRS). Accordingly, there are independently motivated versions of RF, incorporating CRS, which avoid Bealer's main type of (...) refutation. I focus particularly on one such theory, which takes concepts to be event types that are individuated by their psychological roles, which has the resources of responding to each of the more specific worries Bealer expresses. (shrink)
Based on metatheoretical considerations, this article discusses what kinds of traffic forecasts are possible and what kinds are impossible to make with any reasonable degree of accuracy. It will be argued on ontological and epistemological grounds that it is inherently impossible to make exact predictions about the magnitude of the ‘general’ traffic growth 20–30 years ahead, since many of the influencing factors depend on inherently unpredictable geopolitical trajectories as well as contested political decision-making. Due to the context-dependency of each particular (...) planning situation, it is also hardly possible to make exact, quantitative predictions about the impact of implementing a specific infrastructure project, compared to ‘doing nothing’. Instead of relying on traffic model simulations as the general forecasting and assessment tool in transport planning, we propose to separate the so-called strategic, tactical and operational levels of traffic forecasting into three distinct methodological approaches reflecting the different degrees of openness/closure of the systems at hand: scenario analyses at the strategic level; theory-informed, mainly qualitative analyses supplemented with simple calculations at the tactical level; while more traditional micro-simulations should be applied only at a detailed operational level. (shrink)
In the second half of the twentieth century, humanism— namely, the worldview that underpinned Western thought for several centuries—has been severely critiqued by philosophers who highlighted its theoretical and ethical limitations. Inspired by the emergence of cybernetics and new technologies such as robotics, prosthetics, communications, artificial intelligence, genetic engineering, and nanotechnology, there has been a desire to articulate a new worldview that will fit the posthuman condition. Posthumanism is a description of a new form of human existence in which the (...) boundaries between humans and nature and humans and machines are blurred, as well as a prescription for an ideal situation in which the limitations of human biology are transcended, replaced by machines. The transition from the human condition to the posthuman condition will be facilitated by transhumanism, the project of human enhancement that will ultimately yield the transformation of the human species from the human to the posthuman. As an intellectual movement, transhumanism is still very small, but transhumanist ideas exert deep and broad influence on contemporary culture and society. This essay highlights the religious dimension of transhumanism and argues that it should be seen as a secularist faith: transhumanism secularizes traditional religious themes, concerns, and goals, while endowing technology with religious significance. Science‐Religion Studies is the most appropriate context to explore the cultural significance of transhumanism. (shrink)
The article first rehearses three deflationary theories of reference, (1) disquotationalism, (2) propositionalism (Horwich), and (3) the anaphoric theory (Brandom), and raises a number of objections against them. It turns out that each corresponds to a closely related theory of truth, and that these are subject to analogous criticisms to a surprisingly high extent. I then present a theory of my own, according to which the schema “That S(t) is about t” and the biconditional “S refers to x iff S (...) says something about x” are exhaustive of the notions of aboutness and reference. An account of the usefulness of “about” is then given, which, I argue, is superior to that of Horwich. I close with a few considerations about how the advertised theory relates to well-known issues of reference, the conclusions of which is (1) that the issues concern reference and aboutness only insofar as the words “about” and “refer” serve to generalise over the claims that are really at issue, (2) that the theory of reference will not settle the issues, and (3) that it follows from (2) that the issues do not concern the nature of aboutness or reference. (shrink)
In 1997, the Japanese Diet revised the Bank of Japan law thereby granting the central bank greater independence in monetary policy making. The revision was an attempt by Japan's political class to weaken the authority of the powerful Ministry of Finance over the central bank and augment its own influence. The Bank of Japan, however, gained more autonomy than politicians ever intended, leading to frequent confrontations between the government and the central bank over monetary policy. This paper explores the new (...) strategic relationship that emerged between the Bank of Japan and government and the nature of monetary policy implemented in the post-reform period. We demonstrate that several factors contributed to the Bank's unexpected ability to enhance its independence: the astute leadership of the first post-reform governor Hayami Masaru; the Bank's ability to turn politicization of monetary policy to its advantage; and its pursuit of a strategy achieved by augmenting its own research capacity. On a theoretical level, our findings show that the passage of a new legal framework only marks the completion of one stage of institutional change and the start of the next; post-enactment politics have as much importance as pre-enactment politics in shaping outcomes. In the post-enactment phase, various factors, including the state of the economy and informal institutions or processes, matter greatly and may shift the direction of institutional change away from the intended path. (shrink)
I here discuss two problems facing Russellian act-type theories of propositions, and argue that Fregean act-type theories are better equipped to deal with them. The first relates to complex singular terms like '2+2', which turn out not to pose any special problem for Fregeans at all, whereas Soames' theory currently has no satisfactory way of dealing with them (particularly, with such "mixed" propositions as the proposition that 2+2 is greater than 3). Admittedly, one possibility stands out as the most promising (...) one, but it requires that the Russellian treat complex properties as constituents of propositions. This leads to the second major problem for Russellians: that of proliferating propositions. I show how the most direct solution to this problem, that of rejecting complex predicative propositional constituents is available to Fregeans but very implausible for Russellians, since this virtually means rejecting complex properties. (shrink)
Depicts the development of societal organization, welfare & political freedom as a gradual process of increased self-steering, with man as a self-steering actor, thereby rejecting the man-machine analogy.
Contents: 1. Introduction , 2. Overviews , 3.History and major works, 3.1 Gerhard Gentzen and proof-theory, 3.2 Wilfrid Sellars, 3.3 Gilbert Harman, 3.4 Christopher Peacocke, 3.5 Robert Brandom , 3.6 Paul Horwich, 3.7 Major works by other authors, 4. Mental content first vs. linguistic meaning first, 4.1 Content-first views, 4.2 Meaning-first views, 5. Wide vs. narrow CRS, 5.1 Overviews and major works about externalism/internalism, 5.2 Discussions about externalism within CRS, 6. Descriptive vs. normative CRS, 6.1 Overviews and major works about (...) the normativity of meaning, 6.2 The debate on normativity within CRS, 7. Holistic vs. non-holistic CRS, 7.1 Overviews of meaning holism and compositionality, 7.2 Overviews and major works about analyticity, 7.3 Criticisms against holism , 7.4 Replies to the criticisms, 8. Unilateral vs. bilateral CRS , 9. Connections between CRS and meta-ethics. (shrink)
This paper sheds new light on Samuelson’s early methodology as presented in his Foundations of Economic Analysis by reflecting on the similarity between his mathematical economics and Edwin B. Wilson’s mathematics. Wilson was Samuelson’s professor of advanced mathematical and statistical economics; he was also a protégé of Josiah Willard Gibbs. Wilson defined mathematics as a language that consisted of three interconnected aspects: postulational, axiomatic, and operational. In his Foundations, in a Wilsonian style, Samuelson wrote in the opening (...) page, ‘Mathematics is a Language’ and claimed that he offered operationally meaningful theorems. In this paper, it is argued that these maxims embodied Wilson’s approach, which framed Samuelson’s mathematical and statistical thinking around 1940 and which led him to present his work as being mathematically, theoretically, and empirically well founded. Wilson’s and Percy Bridgman’s operational methodologies are also compared and Wilson is presented as a mediator between Bridgman and Samuelson. (shrink)
A new kind of defense of the Millian theory of names is given, which explains intuitive counter-examples as depending on pragmatic effects of the relevant sentences, by direct application of Grice’s and Sperber and Wilson’s Relevance Theory and uncontroversial assumptions. I begin by arguing that synonyms are always intersubstitutable, despite Mates’ considerations, and then apply the method to names. Then, a fairly large sample of cases concerning names are dealt with in related ways. It is argued that the method, as (...) applied to the various cases, satisfies the criterion of success: that for every sentence in context, it is a counter-example to Millianism to the extent that it has pragmatic effects (matching speakers’ intuitions). (shrink)
I here develop a specific version of the deflationary theory of truth. I adopt a terminology on which deflationism holds that an exhaustive account of truth is given by the equivalence between truth-ascriptions and de-nominalised (or disquoted) sentences. An adequate truth-theory, it is argued, must be finite, non-circular, and give a unified account of all occurrences of “true”. I also argue that it must descriptively capture the ordinary meaning of “true”, which is plausibly taken to be unambiguous. Ch. 2 is (...) a critical historical survey of deflationary theories, where notably disquotationalism is found untenable as a descriptive theory of “true”. In Ch. 3, I aim to show that deflationism cannot be finitely and non-circularly formulated by using “true”, and so must only mention it. Hence, it must be a theory specifically about the word “true” (and its foreign counterparts). To capture the ordinary notion, the theory must thus be an empirical, use-theoretic, semantic account of “true”. The task of explaining facts about truth now becomes that of showing that various sentences containing “true” are (unconditionally) assertible. In Ch. 4, I defend the claim (D) that every sentence of the form “That p is true” and the corresponding “p” are intersubstitutable (in a use-theoretic sense), and show how this claim provides a unified and simple account of a wide variety of occurrences of “true”. Disquotationalism then only has the advantage of avoiding propositions. But in Ch. 5, I note that (D) is not committed to propositions. Use-theoretic semantics is then argued to serve nominalism better than truth-theoretic ditto. In particular, it can avoid propositions while sustaining a natural syntactic treatment of “that”-clauses as singular terms and of “Everything he says is true”, as any other quantification. Finally, Horwich’s problem of deriving universal truth-claims is given a solution by recourse to an assertibilist semantics of the universal quantifier. (shrink)
I begin with an exposition of the two main variants of the Prosentential Theory of Truth (PT), those of Dorothy Grover et al. and Robert Brandom. Three main types of criticisms are then put forward: (1) material criticisms to the effect that (PT) does not adequately explain the linguistic data, (2) an objection to the effect that no variant of (PT) gives a properly unified account of the various occurrences of "true" in English, and, most importantly, (3) a charge that (...) the comparison with proforms is explanatorily idle. The last objection is that, given a complete semantic account of pronouns, proadjectives, antecedents, etc., together with a complete (PT), the essential semantic character of "true" could be deduced, but then, the idleness of the comparison with pronouns would be apparent. It turns out that objections (2) and (3) are related in the following way: the prosentential terminology is held to conceal the lack of unity in (PT), by describing the different data in the same terms ("proform", "antecedent", etc.). But this, I argue, is only a way of truly describing, rather than explaining, the data, these being certain relations of equivalence and consequence between sentences. I consider a language for which (PT) would be not only true, but also explanatory, but note that this language is very different from English. I end by showing that Robert Brandom's case that "is true" is not a predicate fails, and that his motivation for saying so is based on fallacious reasoning (namely, Boghossian's argument against deflationism). (shrink)
In this paper, we explore the literature on cognitive heuristics and biases in light of virtue epistemology, specifically highlighting the two major positions—agent-reliabilism and agent-responsibilism —as they apply to dual systems theories of cognition and the role of motivation in biases. We investigate under which conditions heuristics and biases might be characterized as vicious and conclude that a certain kind of intellectual arrogance can be attributed to an inappropriate reliance on Type 1, or the improper function of Type 2, cognitive (...) processes. By the same token, the proper intervention of Type 2 processes results in the virtuous functioning of our cognitive systems. Moreover, the role of motivation in attenuating cognitive biases and the cultivation of certain epistemic habits points to the tenets of agent-responsibilism.. (shrink)
Do axiomatic derivations advance positive economics? If economists are interested in predicting how people behave, without a pretense to change individual decision making, how can they benefit from representation theorems, which are no more than equivalence results? We address these questions. We propose several ways in which representation results can be useful and discuss their implications for axiomatic decision theory.
The paper discusses what kind of truth bearer, or truth-ascription, a deflationist should take as primary. I first present number of arguments against a sententialist view. I then present a deflationary theory which takes propositions as primary, and try to show that it deals neatly with a wide range of linguistic data. Next, I consider both the view that there is no primary truth bearer, and the most common account of sentence truth given by deflationists who take propositions as primary, (...) and argue that they both attribute an implausible type of ambiguity to “true”. This can be avoided, however, if truth-ascriptions to sentences are taken as a certain form of pragmatic ellipses. I end by showing how this hypothesis accommodates a number of intuitions involving truth-ascriptions to sentences. (shrink)
I here propose a hitherto unnoticed possibility of solving embedding problems for noncognitivist expressivists in metaethics by appeal to Conceptual Role Semantics. I show that claims from the latter as to what constitutes various concepts can be used to define functions from states expressed by atomic sentences to states expressed by complex sentences, thereby allowing an expressivist semantics that satisfies a rather strict compositionality constraint. The proposal can be coupled with several different types of concept individuation claim, and is shown (...) to pave the way to novel accounts for, e.g., negation. (shrink)
I here argue for a particular formulation of truth-deflationism, namely, the propositionally quantified formula, (Q) “For all p, <p> is true iff p”. The main argument consists of an enumeration of the other (five) possible formulations and criticisms thereof. Notably, Horwich’s Minimal Theory is found objectionable in that it cannot be accepted by finite beings. Other formulations err in not providing non-questionbegging, sufficiently direct derivations of the T-schema instances. I end by defending (Q) against various objections. In particular, I argue (...) that certain circularity charges rest on mistaken assumptions about logic that lead to Carroll’s regress. I show how the propositional quantifier can be seen as on a par with first-order quantifiers and so equally acceptable to use. While the proposed parallelism between these quantifiers is controversial in general, deflationists have special reasons to affirm it. I further argue that the main three types of approach the truth-paradoxes are open to an adherent of (Q), and that the derivation of general facts about truth can be explained on its basis. (shrink)