Formalised knowledge systems, including universities and research institutes, are important for contemporary societies. They are, however, also arguably failing humanity when their impact is measured against the level of progress being made in stimulating the societal changes needed to address challenges like climate change. In this research we used a novel futures-oriented and participatory approach that asked what future envisioned knowledge systems might need to look like and how we might get there. Findings suggest that envisioned future systems will need (...) to be much more collaborative, open, diverse, egalitarian, and able to work with values and systemic issues. They will also need to go beyond producing knowledge about our world to generating wisdom about how to act within it. To get to envisioned systems we will need to rapidly scale methodological innovations, connect innovators, and creatively accelerate learning about working with intractable challenges. We will also need to create new funding schemes, a global knowledge commons, and challenge deeply held assumptions. To genuinely be a creative force in supporting longevity of human and non-human life on our planet, the shift in knowledge systems will probably need to be at the scale of the enlightenment and speed of the scientific and technological revolution accompanying the second World War. This will require bold and strategic action from governments, scientists, civic society and sustained transformational intent. (shrink)
Unmeasured confounding is one of the most important threats to the validity of observational studies. In this paper we scrutinize a recently proposed sensitivity analysis for unmeasured confounding. The analysis requires specification of two parameters, loosely defined as the maximal strength of association that an unmeasured confounder may have with the exposure and with the outcome, respectively. The E-value is defined as the strength of association that the confounder must have with the exposure and the outcome, to fully explain away (...) an observed exposure-outcome association. We derive the feasible region of the sensitivity analysis parameters, and we show that the bounds produced by the sensitivity analysis are not always sharp. We finally establish a region in which the bounds are guaranteed to be sharp, and we discuss the implications of this sharp region for the interpretation of the E-value. We illustrate the theory with a real data example and a simulation. (shrink)
I here argue that Ted Sider's indeterminacy argument against vagueness in quantifiers fails. Sider claims that vagueness entails precisifications, but holds that precisifications of quantifiers cannot be coherently described: they will either deliver the wrong logical form to quantified sentences, or involve a presupposition that contradicts the claim that the quantifier is vague. Assuming (as does Sider) that the “connectedness” of objects can be precisely defined, I present a counter-example to Sider's contention, consisting of a partial, implicit definition of the (...) existential quantifier that in effect sets a given degree of connectedness among the putative parts of an object as a condition upon there being something (in the sense in question) with those parts. I then argue that such an implicit definition, taken together with an “auxiliary logic” (e.g., introduction and elimination rules), proves to function as a precisification in just the same way as paradigmatic precisifications of, e.g., “red”. I also argue that with a quantifier that is stipulated as maximally tolerant as to what mereological sums there are, precisifications can be given in the form of truth-conditions of quantified sentences, rather than by implicit definition. (shrink)
A key feature of facial behavior is its dynamic quality. However, most previous research has been limited to the use of static images of prototypical expressive patterns. This article explores the role of facial dynamics in the perception of emotions, reviewing relevant empirical evidence demonstrating that dynamic information improves coherence in the identification of affect (particularly for degraded and subtle stimuli), leads to higher emotion judgments (i.e., intensity and arousal), and helps to differentiate between genuine and fake expressions. The findings (...) underline that using static expressions not only poses problems of ecological validity, but also limits our understanding of what facial activity does. Implications for future research on facial activity, particularly for social neuroscience and affective computing, are discussed. (shrink)
I show that the act-type theories of Soames and Hanks entail that every sentence with alternative analyses (including every atomic sentence with a polyadic predicate) is ambiguous, many of them massively so. I assume that act types directed toward distinct objects are themselves distinct, plus some standard semantic axioms, and infer that act-type theorists are committed to saying that ‘Mary loves John’ expresses both the act type of predicating [loving John] of Mary and that of predicating [being loved by Mary] (...) of John. Since the two properties are distinct, so are the act types. Hence, the sentence expresses two propositions. I also discuss a non-standard “pluralist” act-type theory, as well as some retreat positions, which all come with considerable problems. Finally, I extrapolate to a general constraint on theories of structured propositions, and find that Jeffrey King’s theory has the same unacceptable consequence as the act-type theory. (shrink)
Under normal circumstances, we experience that our center of awareness is located behind our eyes and inside our own body. To learn more about the perceptual processes that underlie this tight coupling between the spatial dimensions of our consciously perceived self and our physical body, we conducted a series of experiments using an ‘out-of-body illusion’. In this illusion, the conscious sense of self is displaced in the testing room by experimental manipulation of the congruency of visual and tactile information and (...) a change in the visual perspective. We demonstrate that when healthy individuals experience that they are located in a different place from their real body, they disown this body and no longer perceive it as part of themselves. Our findings are important because they reveal a relationship between the representation of self-location in the local environment and the multisensory representation of one’s own body. (shrink)
Emotions are foremost self-regulating processes that permit rapid responses and adaptations to situations of personal concern. They have biological bases and are shaped ontogenetically via learning and experience. Many situations and events of personal concern are social in nature. Thus, social exchanges play an important role in learning about rules and norms that shape regulation processes. I argue that (a) emotions often are actively auto-regulating—the behavior implied by the emotional reaction bias to the eliciting event or situation modifies or terminates (...) the situation; (b) certain emotion components are likely to habituate dynamically, modifying the emotional states; (c) emotions are typically intra- and interpersonal processes at the same time, and modulating forces at these different levels interact; (d) emotions are not just regulated—they regulate. Important conclusions of my arguments are that the scientific analysis of emotion should not exclude regulatory processes, and that effortful emotion regulation should be seen relative to a backdrop of auto-regulation and habituation, and not the ideal notion of a neutral baseline. For all practical purposes unregulated emotion is not a realistic concept. (shrink)
Semantic dispositionalism is roughly the view that meaning a certain thing by a word, or possessing a certain concept, consists in being disposed to do something, e.g., infer a certain way. Its main problem is that it seems to have so many and disparate exceptions. People can fail to infer as required due to lack of logical acumen, intoxication, confusion, deviant theories, neural malfunctioning, and so on. I present a theory stating possession conditions of concepts that are counterfactuals, rather than (...) disposition attributions, but which is otherwise similar to inferentialist versions of dispositionalism. I argue that it can handle all the exceptions discussed in the literature without recourse to ceteris paribus clauses. Psychological exceptions are handled by suitably undemanding requirements (unlike that of giving the sum of any two numbers) and by setting the following two preconditions upon someone’s making the inference: that she considers the inference and has no motivating reason against it. The non-psychological exceptions, i.e., cases of neural malfunctioning, are handled by requiring that the counterfactuals be true sufficiently often during the relevant interval. I argue that this accommodates some important intuitions about concept possession, in particular, the intuition that concept possession is vague along a certain dimension. (shrink)
Depicts the development of societal organization, welfare & political freedom as a gradual process of increased self-steering, with man as a self-steering actor, thereby rejecting the man-machine analogy.
The paper proposes a way for adherents of Fregean, structured propositions to designate propositions and other complex senses/concepts using a special kind of functor. I consider some formulations from Peacocke's works and highlight certain problems that arise as we try to quantify over propositional constituents while referring to propositions using "that"-clauses. With the functor notation, by contrast, we can quantify over senses/concepts with objectual, first-order quantifiers and speak without further ado about their involvement in propositions. The functor notation also turns (...) out to come with an important kind of expressive strengthening, and is shown to be neutral on several controversial issues. (shrink)
This paper examines the importance of aspirations as reference points in a multi-period decision-making context. After stating their personal aspiration level, 172 individuals made six sequential decisions among risky prospects as part of a choice experiment. The results show that individuals make different risky-choices in a multi-period compared to a single-period setting. In particular, individuals’ aspiration level is their main reference point during the early stages of decision-making, while their starting status (wealth level at the start of the experiment) becomes (...) the central reference point during the later stages of their multi-period decision-making. (shrink)
It is argued that although George Bealer's influential ‘Self-Consciousness argument’ refutes standard versions of reductive functionalism (RF), it fails to generalize in the way Bealer supposes. To wit, he presupposes that any version of RF must take the content of ‘pain’ to be the property of being in pain (and so on), which is expressly rejected in independently motivated versions of conceptual role semantics (CRS). Accordingly, there are independently motivated versions of RF, incorporating CRS, which avoid Bealer's main type of (...) refutation. I focus particularly on one such theory, which takes concepts to be event types that are individuated by their psychological roles, which has the resources of responding to each of the more specific worries Bealer expresses. (shrink)
Based on metatheoretical considerations, this article discusses what kinds of traffic forecasts are possible and what kinds are impossible to make with any reasonable degree of accuracy. It will be argued on ontological and epistemological grounds that it is inherently impossible to make exact predictions about the magnitude of the ‘general’ traffic growth 20–30 years ahead, since many of the influencing factors depend on inherently unpredictable geopolitical trajectories as well as contested political decision-making. Due to the context-dependency of each particular (...) planning situation, it is also hardly possible to make exact, quantitative predictions about the impact of implementing a specific infrastructure project, compared to ‘doing nothing’. Instead of relying on traffic model simulations as the general forecasting and assessment tool in transport planning, we propose to separate the so-called strategic, tactical and operational levels of traffic forecasting into three distinct methodological approaches reflecting the different degrees of openness/closure of the systems at hand: scenario analyses at the strategic level; theory-informed, mainly qualitative analyses supplemented with simple calculations at the tactical level; while more traditional micro-simulations should be applied only at a detailed operational level. (shrink)
I here defend a theory consisting of four claims about ‘property’ and properties, and argue that they form a coherent whole that can solve various serious problems. The claims are (1): ‘property’ is defined by the principles (PR): ‘F-ness/Being F/etc. is a property of x iff F’ and (PA): ‘F-ness/Being F/etc. is a property’; (2) the function of ‘property’ is to increase the expressive power of English, roughly by mimicking quantification into predicate position; (3) property talk should be understood at (...) face value: apparent commitments are real and our apparently literal use of ‘property’ is really literal; (4) there are no properties. In virtue of (1)–(2), this is a deflationist theory and in virtue of (3)–(4), it is an error theory. (1) is fleshed out as a claim about understanding conditions, and it is argued at length, and by going through a number of examples, that it satisfies a crucial constraint on meaning claims: all facts about ‘property’ can be explained, together with auxiliary facts, on its basis. Once claim (1) has been expanded upon, I argue that the combination of (1)–(3) provides the means for handling several problems: they help giving a happy-face solution to what I call the paradox of abstraction , they form part of a plausible account of the correctness of committive sentences, and, most importantly, they help respond to various indispensability arguments against nominalism. (shrink)
The article first rehearses three deflationary theories of reference, (1) disquotationalism, (2) propositionalism (Horwich), and (3) the anaphoric theory (Brandom), and raises a number of objections against them. It turns out that each corresponds to a closely related theory of truth, and that these are subject to analogous criticisms to a surprisingly high extent. I then present a theory of my own, according to which the schema “That S(t) is about t” and the biconditional “S refers to x iff S (...) says something about x” are exhaustive of the notions of aboutness and reference. An account of the usefulness of “about” is then given, which, I argue, is superior to that of Horwich. I close with a few considerations about how the advertised theory relates to well-known issues of reference, the conclusions of which is (1) that the issues concern reference and aboutness only insofar as the words “about” and “refer” serve to generalise over the claims that are really at issue, (2) that the theory of reference will not settle the issues, and (3) that it follows from (2) that the issues do not concern the nature of aboutness or reference. (shrink)
I here discuss two problems facing Russellian act-type theories of propositions, and argue that Fregean act-type theories are better equipped to deal with them. The first relates to complex singular terms like '2+2', which turn out not to pose any special problem for Fregeans at all, whereas Soames' theory currently has no satisfactory way of dealing with them (particularly, with such "mixed" propositions as the proposition that 2+2 is greater than 3). Admittedly, one possibility stands out as the most promising (...) one, but it requires that the Russellian treat complex properties as constituents of propositions. This leads to the second major problem for Russellians: that of proliferating propositions. I show how the most direct solution to this problem, that of rejecting complex predicative propositional constituents is available to Fregeans but very implausible for Russellians, since this virtually means rejecting complex properties. (shrink)
In 1997, the Japanese Diet revised the Bank of Japan law thereby granting the central bank greater independence in monetary policy making. The revision was an attempt by Japan's political class to weaken the authority of the powerful Ministry of Finance over the central bank and augment its own influence. The Bank of Japan, however, gained more autonomy than politicians ever intended, leading to frequent confrontations between the government and the central bank over monetary policy. This paper explores the new (...) strategic relationship that emerged between the Bank of Japan and government and the nature of monetary policy implemented in the post-reform period. We demonstrate that several factors contributed to the Bank's unexpected ability to enhance its independence: the astute leadership of the first post-reform governor Hayami Masaru; the Bank's ability to turn politicization of monetary policy to its advantage; and its pursuit of a strategy achieved by augmenting its own research capacity. On a theoretical level, our findings show that the passage of a new legal framework only marks the completion of one stage of institutional change and the start of the next; post-enactment politics have as much importance as pre-enactment politics in shaping outcomes. In the post-enactment phase, various factors, including the state of the economy and informal institutions or processes, matter greatly and may shift the direction of institutional change away from the intended path. (shrink)
Contents: 1. Introduction , 2. Overviews , 3.History and major works, 3.1 Gerhard Gentzen and proof-theory, 3.2 Wilfrid Sellars, 3.3 Gilbert Harman, 3.4 Christopher Peacocke, 3.5 Robert Brandom , 3.6 Paul Horwich, 3.7 Major works by other authors, 4. Mental content first vs. linguistic meaning first, 4.1 Content-first views, 4.2 Meaning-first views, 5. Wide vs. narrow CRS, 5.1 Overviews and major works about externalism/internalism, 5.2 Discussions about externalism within CRS, 6. Descriptive vs. normative CRS, 6.1 Overviews and major works about (...) the normativity of meaning, 6.2 The debate on normativity within CRS, 7. Holistic vs. non-holistic CRS, 7.1 Overviews of meaning holism and compositionality, 7.2 Overviews and major works about analyticity, 7.3 Criticisms against holism , 7.4 Replies to the criticisms, 8. Unilateral vs. bilateral CRS , 9. Connections between CRS and meta-ethics. (shrink)
A new kind of defense of the Millian theory of names is given, which explains intuitive counter-examples as depending on pragmatic effects of the relevant sentences, by direct application of Grice’s and Sperber and Wilson’s Relevance Theory and uncontroversial assumptions. I begin by arguing that synonyms are always intersubstitutable, despite Mates’ considerations, and then apply the method to names. Then, a fairly large sample of cases concerning names are dealt with in related ways. It is argued that the method, as (...) applied to the various cases, satisfies the criterion of success: that for every sentence in context, it is a counter-example to Millianism to the extent that it has pragmatic effects (matching speakers’ intuitions). (shrink)
I here develop a specific version of the deflationary theory of truth. I adopt a terminology on which deflationism holds that an exhaustive account of truth is given by the equivalence between truth-ascriptions and de-nominalised (or disquoted) sentences. An adequate truth-theory, it is argued, must be finite, non-circular, and give a unified account of all occurrences of “true”. I also argue that it must descriptively capture the ordinary meaning of “true”, which is plausibly taken to be unambiguous. Ch. 2 is (...) a critical historical survey of deflationary theories, where notably disquotationalism is found untenable as a descriptive theory of “true”. In Ch. 3, I aim to show that deflationism cannot be finitely and non-circularly formulated by using “true”, and so must only mention it. Hence, it must be a theory specifically about the word “true” (and its foreign counterparts). To capture the ordinary notion, the theory must thus be an empirical, use-theoretic, semantic account of “true”. The task of explaining facts about truth now becomes that of showing that various sentences containing “true” are (unconditionally) assertible. In Ch. 4, I defend the claim (D) that every sentence of the form “That p is true” and the corresponding “p” are intersubstitutable (in a use-theoretic sense), and show how this claim provides a unified and simple account of a wide variety of occurrences of “true”. Disquotationalism then only has the advantage of avoiding propositions. But in Ch. 5, I note that (D) is not committed to propositions. Use-theoretic semantics is then argued to serve nominalism better than truth-theoretic ditto. In particular, it can avoid propositions while sustaining a natural syntactic treatment of “that”-clauses as singular terms and of “Everything he says is true”, as any other quantification. Finally, Horwich’s problem of deriving universal truth-claims is given a solution by recourse to an assertibilist semantics of the universal quantifier. (shrink)
I begin with an exposition of the two main variants of the Prosentential Theory of Truth (PT), those of Dorothy Grover et al. and Robert Brandom. Three main types of criticisms are then put forward: (1) material criticisms to the effect that (PT) does not adequately explain the linguistic data, (2) an objection to the effect that no variant of (PT) gives a properly unified account of the various occurrences of "true" in English, and, most importantly, (3) a charge that (...) the comparison with proforms is explanatorily idle. The last objection is that, given a complete semantic account of pronouns, proadjectives, antecedents, etc., together with a complete (PT), the essential semantic character of "true" could be deduced, but then, the idleness of the comparison with pronouns would be apparent. It turns out that objections (2) and (3) are related in the following way: the prosentential terminology is held to conceal the lack of unity in (PT), by describing the different data in the same terms ("proform", "antecedent", etc.). But this, I argue, is only a way of truly describing, rather than explaining, the data, these being certain relations of equivalence and consequence between sentences. I consider a language for which (PT) would be not only true, but also explanatory, but note that this language is very different from English. I end by showing that Robert Brandom's case that "is true" is not a predicate fails, and that his motivation for saying so is based on fallacious reasoning (namely, Boghossian's argument against deflationism). (shrink)
The paper discusses what kind of truth bearer, or truth-ascription, a deflationist should take as primary. I first present number of arguments against a sententialist view. I then present a deflationary theory which takes propositions as primary, and try to show that it deals neatly with a wide range of linguistic data. Next, I consider both the view that there is no primary truth bearer, and the most common account of sentence truth given by deflationists who take propositions as primary, (...) and argue that they both attribute an implausible type of ambiguity to “true”. This can be avoided, however, if truth-ascriptions to sentences are taken as a certain form of pragmatic ellipses. I end by showing how this hypothesis accommodates a number of intuitions involving truth-ascriptions to sentences. (shrink)
I here propose a hitherto unnoticed possibility of solving embedding problems for noncognitivist expressivists in metaethics by appeal to Conceptual Role Semantics. I show that claims from the latter as to what constitutes various concepts can be used to define functions from states expressed by atomic sentences to states expressed by complex sentences, thereby allowing an expressivist semantics that satisfies a rather strict compositionality constraint. The proposal can be coupled with several different types of concept individuation claim, and is shown (...) to pave the way to novel accounts for, e.g., negation. (shrink)
Nearly all students believe academic cheating is wrong, yet few students say they would report witnessed acts of cheating. To explain this apparent tension, the present research examined college students’ reasoning about whether to report plagiarism or other forms of cheating. Study 1 examined students’ conflicts when deciding whether to report cheating. Most students gave reasons against reporting a peer as well as reasons in favor of reporting. Study 2 provided experimental confirmation that the contextual factors referenced by Study 1 (...) participants in fact influenced decisions about whether to report cheating. Overall, the findings indicate that students often decide against reporting peers’ acts of cheating, though not due to a lack of concern about integrity. Rather, students may refrain from reporting because of conflicting concerns, lack of information about school policy, and perceived better alternatives to reporting. (shrink)
I here argue for a particular formulation of truth-deflationism, namely, the propositionally quantified formula, (Q) “For all p, <p> is true iff p”. The main argument consists of an enumeration of the other (five) possible formulations and criticisms thereof. Notably, Horwich’s Minimal Theory is found objectionable in that it cannot be accepted by finite beings. Other formulations err in not providing non-questionbegging, sufficiently direct derivations of the T-schema instances. I end by defending (Q) against various objections. In particular, I argue (...) that certain circularity charges rest on mistaken assumptions about logic that lead to Carroll’s regress. I show how the propositional quantifier can be seen as on a par with first-order quantifiers and so equally acceptable to use. While the proposed parallelism between these quantifiers is controversial in general, deflationists have special reasons to affirm it. I further argue that the main three types of approach the truth-paradoxes are open to an adherent of (Q), and that the derivation of general facts about truth can be explained on its basis. (shrink)
Affective computing adopts a computational approach to study affect. We highlight the AC approach towards automated affect measures that jointly model machine-readable physiological/behavioral signals with affect estimates as reported by humans or experimentally elicited. We describe the conceptual and computational foundations of the approach followed by two case studies: one on discrimination between genuine and faked expressions of pain in the lab, and the second on measuring nonbasic affect in the wild. We discuss applications of the measures, analyze measurement accuracy (...) and generalizability, and highlight advances afforded by computational tipping points, such as big data, wearable sensing, crowdsourcing, and deep learning. We conclude by advocating for increasing synergies between AC and affective science and offer suggestions toward that direction. (shrink)
I here investigate whether there is any version of the principle of charity both strong enough to conflict with an error-theoretic version of nominalism (EN) about abstract objects, and supported by the considerations adduced in favour of interpretive charity in the literature. I argue that in order to be strong enough, the principle, which I call (Charity), would have to read, “For all expressions e, an acceptable interpretation must make true a sufficiently high ratio of accepted sentences containing e”. I (...) next consider arguments based on Davidson's intuitive cases for interpretive charity, the reliability of perceptual beliefs, and the reliability of “non-abstractive inference modes”, and conclude that none support (Charity). I then propose a diagnosis of the view that there must be some universal principle of charity ruling out (EN). Finally, I present a reason to think (Charity) is false, namely, that it seems to exclude the possibility of such disagreements as that between nominalists and realists. (shrink)
IN LATE SPRING 2007, professor Allan Gibbard gave the Hägerström Lectures at Uppsala University, Sweden, under the title of “Meaning as a Normative Concept”. He met up with Gunnar Björnsson and Arvid Båve to talk about the views he develops and defends in the lectures.
Based on metatheoretical considerations, this article discusses what kinds of traffic forecasts are possible and what kinds are impossible to make with any reasonable degree of accuracy. It will be argued on ontological and epistemological grounds that it is inherently impossible to make exact predictions about the magnitude of the ‘general’ traffic growth 20-30 years ahead, since many of the influencing factors depend on inherently unpredictable geopolitical trajectories as well as contested political decision-making. Due to the context-dependency of each particular (...) planning situation, it is also hardly possible to make exact, quantitative predictions about the impact of implementing a specific infrastructure project, compared to ‘doing nothing’. Instead of relying on traffic model simulations as the general forecasting and assessment tool in transport planning, we propose to separate the so-called strategic, tactical and operational levels of traffic forecasting into three distinct methodological approaches reflecting the different degrees of openness/closure of the systems at hand: scenario analyses at the strategic level; theory-informed, mainly qualitative analyses supplemented with simple calculations at the tactical level; while more traditional micro-simulations should be applied only at a detailed operational level. (shrink)
The paper discusses the Inconsistency Theory of Truth (IT), the view that “true” is inconsistent in the sense that its meaning-constitutive principles include all instances of the truth-schema (T). It argues that (IT) entails that anyone using “true” in its ordinary sense is committed to all the (T)-instances and that any theory in which “true” is used in that sense entails the (T)-instances (which, given classical logic, entail contradictions). More specifically, I argue that theorists are committed to the meaning-constitutive principles (...) of logical constants, relative to the interpretation they intend thereof (e.g., classical), and that theories containing logical constants entail those principles. Further, I argue, since there is no relevant difference from the case of “true”, inconsistency theorists’ uses of “true” commit them to the (T)-instances. Adherents of (IT) are recommended, as a consequence, to eschew the truth-predicate. I also criticise Matti Eklund’s account of how the semantic value of “true” is determined, which can be taken as an attempt to show how “true” can be consistently used, despite being inconsistent. (shrink)
I here respond to Pietro Salis’s objections against my original critique of the Prosentential Theory of Truth. In addition, I clarify some points regarding the relationship between anaphoric relationships and “general semantic notions” like “equivalence”, “consequence”, and “sameness of content”, and make some further points about ’s ability gto explain pragmatic and expressive features of “true”.
Biological and epidemiological phenomena are often measured with error or imperfectly captured in data. When the true state of this imperfect measure is a confounder of an outcome exposure relationship of interest, it was previously widely believed that adjustment for the mismeasured observed variables provides a less biased estimate of the true average causal effect than not adjusting. However, this is not always the case and depends on both the nature of the measurement and confounding. We describe two sets of (...) conditions under which adjusting for a non-deferentially mismeasured proxy comes closer to the unidentifiable true average causal effect than the unadjusted or crude estimate. The first set of conditions apply when the exposure is discrete or continuous and the confounder is ordinal, and the expectation of the outcome is monotonic in the confounder for both treatment levels contrasted. The second set of conditions apply when the exposure and the confounder are categorical. In all settings, the mismeasurement must be non-differential, as differential mismeasurement, particularly an unknown pattern, can cause unpredictable results. (shrink)
I here present and defend what I call the Triviality Theory of Truth, to be understood in analogy with Matti Eklund’s Inconsistency Theory of Truth. A specific formulation of is defended and compared with alternatives found in the literature. A number of objections against the proposed notion of meaning-constitutivity are discussed and held inconclusive. The main focus, however, is on the problem, discussed at length by Gupta and Belnap, that speakers do not accept epistemically neutral conclusions of Curry derivations. I (...) first argue that the facts about speakers’ reactions to such Curry derivations do not constitute a problem for the Triviality Theory specifically. Rather, they follow from independent, uncontroversial facts. I then propose a solution which coheres with the theory as I understand it. Finally, I consider a normative reading of their objection and offer a response. (shrink)