The sensible and conflicting scenario of the pandemic postulated many challenges to societies around the world in 2020. Part of this problem refers to how the differences between politics and science are not comprehended in their particularities. The recognition of limits and power of science and politics can not only contribute to reaching the actions and strategies facing novel coronavirus but also optimized many domains of society.
If citizens are to make enlightened collective decisions, they need to rely on true factual beliefs, but misinformation impairs their ability to do so. Although some cases of misinformation are deliberate and amount to propaganda, cases of inadvertent misinformation are just as problematic in affecting the beliefs and behavior of democratic citizens. A review of empirical evidence suggests that this is a serious problem that cannot entirely be corrected by means of deliberation.
If political fake news is a serious concern for democratic politics, no less worrisome is scientific news with patently distorted content. Prima facie, scientific misinformation partially escapes the definition of fake news provided by empirical and philosophical analysis, mainly patterned after political disinformation. Most notably, we aim to show that people are often unaware not only of disseminating, but also of producing false or misleading information. However, by leveraging the philosophical and psychological literature, we advance some reasons (...) for keeping scientific misinformation under the same umbrella, broadening the definition of fake news in order to account for it as well. In concluding, we shall advance some ideas on how to reform scientific communication, which may help to address the issue of scientific misinformation. (shrink)
This chapter addresses the relationship between misinformation and disagreement. We begin by arguing that one traditional bogeyman in this domain, ideological polarization, does not account for the many problems that have been documented. Instead, affective polarization seems to be the root cause of most of these problems. We then discuss the relationships between moral outrage, misinformation, and affective polarization. We next turn to the political implications of affective polarization and conclude by discussing some potential solutions to the (...) problems that arise in this area. (shrink)
This study examines the effectiveness of the inoculation strategy in countering vaccine-related misinformation among Hong Kong college students. A three-phase between-subject experiment was conducted to compare the persuasive effects of inoculation messages, supportive messages, and no message control. The results show that inoculation messages were superior to supportive messages at generating resistance to misinformation, as evidenced by more positive vaccine attitudes and stronger vaccine intention. Notably, while we expected the inoculation condition would produce more resistance than the control (...) condition, there was little evidence in favor of this prediction. Attitudinal threat and counterarguing moderated the experimental effects; issue involvement and political trust were found to directly predict vaccine attitudes and intention. The findings suggest that future interventions focus on developing preventive mechanisms to counter misinformation and spreading inoculation over the issue is an effective strategy to generate resistance to misinformation. Interventions should be cautious about using health advocacy initiated by governments among populations with low political trust. (shrink)
This book investigates the impact of misinformation and the role of truth in political struggle. It develops a theory of objective truth for political controversy over topics such as racism and gender, based on the insights of intersectionality, the Black feminist theory of interlocking systems of oppression. Truth is defined using the tools of model theory and formal semantics, but the theory also captures how social power dynamics strongly influence the operation of the concept of truth within (...) the social fabric. Systemic ignorance, propagated through false speech and misinformation, sustains oppressive power structures and perpetuates systemic inequity. Truth tends to empower marginalized groups precisely because oppressive systems are maintained through systemic ignorance. If the truth sets people free, then power will work to obscure it. Hence, the rise of misinformation as a political weapon is a strategy of dominant power to undermine the political advancement of marginalized groups. (shrink)
Agnotology is the study of how ignorance arises via circulation of misinformation calculated to mislead. Legates et al. had questioned the applicability of agnotology to politically-charged debates. In their reply, Bedford and Cook, seeking to apply agnotology to climate science, asserted that fossil-fuel interests had promoted doubt about a climate consensus. Their definition of climate ‘misinformation’ was contingent upon the post-modernist assumptions that scientific truth is discernible by measuring a consensus among experts, and that a near unanimous consensus (...) exists. However, inspection of a claim by Cook et al. of 97.1 % consensus, heavily relied upon by Bedford and Cook, shows just 0.3 % endorsement of the standard definition of consensus: that most warming since 1950 is anthropogenic. Agnotology, then, is a two-edged sword since either side in a debate may claim that general ignorance arises from misinformation allegedly circulated by the other. Significant questions about anthropogenic influences on climate remain. Therefore, Legates et al. appropriately asserted that partisan presentations of controversies stifle debate and have no place in education. (shrink)
Why are mistaken beliefs about Covid-19 so prevalent? Political identity, education and other demographic variables explain only a part of individual differences in the susceptibility to Covid-19 misinformation. This paper focuses on another explanation: epistemic vice. Epistemic vices are character traits that interfere with acquiring, maintaining, and transmitting knowledge. If the basic assumption of vice epistemology is right, then people with epistemic vices such as indifference to the truth or rigidity in their belief structures will tend to be (...) more susceptible to believing Covid-19 misinformation. We carried out an observational study (US sample, n = 998) in which we measured the level of epistemic vice of participants using a novel Epistemic Vice Scale. We also asked participants questions eliciting the extent to which they subscribe to myths and misinformation about Covid-19. We find overwhelming evidence to the effect that epistemic vice is associated with susceptibility to Covid-19 misinformation. In fact, the association turns out to be stronger than with political identity, educational attainment, scores on the Cognitive Reflection Test, personality, dogmatism, and need for closure. We conclude that this offers evidence in favor of the empirical presuppositions of vice epistemology. (shrink)
There is a growing movement in online social networks and within some governments to deny the long-established scientific consensus regarding climate change. Scientific research has shown that a series of climatic events in Latin America, and especially in Brazil, are being exacerbated by global warming. These events have had a profound impact on populations. Disruptions to Brazilian rainfall patterns with their devastating environmental and economic effects on agriculture have been directly linked with Amazonian deforestation. Furthermore, the Bolsonaro government, with its (...) erratic environmental policies and ardent followers, has promoted misinformation about climate change and conservation of the Amazon. It is necessary to simultaneously inform society about scientific misunderstandings regarding the environment and the way these misunderstandings are amplified via the internet and social networks. (shrink)
The world is swimming in misinformation. Conflicting messages bombard us every day with news on everything from politics and world events to investments and alternative health. The daily paper, nightly news, websites, and social media each compete for our attention and each often insist on a different version of the facts. Inevitably, we have questions: Who is telling the truth? How would we know? How did we get here? What can we do? Beyond Fake News answers these and other (...) queries. It offers a technological and market-based explanation for how our informational environment became so polluted. It shows how purveyors of news often have incentives to mislead us, and how consumers of information often have incentives to be misled. And it chronicles how, as technology improves and the regulatory burdens drop, our information-scape becomes ever more littered with misinformation. Beyond Fake News argues that even when we really want the truth, our minds are built in such a way so as to be incapable of grasping many facts, and blind spots mar our view of the world. But we can do better, both as individuals and as a society. As individuals, we can improve the accuracy of our understanding of the world by knowing who to trust and recognizing our limitations. And as a society, we can take important steps to reduce the quantity and effects of misinformation. (shrink)
This open access book looks at how a democracy can devolve into a post-factual state. The media is being flooded by populist narratives, fake news, conspiracy theories and make-believe. Misinformation is turning into a challenge for all of us, whether politicians, journalists, or citizens. In the age of information, attention is a prime asset and may be converted into money, power, and influence – sometimes at the cost of facts. The point is to obtain exposure on the air and (...) in print media, and to generate traffic on social media platforms. With information in abundance and attention scarce, the competition is ever fiercer with truth all too often becoming the first victim. Reality Lost: Markets of Attention, Misinformation and Manipulation is an analysis by philosophers Vincent F. Hendricks and Mads Vestergaard of the nuts and bolts of the information market, the attention economy and media eco-system which may pave way to postfactual democracy. Here misleading narratives become the basis for political opinion formation, debate, and legislation. To curb this development and the threat it poses to democratic deliberation, political self-determination and freedom, it is necessary that we first grasp the mechanisms and structural conditions that cause it. (shrink)
This study investigates the types of misinformation spread on Twitter that evokes scientific authority or evidence when making false claims about the antimalarial drug hydroxychloroquine as a treatment for COVID-19. Specifically, we examined tweets generated after former U.S. President Donald Trump retweeted misinformation about the drug using an unsupervised machine learning approach called the biterm topic model that is used to cluster tweets into misinformation topics based on textual similarity. The top 10 tweets from each topic cluster (...) were content coded for three types of misinformation categories related to scientific authority: medical endorsements of hydroxychloroquine, scientific information used to support hydroxychloroquine’s use, and a comparison group that included scientific evidence opposing hydroxychloroquine’s use. Results show a much higher volume of tweets featuring medical endorsements and use of supportive scientific information compared to accurate and updated scientific evidence, that misinformation-related tweets propagated for a longer time frame, and the majority of hydroxychloroquine Twitter discourse expressed positive views about the drug. Metadata from Twitter accounts found that prominent users within misinformation discourse were more likely to have media or political affiliation and explicitly expressed support for President Trump. Conversely, prominent accounts within the scientific opposition discourse primarily consisted of medical doctors or scientists but had far less influence in the Twitter discourse. Implications of these findings and connections to related social media research are discussed, as well as cognitive mechanisms for understanding susceptibility to misinformation and strategies to combat misinformation spread via online platforms. (shrink)
As current events around the world have illustrated, epistemological issues are at the center of our political lives. It has become increasingly difficult to discern legitimate sources of evidence, misinformation spreads faster than ever, and the role of truth in politics has allegedly decayed in recent years. It is therefore no coincidence that political discourse is currently saturated with epistemic notions like ‘post-truth,’ ‘fake news,’ ‘truth decay,’ ‘echo chambers,’ and ‘alternative facts.’ This book brings together leading philosophers (...) to explore ways in which the analytic and conceptual tools of epistemology bear on political philosophy, and vice versa. It is organized around three broad themes: truth and knowledge in politics; epistemic problems for democracy; and disagreement and polarization. The authors provide new and rich insights on topics such as: propaganda, fake news, weaponized skepticism, belief polarization, political disagreement, the epistemic value of democracy, voter ignorance, irrationality in politics, political bullshit, and identity politics. (shrink)
The spreading of COVID-19 misinformation on social media could have severe consequences on people's behavior. In this paper, we investigated the emotional expression of misinformation related to the COVID-19 crisis on Twitter and whether emotional valence differed depending on the type of misinformation. We collected 17,463,220 English tweets with 76 COVID-19-related hashtags for March 2020. Using Google Fact Check Explorer API we identified 226 unique COVID-19 false stories for March 2020. These were clustered into six types of (...)misinformation. Applying the 226 classifiers to the Twitter sample we identified 690,004 tweets. Instead of running the sentiment on all tweets we manually coded a random subset of 100 tweets for each classifier to increase the validity, reducing the dataset to 2,097 tweets. We found that only a minor part of the entire dataset was related to misinformation. Also, misinformation in general does not lean towards a certain emotional valence. However, looking at comparisons of emotional valence for different types of misinformation uncovered that misinformation related to “virus” and “conspiracy” had a more negative valence than “cures,” “vaccine,” “politics,” and “other.” Knowing from existing studies that negative misinformation spreads faster, this demonstrates that filtering for misinformation type is fruitful and indicates that a focus on “virus” and “conspiracy” could be one strategy in combating misinformation. As emotional contexts affect misinformation spreading, the knowledge about emotional valence for different types of misinformation will help to better understand the spreading and consequences of misinformation. (shrink)
Government lockdowns, school closures, mass unemployment, health and wealth inequality. Political Philosophy in a Pandemic asks us, where do we go from here? What are the ethics of our response to a radically changed, even more unequal society, and how do we seize the moment for enduring change? Addressing the moral and political implications of pandemic response from states and societies worldwide, the 20 essays collected here cover the most pressing debates relating to the biggest public health crisis (...) in the last century. Discussing the pandemic in five key parts covering social welfare, economic justice, democratic relations, speech and misinformation, and the relationship between justice and crisis, this book reflects the fruitful combination of political theory and philosophy in laying the theoretical and practical foundations for justice in the long-term. (shrink)
In matters of governance, is believing subject to ethical standards? If so, what are the criteria how relevant are they in our personal and political culture today? The really important matters in politics and governance necessitate a confidence that our beliefs will lead dependably to predictable and verifiable outcomes. Accordingly, it is unethical to hold a belief that is founded on insufficient evidence or based on hearsay or blind acceptance. In this paper, we demonstrate that the pragmatist concept of (...) truth best meets this standard for ethically held belief in matters of politics and governance. Currently, these standards are abused by the gaslighting and distortion characteristics of the often social media driven ‘misinformation society’. The legitimacy and trust in our institutions and leadership that is requisite for good governance is challenged thereby, threatening the viability of our republic. (shrink)
Politics is full of people who don’t care about the facts. Still, while not caring about the facts, they are often concerned to present themselves as caring about them. Politics, in other words, is full of bullshitters. But why? In this paper I develop an incentives-based analysis of bullshit in politics, arguing that it is often a rational response to the incentives facing different groups of agents. In a slogan: bullshit in politics pays, sometimes literally. After first outlining an account (...) of bullshit (Section 1), I discuss the incentives driving three different groups of agents to bullshit: politicians, the media, and voters (Section 2). I then examine several existing proposals to combat bullshit in politics, arguing that each will fail because they ignore the relevant underlying incentives (Section 3). I conclude somewhat pessimistically that a certain amount of bullshit in politics is inevitable (Section 4). (shrink)
This handbook provides an overview of key ideas, questions, and puzzles in political epistemology. It is divided into seven sections: (1) Politics and Truth: Historical and Contemporary Perspectives; (2) Political Disagreement and Polarization; (3) Fake News, Propaganda, Misinformation; (4) Ignorance and Irrationality in Politics; (5) Epistemic Virtues and Vices in Politics; (6) Democracy and Epistemology; (7) Trust, Expertise, and Doubt.
Rumour has been part of collective human life for centuries. Communities deal with anxiety and make sense of the unknowable by mixing apprehensions with what is already known to them. With modernity, and in line with studies on a range of social phenomena, there have been efforts to develop a science on rumour. Most of these studies deal with rumour at the propositional level, such that the rumouring or rumour-rebutting subject invariably belongs to one of the two sides of the (...) ‘true–false’ divide. Similar categories are followed in the study of rumour in social media, where the nodes in a rumour chain are, however, less hierarchical, and where images are increasingly used for persuasion. This paper, following a value-oriented approach, argues that the science on rumour has objectivized the problem, and has suggested instrumental solutions like enhancing the digital literacy of social media users. Whereas a value position should ideally attempt to efface the dilemma of the rumouring/rumour-rebutting subject, and locate rumours within the larger socio-political and historical context of a society. (shrink)
A striking feature of political discourse is how prone we are to disagree. Political opponents will even give different answers to factual questions, which suggests that opposing parties cannot agree on facts any more than they can on values. This impression is widespread and supported by survey data. I will argue, however, that the extent and depth of political disagreement is largely overstated. Many political disagreements are merely illusory. This claim has several important upshots. I will (...) explore the implications of this idea for theories about voter misinformation, motivated reasoning, public reason liberalism, deliberative democracy, and a number of other issues. (shrink)
There is a widely recognized dilemma of political epistemic trust. While the public needs to rely on the testimonies of epistemic authorities (e.g. politicians, policymakers, and scientists), it is risky to do so. One source of risk is self-interest. Epistemic authorities are prone to abuse the trust placed in them by misinforming the public for material and social gain. To reap the benefits of trust and mitigate the risk of abuse, liberal political theorists adopt the strategy of cultivating (...) vigilant trust. By enhancing epistemic vigilance and epistemic autonomy, trust is both constrained and intellectualized. This chapter rejects this strategy for two reasons. First, it is undesirable. By over-intellectualizing trust, such an approach deprives trust of its important epistemic and social benefits. Second, it is unnecessary. The risk of abuse is exaggerated. The strategy fails to appreciate that epistemic authorities in virtue of their social roles are typically governed by the social norm of epistemic trustworthiness, not self-interest. The chapter concludes by suggesting an alternative strategy of cultivating trustworthiness. It seeks to strengthen epistemic authorities’ responsiveness to the social norm of epistemic trustworthiness, thereby improving epistemic trust without over-intellectualizing it. It is further outlined how liberal-democratic institutions can implement it. (shrink)
The research identifies the amount of headline/article discrepancies in the corpora of western and Russian online articles on sensitive political topics. A quarter of the western headlines and nearly half of the Russian headlines distort the publications they introduce. Language means and manipulative strategies employed by different sides vary considerably. Extensive use of expressive language and style variation are seen as leading causes of distortions in the western corpus. The rich imagery used by the authors forms emotional implicatures that (...) affect the reader’s perception of the issue. In contrast, information substitution, subjective modality and selective citations are identified as major causes of distortions in the Russian corpus. Contributors to Russian news outlets rely on general rather than language manipulation strategies, including frequent use of logical fallacies and wrong generalizations. These techniques establish false logical sequences and wrong causative implicatures that compromise objective reporting. The underlying motives of the journalists’ creating false emotional and causative implicatures in the headlines lies beyond the scope of the study; however, it is assumed that intentional change of the information introduced by the headline could be viewed as a covert misinformation attempt. (shrink)
Grzegorz W. Kolodko, one of the world's leading authorities on economics and development policy and a key architect of Poland's successful economic reforms, applies his far-reaching knowledge to the past and future of the world economy, introducing a framework for understanding our global situation that transcends any single discipline or paradigm. Deploying a novel mix of scientific evaluation and personal observation, Kolodko begins with a brief discussion of misinformation and its perpetuation in economics and politics. He criticizes the simplification (...) of complex economic and social issues and investigates the link between developments in the global economy and cultural change, scientific discoveries, and political fluctuations. Underscoring the necessity of conceptual and theoretical innovation in understanding our global economic situation, Kolodko offers a provocative study of globalization and the possibility of coming out ahead in an era of worldwide interdependence. Deeply critical of neoliberalism, which sought to transfer economic control exclusively to the private sector, Kolodko explores the virtues of social-economic development and the new rules of the economic game. He concludes with a look at our near and distant future, questioning whether we have a say in its making. (shrink)
Grzegorz W. Kolodko, one of the world's leading authorities on economics and development policy and a key architect of Poland's successful economic reforms, applies his far-reaching knowledge to the past and future of the world economy, introducing a framework for understanding our global situation that transcends any single discipline or paradigm. Deploying a novel mix of scientific evaluation and personal observation, Kolodko begins with a brief discussion of misinformation and its perpetuation in economics and politics. He criticizes the simplification (...) of complex economic and social issues and investigates the link between developments in the global economy and cultural change, scientific discoveries, and political fluctuations. Underscoring the necessity of conceptual and theoretical innovation in understanding our global economic situation, Kolodko offers a provocative study of globalization and the possibility of coming out ahead in an era of worldwide interdependence. Deeply critical of neoliberalism, which sought to transfer economic control exclusively to the private sector, Kolodko explores the virtues of social-economic development and the new rules of the economic game. He concludes with a look at our near and distant future, questioning whether we have a say in its making. (shrink)
The contemporary debate in democracies routinely refers to online misinformation, disinformation, and deception, as security-issues in need of urgent attention. Despite this pervasive discourse, however, policymakers often appear incapable of articulating what security means in this context. This paper argues that we must understand the unique practical and normative challenges to security actualized by such online information threats, when they arise in a democratic context. Investigating security-making in the nexus between technology and national security through the concept of “cybersovereignty,” (...) the paper highlights a shared blind spot in the envisaged protection of national security and democracy in cyberspace. Failing to consider the implications of non-territoriality in cyberspace, the “cybersovereign” approach runs into a cul de sac. Security-making, when understood as the continuous constitution of “cybersovereign” boundaries presumes the existence of a legitimate securitizing actor; however, this actor can only be legitimate as a product of pre-existing boundaries. In response to the problems outlined, the article proposes an alternative object of protection in the form of human judgment and, specifically, “political judgment” in the Arendtian sense. The turn to political judgment offers a conceptualization of security that can account for contemporary policy practises in relation to security and the online information threat, as well as for the human communicating subject in the interactive and essentially incomplete information and communication environment. (shrink)
This paper uses the controversy surrounding abstinence-only education to depict the current struggle between US government policy and science. The paper demonstrates the way in which this fight over science has become a communications battle and how the internet has become the vehicle through which ideology is able to masquerade as science. In addition, this paper identifies the damage to public health programs, and the ethical problems of providing selected information and misinformation to teenagers. Part of the resolution may (...) be for scientists to become better communicators to the public about scientific principles and findings. If scientists are interested in improving sexuality education, we need to rely on science but may find it more advantageous to reframe our arguments around themes that perhaps have greater cultural salience. (shrink)
As government pressure on major technology companies builds, both firms and legislators are searching for technical solutions to difficult platform governance puzzles such as hate speech and misinformation. Automated hash-matching and predictive machine learning tools – what we define here as algorithmic moderation systems – are increasingly being deployed to conduct content moderation at scale by major platforms for user-generated content such as Facebook, YouTube and Twitter. This article provides an accessible technical primer on how algorithmic moderation works; examines (...) some of the existing automated tools used by major platforms to handle copyright infringement, terrorism and toxic speech; and identifies key political and ethical issues for these systems as the reliance on them grows. Recent events suggest that algorithmic moderation has become necessary to manage growing public expectations for increased platform responsibility, safety and security on the global stage; however, as we demonstrate, these systems remain opaque, unaccountable and poorly understood. Despite the potential promise of algorithms or ‘AI’, we show that even ‘well optimized’ moderation systems could exacerbate, rather than relieve, many existing problems with content policy as enacted by platforms for three main reasons: automated moderation threatens to further increase opacity, making a famously non-transparent set of practices even more difficult to understand or audit, further complicate outstanding issues of fairness and justice in large-scale sociotechnical systems and re-obscure the fundamentally political nature of speech decisions being executed at scale. (shrink)
Public reason liberals claim that legitimate rules must be justifiable to diverse perspectives. This Public Justification Principle threatens that failing to justify rules to reprehensible agents makes them illegitimate. Although public reason liberals have replies to this objection, they cannot avoid the challenge of powerful deceivers. Powerful deceivers trick people who are purportedly owed public justification into considering otherwise good rules unjustified. Avoiding this challenge requires discounting some failures of justification according to what caused people’s beliefs. I offer a conception (...) of public justification that accommodates these externalist considerations while positioning Public Reason Liberalism to provide insight into real cases of deception. (shrink)
The distinction between misinformation and disinformation becomes especially important in political, editorial, and advertising contexts, where sources may make deliberate efforts to mislead, deceive, or confuse an audience in order to promote their personal, religious, or ideological objectives. The difference consists in having an agenda. It thus bears comparison with lying, because lies are assertions that are false, that are known to be false, and that are asserted with the intention to mislead, deceive, or confuse. One context in (...) which disinformation abounds is the study of the death of JFK, which I know from more than a decade of personal research experience. Here I reflect on that experience and advance a preliminary theory of disinformation that is intended to stimulate thinking on this increasingly important subject. Five kinds of disinformation are distinguished and exemplified by real life cases I have encountered. It follows that the story you are about to read is true. (shrink)
At the time of writing, social media is rife with misinformation and disinformation, having very real effects on our political processes and on the vaccination efforts of the COVID pandemic. As the effort to pass new laws and regulations on social media companies gains momentum, concerns remain about how to balance free speech rights and even who, if anyone, should be the one to regulate social media. Drawing on Dewey’s conception of the public, I argue for the regulation (...) of social media companies by the state as part of the effort to curb misinformation and disinformation. (shrink)
There remains no consensus among social scientists as to how to measure and understand forms of information deprivation such as misinformation. Machine learning and statistical analyses of information deprivation typically contain problematic operationalizations which are too often biased towards epistemic elites' conceptions that can undermine their empirical adequacy. A mature science of information deprivation should include considerable citizen involvement that is sensitive to the value-ladenness of information quality and that doing so may improve the predictive and explanatory power of (...) extant models. (shrink)
Hobbits and hooligans -- Ignorant, irrational, misinformed nationalists -- Political participation corrupts -- Politics doesn't empower you or me -- Politics is not a poem -- The right to competent government -- Is democracy competent? -- The rule of the knowers -- Civic enemies.
Hobbits and hooligans -- Ignorant, irrational, misinformed nationalists -- Political participation corrupts -- Politics doesn't empower you or me -- Politics is not a poem -- The right to competent government -- Is democracy competent? -- The rule of the knowers -- Civic enemies.
PurposeAs interest in technology ethics is increasing, so is the interest in bringing schools of ethics from non-Western philosophical traditions to the field, particularly when it comes to information and communication technology. In light of this development and recent publications that result from it, this paper aims to present responds critically to recent work on Confucian virtue ethics (CVE) and technology.Design/methodology/approachFour critiques are presented as theoretical challenges to CVE in technology, claiming that current literature insufficiently addresses: overall applicability, collective ethics (...) issues, epistemic overconfidence within technology corporations and amplification of epistemic overconfidence by the implementation of CVE. These challenges make use of general CVE literature and work on technology critique, political philosophy, epistemology and business ethics.FindingsImplementing CVE in technology may yield some benefits, but these may be outweighed by other outcomes, include strengthening hierarchies, widening inequities, increasing, rather than limiting, predictive activity, personal data collection, misinformation, privacy violations and challenges to the democratic process.Originality/valueThough not directly advocating against CVE, the paper reveals hitherto unidentified and serious issues that should be addressed before CVE are used to inform ethics guidelines or regulatory policies. It also serves as a foundation for further inquiry into how Eastern philosophy more broadly can inform technology ethics in the West. (shrink)
In the era of information and communication, issues of misinformation and miscommunication are more pressing than ever. _Epistemic injustice - _one of the most important and ground-breaking subjects to have emerged in philosophy in recent years - refers to those forms of unfair treatment that relate to issues of knowledge, understanding, and participation in communicative practices. The Routledge Handbook of Epistemic Injustice is an outstanding reference source to the key topics, problems and debates in this exciting subject. The first (...) collection of its kind, it comprises over thirty chapters by a team of international contributors, divided into five parts: Core Concepts Liberatory Epistemologies and Axes of Oppression Schools of Thought and Subfields within Epistemology Socio-political, Ethical, and Psychological Dimensions of Knowing Case Studies of Epistemic Injustice. As well as fundamental topics such as testimonial and hermeneutic injustice and epistemic trust, the Handbook includes chapters on important issues such as social and virtue epistemology, objectivity and objectification, implicit bias, and gender and race. Also included are chapters on areas in applied ethics and philosophy, such as law, education, and healthcare. The Routledge Handbook of Epistemic Injustice is essential reading for students and researchers in ethics, epistemology, political philosophy, feminist theory, and philosophy of race. It will also be very useful for those in related fields, such as cultural studies, sociology, education and law. (shrink)
The relationship between truth and politics has rarely seemed more vexed. Worries about misinformation and disinformation abound, and the value of expertise for democratic decision-making dismissed. Whom can we trust to provide us with reliable testimony? In Truth and Evidence, the latest in the NOMOS series, Melissa Schwartzberg and Philip Kitcher present nine timely essays shedding light on practices of inquiry. These essays address urgent questions including what it means to #BelieveWomen; what factual knowledge we require to confront challenges (...) like COVID-19; and how white supremacy shapes the law of evidence. (shrink)
Immigration politics are almost universally characterized by their complexity, their ability to raise public passions, and misinformation, often based on generalizations and stereotypes. Recently, immigration has been intrinsically linked to crime, and public agendas have squarely focused on security issues as nativist political forces have successfully created a prominent image of migrants as threats to public security. This article argues that immigrant participation in criminal markets should be studied at the local level, where micro-criminal economies often dominated by (...) migrants actually develop. By examining criminal activity at its base, the article investigates the nature of power in these markets. Specifically, it examines migrant crime in four cities and compares it to migrant integration in regular labour markets. By doing so, the article studies levels of migrant autonomy in both criminal and regular markets and argues that this autonomy indicates whether migrant crime is entrepreneurial or a sign of social deviance. (shrink)
Are corporations ever morally obligated to engage in counterspeech—that is, in speech that aims to counter hate speech and misinformation? While existing arguments in moral and political philosophy show that individuals and states have such obligations, it is an open question whether those arguments apply to corporations as well. In this essay, I show how two such arguments—one based on avoiding complicity, and one based on duties of rescue—can plausibly be extended to corporations. I also respond to several (...) objections to corporate counterspeech. (shrink)
As will be made clear below, the terms extremism, fundamentalism, Islamism and Jihadism are often used interchangeably by the public, something that has negative implications for both the integration of the Muslim community into Western society, and the efficacy of counter-extremism efforts. This paper aims to provide working for these terms by understanding them independent from their misinformed socio-political contexts, and by determining how they relate to one another in what will be identified as a series of conceptual subsets. (...) In doing so, this paper will attempt to provide a framework for the usage of these terms in the governmental, academic, and public contexts, while removing some of the noise surrounding the important, and often highly sensitive contexts within which these terms are referenced. In this paper, religious extremism will defined as an ideological prerequisite to fundamentalism, Islamism and jihadism. It is a rejectionist, and dogmatic orientation that neglects balance in all elements of an individual’s ideological outlook. However, a fundamentalist, when defined in terms of faith, seeks to legitimise his or her beliefs through a non-contextual analysis of the relevant religious texts. An Islamist seeks to implement his or her fundamentalist views with a view to altering the structures of governance in accordance with the aforementioned rejectionist traits. Finally, and in relatively simple terms, a jihadist is an individual that champions the violent, global imposition of the Islamists’ beliefs. Jihadism is a subset of Islamism, Islamism is a subset of fundamentalism, and fundamentalism is a subset of extremism. (shrink)
Purpose This study investigates the incidence of ethical violations in the Ghanaian press which has become topical in the wake of misinformation in a charged political atmosphere. Public interest institutions have questioned the unprofessional conduct of journalists covering election campaigns in recent years. This study content analysed political stories from two leading Ghanaian newspapers (Daily Graphic and Daily Guide) to determine the nature and extent of ethical violations, and to examine the level of prominence accorded to (...) class='Hi'>political news stories by the two dailies. Design/methodology/approach This paper relied on qualitative content analysis for data gathering and analysis. A total of 387 political news items published between 1 October and 30 November 2020, were analysed. Findings This study found infractions of various nature to Article 1 of the Ghana Journalists Association (GJA) codes of ethics, chief among which is the deliberate publications of news stories without cross-checking facts. Other infractions to Articles 17, 11, 6 and 5 of the GJA codes of ethics were observed. Political news coverage favours the governing New Patriotic Party (NPP) and the main opposition National Democratic Congress (NDC) than any other parties, with the two parties (NPP-NDC) given greater prominence and salience by the Ghanaian press. Originality/value The research makes a modest contribution to the growing concern of journalism ethics in an increasing ecology of fake news. (shrink)
In much of the current academic and public discussion, conspiracy theories are portrayed as a negative phenomenon, linked to misinformation, mistrust in experts and institutions, and political propaganda. Rather surprisingly, however, philosophers working on this topic have been reluctant to incorporate a negatively evaluative aspect when either analyzing or engineering the concept conspiracy theory. In this paper, we present empirical data on the nature of the concept conspiracy theory from five studies designed to test the existence, prevalence and (...) exact form of an evaluative dimension to the ordinary concept conspiracy theory. These results reveal that, while there is a descriptive concept of conspiracy theory, the predominant use of conspiracy theory is deeply evaluative, encoding information about epistemic deficiency and often also derogatory and disparaging information. On the basis of these results, we present a new strategy for engineering conspiracy theory to promote theoretical investigations and institutional discussions of this phenomenon. We argue for engineering conspiracy theory to encode an epistemic evaluation, and to introduce a descriptive expression—such as ‘conspiratorial explanation’—to refer to the purely descriptive concept conspiracy theory. (shrink)
In late March of 2020, a new hashtag, #FilmYourHospital, made its first appearance on social media. The hashtag encouraged people to visit local hospitals to take pictures and videos of empty hospitals to help “prove” that the COVID-19 pandemic is an elaborate hoax. Using techniques from Social Network Analysis, this case study examines how this conspiracy theory propagated on Twitter and whether the hashtag virality was aided by the use of automation or coordination among Twitter users. We found that while (...) much of the content came from users with limited reach, the oxygen that fueled this conspiracy in its early days came from a handful of prominent conservative politicians and far right political activists on Twitter. These power users used this hashtag to build awareness about the campaign and to encourage their followers to break quarantine and film what is happening at their local hospitals. After the initial boost by a few prominent accounts, the campaign was mostly sustained by pro-Trump accounts, followed by a secondary wave of propagation outside the U.S. The rise of the #FilmYourHospital conspiracy from a single tweet demonstrates the ongoing challenge of addressing false, viral information during the COVID-19 pandemic. While the spread of misinformation can be potentially mitigated by fact-checking and directing people to credible sources of information from public health agencies, false and misleading claims that are driven by politics and supported by strong convictions and not science are much harder to root out. (shrink)
In 1996, Alan Sokal, a Professor of Physics at New York University, wrote a paper for the cultural-studies journal Social Text, entitled: 'Transgressing the Boundaries: Towards a transformative hermeneutics of quantum gravity'. It was reviewed, accepted and published. Sokal immediately confessed that the whole article was a hoax - a cunningly worded paper designed to expose and parody the style of extreme postmodernist criticism of science. The story became front-page news around the world and triggered fierce and wide-ranging controversy. -/- (...) Sokal is one of the most powerful voices in the continuing debate about the status of evidence-based knowledge. In Beyond the Hoax he turns his attention to a new set of targets - pseudo-science, religion, and misinformation in public life. 'Whether my targets are the postmodernists of the left, the fundamentalists of the right, or the muddle-headed of all political and apolitical stripes, the bottom line is that clear thinking, combined with a respect for evidence, are of the utmost importance to the survival of the human race in the twenty-first century.' The book also includes a hugely illuminating annotated text of the Hoax itself, and a reflection on the furore it provoked. (shrink)
Belief polarization is widely seen to threaten havoc on our shared political lives. It is often assumed that BP is the product of epistemically irrational behaviors at the individual level. After distinguishing between BP as it occurs in intra-group and inter-group settings, this paper argues that neither process necessarily reflects individual epistemic irrationality. It is true that these processes can work in tandem to produce so-called “echo chambers.” But while echo chambers are often problematic from the point of view (...) of collective rationality, it doesn't follow that individuals are doing anything wrong, epistemically speaking, in seeking them out. In non-ideal socio-epistemic contexts, echo chamber construction might provide one's best defense against systematic misinformation and deception. (shrink)
In the recent case of Nike v. Kasky both sides argued that their standard for distinguishing commercial speech from political speechwould create the better policy for ensuring accurate and complete disclosure of social information by corporations. Using insights frominformation economics, we argue that neither standard will achieve the policy goal of optimal truthful disclosure. Instead, we argue that the appropriate standard is one of optimal truthful disclosure—balancing the value of speech against the costs of misinformation. Specifically, we argue (...) that an SEC-sanctioned safe harbor available under a closely supervised system for social reporting will bring about optimal truthful disclosure. The scheme is intended to enhance stakeholder confidence in corporate social and political commentary, while at the same time encouraging corporations to provide accurate information in a fair playing field of public debate. (shrink)
Recent work in economics has rediscovered the importance of belief-based utility for understanding human behaviour. Belief ‘choice’ is subject to an important constraint, however: people can only bring themselves to believe things for which they can find rationalizations. When preferences for similar beliefs are widespread, this constraint generates rationalization markets, social structures in which agents compete to produce rationalizations in exchange for money and social rewards. I explore the nature of such markets, I draw on political media to illustrate (...) their characteristics and behaviour, and I highlight their implications for understanding motivated cognition and misinformation. (shrink)
This article reflects on the problem of false belief produced by the integrated psychological and algorithmic landscape humans now inhabit. Following the work of scholars such as Lee McIntyre (Post-Truth, MIT Press, 2018) or Cailin O’Connor and James Weatherall (The Misinformation Age: How False Beliefs Spread, Yale University Press, 2019) it combines recent discussions of fake news, post-truth, and science denialism across the disciplines of political science, computer science, sociology, psychology, and the history and philosophy of science that (...) variously address the ineffectiveness, in a digital era, of countering individual falsehoods with facts. Truth and falsehood, it argues, rather than being seen as properties or conditions attached to individual instances of content, should now be seen as collective, performative, and above all persuasive phenomena. They should be practically evaluated as networked systems and mechanisms of sharing in which individually targeted actions are combining with structural tendencies (both human and mechanical) in unprecedented ways. For example, the persuasive agency of apparent consensus (clicks, likes, bots, trolls) is newly important in a fractured environment that only appears to be, but is no longer ‘public’; the control of narratives, labels, and associations is a live, time-sensitive issue, a continuous contest, or ongoing cusp. Taking a social approach to truth yields observations of new relevance; from how current strategies of negative cohesion, blame, and enemy-creation depend crucially on binary ways of constructing the world, to how the offer of identity/community powerfully cooperates with the structural tendencies of algorithm-driven advertiser platforms towards polarisation. Remedies for these machine-learned and psychological tendencies lie in end-user education. So the Arts and Humanities, whether via comparisons with previous historical periods, or via principles of critical thinking and active reading, offer crucial resources to help counter what since 1997 silicon valley executives and scholars have called ‘persuasive technology’ (Fogg in Persuasive Technology: Using Computers to Change What we Think and Do, Morgan Kaufmann, 2003; Hamari et al. (eds) in Persuasive Technology, Springer International Publishing, 2014; Harris in How a Handful of Tech Companies Control Billions of Minds Every Day, 2017; Lanier in Who Owns the Future? Simon & Schuster, 2014 and Ten Arguments for Deleting your Social Media Accounts Right Now, Picador, 2019). The article proposes a paradigm shift in public understandings of this new social environment: from a culture of discovery, where what matters is what exists or is in fact the case, to a culture of iteration, where what matters is what gets repeated. (shrink)
There has been much concern with the abundance of misinformation in public discourse. Although misinformation has always played a role in political debate, its character has shifted from support fo...