In this article, the insertion of a two-staged highly interesting question in an online, survey-based field experiment is shown to produce better survey completion rate (i.e., decreases completion refusal by 8%) and sample representativeness (increases the number of moderate answer patterns by 12%) than a typical (same) highly interesting question at the beginning of a survey only. Using nonparametric tests and subgroup probability analysis, measured effects include survey completion rates, response bias and reported demographic differences. In regards to sample representativeness, (...) the results also raise questions about the sensitivity of the conventional practice of comparing early to late respondent means scores as a method of investigating nonresponse bias in marketing research. Alternative approaches to measuring potential non-response bias are compared with the tradition of comparing early-wave verses late-wave mean respondent differences. The results indicate that the conventional mean test fails to identify differences in nonresponse bias; the scores of highly interested or opposed respondents in the first waves produce equivalent means to the scores of the less interested or opposed respondents in the latter wave between the surveys (e.g., 1's and 5's vs. 2's and 4's, both averaging to 3's) that are identifiable through kurtosis and probability analysis. (shrink)
Corporate Social Responsibility (CSR) is a tortured concept. In this paper, we reframe CSR into a number of discrete Corporate Social Responsibilities (CSR’s), each of which can have a positive or negative social impact, and each of which has an endogenous managerially driven component, and an exogenous stakeholder driven component. Using an industry-level sample drawn from the KLD data base, we test the impact of hypothesized drivers of CSR on various CSR’s.
In a previous paper linking Simondon to biological and systems-theoretical discourses in autopoiesis and debates about contemporary technogenesis, I have argued that Simondon’s ontology of individuation furnishes a basis to theorize the “agency” of the environment that comes to the fore as we humans enter, as we do increasingly today, into alliances with sophisticated, computational technologies.1 In concert with researchers like Andy Clark and N. Katherine Hayles, I embrace the “technical distribution” of cognition and perception as a way of understanding (...) the complex couplings between humans and machines that are typical in our contemporary world, but that have, in fact, been part of human techno-genesis since .. (shrink)
The study of decision making has traditionally been dominated by axiomatic utility theories. More recently, an alternative approach, which focuses on the micro-mechanisms of the underlying deliberation process, has been shown to account for several "paradoxes" in human choice behavior for which simple utility-based approaches cannot. Decision field theory (DFT) is a cognitive-dynamical model of decision making and preferential choice, built on the fundamental principle that decisions are based on the accumulation of subjective evaluations of choice alternatives until a threshold (...) criterion is met. This article extends the basic DFT framework to the domain of dynamic decision making. DFT-Dynamic is proposed as a new alternative to normative backward induction. Through its attention to the processes underlying planning and deliberation DFT-D provides simple, emergent explanations for violations of choice principles traditionally taken as evidence of irrationality. A recent multistage decision making study is used to showcase the model's efficacy for developing cognitive models of individual strategies. (shrink)
Motivation: There is an ongoing search for definitive and reliable biomarkers to forecast or predict imminent seizure onset, but to date most research has been limited to EEG with sampling rates <1,000 Hz. High-frequency oscillations have gained acceptance as an indicator of epileptic tissue, but few have investigated the temporal properties of HFOs or their potential role as a predictor in seizure prediction. Here we evaluate time-varying trends in preictal HFO rates as a potential biomarker of seizure prediction.Methods: HFOs were (...) identified for all interictal and preictal periods with a validated automated detector in 27 patients who underwent intracranial EEG monitoring. We used LASSO logistic regression with several features of the HFO rate to distinguish preictal from interictal periods in each individual. We then tested these models with held-out data and evaluated their performance with the area-under-the-curve of their receiver-operating curve. Finally, we assessed the significance of these results using non-parametric statistical tests.Results: There was variability in the ability of HFOs to discern preictal from interictal states across our cohort. We identified a subset of 10 patients in whom the presence of the preictal state could be successfully predicted better than chance. For some of these individuals, average AUC in the held-out data reached higher than 0.80, which suggests that HFO rates can significantly differentiate preictal and interictal periods for certain patients.Significance: These findings show that temporal trends in HFO rate can predict the preictal state better than random chance in some individuals. Such promising results indicate that future prediction efforts would benefit from the inclusion of high-frequency information in their predictive models and technological architecture. (shrink)
Pickering & Garrod's (P&G's) integrated model of production and comprehension includes no explicit role for nonlinguistic cognitive processes. Yet, how domain-general cognitive functions contribute to language processing has become clearer with well-specified theories and supporting data. We therefore believe that their account can benefit by incorporating functions like working memory and cognitive control into a unified model of language processing.
Cognitive control refers to the regulation of mental activity to support flexible cognition across different domains. Cragg and Nation (2010) propose that the development of cognitive control in children parallels the development of language abilities, particularly inner speech. We suggest that children’s late development of cognitive control also mirrors their limited ability to revise misinterpretations of sentence meaning. Moreover, we argue that for certain tasks, a tradeoff between bottom-up (data-driven) and top-down (rule-based) thinking may actually benefit performance in both children (...) and adults. Specifically, we propose that a lack of cognitive control may promote important aspects of cognitive development, like language acquisition and creativity. (shrink)
Children are vulnerable to adverse effects of food advertising. Food commercials are known to increase hedonic, taste-oriented, and unhealthy food decisions. The current study examined how promoting resilience to food commercials impacted susceptibility to unhealthy food decision-making in children. To promote resilience to food commercials, we utilized the food advertising literacy intervention intended to enhance cognitive skepticism and critical thinking, and decrease positive attitudes toward commercials. Thirty-six children aged 8–12 years were randomly assigned to the food advertising literacy intervention or (...) the control condition. Eighteen children received four brief intervention sessions via video over 1 week period. In each session, children watched six food commercials with interspersed embedded intervention narratives. While watching food commercials and narratives, children were encouraged to speak their thoughts out loud spontaneously, which provided children's attitudes toward commercials. Eighteen children in the control condition had four control sessions over 1 week, and watched the same food commercials without intervention narratives while thinking aloud. The first and last sessions were held in the laboratory, and the second and third sessions were held at the children's homes. Susceptibility to unhealthy food decision-making was indicated by the decision weights of taste attributes, taste perception, food choices, ad libitum snacking, and cognitive and affective attitudes toward food commercials. As hypothesized, the intervention successfully decreased susceptibility to unhealthy food decision-making evidenced by reduced decision weights of the taste in food decisions, decreased tasty perception of unhealthy foods, and increased cognitive skepticism and critical thinking toward food commercials. In addition, as children's opinions assimilated to intervention narratives, their cognitive skepticism and critical thinking toward commercials increased. The aforementioned results were not shown in the control condition. However, this brief intervention was not enough to change actual food choices or food consumption. Results of this study suggest that promoting resilience to food commercials by enhancing cognitive skepticism and critical thinking effectively reduced children's susceptibility to unhealthy food-decision making. (shrink)
Transcranial Magnetic Stimulation (TMS) is a non-invasive neurostimulatory and neuromodulatory technique increasingly used in clinical and research practices around the world. Historically, the ethical considerations guiding the therapeutic practice of TMS were largely concerned with aspects of subject safety in clinical trials. While safety remains of paramount importance, the recent US Food and Drug Administration approval of the Neuronetics NeuroStar TMS device for the treatment of specific medication-resistant depression has raised a number of additional ethical concerns, including marketing, off-label use (...) and technician certification. This article provides an overview of the history of TMS and highlights the ethical questions that are likely arise as the therapeutic use of TMS continues to expand. (shrink)
Political corruption imposes substantial costs on shareholders in the U.S. Yet, we understand little about the basic factors that exacerbate or mitigate the value consequences of political corruption. Using federal corruption convictions data, we find that firm-level economic rents and monitoring mechanisms moderate the negative relation between corruption and firm value. The value consequences of political corruption are exacerbated for firms operating in low-rent product markets and mitigated for firms subject to external monitoring by state governments or monitoring induced by (...) disclosure transparency. Our results should inform managers and policymakers of the tradeoffs imposed on firms operating in politically corrupt districts. (shrink)
The period in the foundations of mathematics that started in 1879 with the publication of Frege's Begriffsschrift and ended in 1931 with Gödel's Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme I can reasonably be called the classical period. It saw the development of three major foundational programmes: the logicism of Frege, Russell and Whitehead, the intuitionism of Brouwer, and Hilbert's formalist and proof-theoretic programme. In this period, there were also lively exchanges between the various schools culminating in (...) the famous Hilbert-Brouwer controversy in the 1920s. -/- The purpose of this anthology is to review the programmes in the foundations of mathematics from the classical period and to assess their possible relevance for contemporary philosophy of mathematics. What can we say, in retrospect, about the various foundational programmes of the classical period and the disputes that took place between them? To what extent do the classical programmes of logicism, intuitionism and formalism represent options that are still alive today? These questions are addressed in this volume by leading mathematical logicians and philosophers of mathematics. (shrink)
This paper presents an introduction to Arne Grøn’s existential hermeneutics as a philosophical method, while also attempting to indicate how Grøn’s work contributes to and engages in a number of crucial topics in modern continental philosophy. The first section of the paper shows how Grøn draws on Paul Ricoeur and Michael Theunissen to rethink the concept of existence through a reading of Kierkegaard that uncouples this concept from the self-evident status it attained in twenty-century existentialism. The second section of the (...) paper argues that Grøn proposes an existential ethics that takes the Kierkegaardian notion that humans are inherently normative beings and uses this as a basis for a critique of ethics, as well as for establishing an ethics of vision inspired by Kierkegaard. The third section of the paper presents a reading of Grøn’s notion of religion as an inextricable part of human existence. (shrink)
We propose a coevolutionary model of secrecy and stigmatization. According to this model, secrecy functions to conceal potential fitness costs detected in oneself or one’s genetic kin. In three studies, we found that the content of participants’ distressing secrets overlapped significantly with three domains of social information that were important for inclusive fitness and served as cues for discriminating between rewarding and unrewarding interaction partners: health, mating, and social-exchange behavior. These findings support the notion that secrecy functions primarily as a (...) defense against stigmatization by suppressing information about oneself or one’s kin that evolutionarily has been devalued in mating and social exchange. (shrink)