Hypertext may represent a new paradigm capable of exploring legal sources within which links are established according to pertinent relationships found between statute texts and case law. However, to discover relevant information in such a network, a browsing mechanism is not enough when faced with a large volume of texts. This paper describes a new retrieval model where documents are represented according to both their content and relationships with other sources of information.
Hypertext and knowledge based systems can be viewed as complementary technologies, which if combined into a composite system may be able to yield a whole which is greater than the sum of the parts. To gain the maximum benefits, however, we need to think about how to harness this potential synergy. This will mean devising new styles of system, rather than merely seeking to enhance the old models.In this paper we describe our model for coupling hypertext and (...) a knowledge based system, and then go on to describe two prototype systems which attempt to exploit this composite framework. The first application concerns animated hypertext which accords the text a central role whilst giving access to all the advantages of a knowledge based system. The second suggests how we can augment the hypertext by providing links which reflect the conceptual model of a knowledge based system in the domain, so as to provide a more structured traversal of the text. (shrink)
In this article it is argued that the relation between the socalled Gutenberg galaxis of print culture and the Turing galaxis of digital media is not one of opposition and substitution, but rather one of co-evolution and integration. Or more precisely: that the Gutenberg galaxis on the one hand can be inscribed into the Turing galaxis, which on the other hand is textual in character since it is based on linear and serially processed representations manifested in a binary alphabet. In (...) continuation of this text and hypertext is described as notions of related but different sorts of complex systems and it is argued that both texts and hypertexts are best thought of as sequentially organised. The first part of the paper contains a description of the basic properties of computers and digital representation answering the question: what are the properties common to all kinds of use of computers – whether used for one purpose or another? The second part is concerned with the relationship between printed and electronic text. The third part addresses the idea of hypertext as systems which exploits modal shifts between node-mode and link-mode as a significant part of the semantics of the system. (shrink)
The lostness measure, an implicit and unobtrusive measure originally designed for assessing the usability of hypertextsystems, could be useful in Virtual Reality games where players need to find information to complete a task. VR locomotion systems with node-based movement mimic actions for exploration and browsing found in hypertextsystems. For that reason, hypertext usability measures, such as “lostness” can be used to identify how disoriented a player is when completing tasks in an educational (...) game by examining steps made by the player. An evaluation of two different lostness measures, global and local lostness, based on two different types of tasks, is described in a VR educational game using 13 college students between 14 and 18 years old in a first study and extended using 12 extra participants in a second study. Multiple Linear Regression analyses showed, in both studies, that local lostness, and not global lostness, had a significant effect on a post-game knowledge test. Therefore, we argued that local lostness was able to predict how well-participants would perform on a post-game knowledge test indicating how well they learned from the game. In-game experience aspects were also evaluated and, interestingly, it was also found that participants learned less when they felt more present in the game. We believe these two measures relate to cognitive overload, which is known to have an adverse effect on learning. Further research should investigate the lostness measure for use in an online adaptive game system and design the game system in such a way that the risk of cognitive overload is minimized when learning, resulting in higher retention of information. (shrink)
The composition of a timeline depends on purpose, perspective, and scale – and of the very understanding of the word, the phenomenon referred to, and whether the focus is the idea or concept, an instance of an idea or a phenomenon, a process, or an event and so forth. The main function of timelines is to provide an overview over a long history, it is a kind of a mnemotechnic device or a particular kind of Knowledge Organization System (KOS).b The (...) entries in the timeline should be brief and indisputable. Therefore, timelines often identify the first occurrences rather than the most widespread or most qualified instances leaving the fuller and more complex, and possibly disputable story out. But even first occurrences are often difficult to establish. The first occurrence is most often only the first finding of an instance. Older instances may be found and competing definitions develop either within a field or in different fields. This is further complicated since the phenomena, their names, and their meanings may change over time. Former meanings may become redundant, or they must accommodate and coexist with new meanings. The time and place of the composition of the timeline are to be considered in interpreting the things listed. The following note will discuss these issues as they occur in the development of the notions of text, e-text and hypertext, and the origin of machine translation. (shrink)
Interactive computer systems can support their users in problem solving, both in Performing their work tasks and in using the systems themselves. Not only is direct support for heuristics beneficial, but to do so modifies the form of computer support provided. This Paper defines and explores the use of problem solving heuristics in user interface design.
There is more to legal knowledge representation than knowledge-bases. It is valuable to look at legal knowledge representation and its implementation across the entire domain of computerisation of law, rather than focussing on sub-domains such as legal expert systems. The DataLex WorkStation software and applications developed using it are used to provide examples. Effective integration of inferencing, hypertext and text retrieval can overcome some of the limitations of these current paradigms of legal computerisation which are apparent when they (...) are used on a stand-alone basis. Effective integration of inferencing systems is facilitated by use of a (quasi) natural language knowledge representation, and the benefits of isomorphism are enhanced. These advantages of integration apply to all forms of inferencing, including document generation and casebased inferencing. Some principles for development of integrated legal decision support systems are proposed. (shrink)
The first studies in Philosophy that studied digital technology did not specify the characteristics of its various technical applications or instruments. They looked at the virtual reality and used the metaphor of immersion to study the set of all technological developments, as had been done before with the novel. Thus, Internet is misunderstood by many experts as virtuality (next to Baudrillard), rather than linking its hypertext system closer to other theories like those proposed by Deleuze and Guattari or Bourriaud.
The study of hyperidentities is a growing field of research. While hyperidentities hark back to before 1965, they have found a rebirth in the late seventies and early eighties. It is being expanded in several directions, from connections with clone theory, to finite basis problems, to semigroup theory, to classification of M-solid varieties. Applications to digital logic, formal languages, and hypertextsystems have been suggested. The concept of a P-compatible equation, where P is a partition on the set (...) of operation symbols, is a good tool to study the structure of identities. In [4] we asked for P-compatible hyperidentities. In this paper we will consider hypersubstitutions which are compatible with the partition P and will develop a generalized equational theory for certain P-compatible hyperidentities. (shrink)
Electronic text can be defined on two different, though interconnected, levels. On the one hand, electronic text can be defined by taking the notion of “text” or “printed text” as the point of departure. On the other hand, electronic text can be defined by taking the digital format as the point of departure, where everything is represented in the binary alphabet. While the notion of text in most cases lends itself to being independent of medium and embodiment, it is also (...) often tacitly assumed that it is in fact modeled on the print medium, instead of, for instance, on hand-written text or speech. In late 20th century, the notion of “text” was subjected to increasing criticism, as can be seen in the question that has been raised in literary text theory about whether “there is a text in this class.” At the same time, the notion was expanded by including extralinguistic sign modalities (images, videos). A basic question, therefore, is whether electronic text should be included in the enlarged notion that text is a new digital sign modality added to the repertoire of modalities or whether it should be included as a sign modality that is both an independent modality and a container that can hold other modalities. In the first case, the notion of electronic text would be paradigmatically formed around the e-book, which was conceived as a digital copy of a printed book but is now a deliberately closed work. Even closed works in digital form will need some sort of interface and hypertextual navigation that together constitute a particular kind of paratext needed for accessing any sort of digital material. In the second case, the electronic text is defined by the representation of content and (some parts of the) processing rules as binary sequences manifested in the binary alphabet. This wider notion would include, for instance, all sorts of scanning results, whether of the outer cosmos or the interior of our bodies and of digital traces of other processes in-between (machine readings included). Since other alphabets, such as the genetic alphabet and all sorts of images may also be represented in the binary alphabet, such materials will also belong to the textual universe within this definition. A more intriguing implication is that born-digital materials may also include scripts and interactive features as intrinsic parts of the text. The two notions define the text on different levels: one is centered on the Latin, the other on the binary alphabet, and both definitions include hypertext, interactivity, and multimodality as constituent parameters. In the first case, hypertext is included as a navigational, paratextual device; whereas in the second case, hypertext is also incorporated in the narrative within an otherwise closed work or as a constituent element on the textual universe of the web, where it serves the ongoing production of (possibly scripted) connections and disconnections between blocks of textual content. Since the early decades of early 21st century still represent only the very early stages of the globally distributed universe of web texts, this is also a history of the gradual unfolding of the dimensions of these three constituencies—hypertext, interactivity, and multimodality. The result is a still-expanding repertoire of genres, including some that are emerging via path dependency; some via remediation; and some as new genres that are unique for networked digital media, including “social media texts” and a growing variety of narrative and discursive multiple-source systems. (shrink)
In making a contribution, a person's life gains meaning. A small contribution affects a few people for a short time, while a large contribution affects many people for a long time. Within the framework of an abstract, computational world, a metric on contributions is defined. Simulation of the computational model shows the critical role of gradualness. Gradualness can be supported by human-computer systems in which the computer does the copying and arithmetic, and the human applies a rich understanding of (...) the world. The role of gradualness in the research areas of machine learning and hypertext is highlighted. (shrink)
"Ever Since This World Began" from Love Dog (Penny-Ante Editions, 2013) by Masha Tupitsyn continent. The audio-essay you've recorded yourself reading for continent. , “Ever Since the World Began,” is a compelling entrance into your new multi-media book, Love Dog (Success and Failure) , because it speaks to the very form of the book itself: vacillating and finding the long way around the question of love by using different genres and media. In your discussion of the face, one of the (...) themes of Love Dog , I think there is something to be said about the surfaces media create and how you constantly manipulate them in your work. This seems important for thinking also about your book LACONIA: 1,200 Tweets on Film ; a book on film written in tweets, interposing already three sets of expectations and pushing the boundaries of each medium's faciality, it's surface tension. In Love Dog , is there another kind of facial interaction? Perhaps the discursive faces of approaching love as topic and love as method? If so, how did/do these intersect for you, do/did they drive the creation of this book? MASHA TUPITSYN With LACONIA and Love Dog , I wanted to pay homage to the work modernism has done on subjective time and chronological time by carrying that experiment over into the digital economy. Because LACONIA is a time-based work of cultural criticism that employs the aphorism to look at 21 st century American culture, it is also an archival work of cultural mourning and memory. And in Love Dog , which is also a work about mourning as it relates to love, I wanted to think with all my senses, and to reflect that in the writing itself by using a multi-media form. LACONIA tackled the sound bite and the promotional image--the everyday language of consumer culture--which often wants to communicate ideological agendas through the repetition of a single image or reductive phrase. I had always been interested in the approaches of artists like Barbara Kruger and Jenny Holzer and essayistic, typographical filmmakers, like Chris Marker and Godard. In both cases, I felt it was important to try to compose a book in which deep—critical—thinking happens in so-called immediate, informal, and disposable contexts. That is, in places where you are either not supposed to be serious or are not required to take things seriously. To me the most intervention is needed in every day instances of culture and representation. In order to do that, I had to utilize and interrogate the very structures I was critiquing. In other words, the writing had to materialize in that live, digital, public space. It couldn’t simply happen at home on a piece of paper or in a word document on my computer that no one could see until it was finished. It had to unfold in real time, amidst everything else. And it had to literally be surrounded by the cultural landscape I wanted to assess. In both LACONIA and Love Dog , I wanted to know how and if we can get away from what we cannot get away from? From which there is no respite. Given this, I don’t think hypertexts can really be called hypertexts anymore. Hypertexts are simply the world today. This is not only the way we read the world now. It is the way the world reads. Likewise, interfaciality, as you put it, works on a number of levels for me, both on and off the page. There is my relation to epistemological and phenomenological surfaces—the screen, the body, the face, the voice, gender; the official story. Then there is the way this dovetails formally, and to which the digital adds yet another dimension. It’s also where sound comes into Love Dog (giving the book a sound; giving tonality to the book’s ideas and feelings). As Anne Carson pointed out in her essay “The Gender of Sound,” the two are connected, and of course so are love and gender. In “Ever Since This World Began,” I wanted to think about the phenomenology of the voice, which is why I visualize the sonic in the book and why in my writing about faces, I look at the tonal aspects (the things a face voices and a voice faces) of a face. This was standard to do when images and faces were “silent” in the silent era. Those images/faces were extraordinarily audible. The greatest screen face, to me, is still Falconetti’s in Dreyer’s The Passion of Joan of Arc . We can hear her face—even though there is no actual sound on screen of her speaking. The internet is a similarly “silent” space where actual voices are lacking. So intertextuality goes with interfaciality. I’ve also talked a lot about how categorization worked when it came to the reception of my first book, Beauty Talk & Monsters . How despite the fact that Beauty Talk is a text that crisscrosses form and content in a variety of ways, not pinpointing its exact genre—choosing one genre over another—only made things worse. Fiction was the category people were most adverse to me using, even though of course there is a lot of fiction in the book. Part of how fiction works in Beauty Talk is in the reader not knowing exactly where the fiction resides. In Godard’s 1967 film, Weekend , for example, everything matters. Everything is political, whether it’s real or imaginary, film or reality. In Weekend , life is not a game and neither is the game a game. The game is really life. Either everything is important or nothing is. But many people want clear answers and demarcations so that they can decide what is important and what is not important. My use of the “I,” subjective criticism, made everything “real” in Beauty Talk . But fiction is in the construction. It is in the blending. This is why I perforate the movie screen and connect the onscreen and the offscreen; the official story and the backstory. Although I don’t think there is a difference between onscreen and offscreen anymore. Nor is there a dialectic. It’s all screen all the time now. Non-fiction, on the other hand, was more tolerated as a moniker. Unlike Love Dog , Beauty Talk wasn’t explicitly or tangibly (what is tangible is a question in all these books) working with digital forms or within this digital economy, so some people resisted the book’s hypertextuality or intertextuality because they couldn’t see its other forms, if that makes sense. It was a problem of invisibility; of how to make something appear (this is where Nietzsche, big presence in Love Dog , comes in—the nature of appearances). Something people don’t necessarily want to see. Some readers couldn’t see the way forms were interfacing in that book. On the surface, Beauty Talk was simply a text about media culture—the domination of entertainment as a mode of being and knowing—and most readers could only see that one side of the book. But as Godard puts it in Wim Wenders’ documentary about the future of cinema, Room 666 , “Films are made, images are made, when there’s no one looking. That’s what the invisible is, that which we don’t see. That’s what the incredible is, that which we don’t see. And cinema shows you that which we don’t see, the incredible.” We are living in the aftermath of narrative and temporal collapse, which means we don’t read or feel in the same way. I began to use digital forms in my writing because I don’t see us as traditional book people anymore. I would add that power also resides in the invisible, in the things we make invisible by making them visible. Or it did for a long time. The new face of power is quickly becoming so-called transparency, which is even more corrupt because even though we now live in a behind-the-scenes culture, and see and know how the mechanisms of power and corruption operate, we still don't change. The world still doesn't change. Finally, another important thing that Godard says in Room 666 that relates to Love Dog is: “I’m here in front of the camera, and yet in my body and in my head, I’m behind it. My world is the imaginary and the imaginary is a journey between forwards and backwards.” This idea of visible/invisible, foresight and hindsight , backwards and forwards is an important dialectic in relation to time and the idea of the destinal. This is why I wanted Love Dog to travel, literally, figuratively, and discursively. You have to be open to not knowing. To epistemological, geographical, chronological, and emotional aporias. In Love Dog my story is both visible and invisible to me. Sometimes I could see where I was going. Other times I was completely in the dark existentially. Truth procedure, which love is, as bell hooks and Badiou tell us, requires expedition and openness to possibility. Unless you want a story you already know, but that is not truth procedure. So I tried to create this backwards and forwards journey in the book—this sense of travel and motion, hope and doubt—by jumping between forms, media, time; traveling to different places, texts, and emotional registers. The book’s “Time-Jump series,” which mostly takes the form of music—songs—but also a series of “green” videos that I shot, is the most obvious tribute to this idea of subjective and chronological time. continent. Your work aligns with writers that play with the form of their language, or have assumed the role of performance artist at some point: Kathy Acker, Chris Kraus, Avital Ronell, and Anne Carson to name a few representing varied approaches that show up in Love Dog . At the risk of ossifying the work, or missing the point—as these experimental modes of writing stack-up in piles of published works, do they approach a genre? MASHA As I noted in my book Life As We Show It , in Chelsea Girls the poet Eileen Myles points out, “You can’t get money without a category.” More importantly, you can’t get a category—or respect, rank—without a clear genre. This makes genre and gender an obvious pair. The two words are even linked etymologically and both genre and gender concern taxonomies of legitimacy—of sorting through what and who is valuable. So the words share common prejudices. The things one does not want to read is often synonymous with who one doesn't want to read about. Therefore a break-up of or with genre is maybe the genre or anti-genre that could be said to link these writers. Avital Ronell breaks up with philosophical tradition and modes of inquiry. Like Nietzsche, she revaluates methods of evaluation, testing out things you are not supposed to use philosophy to test (and, by the same token, testing philosophy in ways it’s not supposed to be tested), like drugs and stupidity—where philosophy fails and we fail philosophy. And Chris Kraus does something similar in her experimental fiction, using different forms to put female subjectivity “to the test,” so to speak. All these female writers and thinkers have tried to destabilize the systems that have been set up in (and against) writing. Thus missing the mark with genre, even intentionally, means that we have missed the point in some sense as writers and thinkers. And that to me is a good thing, however difficult. We’ve started at the wrong point and gone somewhere else instead. We’ve acknowledged that writing and thinking are also about failure, and that failure is always embedded in the act of writing and in our reasons for writing. So that missing the point is also the point. Derrida insisted: “We must invent a name for those ‘critical’ inventions which belong to literature while deforming its limits.” But how can you give something that resists and deforms, a name? Wouldn’t the name also be deformed? Isn't this why the aforementioned writers get hyphenated descriptors like ficto-criticism? Do we need a proper name to be able to read something carefully? I don’t think so. I don’t need it as a reader. I’ve always taken a work on its own terms. But for most people, if you don’t have an address, people don’t know how to find you. A lot of time, they won’t even know how to look. And in some cases, they’ll think you’re not even worth finding. You are not on the map because you have to literally make the map in order to exist. In her essay, “The Gender of Sound” Carson asks, “Why is female sound bad to hear?” I think for the same reason something uncategorizable (pedagogically, creatively, racially, and sexually)—Other—is hard to read. *For an audio recording of Florida , read by Masha Tupitsyn click here . continent. : We're also curious to hear know how you see the significance of 'performance' (and why the label sticks to the shoes of these authors like toilet paper) in describing this kind of writing work? MASHA I think I responded to the parenthetical portion of this question in my previous answer. As far as how performance relates to Love Dog , in Acker’s Don Quixote , which is another big presence in the book, she writes, “there is no other reality than anthropomorphism.” In Don Quixote , the dog is human and the human is dog. Which brings us to the title of Love Dog and the totemic function of the dog in the book: giving human things animal characteristics and animal things human characteristics. Rather than investing all our human love ideals into dogs, which we do constantly as a culture of dog lovers, I wanted to put the dog into love. It is the difference between the consumption of love as a patriarchal institution or status position and the essence of a type of radical and liberatory relation that would benefit humans in their bonds with other humans. So the dog in Love Dog is not simply the book’s cover or performative affect, so to speak. Love has always needed the dog, which is why the dog is the very embodiment of belief in love . Recall Argos The Great Dog and Odysseus. Argos is the only one who remembers and recognizes the ragged and old Odysseus even after his 20-year absence. So love is the high ideal and the dog, both common and dependable, is the bridge between the sacred and the profane. The dog is the house of love. And because Love Dog is a digital project, it seemed impossible to think about the post-human, technology, the virtual, or difference (which Badiou says is essential to love in In Praise of Love ) without thinking about animal-being and being-animal. Instead of simply “performing” these ideas and characteristics as literary affects, I’m interested in being-becoming. Which means the book’s tropes, leitmotifs, series, and even its titles are in service of that truth procedure. In other words, I actually want to live this way, not just write this way. And, more importantly, I want to live this way not just think this way. So Love Dog became both my totem animal and my autobiographical animal. This made the book organic, anthropomorphic—beyond literary. See the trailer to Masha Tupitsyn's new book available now from Penny Ante Editions. (shrink)
This article describes a project which involved an attempt to integrate an expert system with a hypertext database of primary and secondary text materials. Our chosen legal domain was that of the Convention on Jurisdiction and the Enforcement of Judgments in Civil and Commercial Matters (The Brussels Convention 1968). In this article, we address three dimensions of system design. With regard to the legal dimension, we consider the choice of domain and the representation of both knowledge and data in (...) the system. On the technological dimension, we discuss the selection of software development tools and problems associated with keeping knowledge bases and databases up-to-date. Finally, we pay particular attention to the Cinderella dimension of legal expert system development — the user interface. (shrink)
The article presents a conceptual framework for distinguishing different sorts of heterogeneous digital materials. The hypothesis is that a wide range of heterogeneous data resources can be characterized and classified due to their particular configurations of hypertext features such as scripts, links, interactive processes, and time scalings, and that the hypertext configuration is a major but not sole source of the messiness of big data. The notion of hypertext will be revalidated, placed at the center of the (...) interpretation of networked digital media, and used in the analysis of the fast-growing amounts of heterogeneous digital collections, assemblages, and corpora. The introduction summarizes the wider background of a fast-changing data landscape. (shrink)
Scholarly hypertexts involve argument and explicit selfquestioning, and can be distinguished from both informational and literary hypertexts. After making these distinctions the essay presents general principles about attention, some suggestions for self-representational multi-level structures that would enhance scholarly inquiry, and a wish list of software capabilities to support such structures. The essay concludes with a discussion of possible conflicts between scholarly inquiry and hypertext.
In this essay, I focus on two questions. First, what is Kant's understanding of the sense in which our faculties form a unified system? And, second, what are the implications of this for the metaphysical relationships between the faculties within this system? To consider these questions, I begin with a brief discussion of Longuenesse's groundbreaking work on the teleological unity of the understanding as the faculty for judgment. In doing so, I argue for a generalization of Longuenesse's account along two (...) dimensions. The result is a picture of our faculties as forming a teleological system—unified under the overarching aims of reason as the highest rational faculty. Then I discuss the recent debate between “additive” and “transformative” interpretations of the relationship between sensibility and the understand- ing, before proposing that we should interpret Kant as endorsing a moderate form of the “transformative” reading, which captures important elements of both the “additive” and the “transformative” account. (shrink)
Hypertext publishing, the integration of a large body (perhaps billions) of public writings into a unified hypertext environment, will require the simultaneous solution of problems involving very wide database distribution, royalties, freedom of speech, and privacy. This paper describes these problems and presents, for criticism and discussion, an abstract design which seems to solve many of them. This design, called LinkText, is presented both as a specification and as design approaches grouped around various levels of electronic publishing.
German Dadaists, Italian and Russian Futurists and Constructivists created in their experiments multi-medial orthopedic bodies as products of collage and montage. Sergei Eisenstein, who was influenced by these experiments, organized his theatrical productions as a chain of independent fragments capable of entering any possible combination/recombination and labelled this method “montage of attractions”. He used the same montage principle not only for a new theatrical or cinematic narrative but also to conceptualize the expressive movement of the theatrical or cinematic body created (...) on stage and on screen. Finally he conceptualized montage not only as a means of conveying movement, but also of conveying a way of thinking. This inspired him to create a new form of scientific narrative in his two unfinished books. The subject to be analysed in the first book from 1929 – montage – inspired him to look for a new structure by organizing different texts in the form of a sphere. This form defined the method of writing his second project on the theory of the arts as a hypertext. Eisenstein gave this book the title Method. (shrink)
This two-volume work, first published in 1843, was John Stuart Mill's first major book. It reinvented the modern study of logic and laid the foundations for his later work in the areas of political economy, women's rights and representative government. In clear, systematic prose, Mill disentangles syllogistic logic from its origins in Aristotle and scholasticism and grounds it instead in processes of inductive reasoning. An important attempt at integrating empiricism within a more general theory of human knowledge, the work constitutes (...) essential reading for anyone seeking a full understanding of Mill's thought. Volume 1 contains Mill's introduction, which elaborates upon his definition of logic as 'not the science of Belief, but the science of Proof, or Evidence'. It also features discussions of the central components of logical reasoning - propositions and syllogisms - in relation to Mill's theories of inductive reasoning and experimental method. (shrink)
Hypertext can be used--in nearly any type of computer-assisted class--to allow students to engage in collaborative, socially-constructed composition and meaning-making; this essay considers both the underlying theory which supports the use of hypertext in composition instruction and provides a range of pedagogical approaches. Various classroom arrangements are considered, from standalone computers with no internet connections to networked, internet accessible workstations; for each type of classroom a different hypertext assignment which emphasizes collaboration is provided as an example.
Uriah Kriegel presents a rich exploration of the philosophy of the great nineteenth-century thinker Franz Brentano. He locates Brentano at the crossroads where the Anglo-American and continental European philosophical traditions diverged. At the centre of this account of Brentano's philosophy is the connection between mind and reality. Kriegel aims to develop Brentano's central ideas where they are overly programmatic or do not take into account philosophical developments that have taken place since Brentano's death a century ago; and to offer a (...) partial defense of Brentano's system as quite plausible and in any case extraordinarily creative and thought-provoking. Brentano's system grounds a complete metaphysics and value theory in a well-developed philosophy of mind, and accordingly the book is divided into three parts, devoted to Brentano's philosophy of mind, his metaphysics, and his moral philosophy. The book's fundamental ambition is to show how Brentano combines the clarity and precision of the analytic philosopher with the sweeping vision of the continental philosopher. Brentano pays careful attention to important distinctions, conscientiously defines key notions, presents precise arguments for his claims, judiciously considers potential objections, and in general proceeds very methodically - yet he does so not as an end in itself, but as a means only. His end is the crafting of a grand philosophical system in the classical sense, attempting to produce nothing less than a unified theory of the true, the good, and the beautiful. (shrink)
Fichte's System of Ethics, originally published in 1798, is at once the most accessible presentation of its author's comprehensive philosophical project, The Science of Knowledge or Wissenschaftslehre, and the most important work in moral philosophy written between Kant and Hegel. This study integrates the discussion of our moral duties into the systematic framework of a transcendental theory of the human subject. Ranging over numerous important philosophical themes, the volume offers a new translation of the work together with an introduction that (...) sets it in its philosophical and historical contexts. (shrink)
Current knowledge of the genetic, epigenetic, behavioural and symbolic systems of inheritance requires a revision and extension of the mid-twentieth-century, gene-based, 'Modern Synthesis' version of Darwinian evolutionary theory. We present the case for this by first outlining the history that led to the neo-Darwinian view of evolution. In the second section we describe and compare different types of inheritance, and in the third discuss the implications of a broad view of heredity for various aspects of evolutionary theory. We end (...) with an examination of the philosophical and conceptual ramifications of evolutionary thinking that incorporates multiple inheritance systems. (shrink)
1. 1. PROGRAM It will be our aim to reconstruct, with precision, certain views which have been traditionally associated with nominalism and to investigate problems arising from these views in the construction of interpreted formal systems. Several such systems are developed in accordance with the demand that the sentences of a system which is acceptable to a nominalist must not imply the existence of any entities other than individuals. Emphasis will be placed on the constructionist method of philosophical (...) analysis. To follow this method is to introduce the central notions of the subject-matter to be investigated into a system governed by exact rules. For example, the constructionist method of investigating the properties of geometric figures may consist in formulating a system of postulates and definitions which, together with the apparatus of formal logic, generates all necessary truths concerning geometric figures. Similarly, a constructionist analysis of the notion of an individual may take the form of an axiomatic theory whose provable assertions are just those which seem essential to the role played by the concept of an individual in system atic contexts. Such axiomatic theories gain in interest if they are supple mented by precise semantical rules specifying the denotation of all terms and the truth conditions of all sentences of the theory. (shrink)
Towards the end of his life, Descartes published the first four parts of a projected six-part work, The Principles of Philosophy. This was intended to be the definitive statement of his complete system of philosophy, dealing with everything from cosmology to the nature of human happiness. In this book, Stephen Gaukroger examines the whole system, and reconstructs the last two parts, 'On Living Things' and 'On Man', from Descartes' other writings. He relates the work to the tradition of late Scholastic (...) textbooks which it follows, and also to Descartes' other philosophical writings, and he examines the ways in which Descartes transformed not only the practice of natural philosophy but also our understanding of what it is to be a philosopher. His book is a comprehensive examination of Descartes' complete philosophical system. (shrink)
Chapter 1 WHY SYSTEMS PHILOSOPHY? Some reasons, for synthetic philosophy generally The persistent theme of this study is the timeliness and the necessity of ...
[GER] Michael Lewin geht es in seinem Buch nicht nur um philosophiehistorische Perspektiven der Kant- und Fichte-Forschung, sondern ebenso sehr um die Sache selbst: das Konzept der Vernunft im engeren Sinne als ein potenziell wohlbegründetes und in zeitgenössischen Kontexten fortführbares Forschungsprogramm. Dabei sind verschiedene, in einer Reihe der Reflexion stehende Theoriegefüge bewusst zu machen, die sich aus den vielfältigen Arten und Funktionen der Ideen ergeben, mit deren Hilfe die Vernunft das Verstehen und Wollen steuert und selbstreflexiv wird. Nach der Untersuchung (...) von sieben Ideenarten bei Kant und ihrer von der Tathandlung (der Selbstsetzung der reinen Vernunft) ausgehenden Systematisierung bei Fichte wird die Frage erörtert, ob, wie und unter welchen Bedingungen sich ein solches Projekt inmitten alternativer Vernunftkonzepte, basaler und radikaler Einwände sowie postidealistischer Vernunftkritik als ein kooperations- und konkurrenzfähiges Unternehmen bewähren kann. Dazu entwickelt der Autor unter dem Stichpunkt „reflektierter Perspektivismus“ das Programm einer perspektivistischen Metaphilosophie, die den Hintergrundparametern hinter den philosophischen Positionierungen – forschungsprogrammatische Festlegungen (in Anlehnung an Imre Lakatos), Ansprüche und (Wissens-)Ziele – nachspürt und dadurch die Möglichkeiten und Grenzen der verschiedenen Projekte offenlegt. ||| -/- [ENG] Michael Lewin’s book is not only concerned with philosophical-historical perspectives of research on Kant and Fichte, but also with the matter itself: the concept of reason in the narrower sense as a potentially well-grounded research program that can be continued in contemporary contexts. In this, various theoretical structures related to the manifold types and functions of ideas are analyzed, by means of which reason controls the understanding and will, and becomes selfreflexive. After the examination of seven types of ideas in Kant and their systematization in Fichte’s work based on the fact-act (the self-positing of pure reason), the question is discussed as to whether, how and under what conditions such a project can prove itself as a cooperative and competitive enterprise in the midst of alternative concepts of reason, fundamental and radical objections and post-idealistic criticism of reason. To this end, the author develops the program of a perspectivistic metaphilosophy under the heading of »reflected perspectivism«, which traces the background parameters behind the philosophical positionings – research-programmatic determinations (following Imre Lakatos), demands and (knowledge) goals – and thereby reveals the possibilities and limits of the various projects. (shrink)
The issue of a logic foundation for African thought connects well with the question of method. Do we need new methods for African philosophy and studies? Or, are the methods of Western thought adequate for African intellectual space? These questions are not some of the easiest to answer because they lead straight to the question of whether or not a logic tradition from African intellectual space is possible. Thus in charting the course of future direction in African philosophy and studies, (...) one must be confronted with this question of logic. The author boldly takes up this challenge and becomes the first to do so in a book by introducing new concepts and formulating a new African culture-inspired system of logic called Ezumezu which he believes would ground new methods in African philosophy and studies. He develops this system to rescue African philosophy and, by extension, sundry fields in African Indigenous Knowledge Systems from the spell of Plato and the hegemony of Aristotle. African philosophers can now ground their discourses in Ezumezu logic which will distinguish their philosophy as a tradition in its own right. On the whole, the book engages with some of the lingering controversies in the idea of African logic before unveiling Ezumezu as a philosophy of logic, methodology and formal system. The book also provides fresh arguments and insights on the themes of decolonisation and Africanisation for the intellectual transformation of scholarship in Africa. It will appeal to philosophers and logicians—undergraduates and post graduate researchers—as well as those in various areas of African studies. (shrink)
This work has been selected by scholars as being culturally important, and is part of the knowledge base of civilization as we know it. This work was reproduced from the original artifact, and remains as true to the original work as possible. Therefore, you will see the original copyright references, library stamps, and other notations in the work. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may (...) freely copy and distribute this work, as no entity has a copyright on the body of the work. As a reproduction of a historical artifact, this work may contain missing or blurred pages, poor pictures, errant marks, etc. Scholars believe, and we concur, that this work is important enough to be preserved, reproduced, and made generally available to the public. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant. (shrink)
This paper investigates variation in argumentative discourse as a consequence of the passage from traditional linear texts to hypertext, focusing in particular on NGOs’ campaigning on the web. The analysis, which combines linguistic and argumentation theory perspectives, addresses issues connected with the loss of linearity determined by hypertexts, with special regard for its impact on textual coherence, and the consequential loss of the writer’s control on the order of arguments.
The nature/nurture debate is not dead. Dichotomous views of development still underlie many fundamental debates in the biological and social sciences. Developmental systems theory offers a new conceptual framework with which to resolve such debates. DST views ontogeny as contingent cycles of interaction among a varied set of developmental resources, no one of which controls the process. These factors include DNA, cellular and organismic structure, and social and ecological interactions. DST has excited interest from a wide range of researchers, (...) from molecular biologists to anthropologists, because of its ability to integrate evolutionary theory and other disciplines without falling into traditional oppositions. The book provides historical background to DST, recent theoretical findings on the mechanisms of heredity, applications of the DST framework to behavioral development, implications of DST for the philosophy of biology, and critical reactions to DST. (shrink)
The author of The Death and Life of Great American Cities looks at business fraud and criminal enterprise, overextended government farm subsidies and zealous transit police, to show what happens when the moral systems of commerce collide with those of politics.