In this Editorial note, Guest Editors introduce the theme of the Special Issue of the journal Philosophies, titled Contemporary Natural Philosophy and Philosophies.
The increased interactivity and connectivity of computational devices along with the spreading of computational tools and computational thinking across the fields, has changed our understanding of the nature of computing. In the course of this development computing models have been extended from the initial abstract symbol manipulating mechanisms of stand-alone, discrete sequential machines, to the models of natural computing in the physical world, generally concurrent asynchronous processes capable of modelling living systems, their informational structures and dynamics on both symbolic and (...) sub-symbolic information processing levels. Present account of models of computation highlights several topics of importance for the development of new understanding of computing and its role: natural computation and the relationship between the model and physical implementation, interactivity as fundamental for computational modelling of concurrent information processing systems such as living organisms and their networks, and the new developments in logic needed to support this generalized framework. Computing understood as information processing is closely related to natural sciences; it helps us recognize connections between sciences, and provides a unified approach for modeling and simulating of both living and non-living systems. (shrink)
Computing is changing the traditional field of Philosophy of Science in a very profound way. First as a methodological tool, computing makes possible ``experimental Philosophy'' which is able to provide practical tests for different philosophical ideas. At the same time the ideal object of investigation of the Philosophy of Science is changing. For a long period of time the ideal science was Physics (e.g., Popper, Carnap, Kuhn, and Chalmers). Now the focus is shifting to the field of Computing/Informatics. There are (...) many good reasons for this paradigm shift, one of those being a long standing need of a new meeting between the sciences and humanities, for which the new discipline of Computing/Informatics gives innumerable possibilities. Contrary to Physics, Computing/Informatics is very much human-centered. It brings a potential for a new Renaissance, where Science and Humanities, Arts and Engineering can reach a new synthesis, so very much needed in our intellectually split culture. This paper investigates contemporary trends and the relation between the Philosophy of Science and the Philosophy of Computing and Information, which is equivalent to the present relation between Philosophy of Science and Philosophy of Physics. (shrink)
Cognitive science is considered to be the study of mind (consciousness and thought) and intelligence in humans. Under such definition variety of unsolved/unsolvable problems appear. This article argues for a broad understanding of cognition based on empirical results from i.a. natural sciences, self-organization, artificial intelligence and artificial life, network science and neuroscience, that apart from the high level mental activities in humans, includes sub-symbolic and sub-conscious processes, such as emotions, recognizes cognition in other living beings as well as extended and (...) distributed/social cognition. The new idea of cognition as complex multiscale phenomenon evolved in living organisms based on bodily structures that process information, linking cognitivists and EEEE (embodied, embedded, enactive, extended) cognition approaches with the idea of morphological computation (info-computational self-organisation) in cognizing agents, emerging in evolution through interactions of a (living/cognizing) agent with the environment. (shrink)
Alan Turing’s pioneering work on computability, and his ideas on morphological computing support Andrew Hodges’ view of Turing as a natural philosopher. Turing’s natural philosophy differs importantly from Galileo’s view that the book of nature is written in the language of mathematics (The Assayer, 1623). Computing is more than a language used to describe nature as computation produces real time physical behaviors. This article presents the framework of Natural info-computationalism as a contemporary natural philosophy that builds on the legacy of (...) Turing’s computationalism. The use of info-computational conceptualizations, models and tools makes possible for the first time in history modeling of complex self-organizing adaptive systems, including basic characteristics and functions of living systems, intelligence, and cognition. (shrink)
Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data → information → knowledge) by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent’s ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as (...) natural computation in order to be better understood and improved. (shrink)
The emerging contemporary natural philosophy provides a common ground for the integrative view of the natural, the artificial, and the human-social knowledge and practices. Learning process is central for acquiring, maintaining, and managing knowledge, both theoretical and practical. This paper explores the relationships between the present advances in understanding of learning in the sciences of the artificial, natural sciences, and philosophy. The question is, what at this stage of the development the inspiration from nature, specifically its computational models such as (...) info-computation through morphological computing, can contribute to machine learning and artificial intelligence, and how much on the other hand models and experiments in machine learning and robotics can motivate, justify, and inform research in computational cognitive science, neurosciences, and computing nature. We propose that one contribution can be understanding of the mechanisms of ‘learning to learn’, as a step towards deep learning with symbolic layer of computation/information processing in a framework linking connectionism with symbolism. As all natural systems possessing intelligence are cognitive systems, we describe the evolutionary arguments for the necessity of learning to learn for a system to reach human-level intelligence through evolution and development. The paper thus presents a contribution to the epistemology of the contemporary philosophy of nature. (shrink)
This paper presents a theoretical study of the binary oppositions underlying the mechanisms of natural computation understood as dynamical processes on natural information morphologies. Of special interest are the oppositions of discrete vs. continuous, structure vs. process, and differentiation vs. integration. The framework used is that of computing nature, where all natural processes at different levels of organisation are computations over informational structures. The interactions at different levels of granularity/organisation in nature, and the character of the phenomena that unfold through (...) those interactions, are modeled from the perspective of an observing agent. This brings us to the movement from binary oppositions to dynamic networks built upon mutually related binary oppositions, where each node has several properties. (shrink)
In discussions regarding models of cognition, the very mention of “computationalism” often incites reactions against the insufficiency of the Turing machine model, its abstractness, determinism, the lack of naturalist foundations, triviality and the absence of clarity. None of those objections, however, concerns models based on natural computation or computing nature, where the model of computation is broader than symbol manipulation or conventional models of computation. Computing nature consists of physical structures that form layered computational architecture, with computation processes ranging from (...) quantum to chemical, biological/cognitive and social-level computation. It is argued that, on the lower levels of information processing in the brain, finite automata or Turing machines may still be adequate models, while, on the higher levels of whole-brain information processing, natural computing models are necessary. A layered computational architecture of the mind based on the intrinsic computing of physical systems avoids objections against early versions of computationalism in the form of abstract symbols manipulation. (shrink)
The dialogue develops arguments for and against a broad new world system - info-computationalist naturalism - that is supposed to overcome the traditional mechanistic view. It would make the older mechanistic view into a special case of the new general info-computationalist framework (rather like Euclidian geometry remains valid inside a broader notion of geometry). We primarily discuss what the info-computational paradigm would mean, especially its pancomputationalist component. This includes the requirements for a the new generalized notion of computing that would (...) include sub-symbolic information processing. We investigate whether pancomputationalism can provide the basic causal structure to the world and whether the overall research program of info-computationalist naturalism appears productive, especially when it comes to new approaches to the living world, including computationalism in the philosophy of mind. (shrink)
This paper connects information with computation and cognition via concept of agents that appear at variety of levels of organization of physical/chemical/cognitive systems – from elementary particles to atoms, molecules, life-like chemical systems, to cognitive systems starting with living cells, up to organisms and ecologies. In order to obtain this generalized framework, concepts of information, computation and cognition are generalized. In this framework, nature can be seen as informational structure with computational dynamics, where an (info-computational) agent is needed for the (...) potential information of the world to actualize. Starting from the definition of information as the difference in one physical system that makes a difference in another physical system – which combines Bateson and Hewitt’s definitions, the argument is advanced for natural computation as a computational model of the dynamics of the physical world, where information processing is constantly going on, on a variety of levels of organization. This setting helps us to elucidate the relationships between computation, information, agency and cognition, within the common conceptual framework, with special relevance for biology and robotics. (shrink)
The book presents investigations into the world of info-computational nature, in which information constitutes the structure, while computational process amounts to its change. Information and computation are inextricably bound: There is no computation without informational structure, and there is no information without computational process. Those two complementary ideas are used to build a conceptual net, which according to Novalis is a theoretical way of capturing reality. We apprehend the reality within a framework known as natural computationalism, the view that the (...) whole universe can be understood as a computational system at many different levels - from quantum mechanical world, to biological organisms including intelligent minds and their societies. Questions about nature of information and computation and their unified view are addressed along with application of info- computational approach to knowledge generation. (shrink)
This text presents the research field of natural/unconventional computing as it appears in the book COMPUTING NATURE. The articles discussed consist a selection of works from the Symposium on Natural Computing at AISB-IACAP (British Society for the Study of Artificial Intelligence and the Simulation of Behaviour and The International Association for Computing and Philosophy) World Congress 2012, held at the University of Birmingham, celebrating Turing centenary. The COMPUTING NATURE is about nature considered as the totality of physical existence, the universe. (...) By physical we mean all phenomena, objects and processes, that are possible to detect either directly by our senses or via instruments. Historically, there have been many ways of describing the universe (cosmic egg, cosmic tree, theistic universe, mechanistic universe) while a particularly prominent contemporary approach is computational universe, as discussed in this article. See more: http://arxiv.org/abs/1210.7784. (shrink)
In his article Open Problems in the Philosophy of Information 1 Luciano Floridi presented a Philosophy of Information research program in the form of eighteen open problems, covering the following fundamental areas: Information definition, information semantics, intelligence/cognition, informational universe/nature and values/ethics. We revisit Floridis program, highlighting some of the major advances, commenting on unsolved problems and rendering the new landscape of the Philosophy of Information emerging at present. As we analyze the progress of PI we try to situate Floridis program (...) in the context of scientific and technological development that have been made last ten years. We emphasize that Philosophy of Information is a huge and vibrant research field, with its origins dating before Open Problems, and its domains extending even outside their scope. In this paper, we have been able only to sketch some of the developments during the past ten years. Our hope is that, even if fragmentary, this review may serve as a contribution to the effort of understanding the present state of the art and the paths of development of Philosophy of Information as seen through the lens of Open Problems. (shrink)
The recent development of the research field of Computing and Philosophy has triggered investigations into the theoretical foundations of computing and information. This thesis consists of two parts which are the result of studies in two areas of Philosophy of Computing and Philosophy of Information regarding the production of meaning and the value system with applications. The first part develops a unified dual-aspect theory of information and computation, in which information is characterized as structure, and computation is the information dynamics. (...) This enables naturalization of epistemology, based on interactive information representation and communication. In the study of systems modeling, meaning, truth and agency are discussed within the framework of the PI/PC unification. The second part of the thesis addresses the necessity of ethical judgment in rational agency illustrated by the problem of information privacy and surveillance in the networked society. The value grounds and socio-technological solutions for securing trustworthiness of computing are analyzed. Privacy issues clearly show the need for computing professionals to contribute to understanding of the technological mechanisms of Information and Communication Technology. The main original contribution of this thesis is the unified dual-aspect theory of computation/information. Semantics of information is seen as a part of the data-information-knowledge structuring, in which complex structures are self-organized by the computational processing of information. Within the unified model, complexity is a result of computational processes on informational structures. The thesis argues for the necessity of computing beyond the Turing-Church limit, motivated by natural computation, and wider by pancomputationalism and paninformationalism, seen as two complementary views of the same physical reality. Moreover, it follows that pancomputationalism does not depend on the assumption that the physical world on some basic level is digital. Contrary to many believes it is entirely compatible with dual quantum-mechanical computing. (shrink)
Computers today are not only the calculation tools - they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond Church-Turing limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agent-based and concurrent. The main criterion of success of computation is not its termination, but the adequacy of its (...) response, its speed, generality and flexibility; adaptability, and tolerance to noise, error,faults, and damage. Interactive computing is a generalization of Turing computing, and it calls for new conceptualizations (Goldin, Wegner). In the info-computationalist framework, with computation seen as information processing, natural computation appears as the most suitable paradigm of computation and information semantics requires logical pluralism. (shrink)
This is a short presentation by the Guest Editors of the series of Special Issues of the journal _Philosophies_ under the common title “Contemporary Natural Philosophy and Philosophies” in which we present Part 2. The series will continue, and the call for contributions to the next Special Issue will appear shortly.
What is reality for an agent? What is minimal cognition? How does the morphology of a cognitive agent affect cognition? These are still open questions among scientists and philosophers. In this chapter we propose the idea of info-computational nature as a framework for answering those questions. Within the info-computational framework, information is defined as a structure, and computation as the dynamics of information. To an agent, nature therefore appears as an informational structure with computational dynamics. Both information and computation in (...) this context have broader meaning than in everyday use, and both are necessarily grounded in physical implementation. Evolution of increasingly complex living agents is understood as a process of morphological computation driven by agents’ interactions with the environment. It is a process much more complex than random variation; instead the mechanisms of change are morphological computational processes of self-organisation. Reality for an agent emerges as a result of interactions with the environment together with internal information processing. Following Maturana and Varela, we take cognition to be the process of living of an organism, and thus it appears on different levels of complexity, from cellular via organismic to social. The simpler the agent, the simpler its “reality” defined by the network of networks of info-computational processes, which constitute its cognition. The debated topic of consciousness takes its natural place in this framework, as a process of information integration that we suggest naturally evolved in organisms with a nervous system. Computing nature/pancomputationalism is sometimes confused with panpsychism or claimed to necessarily imply panpsychism, which we show is not the case. Even though we focus on natural systems in this chapter, the info-computational approach is general and can be used to model both biological and artifactual cognitive agents. (shrink)
Within the Computer Science community, many ethical issues have emerged as significant and critical concerns. Computer ethics is an academic field in its own right and there are unique ethical issues associated with information technology. It encompasses a range of issues and concerns including privacy and agency around personal information, Artificial Intelligence and pervasive technology, the Internet of Things and surveillance applications. As computing technology impacts society at an ever growing pace, there are growing calls for more computer ethics content (...) to be included in Computer Science curricula. In this paper we present the results of a survey that polled faculty from Computer Science and related disciplines about teaching practices for computer ethics at their institutions. The survey was completed by respondents from 61 universities across 23 European countries. Participants were surveyed on whether or not computer ethics is taught to Computer Science students at each institution, the reasons why computer ethics is or is not taught, how computer ethics is taught, the background of staff who teach computer ethics and the scope of computer ethics curricula. This paper presents and discusses the results of the survey. (shrink)
Play and games are among the basic means of expression in intelligent communication, influenced by the relevant cultural environment. Games have found a natural expression in the contemporary computer era in which communications are increasingly mediated by computing technology. The widespread use of e-games results in conceptual and policy vacuums that must be examined and understood. Humans involved in design-ing, administering, selling, playing etc. computer games encounter new situations in which good and bad, right and wrong, are not defined by (...) the experience of previous generations. This article gives an account of the historical necessity of games, the development of e-games, their pros- and cons, threats and promises, focusing on the ethical awareness and attitudes of game developers. (shrink)
With the rapidly growing amounts of information, visualization is becoming increasingly important, as it allows users to easily explore and understand large amounts of information. However the field of information visualiza- tion currently lacks sufficient theoretical foundations. This article addresses foundational questions connecting information visualization with computing and philosophy studies. The idea of multiscale information granula- tion is described based on two fundamental concepts: information (structure) and computation (process). A new information processing paradigm of Granular Computing enables stepwise increase of (...) granulation/aggregation of information on different levels of resolution, which makes possible dynamical viewing of data. Information produced by Google Earth is an illustration of visualization based on clustering (granulation) of information on a succession of layers. Depending on level, specific emergent properties become visible as a result of different ways of aggregation of data/information. As information visualization ultimately aims at amplifying cognition, we discuss the process of simulation and emulation in relation to cognition, and in particular visual cognition. (shrink)
This review essay analyzes the book by Giuseppe Primiero, On the foundations of computing. Oxford: Oxford University Press (ISBN 978-0-19-883564-6/hbk; 978-0-19-883565-3/pbk). xix, 296 p. (2020). It gives a critical view from the perspective of physical computing as a foundation of computing and argues that the neglected pillar of material computation (Stepney) should be brought centerstage and computing recognized as the fourth great domain of science (Denning).
European Computing and Philosophy conference, 2–4 July Barcelona The Seventh ECAP (European Computing and Philosophy) conference was organized by Jordi Vallverdu at Autonomous University of Barcelona. The conference started with the IACAP (The International Association for CAP) presidential address by Luciano Floridi, focusing on mechanisms of knowledge production in informational networks. The first keynote delivered by Klaus Mainzer made a frame for the rest of the conference, by elucidating the fundamental role of complexity of informational structures that can be analyzed (...) on different levels of organization giving place for variety of possible approaches which converge in this cross-disciplinary and multi-disciplinary research field. Keynotes by Kevin Warwick about re-embodiment of rats’ neurons into robots, Raymond Turner on syntax and semantics in programming languages, Roderic Guigo on Biocomputing Sciences and Francesco Subirada on the past and future of supercomputing presented different topics of philosophical as well as practical aspects of computing. Vonference tracks included: Philosophy of Information (Patrick Allo), Philosophy of Computer Science (Raymond Turner), Computer and Information Ethics (Johnny Søraker and Alison Adam), Computational Approaches to the Mind (Ruth Hagengruber), IT and Cultural Diversity (Jutta Weber and Charles Ess), Crossroads (David Casacuberta), Robotics, AI & Ambient Intelligence (Thomas Roth-Berghofer), Biocomputing, Evolutionary and Complex Systems (Gordana Dodig Crnkovic and Søren Brier), E-learning, E-science and Computer-Supported Cooperative Work (Annamaria Carusi) and Technological Singularity and Acceleration Studies (Amnon Eden). (shrink)
In this paper we analyze methodological and philosophical implications of algorithmic aspects of unconventional computation. At first, we describe how the classical algorithmic universe developed and analyze why it became closed in the conventional approach to computation. Then we explain how new models of algorithms turned the classical closed algorithmic universe into the open world of algorithmic constellations, allowing higher flexibility and expressive power, supporting constructivism and creativity in mathematical modeling. As Goedels undecidability theorems demonstrate, the closed algorithmic universe restricts (...) essential forms of mathematical cognition. In contrast, the open algorithmic universe, and even more the open world of algorithmic constellations, remove such restrictions and enable new, richer understanding of computation. (shrink)
This essay presents arguments for the claim that in the best of all possible worlds (Leibniz) there are sources of unpredictability and creativity for us humans, even given a pancomputational stance. A suggested answer to Chaitin’s questions: “Where do new mathematical and biological ideas come from? How do they emerge?” is that they come from the world and emerge from basic physical (computational) laws. For humans as a tiny subset of the universe, a part of the new ideas comes as (...) the result of the re-configuration and reshaping of already existing elements and another part comes from the outside as a consequence of openness and interactivity of the system. For the universe at large it is randomness that is the source of unpredictability on the fundamental level. In order to be able to completely predict the Universe-computer we would need the Universe-computer itself to compute its next state; as Chaitin already demonstrated there are incompressible truths which means truths that cannot be computed by any other computer but the universe itself. (shrink)
This paper investigates the relationship between reality and model, information and truth. It will argue that meaningful data need not be true in order to constitute information. Information to which truth-value cannot be ascribed, partially true information or even false information can lead to an interesting outcome such as technological innovation or scientific breakthrough. In the research process, during the transition between two theoretical frameworks, there is a dynamic mixture of old and new concepts in which truth is not well (...) defined. Instead of veridicity, correctness of a model and its appropriateness within a context are commonly required. Despite empirical models being in general only truthlike, they are nevertheless capable of producing results from which conclusions can be drawn and adequate decisions made. (shrink)
We are living in an era when the focus of human relationships with the world is shifting from execution and physical impact to control and cognitive/informational interaction. This emerging, increasingly informational world is our new ecology, an infosphere that presents the grounds for a cognitive revolution based on interactions in networks of biological and artificial, intelligent agents. After the industrial revolution, which extended the human body through mechanical machinery, the cognitive revolution extends the human mind/cognition through information-processing machinery. These novel (...) circumstances come with new qualities and preferences demanding new conceptualizations. We have some work ahead of us to establish value systems and practices extended from the real to the increasingly virtual/info-computational. This paper first presents a current view of the virtual versus the real and then offers an interpretation framework based on an info-computational understanding of cognition in which agency implies computational processing of informational structures of the world as an infosphere. The notion of “good life” is discussed in light of different ideals of well-being in the infosphere, connecting virtuality as a space of potential and alternative worlds for an agent for whom the reality is a space of actual experiences, in the sense of Deleuze. Even though info-computational framework enables us to see both the real world and the diversity of virtual worlds in terms of computational processes on informational structures, based on a distinct layered cognitive architecture of all physical agents, there is clear difference between potential worlds of the virtual and actual agent’s experiences made in the real. Info-computationalism enables insight into the mechanisms of infosphere and elucidates its importance as cognitively predominant environment and communication media. The conclusion is that by cocooning ourselves in an elaborate info-computational infrastructure of the virtual, we may be increasingly isolating ourselves from the reality of direct experience of the world. The biggest challenges of the cognitive revolution may not be technological but ethical. They are about the nature of being human and its values. (shrink)
I review von Foerster’s computational approach to cognition in relation to foresight and hindsight, and to his Ethical Imperative. For him, ethics must remain implicit and becomes manifest ….
From the Philosophies journal program, one of the main aims of the journal is to help establish a new unity in diversity in human knowledge, which would include both “Wissen” (i.e., “Wissenschaft”) and “sc¯ıre” (i.e., “science”). As is known, “Wissenshaft” (the pursuit of knowledge, learning, and scholarship) is a broader concept of knowledge than “science”, as it involves all kinds of knowledge,including philosophy, and not exclusively knowledge in the form of directly testable explanations and predictions. The broader notion of scholarship (...) incorporates an understanding and articulation of the role of the learner and the process of the growth of knowledge and its development, rather than only the final product and its verification and validation. In other words, it is a form of knowledge that is inclusive of both short-term and long-term perspectives; it is local and global, critical and hypothetical (speculative), breaking new ground. This new synthesis or rather re-connection of knowledge is expected to resonate with basic human value systems, including cultural values. Since knowledge tends to spontaneously fragment while it grows, we take existing diversity as a resource and a starting point for a new synthesis. The idea of broad, inclusive knowledge is in fact not so new. From the beginning, natural philosophy included all contemporary knowledge about nature. Newton was a natural philosopher, as were Bohr, Einstein, Prigogine, Weizsäcker, and Wheeler—to name but a few. Today, the unifying picture of the natural/physical world is sorely missing among the isolated silos of particular scientific domains, each with its own specific ontologies, methodologies, and epistemologies. From the profound need for connected and common knowledge, new trends towards synthesis have emerged in the last decades. One major theme is complexity, especially when applied to biology or medicine, which helps us to grasp the importance of connectedness between present-day disparate pieces of knowledge—frameworks, theories, approaches, etc. Related to this is the emergence of network science, which studies structures of nodes (actors) and edges as connections between them. This book is connecting work on contemporary natural philosophy and its connections with existing philosophies, sciences and other knowledge fields. (shrink)
In this book the editors invited prominent researchers with different perspectives and deep insights into the various facets of the relationship between reality and representation in the following three classes of agent: in humans, in other living beings, and in machines. -/- The book enriches our views on representation and deepens our understanding of its different aspects, a question that connects philosophy, computer science, logic, anthropology, psychology, sociology, neuroscience, linguistics, information and communication science, systems theory and engineering, computability, cybernetics, synthetic (...) biology, and bioinformatics biosemiotics. This book will be relevant to researchers in these fields. (shrink)
Today’s computer network technologies are sociologically founded on hunter-gatherer principles; common users may be possible subjects of surveillance and sophisticated internet-based attacks are almost impossible to prevent. At the same time, information and communication technology, ICT offers the technical possibility of embedded privacy protection. Making technology legitimate by design is a part of the intentional design for democracy. This means incorporating options for socially acceptable behaviour in technical systems, and making the basic principles of privacy protection, rights and responsibilities, transparent (...) to the user. The current global e-polis already has, by means of different technologies, de facto built-in policies that define the level of user-privacy protection. That which remains is to make their ethical implications explicit and understandable to citizens of the global village through interdisciplinary disclosive ethical methods, and to make them correspond to the high ethical norms that support trust, the essential precondition of any socialization. The good news is that research along these lines is already in progress. Hopefully, this will result in a future standard approach to the privacy of network communications. (shrink)
This essay presents arguments for the claim that in the best of all possible worlds (Leibniz) there are sources of unpredictability and creativity for us humans, even given a pancomputational stance. A suggested answer to Chaitin’s questions: “Where do new mathematical and biological ideas come from? How do they emerge?” is that they come from the world and emerge from basic physical (computational) laws. For humans as a tiny subset of the universe, a part of the new ideas comes as (...) the result of the re-configuration and reshaping of already existing elements and another part comes from the outside world as a consequence of openness and interactivity of biological and cognitive systems. For the universe at large it is randomness that is the source of unpredictability on the fundamental level. In order to be able to completely predict the Universe-computer we would need the Universe-computer itself to compute its next state. As Chaitin demonstrated there are incompressible truths, which means truths that cannot be computed by any other computer but the universe itself. (shrink)
This paper investigates the relationship between reality and model, information and truth. It will argue that meaningful data need not be true in order to constitute information. Information to which truth-value cannot be ascribed, partially true information or even false information can lead to an interesting outcome such as technological innovation or scientific breakthrough. In the research process, during the transition between two theoretical frameworks, there is a dynamic mixture of old and new concepts in which truth is not well (...) defined. Instead of veridicity, correctness of a model and its appropriateness within a context are commonly required. Despite empirical models being in general only truthlike, they are nevertheless capable of producing results from which conclusions can be drawn and adequate decisions made. (shrink)