Information Theory as a General Language for Functional Systems
Abstract
Function refers to a broad family of concepts of varying abstractness and range of application, from a many-one mathematical relation of great generality to, for example, highly specialized roles of designed elements in complex machines such as degaussing in a television set, or contributory processes to control mechanisms in complex metabolic pathways, such as the inhibitory function of the appropriate part of the lac-operon on the production of lactase through its action on the genome in the absence of lactose. We would like a language broad enough, neutral enough, but yet powerful enough to cover all such cases, and at the same time to give a framework form explanation both of the family resemblances and differences. General logic and mathematics are too abstract, but more importantly, too broad, whereas other discourses of function, such as the biological and teleological contexts, are too narrow. Information is especially suited since it is mathematically grounded, but also has a wellknown physical interpretation through the Schr dinger/Brillouin Negentropy Principle of Information, and an engineering or design interpretation through Shannon's communication theory. My main focus will be on the functions of autonomous anticipatory systems, but I will try to demonstrate both the connections between this notion of function and the others, especially to dynamical systems with a physical interpretation on the one side and intentional systems on the other. The former are based in concepts like force, energy and work, while the latter involve notions like representation, control and purpose, traditionally, at least in Modern times, on opposite sides of the Cartesian divide. In principle, information can be reduced to energy, but it has the advantage of being more flexible, and easier to apply to higher level phenomena