Authors
Abstract
Humans are cognitive entities. Our ongoing interactions with the environment are threaded with creations and usages of meaningful information. Animal life is also populated with meaningful information related to survival constraints. Information managed by artificial agents can also be considered as having meanings, as derived from the designer. Such perspective brings us to propose an evolutionary approach to cognition based on meaningful information management. We use a systemic tool, the Meaning Generator System (MGS), and apply it consecutively to animals, humans and artificial agents [1, 2]. The MGS receives information from its environment and compares it with its constraint. The generated meaning is the connection existing between the received information and the constraint. It triggers an action aimed at satisfying the constraint. The action modifies the environment and the generated meaning. Meaning generation links agents to their environments. The MGS is a system: a set of elements linked by a set of relations. Any system submitted to a constraint and capable of receiving information can lead to a MGS. Animals, humans and robots are agents containing MGSs dealing with different constraints. Similar MGSs carrying different constraints will generate different meanings. Cognition is system dependent. Contrary to approaches on meaning generation based on psychology or linguistics, the MGS approach is not based on human mind. We want to avoid the circularity of taking human mind as a starting point. Free will and self-consciousness participate to the management of human meanings. They do not exist for animals or robots. Staying alive is a constraint that we share with animals. Robots ignore that constraint. We first use the MGS for animals with “stay alive” and “group life” constraints. The analysis of meaning and cognition in animals is however limited by our un-complete understanding of the nature of life (the question of final causes). Extending the analysis of meaning generation and cognition to humans is complex and has some true limitations as the nature of human mind is a mystery for today science and philosophy. The natures of our feelings, free will or self-consciousness are unknown. Approaches to identify human constraints are however possible, where the MGS can highlight some openings [3, 4]. Modeling meaning management in artificial agents is rather straightforward with the MGS. We, the designers, know the agents and the constraints. The derived nature of constraints, meaning and cognition is however to be highlighted. We define a meaningful representation of an item for an agent as being the networks of meanings relative to the item for the agent, with the action scenarios involving the item. Such meaningful representations embed the agents in their environments and are far from the GOFAI type of representations. Cognition, meanings and representations exist by and for the agents. We finish by summarizing the points presented here and highlight possible continuations . [1] “Information and Meaning” [2] “Introduction to a systemic theory of meaning” [3] “Computation on Information, Meaning and Representations. An Evolutionary Approach” [4] “Proposal for a shared evolutionary nature of language and consciousness”
Keywords cognition  information  meaning  constraint  animal  human  robot  evolution  representation  consciousness
Categories (categorize this paper)
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy

 PhilArchive page | Other versions
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

Computation on Information, Meaning and Representations. An Evolutionary Approach (World Scientific 2011).Christophe Menant - 2011 - In Dodig-Crnkovic, Gordana & Mark Burgin (eds.), Information and Computation. World Scientific. pp. 255-286.
Turing Test, Chinese Room Argument, Symbol Grounding Problem. Meanings in Artificial Agents (APA 2013).Christophe Menant - 2013 - American Philosophical Association Newsletter on Philosophy and Computers 13 (1):30-34.

Analytics

Added to PP index
2020-03-10

Total views
38 ( #288,608 of 2,461,119 )

Recent downloads (6 months)
11 ( #64,512 of 2,461,119 )

How can I increase my downloads?

Downloads

My notes