Order:
  1.  82
    Big Historical Foundations for Deep Future Speculations: Cosmic Evolution, Atechnogenesis, and Technocultural Civilization.Cadell Last - 2017 - Foundations of Science 22 (1):39-124.
    Big historians are attempting to construct a general holistic narrative of human origins enabling an approach to studying the emergence of complexity, the relation between evolutionary processes, and the modern context of human experience and actions. In this paper I attempt to explore the past and future of cosmic evolution within a big historical foundation characterized by physical, biological, and cultural eras of change. From this analysis I offer a model of the human future that includes an addition and/or reinterpretation (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  2.  27
    A Reflective Note for Dialectical Thinkers.Cadell Last - 2018 - International Journal of Žižek Studies 12 (4).
    The dominant forms of thought today exist as either deconstructive or metalinguistic structures. Here we attempt to situate dialectical thinking as a constructive meta-mediation of this opposition between deconstruction and metalanguage. Dialectical thinking offers us a way to think about the processual nature of reason itself as a force of thought mediating being. In this mode of understanding we attempt to think the possibility of articulating the meaning and importance of ‘metaontology’ defined as the ontology of epistemology. In a metaontology (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  3.  14
    Abstraction, mimesis and the evolution of deep learning.Jon Eklöf, Thomas Hamelryck, Cadell Last, Alexander Grima & Ulrika Lundh Snis - forthcoming - AI and Society:1-9.
    Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark