Results for 'algorithmic compressibility'

993 found
Order:
  1.  34
    Algorithmic compression of empirical data: reply to Twardy, Gardner, and Dowe.James Mcallister - 2005 - Studies in History and Philosophy of Science Part A 36 (2):403-410.
    This discussion note responds to objections by Twardy, Gardner, and Dowe to my earlier claim that empirical data sets are algorithmically incompressible. Twardy, Gardner, and Dowe hold that many empirical data sets are compressible by Minimum Message Length technique and offer this as evidence that these data sets are algorithmically compressible. I reply that the compression achieved by Minimum Message Length technique is different from algorithmic compression. I conclude that Twardy, Gardner, and Dowe fail to establish that empirical data (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  2.  32
    Empirical data sets are algorithmically compressible: Reply to McAllister.Charles Twardy, Steve Gardner & David L. Dowe - 2005 - Studies in the History and Philosophy of Science, Part A 36 (2):391-402.
    James McAllister’s 2003 article, “Algorithmic randomness in empirical data” claims that empirical data sets are algorithmically random, and hence incompressible. We show that this claim is mistaken. We present theoretical arguments and empirical evidence for compressibility, and discuss the matter in the framework of Minimum Message Length (MML) inference, which shows that the theory which best compresses the data is the one with highest posterior probability, and the best explanation of the data.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  3.  10
    Compressive Strength Prediction Using Coupled Deep Learning Model with Extreme Gradient Boosting Algorithm: Environmentally Friendly Concrete Incorporating Recycled Aggregate.Mayadah W. Falah, Sadaam Hadee Hussein, Mohammed Ayad Saad, Zainab Hasan Ali, Tan Huy Tran, Rania M. Ghoniem & Ahmed A. Ewees - 2022 - Complexity 2022:1-22.
    The application of recycled aggregate as a sustainable material in construction projects is considered a promising approach to decrease the carbon footprint of concrete structures. Prediction of compressive strength of environmentally friendly concrete containing recycled aggregate is important for understanding sustainable structures’ concrete behaviour. In this research, the capability of the deep learning neural network approach is examined on the simulation of CS of EF concrete. The developed approach is compared to the well-known artificial intelligence approaches named multivariate adaptive regression (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  4.  23
    Compressibility and the Algorithmic Theory of Laws.Billy Wheeler - 2019 - Principia: An International Journal of Epistemology 23 (3):461-485.
    The algorithmic theory of laws claims that the laws of nature are the algorithms in the best possible compression of all empirical data. This position assumes that the universe is compressible and that data received from observing it is easily reproducible using a simple set of rules. However, there are three sources of evidence that suggest that the universe as a whole is incompressible. The first comes from the practice of science. The other two come from the nature of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5.  21
    Vehicle Text Data Compression and Transmission Method Based on Maximum Entropy Neural Network and Optimized Huffman Encoding Algorithms.Jingfeng Yang, Zhenkun Zhang, Nanfeng Zhang, Ming Li, Yanwei Zheng, Li Wang, Yong Li, Ji Yang, Yifei Xiang & Yu Zhang - 2019 - Complexity 2019:1-9.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  10
    Analysis and modification of graphic data compression algorithms.Bouza M. K. - 2020 - Artificial Intelligence Scientific Journal 25 (4):32-40.
    The article examines the algorithms for JPEG and JPEG-2000 compression of various graphic images. The main steps of the operation of both algorithms are given, their advantages and disadvantages are noted. The main differences between JPEG and JPEG-2000 are analyzed. It is noted that the JPEG-2000 algorithm allows re-moving visually unpleasant effects. This makes it possible to highlight important areas of the image and improve the quality of their compression. The features of each step of the algorithms are considered and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  7.  46
    Algorithmic randomness in empirical data.James W. McAllister - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  8.  53
    Compressibility, Laws of Nature, Initial Conditions and Complexity.Sergio Chibbaro & Angelo Vulpiani - 2017 - Foundations of Physics 47 (10):1368-1386.
    We critically analyse the point of view for which laws of nature are just a mean to compress data. Discussing some basic notions of dynamical systems and information theory, we show that the idea that the analysis of large amount of data by means of an algorithm of compression is equivalent to the knowledge one can have from scientific laws, is rather naive. In particular we discuss the subtle conceptual topic of the initial conditions of phenomena which are generally incompressible. (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  9.  7
    Compressed Sensing for THz FMCW Radar 3D Imaging.Shanshan Gu, Guangrong Xi, Lingyu Ge, Zhong Yang, Yizhi Wang, Weina Chen & Zhenzhong Yu - 2021 - Complexity 2021:1-10.
    A terahertz frequency-modulated continuous wave imaging radar system is developed for high-resolution 3D imaging recently. Aiming at the problems of long data acquisition periods and large sample sizes for the developed imaging system, an algorithm based on compressed sensing is proposed for THz FMCW radar 3D imaging in this paper. Firstly, the FMCW radar signal model is built, and the conventional range migration algorithm is introduced for THz FMCW radar imaging. Then, compressed sensing is extended for THz FMCW radar 3D (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  10.  33
    Compression in Working Memory and Its Relationship With Fluid Intelligence.Mustapha Chekaf, Nicolas Gauvrit, Alessandro Guida & Fabien Mathy - 2018 - Cognitive Science 42 (S3):904-922.
    Working memory has been shown to be strongly related to fluid intelligence; however, our goal is to shed further light on the process of information compression in working memory as a determining factor of fluid intelligence. Our main hypothesis was that compression in working memory is an excellent indicator for studying the relationship between working-memory capacity and fluid intelligence because both depend on the optimization of storage capacity. Compressibility of memoranda was estimated using an algorithmic complexity metric. The (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  11.  41
    Algorithmic randomness in empirical data.James W. McAllister - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally (...)
    Direct download  
     
    Export citation  
     
    Bookmark   7 citations  
  12.  10
    Image Compression Based on Block SVD Power Method.Khalid El Asnaoui - 2019 - Journal of Intelligent Systems 29 (1):1345-1359.
    In recent years, the important and fast growth in the development and demand of multimedia products is contributing to an insufficiency in the bandwidth of devices and network storage memory. Consequently, the theory of data compression becomes more significant for reducing data redundancy in order to allow more transfer and storage of data. In this context, this paper addresses the problem of lossy image compression. Indeed, this new proposed method is based on the block singular value decomposition (SVD) power method (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  13.  14
    Algorithmics and the Limits of Complexity.Daniel Parrochia - 1996 - Science in Context 9 (1):39-56.
    The ArgumentDagognet's work shows that making algorithmic compressions seems to be one of the major targets of scientific progress. This effort has been so successful that until recently one might have thought everything could be algorithmically compressed. Indeed, this statement, which might be seen as a scientific translation of the Hegelian thesis in its strong form, admits to some objective limits in computer science. Though a lot of algorithms are successful, there exist today, and perhaps forever, logical and physical (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14.  9
    Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence: Papers From the Ray Solomonoff 85th Memorial Conference, Melbourne, Vic, Australia, November 30 -- December 2, 2011.David L. Dowe (ed.) - 2013 - Springer.
    Algorithmic probability and friends: Proceedings of the Ray Solomonoff 85th memorial conference is a collection of original work and surveys. The Solomonoff 85th memorial conference was held at Monash University's Clayton campus in Melbourne, Australia as a tribute to pioneer, Ray Solomonoff, honouring his various pioneering works - most particularly, his revolutionary insight in the early 1960s that the universality of Universal Turing Machines could be used for universal Bayesian prediction and artificial intelligence. This work continues to increasingly influence (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  15.  8
    Basic concepts in algorithms.Shmuel T. Klein - 2021 - Hoboken: World Scientific.
    This book is the result of several decades of teaching experience in data structures and algorithms. It is self-contained but does assume some prior knowledge of data structures, and a grasp of basic programming and mathematics tools. Basic Concepts in Algorithms focuses on more advanced paradigms and methods combining basic programming constructs as building blocks and their usefulness in the derivation of algorithms. Its coverage includes the algorithms' design process and an analysis of their performance. It is primarily intended as (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  16. Algorithmic Structuring of Cut-free Proofs.Matthias Baaz & Richard Zach - 1993 - In Börger Egon, Kleine Büning Hans, Jäger Gerhard, Martini Simone & Richter Michael M. (eds.), Computer Science Logic. CSL’92, San Miniato, Italy. Selected Papers. Springer. pp. 29–42.
    The problem of algorithmic structuring of proofs in the sequent calculi LK and LKB ( LK where blocks of quantifiers can be introduced in one step) is investigated, where a distinction is made between linear proofs and proofs in tree form. In this framework, structuring coincides with the introduction of cuts into a proof. The algorithmic solvability of this problem can be reduced to the question of k-l-compressibility: "Given a proof of length k , and l ≤ (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17. Compressed Environments: Unbounded Optimizers Should Sometimes Ignore Information. [REVIEW]Nathan Berg & Ulrich Hoffrage - 2010 - Minds and Machines 20 (2):259-275.
    Given free information and unlimited processing power, should decision algorithms use as much information as possible? A formal model of the decision-making environment is developed to address this question and provide conditions under which informationally frugal algorithms, without any information or processing costs whatsoever, are optimal. One cause of compression that allows optimal algorithms to rationally ignore information is inverse movement of payoffs and probabilities (e.g., high payoffs occur with low probably and low payoffs occur with high probability). If inversely (...)
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  18.  48
    Algorithmic randomness and measures of complexity.George Barmpalias - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  19.  10
    Application of Normalized Compression Distance and Lempel-Ziv Jaccard Distance in Micro-electrode Signal Stream Classification for the Surgical Treatment of Parkinson’s Disease.Kamil Ząbkiewicz - 2018 - Studies in Logic, Grammar and Rhetoric 56 (1):45-57.
    Parkinson’s Disease can be treated with the use of microelectrode recording and stimulation. This paper presents a data stream classifier that analyses raw data from micro-electrodes and decides whether the measurements were taken from the subthalamic nucleus (STN) or not. The novelty of the proposed approach is based on the fact that distances based on raw data are used. Two distances are investigated in this paper, i.e. Normalized Compression Distance (NCD) and Lempel-Ziv Jaccard Distance (LZJD). No new features needed to (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  20.  14
    Algorithmic Randomness and Measures of Complexity.George Barmpalias - 2013 - Bulletin of Symbolic Logic 19 (3):318-350.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on reducibilities that measure the initial segment complexity of reals and the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  21.  26
    The application of algorithmic information theory to noisy patterned strings.Sean Devine - 2006 - Complexity 12 (2):52-58.
    Although algorithmic information theory provides a measure of the information content of string of characters, problems of noise and noncomputability emerge. However, if pattern in a noisy string is recognized by reference to a set of similar strings, this article shows that a compressed algorithmic description of a noisy string is possible and illustrates this with some simple examples. The article also shows that algorithmic information theory can quantify the information in complex organized systems where pattern is (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  22. Is Evolution Algorithmic?Marcin Miłkowski - 2009 - Minds and Machines 19 (4):465-475.
    In Darwin’s Dangerous Idea, Daniel Dennett claims that evolution is algorithmic. On Dennett’s analysis, evolutionary processes are trivially algorithmic because he assumes that all natural processes are algorithmic. I will argue that there are more robust ways to understand algorithmic processes that make the claim that evolution is algorithmic empirical and not conceptual. While laws of nature can be seen as compression algorithms of information about the world, it does not follow logically that they are (...)
    Direct download (15 more)  
     
    Export citation  
     
    Bookmark  
  23.  13
    Selected papers on design of algorithms.Donald Ervin Knuth - 2010 - Stanford, Calif.: Center for the Study of Language and Information.
    Donald E. Knuth has been making foundational contributions to the field of computer science for as long as computer science has been a field. His award-winning textbooks are often given credit for shaping the field, and his scientific papers are widely referenced and stand as milestones of development over a wide variety of topics. The present volume, the seventh in a series of his collected papers, is devoted to his work on the design of new algorithms. Nearly thirty of Knuth’s (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  24.  7
    Study of Human Motion Recognition Algorithm Based on Multichannel 3D Convolutional Neural Network.Yang Ju - 2021 - Complexity 2021:1-12.
    Aiming at the problem that it is difficult to balance the speed and accuracy of human behaviour recognition, this paper proposes a method of motion recognition based on random projection. Firstly, the optical flow picture and Red, Green, Blue picture obtained by the Lucas-Kanade algorithm are used. Secondly, the data of optical flow pictures and RGB pictures are compressed based on a random projection matrix of compressed sensing, which effectively reduces power consumption. At the same time, based on random projection (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  25.  2
    Construction of Social Security Fund Cloud Audit Platform Based on Fuzzy Data Mining Algorithm.Yangting Huai & Qianxiao Zhang - 2021 - Complexity 2021:1-11.
    Guided by the theories of system theory, synergetic theory, and other disciplines and based on fuzzy data mining algorithm, this article constructs a three-tier social security fund cloud audit platform. Firstly, the article systematically expounds the current situation of social security fund and social security fund audit, such as the technical basis of cloud computing and data mining. Combined with the actual work, the necessity and feasibility of building a cloud audit platform for social security funds are analyzed. This article (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26.  17
    Prediction of Ammunition Storage Reliability Based on Improved Ant Colony Algorithm and BP Neural Network.Fang Liu, Hua Gong, Ligang Cai & Ke Xu - 2019 - Complexity 2019:1-13.
    The interference of the complex background and less information of the small targets are two major problems in vehicle attribute recognition. In this paper, two cascaded networks of vehicle attribute recognition are established to solve the two problems. For vehicle targets with normal size, the multitask cascaded convolution neural network MC-CNN-NT uses the improved Faster R-CNN as the location subnetwork. The vehicle targets in the complex background are extracted by the location subnetwork to the classification subnetwork CNN for the classification. (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  27. Strengthening Weak Emergence.Nora Berenstain - 2020 - Erkenntnis 87 (5):2457-2474.
    Bedau's influential (1997) account analyzes weak emergence in terms of the non-derivability of a system’s macrostates from its microstates except by simulation. I offer an improved version of Bedau’s account of weak emergence in light of insights from information theory. Non-derivability alone does not guarantee that a system’s macrostates are weakly emergent. Rather, it is non-derivability plus the algorithmic compressibility of the system’s macrostates that makes them weakly emergent. I argue that the resulting information-theoretic picture provides a metaphysical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  28. Humeanism and Exceptions in the Fundamental Laws of Physics.Billy Wheeler - 2017 - Principia: An International Journal of Epistemology 21 (3):317-337.
    It has been argued that the fundamental laws of physics do not face a ‘problem of provisos’ equivalent to that found in other scientific disciplines (Earman, Roberts and Smith 2002) and there is only the appearance of exceptions to physical laws if they are confused with differential equations of evolution type (Smith 2002). In this paper I argue that even if this is true, fundamental laws in physics still pose a major challenge to standard Humean approaches to lawhood, as they (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  29.  76
    The Nooscope manifested: AI as instrument of knowledge extractivism.Matteo Pasquinelli & Vladan Joler - 2021 - AI and Society 36 (4):1263-1280.
    Some enlightenment regarding the project to mechanise reason. The assembly line of machine learning: data, algorithm, model. The training dataset: the social origins of machine intelligence. The history of AI as the automation of perception. The learning algorithm: compressing the world into a statistical model. All models are wrong, but some are useful. World to vector: the society of classification and prediction bots. Faults of a statistical instrument: the undetection of the new. Adversarial intelligence vs. statistical intelligence: labour in the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  30. Kolmogorov complexity and information theory. With an interpretation in terms of questions and answers.Peter D. Grünwald & Paul M. B. Vitányi - 2003 - Journal of Logic, Language and Information 12 (4):497-529.
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  31.  42
    Seeing Patterns in Randomness: A Computational Model of Surprise.Phil Maguire, Philippe Moser, Rebecca Maguire & Mark T. Keane - 2019 - Topics in Cognitive Science 11 (1):103-118.
    Much research has linked surprise to violation of expectations, but it has been less clear how one can be surprised when one has no particular expectation. This paper discusses a computational theory based on Algorithmic Information Theory, which can account for surprises in which one initially expects randomness but then notices a pattern in stimuli. The authors present evidence that a “randomness deficiency” heuristic leads to surprise in such cases.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  32.  72
    Solomonoff Prediction and Occam’s Razor.Tom F. Sterkenburg - 2016 - Philosophy of Science 83 (4):459-479.
    Algorithmic information theory gives an idealized notion of compressibility that is often presented as an objective measure of simplicity. It is suggested at times that Solomonoff prediction, or algorithmic information theory in a predictive setting, can deliver an argument to justify Occam’s razor. This article explicates the relevant argument and, by converting it into a Bayesian framework, reveals why it has no such justificatory force. The supposed simplicity concept is better perceived as a specific inductive assumption, the (...)
    Direct download (12 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  33. Psychopower and Ordinary Madness: Reticulated Dividuals in Cognitive Capitalism.Ekin Erkan - 2019 - Cosmos and History 15 (1):214-241.
    Despite the seemingly neutral vantage of using nature for widely-distributed computational purposes, neither post-biological nor post-humanist teleology simply concludes with the real "end of nature" as entailed in the loss of the specific ontological status embedded in the identifier "natural." As evinced by the ecological crises of the Anthropocene—of which the 2019 Brazil Amazon rainforest fires are only the most recent—our epoch has transfixed the “natural order" and imposed entropic artificial integration, producing living species that become “anoetic,” made to serve (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  34.  82
    Universal Prediction: A Philosophical Investigation.Tom F. Sterkenburg - 2018 - Dissertation, University of Groningen
    In this thesis I investigate the theoretical possibility of a universal method of prediction. A prediction method is universal if it is always able to learn from data: if it is always able to extrapolate given data about past observations to maximally successful predictions about future observations. The context of this investigation is the broader philosophical question into the possibility of a formal specification of inductive or scientific reasoning, a question that also relates to modern-day speculation about a fully automatized (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  35.  31
    Адаптивна методика наближеної оцінки техніко-економічних показників розробки об'єкта виснаженого газоконденсатного родовища.Mykhailo Fyk, Kyrylo Kurochkin, Mohammed Abbud, Mohammed Al Sultan & Olena Varavina - 2017 - Схід 5 (151):15-21.
    The method of evaluation of technical and economic indicators for the development of gas condensate field tested in the branch system of enterprises of PJSC "Ukrgazvydobuvannya" is considered, which is supplemented and simplified for obtaining approximate estimates in the case of annual natural or artificial changes in the gas condensate factor and thermobaric states of the extracted fluid during the operational period of the late stage. Targeted changes in the algorithms of calculations of basic technical and economic indicators, especially for (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  36. Composition as pattern.Steve Petersen - 2019 - Philosophical Studies 176 (5):1119-1139.
    I argue for patternism, a new answer to the question of when some objects compose a whole. None of the standard principles of composition comfortably capture our natural judgments, such as that my cat exists and my table exists, but there is nothing wholly composed of them. Patternism holds, very roughly, that some things compose a whole whenever together they form a “real pattern”. Plausibly we are inclined to acknowledge the existence of my cat and my table but not of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  37.  37
    Idealization and the Laws of Nature.Billy Wheeler - 2018 - Switzerland: Springer.
    This new study provides a refreshing look at the issue of exceptions and shows that much of the problem stems from a failure to recognize at least two kinds of exception-ridden law: ceteris paribus laws and ideal laws. Billy Wheeler offers the first book-length discussion of ideal laws. The key difference between these two kinds of laws concerns the nature of the conditions that need to be satisfied and their epistemological role in the law’s formulation and discovery. He presents a (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  38.  12
    Simplicity, Language-Dependency and the Best System Account of Laws.Billy Wheeler - 2016 - Theoria: Revista de Teoría, Historia y Fundamentos de la Ciencia 31 (2):189-206.
    It is often said that the best system account of laws (BSA) needs supplementing with a theory of perfectly natural properties. The ‘strength’ and ‘simplicity’ of a system is language-relative and without a fixed vocabulary it is impossible to compare rival systems. Recently a number of philosophers have attempted to reformulate the BSA in an effort to avoid commitment to natural properties. I assess these proposals and argue that they are problematic as they stand. Nonetheless, I agree with their aim, (...)
    Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  39. Simplicity, Language-Dependency and the Best System Account of Laws.Billy Wheeler - 2014 - Theoria : An International Journal for Theory, History and Fundations of Science 31 (2):189-206.
    It is often said that the best system account of laws needs supplementing with a theory of perfectly natural properties. The ‘strength’ and ‘simplicity’ of a system is language-relative and without a fixed vocabulary it is impossible to compare rival systems. Recently a number of philosophers have attempted to reformulate the BSA in an effort to avoid commitment to natural properties. I assess these proposals and argue that they are problematic as they stand. Nonetheless, I agree with their aim, and (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40.  9
    Parental manual ventilation in resource-limited settings: an ethical controversy.Emily Barsky & Sadath Sayeed - 2020 - Journal of Medical Ethics 46 (7):459-464.
    Lower respiratory tract infections are a leading cause of paediatric morbidity and mortality worldwide. Children in low-income countries are disproportionately affected. This is in large part due to limitations in healthcare resources and medical technologies. Mechanical ventilation can be a life-saving therapy for many children with acute respiratory failure. The scarcity of functioning ventilators in low-income countries results in countless preventable deaths. Some hospitals have attempted to adapt to this scarcity by using hand-bag ventilation, as either a bridge to a (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  41.  4
    Consumption Reduction Solution of TV News Broadcast System Based on Wireless Communication Network.Haifeng Qiang - 2021 - Complexity 2021:1-13.
    At present, the news broadcast system using mobile network on the market provides the basic functions required by TV stations, but there are still many problems and shortcomings. In view of the main problems existing in the current system and combined with the actual needs of current users, this paper has preliminarily developed a news broadcast system based on 5G Live. The card frame adaptive strategy significantly improves the user experience by using gradual video frame buffering technology. Hardware codec technology (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  42.  7
    Virtual Reality Video Image Classification Based on Texture Features.Guofang Qin & Guoliang Qin - 2021 - Complexity 2021:1-11.
    As one of the most widely used methods in deep learning technology, convolutional neural networks have powerful feature extraction capabilities and nonlinear data fitting capabilities. However, the convolutional neural network method still has disadvantages such as complex network model, too long training time and excessive consumption of computing resources, slow convergence speed, network overfitting, and classification accuracy that needs to be improved. Therefore, this article proposes a dense convolutional neural network classification algorithm based on texture features for images in virtual (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  43.  2
    Fusion-Learning-Based Optimization: A Modified Metaheuristic Method for Lightweight High-Performance Concrete Design.Ghodrat Rahchamani, Seyed Mojtaba Movahedifar & Amin Honarbakhsh - 2022 - Complexity 2022:1-15.
    In order to build high-quality concrete, it is imperative to know the raw materials in advance. It is possible to accurately predict the quality of concrete and the amount of raw materials used using machine learning-enhanced methods. An automated process based on machine learning strategies is proposed in this paper for predicting the compressive strength of concrete. Fusion-learning-based optimization is used in the proposed approach to generate a strong learner by pooling support vector regression models. The SVR technique proposes an (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  7
    Improving the efficiency of intrusion detection in information systems.Bouderah Brahim, Nacer Eddine Yousfi, Bourenane Malika & Lounis Ouarda - 2022 - Journal of Intelligent Systems 31 (1):835-854.
    Policy Interaction Graph Analysis is a Host-based Intrusion Detection tool that uses Linux MAC Mandatory access control policy to build the licit information flow graph and uses a detection policy defined by the administrator to extract illicit behaviour from the graph. The main limitation of this tool is the generation of a huge signature base of illicit behaviours; hence, this leads to the use of huge memory space to store it. Our primary goal in this article is to reduce this (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  45.  16
    Task Reallocating for Responding to Design Change in Complex Product Design.Meng Wei, Yu Yang, Jiafu Su, Qiucheng Li & Zhichao Liang - 2019 - Journal of Intelligent Systems 28 (1):57-76.
    In the real-world complex product design process, task allocating is an ongoing reactive process where the presence of unexpected design change is usually inevitable. Therefore, reallocating is necessary to respond to design change positively as a procedure to repair the affected task plan. General reallocating literature addressed the reallocating versions with fixed executing time. In this paper, a multi-objective reallocation model is developed with a feasible assumption that the task executing time is controllable. To illustrate this idea, a compressing executing (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  46.  4
    Dance Movement Recognition Based on Feature Expression and Attribute Mining.Xianfeng Zhai - 2021 - Complexity 2021:1-12.
    There are complex posture changes in dance movements, which lead to the low accuracy of dance movement recognition. And none of the current motion recognition uses the dancer’s attributes. The attribute feature of dancer is the important high-level semantic information in the action recognition. Therefore, a dance movement recognition algorithm based on feature expression and attribute mining is designed to learn the complicated and changeable dancer movements. Firstly, the original image information is compressed by the time-domain fusion module, and the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  47.  15
    Just Because You Can—Doesn’t Mean You Should.Mindy B. Statter - 2015 - Narrative Inquiry in Bioethics 5 (1):22-24.
    In lieu of an abstract, here is a brief excerpt of the content:“Just Because You Can—Doesn’t Mean You Should”Mindy B. StatterAs Albert R. Jonsen stated, “The technological imperative begins to rule clinical decisions: if a technology exists, it must be applied. Patients... are moved to higher and higher levels of care, finally becoming enmeshed in a tangle of tubes that extinguish their identity and needs as persons.” In this case the conflict created by the parental demand for the utilization of (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  48.  10
    Computer Science Logic.Dirk van Dalen & Marc Bezem (eds.) - 1997 - Springer.
    The related fields of fractal image encoding and fractal image analysis have blossomed in recent years. This book, originating from a NATO Advanced Study Institute held in 1995, presents work by leading researchers. It is developing the subjects at an introductory level, but it also has some recent and exciting results in both fields. The book contains a thorough discussion of fractal image compression and decompression, including both continuous and discrete formulations, vector space and hierarchical methods, and algorithmic optimizations. (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  49.  40
    On Maxwell's demons and the origin of evolutionary variations: An internalist perspective.Eugenio Andrade - 2004 - Acta Biotheoretica 52 (1):17-40.
    This paper defends an internalist perspective of selection based on the hypothesis that considers living evolutionary units as Maxwell's demons (MD) or Zurek's Information Gathering and Using Systems (IGUS). Individuals are considered as IGUS that extract work by means of measuring and recording processes. Interactions or measurements convert uncertainty about the environment (Shannon's information, H) into internalized information in the form of a compressed record (Chaitin's algorithmic complexity, K). The requirements of the model and the limitations inherent to its (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  50.  28
    On the Kolmogorov-Chaitin complexity for short sequences.Hector Zenil - unknown
    This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye. Zenil presents Experimental Algorithmic Theory as Algorithmic Information Theory and NKS, put together in a mixer. Algorithmic Complexity Theory defines the algorithmic complexity k(s) as the length of the shortest program that produces s. But since finding this short program is in general an undecidable question, the only way to approach k(s) is to use compression algorithms. He shows how to use the Compress (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
1 — 50 / 993