Very large databases are a major opportunity for science and data analytics is a remarkable new field of investigation in computer science. The effectiveness of these tools is used to support a “philosophy” against the scientific method as developed throughout history. According to this view, computer-discovered correlations should replace understanding and guide prediction and action. Consequently, there will be no need to give scientific meaning to phenomena, by proposing, say, causal relations, since regularities in very large databases are enough: “with (...) enough data, the numbers speak for themselves”. The “end of science” is proclaimed. Using classical results from ergodic theory, Ramsey theory and algorithmic information theory, we show that this “philosophy” is wrong. For example, we prove that very large databases have to contain arbitrary correlations. These correlations appear only due to the size, not the nature, of data. They can be found in “randomly” generated, large enough databases, which—as we will prove—implies that most correlations are spurious. Too much information tends to behave like very little information. The scientific method can be enriched by computer mining in immense databases, but not replaced by it. (shrink)
If is a random sequence, then the sequence is clearly not random; however, seems to be “about half random”. L. Staiger [Kolmogorov complexity and Hausdorff dimension, Inform. and Comput. 103 159–194 and A tight upper bound on Kolmogorov complexity and uniformly optimal prediction, Theory Comput. Syst. 31 215–229] and K. Tadaki [A generalisation of Chaitin’s halting probability Ω and halting self-similar sets, Hokkaido Math. J. 31 219–253] have studied the degree of randomness of sequences or reals by measuring their “degree (...) of compression”. This line of study leads to various definitions of partial randomness. In this paper we explore some relations between these definitions. Among other results we obtain a characterisation of Σ1-dimension in terms of strong Martin-Löf ε-tests , and we show that ε-randomness for ε is different than the classical 1-randomness. (shrink)
We study some aspects of the emergence of _lógos_ from _xáos_ on a basal model of the universe using methods and techniques from algorithmic information and Ramsey theories. Thereby an intrinsic and unusual mixture of meaningful and spurious, emerging laws surfaces. The spurious, emergent laws abound, they can be found almost everywhere. In accord with the ancient Greek theogony one could say that _lógos_, the Gods and the laws of the universe, originate from “the void,„ or from _xáos_, a picture (...) which supports the unresolvable/irreducible lawless hypothesis. The analysis presented in this paper suggests that the “laws„ discovered in science correspond merely to syntactical correlations, are local and not universal. (shrink)
Do the partial order and ortholattice operations of a quantum logic correspond to the logical implication and connectives of classical logic? Rephrased, How far might a classical understanding of quantum mechanics be, in principle, possible? A celebrated result of Kochen and Specker answers the above question in the negative. However, this answer is just one among various possible ones, not all negative. It is our aim to discuss the above question in terms of mappings of quantum worlds into classical ones, (...) more specifically, in terms of embeddings of quantum logics into classical logics; depending upon the type of restrictions imposed on embeddings, the question may get negative or positive answers. (shrink)
Recently, G. ASSER has obtained two interesting characterizations of the class of unary primitive recursive string-functions over a fixed alphabet as Robinson algebras. Both characterizations use a somewhat artificial string-function, namely the string-function lexicographically associated with the number-theoretical excess-over-a-square function. Our aim is to offer two new and natural Robinson algebras which are equivalent to ASSER’S algebras.
Turing’s famous 1936 paper “On computable numbers, with an application to the Entscheidungsproblem” deﬁnes a computable real number and uses Cantor’s diagonal argument to exhibit an uncomputable real. Roughly speaking, a computable real is one that one can calculate digit by digit, that there is an algorithm for approximating as closely as one may wish. All the reals one normally encounters in analysis are computable, like π, √2 and e. But they are much scarcer than the uncomputable reals because, as (...) Turing points out, the computable reals are countable, whilst the uncomputable reals have the power of the continuum. Furthermore, any countable set of reals has measure zero, so the computable reals have measure zero. In other words, if one picks a real at random in the unit interval with uniform probability distribution, the probability of obtaining an uncomputable real is unity. One may obtain a computable real, but that is in- ﬁnitely improbable. But how about individual examples of uncomputable reals? We will show two: H and the halting probability Ω, both contained in the unit interval. Their construction was anticipated in.. (shrink)
The present paper proposes a generalisation of the notion of disjunctive sequence, that is, of an infinite sequence of letters having each finite sequence as a subword. Our aim is to give a reasonable notion of disjunctiveness relative to a given set of sequences F. We show that a definition like “every subword which occurs at infinitely many different positions in sequences in F has to occur infinitely often in the sequence” fulfils properties similar to the original unrelativised notion of (...) disjunctiveness. Finally, we investigate our concept of generalised disjunctiveness in spaces of Cantor expansions of reals. (shrink)
Gödel's Incompleteness Theorems have the same scientific status as Einstein's principle of relativity, Heisenberg's uncertainty principle, and Watson and Crick's double helix model of DNA. Our aim is to discuss some new faces of the incompleteness phenomenon unveiled by an information-theoretic approach to randomness and recent developments in quantum computing.