Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Random sequence
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Modern approaches== During the 20th century various technical approaches to defining random sequences were developed and now three distinct paradigms can be identified. In the mid 1960s, [[A. N. Kolmogorov]] and [[D. W. Loveland]] independently proposed a more permissive selection rule.<ref>A. N. Kolmogorov, ''Three approaches to the quantitative definition of information'' Problems of Information and Transmission, 1(1):1–7, 1965.</ref><ref>D.W. Loveland, ''A new interpretation of von Mises' concept of random sequence'' Z. Math. Logik Grundlagen Math 12 (1966) 279–294</ref> In their view Church's recursive function definition was too restrictive in that it read the elements in order. Instead they proposed a rule based on a partially computable process which having read ''any'' ''N'' elements of the sequence, decides if it wants to select another element which has not been read yet. This definition is often called ''Kolmogorov–Loveland stochasticity''. But this method was considered too weak by [[Alexander Shen]] who showed that there is a Kolmogorov–Loveland stochastic sequence which does not conform to the general notion of randomness. In 1966 [[Per Martin-Löf]] introduced a new notion which is now generally considered the most satisfactory notion of [[algorithmic randomness]]. His original definition involved measure theory, but it was later shown that it can be expressed in terms of [[Kolmogorov complexity]]. Kolmogorov's definition of a random string was that it is random if it has no description shorter than itself via a [[universal Turing machine]].<ref>''An introduction to Kolmogorov complexity and its applications'' by Ming Li, P. M. B. Vitányi 1997 0387948686 pages 149–151</ref> Three basic paradigms for dealing with random sequences have now emerged:<ref>R. Downey, ''Some Recent Progress in Algorithmic Randomness'' in Mathematical foundations of computer science 2004: by Jiří Fiala, Václav Koubek 2004 {{ISBN|3-540-22823-3}} page 44</ref> :* The ''frequency / measure-theoretic'' approach. This approach started with the work of Richard von Mises and Alonzo Church. In the 1960s Per Martin-Löf noticed that the sets coding such frequency-based stochastic properties are a special kind of [[measure zero]] sets, and that a more general and smooth definition can be obtained by considering all effectively measure zero sets. :* The ''complexity / compressibility'' approach. This paradigm was championed by A. N. Kolmogorov along with contributions from [[Leonid Levin]] and [[Gregory Chaitin]]. For finite sequences, Kolmogorov defines randomness of a binary string of length ''n'' as the entropy (or [[Kolmogorov complexity]]) normalized by the length ''n''. In other words, if the Kolmogorov complexity of the string is close to ''n'', it is very random; if the complexity is far below ''n'', it is not so random. The dual concept of randomness is compressibility ‒ the more random a sequence is, the less compressible, and vice versa. :* The ''predictability'' approach. This paradigm is due to [[Claus P. Schnorr]] and uses a slightly different definition of constructive [[Martingale (probability theory)|martingales]] than martingales used in traditional probability theory.<ref>{{cite journal | last1 = Schnorr | first1 = C. P. | year = 1971 | title = A unified approach to the definition of a random sequence | journal = Mathematical Systems Theory | volume = 5 | issue = 3| pages = 246–258 | doi=10.1007/bf01694181| s2cid = 8931514 }}</ref> Schnorr showed how the existence of a selective betting strategy implied the existence of a selection rule for a biased sub-sequence. If one only requires a recursive martingale to succeed on a sequence instead of constructively succeed on a sequence, then one gets the concept of recursive randomness.{{explain|reason="Recursive randomness" is not defined or mentioned in any of the cited sources.|date=December 2020}} [[Yongge Wang]] showed<ref>Yongge Wang: Randomness and Complexity. PhD Thesis, 1996. http://webpages.uncc.edu/yonwang/papers/IPL97.pdf</ref><ref>{{cite journal | last1 = Wang | first1 = Yongge | year = 1999 | title = A separation of two randomness concepts | journal = Information Processing Letters | volume = 69 | issue = 3| pages = 115–118 | doi=10.1016/S0020-0190(98)00202-6| citeseerx = 10.1.1.46.199 }}</ref> that recursive randomness concept is different from Schnorr's randomness concept.{{explain|reason=So then what is "Schnorr's randomness concept"?|date=December 2020}} In most cases, theorems relating the three paradigms (often equivalence) have been proven.<ref>Wolfgang Merkle, ''Kolmogorov Loveland Stochasticity'' in Automata, languages and programming: 29th international colloquium, ICALP 2002, by Peter Widmayer et al. {{ISBN|3-540-43864-5}} page 391</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Random sequence
(section)
Add topic