Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Markov chain
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Definition === A Markov process is a [[stochastic process]] that satisfies the [[Markov property]] (sometimes characterized as "[[memorylessness]]"). In simpler terms, it is a process for which predictions can be made regarding future outcomes based solely on its present state and—most importantly—such predictions are just as good as the ones that could be made knowing the process's full history.<ref name=":3">{{Cite book|title=Stochastic differential equations : an introduction with applications|author=Øksendal, B. K. (Bernt Karsten) |date=2003|publisher=Springer|isbn=3540047581|edition=6th|location=Berlin|oclc=52203046}}</ref> In other words, [[conditional probability|conditional]] on the present state of the system, its future and past states are [[independence (probability theory)|independent]]. A Markov chain is a type of Markov process that has either a discrete [[state space]] or a discrete index set (often representing time), but the precise definition of a Markov chain varies.<ref name="Asmussen2003page73">{{cite book|url=https://books.google.com/books?id=BeYaTxesKy0C|title=Applied Probability and Queues|date=15 May 2003|publisher=Springer Science & Business Media|isbn=978-0-387-00211-8|page=7|author=Søren Asmussen}}</ref> For example, it is common to define a Markov chain as a Markov process in either [[continuous or discrete variable|discrete or continuous time]] with a countable state space (thus regardless of the nature of time),<ref name="Parzen1999page1882">{{cite book|url=https://books.google.com/books?id=0mB2CQAAQBAJ|title=Stochastic Processes|date=17 June 2015|publisher=Courier Dover Publications|isbn=978-0-486-79688-8|page=188|author=Emanuel Parzen}}</ref><ref name="KarlinTaylor2012page292">{{cite book|url=https://books.google.com/books?id=dSDxjX9nmmMC|title=A First Course in Stochastic Processes|date=2 December 2012|publisher=Academic Press|isbn=978-0-08-057041-9|pages=29 and 30|author1=Samuel Karlin|author2=Howard E. Taylor}}</ref><ref name="Lamperti1977chap62">{{cite book|url=https://books.google.com/books?id=Pd4cvgAACAAJ|title=Stochastic processes: a survey of the mathematical theory|publisher=Springer-Verlag|year=1977|isbn=978-3-540-90275-1|pages=106–121|author=John Lamperti}}</ref><ref name="Ross1996page174and2312">{{cite book|url=https://books.google.com/books?id=ImUPAQAAMAAJ|title=Stochastic processes|publisher=Wiley|year=1996|isbn=978-0-471-12062-9|pages=174 and 231|author=Sheldon M. Ross}}</ref> but it is also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).<ref name="Asmussen2003page73" />
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Markov chain
(section)
Add topic