Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Kolmogorov–Smirnov test
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Kolmogorov distribution== [[File:KolmogorovDistrPDF.png|thumb|600px|Illustration of the Kolmogorov distribution's [[probability density function|PDF]]]] The Kolmogorov distribution is the distribution of the [[random variable]] <math display="block">K=\sup_{t\in[0,1]}|B(t)|</math> where ''B''(''t'') is the [[Brownian bridge]]. The [[cumulative distribution function]] of ''K'' is given by<ref>{{Cite journal |vauthors=Marsaglia G, Tsang WW, Wang J |year=2003 |title=Evaluating Kolmogorov's Distribution |journal=Journal of Statistical Software |volume=8 |issue=18 |pages=1–4 |doi=10.18637/jss.v008.i18 |doi-access=free }}</ref> <math display="block">\begin{align} \operatorname{Pr}(K\leq x) &= 1-2\sum_{k=1}^\infty (-1)^{k-1} e^{-2k^2 x^2} \\ &=\frac{\sqrt{2\pi}}{x}\sum_{k=1}^\infty e^{-(2k-1)^2\pi^2/(8x^2)}, \end{align}</math> which can also be expressed by the [[Jacobi theta function]] <math>\vartheta_{01}(z=0;\tau=2ix^2/\pi)</math>. Both the form of the Kolmogorov–Smirnov test statistic and its asymptotic distribution under the null hypothesis were published by [[Andrey Kolmogorov]],<ref name=AK>{{Cite journal |author=Kolmogorov A |year=1933 |title=Sulla determinazione empirica di una legge di distribuzione |journal=G. Ist. Ital. Attuari |volume=4 |pages=83–91}}</ref> while a table of the distribution was published by [[Nikolai Smirnov (mathematician)|Nikolai Smirnov]].<ref>{{Cite journal |author=Smirnov N |year=1948 |title=Table for estimating the goodness of fit of empirical distributions |journal=[[Annals of Mathematical Statistics]] |volume=19 |issue=2 |pages=279–281 |doi=10.1214/aoms/1177730256|doi-access=free }}</ref> Recurrence relations for the distribution of the test statistic in finite samples are available.<ref name=AK/> Under null hypothesis that the sample comes from the hypothesized distribution ''F''(''x''), <math display="block">\sqrt{n}D_n\xrightarrow{n\to\infty}\sup_t |B(F(t))|</math> [[convergence of random variables|in distribution]], where ''B''(''t'') is the Brownian bridge. If ''F'' is continuous then under the null hypothesis <math>\sqrt{n}D_n</math> converges to the Kolmogorov distribution, which does not depend on ''F''. This result may also be known as the Kolmogorov theorem. The accuracy of this limit as an approximation to the exact CDF of <math>K</math> when <math>n</math> is finite is not very impressive: even when <math>n=1000</math>, the corresponding maximum error is about <math>0.9~\%</math>; this error increases to <math>2.6~\%</math> when <math>n=100</math> and to a totally unacceptable <math>7~\%</math> when <math>n=10</math>. However, a very simple expedient of replacing <math>x</math> by <math display="block">x+\frac{1}{6\sqrt{n}}+ \frac{x-1}{4n}</math> in the argument of the Jacobi theta function reduces these errors to <math>0.003~\%</math>, <math>0.027\%</math>, and <math>0.27~\%</math> respectively; such accuracy would be usually considered more than adequate for all practical applications.<ref>{{Cite journal |vauthors=Vrbik, Jan |year=2018 |title=Small-Sample Corrections to Kolmogorov–Smirnov Test Statistic |journal=Pioneer Journal of Theoretical and Applied Statistics |volume=15 |issue=1–2 |pages=15–23}}</ref> The ''goodness-of-fit'' test or the Kolmogorov–Smirnov test can be constructed by using the critical values of the Kolmogorov distribution. This test is asymptotically valid when <math>n \to\infty.</math> It rejects the null hypothesis at level <math>\alpha</math> if <math display="block">\sqrt{n}D_n>K_\alpha,\,</math> where ''K''<sub>''α''</sub> is found from <math display="block">\operatorname{Pr}(K\leq K_\alpha)=1-\alpha.\,</math> The asymptotic [[statistical power|power]] of this test is 1. Fast and accurate algorithms to compute the cdf <math>\operatorname{Pr}(D_n \leq x)</math> or its complement for arbitrary <math>n</math> and <math>x</math>, are available from: * <ref name=SL2011>{{Cite journal |vauthors=Simard R, L'Ecuyer P |year=2011 |title=Computing the Two-Sided Kolmogorov–Smirnov Distribution |journal=Journal of Statistical Software |volume=39 |issue=11 |pages=1–18 |doi=10.18637/jss.v039.i11 |doi-access=free }}</ref> and <ref>{{Cite journal |vauthors=Moscovich A, Nadler B |year=2017 |title=Fast calculation of boundary crossing probabilities for Poisson processes |journal=Statistics and Probability Letters |volume=123 |pages=177–182 |doi=10.1016/j.spl.2016.11.027|arxiv=1503.04363 |s2cid=12868694 }}</ref> for continuous null distributions with code in C and Java to be found in.<ref name=SL2011/> * <ref name=DKT2019>{{Cite journal |vauthors=Dimitrova DS, Kaishev VK, Tan S |year=2020 |title=Computing the Kolmogorov–Smirnov Distribution when the Underlying cdf is Purely Discrete, Mixed or Continuous |journal=Journal of Statistical Software |volume=95 |issue=10 |pages=1–42 |doi= 10.18637/jss.v095.i10 |doi-access=free }}</ref> for purely discrete, mixed or continuous null distribution implemented in the KSgeneral package <ref name=KSgeneral>{{Cite web|url=https://CRAN.R-project.org/package=KSgeneral |title=KSgeneral: KSgeneral: Computing P-Values of the One-Sample K-S Test and the Two-Sample K-S and Kuiper Tests for (Dis)Continuous Null Distribution|last1=Dimitrova|first1=Dimitrina |last2=Yun|first2=Jia | last3=Kaishev| first3=Vladimir | last4=Tan|first4=Senren|website=CRAN.R-project.org/package=KSgeneral|date=21 May 2024}}</ref> of the [[R (programming language)|R project for statistical computing]], which for a given sample also computes the KS test statistic and its p-value. Alternative C++ implementation is available from.<ref name=DKT2019/> ===Test with estimated parameters=== If either the form or the parameters of ''F''(''x'') are determined from the data ''X''<sub>''i''</sub> the critical values determined in this way are invalid. In such cases, [[Monte Carlo method|Monte Carlo]] or other methods may be required, but tables have been prepared for some cases. Details for the required modifications to the test statistic and for the critical values for the [[normal distribution]] and the [[exponential distribution]] have been published,<ref name="Pearson & Hartley">{{cite book |title= Biometrika Tables for Statisticians |editor=Pearson, E. S. |editor2=Hartley, H. O. |year=1972 |volume=2 |publisher=Cambridge University Press |isbn=978-0-521-06937-3 |pages=117–123, Tables 54, 55}}</ref> and later publications also include the [[Gumbel distribution]].<ref name="Shorak & Wellner">{{cite book |title=Empirical Processes with Applications to Statistics |first1=Galen R. |last1=Shorack |first2=Jon A. |last2=Wellner |year=1986 |isbn=978-0-471-86725-8 |publisher=Wiley |page=239}}</ref> The [[Lilliefors test]] represents a special case of this for the normal distribution. The logarithm transformation may help to overcome cases where the Kolmogorov test data does not seem to fit the assumption that it came from the normal distribution. Using estimated parameters, the question arises which estimation method should be used. Usually this would be the [[Maximum likelihood estimation|maximum likelihood method]], but e.g. for the normal distribution MLE has a large bias error on sigma. Using a moment fit or KS minimization instead has a large impact on the critical values, and also some impact on test power. If we need to decide for Student-T data with df = 2 via KS test whether the data could be normal or not, then a ML estimate based on H<sub>0</sub> (data is normal, so using the standard deviation for scale) would give much larger KS distance, than a fit with minimum KS. In this case we should reject H<sub>0</sub>, which is often the case with MLE, because the sample standard deviation might be very large for T-2 data, but with KS minimization we may get still a too low KS to reject H<sub>0</sub>. In the Student-T case, a modified KS test with KS estimate instead of MLE, makes the KS test indeed slightly worse. However, in other cases, such a modified KS test leads to slightly better test power.{{Citation needed|date=May 2022}} ===Discrete and mixed null distribution=== Under the assumption that <math>F</math> is non-decreasing and right-continuous, with countable (possibly infinite) number of jumps, the KS test statistic can be expressed as: <math display="block">D_n= \sup_x |F_n(x)-F(x)| = \sup_{0 \leq t \leq 1} |F_n(F^{-1}(t)) - F(F^{-1}(t))|.</math> From the right-continuity of <math>F</math>, it follows that <math>F(F^{-1}(t)) \geq t</math> and <math>F^{-1}(F(x)) \leq x </math> and hence, the distribution of <math>D_{n}</math> depends on the null distribution <math>F</math>, i.e., is no longer distribution-free as in the continuous case. Therefore, a fast and accurate method has been developed to compute the exact and asymptotic distribution of <math>D_{n}</math> when <math>F</math> is purely discrete or mixed,<ref name=DKT2019/> implemented in C++ and in the KSgeneral package <ref name=KSgeneral/> of the [[R (programming language)|R language]]. The functions <code>disc_ks_test()</code>, <code>mixed_ks_test()</code> and <code>cont_ks_test()</code> compute also the KS test statistic and p-values for purely discrete, mixed or continuous null distributions and arbitrary sample sizes. The KS test and its p-values for discrete null distributions and small sample sizes are also computed in <ref name=arnold-emerson>{{Cite journal |first1=Taylor B. |last1=Arnold |first2=John W. |last2=Emerson |year=2011 |title=Nonparametric Goodness-of-Fit Tests for Discrete Null Distributions |journal=The R Journal |volume=3 |issue=2 |pages=34\[Dash]39 |url=http://journal.r-project.org/archive/2011-2/RJournal_2011-2_Arnold+Emerson.pdf |doi=10.32614/rj-2011-016|doi-access=free }}</ref> as part of the dgof package of the R language. Major statistical packages among which [[SAS (software)|SAS]] <code>PROC NPAR1WAY</code>,<ref>{{cite web|url=https://support.sas.com/documentation/cdl/en/statug/68162/HTML/default/viewer.htm#statug_npar1way_toc.htm|title=SAS/STAT(R) 14.1 User's Guide|website=support.sas.com|access-date=14 April 2018}}</ref> [[Stata]] <code>ksmirnov</code><ref>{{cite web|url=https://www.stata.com/manuals15/rksmirnov.pdf|title=ksmirnov — Kolmogorov–Smirnov equality-of-distributions test|website=stata.com|access-date=14 April 2018}}</ref> implement the KS test under the assumption that <math>F(x)</math> is continuous, which is more conservative if the null distribution is actually not continuous (see <ref name=Noether63>{{Cite journal |vauthors=Noether GE |year=1963|title=Note on the Kolmogorov Statistic in the Discrete Case |journal=Metrika |volume=7 |issue=1 |pages=115–116|doi=10.1007/bf02613966|s2cid=120687545}}</ref> <ref name=Slakter65>{{Cite journal |vauthors=Slakter MJ |year=1965|title=A Comparison of the Pearson Chi-Square and Kolmogorov Goodness-of-Fit Tests with Respect to Validity |journal=Journal of the American Statistical Association |volume=60 |issue=311 |pages=854–858 |doi=10.2307/2283251|jstor=2283251}}</ref> <ref name=Walsh63>{{Cite journal |vauthors=Walsh JE |year=1963 |title=Bounded Probability Properties of Kolmogorov–Smirnov and Similar Statistics for Discrete Data |journal=Annals of the Institute of Statistical Mathematics |volume=15 |issue=1 |pages=153–158|doi=10.1007/bf02865912|s2cid=122547015 }}</ref>).
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Kolmogorov–Smirnov test
(section)
Add topic