Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Causality
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Science === For the scientific investigation of efficient causality, the cause and effect are each best conceived of as temporally transient processes. Within the conceptual frame of the [[scientific method]], an investigator sets up several distinct and contrasting temporally transient material processes that have the structure of [[experiment]]s, and records candidate material responses, normally intending to determine causality in the physical world.<ref>[[Max Born|Born, M.]] (1949). ''Natural Philosophy of Cause and Chance'', Oxford University Press, Oxford UK, p. 18: "... scientific work will always be the search for causal interdependence of phenomena."</ref> For instance, one may want to know whether a high intake of [[carrot]]s causes humans to develop the [[bubonic plague]]. The quantity of carrot intake is a process that is varied from occasion to occasion. The occurrence or non-occurrence of subsequent bubonic plague is recorded. To establish causality, the experiment must fulfill certain criteria, only one example of which is mentioned here. For example, instances of the hypothesized cause must be set up to occur at a time when the hypothesized effect is relatively unlikely in the absence of the hypothesized cause; such unlikelihood is to be established by empirical evidence. A mere observation of a [[correlation does not imply causation|correlation]] is not nearly adequate to establish causality. In nearly all cases, establishment of causality relies on repetition of experiments and probabilistic reasoning. Hardly ever is causality established more firmly than as more or less probable. It is most convenient for establishment of causality if the contrasting material states of affairs are precisely matched, except for only one variable factor, perhaps measured by a real number. ==== Physics ==== {{Main|Causality (physics)}} One has to be careful in the use of the word cause in physics. Properly speaking, the hypothesized cause and the hypothesized effect are each temporally transient processes. For example, force is a useful concept for the explanation of acceleration, but force is not by itself a cause. More is needed. For example, a temporally transient process might be characterized by a definite change of force at a definite time. Such a process can be regarded as a cause. Causality is not inherently implied in [[equations of motion]], but postulated as an additional [[constraint (classical mechanics)|constraint]] that needs to be satisfied (i.e. a cause always precedes its effect). This constraint has mathematical implications<ref name=kinsler2011> {{cite journal | author=Kinsler, P. | s2cid=56034806 | year=2011 | title=How to be causal | journal=Eur. J. Phys. | volume=32 | issue=6 | pages=1687–1700 | doi=10.1088/0143-0807/32/6/022 | arxiv=1106.1792|bibcode = 2011EJPh...32.1687K }}</ref> such as the [[Kramers-Kronig relations]]. Causality is one of the most fundamental and essential notions of physics.<ref>[[Albert Einstein|Einstein, A.]] (1910/2005). 'On Boltzmann's Principle and some immediate consequences thereof', unpublished manuscript of a 1910 lecture by Einstein, translated by B. Duplantier and E. Parks, reprinted on pp. 183–199 in ''Einstein,1905–2005, Poincaré Seminar 2005'', edited by T. Damour, O. Darrigol, B. Duplantier, V. Rivasseau, Birkhäuser Verlag, Basel, {{ISBN|3-7643-7435-7}}, from ''Einstein, Albert: The Collected Papers of Albert Einstein'', 1987–2005, Hebrew University and Princeton University Press; p. 183: "All natural science is based upon the hypothesis of the complete causal connection of all events."</ref> Causal efficacy cannot 'propagate' faster than light. Otherwise, reference coordinate systems could be constructed (using the [[Lorentz transform]] of [[special relativity]]) in which an observer would see an effect precede its cause (i.e. the postulate of causality would be violated). Causal notions appear in the context of the flow of mass-energy. Any actual process has causal efficacy that can propagate no faster than light. In contrast, an abstraction has no causal efficacy. Its mathematical expression does not propagate in the ordinary sense of the word, though it may refer to virtual or nominal 'velocities' with magnitudes greater than that of light. For example, wave packets are mathematical objects that have [[group velocity]] and [[phase velocity]]. The energy of a wave packet travels at the group velocity (under normal circumstances); since energy has causal efficacy, the group velocity cannot be faster than the speed of light. The phase of a wave packet travels at the phase velocity; since phase is not causal, the phase velocity of a wave packet can be faster than light.<ref>{{cite book |last1=Griffiths |first1=David |title=Introduction to electrodynamics |date=2017 |publisher=Cambridge University Press |isbn=978-1-108-42041-9 |page=418 |edition=Fourth}}</ref> Causal notions are important in general relativity to the extent that the existence of an arrow of time demands that the universe's semi-[[Riemannian manifold]] be orientable, so that "future" and "past" are globally definable quantities. ==== Engineering ==== A [[causal system]] is a [[system]] with output and internal states that depends only on the current and previous input values. A system that has ''some'' dependence on input values from the future (in addition to possible past or current input values) is termed an '''acausal''' system, and a system that depends ''solely'' on future input values is an [[anticausal system]]. Acausal filters, for example, can only exist as postprocessing filters, because these filters can extract future values from a memory buffer or a file. We have to be very careful with causality in physics and engineering. Cellier, Elmqvist, and Otter<ref name=" Cellier"> Cellier, Francois E., Hilding Elmqvist, and Martin Otter. "Modeling from physical principles." The Control Handbook, 1996 by CRC Press, Inc., ed. William S. Levine (1996): 99-108.</ref> describe causality forming the basis of physics as a misconception, because physics is essentially acausal. In their article they cite a simple example: "The relationship between voltage across and current through an electrical resistor can be described by Ohm's law: V = IR, yet, whether it is the current flowing through the resistor that causes a voltage drop, or whether it is the difference between the electrical potentials on the two wires that causes current to flow is, from a physical perspective, a meaningless question". In fact, if we explain cause-effect using the law, we need two explanations to describe an electrical resistor: as a voltage-drop-causer or as a current-flow-causer. There is no physical experiment in the world that can distinguish between action and reaction. ==== Biology, medicine and epidemiology ==== [[File:comparison confounder mediator.svg|thumb|Whereas a mediator is a factor in the causal chain (top), a confounder is a spurious factor incorrectly suggesting causation (bottom).]] [[Austin Bradford Hill]] built upon the work of [[David Hume|Hume]] and [[Karl Popper|Popper]] and suggested in his paper "The Environment and Disease: Association or Causation?" that aspects of an association such as strength, consistency, specificity, and temporality be considered in attempting to distinguish causal from noncausal associations in the epidemiological situation. (See [[Bradford Hill criteria]].) He did not note however, that temporality is the only necessary criterion among those aspects. Directed acyclic graphs (DAGs) are increasingly used in epidemiology to help enlighten causal thinking.<ref>{{cite journal|last=Chiolero|first=A|author2=Paradis, G |author3=Kaufman, JS |title=Assessing the possible direct effect of birth weight on childhood blood pressure: a sensitivity analysis|journal=American Journal of Epidemiology|date=1 January 2014|volume=179|issue=1|pages=4–11|pmid=24186972|doi=10.1093/aje/kwt228|doi-access=free}}</ref> ==== Psychology ==== {{Main|Causal reasoning}} Psychologists take an empirical approach to causality, investigating how people and non-human animals detect or infer causation from sensory information, prior experience and [[innatism|innate knowledge]]. '''Attribution:''' [[Attribution theory]] is the [[theory]] concerning how people explain individual occurrences of causation. [[Attribution (psychology)|Attribution]] can be external (assigning causality to an outside agent or force—claiming that some outside thing motivated the event) or internal (assigning causality to factors within the person—taking personal [[Moral responsibility|responsibility]] or [[accountability]] for one's actions and claiming that the person was directly responsible for the event). Taking causation one step further, the type of attribution a person provides influences their future behavior. The intention behind the cause or the effect can be covered by the subject of [[action (philosophy)|action]]. See also [[accident]]; [[blame]]; [[intent (law)|intent]]; and responsibility. ;Causal powers Whereas [[David Hume#Causation|David Hume]] argued that causes are inferred from non-causal observations, [[Immanuel Kant]] claimed that people have innate assumptions about causes. Within psychology, [[Patricia Cheng]]<ref name="Cheng1997"/> attempted to reconcile the Humean and Kantian views. According to her power PC theory, people filter observations of events through an intuition that causes have the power to generate (or prevent) their effects, thereby inferring specific cause-effect relations. ;Causation and salience Our view of causation depends on what we consider to be the relevant events. Another way to view the statement, "Lightning causes thunder" is to see both lightning and thunder as two perceptions of the same event, viz., an electric discharge that we perceive first visually and then aurally. ;Naming and causality David Sobel and [[Alison Gopnik]] from the Psychology Department of UC Berkeley designed a device known as ''the blicket detector'' which would turn on when an object was placed on it. Their research suggests that "even young children will easily and swiftly learn about a new causal power of an object and spontaneously use that information in classifying and naming the object."<ref>{{cite journal |last1=Gopnik |first1=A |author-link=Alison Gopnik |first2=David M. |last2=Sobel |title=Detecting Blickets: How Young Children Use Information about Novel Causal Powers in Categorization and Induction |journal=Child Development |date=September–October 2000 |volume=71 |issue=5 |pages=1205–1222 |doi=10.1111/1467-8624.00224 |pmid=11108092}}</ref> ;Perception of launching events Some researchers such as Anjan Chatterjee at the University of Pennsylvania and Jonathan Fugelsang at the University of Waterloo are using neuroscience techniques to investigate the neural and psychological underpinnings of causal launching events in which one object causes another object to move. Both temporal and spatial factors can be manipulated.<ref>{{cite journal |last1=Straube |doi=10.3389/fnhum.2010.00028 |pmid=20463866 |title=Space and time in perceptual causality |year=2010 |journal=Frontiers in Human Neuroscience |first1=B |last2=Chatterjee |first2=A |volume=4 |page=28 |pmc=2868299|doi-access=free }}</ref> See [[Causal Reasoning (Psychology)]] for more information. ==== Statistics and economics ==== {{See also|Causal graph}} [[Statistics]] and [[economics]] usually employ pre-existing data or experimental data to infer causality by regression methods. The body of statistical techniques involves substantial use of [[regression analysis]]. Typically a linear relationship such as :<math>y_i = a_0 + a_1x_{1,i} + a_2x_{2,i} + \dots + a_kx_{k,i} + e_i</math> is postulated, in which <math>y_i</math> is the ''i''th observation of the dependent variable (hypothesized to be the caused variable), <math>x_{j,i}</math> for ''j''=1,...,''k'' is the ''i''th observation on the ''j''th independent variable (hypothesized to be a causative variable), and <math>e_i</math> is the error term for the ''i''th observation (containing the combined effects of all other causative variables, which must be uncorrelated with the included independent variables). If there is reason to believe that none of the <math>x_j</math>s is caused by ''y'', then estimates of the coefficients <math>a_j</math> are obtained. If the null hypothesis that <math>a_j=0</math> is rejected, then the alternative hypothesis that <math>a_{j} \ne 0 </math> and equivalently that <math>x_j</math> causes ''y'' cannot be rejected. On the other hand, if the null hypothesis that <math>a_j=0</math> cannot be rejected, then equivalently the hypothesis of no causal effect of <math>x_j</math> on ''y'' cannot be rejected. Here the notion of causality is one of contributory causality as discussed [[#Necessary and sufficient causes|above]]: If the true value <math>a_j \ne 0</math>, then a change in <math>x_j</math> will result in a change in ''y'' ''unless'' some other causative variable(s), either included in the regression or implicit in the error term, change in such a way as to exactly offset its effect; thus a change in <math>x_j</math> is ''not sufficient'' to change ''y''. Likewise, a change in <math>x_j</math> is ''not necessary'' to change ''y'', because a change in ''y'' could be caused by something implicit in the error term (or by some other causative explanatory variable included in the model). The above way of testing for causality requires belief that there is no reverse causation, in which ''y'' would cause <math>x_j</math>. This belief can be established in one of several ways. First, the variable <math>x_j</math> may be a non-economic variable: for example, if rainfall amount <math>x_j</math> is hypothesized to affect the futures price ''y'' of some agricultural commodity, it is impossible that in fact the futures price affects rainfall amount (provided that [[cloud seeding]] is never attempted). Second, the [[instrumental variables]] technique may be employed to remove any reverse causation by introducing a role for other variables (instruments) that are known to be unaffected by the dependent variable. Third, the principle that effects cannot precede causes can be invoked, by including on the right side of the regression only variables that precede in time the dependent variable; this principle is invoked, for example, in testing for [[Granger causality]] and in its multivariate analog, [[vector autoregression]], both of which control for lagged values of the dependent variable while testing for causal effects of lagged independent variables. Regression analysis controls for other relevant variables by including them as regressors (explanatory variables). This helps to avoid false inferences of causality due to the presence of a third, underlying, variable that influences both the potentially causative variable and the potentially caused variable: its effect on the potentially caused variable is captured by directly including it in the regression, so that effect will not be picked up as an indirect effect through the potentially causative variable of interest. Given the above procedures, coincidental (as opposed to causal) correlation can be probabilistically rejected if data samples are large and if regression results pass [[Cross-validation (statistics)|cross-validation]] tests showing that the correlations hold even for data that were not used in the regression. Asserting with certitude that a common-cause is absent and the regression represents the true causal structure is ''in principle'' impossible.<ref>{{Cite journal|last=Henschen|first=Tobias|date=2018|title=The in-principle inconclusiveness of causal evidence in macroeconomics|journal=European Journal for the Philosophy of Science|volume=8|issue=3|pages=709–733|doi=10.1007/s13194-018-0207-7|s2cid=158264284}}</ref> The problem of omitted variable bias, however, has to be balanced against the risk of inserting [[Collider (statistics)|Causal colliders]], in which the addition of a new variable <math>x_{j+1}</math> induces a correlation between <math>x_j</math> and <math>y</math> via [[Berkson's paradox]].<ref name="Pearl" /> Apart from constructing statistical models of observational and experimental data, economists use axiomatic (mathematical) models to infer and represent causal mechanisms. Highly abstract theoretical models that isolate and idealize one mechanism dominate microeconomics. In macroeconomics, economists use broad mathematical models that are calibrated on historical data. A subgroup of calibrated models, [[dynamic stochastic general equilibrium]] (DSGE) models are employed to represent (in a simplified way) the whole economy and simulate changes in fiscal and monetary policy.<ref>{{Cite journal|last=Maziarz Mariusz|first=Mróz Robert|date=2020|title=A rejoinder to Henschen: the issue of VAR and DSGE models|journal=Journal of Economic Methodology|volume=27|issue=3|pages=266–268|doi=10.1080/1350178X.2020.1731102|s2cid=212838652}}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Causality
(section)
Add topic