Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Cross product
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Computing == === Coordinate notation === [[File:3D Vector.svg|thumb|right|[[Standard basis]] vectors '''i''', '''j''', '''k''' and [[vector component]]s of '''a''', denoted here '''a'''<sub>x</sub>, '''a'''<sub>y</sub>, '''a'''<sub>z</sub>]] If <math>(\mathbf{\color{blue}{i}}, \mathbf{\color{red}{j}}, \mathbf{\color{green}{k}})</math> is a positively oriented orthonormal basis, the basis vectors satisfy the following equalities<ref name=":1" /> :<math>\begin{alignat}{2} \mathbf{\color{blue}{i}}&\times\mathbf{\color{red}{j}} &&= \mathbf{\color{green}{k}}\\ \mathbf{\color{red}{j}}&\times\mathbf{\color{green}{k}} &&= \mathbf{\color{blue}{i}}\\ \mathbf{\color{green}{k}}&\times\mathbf{\color{blue}{i}} &&= \mathbf{\color{red}{j}} \end{alignat}</math> A [[mnemonic]] for these formulas is that they can be deduced from any other of them by a [[cyclic permutation]] of the basis vectors. This mnemonic applies also to many formulas given in this article. The [[anticommutativity]] of the cross product, implies that :<math>\begin{alignat}{2} \mathbf{\color{red}{j}}&\times\mathbf{\color{blue}{i}} &&= -\mathbf{\color{green}{k}}\\ \mathbf{\color{green}{k}}&\times\mathbf{\color{red}{j}} &&= -\mathbf{\color{blue}{i}}\\ \mathbf{\color{blue}{i}}&\times\mathbf{\color{green}{k}} &&= -\mathbf{\color{red}{j}} \end{alignat}</math> The anticommutativity of the cross product (and the obvious lack of linear independence) also implies that :<math>\mathbf{\color{blue}{i}}\times\mathbf{\color{blue}{i}} = \mathbf{\color{red}{j}}\times\mathbf{\color{red}{j}} = \mathbf{\color{green}{k}}\times\mathbf{\color{green}{k}} = \mathbf{0}</math> (the [[zero vector]]). These equalities, together with the [[distributivity]] and [[linearity]] of the cross product (though neither follows easily from the definition given above), are sufficient to determine the cross product of any two vectors '''a''' and '''b'''. Each vector can be defined as the sum of three orthogonal components parallel to the standard basis vectors: :<math>\begin{alignat}{3} \mathbf{a} &= a_1\mathbf{\color{blue}{i}} &&+ a_2\mathbf{\color{red}{j}} &&+ a_3\mathbf{\color{green}{k}} \\ \mathbf{b} &= b_1\mathbf{\color{blue}{i}} &&+ b_2\mathbf{\color{red}{j}} &&+ b_3\mathbf{\color{green}{k}} \end{alignat}</math> Their cross product {{nowrap|1='''a''' Γ '''b'''}} can be expanded using distributivity: :<math> \begin{align} \mathbf{a}\times\mathbf{b} = {} &(a_1\mathbf{\color{blue}{i}} + a_2\mathbf{\color{red}{j}} + a_3\mathbf{\color{green}{k}}) \times (b_1\mathbf{\color{blue}{i}} + b_2\mathbf{\color{red}{j}} + b_3\mathbf{\color{green}{k}})\\ = {} &a_1b_1(\mathbf{\color{blue}{i}} \times \mathbf{\color{blue}{i}}) + a_1b_2(\mathbf{\color{blue}{i}} \times \mathbf{\color{red}{j}}) + a_1b_3(\mathbf{\color{blue}{i}} \times \mathbf{\color{green}{k}}) + {}\\ &a_2b_1(\mathbf{\color{red}{j}} \times \mathbf{\color{blue}{i}}) + a_2b_2(\mathbf{\color{red}{j}} \times \mathbf{\color{red}{j}}) + a_2b_3(\mathbf{\color{red}{j}} \times \mathbf{\color{green}{k}}) + {}\\ &a_3b_1(\mathbf{\color{green}{k}} \times \mathbf{\color{blue}{i}}) + a_3b_2(\mathbf{\color{green}{k}} \times \mathbf{\color{red}{j}}) + a_3b_3(\mathbf{\color{green}{k}} \times \mathbf{\color{green}{k}})\\ \end{align}</math> This can be interpreted as the decomposition of {{nowrap|1='''a''' Γ '''b'''}} into the sum of nine simpler cross products involving vectors aligned with '''i''', '''j''', or '''k'''. Each one of these nine cross products operates on two vectors that are easy to handle as they are either parallel or orthogonal to each other. From this decomposition, by using the above-mentioned [[#Coordinate notation|equalities]] and collecting similar terms, we obtain: :<math>\begin{align} \mathbf{a}\times\mathbf{b} = {} &\quad\ a_1b_1\mathbf{0} + a_1b_2\mathbf{\color{green}{k}} - a_1b_3\mathbf{\color{red}{j}} \\ &- a_2b_1\mathbf{\color{green}{k}} + a_2b_2\mathbf{0} + a_2b_3\mathbf{\color{blue}{i}} \\ &+ a_3b_1\mathbf{\color{red}{j}}\ - a_3b_2\mathbf{\color{blue}{i}}\ + a_3b_3\mathbf{0} \\ = {} &(a_2b_3 - a_3b_2)\mathbf{\color{blue}{i}} + (a_3b_1 - a_1b_3)\mathbf{\color{red}{j}} + (a_1b_2 - a_2b_1)\mathbf{\color{green}{k}}\\ \end{align}</math> meaning that the three [[scalar component]]s of the resulting vector '''s''' = ''s''<sub>1</sub>'''i''' + ''s''<sub>2</sub>'''j''' + ''s''<sub>3</sub>'''k''' = {{nowrap|1='''a''' Γ '''b'''}} are :<math>\begin{align} s_1 &= a_2b_3-a_3b_2\\ s_2 &= a_3b_1-a_1b_3\\ s_3 &= a_1b_2-a_2b_1 \end{align}</math> Using [[column vector]]s, we can represent the same result as follows: :<math>\begin{bmatrix}s_1\\s_2\\s_3\end{bmatrix}=\begin{bmatrix}a_2b_3-a_3b_2\\a_3b_1-a_1b_3\\a_1b_2-a_2b_1\end{bmatrix}</math> === Matrix notation === [[File:Sarrus_rule_cross_product_ab.svg|thumb|Use of Sarrus's rule to find the cross product of '''a''' and '''b''']] The cross product can also be expressed as the [[formal calculation|formal]] determinant:<ref group="note">Here, "formal" means that this notation has the form of a determinant, but does not strictly adhere to the definition; it is a mnemonic used to remember the expansion of the cross product.</ref><ref name=":1" /> :<math>\mathbf{a\times b} = \begin{vmatrix} \mathbf{i}&\mathbf{j}&\mathbf{k}\\ a_1&a_2&a_3\\ b_1&b_2&b_3\\ \end{vmatrix}</math> This determinant can be computed using [[Rule of Sarrus|Sarrus's rule]] or [[cofactor expansion]]. Using Sarrus's rule, it expands to :<math>\begin{align} \mathbf{a\times b} &=(a_2b_3\mathbf{i}+a_3b_1\mathbf{j}+a_1b_2\mathbf{k}) - (a_3b_2\mathbf{i}+a_1b_3\mathbf{j}+a_2b_1\mathbf{k})\\ &=(a_2b_3 - a_3b_2)\mathbf{i} -(a_1b_3 - a_3b_1)\mathbf{j} +(a_1b_2 - a_2b_1)\mathbf{k}. \end{align}</math> which gives the components of the resulting vector directly. === Using Levi-Civita tensors === * In any basis, the cross-product <math>a \times b</math> is given by the tensorial formula <math>E_{ijk}a^ib^j</math> where <math>E_{ijk} </math> is the covariant [[Levi-Civita symbol#Levi-Civita tensors|Levi-Civita]] tensor (we note the position of the indices). That corresponds to the intrinsic formula given [[#As an external product|here]]. * In an orthonormal basis '''having the same orientation as the space''', <math>a \times b</math> is given by the pseudo-tensorial formula <math> \varepsilon_{ijk}a^ib^j</math> where <math>\varepsilon_{ijk}</math> is the Levi-Civita symbol (which is a pseudo-tensor). That is the formula used for everyday physics but it works only for this special choice of basis. * In any orthonormal basis, <math>a \times b</math> is given by the pseudo-tensorial formula <math>(-1)^B\varepsilon_{ijk}a^ib^j</math> where <math>(-1)^B = \pm 1</math> indicates whether the basis has the same orientation as the space or not. The latter formula avoids having to change the orientation of the space when we inverse an orthonormal basis.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Cross product
(section)
Add topic