Basis Vector
(Redirected from Basis vector)
Jump to navigation
Jump to search
A Basis Vector is a vector set that forms a vector basis in an n-dimensional vector space.
- …
- Example(s):
- $(x,y)=x\,\vec{e}_1+y\,\vec{e}_2$ with $\vec{e}_1=(1,0)$ and $\vec{e}_2=(0,1)$.
- …
- Counter-Example(s):
- See: Random Vector, Multi-Dimensional Tensor, Orthogonal Transform, Vector Length Function.
References
2020a
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Basis_(linear_algebra) Retrieved:2020-8-15.
- In mathematics, a set of elements (vectors) in a vector space V is called a basis, if every element of V may be written in a unique way as a (finite) linear combination of elements of . The coefficients of this linear combination are referred to as components or coordinates on of the vector. The elements of a basis are called basis vectors.
Equivalently is a basis if its elements are linearly independent and every element of is a linear combination of elements of $B$. In more general terms, a basis is a linearly independent spanning set.
A vector space can have several bases; however all the bases have the same number of elements, called the dimension of the vector space.
- In mathematics, a set of elements (vectors) in a vector space V is called a basis, if every element of V may be written in a unique way as a (finite) linear combination of elements of . The coefficients of this linear combination are referred to as components or coordinates on of the vector. The elements of a basis are called basis vectors.
2020b
- (Weisstein, 2020) ⇒ Weisstein, Eric W. “Basis Vector." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/BasisVector.html
- QUOTE: A basis vector in an n-dimensional vector space is one of any chosen set of $n$ vectors in the space forming a vector basis, i.e., having the property that every vector in the space can be written uniquely as a linear combination of them.
For example, in the Euclidean plane, the unit vectors $\vec{e}_1=(1,0)$ and $\vec{e}_2=(0,1)$ form a vector basis since for any point $(x,y)$,
$(x,y)=xe_1+ye_2$, so for this basis, $\vec{e}_1$ and $\vec{e}_2$ are basis vectors.
- QUOTE: A basis vector in an n-dimensional vector space is one of any chosen set of $n$ vectors in the space forming a vector basis, i.e., having the property that every vector in the space can be written uniquely as a linear combination of them.
2015
- (Wikipedia) ⇒ http://en.wikipedia.org/wiki/Covariant_derivative#Coordinate_description
- The covariant derivative of a basis vector along a basis vector is again a vector and so can be expressed as a linear combination [math]\displaystyle{ \Gamma^k {\mathbf e}_k\, }[/math]. To specify the covariant derivative it is enough to specify the covariant derivative of each basis vector field [math]\displaystyle{ {\mathbf e}_j\, }[/math] along [math]\displaystyle{ {\mathbf e}_i\, }[/math].
2001
- (Luo & Hancock, 2001) ⇒ Bin Luo, and Edwin R. Hancock. (2001). “Structural Graph Matching Using the EM Algorithm and Singular Value Decomposition.” In: IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(10). doi:10.1109/34.954602
- QUOTE: Here, the proximity matrix is constructed by computing the Gaussian weighted distance between points. The eigenvectors of the proximity matrices can be viewed as the basis vectors of an orthogonal transformation on the original point identities. In other words, the components of the eigenvectors represent mixing angles for the transformed points.