matrices tensor-product

Vector space

A vector space consists of a set of vectors and scalars

  • linear combination =
  • A vector is said to be linearly independent if it cannot be written as a linear combination of the set of vectors
    • the unit vector is linearly independent to the unit vectors and , but any vector in the xy plane would be a linearly dependent on and
  • a collection of vectors span the space when every vector can be written as a linear combination of the set of vectors
  • basis = a collection of linearly independent vectors that span the space
  • dimension of the vector space = # of bases

inner product

The dot product of an n-dimensional vector space is called the inner product

norm =

  • normalized =
  • orthogonal =
  • orthonormal set = a collection of orthogonal normalized vectors where
    • when
    • when

Schwartz inequality angle between two vectors

Gram-Schmidt Procedure

Given a set of linearly independent vectors , the Gram-Schmidt produces an orthogonal basis , followed by an orthonormal basis

\hat{e_{1}} = \frac{v_{1}}{\sqrt{ v_{1} \cdot v_{1} }} \\ \text{For } e_{2} : \vec{e_{2}} = v_{2} - \hat{e_{1}} (\hat{e_{1}} \cdot v_{2}) \\ \hat{e_{2}} = \frac{e_{2}}{\sqrt{ e_{2} \cdot e_{2} }} \\ \text{For } e_{3} : \vec{e_{3}} =v_{3} - \hat{e_{1}} (\hat{e_{1}} \cdot v_{3} ) - \hat{e_{2}} (\hat{e_{2}} \cdot v_{3}) \\ \hat{e_{3}} = \frac{e_{3}}{\sqrt{ e_{3} \cdot e_{3} }} \end{align}$$ So, general formula = $$\vec{e_{n}} = v_{n} - \sum_{i<n} \hat{e_{i}} (\hat{e_{i} \cdot v_{n}}) , \hat{e_{n}} = \frac{\vec{e_{n}}}{\sqrt{ e_{n} \cdot e_{n} }}$$