Skip to main content Link Menu Expand (external link) Document Search Copy Copied

Vector

Table of contents
  1. Vector Definition
  2. Vector Properties
    1. Orthogonal
    2. Normalized
    3. Linear Independence
    4. Norms
  3. Vector Arithmetic
  4. Vector - Vector Products
  5. References

Objectives

Check-list for progress of sub-content in this note:

1. Concepts

  • Vector definition
  • Zero vector
  • Basis vector
  • Unit vector

2. Vector properties

  • Norms
  • Length
  • Maginitude

3. Vector operations

  • Transpose
  • Vector arithmetics
  • Vec2Vec products: dot - cross - outer product
  • Vec2Vec intuitions

Vector Definition

In Euclidean space AA and BB be two points, a directed line segment from AA to BB was called AB\overrightarrow{AB}

A vector is an equivalance class of all of the directed segments with the same magnitude and direction as the directed line segment described above. (Immersive Linear Algebra, 2021)

In other words, a vector is defined by:

  • Magnitude (i.e. the line segment’s length, denoted as AB\left\lVert\overrightarrow{AB}\right\rVert)
  • Direction (i.e. the direction from AA to BB)

Special kinds of vectors:

  • Zero vector (Null vector): Denoted as 0\boldsymbol{0}, 0=0\lVert\boldsymbol{0}\rVert=0
  • Basis vector
  • Unit vector

Vector Properties

Orthogonal

(Reading: /ɔrˈθɒg ə nl/)

Suppose there are 2 vectors x,yRn\boldsymbol{x}, \boldsymbol{y} \isin \mathbb{R}^n. x\boldsymbol{x} and y\boldsymbol{y} are orthogonal if xTy=0\boldsymbol{x}^T\boldsymbol{y}=0

Normalized

A vector xRn\boldsymbol{x} \isin \mathbb{R}^n is normalized if x2=1\lVert x\rVert_2 = 1

Linear Independence

A set of vector x1,,xnRm{\boldsymbol{x}_1,\dots,\boldsymbol{x}_n} \subset \mathbb{R}^m is linearly independent if no vector can be represented as a linear combination of the remaining vectors.

Norms

A norm, which is a scalar, is an informal measure of the length (or magnitude) of a vector.

A norm is a function .:RnR\lVert.\rVert: \mathbb{R}^n\rightarrow\mathbb{R} that satisfies:

  1. Definiteness: x=0    x=0 \lVert \boldsymbol{x}\rVert = 0 \iff x = 0
  2. Homogeneity: cx=cx \lVert c\boldsymbol{x}\rVert = \lvert c\rvert\lVert\boldsymbol{x}\rVert
  3. Triangle inequality: f(x+y)f(x)+f(y),x,yRn f(x+y) \leq f(x) + f(y), \quad \forall x,y \isin \mathbb{R}^n
  4. Non-negativity: f(x)0,xRn f(x) \geq 0, \quad \forall x \isin \mathbb{R}^n
Norms Example
1-norm xRn,x1=x1++xn\boldsymbol{x}\isin\mathbb{R}^n, \quad\lVert\boldsymbol{x}\rVert_1=\lvert x_1\rvert + \dots + \lvert x_n\rvert
2-norm xRn,x2=(x12++xn2)1/2\boldsymbol{x}\isin\mathbb{R}^n, \quad\lVert\boldsymbol{x}\rVert_2=(x_1^2 + \dots + x_n^2)^{1/2}
p-norm xRn,xp=(i=1nxip)1/p\boldsymbol{x}\isin\mathbb{R}^n, \quad\lVert\boldsymbol{x}\rVert_p=\left(\sum^{n}_{i=1}{\lvert x_i\rvert^p}\right)^{1/p}

Vector Arithmetic

Vector addition
u+v=[u1+v1un+vn],u,vRn \boldsymbol{u} + \boldsymbol{v} = \begin{bmatrix}u_1+v_1 \\ \vdots \\ u_n + v_n\end{bmatrix},\quad \forall \boldsymbol{u},\boldsymbol{v}\isin\mathbb{R}^n
Vector Subtraction
uv=[u1v1unvn],u,vRn \boldsymbol{u} - \boldsymbol{v} = \begin{bmatrix}u_1-v_1 \\ \vdots \\ u_n-v_n\end{bmatrix},\quad \forall \boldsymbol{u},\boldsymbol{v}\isin\mathbb{R}^n
Scala Multiplication
ku=[ku1kun],u,vRn k\boldsymbol{u} = \begin{bmatrix}ku_1 \\ \vdots \\ ku_n\end{bmatrix},\quad \forall \boldsymbol{u},\boldsymbol{v}\isin\mathbb{R}^n

Vector - Vector Products

Suppose there are two vectors x,yRn\boldsymbol{x},\boldsymbol{y}\isin\mathbb{R}^n, multiplying these two vectors refers to three different operations:

Dot product
Notation: xy \boldsymbol{x}\cdot\boldsymbol{y}
Definition: xy=xTy=[x1xn][y1yn]=i=1nxiyi=xycosθ \begin{aligned} \boldsymbol{x}\cdot\boldsymbol{y} &= \boldsymbol{x}^T\boldsymbol{y} \\ &=\begin{bmatrix}x_1 & \dots & x_n\end{bmatrix}\cdot\begin{bmatrix}y_1 \\ \vdots \\ y_n\end{bmatrix}\\ &=\sum_{i=1}^{n}{x_iy_i} \\ &= \left\lVert\boldsymbol{x}\right\rVert\left\lVert\boldsymbol{y}\right\rVert\cos{\theta} \end{aligned}
Cross product
Notation: x×y\boldsymbol{x}\times\boldsymbol{y}
Definition: x×y=[x1xn]×[y1yn] \boldsymbol{x}\times\boldsymbol{y} = \begin{bmatrix}x_1\\ \vdots \\ x_n\end{bmatrix}\times\begin{bmatrix}y_1\\ \vdots \\ y_n \end{bmatrix}
Outer product
Notation: xy \boldsymbol{x}\otimes\boldsymbol{y}
Definition: xy=xyT=[x1xn][y1yn]=[x1y1x1ynxny1xnyn] \begin{aligned} \boldsymbol{x}\otimes\boldsymbol{y} &= \boldsymbol{x}\boldsymbol{y}^T \\ &=\begin{bmatrix}x_1\\ \vdots \\ x_n\end{bmatrix}\otimes\begin{bmatrix}y_1 & \dots & y_n\end{bmatrix}\\ &=\begin{bmatrix}x_1y_1 & \dots & x_1y_n\\ \vdots & \ddots & \vdots \\ x_ny_1 & \dots & x_ny_n \end{bmatrix} \end{aligned}

References

  1. Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge University Press. https://doi.org/10.1017/9781108679930
  2. Mathematical objects. (2021). https://dynref.engr.illinois.edu/rvn.html
    [Online; accessed 2021-07-18]
  3. Petulla, S. (2021). Linear algebra cheatsheet. https://observablehq.com/@petulla/explorable-linear-algebra-cheatsheet
    [Online; accessed 2021-08-13]
  4. Kolter, Z., & Do, C. (2015). Linear Algebra Review and Reference. http://cs229.stanford.edu/section/cs229-linalg.pdf
    [Online; accessed 2021-08-13]
  5. Computational Linear Algebra for Coders. (2021). https://github.com/fastai/numerical-linear-algebra
    [Online; accessed 2021-08-13]
  6. Linear Algebra for Deep Learning. (2021). https://www.quantstart.com/articles/scalars-vectors-matrices-and-tensors-linear-algebra-for-deep-learning-part-1/
    [Online; accessed 2021-08-13]
  7. Immersive Linear Algebra. (2021). http://immersivemath.com/ila
    [Online; accessed 2021-08-13]
  8. Strang, G. (2009). Introduction to Linear Algebra (Fourth). Wellesley-Cambridge Press.