This is a summary of the linear algebra concepts used in the course:
-
$A \in \mathbb{R}^{m \times n}$ : matrix with$m$ rows and$n$ columns:
-
$x \in \mathbb{R}^{n}$ : (column) vector with$n$ rows and$1$ column:
- A row vector is a matrix with
$1$ row and$n$ columns; - A scalar is a matrix with
$1$ row and$1$ column (i.e. a number).
-
Transpose of a matrix
$A \in \mathbb{R}^{m \times n}$ is the matrix$A^T \in \mathbb{R}^{n \times m}$ such that$A^T_{ij} = A_{ji}$ ; - Matrix
$A$ is symmetric if$A = A^T$ ; - The product of two matrices
$A \in \mathbb{R}^{m \times n}$ and$B \in \mathbb{R}^{n \times p}$ is the matrix$C \in \mathbb{R}^{m \times p}$ :
- The inner product (or dot product) of two vectors
$x, y \in \mathbb{R}^n$ is the scalar$x^T y \in \mathbb{R}$ :
- The outer product of two vectors
$x, y \in \mathbb{R}^n$ is the matrix$xy^T \in \mathbb{R}^{n \times n}$ :
- The Hadamard/Schur product of two vectors
$x, y \in \mathbb{R}^n$ is the vector$x \circ y \in \mathbb{R}^n$ such that$(x \circ y)_i = x_i y_i$ :
-
Associativity:
$(AB)C = A(BC)$ ; - It is not commutative:
$AB \neq BA$ - unless$A$ or$B$ is a scalar; -
Transpose of a product:
$(AB)^T = B^T A^T$ ; -
Transpose of sum:
$(A + B)^T = A^T + B^T$ .
- The norm of a vector is its length or magnitude;
- The euclidean norm (or
$l_2$ norm) of a vector$x \in \mathbb{R}^n$ is:
$$ |x|2 = \sqrt{\sum{i=1}^n x_i^2} = \sqrt{x^T x} = \sqrt{<x, x>} $$
- More generally, the
$l_p$ norm of a vector$x \in \mathbb{R}^n$ is:
$$ |x|p = \left(\sum{i=1}^n |x_i|^p\right)^{1/p} $$
- The
$l_1$ norm of a vector$x \in \mathbb{R}^n$ is: $|x|1 = \sum{i=1}^n |x_i|$; - The
$l_\infty$ norm of a vector$x \in \mathbb{R}^n$ is:$|x|_\infty = \max_i |x_i|$ .
-
Diagonal matrix: a matrix
$D \in \mathbb{R}^{n \times n}$ such that$D_{ij} = 0$ for$i \neq j$ ; -
Identity matrix: a diagonal matrix
$I \in \mathbb{R}^{n \times n}$ such that$I_{ii} = 1$ for all$i$ ;-
$AI = IA = A$ for any matrix$A \in \mathbb{R}^{m \times n}$ - neutral element of matrix multiplication;
-
-
Upper triangular matrix: a matrix
$U \in \mathbb{R}^{n \times n}$ such that$U_{ij} = 0$ for$i > j$ ; -
Lower triangular matrix: a matrix
$L \in \mathbb{R}^{n \times n}$ such that$L_{ij} = 0$ for$i < j$ .
-
Eigenvector of a matrix
$A \in \mathbb{R}^{n \times n}$ is a nonzero vector$\vec{v} \in \mathbb{R}^n$ such that$A \vec{v} = \lambda \vec{v}$ for some scalar$\lambda \in \mathbb{R}$ ;
- The eigenvalues of a diagonal matrix are its diagonal elements;
- The matrix trace is the sum of its diagonal elements:
$\text{tr}(A) = \sum_{i=1}^n A_{ii} = \sum_{i=1}^n \lambda_i$ ; - The matrix determinant is the product of its eigenvalues:
$\text{det}(A) = |A| = \prod_{i=1}^n \lambda_i$ ;-
$|A| = |A^T|$ ; -
$|AB| = |A| |B|$ ; -
$|\alpha A| = \alpha^n |A|$ .
-
- A matrix is invertible if there exists a matrix
$B$ such that$AB = BA = I$ ; - The inverse of a matrix
$A \in \mathbb{R}^{n \times n}$ is the matrix$A^{-1} \in \mathbb{R}^{n \times n}$ such that$AA^{-1} = A^{-1}A = I$ ; -
$det(A) \neq 0 \quad \Leftrightarrow \quad A \text{ is invertible}$ ; -
$det(A^{-1}) = \frac{1}{det(A)}$ ; - If
$A$ is invertible, then$Ax = b$ has a unique solution$x = A^{-1} b$ for any$b$ ; - Computational cost of inverting a matrix is
$O(n^3)$ .
Properties:
-
$(A^{-1})^{-1} = A$ ; -
$(AB)^{-1} = B^{-1} A^{-1}$ ; -
$(A^T)^{-1} = (A^{-1})^T$ .
Quadratic Form and Positive Semidefinite Matrices
Given a matrix
This can be written as:
- A positive semidefinite matrix is a symmetric matrix
$A \in \mathbb{R}^{n \times n}$ such that$x^T A x \geq 0$ for all$x \in \mathbb{R}^n$ ;-
$A$ is positive semidefinite$\quad \Leftrightarrow \quad \lambda_i(A) \geq 0$ for all$i$ ;
-
- A positive definite matrix is a symmetric matrix
$A \in \mathbb{R}^{n \times n}$ such that$x^T A x > 0$ for all$x \in \mathbb{R}^n$ ;-
$A$ is positive definite$\quad \Leftrightarrow \quad \lambda_i(A) > 0$ for all$i$ .
-