목록Linear algebra (9)
strangerRidingCaml
Advanced TopicsSingular Value Decomposition (SVD)The Singular Value Decomposition (SVD) of a matrix $\mathbf{A}$ is a factorization of $\mathbf{A}$ into the product of three matrices:$$\mathbf{A} = \mathbf{U} \mathbf{\Sigma} \mathbf{V}^*$$where $\mathbf{U}$ and $\mathbf{V}$ are unitary matrices, and $\mathbf{\Sigma}$ is a diagonal matrix with the singular values of $\mathbf{A}$ on its diagonal.P..
Spectral TheoryHermitian, Unitary, and Normal MatricesA matrix $\mathbf{A}$ is: Hermitian if it is equal to its conjugate transpose: $\mathbf{A} = \mathbf{A}^*$. Unitary if its conjugate transpose is its inverse: $\mathbf{A}^* \mathbf{A} = \mathbf{A} \mathbf{A}^* = \mathbf{I}$. Normal if it commutes with its conjugate transpose: $\mathbf{A} \mathbf{A}^* = \mathbf{A}^* \mathbf{A}$.Spectr..
Inner Product SpacesInner Product, Norms, and Orthogonality in Euclidean SpacesAn inner product on a vector space $V$ is a function $\langle \cdot, \cdot \rangle: V \times V \rightarrow \mathbb{R}$ that satisfies the following properties for all vectors $\mathbf{u}, \mathbf{v}, \mathbf{w}$ and scalars $a, b$: Linearity: $\langle a\mathbf{u} + b\mathbf{v}, \mathbf{w} \rangle = a\langle \mathbf..
Eigenvalues and EigenvectorsDefinition and Characteristic EquationLet $\mathbf{A}$ be an $n \times n$ matrix. A scalar $\lambda$ is called an eigenvalue of $\mathbf{A}$ if there exists a non-zero vector $\mathbf{v}$ such that:$$\mathbf{Av} = \lambda \mathbf{v}$$The characteristic equation of $\mathbf{A}$ is given by:$$\text{det}(\mathbf{A} - \lambda \mathbf{I}) = 0$$where $\mathbf{I}$ is the ide..
Linear TransformationsDefinition and PropertiesA linear transformation $T: \mathbb{R}^m \rightarrow \mathbb{R}^n$ is a function that preserves vector addition and scalar multiplication:$T(\mathbf{u} + \mathbf{v}) = T(\mathbf{u}) + T(\mathbf{v})$ for all vectors $\mathbf{u}, \mathbf{v}$ in the domain of $T$.$T(c\mathbf{v}) = cT(\mathbf{v})$ for all scalar $c$ and vector $\mathbf{v}$ in the domain..
Vector SpacesBasis and DimensionA basis of a vector space $V$ is a set of linearly independent vectors that span $V$. The dimension of $V$, denoted as $\text{dim}(V)$, is the number of vectors in any basis of $V$.Orthogonality, Orthogonal Complements, and ProjectionsTwo vectors $\mathbf{v}$ and $\mathbf{w}$ in a vector space are orthogonal if their dot product is zero, i.e., $\mathbf{v} \cdot \m..