SVD Calculator (Singular Value Decomposition)

Enter your matrix values into the SVD Calculator to decompose any matrix A into its singular value decomposition A = UΣVᵀ. Input the matrix size (rows × columns, up to 4×4) and fill in the matrix entries — the calculator returns the singular values (diagonal of Σ), along with the rank and Frobenius norm of your matrix. Results display automatically from the default example.

Number of rows in matrix A (1–4)

Number of columns in matrix A (1–4)

Results

Singular Values (σ₁ ≥ σ₂ ≥ …)

--

Matrix Rank

--

Frobenius Norm ‖A‖

--

Largest Singular Value (σ₁)

--

Smallest Non-zero Singular Value

--

Condition Number (σ₁ / σₘᵢₙ)

--

Singular Values (σᵢ)

Results Table

Frequently Asked Questions

What is singular value decomposition (SVD)?

Singular value decomposition is a factorization of a real or complex matrix A of size m×n into the product A = UΣVᵀ, where U is an m×m orthogonal matrix, Σ is an m×n diagonal matrix of non-negative values called singular values, and V is an n×n orthogonal matrix. It generalizes the concept of eigendecomposition to non-square matrices.

Is the singular value decomposition unique?

No, the SVD is not unique in general. The singular values themselves are unique and always listed in non-increasing order (σ₁ ≥ σ₂ ≥ … ≥ 0), but the columns of U and V (the singular vectors) can differ by sign flips or, when singular values are repeated, by orthogonal rotations in the corresponding subspace.

What does SVD do to a matrix?

SVD decomposes any linear transformation (matrix) into three geometric steps: a rotation/reflection (Vᵀ), a scaling along principal axes (Σ), and another rotation/reflection (U). This reveals the fundamental structure of the transformation — how much it stretches space in each independent direction, which is captured by the singular values.

What is the SVD of a symmetric positive definite matrix?

For a symmetric positive definite matrix A, the SVD coincides with its eigendecomposition: A = QΛQᵀ. The singular values equal the eigenvalues, and the left and right singular vectors are the same orthonormal eigenvectors. This makes symmetric matrices a special, elegant case of SVD.

What is the SVD of a unitary (orthogonal) matrix?

If A is a unitary (or orthogonal) matrix, all its singular values are exactly 1. This is because a unitary matrix preserves vector lengths, so the scaling matrix Σ is simply the identity. The matrices U and V are also orthogonal, so A = UIVᵀ.

How is the rank of a matrix determined from its SVD?

The rank of matrix A equals the number of non-zero singular values in Σ. Numerically, a singular value is treated as zero if it falls below a small tolerance threshold (e.g., 1e-9). This makes SVD the most reliable numerical method for determining matrix rank.

What is the condition number and why does it matter?

The condition number is the ratio of the largest singular value to the smallest non-zero singular value (σ₁ / σₘᵢₙ). A large condition number indicates a nearly singular or ill-conditioned matrix, meaning small changes in input can cause large changes in the output — a critical property for numerical stability in linear solvers.

What are practical applications of SVD?

SVD is used across many fields: in data science and machine learning for Principal Component Analysis (PCA) and dimensionality reduction; in image compression to approximate images with fewer singular values; in natural language processing for Latent Semantic Analysis (LSA); and in numerical linear algebra for computing pseudoinverses and solving least-squares problems.

More Math Tools