Linear Independence Calculator

Enter your vectors into the matrix grid to check whether a set of vectors is linearly independent or linearly dependent. Set the number of vectors and vector size, fill in the component values, and the calculator determines independence by computing the matrix rank — showing you the result along with which vectors are redundant.

How many vectors are in your set (up to 5)

Number of components in each vector (up to 5)

Results

Linear Independence Status

--

Matrix Rank

--

Number of Vectors

--

Redundant Vectors

--

Span Dimension

--

Independent vs Redundant Vectors

Results Table

Frequently Asked Questions

What does it mean for vectors to be linearly independent?

A set of vectors is linearly independent if the only solution to the equation c₁v₁ + c₂v₂ + … + cₙvₙ = 0 is when all scalar coefficients c₁, c₂, …, cₙ equal zero. In other words, no vector in the set can be expressed as a combination of the others. If any non-trivial solution exists, the vectors are linearly dependent.

How do I check if vectors are linearly independent?

The standard method is to form a matrix with your vectors as columns (or rows), then row-reduce it to echelon form. The rank of that matrix equals the number of linearly independent vectors. If the rank equals the number of vectors, the set is independent; if the rank is less, the set is dependent.

Are [1, 1] and [1, −1] linearly independent in R²?

Yes. Forming the matrix [[1, 1], [1, -1]] and computing its determinant gives 1×(−1) − 1×1 = −2 ≠ 0, so the rank is 2 — equal to the number of vectors — confirming linear independence. Neither vector is a scalar multiple of the other.

Can 2 vectors span R³?

No. Two vectors can span at most a 2-dimensional subspace (a plane) within R³. To span all of R³ you need at least 3 linearly independent vectors, since R³ has dimension 3.

Is the identity matrix linearly independent?

Yes. The columns of the n×n identity matrix are the standard basis vectors e₁, e₂, …, eₙ, which are always linearly independent. The matrix has full rank n, so no column can be written as a combination of the others.

What is the relationship between rank and linear independence?

The rank of a matrix equals the maximum number of linearly independent rows or columns it contains. If you arrange your vectors as columns of a matrix, a rank equal to the number of columns confirms the entire set is linearly independent. A lower rank means some vectors are redundant.

Can more vectors than the dimension of the space be linearly independent?

No. In an n-dimensional vector space, any set of more than n vectors must be linearly dependent. For example, you cannot have 4 linearly independent vectors in R³ — at least one will always be expressible as a combination of the others.

What happens if one of my vectors is the zero vector?

Any set containing the zero vector is automatically linearly dependent, because you can always set the coefficient of the zero vector to any non-zero value while setting all other coefficients to zero and still satisfy the equation c₁v₁ + … = 0. The zero vector contributes nothing to the span.

More Math Tools