Let $A$ be an $n\times m$ matrix. Prove that $\operatorname{rank} (A) = 1$ if and only if there exist column vectors $v \in \mathbb{R}^n$ and $w \in \mathbb{R}^m$ such that $A=vw^t$.

Progress: I'm going back and forth between using the definitions of rank: $\operatorname{rank} (A) = \dim(\operatorname{col}(A)) = \dim(\operatorname{row}(A))$ or using the rank theorem that says $ \operatorname{rank}(A)+\operatorname{nullity}(A) = m$. So in the second case I have to prove that $\operatorname{nullity}(A)=m$-1

  • 363
  • 1
  • 4
  • 5
  • I'm going back and forth between using the definitions of rank: rank (A) = dim(col(A)) = dim(row(A)) or using the rank theorem that says rank(A)+nullity(A) = m. So in the second case I have to prove that nullity(A)=m-1. I feel like that's maybe the harder approach. I'm looking at concrete examples now just to see what happens with matrix multiplication. Do you have any thoughts on how I should approach this? – coconutbandit Nov 24 '15 at 23:39
  • Also [Rank of the Outer Product of two Vectors](http://math.stackexchange.com/q/689679) –  Nov 24 '15 at 23:43

3 Answers3



$A=\mathbf v\mathbf w^T\implies\operatorname{rank}A=1$ should be pretty easy to prove directly. Multiply a vector in $\mathbb R^m$ by $A$ and see what you get.

For the other direction, think about what $A$ does to the basis vectors of $\mathbb R^m$ and what this means about the columns of $A$.


Suppose $A=\mathbf v\mathbf w^T$. If $\mathbf u\in\mathbb R^m$, then $A\mathbf u=\mathbf v\mathbf w^T\mathbf u=(\mathbf u\cdot\mathbf w)\mathbf v$. Thus, $A$ maps every vector in $\mathbb R^m$ to a scalar multiple of $\mathbf v$, hence $\operatorname{rank}A=\dim\operatorname{im}A=1$.

Now, assume $\operatorname{rank}A=1$. Then for all $\mathbf u\in\mathbb R^m$, $A\mathbf u=k\mathbf v$ for some fixed $\mathbf v\in\mathbb R^n$. In particular, this is true for the basis vectors of $\mathbb R^m$, so every column of $A$ is a multiple of $\mathbf v$. That is, $$ A=\pmatrix{w_1\mathbf v & w_2\mathbf v & \cdots & w_m\mathbf v}=\mathbf v\pmatrix{w_1&w_2&\cdots&w_m}=\mathbf v\mathbf w^T. $$

  • 50,839
  • 3
  • 27
  • 77
  • I understand now what why rank(A) = 1, simply because the columns of A are a linear combination of w. But how would I show that by multiplying a vector in the stated dimension by A? – coconutbandit Nov 25 '15 at 00:15
  • You’re getting the spaces confused. $A$ has $n$ rows, but $w$ has $m$. In any case, for the easy direction, take $u\in\mathbb R^m$ and multiply: $Au=vw^Tu=$? – amd Nov 25 '15 at 03:53
  • @amd I think you have the convention for rows and columns mixed up. $A$ is usually described as having $m$ rows and $n$ columns, and hence $u$ and $v$ would have $n$ rows. – csss Aug 24 '16 at 08:05
  • I meant $u$ should be in $\mathbb{R}^n$ and $v$ should be in $\mathbb{R}^m$. – csss Aug 24 '16 at 08:40
  • @csss I’m following the OP’s nomenclature. – amd Aug 24 '16 at 16:57
  • @coconutbandit You can use Sylvester’s Law of Nullity(rank) to prove the "if" part. It is pretty straight forward. – uuuuuuuuuu Jun 25 '19 at 14:00
  • @amd can you please elaborate more on the second part of the proof, i.e. $rank(A) = 1 \implies A = vw^T?$ – Samvid Mistry Oct 15 '19 at 17:34
  • @SamvidMistry I’m not sure that there’s really much more to say there. I’m just applying the definitions of rank and matrix multiplication. – amd Oct 18 '19 at 19:32
  • @amd I am confused about this. I understand that the basis vectors for $\mathbb{R}^m$ will be mapped onto a single line. But how can we derive the fact that all columns of A are multiples of $\textbf{v}$ from that? – Samvid Mistry Oct 19 '19 at 13:00
  • @SamvidMistry It’s a direct consequence of the definition of matrix multiplication. Hint: Think about the matrix products $A(1,0,0,\dots)^T$, $A(0,1,0,\dots)^T$ and so on. What do the columns of a transformation matrix correspond to? – amd Oct 19 '19 at 17:39
  • @amd Thank you. I got it, kind of. Multiplying $i^{th}$ basis vector will give $i^{th}$ column of $A$, since we have assumed $A$ is rank 1, all columns will end up on a single line, scaled by some constant $w_i$. – Samvid Mistry Oct 31 '19 at 18:12
  • @amd I just wanted to ask since $rank( A) =1$ then $dim( Col(A))=1$ and generally after RREF we pick the pivot column in A as the basis. So here only one column is in the basis say $C_i$ then either we choose $C_i$ as the basis or $v=\alpha C_i, \alpha \neq 0$. So now $ A=[ \alpha_1 v ~ \alpha_2 v ~.......\alpha_m v]= v\alpha^T$ – Upstart Feb 01 '22 at 17:25

Suppose that $A$ has rank one. Then its image is one dimensional, so there is some nonzero $v$ that generates it. Moreover, for any other $w$, we can write $$Aw = \lambda(w)v$$

for some scalar $\lambda(w)$ that depends linearly on $w$ by virtue of $A$ being linear and $v$ being a basis of the image of $A$. This defines then a nonzero functional $\mathbb R^n \longrightarrow \mathbb R$ which must be given by taking dot product by some $w_0$; say $\lambda(w) =\langle w_0,w\rangle$. It follows then that

$$ A(w) = \langle w_0,w\rangle v$$ for every $w$, or, what is the same, that $A= vw_0^t$.

  • 116,339
  • 16
  • 202
  • 362

dealing with the more difficult direction - i will leave the details for you to complete, since this is a useful exercise to go through, in terms of familiarizing yourself with use of the index notation, and gaining practice in thinking at the required level of abstraction. but there may be a fairly intuitive and straightforward solution along the following lines:

suppose $n \ge 2$. let $A$ be an $n \times n$ matrix which acts on the $n-$ dimensional vector space $F^n$.

first you may show that if any $2\times 2$ minor of $A$ does not vanish then the image has dimension at least $2$. by permuting rows and columns you can see that this is implied by the case for $n=2$ which is straightforward.

second, knowing that all $2\times 2$ minors of a rank 1 matrix must vanish, you have the condition: $$ a_{ij}a_{kl}=a_{il}a_{kj} $$

suppose firstly that none of the entries in $A$ is zero. then you should have no trouble in deriving the desired conclusion.

we now need to show that the problem can, in effect, be reduced to the case already dealt with.

if $A \ne 0$ then we have an element, wlog $a_{11} \ne 0$. you may now show that if $a_{1j}=0$ that we must have $a_{kj}=0$ for all $k$.

the same thing follows for columns if you invoke the fact that the rank of a matrix is equal to the rank of its transpose.

David Holden
  • 17,608
  • 2
  • 18
  • 34