I understand that a vector space is a collection of vectors that can be added and scalar multiplied and satisfies the 8 axioms, however, I do not know what a vector is.

I know in physics a vector is a geometric object that has a magnitude and a direction and it computer science a vector is a container that holds elements, expand, or shrink, but in linear algebra the definition of a vector isn't too clear.

As a result, what is a vector in Linear Algebra?

Aaron Hall
  • 751
  • 9
  • 18
Paul Lee
  • 975
  • 1
  • 7
  • 5
  • 115
    A vector is simply an element of a vector space. – Mariano Suárez-Álvarez Sep 22 '16 at 18:59
  • 6
    Do not think too much about what a vector or a vector space is. It is just an algebraic structure which seems to have extremely interesting properties, which then are applied everywhere in science. –  Sep 22 '16 at 19:00
  • A mathematical vector is anything with the same formal properties as in physics (the axioms) but in a more general and abstract mathematical setting, so that results about it apply not just to physics but to anything satisfying the axioms. That’s what maths does! P.S. It is not much like the computer science vector. – PJTraill Sep 22 '16 at 22:47
  • 14
    "a vector space is a collection of **vectors** that..." You should avoid using the word "vectors" there, and simply say, "A vector space is a set $S$ together with a field $F$ and functions $+$ and $\times$ such that the following axioms are satisfied..." Then you have avoided using the word "vector", so you don't need to define it. If you'd like you can refer to the elements of $S$ as "vectors" to remind people that $(S,F,+,\times)$ is a vector space. – littleO Sep 22 '16 at 23:36
  • 4
    The reason is that physicists care about the definition of a vector. In linear algebra, the definition of a vector is irrelevant, what is important is the definition of a vector space. – Asaf Karagila Sep 23 '16 at 04:53
  • 1
    @AsafKaragila: Sort of. Actually in modern physics they care that a quantity *transforms* like a vector. So ultimately it's still defined by behaviour, just that it takes more behaviour than just the vector space axioms. – celtschk Sep 23 '16 at 06:27
  • 1
    In mathematics, a thing is exactly what its definition says it is. A vector is an element of a vector space. The same definition applies whether you're talking about a point in $\mathbb{R}^n$, a polynomial in $\mathbb{C}[X]$, a continuous function in $C(X)$ for some space $X$, etc. – anomaly Sep 23 '16 at 14:17
  • https://youtu.be/TgKwz5Ikpc8 – Martijn Sep 24 '16 at 21:41
  • The word "vector" has a mathematical meaning **only** in the context of a specified vector space. Once that context is specified, the meaning is very simple: A vector is an element of that vector space. One could try to introduce a context-less notion of vector, by saying that a vector is anything that is an element of **some** vector space, but this notion is useless because everything is an element of some vector space; that is, given any entity $x$, there is (as Hurkyl pointed out in a comment on one of the answers) a vector space having $x$ as an element (in fact, as its only element). – Andreas Blass Sep 25 '16 at 23:38
  • One possible source of confusion is that the physics definition ("magnitude and direction") is more specific than the linear algebra definition. A (linear algebra) vector in a vector space does not have a magnitude or a direction unless the vector space has the additional structure that makes it an *inner pdouct space*. – gandalf61 Jan 08 '19 at 16:01

9 Answers9


In modern mathematics, there's a tendency to define things in terms of what they do rather than in terms of what they are.

As an example, suppose that I claim that there are objects called "pizkwats" that obey the following laws:

  • $\forall x. \forall y. \exists z. x + y = z$
  • $\exists x. x = 0$
  • $\forall x. x + 0 = 0 + x = x$
  • $\forall x. \forall y. \forall z. (x + y) + z = x + (y + z)$
  • $\forall x. x + x = 0$

These rules specify what pizkwats do by saying what rules they obey, but they don't say anything about what pizkwats are. We can find all sorts of things that we could call pizkwats. For example, we could imagine that pizkwats are the numbers 0 and 1, with addition being done modulo 2. They could also be bitstrings of length 137, with "addition" meaning "bitwise XOR." Or they could be sets, with “addition” meaning “symmetric difference.” Each of these groups of objects obey the rules for what pizkwats do, but neither of them "are" pizkwats.

The advantage of this approach is that we can prove results about pizkwats knowing purely how they behave rather than what they fundamentally are. For example, as a fun exercise, see if you can use the above rules to prove that

$\forall x. \forall y. x + y = y + x$.

This means that anything that "acts like a pizkwat" must support a commutative addition operator. Similarly, we could prove that

$\forall x. \forall y. (x + y = 0 \rightarrow x = y)$.

The advantage of setting things up this way is that any time we find something that "looks like a pizkwat" in the sense that it obeys the rules given above, we're guaranteed that it must have some other properties, namely, that it's commutative and that every element has its own and unique inverse. We could develop a whole elaborate theory about how pizkwats behave and what pizkwats do purely based on the rules of how they work, and since we specifically never actually said what a pizkwat is, anything that we find that looks like a pizkwat instantly falls into our theory.

In your case, you're asking about what a vector is. In a sense, there is no single thing called "a vector," because a vector is just something that obeys a bunch of rules. But any time you find something that looks like a vector, you immediately get a bunch of interesting facts about it - you can ask questions about spans, about changing basis, etc. - regardless of whether that thing you're looking at is a vector in the classical sense (a list of numbers, or an arrow pointing somewhere) or a vector in a more abstract sense (say, a function acting as a vector in a "vector space" made of functions.)

As a concluding remark, Grant Sanderson of 3blue1brown has an excellent video talking about what vectors are that explores this in more depth.

  • 8,825
  • 3
  • 35
  • 77
  • 18
    +1 for your first sentence which puts succinctly the modern algebraic/axiomatic viewpoint. – P Vanchinathan Sep 23 '16 at 03:48
  • 1
    To add an example: In Tomography, the image is a vector. So yeah, your insides, lungs, heart, muscles are a vector in CT scans! – Ander Biguri Sep 23 '16 at 10:34
  • 5
    This is an excellent explanation. For anyone else coming from a software development background, this is roughly analogous to how typeclasses, traits, and interfaces work! – Jules Sep 23 '16 at 17:53
  • These pizkwats are such a great example! If anyone is interested in the proofs of the two statements: http://pastebin.com/mYJVfRRa – Vincent Sep 23 '16 at 21:24
  • This is a good explanation, but I think it's also important to mention that we can go *from* the abstract axioms *to* a concrete representation in some cases. For instance, you can define the "dimension" of a vector space, and then rigorously prove that any finite-dimensional vector space is isomorphic to the usual R^n. – David Sep 24 '16 at 00:24
  • 4
    +1 for inclusion of Grant Sanderson's 3Blue1Brown video. His 'Essence of Linear Algebra' series of videos are fantastic for those who are looking for a quick introduction to the core concepts of Linear Algebra. – Perturbative Sep 25 '16 at 04:44
  • @David: Only those finite-dimensional vector spaces over $R$... – user21820 Sep 25 '16 at 07:27

When I was 14, I was introduced to vectors in a freshman physics course (algebra based). We were told that it was a quantity with magnitude and direction. This is stuff like force, momentum, and electric field.

Three years later in precalculus we thought of them as "points," but with arrows emanating from the origin to that point. Just another thing. This was the concept that stuck until I took linear algebra two more years later.

But now in the abstract sense, vectors don't have to be these "arrows." They can be anything we want: functions, numbers, matrices, operators, whatever. When we build vector spaces (linear spaces in other texts), we just call the objects vectors - who cares what they look like? It's a name to an abstract object.

For example, in $\mathbb{R}^n$ our vectors are ordered n-tuples. In $\mathcal{C}[a,b]$ our vectors are now functions - continuous functions on $[a, b]$ at that. In $L^2(\mathbb{R}$) our vectors are those functions for which

$$ \int_{\mathbb{R}} | f |^2 < \infty $$

where the integral is taken in the Lebesgue sense.

Vectors are whatever we take them to be in the appropriate context.

Sean Roberson
  • 7,524
  • 3
  • 16
  • 29
  • 14
    There are things which are not vectors, and spaces which are not vector spaces. Vector's can't be "anything you want". – Wouter Sep 23 '16 at 09:34
  • 6
    Well, vectors can be anything we want as long as we have (or invent) a way to sum them and multiply them by scalars, and those operations follow a few quite reasonable rules. – Pere Sep 23 '16 at 09:49
  • 14
    @Wouter: To wit, given any thing $x$, I can define a vector space whose only element is $x$, and the result of scalar multiplication and addition is always $x$. So there really is a vector space in which the thing $x$ is a vector. –  Sep 23 '16 at 11:16
  • 4
    @Hurkyl On the other hand, there are sets which cannot be turned into vector spaces (finite sets of most sizes). – Tobias Kildetoft Sep 23 '16 at 18:23
  • 2
    Nit: Electric charge is a scalar, not a vector (maybe you mean electric field). – kennytm Sep 24 '16 at 09:31
  • @kennytm Thank you, I made the appropriate change. – Sean Roberson Sep 25 '16 at 15:35

This may be disconcerting at first, but the whole point of the abstract notion of vectors is to not tell you precisely what they are. In practice (that is, when using linear algebra in other areas of mathematics and the sciences, and there are a lot of areas that use linear algebra), a vector could be a real or complex valued function, a power series, a translation in Euclidean space, a description of a state of a quantum mechanical system, or something quite different still.

The reason all these diverse things are gathered under the common name of vector, is that for certain type of questions about all these things, a common way of reasoning can be applied; this is what linear algebra is about. In all cases there must be a definite (large) set of vectors (the vector space in which the vectors live), and operations of addition and scalar multiplication of vectors must be defined. What these operations are concretely may vary according to the nature of the vectors. Certain properties are required to hold in order to serve as a foundation for reasoning; these axioms say for instance that there must be a distinguished "zero" vector that is neutral for addition, that addition of vectors is commutative (a good linear algebra course will give you the complete list).

Linear algebra will tell you what facts about vectors, formulated exclusively in terms of the vector space operations, can be deduced purely from those axioms. Some kinds of vectors have more operations defined than just those of linear algebra: for instance power series can be multiplied together (while in general one cannot multiply two vectors), and functions allow talking about taking limits. However, proving statements about such operations will be based on other facts than the axioms of linear algebra, and will require a different kind of reasoning adapted to each case. In contrast, linear algebra focusses on a large body of common properties that can be derived in exactly the same way in all to the examples, because it does not involve at all these additional structures that may be present. It is for that reason that linear algebra speaks of vectors in an abstract manner, and limits its language to the operations of addition and scalar multiplication (and other notions that can be entirely defined in terms of them).

Marc van Leeuwen
  • 107,679
  • 7
  • 148
  • 306

It's an element of a set which endowed with a certain structure, i.e. satisfying the axioms of a vector space.

  • 28,058
  • 4
  • 41
  • 79
  • 4
    To add to this, the vector space axioms formalize the familiar rules of vector addition and scalar multiplication that work in $\mathbb R^2$ and $\mathbb R^3$. It's okay in the beginning to visualize an abstract vector space as $\mathbb R^3$, as long as you're careful not to confuse visualization with rigorous reasoning. –  Sep 23 '16 at 00:32
  • That's at most a comment, not an answer. – Jannik Pitt May 19 '17 at 19:24

You seem to be thinking that a vector is something different depending on the field of study you are working in, but this is not true. The definition of a vector that you learn in linear algebra tells you everything you need to know about what a vector is in any setting. A vector is simply an element of a vector space, period. A vector space being any set that follows the axioms you've been given.

The vector space $\mathbb{R}^3$ that you are used to from physics is just one example of a vector space. So, to say that a vector is a column of numbers, or a geometric object with magnitude and direction, is incorrect. These are just specific examples of the many possible vectors that are out there.

I think you are looking for a very specific notion of what a vector is, when instead you should try to reconcile why all of the types of vectors that you are already used to using actually are vectors in the sense of the true definition you've been given in linear algebra.

  • 3,523
  • 2
  • 12
  • 18
  • 2
    To be fair, a vector in computer science really *is* something that has very little to do with vectors in physics or in abstract algebra. – JohannesD Sep 25 '16 at 20:29

Just to help and understand the change of concept from physics to linear algebra about vectors, without pretending to be rigorous.
Consider that in physics (Newtonian) you consider an euclidean space, so you can speak in terms of magnitude. In linear algebra we want to be able and define a vector in broader terms, in a reference system that is not necessarily orthogonal, what is called an affine space/subspace.
Infact in affine geometry (which help to visualize) an oriented segment $\mathop {AB}\limits^ \to$ is an ordered couple of points, and a vector corresponds to the ordered $n$-uple of the difference of their coordinates (the translation vector). A vector therefore is a representative of all the segments, oriented in the same direction, which are parallel and have the same "translation" (and not modulus, which is not defined, or better it is not preserved under an affine change of coordinates).

G Cab
  • 33,333
  • 3
  • 19
  • 60

Literally, an element, or a point, in a vector space, but to reach direction and magnitude, the vector space requires an inner product. Most every vector space you have seen does. An example vector space with inner product is $\mathbb R^3$, which you probably use in physics a lot. In math, "element" and "point" are frequently interchangeable: the word "element" emphasizes the algebraic nature or simply set-based nature of the thing in question, whereas "point" emphasizes a geometric interpretation.

Inner products are not part of the definition of a vector space. A linear algebra course that always works with bases and matrices will not bother to define them since the basis of a finite dimensional space always defines an inner product. A theoretical linear algebra course will not include the inner product in the definition of a vector space, but will probably study them by end of semester.

You are more familiar with the "point" interpretation, it seems. So a vector is just a point. But, as a point, it comes with additional information, since a vector space also has an origin. Therefore each point corresponds to a line segment. The inner product gives the vector a direction from the origin to the point, and the inner product also gives the vector a magnitude. (Even more detail: every inner product automatically defines a norm, and the norm is mostly synonymous with magnitude).

Surely it is easier to say "the direction of a vector" than "the point defines a line segment," but you are right to be confused -- it's a lot of shorthand and skipped details to get from the barebones math of elements and sets, to intuitive geometric quantities of directions and quantities.

Every mathematician and physicist is fluent in this shorthand, and can apply it precisely as needed. You will see a lot of this in your math career, and you should always convince yourself that when steps are skipped, the steps are honest and precise.

  • 5,315
  • 16
  • 32

The YouTube channel 3Blue1Brown has recently put out an amazing short series on the "Essence of linear algebra". As it so happens, the first chapter is called "Vectors, what even are they?" and is an outstanding explanation, far simpler than any of the answers above: https://www.youtube.com/watch?v=fNk_zzaMoSs

While I highly recommend just watching the video (since vectors are really best understood visually), I'll try to summarize here: vectors are simply lists of numbers--that's really it. They can be used in a geometric sense (similar to the physics sense you're already familiar with on a grid) where each number represents coordinates relative to some "axes" (formally called "basis vectors"). In the most common case with basis vectors $\hat{i}$, a 1-unit-long vector pointing directly right along the x-axis (represented as $[1, 0]$), and $\hat{j}$, a 1-unit-long vector orthogonal to $\hat{j}$ pointing up along the the y-axis (represented as $[0, 1]$), vectors are just coordinates on the plane. So in that case $[1, 1]$ is a vector pointing up and to the right from the origin to coordinate $(1, 1)$.

The video also goes into how vectors can also be seen as geometric transformations of the plane (e.g. squashing, stretching, shearing, or rotating), but that's something you really need to see to understand.

  • 101
  • 1

Your question is a nice example for how mathematics of today works and how certain notions emerged. There is no fancy definition of a vector in mathematics, the content of the meaning what a vector has to be was shifted to other objects: Mathematicians have abstracted certain properties of "objects" that appear in geometry or physics. This fits better the axiomatic requirements of today's mathematics.

Near the end of the 19th century, a "vector" was an ordered pair (A,B) of points in an affine space. This was also called a "fixed vector", where one could imagine (A,B) to be an arrow beginning at point A and ending with its tip at point B. In today's differential geometry one finds some relicts of this situation, when a "vector" is usually given together with its base point to which it is attached.

In mechanics, there appeared so called "line-bond vectors", vectors that were considered equivalent if they differed only by a translation along the line through A and B (if A is unequal to B). "Free vectors" were vectors considered to be equivalent if they differed only by a translation in the affine space. Free vectors can represent translations. Translations can be composed and inverted - they form a group. Translations can be scaled by multiplying with a number.

From these properties emerged what is called a "vector space". Due to the axiomatic requirements of mathematics, one puts the cart before the horse:

First, one defines - abstractly - a "vector space" (over a field (K,+,0,$\cdot$,1) ) to be a group (V,+,0) on which K acts "compatibly" via a homomorphism (of rings with unit) from the field K to the group endomorphisms of V: (K,+,0,$\cdot$,1) $\to$ (Hom$_{Grp}$(V,V),+,0,$\circ$,id$_V$).

Afterwards, one defines a "vector" to be an element of a vector space.

So the abstracted properties appear in the definition of a vector space, not in the definition of a vector.

The original geometric content of a vector appears only later as a very special case, when a real vector space acts on a (real) affine space as its space of translations, and when these translations are depicted eg. by arrows.

(N.B.: Historically spoken, a vector was not even a pair of points, but could have different meanings, e.g. as a pair of parallel planes in 3-dimensional affine space. It also took some time till the notion of a vector space emerged and till various fields K of scalars or even skew fields were admitted. Generalizing the notion of a K-vector field from a field K to a commutative ring R with unit gives today's notion of an R-module.)

  • 46
  • 4