Vectors are often represented using a lowercase character such as “v”; for example:

v = (v1, v2, v3)

Where v1, v2, v3 are scalar values, often real values.

Vectors are also shown using a vertical representation or a column; for example:

v1
v = ( v2 )
v3

It is common to represent the target variable as a vector with the lowercase “y” when describing the training of a machine learning algorithm.

It is common to introduce vectors using a geometric analogy, where a vector represents a point or coordinate in an n-dimensional space, where n is the number of dimensions, such as 2.

The vector can also be thought of as a line from the origin of the vector space with a direction and a magnitude.

These analogies are good as a starting point, but should not be held too tightly as we often consider very high dimensional vectors in machine learning. I find the vector-as-coordinate the most compelling analogy in machine learning.

Now that we know what a vector is, let’s look at how to define a vector in Python.

Defining a Vector

We can represent a vector in Python as a NumPy array.

A NumPy array can be created from a list of numbers. For example, below we define a vector with the length of 3 and the integer values 1, 2 and 3.