- Forward


Vector Mathematics in 2D
An Introduction to 2D Vectors


Prof. David Bernstein
James Madison University

Computer Science Department
bernstdh@jmu.edu

Print

Some Notation
Back Forward
  • The Set of Real Numbers:
    • \(\mathbb{R}^{1} \doteq \mathbb{R} \)
  • The Set of Ordered Pairs of Real Numbers:
    • \(\mathbb{R}^{2} \doteq \mathbb{R} \times \mathbb{R}\)
Members of \(\mathbb{R}^{2}\)
Back Forward
  • The Notation You Are Most Familiar With:
    • \((x, y)\)
  • A More Convenient Notation:
    • \(\bs{p} = (p_{x}, p_{y})\)
  • A Still More Convenient Notation:
    • \(\bs{p} = (p_{1}, p_{2})\)
Two Things to be Aware Of
Back Forward
  • Orientation:
    • It is sometimes important to distinguish between "rows" and "columns"
    • An example of a "row": \(\bs{p} = [p_1 \quad p_2]\)
    • An example of a "column": \(\bs{p} = \left[ \begin{array}{c}p_1 \\ p_2 \end{array} \right]\)
    • We won't distinguish for now
  • Points and Directions:
    • It is sometimes important to distinguish between different kinds of ordered pairs (e.g., points and directions)
    • We won't distinguish for now except in visualizations
Visualization of Points
Back Forward

Points

images/point2d.gif
Multiplication by a Scalar
Back Forward
  • Definition:
    • Given \(\alpha \in \mathbb{R}\) and \(\bs{p} \in \mathbb{R}^{2}\)
    • \(\alpha \bs{p} = (\alpha p_{1}, \alpha p_{2})\)
  • Example:
    • \(2 (1,1) = (2,2)\)
  • Visualization:
    • images/scalarmultiplication2d.gif
Addition
Back Forward
  • Definition:
    • Given \(\bs{q} \in \mathbb{R}^{2}\) and \(\bs{r} \in \mathbb{R}^{2}\)
    • \(\bs{q} + \bs{r} = (q_{1}+r_{1}, q_{2}+r_{2})\)
  • Example:
    • \((1.5, -0.5) + (0.5, -1.0) = (2, -1.5)\)
  • Visualization:
    • images/addition2d.gif
Subtraction
Back Forward
  • Definition:
    • Given \(\bs{q} \in \mathbb{R}^{2}\) and \(\bs{r} \in \mathbb{R}^{2}\)
    • \(\bs{q} - \bs{r} = (q_{1}-r_{1}, q_{2}-r_{2})\)
  • Example:
    • \((1.5, 0.5) - (0.5, 1.0) = (1.0, -0.5)\)
  • Visualization:
    • images/subtraction2d.gif
Subtraction (cont.)
Back Forward
  • Another Visualization:
    • images/subtraction2d-vector.gif
  • Interpretation:
    • The direction vector points from \(\bs{r}\) to \(\bs{q}\)
    • So, \(\bs{r} + (\bs{q} - \bs{r}) = \bs{q}\)
Visualization of Points and Directions Revisited
Back Forward
images/vector2d.gif

We will use the different visualization techniques in different situations. So, you will have to be very careful.

Visualization of Addition Revisited
Back Forward
  • Example:
    • \((1.5, -0.5) + (0.5, -1.0) = (2, -1.5)\)
  • Visualization:
    • images/addition2d-vector.gif
An Example of the Visualization of Points and Directions
Back Forward
  • Vector Fields:
    • A vector field is a map \(F: A \subset \mathbb{R}^{2} \rightarrow \mathbb{R}^{2}\) that assigns to each point \(p \in A\) a vector \(F(p)\)
  • Visualization of Vector Fields:
    • images/vectorfield2d.gif
The Standard Basis
Back Forward
  • Two Important Vectors:
    • \(\bs{i} = (1, 0)\)
    • \(\bs{j} = (0, 1)\)
  • A General Representation of Vectors in \(\mathbb{R}^2\)
    • Suppose \(\bs{q} = (q_{1}, q_{2})\), then
    • \(\bs{q} = q_{1} \bs{i} + q_{2} \bs{j}\)
The Standard Basis (cont.)
Back Forward

Visualization

images/standardbasis2d.gif
The Standard Basis (cont.)
Back Forward
  • An Important Observation:
    • Every vector \(\bs{p}\) can be written as \(\bs{p} = \alpha \bs{i} + \beta \bs{j}\) when \(\alpha\) and \(\beta\) are chosen appropriately
  • Some Terminology:
    • \(\bs{p}\) is then said to be a linear combination of \(\bs{i}\) and \(\bs{j}\) (more later)
    • \(\bs{i}\) and \(\bs{j}\) are said to be linearly independent (more later)
The Standard Basis and the Unit Circle
Back Forward
images/standard-basis_unit-circle.png

Since the distance from the origin to \(\bs{p}\) is 1, it follows that: \[ \cos(\theta) = \frac{p_1}{1} \\ \sin(\theta) = \frac{p_2}{1} \]

Which is to say that \( \bs{p} := (p_1, p_2) = (\cos(\theta), \sin(\theta)) \)

Hence, any point, \(\bs{q}\), on the unit sphere can be represented as: \[ \bs{q} = (1, 0) \cdot \cos(\theta) + (0, 1) \cdot \sin(\theta) = \bs{i} \cos(\theta) + \bs{j} \sin(\theta) \]

The Inner (Dot) Product
Back Forward
  • Two Notations:
    • \(\bs{p} \cdot \bs{q}\)
    • \(\lt \bs{p}, \bs{q} \gt\)
  • Definition:
    • Given \(\bs{p} \in \mathbb{R}^{2}\) and \(\bs{q} \in \mathbb{R}^{2}\)
    • \(\bs{p} \cdot \bs{q} = p_{1} q_{1} + p_{2} q_{2}\)
  • Example:
    • \((2,3) \cdot (4,5) = 2 (4) + 3 (5) = 8 + 15 = 23\)
The Euclidean Norm (Length) of a Vector
Back Forward
  • An Example:
    • images/norm2d.gif
  • Pythagorean Theorem:
    • Length of \(\bs{p}\) is \(\sqrt{4^{2} + 3^2} = \sqrt{25} = 5\)
The Euclidean Norm (Length) of a Vector (cont.)
Back Forward
  • The (Euclidean) Length of a Vector:
    • \(\sqrt{p_{1}^{2} + p_{2}^2}\)
  • An Observation:
    • \(\bs{p} \cdot \bs{p} = p_{1}^{2} + p_{2}^2\)
  • Substituting:
    • The length of \(\bs{p}\) is \((\bs{p} \cdot \bs{p})^{\frac{1}{2}}\) which we denote by \(||\bs{p}||\)
Unit Vector
Back Forward
  • Definition:
    • A vector with a norm (i.e., length) of 1
  • Two Examples Revisited:
    • \(\bs{i} = (1, 0)\)
    • \(\bs{j} = (0, 1)\)
Normalization
Back Forward
  • Definition:
    • The normalization of a vector \(\bs{p}\) is \(\frac{\bs{p}}{||\bs{p}||}\) (i.e., the vector divided by its norm)
  • An Observation:
    • \(||\bs{p}||\) is a scalar so normalization involves multiplication by a scalar
  • Relationship to Unit Vectors:
    • \( \begin{align*} \left\| \frac{\bs{p}}{||\bs{p}||} \right\| &= \left(\frac{\bs{p}}{||\bs{p}||} \cdot \frac{\bs{p}}{||\bs{p}||}\right)^\frac{1}{2} \\ &= \left[ \left(\frac{1}{||\bs{p}||}p_{1}, \frac{1}{||\bs{p}||}p_{2}\right) \cdot \left(\frac{1}{||\bs{p}||}p_{1}, \frac{1}{||\bs{p}||}p_{2}\right)\right]^\frac{1}{2} \\ &= \left[\frac{1}{||\bs{p}||^{2}}p_{1}^{2} + \frac{1}{||\bs{p}||^{2}}p_{2}^{2}\right]^\frac{1}{2} \\ &= \left[\frac{p_{1}^{2} + p_{2}^{2}}{||\bs{p}||^{2}}\right]^\frac{1}{2} \\ &= \left[\frac{p_{1}^{2} + p_{2}^{2}}{\left(\bs{p} \cdot \bs{p}\right)}\right]^\frac{1}{2} \\ &= \left[\frac{p_{1}^{2} + p_{2}^{2}}{p_{1}^{2} + p_{2}^{2}}\right]^\frac{1}{2} \\ &= 1 \end{align*} \)
Generalizing the Pythagorean Theorem
Back Forward
  • Visualization:
    • images/lawofcosines2d.gif
  • The Law of Cosines:
    • \(||\bs{q} - \bs{r}||^{2} = ||\bs{q}||^{2} + ||\bs{r}||^{2} - 2 ||\bs{q}|| \enspace ||\bs{r}|| \cos \theta\)
The Angle Formed by Vectors
Back Forward
  • From the Law of Cosines:
    • \(\cos \theta = \frac{||\bs{q}||^{2} + ||\bs{r}||^{2} - ||\bs{q} - \bs{r}||^{2}}{2 ||\bs{q}|| \enspace ||\bs{r}||} = \frac{\bs{q} \cdot \bs{r}}{||\bs{q}|| \enspace ||\bs{r}||}\)
    • \(\theta = \cos^{-1} \frac{\bs{q} \cdot \bs{r}}{||\bs{q}|| \enspace ||\bs{r}||}\)
  • Interpretation:
    • The angle formed by two vectors is related to their inner product
  • Special Cases:
    • The two vectors are orthogonal (i.e., the angle is 90 degrees)
    • The two vectors are linearly independent if they are not parallel
The Angle Formed by Vectors (cont.)
Back Forward
  • One Perpendicular Vector:
    • images/vector2d-perp.gif
  • Definition:
    • \(\bs{q}^{\perp} = (- q_{2}, q_{1})\)
Weighted Combinations
Back Forward
  • Linear Combination (Revisited):
    • Given two linearly independent vectors \(\bs{v}\) and \(\bs{w}\) and two scalars \(\alpha\) and \(\beta\), \(\alpha \bs{v} + \beta \bs{w}\) is said to be a linear combination of \(\bs{v}\) and \(\bs{w}\)
  • Barycentric Combination:
    • A linear combination in which the weights sum to 1
    • Hence, given two points \(\bs{q}\) and \(\bs{r}\) and a scalar \(\alpha\), \(\alpha \bs{q} + (1 - \alpha) \bs{r}\) is said to be a barycentric combination of \(\bs{q}\) and \(\bs{r}\)
  • Convex Combination:
    • A linear combination in which the weights sum to 1 and are all less than or equal to 1
    • Hence, given two points \(\bs{q}\) and \(\bs{r}\) and a scalar \(\alpha \in [0, 1]\), \(\alpha \bs{q} + (1 - \alpha) \bs{r}\) is said to be a convex combination of \(\bs{q}\) and \(\bs{r}\)
Weighted Combinations (cont.)
Back Forward

Weighted Combinations of \(\bs{q}\) and \(\bs{r}\)

images/weightedcombinations2d.gif
Weighted Combinations (cont.)
Back Forward

Convex Combinations of \(q\) and \(r\)

images/convexcombinations2d.gif
Weighted Combinations (cont.)
Back Forward

Barycentric Combinations of \(q\) and \(r\)

images/barycentriccombinations2d.gif
Weighted Combinations (cont.)
Back Forward
  • Barycentric Combination of Three Points:
    • Given three points \(\bs{r}\), \(\bs{s}\) and \(\bs{t}\) and three scalars \(\rho\), \(\sigma\), and \(\tau\) with \(\rho + \sigma + \tau = 1\), \(\rho \bs{r} + \sigma \bs{s} + \tau \bs{t}\) is said to be a barycentric combination of the three points
  • Visualization:
    • images/barycentriccombinations2d-3points.png
  • An Observation:
    • We can always determine one of the scalars from the other two
There's Always More to Learn
Back -