Description

This course is aimed at majors in mathematics, the physical sciences, engineering, and other students interested in applications of mathematics to their disciplines. Linear algebra is used more than any other form of advanced mathematics in industry and science. A key idea is the mathematical modeling of a problem via systems of linear equations.

Desired Learning Outcomes

Prerequisites

Minimal learning outcomes

Upon completion of this course, the successful student will be able to:

Use Gaussian elimination to do all of the following: solve a linear system with reduced row echelon form, solve a linear system with row echelon form and backward substitution, find the inverse of a given matrix, and find the determinant of a given matrix.

Demonstrate proficiency at matrix algebra. For matrix multiplication demonstrate understanding of the associative law, the reverse order law for inverses and transposes, and the failure of the commutative law and the cancellation law.

Use Cramer's rule to solve a linear system.

Use cofactors to find the inverse of a given matrix and the determinant of a given matrix.

Determine whether a set with a given notion of addition and scalar multiplication is a vector space. Here, and in relevant numbers below, be familiar with both finite and infinite dimensional examples.

Determine whether a given subset of a vector space is a subspace.

Determine whether a given set of vectors is linearly independent, spans, or is a basis.

Determine the dimension of a given vector space or of a given subspace.

Find bases for the null space, row space, and column space of a given matrix, and determine its rank.

Demonstrate understanding of the Rank-Nullity Theorem and its applications.

Given a description of a linear transformation, find its matrix representation relative to given bases.

Demonstrate understanding of the relationship between similarity and change of basis.

Find the norm of a vector and the angle between two vectors in an inner product space.

Use the inner product to express a vector in an inner product space as a linear combination of an orthogonal set of vectors.

Find the orthogonal complement of a given subspace.

Demonstrate understanding of the relationship of the row space, column space, and nullspace of a matrix (and its transpose) via orthogonal complements.

Demonstrate understanding of the Cauchy-Schwartz inequality and its applications.

Determine whether a vector space with a (sesquilinear) form is an inner product space.

Use the Gram-Schmidt process to find an orthonormal basis of an inner product space. Be capable of doing this both in Rn and in function spaces that are inner product spaces.

Use least squares to fit a line (y = ax + b) to a table of data, plot the line and data points, and explain the meaning of least squares in terms of orthogonal projection.

Use the idea of least squares to find orthogonal projections onto subspaces and for polynomial curve fitting.