 Linear Algebra is a powerful tool that allows scientists to model physical phenomena and efficiently solve the systems of equations that arise from them. It is essential to almost every field of mathematics, such as geometry, topology, number theory, and abstract algebra. As such, it is not only computationally useful, with applications in data science, quantum mechanics, statistics, computer science, and all branches of engineering, but it also a great opportunity for the student to explore the beauty and elegance of mathematics.

A Portrait of Linear Algebra introduces the students to the algorithms and structures of this subject with rigorous definitions, clear explanations, and interesting examples. The exercises include both computational as well as theoretical problems that will challenge the student’s understanding.

You will not just see how Linear Algebra works, but more importantly why it works.

The structures of Linear Algebra – vector spaces, matrices, linear transformations, and their interrelated properties – are built from the field axioms of the real number system. This enables us to prove almost all the Theorems discussed in the text, with many exercises that allow the student to develop their own proof-writing skills.

A Portrait of Linear Algebra rigorously prepares the student to tackle more advanced techniques and applications of Linear Algebra, in whatever field of interest they may be. There are also many projects that will direct the student to go deeper into the subject, such as using projections to draw a three-dimensional object, rotating a vector in space around an arbitrary axis, and simultaneously diagonalizing commuting matrices.

Chapter Zero. The Language of Mathematics: Sets, Axioms, Theorems, and Proofs

Chapter One. The Canvas of Linear Algebra: Euclidean Spaces and Systems of Linear Equations
1.1 The Main Subject: Euclidean Spaces
1.2 The Span of a Set of Vectors
1.3 Euclidean Geometry
1.4 Systems of Linear Equations
1.5 The Gauss-Jordan Algorithm
1.6 Types of Linear Systems

Chapter Two. Peeling the Onion: Subspaces of Euclidean Spaces
2.1 Linear Dependence and Independence
2.2 Introduction to Subspaces
2.3 The Fundamental Matrix Spaces
2.4 The Dot Product and Orthogonality
2.5 Orthogonal Complements
2.6 Full-Rank Systems and Dependent Systems

Chapter Three. Adding Movement and Colors: Linear Transformations on Euclidean Spaces
3.1 Mapping Spaces: Introduction to Linear Transformations
3.2 Rotations, Projections and Reflections
3.3 Operations on Linear Transformations and Matrices
3.4 Properties of Operations on Linear Transformations and Matrices
3.5 The Kernel and Range; One-to-One and Onto Transformations
3.6 Invertible Operators and Matrices
3.7 Finding the Inverse of a Matrix
3.8 Conditions for Invertibility

Chapter Four. From The Real to The Abstract: General Vector Spaces
4.1 Axioms for a Vector Space
4.2 Linearity Properties for Finite Sets of Vectors
4.3 A Primer on Infinite Sets
4.4 Linearity Properties for Infinite Sets of Vectors
4.5 Subspaces, Basis, and Dimension
4.6 Diagonal, Triangular, and Symmetric Matrices

Chapter Five. Movement in the Abstract: Linear Transformations of General Vector Spaces
5.1 Introduction to General Linear Transformations
5.2 Coordinate Vectors and Matrices for Linear Transformations
5.3 One-to-One and Onto Linear Transformations; Compositions of Linear Transformations
5.4 Isomorphisms

Chapter Six. Operations on Subspaces: The Isomorphism Theorems
6.1 The Join and Intersection of Two Subspaces
6.2 Restricting Linear Transformations and the Role of the Rowspace
6.3 The Image and Preimage of Subspaces
6.4 Cosets and Quotient Spaces
6.5 The Three Isomorphism Theorems of Emmy Noether

Chapter Seven. From Square to Scalar: Permutation Theory and Determinants
7.1 Permutations and The Determinant Concept
7.2 A General Determinant Formula
7.3 Computational Tools and Properties of Determinants
7.4 The Adjugate Matrix and Cramer’s Rule
7.5 The Wronskian

Chapter Eight. Painting the Lines: Eigentheory, Diagonalization, and Similarity
8.1 The Eigentheory of Square Matrices
8.2 Computational Techniques for Eigentheory
8.3 Diagonalization of Square Matrices
8.4 Change of Basis and Linear Transformations on Euclidean Spaces
8.5 Change of Basis for Abstract Spaces and Determinants for Operators
8.6 Similarity and The Eigentheory of Operators
8.7 The Exponential of a Matrix

Chapter Nine. Geometry in the Abstract: Inner Product Spaces
9.1 Axioms for an Inner Product Space
9.2 Geometric Constructions in Inner Product Spaces
9.3 Orthonormal Sets and The Gram-Schmidt Algorithm
9.4 Orthogonal Complements and Decompositions
9.5 Orthonormal Bases and Projection Operators
9.6 Orthogonal Matrices
9.7 Orthogonal Diagonalization of Symmetric Matrices

Chapter Ten. Imagine That: Complex Spaces and The Spectral Theorems
10.1 The Field of Complex Numbers
10.2 Complex Vector Spaces
10.3 Complex Inner Products
10.4 Complex Linear Transformations and The Adjoint
10.5 Normal Matrices
10.6 Schur’s Lemma and The Spectral Theorems
10.7 Simultaneous Diagonalization

Glossary of Symbols
Subject Index

Jude Socrates received his Ph.D. in Mathematics (Number Theory) from the California Institute of Technology in 1993, under the direction of Prof. Dinakar Ramakrishnan. He has been on the full-time faculty of Pasadena City College since that year.

Professor of Mathematics, Pasadena City College

Member of the faculty since 1993.