Linear Algebra and Its Applications, Sixth Edition
By David C Lay, Steven R Lay and Judi J McDonald
Contents:
About the Authors 3
Preface 12
A Note to Students 22
Chapter 1 Linear Equations in Linear Algebra 25
INTRODUCTORY EXAMPLE: Linear Models in Economics
and Engineering 25
1.1 Systems of Linear Equations 26
1.2 Row Reduction and Echelon Forms 37
1.3 Vector Equations 50
1.4 The Matrix Equation Ax D b 61
1.5 Solution Sets of Linear Systems 69
1.6 Applications of Linear Systems 77
1.7 Linear Independence 84
1.8 Introduction to Linear Transformations 91
1.9 The Matrix of a Linear Transformation 99
1.10 Linear Models in Business, Science, and Engineering 109
Projects 117
Supplementary Exercises 117
Chapter 2 Matrix Algebra 121
INTRODUCTORY EXAMPLE: Computer Models in Aircraft Design 121
2.1 Matrix Operations 122
2.2 The Inverse of a Matrix 135
2.3 Characterizations of Invertible Matrices 145
2.4 Partitioned Matrices 150
2.5 Matrix Factorizations 156
2.6 The Leontief Input–Output Model 165
2.7 Applications to Computer Graphics 171
2.8 Subspaces of Rn 179
2.9 Dimension and Rank 186
Projects 193
Supplementary Exercises 193
Chapter 3 Determinants 195
INTRODUCTORY EXAMPLE: Weighing Diamonds 195
3.1 Introduction to Determinants 196
3.2 Properties of Determinants 203
3.3 Cramer’s Rule, Volume, and Linear Transformations 212
Projects 221
Supplementary Exercises 221
Chapter 4 Vector Spaces 225
INTRODUCTORY EXAMPLE: Discrete-Time Signals and Digital
Signal Processing 225
4.1 Vector Spaces and Subspaces 226
4.2 Null Spaces, Column Spaces, Row Spaces, and Linear
Transformations 235
4.3 Linearly Independent Sets; Bases 246
4.4 Coordinate Systems 255
4.5 The Dimension of a Vector Space 265
4.6 Change of Basis 273
4.7 Digital Signal Processing 279
4.8 Applications to Difference Equations 286
Projects 295
Supplementary Exercises 295
Chapter 5 Eigenvalues and Eigenvectors 297
INTRODUCTORY EXAMPLE: Dynamical Systems and Spotted Owls 297
5.1 Eigenvectors and Eigenvalues 298
5.2 The Characteristic Equation 306
5.3 Diagonalization 314
5.4 Eigenvectors and Linear Transformations 321
5.5 Complex Eigenvalues 328
5.6 Discrete Dynamical Systems 335
5.7 Applications to Differential Equations 345
5.8 Iterative Estimates for Eigenvalues 353
5.9 Applications to Markov Chains 359
Projects 369
Supplementary Exercises 369
Chapter 6 Orthogonality and Least Squares 373
INTRODUCTORY EXAMPLE: Artificial Intelligence and Machine
Learning 373
6.1 Inner Product, Length, and Orthogonality 374
6.2 Orthogonal Sets 382
6.3 Orthogonal Projections 391
6.4 The Gram–Schmidt Process 400
6.5 Least-Squares Problems 406
6.6 Machine Learning and Linear Models 414
6.7 Inner Product Spaces 423
6.8 Applications of Inner Product Spaces 431
Projects 437
Supplementary Exercises 438
Chapter 7 Symmetric Matrices and Quadratic Forms 441
INTRODUCTORY EXAMPLE: Multichannel Image Processing 441
7.1 Diagonalization of Symmetric Matrices 443
7.2 Quadratic Forms 449
7.3 Constrained Optimization 456
7.4 The Singular Value Decomposition 463
7.5 Applications to Image Processing and Statistics 473
Projects 481
Supplementary Exercises 481
Chapter 8 The Geometry of Vector Spaces 483
INTRODUCTORY EXAMPLE: The Platonic Solids 483
8.1 Affine Combinations 484
8.2 Affine Independence 493
8.3 Convex Combinations 503
8.4 Hyperplanes 510
8.5 Polytopes 519
8.6 Curves and Surfaces 531
Project 542
Supplementary Exercises 543
Chapter 9 Optimization 545
INTRODUCTORY EXAMPLE: The Berlin Airlift 545
9.1 Matrix Games 546
9.2 Linear Programming Geometric Method 560
9.3 Linear Programming Simplex Method 570
9.4 Duality 585
Project 594
Supplementary Exercises 594
Chapter 10 Finite-State Markov Chains C-1
(Available Online)
INTRODUCTORY EXAMPLE: Googling Markov Chains C-1
10.1 Introduction and Examples C-2
10.2 The Steady-State Vector and Google’s PageRank C-13
10.3 Communication Classes C-25
10.4 Classification of States and Periodicity C-33
10.5 The Fundamental Matrix C-42
10.6 Markov Chains and Baseball Statistics C-54
Appendixes
A Uniqueness of the Reduced Echelon Form 597
B Complex Numbers 599
Credits 604
Glossary 605
Answers to Odd-Numbered Exercises A-1
Index I-1
Preface:
The response of students and teachers to the first five editions of Linear Algebra and Its Applications has been most gratifying. This Sixth Edition provides substantial support both for teaching and for using technology in the course. As before, the text provides a modern elementary introduction to linear algebra and a broad selection of interesting classical and leading-edge applications. The material is accessible to students with the maturity that should come from successful completion of two semesters of college-level mathematics, usually calculus.
The main goal of the text is to help students master the basic concepts and skills they will use later in their careers. The topics here follow the recommendations of the original Linear Algebra Curriculum Study Group (LACSG), which were based on a careful investigation of the real needs of the students and a consensus among professionals in many disciplines that use linear algebra. Ideas being discussed by the second Linear Algebra Curriculum Study Group (LACSG 2.0) have also been included. We hope this course will be one of the most useful and interesting mathematics classes taken by undergraduates.
What’s New in This Edition
The Sixth Edition has exciting new material, examples, and online resources. After talking with high-tech industry researchers and colleagues in applied areas, we added new topics, vignettes, and applications with the intention of highlighting for students and faculty the linear algebraic foundational material for machine learning, artificial intelligence, data science, and digital signal processing.
Content Changes
- Since matrix multiplication is a highly useful skill, we added new examples in Chapter 2 to show how matrix multiplication is used to identify patterns and scrub data. Corresponding exercises have been created to allow students to explore using matrix multiplication in various ways.
- In our conversations with colleagues in industry and electrical engineering, we heard repeatedly how important understanding abstract vector spaces is to their work. After reading the reviewers’ comments for Chapter 4, we reorganized the chapter, condensing some of the material on column, row, and null spaces; moving Markov chains to the end of Chapter 5; and creating a new section on signal processing. We view signals 12 as an infinite dimensional vector space and illustrate the usefulness of linear transformations to filter out unwanted “vectors” (a.k.a. noise), analyze data, and enhance signals.
- By moving Markov chains to the end of Chapter 5, we can now discuss the steady state vector as an eigenvector. We also reorganized some of the summary material on determinants and change of basis to be more specific to the way they are used in this chapter.
- In Chapter 6, we present pattern recognition as an application of orthogonality, and the section on linear models now illustrates how machine learning relates to curve fitting.
- Chapter 9 on optimization was previously available only as an online file. It has now been moved into the regular textbook where it is more readily available to faculty and students. After an opening section on finding optimal strategies to two-person zerosum games, the rest of the chapter presents an introduction to linear programming— from two-dimensional problems that can be solved geometrically to higher dimensional problems that are solved using the Simplex Method.
Other Changes
- In the high-tech industry, where most computations are done on computers, judging the validity of information and computations is an important step in preparing and analyzing data. In this edition, students are encouraged to learn to analyze their own computations to see if they are consistent with the data at hand and the questions being asked. For this reason, we have added “Reasonable Answers” advice and exercises to guide students.
- We have added a list of projects to the end of each chapter (available online and in MyLab Math). Some of these projects were previously available online and have a wide range of themes from using linear transformations to create art to exploring additional ideas in mathematics. They can be used for group work or to enhance the learning of individual students.
- PowerPoint lecture slides have been updated to cover all sections of the text and cover them more thoroughly.
Distinctive Features Early Introduction of Key Concepts
Many fundamental ideas of linear algebra are introduced within the first seven lectures, in the concrete setting of Rn, and then gradually examined from different points of view. Later generalizations of these concepts appear as natural extensions of familiar ideas, visualized through the geometric intuition developed in Chapter 1. A major achievement of this text is that the level of difficulty is fairly even throughout the course.
A Modern View of Matrix Multiplication
Good notation is crucial, and the text reflects the way scientists and engineers actually use linear algebra in practice. The definitions and proofs focus on the columns of a matrix rather than on the matrix entries. A central theme is to view a matrix–vector product Ax as a linear combination of the columns of A. This modern approach simplifies many arguments, and it ties vector space ideas into the study of linear systems.
Linear Transformations
Linear transformations form a “thread” that is woven into the fabric of the text. Their use enhances the geometric flavor of the text. In Chapter 1, for instance, linear transformations provide a dynamic and graphical view of matrix–vector multiplication.
Eigenvalues and Dynamical Systems
Eigenvalues appear fairly early in the text, in Chapters 5 and 7. Because this material is spread over several weeks, students have more time than usual to absorb and review these critical concepts. Eigenvalues are motivated by and applied to discrete and continuous dynamical systems, which appear in Sections 1.10, 4.8, and 5.9, and in five sections of Chapter 5. Some courses reach Chapter 5 after about five weeks by covering Sections 2.8 and 2.9 instead of Chapter 4. These two optional sections present all the vector space concepts from Chapter 4 needed for Chapter 5.
Orthogonality and Least-Squares Problems
These topics receive a more comprehensive treatment than is commonly found in beginning texts. The original Linear Algebra Curriculum Study Group has emphasized the need for a substantial unit on orthogonality and least-squares problems, because orthogonality plays such an important role in computer calculations and numerical linear algebra and because inconsistent linear systems arise so often in practical work.
Pedagogical Features Applications
A broad selection of applications illustrates the power of linear algebra to explain fundamental principles and simplify calculations in engineering, computer science, mathematics, physics, biology, economics, and statistics. Some applications appear in separate sections; others are treated in examples and exercises. In addition, each chapter opens with an introductory vignette that sets the stage for some application of linear algebra and provides a motivation for developing the mathematics that follows.
A Strong Geometric Emphasis
Every major concept in the course is given a geometric interpretation, because many students learn better when they can visualize an idea. There are substantially more drawings here than usual, and some of the figures have never before appeared in a linear algebra text. Interactive versions of many of these figures appear in MyLab Math.
Examples
This text devotes a larger proportion of its expository material to examples than do most linear algebra texts. There are more examples than an instructor would ordinarily present in class. But because the examples are written carefully, with lots of detail, students can read them on their own.