Linear Algebra and Matrix Computations for Statistics

;

107,98 €
102,58 €
AGGIUNGI AL CARRELLO
NOTE EDITORE
Linear Algebra and Matrix Analysis for Statistics offers a gradual exposition to linear algebra without sacrificing the rigor of the subject. It presents both the vector space approach and the canonical forms in matrix theory. The book is as self-contained as possible, assuming no prior knowledge of linear algebra. The authors first address the rudimentary mechanics of linear systems using Gaussian elimination and the resulting decompositions. They introduce Euclidean vector spaces using less abstract concepts and make connections to systems of linear equations wherever possible. After illustrating the importance of the rank of a matrix, they discuss complementary subspaces, oblique projectors, orthogonality, orthogonal projections and projectors, and orthogonal reduction. The text then shows how the theoretical concepts developed are handy in analyzing solutions for linear systems. The authors also explain how determinants are useful for characterizing and deriving properties concerning matrices and linear systems. They then cover eigenvalues, eigenvectors, singular value decomposition, Jordan decomposition (including a proof), quadratic forms, and Kronecker and Hadamard products. The book concludes with accessible treatments of advanced topics, such as linear iterative systems, convergence of matrices, more general vector spaces, linear transformations, and Hilbert spaces.

SOMMARIO
Matrices, Vectors, and Their OperationsBasic definitions and notations Matrix addition and scalar-matrix multiplication Matrix multiplication Partitioned matricesThe "trace" of a square matrix Some special matrices Systems of Linear EquationsIntroduction Gaussian elimination Gauss-Jordan elimination Elementary matrices Homogeneous linear systems The inverse of a matrix More on Linear EquationsThe LU decompositionCrout’s Algorithm LU decomposition with row interchanges The LDU and Cholesky factorizations Inverse of partitioned matrices The LDU decomposition for partitioned matricesThe Sherman-Woodbury-Morrison formula Euclidean SpacesIntroduction Vector addition and scalar multiplication Linear spaces and subspaces Intersection and sum of subspaces Linear combinations and spans Four fundamental subspaces Linear independence Basis and dimension The Rank of a MatrixRank and nullity of a matrix Bases for the four fundamental subspaces Rank and inverse Rank factorization The rank-normal form Rank of a partitioned matrix Bases for the fundamental subspaces using the rank normal form Complementary SubspacesSum of subspaces The dimension of the sum of subspacesDirect sums and complements Projectors Orthogonality, Orthogonal Subspaces, and ProjectionsInner product, norms, and orthogonality Row rank = column rank: A proof using orthogonality Orthogonal projections Gram-Schmidt orthogonalization Orthocomplementary subspaces The fundamental theorem of linear algebra More on OrthogonalityOrthogonal matrices The QR decomposition Orthogonal projection and projector Orthogonal projector: Alternative derivations Sum of orthogonal projectorsOrthogonal triangularization Revisiting Linear EquationsIntroductionNull spaces and the general solution of linear systems Rank and linear systemsGeneralized inverse of a matrix Generalized inverses and linear systems The Moore-Penrose inverse DeterminantsDefinitions Some basic properties of determinants Determinant of products Computing determinants The determinant of the transpose of a matrix — revisited Determinants of partitioned matrices Cofactors and expansion theorems The minor and the rank of a matrix The Cauchy-Binet formula The Laplace expansion Eigenvalues and EigenvectorsCharacteristic polynomial and its roots Spectral decomposition of real symmetric matricesSpectral decomposition of Hermitian and normal matrices Further results on eigenvalues Singular value decomposition Singular Value and Jordan Decompositions Singular value decomposition (SVD)The SVD and the four fundamental subspaces SVD and linear systems SVD, data compression and principal components Computing the SVD The Jordan canonical form Implications of the Jordan canonical form Quadratic FormsIntroductionQuadratic forms Matrices in quadratic forms Positive and nonnegative definite matrices Congruence and Sylvester’s law of inertiaNonnegative definite matrices and minorsExtrema of quadratic forms Simultaneous diagonalization The Kronecker Product and Related Operations Bilinear interpolation and the Kronecker product Basic properties of Kronecker products Inverses, rank and nonsingularity of Kronecker products Matrix factorizations for Kronecker products Eigenvalues and determinant The vec and commutator operators Linear systems involving Kronecker products Sylvester’s equation and the Kronecker sum The Hadamard product Linear Iterative Systems, Norms, and Convergence Linear iterative systems and convergence of matrix powers Vector norms Spectral radius and matrix convergence Matrix norms and the Gerschgorin circles SVD – revisited Web page ranking and Markov chains Iterative algorithms for solving linear equations Abstract Linear Algebra General vector spaces General inner productsLinear transformations, adjoint and rankThe four fundamental subspaces - revisited Inverses of linear transformations Linear transformations and matrices Change of bases, equivalence and similar matrices Hilbert spaces References Exercises appear at the end of each chapter.

ALTRE INFORMAZIONI
  • Condizione: Nuovo
  • ISBN: 9781420095388
  • Collana: Chapman & Hall/CRC Texts in Statistical Science
  • Dimensioni: 9 x 6 in Ø 2.05 lb
  • Formato: Copertina rigida
  • Illustration Notes: 10 b/w images and 3 tables
  • Pagine Arabe: 580