Module: Eigenvalues, Multiplicities, and Diagonalization

Review of Previous Concepts

  • Eigenvalues and Eigenvectors Calculation:

    • To find all eigenvalues for a given matrix, calculate the roots of its characteristic polynomial.

    • For each eigenvalue, calculate its corresponding eigenvectors by finding the null space of the matrix (AextlambdaI)(A - ext{lambda} I).

    • The basis of this null space forms the eigenspace for that eigenvalue.

  • Characteristic Equation and Polynomial:

    • For an (n×n)(n \times n) matrix, the characteristic polynomial involves calculating the determinant of (AextlambdaI)(A - ext{lambda} I) where lambda\text{lambda} is the unknown.

    • Finding roots of high-degree polynomials can be computationally challenging, especially for large matrices (e.g., $100,000 \times 100,000).</p></li><li><p>Inpracticalapplications,numericalorapproximativealgorithmsareoftenusedtocomputeeigenvaluesforlargematrices,butthesealgorithmsarebuiltonthetheoreticalfoundationslearnedinthiscourse.</p></li></ul></li><li><p><strong>TriangularMatrices:</strong></p><ul><li><p>Triangularmatricesareconvenientforfindingeigenvaluesbecausetheeigenvaluesaresimplytheentriesonthemaindiagonal.</p></li><li><p>Thisisduetothedeterminantofatriangularmatrixbeingtheproductofitsdiagonalentries,whichmeansthecharacteristicpolynomialcomes"prefactored,"revealingtherootsimmediately.</p></li></ul></li><li><p><strong>DistinguishingMatriceswithSameEigenvalues:</strong></p><ul><li><p>Twodifferentmatricescanhavethesamesetofeigenvaluesbutstillpossessdistinctproperties.</p></li><li><p>Forexample,theircharacteristicpolynomialsmightnotbeidenticalifthemultiplicitiesoftheeigenvaluesdiffer.</p></li><li><p><strong>Example:</strong>Matrixwitheigenvalue).</p></li><li><p>In practical applications, numerical or approximative algorithms are often used to compute eigenvalues for large matrices, but these algorithms are built on the theoretical foundations learned in this course.</p></li></ul></li><li><p><strong>Triangular Matrices:</strong></p><ul><li><p>Triangular matrices are convenient for finding eigenvalues because the eigenvalues are simply the entries on the main diagonal.</p></li><li><p>This is due to the determinant of a triangular matrix being the product of its diagonal entries, which means the characteristic polynomial comes "pre-factored," revealing the roots immediately.</p></li></ul></li><li><p><strong>Distinguishing Matrices with Same Eigenvalues:</strong></p><ul><li><p>Two different matrices can have the same set of eigenvalues but still possess distinct properties.</p></li><li><p>For example, their characteristic polynomials might not be identical if the multiplicities of the eigenvalues differ.</p></li><li><p><strong>Example:</strong> Matrix with eigenvalue-1(multiplicityof(multiplicity of2)and) and4(multiplicityof(multiplicity of1)versusanothermatrixwitheigenvalue) versus another matrix with eigenvalue-1(multiplicityof(multiplicity of1)and) and4(multiplicityof(multiplicity of2).Thisdifferencehighlightstheconceptof<em>algebraicmultiplicity</em>.</p></li></ul></li></ul><h4id="4a56c2294e1d411d848296f290c6c7aa"datatocid="4a56c2294e1d411d848296f290c6c7aa"collapsed="false"seolevelmigrated="true">AlgebraicMultiplicity(AM)</h4><ul><li><p><strong>Definition:</strong>Foraneigenvalue). This difference highlights the concept of <em>algebraic multiplicity</em>.</p></li></ul></li></ul><h4 id="4a56c229-4e1d-411d-8482-96f290c6c7aa" data-toc-id="4a56c229-4e1d-411d-8482-96f290c6c7aa" collapsed="false" seolevelmigrated="true">Algebraic Multiplicity (AM)</h4><ul><li><p><strong>Definition:</strong> For an eigenvalue\text{lambda}\text{bar}ofamatrixof a matrixA,its</em><strong><em>algebraicmultiplicity</em></strong><em>(denoted, its </em><strong><em>algebraic multiplicity</em></strong><em> (denotedAM(\text{lambda}\text{bar}))isthenumberoftimesthat) is the number of times that( \text{lambda} - \text{lambda}\text{bar} )isafactorinthecharacteristicpolynomialis a factor in the characteristic polynomial(PA(\text{lambda})).</p><ul><li><p>Here,.</p><ul><li><p>Here,\text{lambda}istheunknowninthepolynomial,andis the unknown in the polynomial, and\text{lambda}_\text{bar}isaspecificeigenvalue.</p></li></ul></li><li><p><strong>Example(fromslides):</strong>Ifis a specific eigenvalue.</p></li></ul></li><li><p><strong>Example (from slides):</strong> If(P_A(\text{lambda})) = (\text{lambda} + 1)^2 (\text{lambda} - 4)(orequivalently(or equivalently( \text{lambda} - (-1) )^2 (\text{lambda} - 4)),</p><ul><li><p>),</p><ul><li><p>AM(-1) = 2</p></li><li><p></p></li><li><p>AM(4) = 1</p></li></ul></li><li><p><strong>Examplewithahighdegreepolynomial:</strong>Givenacharacteristicpolynomial</p></li></ul></li><li><p><strong>Example with a high-degree polynomial:</strong> Given a characteristic polynomial\text{lambda}^4 (\text{lambda}^2 - 2\text{lambda} + 1):</p><ul><li><p>Thiscanbefactoredas:</p><ul><li><p>This can be factored as\text{lambda}^4 (\text{lambda} - 1)^2(or(or( \text{lambda} - 0 )^4 (\text{lambda} - 1)^2).</p></li><li><p>Eigenvalue).</p></li><li><p>Eigenvalue0hashasAM(0) = 4.</p></li><li><p>Eigenvalue.</p></li><li><p>Eigenvalue1hashasAM(1) = 2.</p></li></ul></li><li><p><strong>RotationMatrixExample:</strong></p><ul><li><p>Thecharacteristicpolynomialforarotationmatrix(e.g.,.</p></li></ul></li><li><p><strong>Rotation Matrix Example:</strong></p><ul><li><p>The characteristic polynomial for a rotation matrix (e.g.,\begin{pmatrix} 0 & -1 \ 1 & 0 \end{pmatrix})is) is\text{lambda}^2 + 1.</p></li><li><p>Thecharacteristicequation.</p></li><li><p>The characteristic equation\text{lambda}^2 + 1 = 0hasnorealsolutions.</p></li><li><p>Ithascomplexsolutions:has no real solutions.</p></li><li><p>It has complex solutions:iandand-i.</p></li><li><p>Therefore,thismatrixhasnorealeigenvaluesbuthascomplexeigenvalues.</p></li><li><p>Therefore, this matrix has no real eigenvalues but has complex eigenvalues\text{i}andand-i,eachwith, each withAM(i) = 1andandAM(-i) = 1.</p></li></ul></li><li><p><strong>FundamentalTheoremofAlgebra:</strong></p><ul><li><p>Ifapolynomialhasdegree.</p></li></ul></li><li><p><strong>Fundamental Theorem of Algebra:</strong></p><ul><li><p>If a polynomial has degreenandcomplexrootsareallowed,thenthereareexactlyand complex roots are allowed, then there are exactlynroots,countedwiththeirmultiplicities.</p></li><li><p>Anroots, counted with their multiplicities.</p></li><li><p>An(n \times n)matrixwillalwayshaveexactlymatrix will always have exactlyncomplexeigenvalueswheneachiscountedaccordingtoitsalgebraicmultiplicity.</p></li></ul></li></ul><h4id="eab2bc6ddcbb4cbaa36e6a1bcc4c4307"datatocid="eab2bc6ddcbb4cbaa36e6a1bcc4c4307"collapsed="false"seolevelmigrated="true">GeometricMultiplicity(GM)</h4><ul><li><p><strong>Motivation:</strong>Eveniftwomatriceshavethesameeigenvalueswiththesamealgebraicmultiplicities(i.e.,thesamecharacteristicpolynomial),theireigenpropertiesmightstilldiffer.Thisdifferenceisrelatedtothedimensionoftheircorrespondingeigenspaces.</p></li><li><p><strong>Definition:</strong>Foraneigenvaluecomplex eigenvalues when each is counted according to its algebraic multiplicity.</p></li></ul></li></ul><h4 id="eab2bc6d-dcbb-4cba-a36e-6a1bcc4c4307" data-toc-id="eab2bc6d-dcbb-4cba-a36e-6a1bcc4c4307" collapsed="false" seolevelmigrated="true">Geometric Multiplicity (GM)</h4><ul><li><p><strong>Motivation:</strong> Even if two matrices have the same eigenvalues with the same algebraic multiplicities (i.e., the same characteristic polynomial), their eigenproperties might still differ. This difference is related to the dimension of their corresponding eigenspaces.</p></li><li><p><strong>Definition:</strong> For an eigenvalue\text{lambda}\text{bar}ofamatrixof a matrixA,its</em><strong><em>geometricmultiplicity</em></strong><em>(denoted, its </em><strong><em>geometric multiplicity</em></strong><em> (denotedGM(\text{lambda}\text{bar}))isthe<em>dimensionofitseigenspace</em>(whichisthedimensionofthenullspaceof) is the <em>dimension of its eigenspace</em> (which is the dimension of the null space of(A - \text{lambda}_\text{bar} I)).</p></li><li><p><strong>Example(MatricesAandB):</strong></p><ul><li><p>Considermatrices).</p></li><li><p><strong>Example (Matrices A and B):</strong></p><ul><li><p>Consider matricesA = \begin{pmatrix} -2 & 1 \ 0 & -2 \end{pmatrix}andandB = \begin{pmatrix} -2 & 0 \ 0 & -2 \end{pmatrix}.</p></li><li><p>Bothhavecharacteristicpolynomial.</p></li><li><p>Both have characteristic polynomial( -2 - \text{lambda} )^2(or(or( \text{lambda} + 2 )^2).</p></li><li><p>Bothhaveeigenvalue).</p></li><li><p>Both have eigenvalue-2withwithAM(-2) = 2.</p></li><li><p>However,theirgeometricmultiplicitiesdiffer:</p><ul><li><p>Formatrix.</p></li><li><p>However, their geometric multiplicities differ:</p><ul><li><p>For matrixA:Thenullspaceof: The null space of(A - (-2)I) = A + 2I = \begin{pmatrix} 0 & 1 \ 0 & 0 \end{pmatrix}hasdimensionhas dimension1(onefreevariable).So,(one free variable). So,GM_A(-2) = 1.</p></li><li><p>Formatrix.</p></li><li><p>For matrixB:Thenullspaceof: The null space of(B - (-2)I) = B + 2I = \begin{pmatrix} 0 & 0 \ 0 & 0 \end{pmatrix}hasdimensionhas dimension2(twofreevariables).So,(two free variables). So,GM_B(-2) = 2.</p></li></ul></li></ul></li><li><p><strong>Theorem:</strong>Foran.</p></li></ul></li></ul></li><li><p><strong>Theorem:</strong> For an(n \times n)matrixmatrixAandaneigenvalueand an eigenvalue\text{lambda}::1 \le GM(\text{lambda}) \le AM(\text{lambda}) \le n</p><ul><li><p>Thismeansthegeometricmultiplicityisalwayslessthanorequaltothealgebraicmultiplicity,andbothareatmost</p><ul><li><p>This means the geometric multiplicity is always less than or equal to the algebraic multiplicity, and both are at mostn.</p></li></ul></li></ul><h4id="47588c854b1145ca8849eb46bc6fcfff"datatocid="47588c854b1145ca8849eb46bc6fcfff"collapsed="false"seolevelmigrated="true">DiagonalizationandMarkovChains</h4><ul><li><p><strong>Goal:</strong>Constructabasisof.</p></li></ul></li></ul><h4 id="47588c85-4b11-45ca-8849-eb46bc6fcfff" data-toc-id="47588c85-4b11-45ca-8849-eb46bc6fcfff" collapsed="false" seolevelmigrated="true">Diagonalization and Markov Chains</h4><ul><li><p><strong>Goal:</strong> Construct a basis ofR^n(or(orR^2fortheexample)thatconsists<em>entirely</em>ofeigenvectorsofagivenmatrix.</p></li><li><p><strong>MarkovChainApplication:</strong>Usingeigenvaluesandeigenvectorscansimplifytheanalysisofsystemevolution,suchasMarkovchains.</p><ul><li><p><strong>Problem:</strong>Calculatefor the example) that consists <em>entirely</em> of eigenvectors of a given matrix.</p></li><li><p><strong>Markov Chain Application:</strong> Using eigenvalues and eigenvectors can simplify the analysis of system evolution, such as Markov chains.</p><ul><li><p><strong>Problem:</strong> CalculateXk = T^k X0foralargefor a largek(e.g.,(e.g.,k=100),where), whereTisatransitionmatrixandis a transition matrix andX_0isaninitialdistributionvector.</p></li><li><p><strong>SteadyState:</strong>Forregularstochasticmatrices,thesystemtendstowardsauniquesteadystateis an initial distribution vector.</p></li><li><p><strong>Steady State:</strong> For regular stochastic matrices, the system tends towards a unique steady stateq,where, whereTq = q.Thisimplies. This implies\text{lambda} = 1isalwaysaneigenvalue,andis always an eigenvalue, andqisitscorrespondingeigenvector(rescaledtobeaprobabilityvector).</p></li><li><p><strong>SimplifiedCalculationusingEigenvectors:</strong></p><ol><li><p>Findalleigenvaluesandtheircorrespondingeigenvectorsforthematrixis its corresponding eigenvector (rescaled to be a probability vector).</p></li><li><p><strong>Simplified Calculation using Eigenvectors:</strong></p><ol><li><p>Find all eigenvalues and their corresponding eigenvectors for the matrixT.</p></li><li><p>Iftheeigenvectorsformabasisforthespace(e.g.,.</p></li><li><p>If the eigenvectors form a basis for the space (e.g.,R^2),expresstheinitialdistributionvector), express the initial distribution vectorX0asalinearcombinationoftheseeigenvectors:as a linear combination of these eigenvectors:\text{X}0 = c1 \text{v}1 + c2 \text{v}2</p></li><li><p>Then,</p></li><li><p>Then, \text{X}k = T^k \text{X}0 = T^k ( c1 \text{v}1 + c2 \text{v}2 ) .</p></li><li><p>Duetotheeigenvectorproperty,thissimplifiesto:.</p></li><li><p>Due to the eigenvector property, this simplifies to: \text{X}k = c1 T^k \text{v}1 + c2 T^k \text{v}2 = c1 \text{lambda}1^k \text{v}1 + c2 \text{lambda}2^k \text{v}_2 </p></li><li><p>ForMarkovchains,</p></li><li><p>For Markov chains,\text{lambda}1 = 1(correspondingtothesteadystateeigenvector(corresponding to the steady state eigenvector\text{v}1).Othereigenvalues(e.g.,). Other eigenvalues (e.g.,\text{lambda}_2)willhavemagnitudelessthan) will have magnitude less than1(e.g.,(e.g.,0.92).</p></li><li><p>Theexpressionbecomes:).</p></li><li><p>The expression becomes: \text{X}k = c1 \text{v}1 + c2 (0.92)^k \text{v}_2 .</p></li><li><p>As.</p></li><li><p>Ask \to \infty,,(0.92)^k \to 0.Thus,. Thus, \text{X}k \to c1 \text{v}_1 .</p></li><li><p>Theterm.</p></li><li><p>The termc1 \text{v}1representsthesteadystaterepresents the steady stateq.Thisprovidesaclosedformsolutionfor. This provides a closed-form solution forX_kandclarifiesthelongtermbehavior:thelargesteigenvalue(magnitudinally)dominatesthecalculation.</p></li></ol></li></ul></li></ul><h4id="f92f06b9f2094853a523973043a5f550"datatocid="f92f06b9f2094853a523973043a5f550"collapsed="false"seolevelmigrated="true">TheDiagonalizationTheorem</h4><ul><li><p><strong>Purpose:</strong>Providesawaytofactoramatrixand clarifies the long-term behavior: the largest eigenvalue (magnitudinally) dominates the calculation.</p></li></ol></li></ul></li></ul><h4 id="f92f06b9-f209-4853-a523-973043a5f550" data-toc-id="f92f06b9-f209-4853-a523-973043a5f550" collapsed="false" seolevelmigrated="true">The Diagonalization Theorem</h4><ul><li><p><strong>Purpose:</strong> Provides a way to factor a matrixAifitadmitsaneigenvectorbasis.</p></li><li><p><strong>Assumptions:</strong></p><ol><li><p>if it admits an eigenvector basis.</p></li><li><p><strong>Assumptions:</strong></p><ol><li><p>Aisanis an(n \times n)matrix.</p></li><li><p>Letmatrix.</p></li><li><p>Let\text{lambda}1, \dots, \text{lambda}nbetheeigenvaluesofbe the eigenvalues ofA,whereeacheigenvalueisrepeatedaccordingtoitsalgebraicmultiplicity.</p></li><li><p>Let, where each eigenvalue is repeated according to its algebraic multiplicity.</p></li><li><p>Let\text{v}1, \dots, \text{v}nbecorrespondingeigenvectorsthatformabasisofbe corresponding eigenvectors that form a basis ofR^n</p></li></ol></li><li><p><strong>Conclusion:</strong>Iftheseassumptionshold,then</p></li></ol></li><li><p><strong>Conclusion:</strong> If these assumptions hold, thenAcanbeexpressedas:can be expressed as:A = P D P^{-1}where:</p><ul><li><p>where:</p><ul><li><p>Pisaninvertiblematrixwhosecolumnsaretheeigenvectorsis an invertible matrix whose columns are the eigenvectors\text{v}1, \dots, \text{v}n</p></li><li><p></p></li><li><p>Disadiagonalmatrixwiththeeigenvaluesis a diagonal matrix with the eigenvalues\text{lambda}1, \dots, \text{lambda}nonitsmaindiagonal(inthesameorderastheircorrespondingeigenvectorsinon its main diagonal (in the same order as their corresponding eigenvectors inP).</p></li></ul></li><li><p><strong>SignificanceLater:</strong>Whilethistheoremshowshowtoconstruct).</p></li></ul></li><li><p><strong>Significance Later:</strong> While this theorem shows how to constructAfromitseigenelements,amorepowerfulaspect(discussedlater)isthatiffrom its eigen-elements, a more powerful aspect (discussed later) is that ifAcanbewritteninthisform,onecanimmediatelyidentifyitseigenvectorsandeigenvaluesfromcan be written in this form, one can immediately identify its eigenvectors and eigenvalues fromPandandD$$.