18.6 problem 6

Internal problem ID [12835]
Internal file name [OUTPUT/11488_Saturday_November_04_2023_08_47_35_AM_69163266/index.tex]

Book: Ordinary Differential Equations by Charles E. Roberts, Jr. CRC Press. 2010
Section: Chapter 8. Linear Systems of First-Order Differential Equations. Exercises 8.2 page 362
Problem number: 6.
ODE order: 4.
ODE degree: 1.

The type(s) of ODE detected by this program : "find eigenvalues and eigenvectors"

Find the eigenvalues and associated eigenvectors of the matrix \[ \left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] \] The first step is to determine the characteristic polynomial of the matrix in order to find the eigenvalues of the matrix \(A\). This is given by \begin {align*} \det (A-\lambda I) &= 0 \\ \det \left (\left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] - \lambda \left [\begin {array}{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end {array}\right ]\right ) &= 0 \\ \det \left [\begin {array}{cccc} 1-\lambda & 1 & 1 & 1 \\ 1 & 1-\lambda & 1 & 1 \\ 1 & 1 & 1-\lambda & 1 \\ 1 & 1 & 1 & 1-\lambda \end {array}\right ] &= 0 \\ \lambda ^{4}-4 \lambda ^{3} &= 0 \end {align*}

The eigenvalues are the roots of the above characteristic polynomial. Solving for the roots gives \begin {align*} \lambda _1&=4\\ \lambda _2&=0\\ \lambda _3&=0\\ \lambda _4&=0 \end {align*}

This table summarises the above result

eigenvalue algebraic multiplicity type of eigenvalue
\(0\) \(3\) real eigenvalue
\(4\) \(1\) real eigenvalue

For each eigenvalue \(\lambda \) found above, we now find the corresponding eigenvector. Considering \(\lambda = 0\)

We need now to determine the eigenvector \(\boldsymbol {v}\) where \begin {align*} A \boldsymbol {v} &= \lambda \boldsymbol {v} \\ A \boldsymbol {v} - \lambda \boldsymbol {v} &= \boldsymbol {0} \\ (A - \lambda I ) \boldsymbol {v} &= \boldsymbol {0} \\ \left (\left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] - \left (0\right ) \left [\begin {array}{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end {array}\right ] \right ) \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \\ \left (\left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] - \left [\begin {array}{cccc} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end {array}\right ] \right ) \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \\ \left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \end {align*}

We will now do Gaussian elimination in order to solve for the eigenvector. The augmented matrix is \[ \left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} 1&1&1&1&0\\ 1&1&1&1&0\\ 1&1&1&1&0\\ 1&1&1&1&0 \end {array} \right ] \] \begin {align*} R_{2} = R_{2}-R_{1} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} 1&1&1&1&0\\ 0&0&0&0&0\\ 1&1&1&1&0\\ 1&1&1&1&0 \end {array} \right ] \end {align*}

\begin {align*} R_{3} = R_{3}-R_{1} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} 1&1&1&1&0\\ 0&0&0&0&0\\ 0&0&0&0&0\\ 1&1&1&1&0 \end {array} \right ] \end {align*}

\begin {align*} R_{4} = R_{4}-R_{1} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} 1&1&1&1&0\\ 0&0&0&0&0\\ 0&0&0&0&0\\ 0&0&0&0&0 \end {array} \right ] \end {align*}

Therefore the system in Echelon form is \[ \left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \end {array}\right ] \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] = \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \] The free variables are \(\{v_{2}, v_{3}, v_{4}\}\) and the leading variables are \(\{v_{1}\}\). Let \(v_{2} = t\). Let \(v_{3} = s\). Let \(v_{4} = r\). Now we start back substitution. Solving the above equation for the leading variables in terms of free variables gives equation \(\{v_{1} = -t -s -r\}\)

Hence the solution is \[ \left [\begin {array}{c} -t -s -r \\ t \\ s \\ r \end {array}\right ] = \left [\begin {array}{c} -t -s -r \\ t \\ s \\ r \end {array}\right ] \] Since there are three free Variable, we have found three eigenvectors associated with this eigenvalue. The above can be written as \begin {align*} \left [\begin {array}{c} -t -s -r \\ t \\ s \\ r \end {array}\right ] &= \left [\begin {array}{c} -t \\ t \\ 0 \\ 0 \end {array}\right ] + \left [\begin {array}{c} -s \\ 0 \\ s \\ 0 \end {array}\right ]\\ &= t \left [\begin {array}{c} -1 \\ 1 \\ 0 \\ 0 \end {array}\right ] + s \left [\begin {array}{c} -1 \\ 0 \\ 1 \\ 0 \end {array}\right ] + r \left [\begin {array}{c} -1 \\ 0 \\ 0 \\ 1 \end {array}\right ] \end {align*}

By letting \(t = 1\) and \(s = 1\) and \(r = 1\) then the above becomes \[ \left [\begin {array}{c} -t -s -r \\ t \\ s \\ r \end {array}\right ] = \left [\begin {array}{c} -1 \\ 1 \\ 0 \\ 0 \end {array}\right ] + \left [\begin {array}{c} -1 \\ 0 \\ 1 \\ 0 \end {array}\right ]+ \left [\begin {array}{c} -1 \\ 0 \\ 0 \\ 1 \end {array}\right ] \] Hence the three eigenvectors associated with this eigenvalue are \[ \left (\left [\begin {array}{c} -1 \\ 1 \\ 0 \\ 0 \end {array}\right ],\left [\begin {array}{c} -1 \\ 0 \\ 1 \\ 0 \end {array}\right ],\left [\begin {array}{c} -1 \\ 0 \\ 0 \\ 1 \end {array}\right ]\right ) \] Considering \(\lambda = 4\)

We need now to determine the eigenvector \(\boldsymbol {v}\) where \begin {align*} A \boldsymbol {v} &= \lambda \boldsymbol {v} \\ A \boldsymbol {v} - \lambda \boldsymbol {v} &= \boldsymbol {0} \\ (A - \lambda I ) \boldsymbol {v} &= \boldsymbol {0} \\ \left (\left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] - \left (4\right ) \left [\begin {array}{cccc} 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 1 & 0 \\ 0 & 0 & 0 & 1 \end {array}\right ] \right ) \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \\ \left (\left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ] - \left [\begin {array}{cccc} 4 & 0 & 0 & 0 \\ 0 & 4 & 0 & 0 \\ 0 & 0 & 4 & 0 \\ 0 & 0 & 0 & 4 \end {array}\right ] \right ) \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \\ \left [\begin {array}{cccc} -3 & 1 & 1 & 1 \\ 1 & -3 & 1 & 1 \\ 1 & 1 & -3 & 1 \\ 1 & 1 & 1 & -3 \end {array}\right ] \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] &= \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \end {align*}

We will now do Gaussian elimination in order to solve for the eigenvector. The augmented matrix is \[ \left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 1&-3&1&1&0\\ 1&1&-3&1&0\\ 1&1&1&-3&0 \end {array} \right ] \] \begin {align*} R_{2} = R_{2}+\frac {R_{1}}{3} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 1&1&-3&1&0\\ 1&1&1&-3&0 \end {array} \right ] \end {align*}

\begin {align*} R_{3} = R_{3}+\frac {R_{1}}{3} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 0&{\frac {4}{3}}&-{\frac {8}{3}}&{\frac {4}{3}}&0\\ 1&1&1&-3&0 \end {array} \right ] \end {align*}

\begin {align*} R_{4} = R_{4}+\frac {R_{1}}{3} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 0&{\frac {4}{3}}&-{\frac {8}{3}}&{\frac {4}{3}}&0\\ 0&{\frac {4}{3}}&{\frac {4}{3}}&-{\frac {8}{3}}&0 \end {array} \right ] \end {align*}

\begin {align*} R_{3} = R_{3}+\frac {R_{2}}{2} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 0&0&-2&2&0\\ 0&{\frac {4}{3}}&{\frac {4}{3}}&-{\frac {8}{3}}&0 \end {array} \right ] \end {align*}

\begin {align*} R_{4} = R_{4}+\frac {R_{2}}{2} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 0&0&-2&2&0\\ 0&0&2&-2&0 \end {array} \right ] \end {align*}

\begin {align*} R_{4} = R_{4}+R_{3} &\Longrightarrow \hspace {5pt}\left [\begin {array}{@{}cccc!{\ifdefined \HCode |\else \color {red}\vline width 0.6pt\fi }c@{}} -3&1&1&1&0\\ 0&-{\frac {8}{3}}&{\frac {4}{3}}&{\frac {4}{3}}&0\\ 0&0&-2&2&0\\ 0&0&0&0&0 \end {array} \right ] \end {align*}

Therefore the system in Echelon form is \[ \left [\begin {array}{cccc} -3 & 1 & 1 & 1 \\ 0 & -\frac {8}{3} & \frac {4}{3} & \frac {4}{3} \\ 0 & 0 & -2 & 2 \\ 0 & 0 & 0 & 0 \end {array}\right ] \left [\begin {array}{c} v_{1} \\ v_{2} \\ v_{3} \\ v_{4} \end {array}\right ] = \left [\begin {array}{c} 0 \\ 0 \\ 0 \\ 0 \end {array}\right ] \] The free variables are \(\{v_{4}\}\) and the leading variables are \(\{v_{1}, v_{2}, v_{3}\}\). Let \(v_{4} = t\). Now we start back substitution. Solving the above equation for the leading variables in terms of free variables gives equation \(\{v_{1} = t, v_{2} = t, v_{3} = t\}\)

Hence the solution is \[ \left [\begin {array}{c} t \\ t \\ t \\ t \end {array}\right ] = \left [\begin {array}{c} t \\ t \\ t \\ t \end {array}\right ] \] Since there is one free Variable, we have found one eigenvector associated with this eigenvalue. The above can be written as \[ \left [\begin {array}{c} t \\ t \\ t \\ t \end {array}\right ] = t \left [\begin {array}{c} 1 \\ 1 \\ 1 \\ 1 \end {array}\right ] \] Or, by letting \(t = 1\) then the eigenvector is \[ \left [\begin {array}{c} t \\ t \\ t \\ t \end {array}\right ] = \left [\begin {array}{c} 1 \\ 1 \\ 1 \\ 1 \end {array}\right ] \] The following table summarises the result found above.

\(\lambda \) algebraic geometric defective associated
multiplicity multiplicity eigenvalue? eigenvectors
\(0\) \(3\) \(4\) No \(\left [\begin {array}{c} -1 \\ 1 \\ 0 \\ 0 \end {array}\right ]\)
\(4\) \(1\) \(4\) No \(\left [\begin {array}{c} -1 \\ 0 \\ 1 \\ 0 \end {array}\right ]\)

Since the matrix is not defective, then it is diagonalizable. Let \(P\) the matrix whose columns are the eigenvectors found, and let \(D\) be diagonal matrix with the eigenvalues at its diagonal. Then we can write \[ A = P D P^{-1} \] Where \begin {align*} D &= \left [\begin {array}{cccc} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 4 \end {array}\right ]\\ P &= \left [\begin {array}{cccc} -1 & -1 & -1 & 1 \\ 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 1 & 1 \end {array}\right ] \end {align*}

Therefore \[ \left [\begin {array}{cccc} 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \\ 1 & 1 & 1 & 1 \end {array}\right ]=\left [\begin {array}{cccc} -1 & -1 & -1 & 1 \\ 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 1 & 1 \end {array}\right ] \left [\begin {array}{cccc} 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 4 \end {array}\right ] \left [\begin {array}{cccc} -1 & -1 & -1 & 1 \\ 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 1 \\ 0 & 0 & 1 & 1 \end {array}\right ]^{-1} \]