Show that the characteristic polynomial is the same as the minimal polynomialWhen are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.

Why does the UK have more political parties than the US?

Why don't I have ground wiring on any of my outlets?

How can an eldritch abomination hide its true form in public?

How can I offer a test ride while selling a bike?

Do adult Russians normally hand-write Cyrillic as cursive or as block letters?

What is the right way to float a home lab?

Coding Challenge Solution - Good Range

If a problem only occurs randomly once in every N times on average, how many tests do I have to perform to be certain that it's now fixed?

Can The Malloreon be read without first reading The Belgariad?

Is there a rule that prohibits us from using 2 possessives in a row?

If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?

Slide Partition from Rowstore to Columnstore

Is having a hidden directory under /etc safe?

Opposite of "Squeaky wheel gets the grease"

Is American Express widely accepted in France?

The term for the person/group a political party aligns themselves with to appear concerned about the general public

Is the world in Game of Thrones spherical or flat?

What caused the tendency for conservatives to not support climate change regulations?

Accidentally cashed a check twice

what's the equivalent of helper in LWC?

The qvolume of an integer

Recording the inputs of a command and producing a list of them later on

Relativistic resistance transformation

What is the intuition behind uniform continuity?



Show that the characteristic polynomial is the same as the minimal polynomial


When are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.













7












$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32















7












$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32













7












7








7


2



$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$





Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.







linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 18 at 23:56









Rodrigo de Azevedo

13.4k42065




13.4k42065










asked May 17 at 0:19









zz20szz20s

5,30141936




5,30141936











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32
















  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32















$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29




$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29












$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31




$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31












$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31




$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31












$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32




$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32










3 Answers
3






active

oldest

votes


















9












$begingroup$

Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






share|cite|improve this answer











$endgroup$












  • $begingroup$
    Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
    $endgroup$
    – zz20s
    May 17 at 0:44






  • 1




    $begingroup$
    Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
    $endgroup$
    – N. S.
    May 17 at 0:45










  • $begingroup$
    @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
    $endgroup$
    – Theo Bendit
    May 17 at 0:48











  • $begingroup$
    Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
    $endgroup$
    – zz20s
    May 17 at 0:54










  • $begingroup$
    It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
    $endgroup$
    – Theo Bendit
    May 17 at 1:01


















4












$begingroup$

The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






share|cite|improve this answer









$endgroup$




















    3












    $begingroup$

    Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



    • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

    Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



    Applying $m(A)$ to $e_1$ gives
    $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
    The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






    share|cite|improve this answer









    $endgroup$













      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      9












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01















      9












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01













      9












      9








      9





      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$



      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited May 17 at 14:09

























      answered May 17 at 0:41









      Theo BenditTheo Bendit

      22.7k12359




      22.7k12359











      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01
















      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01















      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      May 17 at 0:44




      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      May 17 at 0:44




      1




      1




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      May 17 at 0:45




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      May 17 at 0:45












      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      May 17 at 0:48





      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      May 17 at 0:48













      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      May 17 at 0:54




      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      May 17 at 0:54












      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      May 17 at 1:01




      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      May 17 at 1:01











      4












      $begingroup$

      The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



      For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



      The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






      share|cite|improve this answer









      $endgroup$

















        4












        $begingroup$

        The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



        For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



        The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






        share|cite|improve this answer









        $endgroup$















          4












          4








          4





          $begingroup$

          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






          share|cite|improve this answer









          $endgroup$



          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered May 17 at 2:02









          user52817user52817

          1492




          1492





















              3












              $begingroup$

              Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



              • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

              Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



              Applying $m(A)$ to $e_1$ gives
              $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
              The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






              share|cite|improve this answer









              $endgroup$

















                3












                $begingroup$

                Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                Applying $m(A)$ to $e_1$ gives
                $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                share|cite|improve this answer









                $endgroup$















                  3












                  3








                  3





                  $begingroup$

                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                  share|cite|improve this answer









                  $endgroup$



                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered May 17 at 1:55









                  trancelocationtrancelocation

                  15.5k1929




                  15.5k1929



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      RemoteApp sporadic failureWindows 2008 RemoteAPP client disconnects within a matter of minutesWhat is the minimum version of RDP supported by Server 2012 RDS?How to configure a Remoteapp server to increase stabilityMicrosoft RemoteApp Active SessionRDWeb TS connection broken for some users post RemoteApp certificate changeRemote Desktop Licensing, RemoteAPPRDS 2012 R2 some users are not able to logon after changed date and time on Connection BrokersWhat happens during Remote Desktop logon, and is there any logging?After installing RDS on WinServer 2016 I still can only connect with two users?RD Connection via RDGW to Session host is not connecting

                      How to write a 12-bar blues melodyI-IV-V blues progressionHow to play the bridges in a standard blues progressionHow does Gdim7 fit in C# minor?question on a certain chord progressionMusicology of Melody12 bar blues, spread rhythm: alternative to 6th chord to avoid finger stretchChord progressions/ Root key/ MelodiesHow to put chords (POP-EDM) under a given lead vocal melody (starting from a good knowledge in music theory)Are there “rules” for improvising with the minor pentatonic scale over 12-bar shuffle?Confusion about blues scale and chords

                      Esgonzo ibérico Índice Descrición Distribución Hábitat Ameazas Notas Véxase tamén "Acerca dos nomes dos anfibios e réptiles galegos""Chalcides bedriagai"Chalcides bedriagai en Carrascal, L. M. Salvador, A. (Eds). Enciclopedia virtual de los vertebrados españoles. Museo Nacional de Ciencias Naturales, Madrid. España.Fotos