Show that the characteristic polynomial is the same as the minimal polynomialWhen are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.

Why does the UK have more political parties than the US?

Why don't I have ground wiring on any of my outlets?

How can an eldritch abomination hide its true form in public?

How can I offer a test ride while selling a bike?

Do adult Russians normally hand-write Cyrillic as cursive or as block letters?

What is the right way to float a home lab?

Coding Challenge Solution - Good Range

If a problem only occurs randomly once in every N times on average, how many tests do I have to perform to be certain that it's now fixed?

Can The Malloreon be read without first reading The Belgariad?

Is there a rule that prohibits us from using 2 possessives in a row?

If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?

Slide Partition from Rowstore to Columnstore

Is having a hidden directory under /etc safe?

Opposite of "Squeaky wheel gets the grease"

Is American Express widely accepted in France?

The term for the person/group a political party aligns themselves with to appear concerned about the general public

Is the world in Game of Thrones spherical or flat?

What caused the tendency for conservatives to not support climate change regulations?

Accidentally cashed a check twice

what's the equivalent of helper in LWC?

The qvolume of an integer

Recording the inputs of a command and producing a list of them later on

Relativistic resistance transformation

What is the intuition behind uniform continuity?



Show that the characteristic polynomial is the same as the minimal polynomial


When are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.













7












$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32















7












$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32













7












7








7


2



$begingroup$



Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.










share|cite|improve this question











$endgroup$





Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.




I have already computated the characteristic polynomial



$$p_A(x)=x^3-ax^2-bx-c$$



and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.



A hint would be appreciated.







linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited May 18 at 23:56









Rodrigo de Azevedo

13.4k42065




13.4k42065










asked May 17 at 0:19









zz20szz20s

5,30141936




5,30141936











  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32
















  • $begingroup$
    How about calculate $det(xI-A)$ ?
    $endgroup$
    – Rodrigo Dias
    May 17 at 0:29










  • $begingroup$
    @zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
    $endgroup$
    – Theo Bendit
    May 17 at 0:31










  • $begingroup$
    You said the minimal polynomial has degree $3$
    $endgroup$
    – J. W. Tanner
    May 17 at 0:31










  • $begingroup$
    Oh, yes, sorry, that should say characteristic.
    $endgroup$
    – zz20s
    May 17 at 0:32















$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29




$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29












$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31




$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31












$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31




$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31












$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32




$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32










3 Answers
3






active

oldest

votes


















9












$begingroup$

Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






share|cite|improve this answer











$endgroup$












  • $begingroup$
    Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
    $endgroup$
    – zz20s
    May 17 at 0:44






  • 1




    $begingroup$
    Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
    $endgroup$
    – N. S.
    May 17 at 0:45










  • $begingroup$
    @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
    $endgroup$
    – Theo Bendit
    May 17 at 0:48











  • $begingroup$
    Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
    $endgroup$
    – zz20s
    May 17 at 0:54










  • $begingroup$
    It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
    $endgroup$
    – Theo Bendit
    May 17 at 1:01


















4












$begingroup$

The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






share|cite|improve this answer









$endgroup$




















    3












    $begingroup$

    Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



    • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

    Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



    Applying $m(A)$ to $e_1$ gives
    $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
    The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






    share|cite|improve this answer









    $endgroup$













      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "69"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );













      draft saved

      draft discarded


















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      9












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01















      9












      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$












      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01













      9












      9








      9





      $begingroup$

      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial






      share|cite|improve this answer











      $endgroup$



      Compute:
      $$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
      So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
      $$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
      for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
      $$1 = 0p + 0q,$$
      which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial







      share|cite|improve this answer














      share|cite|improve this answer



      share|cite|improve this answer








      edited May 17 at 14:09

























      answered May 17 at 0:41









      Theo BenditTheo Bendit

      22.7k12359




      22.7k12359











      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01
















      • $begingroup$
        Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
        $endgroup$
        – zz20s
        May 17 at 0:44






      • 1




        $begingroup$
        Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
        $endgroup$
        – N. S.
        May 17 at 0:45










      • $begingroup$
        @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
        $endgroup$
        – Theo Bendit
        May 17 at 0:48











      • $begingroup$
        Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
        $endgroup$
        – zz20s
        May 17 at 0:54










      • $begingroup$
        It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
        $endgroup$
        – Theo Bendit
        May 17 at 1:01















      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      May 17 at 0:44




      $begingroup$
      Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
      $endgroup$
      – zz20s
      May 17 at 0:44




      1




      1




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      May 17 at 0:45




      $begingroup$
      Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
      $endgroup$
      – N. S.
      May 17 at 0:45












      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      May 17 at 0:48





      $begingroup$
      @zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
      $endgroup$
      – Theo Bendit
      May 17 at 0:48













      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      May 17 at 0:54




      $begingroup$
      Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
      $endgroup$
      – zz20s
      May 17 at 0:54












      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      May 17 at 1:01




      $begingroup$
      It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
      $endgroup$
      – Theo Bendit
      May 17 at 1:01











      4












      $begingroup$

      The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



      For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



      The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






      share|cite|improve this answer









      $endgroup$

















        4












        $begingroup$

        The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



        For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



        The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






        share|cite|improve this answer









        $endgroup$















          4












          4








          4





          $begingroup$

          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.






          share|cite|improve this answer









          $endgroup$



          The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.



          For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.



          The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered May 17 at 2:02









          user52817user52817

          1492




          1492





















              3












              $begingroup$

              Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



              • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

              Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



              Applying $m(A)$ to $e_1$ gives
              $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
              The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






              share|cite|improve this answer









              $endgroup$

















                3












                $begingroup$

                Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                Applying $m(A)$ to $e_1$ gives
                $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                share|cite|improve this answer









                $endgroup$















                  3












                  3








                  3





                  $begingroup$

                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.






                  share|cite|improve this answer









                  $endgroup$



                  Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:



                  • Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$

                  Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.



                  Applying $m(A)$ to $e_1$ gives
                  $$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
                  The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered May 17 at 1:55









                  trancelocationtrancelocation

                  15.5k1929




                  15.5k1929



























                      draft saved

                      draft discarded
















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Wikipedia:Vital articles Мазмуну Biography - Өмүр баян Philosophy and psychology - Философия жана психология Religion - Дин Social sciences - Коомдук илимдер Language and literature - Тил жана адабият Science - Илим Technology - Технология Arts and recreation - Искусство жана эс алуу History and geography - Тарых жана география Навигация менюсу

                      Bruxelas-Capital Índice Historia | Composición | Situación lingüística | Clima | Cidades irmandadas | Notas | Véxase tamén | Menú de navegacióneO uso das linguas en Bruxelas e a situación do neerlandés"Rexión de Bruxelas Capital"o orixinalSitio da rexiónPáxina de Bruselas no sitio da Oficina de Promoción Turística de Valonia e BruxelasMapa Interactivo da Rexión de Bruxelas-CapitaleeWorldCat332144929079854441105155190212ID28008674080552-90000 0001 0666 3698n94104302ID540940339365017018237

                      What should I write in an apology letter, since I have decided not to join a company after accepting an offer letterShould I keep looking after accepting a job offer?What should I do when I've been verbally told I would get an offer letter, but still haven't gotten one after 4 weeks?Do I accept an offer from a company that I am not likely to join?New job hasn't confirmed starting date and I want to give current employer as much notice as possibleHow should I address my manager in my resignation letter?HR delayed background verification, now jobless as resignedNo email communication after accepting a formal written offer. How should I phrase the call?What should I do if after receiving a verbal offer letter I am informed that my written job offer is put on hold due to some internal issues?Should I inform the current employer that I am about to resign within 1-2 weeks since I have signed the offer letter and waiting for visa?What company will do, if I send their offer letter to another company