Show that the characteristic polynomial is the same as the minimal polynomialWhen are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.
Why does the UK have more political parties than the US?
Why don't I have ground wiring on any of my outlets?
How can an eldritch abomination hide its true form in public?
How can I offer a test ride while selling a bike?
Do adult Russians normally hand-write Cyrillic as cursive or as block letters?
What is the right way to float a home lab?
Coding Challenge Solution - Good Range
If a problem only occurs randomly once in every N times on average, how many tests do I have to perform to be certain that it's now fixed?
Can The Malloreon be read without first reading The Belgariad?
Is there a rule that prohibits us from using 2 possessives in a row?
If a massive object like Jupiter flew past the Earth how close would it need to come to pull people off of the surface?
Slide Partition from Rowstore to Columnstore
Is having a hidden directory under /etc safe?
Opposite of "Squeaky wheel gets the grease"
Is American Express widely accepted in France?
The term for the person/group a political party aligns themselves with to appear concerned about the general public
Is the world in Game of Thrones spherical or flat?
What caused the tendency for conservatives to not support climate change regulations?
Accidentally cashed a check twice
what's the equivalent of helper in LWC?
The qvolume of an integer
Recording the inputs of a command and producing a list of them later on
Relativistic resistance transformation
What is the intuition behind uniform continuity?
Show that the characteristic polynomial is the same as the minimal polynomial
When are minimal and characteristic polynomials the same?Minimal polynomials and characteristic polynomialsCharacteristic polynomial divides minimal polynomial if and only if all eigenspaces are one-dimensional$3 times 3$ matrices completely determined by their characteristic and minimal polynomialsFinding Jordan Canonical form given the minimal and characteristic polynomial.Theorem on characteristic polynomials and minimal polynomials.Minimal Polynomial VS Jordan Normal Form.Minimal polynomial and possible Jordan formsMinimal polynomial problemsFind minimal Polynomial of matrixProof: Characteristic polynomial expressed two different ways equals same polynomial.
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.
I have already computated the characteristic polynomial
$$p_A(x)=x^3-ax^2-bx-c$$
and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials
$endgroup$
add a comment |
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.
I have already computated the characteristic polynomial
$$p_A(x)=x^3-ax^2-bx-c$$
and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials
$endgroup$
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32
add a comment |
$begingroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.
I have already computated the characteristic polynomial
$$p_A(x)=x^3-ax^2-bx-c$$
and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials
$endgroup$
Let $$A =beginpmatrix0 & 0 & c \1 & 0 & b \ 0& 1 & aendpmatrix$$
Show that the characteristic and minimal polynomials of $A$ are the same.
I have already computated the characteristic polynomial
$$p_A(x)=x^3-ax^2-bx-c$$
and I know from here that if I could show that the eigenspaces of $A$ all have dimension $1$, I would be done. The problem is that solving for the eigenvalues of this (very general) cubic equation is difficult (albeit possible), meaning it would be difficult to find bases for the eigenspaces.
A hint would be appreciated.
linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials
linear-algebra matrices eigenvalues-eigenvectors jordan-normal-form minimal-polynomials
edited May 18 at 23:56
Rodrigo de Azevedo
13.4k42065
13.4k42065
asked May 17 at 0:19
zz20szz20s
5,30141936
5,30141936
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32
add a comment |
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
|
show 1 more comment
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
|
show 1 more comment
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
|
show 1 more comment
$begingroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
$endgroup$
Compute:
$$A^2 = beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix.$$
So, we just need to show that $A^2, A, I$ are linearly independent. Clearly $A$ is not a multiple of $I$, so we just need to show there is no solution to the equation
$$A^2 = pA + qI iff beginpmatrix 0 & c & ac \ 0 & b & c + ab \ 1 & a & b + a^2endpmatrix = pbeginpmatrix 0 & 0 & c \ 1 & 0 & b \ 0 & 1 & aendpmatrix + qbeginpmatrix 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1endpmatrix$$
for $p$ and $q$. In particular, if you examine the entries in the left column, bottom row, we get
$$1 = 0p + 0q,$$
which means there is indeed no solution. Hence $I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to the $0$ matrix. Thus, the minimal polynomial must be (at least) a cubic, and equal to the characteristic polynomial
edited May 17 at 14:09
answered May 17 at 0:41
Theo BenditTheo Bendit
22.7k12359
22.7k12359
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
|
show 1 more comment
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
$begingroup$
Interesting! Can you elaborate on the sentence "$I, A, A^2$ are linearly independent, so no quadratic of $A$ will be equal to $0$"?
$endgroup$
– zz20s
May 17 at 0:44
1
1
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
Nice solution. Your argument can be rewritten as: If $c_1A^2+c_2A+c_3I_3=0_3$ then, looking at the first columns we get $$beginbmatrix c_1 \c_2 \ c_3 endbmatrix=beginbmatrix 0 \ 0\0 endbmatrix$$
$endgroup$
– N. S.
May 17 at 0:45
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
@zz20s To say that $I, A, A^2$ are linearly dependent is to say that there are some scalars $p, q, r$, not all equal to $0$, such that $pA^2 + qA + rI = 0$. That is, there is some non-zero polynomial $f(x) = px^2 + qx + r$, of degree at most $2$, such that $f(A) = 0$. So, $I, A, A^2$ being independent means that there are no polynomials $f$ of degree less than $3$ (except the $0$ polynomial) such that $f(A) = 0$. Hence, the minimal polynomial must have degree at least $3$.
$endgroup$
– Theo Bendit
May 17 at 0:48
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
Ah, right, thank you! That makes sense. Is this a standard method for proving such a statement, or does it only work because of some property inherent to this matrix?
$endgroup$
– zz20s
May 17 at 0:54
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
$begingroup$
It's a method that should work every time, provided you can solve the equations. If you're given an $n times n$ matrix $A$ that you wish to show is diagonalisable, then this is equivalent to showing $I, A, A^2, ldots, A^n-1$ are linearly independent. You can always do this mechanically, but sometimes it might mean solving a system of $n^2$ equations in $n$ variables! This matrix is particularly nice because the independence could be essentially read off three entries (cf N. S.'s comment).
$endgroup$
– Theo Bendit
May 17 at 1:01
|
show 1 more comment
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
add a comment |
$begingroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
$endgroup$
The form of $A$ has a special name: the companion matrix of the polynomial $p(x)=x^3-ax^2-bx-c$.
For the standard basis $e_1,e_2,e_3$, one finds that $Ae_1=e_2$, $Ae_2=e_3$, so $e_1,Ae_1,A^2e_1$ forms a basis.
The general context is the companion $ntimes n$ matrix of the polynomial $$p(x)=x^n-c_n-1x^n-1-cdots-c_1x-c_0.$$ A vector $v$ is said to be a cyclic vector for $A$ if the iterates by $A$ of $v$ for a basis for $R^n$. As others point out, this suffices to show that the minimal polynomial is the same as the characteristic polynomial.
answered May 17 at 2:02
user52817user52817
1492
1492
add a comment |
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
add a comment |
$begingroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
$endgroup$
Assuming you know already that according to Cayley-Hamilton you have $p_A(A) = O_3times 3$ you can also proceed as follows:
- Let $e_1, e_2, e_3$ denote the canonical basis $Rightarrow Ae_1=e_2, Ae_2 = e_3 Rightarrow A^2e_1 = e_3$
Now, assume there is a polynomial $m(x)=x^2+ux+v$ such that $m(A) = O_3times 3$.
Applying $m(A)$ to $e_1$ gives
$$m(A)e_1 = A^2e_1 + uAe_1 + ve_1 = e_3 +ue_2 + ve_1 = beginpmatrix0 \ 0 \ 0endpmatrix mbox Contradiction!$$
The linear combination cannot result in the zero vector as the coefficient of the basis vector $e_3$ is $1$.
answered May 17 at 1:55
trancelocationtrancelocation
15.5k1929
15.5k1929
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3229044%2fshow-that-the-characteristic-polynomial-is-the-same-as-the-minimal-polynomial%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
How about calculate $det(xI-A)$ ?
$endgroup$
– Rodrigo Dias
May 17 at 0:29
$begingroup$
@zz20s You wrote that you've found the minimal polynomial via computation. Did you mean the characteristic polynomial?
$endgroup$
– Theo Bendit
May 17 at 0:31
$begingroup$
You said the minimal polynomial has degree $3$
$endgroup$
– J. W. Tanner
May 17 at 0:31
$begingroup$
Oh, yes, sorry, that should say characteristic.
$endgroup$
– zz20s
May 17 at 0:32