EXPLICIT MINIMUM POLYNOMIAL, EIGENVECTOR AND INVERSE FORMULA OF DOUBLY LESLIE MATRIX†

Journal of Applied Mathematics & Informatics.
2015.
May,
33(3_4):
247-260

- Received : September 02, 2014
- Accepted : March 20, 2015
- Published : May 30, 2015

Download

PDF

e-PUB

PubReader

PPT

Export by style

Share

Article

Metrics

Cited by

TagCloud

The special form of Schur complement is extended to have a Schur’s formula to obtains the explicit formula of determinant, inverse, and eigenvector formula of the doubly Leslie matrix which is the generalized forms of the Leslie matrix. It is also a generalized form of the doubly companion matrix, and the companion matrix, respectively. The doubly Leslie matrix is a nonderogatory matrix.
AMS Mathematics Subject Classification : 15A09, 15A15, 15A18, 65F15, 65F40.
Biometrika
, a journal. The article was entitled,
On the use of matrices in certain population mathematics
[1
,pp. 117–120]. The Leslie model describes the growth of the female portion of a population which is assumed to have a maximum lifespan. The females are divided into age classes all of which span an equal number of years. Using data about the average birthrates and survival probabilities of each class, the model is then able to determine the growth of the population over time,
[11
,
7]
.
Chen and Li in
[5]
asserted that, Leslie matrix models are discrete models for the development of age-structured populations. It is known that eigenvalues of a Leslie matrix are important in describing the asymptotic behavior of the corresponding population model. It is also known that the ratio of the spectral radius and the second largest(subdominant) eigenvalue in modulus of a nonperiodic Leslie matrix determines the rate of convergence of the corresponding population distributions to a stable age distribution.
A
Leslie matrix
arises in a discrete, age-dependent model for population growth. It is a matrix of the form
where
r_{j}
≥0,0 <
s_{j}
≤ 1,
j
= 1, 2, . . . ,
n
− 1.
For a given field
, the set of all polynomials in
x
over
is denoted by
[
x
]. For a positive integer
n
, let
M_{n}
(
) be the set of all
n
×
n
matrices over
. The set of all vectors, or
n
× 1 matrices over
is denoted by
^{n}
. A nonzero vector
v
∈
^{n}
is called an eigenvector of
A
∈
M_{n}
(
) corresponding to a scalar
λ
∈
if
A
v
=
λ
v
, and the scalar
λ
is an eigenvalue of the matrix
A
. The set of eigenvalues of
A
is call the spectrum of
A
and is denoted by
σ
(
A
). In the most common case in which
=
, the complex numbers,
M_{n}
(
) is abbreviated to
M_{n}
.
Doubly companion matrices
C
∈
M_{n}
were first introduced by Butcher and Chartier in
[4
, pp. 274–276], given by
that is, a
n
×
n
matrix
C
with
n
> 1 is called a
doubly companion matrix
if its entries
c_{ij}
satisfy
c_{ij}
= 1 for all entries in the sub-maindiagonal of
C
and else
c_{ij}
= 0 for
i
≠ 1 and
j
≠
n
.
We define a
doubly Leslie matrix
analogous as the doubly companion matrix by replacing the subdiagonal of the doubly companion matrix by
s
_{1}
,
s
_{2}
, . . . ,
s_{n}
_{−1}
where
s_{j}
,
j
= 1, 2, . . . ,
n
− 1, respectively, and denoted by
L
, that is, a doubly Leslie matrix is defined to be a matrix as follows
where
a_{j}
,
b_{j}
∈
, the real numbers,
j
= 1, 2, . . . ,
n
. As the Leslie matrix, we restriction only
s_{j}
> 0,
j
= 1, 2, . . . ,
n
− 1.
For convenience, we can be written the matrix
L
in a partitioned form as
and
Λ
= diag(
s
_{1}
,
s
_{2}
, . . . ,
s_{n}
−
_{1}
) is a diagonal matrix of order
n
− 1.
Note: If we define the doubly Leslie matrix in an another form such as
L
=
where all symbols are as above, then some consequence productions will be complicates forms.
We recall some well-known results from linear algebra and matrix analysis.
Definition 1.1
(
[6]
, Definition 1.3.1). A matrix
B
∈
M_{n}
is said to be
similar
to a matrix
A
∈
M_{n}
if there exists a nonsingular matrix
S
∈
M_{n}
such that
B
=
S
^{−1}
AS
.
Theorem 1.2
(
[6]
, Theorem 1.4.8).
Let A, B
∈
M_{n}
,
if
x
∈
^{n}
is an eigenvector corresponding to
λ
∈
σ
(
B
)
and if B is similar to A via S, then Sx is an eigenvector of A corresponding to the eigenvalue
λ
.
Theorem 1.3
(
[6]
, Theorem 3.3.15).
A matrix A
∈
M_{n} is similar to the companion matrix of its characteristic polynomial if and only if the minimal and characteristic polynomial of A are identical
.
Definition 1.4
(
[9]
, p. 664). A matrix
A
∈
M_{n}
for which the characteristic polynomial ∆
_{A}
(
x
) equal to the minimum polynomial
m_{A}
(
x
) are said to be
nonderogatory
matrix.
In the present paper we give explicit determinant, inverse matrix, and eigenvector formulae for the doubly Leslie matrix and give some related topics.
M
be a matrix partitioned into four blocks
where the submatrix
C
is assumed to be square and nonsingular. Brezinski in
[3
,p. 232] asserted that, the Schur complement of
C
in
M
, denoted by (
M/C
), is defined by
which is related to Gaussian elimination by
Suppose that
B
and
C
are
k
×
k
and (
n
−
k
) × (
n
−
k
) matrices, respectively,
k
<
n
, and
C
is nonsingular, as in
[8
, p.39] we have the following theorem.
Theorem 2.1
(Schur’s formula).
Let M be a square matrix of order n
×
n partitioned as
where B and C are k
×
k and
(
n
−
k
) × (
n
−
k
)
matrices, respectively, k
<
n
.
If C is nonsingular, then
Proof
. From the (6)
The identity (7) follows by taking the determinant of both sides. Then,
Since det
= 1. Therefore
By Laplace’s theorem, expansion of det
by the first
k
rows i.e., rows {1, 2, . . . ,
k
}. We have
Therefore
det
M
= (−1)
^{(n+1)k}
det
C
det(
M/C
). This completes the proof.
The following useful formula, presents the inverse of a matrix in terms of Schur complements, analogous as in
[14
, p.19], we obtain.
Theorem 2.2.
Let M be partitioned as in
(4)
and suppose both M and C are nonsingular. Then
(
M/C
)
is nonsingular and
Proof
. The Schur complements (
M/C
) is nonsingular by virtue of (7). Under the given hypotheses, from (6) one checks that
Inverting both sides yields
from which the identity (8) follows.
Theorem 3.1
(Determinant of doubly Leslie matrix).
Let L be a doubly Leslie matrix as in
(3)
with partitioned as
where
and
Λ = diag(
s
_{1}
,
s
_{2}
, . . . ,
s_{n}
_{−1}
),
s_{j}
> 0,
j
= 1, 2, . . . ,
n
− 1
is a diagonal matrix of order n
− 1,
then
Proof
. Since Λ is a (
n
− 1) × (
n
− 1) submatrix of the matrix
L
. Then we apply the Schur’ formula (7),
As in (5), the Schur complement of Λ in
L
, denoted by (
L
/Λ) , is a 1 × 1 matrix or a scalar
Now, from (9) it is easy to see that
Therefore
This completes the proof.
Immediately, we have the following corollaries.
Corollary 3.2.
Let L be a Leslie matrix defined as in
(1)
with partitioned as L
=
j
= 1, 2, . . . ,
n. and
Λ = diag(
s
_{1}
,
s
_{2}
, . . . ,
s_{n}
_{−1}
),
s_{j}
> 0,
j
= 1, 2, . . . ,
n
− 1
is a diagonal matrix of order n
− 1,
then
Corollary 3.3.
Let
C
=
be a doubly companion matrix, where
p
= [
a
_{1}
a
_{2}
. . .
a_{n}
_{−1}
]
^{T}
,
and
q
= [
b_{n}
_{−1}
b_{n}
_{−2}
. . .
b
_{1}
]
^{T}
,
then
Corollary 3.4.
Let
C =
be a companion matrix, where
p
= [
a
_{1}
a
_{2}
. . .
a_{n}
_{−1}
]
^{T}
,
then
det C = (−1)
^{n}a_{n}
.
Now we wish to find the inverse of doubly Leslie matrix.
Theorem 3.5.
Let
L
=
be a doubly Leslie matrix,where
p
= [
a
_{1}
a
_{2}
. . .
a_{n}
_{−1}
]
^{T}
,
q
= [
b_{n}
_{−1}
b_{n}
_{−2}
. . .
b
_{1}
]
^{T}
,
and
Λ = diag(
s
_{1}
,
s
_{2}
, . . . ,
s_{n}
_{−1}
),
where s_{j}
> 0,
j
= 1, 2, . . . ,
n
− 1
is a diagonal matrix of order n
− 1.
If
det
L
≠0
then
where (
L
/Λ) = −
,
as in
(10),
and
Λ
^{−1}
= diag
.
Proof
. Apply the identity (8) to the matrix
L
, we have
The Schur complement of Λ in
L
is (
L
/Λ), in (10) showed that (
L
/Λ) is a scalar. Then
Immediately, we have the following corollaries.
Corollary 3.6.
Let L be a Leslie matrix defined in Corollary 3.2. If
det
L
≠ 0
then
where
(
L
/Λ) = −
a_{n}
.
Corollary 3.7.
Let C be a doubly companion matrix defined in Corollary 3.3.
If
det
C
≠ 0
then
where
Corollary 3.8.
Let C be a companion matrix defined in Corollary 3.4. If
det C ≠ 0
then
L
in (3) is similar to a companion matrix, that is, it is a nonderogatory.
Theorem 4.1.
The doubly Leslie matrix matrix L defined in
(3)
is nonderogatory and the characteristic polynomial and the explicit minimum polynomial is
where c
_{1}
=
a
_{1}
,
c
_{i}
=
a_{i}
s_{k}
,
and d
_{1}
=
b
_{1}
,
d_{i}
=
b_{i}
s_{k}
,
for i
= 2, 3, . . . ,
n
.
Proof
. Let
Firstly to show that
L
is similar to a doubly companion matrix.
By a similarity transformation with a diagonal matrix
L
can be transformed to a doubly companion matrix,
For convenient, let us denote the doubly companion matrix
D
^{−1}
L
D
by
where
c
_{1}
=
a
_{1}
,
c_{i}
=
a_{i}
s_{k}
, and
d
_{1}
=
b
_{1}
,
d_{i}
=
b_{i}
s_{k}
, for
i
= 2, 3, . . . ,
n
.
Let
J
be the
backward identity
matrix of order
n
×
n
(or
reversal
matrix of order
n
×
n
),
J
(=
J
^{−1}
), which showing that
To show that the matrix Γ is similar to a companion matrix. We shall prove by explicit construction the existence of an invertible matrix
M
such that
M
^{−1}
Γ
M
is a companion matrix. Now, chosen a matrix
M
of size
n
×
n
,
Then
M
is nonsingular matrix. In fact the matrix
M
is an lower triangular Toeplitz matrix with diagonal-constant 1, and
where
e
_{1}
= [ 1 0 . . . 0 ]
^{T}
∈
^{n}
is the unit column vector.
Computation shows that
where
The matrix
is the desired companion matrix. Then, we have the doubly Leslie matrix
L
is similar to the companion matrix
C
. By Theorem 1.3, the characteristic polynomial ∆
_{L}
(
x
) equal to the minimum polynomial
m
_{L}
(
x
), we have
That is
Theorem 5.1.
Let λ be an eigenvalue of a doubly Leslie matrix L defined in
(3).
Then
is an eigenvector of
L
corresponding to the eigenvalue λ, where d
_{1}
=
b
_{1}
,
d_{i}
=
b_{i}
s_{k}
,
for i
= 2, 3, . . . ,
n
.
Proof
. From Theorem 4.1,
L
is similar to the companion matrix
C
as in (14). Then they have the same eigenvalues in common. Let
λ
be an eigenvalue of
L
, then
λ
also an eigenvalue of
C
. Since
λ
is a root of the characteristic polynomial ∆
_{L}
(
x
), we have
From (13), we have,
Therefore
Then, we put a vector
u
= [1
λ
· · ·
λ
^{n−2}
λ
^{n−1}
]
^{T}
. We must show that this vector
u
is an eigenvector of
C
corresponding to the eigenvalue
λ
. Form equation (14),
C
= (
DJM
)
^{−1}
L
(
DJM
), we have
it is easy to see that the first component in the vector
u
cannot be zero, the vector
u
is not a zero-vector, it is an eigenvector of
C
corresponding to
λ
.
Since (
DJM
)
^{−1}
L
(
DJM
) =
C
. Theorem 1.2 asserted that (
DJM
)
u
is an eigenvector of
L
corresponding to the eigenvalue
λ
. Hence, the explicit form of an eigenvector corresponding to an eigenvalue
λ
of the matrix
L
is
that is
it is easy to see that the last component in the vector
v
cannot be zero, which proves the assertion.
The following corollaries are particular case of Theorem 5.1.
If
b
_{1}
=
b
_{2}
= · · · =
b_{n}
= 0, then the matrix become a Leslie matrix, we have the following corollary.
Corollary 5.2.
Let
λ
be an eigenvalue of a Leslie matrix L defined in Corollary 3.2. Then
is an eigenvector of L corresponding to the eigenvalue λ. A nonzero scalar multiple of v namely
is also an eigenvector of L corresponding to the eigenvalue
λ
.
If
s
_{1}
=
s
_{2}
= · · · =
s_{n}
= 1, then we have the following corollary, as in
[13
, pp. 270–272].
Corollary 5.3.
Let
λ
be an eigenvalue of a doubly companion matrix
C
defined in Corollary 3.3. Then
is an eigenvector of C corresponding to the eigenvalue
λ
.
Corollary 5.4.
Let
λ
be an eigenvalue of a companion matrix
C
defined in Corollary 3.4. Then
is an eigenvector of
C
corresponding to the eigenvalue
λ
.
Wiwat Wanicharpichat received M.Ed.(in Mathematics) from Srinaharinwirot University. Since 1976 he has been at Naresuan University. In February of 1996, he received a associate professor from Naresuan University Committee. His research interests include linear algebra, matrix analysis and theory of ring.
Department of Mathematics, Faculty of Science, Naresuan University, Phitsanulok 65000, Thailand.
e-mail: wiwatw@nu.ac.th

Schur complement
;
Leslie matrix
;
doubly Leslie matrix
;
companion matrix
;
Toeplitz matrix
;
nonderogatory matrix
;
eigenvalue
;
eigenvector

1. Introduction

One of the most popular models of population growth is a matrix-based model, first introduced by P. H. Leslie. In 1945, he published his most famous article in
PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

2. Some Properties of Schur Complement

Let
PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

3. Inverse Formula of Doubly Leslie Matrix

The following theorem is follows from Theorem 2.1.
PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

4. Explicit Minimum Polynomial of Doubly Leslie Matrix

The author in
[12
, Theorem 33] asserted that, the doubly companion matrix is nonderogatory. Now, we wish to show that any doubly Leslie matrix
PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

5. Explicit Eigenvector Formula of Doubly Leslie Matrix

Now analogous as eigenvector of a companion matrix in
[2
,pp. 630–631] and in
[10
, p.6], we obtain.
PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

PPT Slide

Lager Image

6. Conclusion

The doubly Leslie matrix is a nonderogatory matrix. This paper has explored a special form of a Schur complement to obtained the determinant, inverse, and explicit eigenvector formulas of the doubly Leslie matrix which is the generalized forms of the Leslie matrix. It is also a generalized form of the doubly companion matrix, and the companion matrix, respectively.
Acknowledgements

The author is very grateful to the anonymous referees for their comments and suggestions, which inspired the improvement of the manuscript. This work was supported by Naresuan University.

BIO

Bacaër N.
2011
A Short History of Mathematical Population Dynamics
Springer
New York

Brand L.
(1964)
The companion matrix and its properties
The American Mathematical Monthly
71
(6)
629 -
634
** DOI : 10.2307/2312322**

Brezinski C.
(1988)
Other Manifestations of the Schur Complement
Linear Algebra Appl
111
231 -
247
** DOI : 10.1016/0024-3795(88)90062-6**

Butcher J.C.
,
Chartier P.
(1999)
The effective order of singly-implicit Runge-Kutta methods
Numerical Algorithms
20
269 -
284
** DOI : 10.1023/A:1019176422613**

Chen M.Q.
,
Li X.
(2005)
Spectral properties of a near-periodic row-stochastic Leslie matrix
Linear Algebra Appl
409
166 -
186
** DOI : 10.1016/j.laa.2005.07.005**

Horn R.A.
,
Johnson C.R.
1996
Matrix Analysis
Cambridge University Press
Cambridge, UK

Kirkland S.J.
,
Neumann M.
(1994)
Convexity and concavity of the Perron root and vector of Leslie matrices with applications to a population model
SIAM J. Matrix Anal. Appl
15
(4)
1092 -
1107
** DOI : 10.1137/S0895479893249228**

Lancaster P.
,
Tismenetsky M.
1985
The Theory of Matrices Second Edition with Applications
Academic Press Inc.
San Diego

Meyer C.D.
2000
Matrix Analysis and Applied Linear Algebra
SIAM
Philadelphia

Moritsugu S.
,
Kuriyama K.
(2000)
A linear algebra method for solving systems of algebraic equations
J. Jap. Soc. Symb. Alg. Comp. (J. JSSAC)
7/4
2 -
22

Poole D.
2006
Linear Algebra:A Modern Introduction
Second Edition
Thomson Learning
London

Wanicharpichat W.
(2011)
Nonderogatory of sum and product of doubly companion matrices
Thai J. Math
9
(2)
337 -
348

Wanicharpichat W.
(2013)
Explicit eigenvectors formulae for lower doubly companion matrices
Thai J. Math
11
(2)
261 -
274

Zhang F.
2005
The Schur Complement and Its Applications, in Series: Numerical Methods and Algorithms
Springer, Inc.
New York

Citing 'EXPLICIT MINIMUM POLYNOMIAL, EIGENVECTOR AND INVERSE FORMULA OF DOUBLY LESLIE MATRIX†
'

@article{ E1MCA9_2015_v33n3_4_247}
,title={EXPLICIT MINIMUM POLYNOMIAL, EIGENVECTOR AND INVERSE FORMULA OF DOUBLY LESLIE MATRIX†}
,volume={3_4}
, url={http://dx.doi.org/10.14317/jami.2015.247}, DOI={10.14317/jami.2015.247}
, number= {3_4}
, journal={Journal of Applied Mathematics & Informatics}
, publisher={Korean Society of Computational and Applied Mathematics}
, author={WANICHARPICHAT, WIWAT}
, year={2015}
, month={May}