0. Introduction
A scalar
is called an eigenvalue of an
complex matrix
A if there is a nontrivial solution
x of
[
1]. The eigenvalues of a matrix
A are the roots of the
and so are difficult to evaluate in general[
2,
3]. “ Computer softwares such as Mathematica and Maple can use symbolic calculations to find the characteristic polynomial of a moderate-sized matrix"[
4]. But there is no formula or finite algorithm to solve the characteristic equation of a general
matrix for
, and the best numerical methods for finding eigenvalues avoid the characteristic polynomial entirely[
4]. Is there a way to represent each eigenvalue of a matrix by some simple numerical features, avoiding feature polynomials?
In this paper, we focus on the spectrum of the orthogonal matrix with cube being symmetric. It is a common matrix class, as it may not be symmetric, but its cube is. It is difficult to get the eigenvalues by solving the roots of characteristic polynomial of an orthogonal matrix. The trace is a useful tool. Smith O.K. has obtained the accurate eigenvalues of a
real symmetric matrix represented by trace[
5]. Of course, the expression is somewhat complicated. Lin et al.[
6] discussed the solutions to the trace equation of orthogonal matrices with all eigenvalues of real or pure imaginary numbers, whose square is symmetric. The trace of a matrix is a simple but useful tool. Chen.et al.[
7] obtained the spectrum of
orthogonal matrix with trace be an integer. Chen.et al.[
8] showed the explict expression for the spectrum of
orthogonal matrix by trace.
Inspired by the close relationship between all possible eigenvalues of a matrix and the traces of its square power, we discuss the properties of orthogonal matrices with cube be symmetric, and obtain the exact eigenvalues expressed by their square power trace in
Section 2. The numerical examples in
Section 3 show that our calculation method is simple and practical, it only uses the matrix trace and avoids solving characteristic polynomials. As an application, we draw a conclusion on the judgment of matrix orthogonal similarity in
Section 4 and obtain the condition for the sum of the orthogonal matrices be orthogonal.
Throughout this paper, we denote the transpose, determinant, and trace of a square matrix A by , , and respectively. represents the identity matrix of order n, sometimes represent it as E briefly. Denote the imaginary unit by , that is, . Furthermore, and stand for the sets of orthogonal matrices, symmetric orthogonal matrices respectively.
1. Preliminary
As we know, the eigenvalues of an orthogonal matrix is always 1 or or some pairs of conjugate complex roots. Now let the multiplicities of eigenvalues 1 and be s, t respectively, and the pair of pairwise conjugate complex(non-real) roots be k. We shall first introduce the standard form of an orthogonal matrix A, denoted it as .
Lemma 1.
[[1], inference 2.5.11 (c)] For an orthogonal matrix A of order n, there exists such that
and , where , , ,
The set of eigenvalues of a matrix A is also called the spectrum of A, denoted as . We have noticed details for the matrix , and list it as as follows. It is easy to prove by checking the calculation.
Lemma 2. Let . Then
- (1)
, and ;
- (2)
;
- (3)
.
In view of Lemma2, we see that is eigenvalues of A iff or is a sub-block of the standard form . In fact, if , then there will be a sub-block . Therefore , it follows . Then by , so or .
The correspondence between sub-blocks of the standard form and the eigenvalues will be an important fact for our later discussion.
2. Main results
Based on the background in
Section 2, we consider how to calculate the eigenvalue of a symmetric orthogonal matrix with cube being symmetric quickly. We show a simple fact firstly.
Theorem 1.
For an orthogonal matrix A, it has
Proof. For the necessity, if the all possible eigenvalue of
A is 1 or
or
or
, then it follows
,
,
,
be the all possible different sub-blocks of the standard form
. By (
1), we can assume
Combined with Lemma 2 yields
For the sufficiency, as
, it has
, then the all possible eigenvalues of
A should be cube unit roots. As
then the all possible eigenvalue of
A is 1 or
or
or
. □
Now we can obtain the standard form of an orthogonal matrix A with cube being symmetric.
Theorem 2.
For an orthogonal matrix A, if its cube is symmetric. Then there exists an orthogonal matrix Q such that
Besides, the parameters t,s,, are as follows,
Proof. As
, by Theorem1, all possible eigenvalue of
A is 1 or
or
or
, then Equation (
2) follows by Lemma2 and Theorem1.
Plusing Equations (
4) and (5) gives
, and plusing Equations (6) and (7) gives
Then
By Equations (5) and (7) follows
Therefore,
Combined with Equation (
8) yields
namely,
Consequently, we have
So Equation (
3) is established . □
Theorem2 indicates that the exact eigenvalue and multiplicities of an orthogonal matrix with cube being symmetric could be obtained by Equation (
3), which only relates to trace of matrices, avoiding to calculate the characteristic polynomials.
In 1962, Hua Loo-Keng once obtained the necessary and sufficient condition for two orthogonal matrices be similar is that their characteristic matrices have the same elementary factor, when solving the similarity problem of the symplectic square under the symplectic group[[
9], Theorem 1]. It is equivalent to Lemma1, namely whether the orthogonal similarity of two orthogonal matrices is determined by all their eigenvalues.
Theorem 3. For orthogonal matrices A,B, if their cubes are both symmetric. Then are orthogonal similar if and only .
Proof. For the necessity, if are orthogonal similar, it is obvious that , so are the square and cube of trace.
For the sufficiency, in view of Theorem2 yields the standard form of an orthogonal matrix with cube being symmetric could be determined by Equation (
3), which only relates to the trace of the an orthogonal matrix and its square and cube. Consequently, when
,
are orthogonal similar. □
3. Numerical Examples
We see an example from [[
10], Example6.2] firstly.
Example 1.
Let . Then D is asymmetric. By [10], we see that it belongs to the subgroup of symmetric groupG, which is composed of positive hexagons, it is also orthogonal. By calculating, it has
We see is symmetric. It would be very complicated to calculate its eigenvalues.
Since , , it follows by Equation (3). Then combined with Equation (2) yields
Example 2.
From [[10], example 8.1], there are four orthogonal matrices as follows,
By checking, are different and all asymmetric. It is easy to know
Therefore are orthogonal matrices with cube being symmetry.
Let then by Lemma2 yields
So Then are orthogonal similar, they have the same orthogonal standard form and the same spectrum.
Example 3.
From [[11], Theorem 3.3],
are orthogonal matrices of order 3 with only three non-zero elements and all the diagonal elements are zero. By calculation, it has
and
It shows are orthogonal with cubic being symmetry, and
it meets Theorem2.
As we know, if are orthogonal similar, then . Here the inverse is also correct. In fact, if , then it follows by Equations (9) and (10), which indicates meet Theorem3, namely, are orthogonal similar.
Furthermore, if , then by Equation (9) yields . Then from Equation (3), it follows
If , then by Equation (9) yields . Combined with Equation (3) yields
4. Application
As everyone knows, the sum of orthogonal matrices is not necessarily orthogonal. For example, both are orthogonal, however, is not orthogonal. So it is natural to consider under what conditions, the sum of the orthogonal matrices coule be orthogonal? It is an interesting question.
Proposition 1. [12] Let be orthogonal. Then is orthogonal if and only if are orthogonal matrices of order , and , where P is an arbitrary orthogonal matrix, and .
There are some mistakes in Proposition1. If and , then , that is, has eigenvalues 2 or 0, it is contradictory.
Proposition 2. [13] Let be orthogonal. Then is orthogonal if and only if are orthogonal matrices of order , and , where P is an arbitrary orthogonal matrix, and .
Proposition2 is not rigorous. As P is arbitrary in Proposition1. Then suppose , it follows by . If suppose , it follows from . They are contradictory.
As an application, we will give a new characterization for the sum of orthogonal matrices being orthogonal.
Theorem 4. Let be orthogonal. Then is orthogonal if and only if n is an even number and is a tri-potent matrix with no real-eigenvalue, namely,
Proof. For the necessity, as
is orthogonal, then
, namely,
. Therefore,
is a anti-symmetric matrix with no real eigenvalue. Now we can suppose the eigenvalue of
is
, here
i is the imaginary unit of the complex field
and the coefficient
. Then the eigenvalue of
is
. Consequently, it has
, namely,
, it follows
It indicated that the eigenvalues of
are only
. From Equation (
1) and Lemma2, it gets
where
It shows
nis an even number and
has no real-eigenvalue. Combining with Lemma2 yields
that means
. This completes the proof of the necessity.
For the sufficiency, since
, if follows
is the annihilating polynomial of
. As
and
has no real-eigenvalue, then the eigenvalues of
are only
, it follows Equation (
11). Then there exists
such that
By Lemma2, it follows
Furthermore, is orthogonal. □
Corollary 1. For a orthogonal matrix A, there exists an orthogonal matrix B with same order meets is orthogonal.
Proof. By Lemma1 and Lemma2, we can assume
where
Q is orthogonal with order
. Then
B is orthogonal. Hence
□
Corollary1 indicates there are many pairs of orthogonal matrices such that their sum is orthogonal as well. We can construct many orthogonal matrices of the form as Equation (
12). We seek into the example as follows.
Example 4.
Let A be an orthogonal matrix of order 4, and be orthogonal, then
Since and
, by Equation (12), it follows
Then , that is, , which means are different.
From Theorem4, we get [[
14], Theorem 3] as follows easily.
Corollary 2. Let be orthogonal matrices of order n. Then the following statements are equivalent:
- (1)
is orthogonal;
- (2)
be real anti-symmetric;
- (3)
the eigenvalues of are .
It shows some equivalent characterization for the sum of orthogonal matrices be orthogonal.
Author Contributions
Investigations, Meixiang Chen, Zhongpeng Yang and Zhixing Lin; writing—review and editing, Meixiang Chen; Formal analysis, Zhixing Lin; Project administration, Meixiang Chen. All the authors contributed equally to all the parts of this work. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the National Natural Science Foundation of China No. 61772292, the Natural Science Foundation of Fujian Province No. 2021J011103, and by the Science and Technology Project of Putian City No. 2022SZ3001ptxy05.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Acknowledgments
The authors wish to thank the anonymous referees for their detailed and very helpful comments and suggestions that improved this article.
Conflicts of Interest
The authors declare no conflict of interest.
References
- HORN R. A., JOHNSON C. R. Matrix Analysis,2nd ed.New York: Cambridge University press, 2013.
- WOLKOWICZ H., STYAN G. P. H. Bounds for eigenvalues using Traces.Linear Algebra and its Application1980, 29: 471–506. [CrossRef]
- WOLKOWICZ H., STYAN G. P. H. More bounds for eigenvalues using Traces. Linear Algebra and its Application 1980, 31, 1–17. [CrossRef]
- LAY D C. Linear algebra and its applications,5th. New Jersey: Pearson Education, Inc. 2019:281.
- SMITH O. K. Eigenvalues of a Symmetric 3×3 Matrix. Communications ACM1961,4(4):168. [CrossRef]
- LIN Z.X., YANG Z.P.,CHEN M.X.,et al. Some researches on orthogonal solutions to a class of matrix trace equations2020,46(2):115-121.
- CHEN M. X.,YANG Z. P., YAN Y. Y., et al. Spectrum of 3×3 Orthogonal Matrices Whose Traces are Integers. ournal of Beihua University ( Natural Science)2018, 19(2):158–163. [CrossRef]
- CHEN M. X.,YANG Z. P., LI Y. X., et al. The Determination of the Spectrum of 3×3 Orthogonal Matrices and its Applications. Journal of Fujian Normal University( Natural Science Edition)2020,36(4):1–8.
- HUA L. K. Symplectic Similarity of Symplectic Square Matrix. Acta Scientiarum Naturalium Universitatis Sunyatseni,1962, (4):1–12.
- ZHANG S. G., LIN H. L. Algorithms for Symmetric Groups of Simplexes. Applied Mathematics and Computation2007 (188): 1610–1634. [CrossRef]
- YANG J. The Diagonalization of some Orthogonal Matrices. Journal of Southwest University for Nationalities-Natural Science Edition2011, 37(6):889–894.
- LIU Z. M. Discussion on the Properties of Orthogonal Matrix. Journal of Chongqing Normal University ( Natural Science Edition )2000, (S1):162-164.
- REN F. T,.WANG Y. X.,YANG S. L. On the Necessary and Sufficient Conditions for the Sum of Orthogonal Matrices to be Orthogonal Matrices. Journal of Mathematics(China)1999, (05):48-49.
- JIANG L. Some Notes on the Row Simplicity of Matrix and the Sum of Two Orthogonal Matrices.Journal of Mathematics for Technology,1998,14 (2): 134-137.
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).