Preprint
Article

The Spectrum of the Orthogonal Matrix with its Cube being Symmetric

Altmetrics

Downloads

159

Views

28

Comments

1

This version is not peer-reviewed

Submitted:

26 May 2023

Posted:

26 May 2023

You are already at the latest version

Alerts
Abstract
The orthogonal matrix with cube be symmetric is a common class of matrices with important properties. We pay attention to this kind of matrices. By using the close relationship between the eigenvalues of a matrix and the trace of its power, we obtain the algorithm for its all possible different eigenvalues and multiplicities. The calculation formula is expressed only by the trace of a matrix and its power, avoiding solving the characteristic polynomial. The method is simple and practical. Furthermore, a new essential characterization for the sum of orthogonal matrix pairs being orthogonal is given as well.
Keywords: 
Subject: Computer Science and Mathematics  -   Applied Mathematics

0. Introduction

A scalar λ is called an eigenvalue of an n × n complex matrix A if there is a nontrivial solution x of A x = λ x [1]. The eigenvalues of a matrix A are the roots of the d e t ( A λ E ) = 0 and so are difficult to evaluate in general[2,3]. “ Computer softwares such as Mathematica and Maple can use symbolic calculations to find the characteristic polynomial of a moderate-sized matrix"[4]. But there is no formula or finite algorithm to solve the characteristic equation of a general n × n matrix for n 5 , and the best numerical methods for finding eigenvalues avoid the characteristic polynomial entirely[4]. Is there a way to represent each eigenvalue of a matrix by some simple numerical features, avoiding feature polynomials?
In this paper, we focus on the spectrum of the orthogonal matrix with cube being symmetric. It is a common matrix class, as it may not be symmetric, but its cube is. It is difficult to get the eigenvalues by solving the roots of characteristic polynomial of an orthogonal matrix. The trace is a useful tool. Smith O.K. has obtained the accurate eigenvalues of a 3 × 3 real symmetric matrix represented by trace[5]. Of course, the expression is somewhat complicated. Lin et al.[6] discussed the solutions to the trace equation of orthogonal matrices with all eigenvalues of real or pure imaginary numbers, whose square is symmetric. The trace of a matrix is a simple but useful tool. Chen.et al.[7] obtained the spectrum of 3 × 3 orthogonal matrix with trace be an integer. Chen.et al.[8] showed the explict expression for the spectrum of 3 × 3 orthogonal matrix by trace.
Inspired by the close relationship between all possible eigenvalues of a matrix and the traces of its square power, we discuss the properties of orthogonal matrices with cube be symmetric, and obtain the exact eigenvalues expressed by their square power trace in Section 2. The numerical examples in Section 3 show that our calculation method is simple and practical, it only uses the matrix trace and avoids solving characteristic polynomials. As an application, we draw a conclusion on the judgment of matrix orthogonal similarity in Section 4 and obtain the condition for the sum of the orthogonal matrices be orthogonal.
Throughout this paper, we denote the transpose, determinant, and trace of a square matrix A by A T , | A | , and tr ( A ) respectively. E n represents the identity matrix of order n, sometimes represent it as E briefly. Denote the imaginary unit by i C , that is, i 2 = 1 . Furthermore, O n × n and SO n × n stand for the sets of n × n orthogonal matrices, symmetric orthogonal matrices respectively.

1. Preliminary

As we know, the eigenvalues of an orthogonal matrix is always 1 or 1 or some pairs of conjugate complex roots. Now let the multiplicities of eigenvalues 1 and 1 be s, t respectively, and the pair of pairwise conjugate complex(non-real) roots be k. We shall first introduce the standard form of an orthogonal matrix A, denoted it as O A .
Lemma 1.
[[1], inference 2.5.11 (c)] For an orthogonal matrix A of order n, there exists Q O n × n such that
Q 1 A Q = diag E t , E s , W 1 , , W k , t + s + 2 k = n ,
and A SO n × n k = 0 , where W j = a j b j b j a j , a j 2 + b j 2 = 1 , 1 < a j < 1 , a j , b j R , j = 1 , 2 , , k .
The set of eigenvalues of a matrix A is also called the spectrum of A, denoted as σ ( A ) . We have noticed details for the matrix F 2 = 1 2 1 3 3 1 , and list it as as follows. It is easy to prove by checking the calculation.
Lemma 2.
Let F 2 = 1 2 1 3 3 1 . Then
(1)
σ F 2 = 1 2 ( 1 ± 3 i ) , and σ F 2 = 1 2 ( 1 ± 3 i ) ;
(2)
F 2 2 = F 2 T , F 2 3 = E 2 , F 2 3 = E 2 ;
(3)
tr F 2 = 1 , tr F 2 = 1 , tr F 2 3 = 2 , tr F 2 3 = 2 .
In view of Lemma2, we see that 1 2 ( 1 ± 3 i ) is eigenvalues of A iff F 2 or F 2 T is a sub-block of the standard form O A . In fact, if 1 2 ( 1 ± 3 i ) σ ( A ) , then there will be a sub-block W = a b b a O A . Therefore tr W = 2 a = 1 , it follows a = 1 2 . Then b = ± 3 2 by | W | = 1 , so W = F 2 or W = F 2 T .
The correspondence between sub-blocks of the standard form O A and the eigenvalues will be an important fact for our later discussion.

2. Main results

Based on the background in Section 2, we consider how to calculate the eigenvalue of a symmetric orthogonal matrix with cube being symmetric quickly. We show a simple fact firstly.
Theorem 1.
For an orthogonal matrix A, it has
σ ( A ) = 1 , 1 , 1 2 ( 1 + 3 i ) , 1 2 ( 1 3 i ) , 1 2 ( 1 + 3 i ) , 1 2 ( 1 3 i ) A 3 SO n × n .
Proof. 
For the necessity, if the all possible eigenvalue of A is 1 or 1 or 1 2 ( 1 ± 3 i ) or 1 2 ( 1 ± 3 i ) , then it follows E t , E s , F 2 , F 2 be the all possible different sub-blocks of the standard form O A . By (1), we can assume
O A = diag E t , E s , F 2 , , F 2 k 1 , F 2 , , F 2 k 2 , where F 2 = 1 2 1 3 3 1 .
Combined with Lemma 2 yields
O A 2 = diag E t + s , F 2 T , , F 2 T k 1 , F 2 T , , F 2 T k 2 , n = t + s + 2 k 1 + 2 k 2 ; O A 3 = diag E t , E s , E 2 , , E 2 k 1 , E 2 , , E k 2 S O n × n .
For the sufficiency, as A 3 SO n × n , it has σ ( A 3 ) = 1 , 1 , then the all possible eigenvalues of A should be cube unit roots. As
x 3 1 = ( x 1 ) x + 1 2 ( 1 + 3 i ) x + 1 2 ( 1 3 i ) , x 3 + 1 = ( x + 1 ) x 1 2 ( 1 + 3 i ) x 1 2 ( 1 3 i ) ,
then the all possible eigenvalue of A is 1 or 1 or 1 2 ( 1 ± 3 i ) or 1 2 ( 1 ± 3 i ) . □
Now we can obtain the standard form of an orthogonal matrix A with cube being symmetric.
Theorem 2.
For an orthogonal matrix A, if its cube is symmetric. Then there exists an orthogonal matrix Q such that
O A = Q 1 A Q = diag E t , E s , F 2 , , F 2 k 1 , F 2 , , F 2 k 2 , where F 2 = 1 2 1 3 3 1 .
Besides, the parameters t,s, k 1 , k 2 are as follows,
t = 1 6 n + tr A 3 + 2 tr A 2 + 2 tr A , s = 1 6 n tr A 3 + 2 tr A 2 2 tr A k 1 = 1 6 n tr A 3 tr A 2 + tr A , k 2 = 1 6 n + tr A 3 tr A 2 tr A .
Proof. 
As A 3 SO n × n , by Theorem1, all possible eigenvalue of A is 1 or 1 or 1 2 ( 1 ± 3 i ) or 1 2 ( 1 ± 3 i ) , then Equation (2) follows by Lemma2 and Theorem1.
By Lemma2, we see that
tr A = t s + k 1 k 2
tr A 2 = t + s k 1 k 2 ,
tr A 3 = t s 2 k 1 + 2 k 2 ,
n = t + s + 2 k 1 + 2 k 2 .
Plusing Equations (4) and (5) gives 1 2 tr A 2 + tr A = t k 2 , and plusing Equations (6) and (7) gives 1 2 n + tr A 3 = t + 2 k 2 . Then
k 2 = 1 6 n + tr A 3 tr A 2 tr A , t = 1 6 n + tr A 3 + 2 tr A 2 + 2 tr A .
By Equations (5) and (7) follows
k 1 + k 2 = t + s tr A 2 = t + s + 2 k 1 + 2 k 2 2 k 1 + k 2 tr A 2 = n 2 k 1 + k 2 tr A 2 .
Therefore,
k 1 + k 2 = 1 3 n tr A 2 .
Combined with Equation (8) yields
k 1 = 1 3 n tr A 2 k 2 = 1 3 n tr A 2 1 6 n + tr A 3 tr A 2 tr A ,
namely, k 1 = 1 6 n tr A 3 tr A 2 + tr A .
Consequently, we have
s = n 2 k 1 + k 2 t = n 2 3 n tr A 2 1 6 n + tr A 3 + 2 tr A 2 + 2 tr A = 1 6 n tr A 3 + 2 tr A 2 2 tr A .
So Equation (3) is established . □
Theorem2 indicates that the exact eigenvalue and multiplicities of an orthogonal matrix with cube being symmetric could be obtained by Equation (3), which only relates to trace of matrices, avoiding to calculate the characteristic polynomials.
In 1962, Hua Loo-Keng once obtained the necessary and sufficient condition for two orthogonal matrices be similar is that their characteristic matrices have the same elementary factor, when solving the similarity problem of the symplectic square under the symplectic group[[9], Theorem 1]. It is equivalent to Lemma1, namely whether the orthogonal similarity of two orthogonal matrices is determined by all their eigenvalues.
Theorem 3.
For orthogonal matrices A,B, if their cubes are both symmetric. Then A , B are orthogonal similar if and only tr A = tr B , tr A 2 = tr B 2 , tr A 3 = tr B 3 .
Proof. 
For the necessity, if A , B are orthogonal similar, it is obvious that tr A = tr B , so are the square and cube of trace.
For the sufficiency, in view of Theorem2 yields the standard form of an orthogonal matrix with cube being symmetric could be determined by Equation (3), which only relates to the trace of the an orthogonal matrix and its square and cube. Consequently, when tr A = tr B , tr A 2 = tr B 2 , tr A 3 = tr B 3 , A , B are orthogonal similar. □

3. Numerical Examples

We see an example from [[10], Example6.2] firstly.
Example 1.
Let D = 1 8 6 2 3 6 2 6 + 2 2 3 6 6 + 2 6 + 2 6 2 6 + 2 6 2 3 6 + 2 6 + 2 2 3 6 . Then D is asymmetric. By [10], we see that it belongs to the subgroup G ¯ of symmetric groupG, which is composed of positive hexagons, it is also orthogonal. By calculating, it has
D 2 = 1 8 2 2 3 6 3 2 6 + 3 2 2 3 2 6 + 2 6 + 2 6 3 2 6 + 3 2 2 2 3 6 + 3 2 6 + 3 2 2 3 2 ; D 3 = 2 2 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 0 .
We see D 3 is symmetric. It would be very complicated to calculate its eigenvalues.
Since D 2 SO 4 × 4 , D 3 S O 4 × 4 , tr D = 3 , tr D 2 = 1 , tr D 3 = 0 , it follows t = 2 , k 1 = 1 , s = k 2 = 0 by Equation (3). Then combined with Equation (2) yields
O D = diag 1 , 1 , 1 2 1 3 3 1 , σ ( D ) = 1 , 1 , 1 2 ( 1 ± 3 i ) .
Example 2.
From [[10], example 8.1], there are four orthogonal matrices as follows,
σ 1 = 0 0 1 0 1 0 0 0 0 1 0 0 0 0 0 1 , σ 2 = 0 0 0 1 1 0 0 0 0 0 1 0 0 1 0 0 , σ 3 = 0 0 1 0 0 1 0 0 0 0 0 1 1 0 0 0 , σ 4 = 1 0 0 0 0 0 0 1 0 1 0 0 0 0 1 0 .
By checking, σ 1 2 , σ 2 2 , σ 3 2 , σ 4 2 are different and all asymmetric. It is easy to know
σ j 3 = E 4 , tr σ j = tr σ j 2 = 1 , tr σ j 3 = 4 , j = 1 , 2 , 3 , 4 .
Therefore σ 1 , σ 2 , σ 3 , σ 4 are orthogonal matrices with cube being symmetry.
Let B = 1 2 3 2 0 0 3 2 1 2 0 0 0 0 1 0 0 0 0 1 = diag F 2 T , E 2 , then by Lemma2 yields
B 2 = diag F 2 T , E 2 2 = diag F 2 T 2 , E 2 2 = diag F 2 , E 2 ,
B 3 = diag F 2 T , E 2 3 = diag F 2 T 3 , E 2 3 = E 4 .
So tr B = tr B 2 = 1 , tr B 4 3 = 4 . Then σ 1 2 , σ 2 2 , σ 3 2 , σ 4 2 , B are orthogonal similar, they have the same orthogonal standard form and the same spectrum.
Example 3.
From [[11], Theorem 3.3],
A = 0 a 12 0 0 0 a 23 a 31 0 0 , B = 0 0 b 13 b 21 0 0 0 b 32 0
are orthogonal matrices of order 3 with only three non-zero elements and all the diagonal elements are zero. By calculation, it has
A 2 = 0 0 a 12 a 23 a 23 a 31 0 0 0 a 31 a 12 0 , B 2 = 0 b 13 b 32 0 0 0 b 21 b 13 b 32 b 21 0 0 ,
and
A 3 = diag a 12 a 23 a 31 , a 23 a 31 a 12 , a 31 a 12 a 23 = a 12 a 23 a 31 E 3 = | A | E 3 ;
B 3 = diag b 13 b 32 b 21 , b 21 b 13 b 32 , b 32 b 21 b 13 = b 13 b 32 b 21 E 3 = B E 3 .
It shows A , B are orthogonal with cubic being symmetry, and
tr A = tr B = tr A 2 = tr B 2 = 0 , tr A 3 = 3 | A | , tr B 3 = 3 | B | ,
it meets Theorem2.
As we know, if A , B are orthogonal similar, then A | = | B | . Here the inverse is also correct. In fact, if A | = | B | , then it follows tr A 3 = tr B 3 by Equations (9) and (10), which indicates A , B meet Theorem3, namely, A , B are orthogonal similar.
Furthermore, if | A | = 1 , then by Equation (9) yields tr A 3 = 3 . Then from Equation (3), it follows
O A = diag 1 , 1 2 1 3 3 1 , σ ( A ) = 1 , 1 2 ( 1 ± 3 i ) .
If | A | = 1 , then by Equation (9) yields tr A 3 = 3 . Combined with Equation (3) yields
O A = diag 1 , 1 2 1 3 3 1 , σ ( A ) = 1 , 1 2 ( 1 ± 3 i ) .

4. Application

As everyone knows, the sum of orthogonal matrices is not necessarily orthogonal. For example, A = 1 0 0 1 , B = 1 0 0 1 both are orthogonal, however, A + B = 2 0 0 2 is not orthogonal. So it is natural to consider under what conditions, the sum of the orthogonal matrices coule be orthogonal? It is an interesting question.
Proposition 1.
[12] Let A , B be orthogonal. Then A + B is orthogonal if and only if A , B are orthogonal matrices of order 2 n , and B = A P F P T , where P is an arbitrary orthogonal matrix, F = diag E t , E s , F 2 T , F 2 T , and F 2 = 1 2 1 3 3 1 .
There are some mistakes in Proposition1. If t 0 , and s 0 , then A T ( A + B ) = E + P F P T = P diag 2 E t , 0 s , E 2 F 2 T , , E 2 F 2 T P T , that is, A T B has eigenvalues 2 or 0, it is contradictory.
Proposition 2.
[13] Let A , B be orthogonal. Then A + B is orthogonal if and only if A , B are orthogonal matrices of order 2 n , and B = A P F P T , where P is an arbitrary orthogonal matrix, F = diag F 2 T , F 2 T , and F 2 = 1 2 1 3 3 1 .
Proposition2 is not rigorous. As P is arbitrary in Proposition1. Then suppose P 1 = E 2 , it follows A T B = F 2 T by B = A P 1 F 2 T P 1 T = A F 2 T . If suppose P 1 = E 2 , it follows A T B = F 2 T from P 2 F 2 T P 2 T = F 2 . They are contradictory.
As an application, we will give a new characterization for the sum of orthogonal matrices being orthogonal.
Theorem 4.
Let A , B be orthogonal. Then A + B is orthogonal if and only if n is an even number and A T B is a tri-potent matrix with no real-eigenvalue, namely, A T B 3 = E n .
Proof. 
For the necessity, as A + B is orthogonal, then ( A + B ) T ( A + B ) = E n , namely, 2 E n + A T B + B T A = E n . Therefore,
A T B + 1 2 E n = A T B + 1 2 E n T = B T A 1 2 E n
is a anti-symmetric matrix with no real eigenvalue. Now we can suppose the eigenvalue of A T B + 1 2 E n is c j i , here i is the imaginary unit of the complex field C and the coefficient c j R . Then the eigenvalue of A T B = 1 2 E n + A T B + 1 2 E n is 1 2 + c j i . Consequently, it has 1 2 + c j i = 1 , namely, 1 2 2 + c j 2 = 1 , it follows c j = ± 3 2 . It indicated that the eigenvalues of A T B are only 1 2 1 ± 3 i . From Equation (1) and Lemma2, it gets
Q T A T B Q = diag F 2 , , F 2 = O A T B ,
where Q O n × n . It shows nis an even number and A T B has no real-eigenvalue. Combining with Lemma2 yields
O A T B 2 = diag F 2 T , F 2 T , O A T B 3 = diag E 2 , E 2 = E n ,
that means A T B 3 = E n . This completes the proof of the necessity.
For the sufficiency, since A T B 3 = E n , if follows x 3 1 is the annihilating polynomial of A T B . As x 3 1 = ( x 1 ) x + 1 2 ( 1 + 3 i ) x + 1 2 ( 1 3 i ) and A T B has no real-eigenvalue, then the eigenvalues of A T B are only 1 2 1 ± 3 i , it follows Equation (11). Then there exists H 2 = 0 1 1 0 such that
A T B = Q diag F 2 , , F 2 Q T = Q 1 2 E n 3 2 diag H 2 , , H 2 Q T .
By Lemma2, it follows
A T ( A + B ) = E n + A T B = Q diag 1 2 E 2 3 2 H 2 , , 1 2 E 2 3 2 H 2 Q T = Q diag F 2 T , F 2 T Q T O n × n .
Furthermore, A + B = A Q diag F 2 T , · F 2 T Q T is orthogonal. □
Corollary 1.
For a 2 n × 2 n orthogonal matrix A, there exists an orthogonal matrix B with same order meets A + B is orthogonal.
Proof. 
By Lemma1 and Lemma2, we can assume
B = A Q diag F 2 , , F 2 Q T O 2 n × 2 n ,
where Q is orthogonal with order 2 n . Then B is orthogonal. Hence
A + B = A E n + Q diag F 2 , · E 2 F 2 Q T = A Q diag E 2 F 2 , · E 2 F 2 Q T = A Q diag F 2 T , · F 2 T Q T O 2 n × 2 n .
Corollary1 indicates there are many pairs of orthogonal matrices such that their sum is orthogonal as well. We can construct many orthogonal matrices of the form as Equation (12). We seek into the example as follows.
Example 4.
Let A be an orthogonal matrix of order 4, and Q 1 = diag 0 1 1 0 , 0 1 1 0 , Q 2 = diag F 2 , F 2 be orthogonal, then
Q 1 T = diag 0 1 1 0 , 0 1 1 0 , Q 2 T = diag F 2 T , F 2 T .
Since 0 1 1 0 F 2 0 1 1 0 = F 2 T , 0 1 1 0 F 2 0 1 1 0 = F 2 , and
F 2 F 2 F 2 T = F 2 F 2 F 2 T = F 2 , by Equation (12), it follows
B 1 = A Q 1 diag F 2 , F 2 Q 1 T = A diag F 2 T , F 2 , B 2 = A Q 2 diag F 2 T , F 2 Q 2 T = A diag F 2 , F 2 .
Then B 1 B 2 = A diag F 2 T + F 2 , F 2 + F 2 = A diag 3 0 1 1 0 , 0 0 , that is, B 1 B 2 , which means B 1 , B 2 are different.
From Theorem4, we get [[14], Theorem 3] as follows easily.
Corollary 2.
Let A , B be orthogonal matrices of order n. Then the following statements are equivalent:
(1)
A + B is orthogonal;
(2)
A T B + 1 2 E n be real anti-symmetric;
(3)
the eigenvalues of A T B are 1 2 ( 1 ± 3 i ) .
It shows some equivalent characterization for the sum of orthogonal matrices be orthogonal.

Author Contributions

Investigations, Meixiang Chen, Zhongpeng Yang and Zhixing Lin; writing—review and editing, Meixiang Chen; Formal analysis, Zhixing Lin; Project administration, Meixiang Chen. All the authors contributed equally to all the parts of this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China No. 61772292, the Natural Science Foundation of Fujian Province No. 2021J011103, and by the Science and Technology Project of Putian City No. 2022SZ3001ptxy05.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors wish to thank the anonymous referees for their detailed and very helpful comments and suggestions that improved this article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. HORN R. A., JOHNSON C. R. Matrix Analysis,2nd ed.New York: Cambridge University press, 2013.
  2. WOLKOWICZ H., STYAN G. P. H. Bounds for eigenvalues using Traces.Linear Algebra and its Application1980, 29: 471–506. [CrossRef]
  3. WOLKOWICZ H., STYAN G. P. H. More bounds for eigenvalues using Traces. Linear Algebra and its Application 1980, 31, 1–17. [CrossRef]
  4. LAY D C. Linear algebra and its applications,5th. New Jersey: Pearson Education, Inc. 2019:281.
  5. SMITH O. K. Eigenvalues of a Symmetric 3×3 Matrix. Communications ACM1961,4(4):168. [CrossRef]
  6. LIN Z.X., YANG Z.P.,CHEN M.X.,et al. Some researches on orthogonal solutions to a class of matrix trace equations2020,46(2):115-121.
  7. CHEN M. X.,YANG Z. P., YAN Y. Y., et al. Spectrum of 3×3 Orthogonal Matrices Whose Traces are Integers. ournal of Beihua University ( Natural Science)2018, 19(2):158–163. [CrossRef]
  8. CHEN M. X.,YANG Z. P., LI Y. X., et al. The Determination of the Spectrum of 3×3 Orthogonal Matrices and its Applications. Journal of Fujian Normal University( Natural Science Edition)2020,36(4):1–8.
  9. HUA L. K. Symplectic Similarity of Symplectic Square Matrix. Acta Scientiarum Naturalium Universitatis Sunyatseni,1962, (4):1–12.
  10. ZHANG S. G., LIN H. L. Algorithms for Symmetric Groups of Simplexes. Applied Mathematics and Computation2007 (188): 1610–1634. [CrossRef]
  11. YANG J. The Diagonalization of some Orthogonal Matrices. Journal of Southwest University for Nationalities-Natural Science Edition2011, 37(6):889–894.
  12. LIU Z. M. Discussion on the Properties of Orthogonal Matrix. Journal of Chongqing Normal University ( Natural Science Edition )2000, (S1):162-164.
  13. REN F. T,.WANG Y. X.,YANG S. L. On the Necessary and Sufficient Conditions for the Sum of Orthogonal Matrices to be Orthogonal Matrices. Journal of Mathematics(China)1999, (05):48-49.
  14. JIANG L. Some Notes on the Row Simplicity of Matrix and the Sum of Two Orthogonal Matrices.Journal of Mathematics for Technology,1998,14 (2): 134-137.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated