A note on two-sided removal and cancellation properties associated with Hermitian matrix

. A complex square matrix A is said to be Hermitian if A = A ∗ , the conjugate transpose of A . We prove that each of the two triple matrix product equalities AA ∗ A = A ∗ AA ∗ and A 3 = AA ∗ A implies that A is Hermitian by means of decompositions and determinants of matrices, which are named as two-sided removal and cancellation laws associated with a Hermitian matrix. We also present several general removal and cancellation laws as the extensions of the preceding two facts about Hermitian matrix.

Throughout this note, let C m×n and R m×n stand for the sets of all m × n matrices over the fields of complex numbers and real numbers, respectively; A, A * , and r(A) stand for the conjugate, the conjugate transpose, and the rank of A ∈ C m×n , respectively; det(A) stand for the determinant of A ∈ C m×m ; and I m denote the identity matrix of order m.
Recall that an A ∈ C m×m is said to be Hermitian if and only if it satisfies the following or element-wise, a ij = a ji for i, j = 1, 2, . . . , m. Hermitian matrix was named after French mathematician Charles Hermite , who showed in 1855 that this kind of matrices always have real eigenvalues. In addition, Hermitian matrices possess many elegant and pleasing formulas and facts, have many significant applications in the research areas of both theoretical and applied mathematics, and have already been recognized as basic conceptual objects and building blocks in matrix theory and linear algebra.
It has been known that in addition to the definition in (1), there were many conditions under which a square complex matrix is Hermitian in the literature. Here we mention the E-mail Address: yongge.tian@gmail.com following two well-known facts: The underlying meaning of these two facts is that we can cancel A from the left-and righthand sides of the first two equalities in (2) and (3) to yield A = A * without assuming that A is invertible. Obviously, these two cancellation laws can be utilized to simplify matrix equalities that involve the corresponding matrix products. The reader is to referred to [1-3, 5, 13, 15] for their derivations and related work. It has been realized (cf. [7]) that this kind of cancellation problems can be regarded as special cases of the following twosided implication facts where f (·) is certain ordinary algebraic operation of A and A * , namely, Hermitian matrices are exclusive solutions of the matrix equation on the left-hand side. Inspired by the two known results in (2) and (3), the present author proposes in this note two matrix equalities AA * A = A * AA * and A 3 = AA * A composed of triple matrix products of A and A * , and show that they are also equivalent to A = A * .
We first present some known results on matrix equalities, the kth root of a positive semi-definite matrix, and the singular value decomposition (SVD) of a matrix.

Lemma 1.
Let A ∈ C m×n and B, C ∈ C n×p . Then the following results hold: (ii) [12] The principal kth root of a positive semi-definite matrix exists and is unique.
where U ∈ C m×m and V ∈ C n×n are two unitary matrices, namely, U U * = U * U = I m and V V * = V * V = I n , and Σ ∈ R s×s is a positive diagonal matrix composed of the singular values of A with s = rank(A). In particular, if A ∈ C m×m , then A admits the following decomposition: where V U can be decomposed as where K * ≤ I s and LL * ≤ I s mean that I s − KK * and I s − LL * are positive semi-definite.
We are going to present and prove the main result in the note.
Theorem 3. Let A ∈ C m×m . Then the following results hold.
Proof. It is obvious that A = A * implies that the two equalities AA * A = A * AA * and To prove the converse assertions, we first multiply both sides of positive semi-definite matrices and Lemma 1(ii). In this case, We next post-multiply It is easy to verify by (5) that the four terms A 2 , AA * , A 3 A * , and (AA * ) 2 can be decomposed as By (8), hold. Finally, combining this fact with the third inequality in (6) results in LL * = 0, that is, L = 0. In this case, the two matrices in (7) satisfy A 2 = AA * , which implies that (ii) holds by (2).
Theorem 3 shows that we can remove A and A * from both sides of AA * A = A * AA * simultaneously to yield A * = A, and cancel A from both sides of A 3 = AA * A simultaneously to yield A = A * . Hence, we call the two facts the two-sided removal and cancellation laws, respectively. It should be pointed out that the two removal and cancellation laws are not isolated facts associated with conjugate transpose operation of a square matrix, but we are able to propose and prove many types of removal and cancellation facts for the multiplications of complex square matrices and their conjugate transposes. As direct consequences of the preceding results, we present three groups of removal and cancellation facts in the following three corollaries.
Proof. The "⇐" parts of (9)-(13) are obvious by replacing A * with A in the seven matrix equalities on the left-hand sides. To show the "⇒" parts in (9)-(11), let X = AA * A.
Then the left-hand sides of (9)-(11) can be rewritten as X 2 = (AA * A) 2 = (AA * ) 3 = (AA * A)(AA * A) * = XX * , X 2 = (AA * A) 2 = (A * A) 3 = (AA * A) * (AA * A) = X * X, and By (2) and (3), the first two equalities imply that X = AA * A is Hermitian, so that A is Hermitian by Theorem 3(i), and the third equality implies that X = AA * A is Hermitian by Theorem 3(ii), so that A is Hermitian by Theorem 3(i). Substituting the first equality into the second equality in (12), we obtain A 5 = A 3 A 2 = A * AA * A 2 = A * AA * AA * , which is equivalent to A 2 = AA * by applying Lemma 1(i) three times, thus establishing the equivalent facts in (12) through (2). Finally, we substitute the first equality into the second equality in (13) to yield A 7 = A 5 A 2 = (AA * ) 2 A 3 = (AA * ) 3 A, which is equivalent to A 3 = AA * A by applying Lemma 1(i) two times, thus establishing the equivalent facts in (13) through Theorem 3(ii).

Corollary 5.
Let A ∈ C m×n and B ∈ C n×m . Then Proof. The "⇐" part is obvious. To show the "⇒" part, we pre-and post-multiply the first equality in (14) with A and B, respectively, to obtain ABABAB = (AB) 3 = AB(AB) * AB, which implies AB is Hermitian by Theorem 3(ii), thus establishing equivalent facts in (14). Then In this situation, it is easy to verify Applying Theorem 3(i) and (ii) to the three block matrices, we obtain thus establishing (15) and (16).
The preceding results are very straightforward under more stringent assumptions.
For example, if A is nonsingular, then we immediately get A = A * by pre-and postmultiplying both sides of A 3 = AA * A with A −1 simultaneously. Similarly, if the two matrices A and B are square and nonsingular, then we get AB = B * A * from (BA) 2 = BB * A * A; if A is square and nonsingular, then we get A = B from AA * A = AB * A. The main merit of the results presented in this note is that the nonsingularity assumption is not needed to derive these cancellation laws. As concrete cases of the equivalence problems described in (4), there are some other mixed cancellation laws associated with Hermitian matrices that were proposed and proved (cf. [1,3,15]). In comparison, the preceding removal/cancellation laws link several fundamental matrix equalities together, so that they should be recognized as some fundamental facts and common knowledge regarding Hermitian matrices and their algebraic operations in matrix theory and applications.
Recalling a well-known fact that a square matrix A is skew-Hermitian if and only if iA is Hermitian, where i 2 = −1. Thus, we directly obtain the following consequences on skew-Hermitian matrix from (2), (3), (9)-(13), and Theorem 3.

Corollary 7.
Let A ∈ C m×m . Then the following ten statements are equivalent: (i) A is skew-Hermitian, i.e., A = −A * .
It should be pointed out that the previous results are valid for real matrices by replacing the conjugate transpose of complex matrix with the transpose of real matrix in the representation and derivation of Theorem 3. In addition, they also hold for matrices with elements in some general algebraic frameworks, such as, the real and complex quaternion algebras, in which, the conjugate transpose and the SVD of matrix can be defined analogously. Moreover, recall that as an extension of the concept of the conjugate transpose of complex matrix, the * -involution operation of an element in an associative ring R (or semigroups and algebras) is defined to be a mapping a → a * if it satisfies the equalities (a * ) * = a, (a + b) * = a * + b * , and (ab) * = b * a * for all a, b ∈ R. In this case, it would be interest to consider the previous removal/cancellation laws in the algebraic setting equipped with * -involution, in which idempotent, self-adjoint, and skew-self-adjoint elements can routinely be defined (cf. [2,[7][8][9][10][11]14]). Notice that the proof of Theorem 3 uses some well-known facts regarding the existence and uniqueness of the kth principal root of positive semi-definite Hermitian matrix, as well as matrix decompositions, determinants of matrices, ranks of matrices, and matrix inequalities. Since these methods are not necessarily available to use in general algebraic settings, the proofs or disproofs of the preceding removal/cancellation laws for self-adjoint/skew-self-adjoint elements in general algebraic settings should be given by means of other algebraic methods.