1 Introduction
Consider the following weakly absolute value equations (AVE):
$$ Ax-\vert x\vert =b, $$
(1.1)
where
\(b \in\mathbb{R}^{n}\),
\(\vert \cdot \vert \) denotes the componentwise absolute value,
\(A=W+iT\) where
\(W\in\mathbb{R}^{n\times n}\) is a symmetric positive definite matrix and
\(T\in\mathbb{R}^{n\times n}\) is a symmetric positive semidefinite matrix, and
\(i=\sqrt{-1}\) denotes the imaginary unit. A general form of AVE (
1.1)
$$ Ax+B\vert x\vert =b, $$
(1.2)
was first introduced in [
1] and investigated in [
2]. The AVE (
1.1) is a class of the important nonlinear linear systems, and it often comes from the fact that linear programs, quadratic programs, and bimatrix games can all be reduced to a linear complementarity problem (LCP) [
3‐
5]. This fact implies that the AVE (
1.1) is equivalent to the LCP [
5] and turns out to be NP-hard; see [
2].
In recent years, some efficient methods have been proposed to solve the AVE (
1.1), such as the smoothing Newton method [
6], the generalized Newton method [
7‐
11], the sign accord method [
12]. For other forms of the iteration method, one can see [
13‐
15] for more details.
When the involved matrix
A in (
1.1) is a non-Hermitian positive definite matrix, based on the Hermitian and skew-Hermitian splitting (HSS) iteration method [
16], the Picard-HSS iteration method for solving the AVE (
1.1) has been proposed in [
17]. Numerical results show that the Picard-HSS iteration method outperforms the Picard and generalized Newton methods under certain conditions. Although the Picard-HSS iteration method is efficient and competitive, the numbers of the inner HSS iteration steps are often problem-dependent and difficult to be obtained in actual computations. To overcome this disadvantage and improve the convergence of the Picard-HSS iteration method, the nonlinear HSS-like iteration method in [
18] has been presented and its convergent conditions are established. Numerical experiments demonstrate that the nonlinear HSS-like iteration method is feasible and robust.
When the involved matrix
A in (
1.1) is
\(A = W + iT\), the convergent rate of the aforementioned Picard-HSS and nonlinear HSS-like methods maybe reduce. This is reason that each step of Picard-HSS and HSS-like iterations needs to solve two linear subsystems with the symmetric positive definite matrix
\(\alpha I +W\) and the shifted skew-Hermitian matrix
\(\alpha I + iT\). It is well known that the solution of the linear system with the coefficient matrix
\(\alpha I + iT\) is not easy to obtain [
19]. To overcome this defect, based on MHSS iteration method [
20], we will establish the nonlinear MHSS-like iteration method to solve the AVE (
1.1). Compared with the nonlinear HSS-like iteration method, the potential advantage of the nonlinear MHSS-like iteration method is that only two linear subsystems with coefficient matrices
\(\alpha I +W\) and
\(\alpha I +T\), both being real and symmetric positive definite, need to be solved at each step. This shows that the nonlinear MHSS-like iteration method can avoid a shifted skew-Hermitian linear subsystem with coefficient matrix
\(\alpha I + iT\). Therefore, in this case, these two linear subsystems can be solved either exactly by a sparse Cholesky factorization or inexactly by conjugated gradient scheme. The convergent conditions of the nonlinear MHSS-like iteration method are obtained by using a smoothing approximate function.
The remainder of the paper is organized as follows. In Section
2, the MHSS iteration method is briefly reviewed. The nonlinear MHSS-like iteration method is discussed in Section
3. Numerical experiments are reported in Section
4. Finally, in Section
5 we draw some conclusions.
2 The MHSS iteration method
To establish the nonlinear MHSS-like iteration method for solving the AVE (
1.1), a brief review of MHSS iteration is needed.
From (
1.2), when
B is a zero matrix, the generalized AVE (
1.2) reduces to the system of linear equations
where
\(A=W+iT\) with symmetric positive definite matrix
\(W\in\mathbb{R}^{n\times n}\) and symmetric positive semidefinite matrix
\(T\in\mathbb{R}^{n\times n}\). In fact, the matrices
W and
iT are the Hermitian and skew-Hermitian parts of the complex symmetric matrix
A, respectively.
Obviously, the linear system (
2.1) can be rewritten in the following two equivalent forms:
$$\begin{aligned}& (\alpha I + W)x = (\alpha I -i T )x + b, \end{aligned}$$
(2.2)
$$\begin{aligned}& (\alpha I + T)x = (\alpha I +i W )x- ib. \end{aligned}$$
(2.3)
Combining (
2.2) with (
2.3), Bai
et al. in [
20] skillfully designed a modified HSS (MHSS) method to solve the complex symmetric linear system (
2.1) below.
The MHSS method. Let
\(x^{(0)}\in\mathbb{C}^{n}\) be an arbitrary initial point. For
\(k=0,1,2,\ldots\) , compute the next iterate
\(x^{(k+1)}\) according to the following procedure:
$$ \left\{ \textstyle\begin{array}{l} (\alpha I+ W)x^{(k+\frac{1}{2})}=(\alpha I-iT)x^{(k)}+b, \\ (\alpha I+T)x^{(k+1)}=(\alpha I+iW)x^{(k+\frac{1}{2})}-ib, \end{array}\displaystyle \right.$$
(2.4)
where
α is a given positive constant and
I is the identity matrix.
In matrix-vector form, the MHSS iteration (
2.4) can be equivalently rewritten as
$$x^{(k+1)}=M_{\alpha}x^{(k)}+G_{\alpha}b=M_{\alpha}^{k+1}x^{(0)}+ \sum_{j=0}^{k}M_{\alpha}^{j}G_{\alpha}b,\quad k=0,1,\ldots, $$
where
$$\begin{aligned}& M_{\alpha} =(\alpha I+T)^{-1}(\alpha I+iW) (\alpha I+W)^{-1}(\alpha I-iT), \\& G_{\alpha}=(1-i)\alpha(\alpha I+T)^{-1}(\alpha I+ W)^{-1}. \end{aligned}$$
Here,
\(M_{\alpha}\) is the iteration matrix of the MHSS method.
Theoretical analysis in [
20] shows that the MHSS method converges unconditionally to the unique solution of the complex symmetric linear system (
2.1) when
\(W\in\mathbb{R}^{n\times n}\) is symmetric positive definite and
\(T\in\mathbb{R}^{n\times n}\) is symmetric positive semidefinite.
3 The nonlinear MHSS-like iteration method
In this section, we will establish the MHSS-like iteration method for solving the AVE (
1.1). Using the techniques in [
21], the AVE (
1.1) can be rewritten as the system of the nonlinear fixed-point equations
$$\begin{aligned}& (\alpha I + W)x = (\alpha I -i T )x +\vert x\vert + b, \end{aligned}$$
(3.1)
$$\begin{aligned}& (\alpha I + T)x = (\alpha I +i W )x-i\vert x\vert - ib. \end{aligned}$$
(3.2)
In the following, we establish the following MHSS-like iteration method for solving the AVE (
1.1) by alternately iterating between two systems of the nonlinear fixed-point equations (
3.1) and (
3.2).
The nonlinear MHSS-like method. Let
\(x^{(0)}\in \mathbb{C}^{n}\) be an arbitrary initial point. For
\(k=0,1,2,\ldots\) , compute the next iterate
\(x^{(k+1)}\) according to the following procedure:
$$ \left\{ \textstyle\begin{array}{l} (\alpha I+ W)x^{(k+\frac{1}{2})}=(\alpha I-iT)x^{(k)}+\vert x^{(k)}\vert +b, \\ (\alpha I+T)x^{(k+1)}=(\alpha I+iW)x^{(k+\frac{1}{2})}-i\vert x^{(k+\frac{1}{2})}\vert -ib, \end{array}\displaystyle \right.$$
(3.3)
where
α is a given positive constant and
I is the identity matrix.
Evidently, each step of the nonlinear MHSS-like iteration alternates between two real symmetric positive definite matrices \(\alpha I+W\) and \(\alpha I+T\). Hence, these two linear subsystems involved in each step of the nonlinear MHSS-like iteration can be solved effectively by exactly using the Cholesky factorization. On the other hand, in the Picard-HSS and nonlinear HSS-like methods, a shifted skew-Hermitian linear subsystem with coefficient matrix \(\alpha I+iT\) needs to be solved at every iterative step.
Let
$$ \left\{ \textstyle\begin{array}{l} U(x)=(\alpha I+ W)^{-1}((\alpha I-iT)x+\vert x\vert +b), \\ V(x)=(\alpha I+T)^{-1}((\alpha I+iW)x-i\vert x\vert -ib), \end{array}\displaystyle \right.$$
(3.4)
and
$$ \psi(x)=V\circ U(x)=V\bigl(U(x)\bigr). $$
(3.5)
Then, from (
3.4) and (
3.5), the nonlinear MHSS-like iteration scheme can be equivalently rewritten as
$$ x^{(k+1)}=\psi\bigl(x^{(k)}\bigr), \quad k=0,1,2,\ldots. $$
(3.6)
We note that we cannot directly use the Ostrowski theorem (Theorem 10.1.3 in [
22]) to give a local convergence theory for the iteration (
3.6). It is reason that the nonlinear term
\(\vert x\vert +b\) is non-differentiable. To overcome this defect, based on the smoothing approximate function introduced in [
6], we can establish the following local convergence theory for nonlinear MHSS-like iteration method.
Define
\(\phi:\mathbb{D}\subset\mathbb{C}^{n}\rightarrow\mathbb{C}^{n}\) by
$$\phi(x)=\sqrt{x^{2}+\epsilon^{2}}=\Bigl(\sqrt {x_{1}^{2}+\epsilon^{2}},\ldots ,\sqrt {x_{n}^{2}+\epsilon^{2}} \Bigr)^{T}, \quad \epsilon>0, x\in\mathbb{D}. $$
It is easy to see that
\(\phi(x)\) is a smoothing function of
\(\vert x\vert \). Based on the results in [
6], the following properties of
\(\phi(x)\) can easily be given.
Using the smoothing approximate function
\(\phi(x)\) instead of
\(\vert x\vert \) in (
3.4), we have
$$ \left\{ \textstyle\begin{array}{l} \hat{U}(x)=(\alpha I+ W)^{-1}((\alpha I-iT)x+\phi(x)+b), \\ \hat{V}(x)=(\alpha I+T)^{-1}((\alpha I+iW)x-i\phi(x)-ib), \end{array}\displaystyle \right.$$
(3.8)
and
$$ \hat{\psi}(x)=\hat{V}\circ\hat{U}(x)=\hat{V}\bigl(\hat{U}(x) \bigr). $$
(3.9)
Then, from (
3.8) and (
3.9), the smoothing nonlinear MHSS-like iteration scheme can be equivalently rewritten as
$$ \hat{x}^{(k+1)}=\hat{\psi}\bigl(x^{(k)}\bigr),\quad k=0,1,2,\ldots. $$
(3.10)
To obtain the convergence conditions of the nonlinear MHSS-like method, we define a matrix norm
\(\Vert X\Vert =\Vert (\alpha I +T)X(\alpha I +T)^{-1}\Vert _{2}\) for any
\(X\in\mathbb{C}^{n\times n}\). Let
\(\theta(\alpha)= \Vert M_{\alpha} \Vert \). For
\(\alpha>0\), we have
$$\begin{aligned} \theta(\alpha)&= \Vert M_{\alpha} \Vert =\bigl\Vert (\alpha I+iW) ( \alpha I+W)^{-1}(\alpha I-iT) (\alpha I+T)^{-1}\bigr\Vert _{2} \\ &\leq\bigl\Vert (\alpha I+iW) (\alpha I+W)^{-1} \bigr\Vert _{2}\bigl\Vert (\alpha I-iT) (\alpha I+T)^{-1}\bigr\Vert _{2} \\ &\leq\bigl\Vert (\alpha I+iW) (\alpha I+W)^{-1}\bigr\Vert _{2}< 1; \end{aligned}$$
see [
20] for details.
Observe that the matrices
W and
T are strongly dominant over the matrix
\(\phi'(x^{\ast})\) in certain norm (
i.e.,
\(\Vert W\Vert \gg \Vert \phi'(x^{\ast})\Vert \) and
\(\Vert T\Vert \gg \Vert \phi'(x^{\ast})\Vert \) for certain matrix norm). Therefore, matrix
A satisfies the condition (
3.11) with
W be symmetric positive definite and
T be symmetric positive semidefinite. In fact, in this case, the matrix
\(M_{\alpha,x^{\ast}}\) can be approximated by the matrix
\(M_{\alpha}\) and the condition (
3.11) reduces to
\(\rho(M_{\alpha})<1\). Clearly,
\(\rho(M_{\alpha})\leq\theta(\alpha)<1\).
Under the condition of Theorem
3.1,
$$\begin{aligned} \bigl\Vert x^{(k+1)}-\hat{x}^{(k+1)}\bigr\Vert \leq&\sqrt{n} \epsilon\delta(2+\delta)\leq\sqrt{n}\epsilon. \end{aligned}$$
This implies that, if we choose
ϵ such that
\(\sqrt{n}\epsilon<\varepsilon\), the results in Theorem
3.2 hold as well.
In fact, under the condition of Theorem
3.3, the solution of
\(Ax=\vert x\vert +b\) can be approximated by
\(x^{\ast}\), which is such that
\(Ax^{\ast}=\phi(x^{\ast})+b\). It is the reason that, when
\(Ax^{\ast}=\vert x^{\ast} \vert +b\), we have
$$\bigl\Vert \phi\bigl(x^{\ast}\bigr)-\bigl\vert x^{\ast}\bigr\vert \bigr\Vert \leq\sqrt{n}\epsilon, \quad\mbox{for any } \epsilon>0. $$
The convergent speed of the nonlinear MHSS-like method (
3.3) may depend on two factors: (1) the nonlinear term
\(\vert x\vert +b\); (2) finding the optimal parameters to guarantee that the spectral radius
\(\rho(M_{\alpha})\) of the iteration matrix
\(M_{\alpha}\) is less than 1. As the former is problem-independent, the latter can be estimated. Based on Corollary 2.1 in [
20], the optimal parameter
\(\alpha^{\ast}=\sqrt{\lambda_{\max}\lambda_{\min}}\) is obtained to minimize the upper bound on the spectral radius
\(\rho(M_{\alpha})\) of the MHSS iteration matrix
\(M_{\alpha}\), where
\(\lambda_{\max}\) and
\(\lambda_{\min}\) are the extreme eigenvalues of the symmetric positive definite matrix
W, respectively. It is noted that, although one usually cannot expect to minimize the spectral radius
\(\rho(M_{\alpha})\) of the corresponding iteration matrix
\(M_{\alpha}\) with the optimal parameter
\(\alpha^{\ast}\), it is still helpful for us to choose an effective parameter for the nonlinear MHSS-like method.
Competing interests
The author declares that there is no conflict of interests regarding the publication of this paper.