In this paper, we consider the following nonsmooth optimization problems with
\(l_{1}\)-norm minimization problem
$$ \min_{x \in R^{n}}\frac{1}{2}\|Ax - b\|_{2}^{2} + \tau \|x\|_{1}, $$
(1)
where
\(x \in R^{n}\),
\(A \in R^{m \times n}\) (
\(m \ll n\)),
\(b \in R^{m}\),
\(\tau> 0\),
\(\Vert v \Vert _{2}\) denotes the Euclidean norm of
v and
\(\|v\|_{1} = \sum_{i = 1}^{n} |v_{i}|\) is the
\(l_{1}\)-norm of
v. This problem is widely used in compressed sensing, signal reconstruction, analog-to-information conversion and related to many mathematical problems [
1‐
16]. Problem (
1) is also a typical compressed sensing scenario, which can reconstruct a length-
n sparse signal from
m observations. From the Bayesian perspective, problem (
1) can also be seen as a maximum a posteriori criterion for estimating
x from observations
\(b = Ax + \omega\), where
ω is the Gaussian noise of variance
\(\sigma^{2}\). Many researchers have studied the numerical algorithms, which can be used to solve problem (
1) with large-scale data such as fixed point method [
1], gradient projection method for sparse reconstruction [
2], interior-point continuation method [
3,
4], iterative shrinkage thresholds algorithms in [
5,
6], linearized Bregman method [
7,
8], alternating direction algorithms [
9], nonsmooth equations-based method [
10] and some related methods [
11,
12]. Just recently, a smoothing gradient method has been given for solving problem (
1) based on the new transformed absolute value equations in [
14,
15]. The transformation is based on the equivalence between a linear complementarity problem and an absolute value equation problem [
17,
18]. The complementarity problem, the absolute value equation problem, and the related constrained optimization problem are three kinds of important optimization problems [
19‐
23]. On the other hand, the nonlinear conjugate gradient methods and smoothing methods are used widely to solve large-scale optimization problems [
24,
25], total variation image restoration [
26], monotone nonlinear equations with convex constraints [
27], and nonsmooth optimization problems, such as nonsmooth nonconvex problems [
28], minimax problem [
29], P0 nonlinear complementarity problems [
30]. Specially, the effectiveness of widely used and attained different numerical outcomes three-term conjugate gradient method, which is based on Hang–Zhang conjugate gradient method and Polak–Ribière–Polyak conjugate gradient method [
31‐
33], has been widely studied. Based on the above analysis, in this paper, we propose a new smoothing modified three-term conjugate gradient method to solve problem (
1). The global convergence analysis of the proposed method is also presented.