The proximal term plays a significant role in the literature of proximal Alternating Direction Method of Multipliers (ADMM), since (positive-definite or indefinite) proximal terms can promote convergence of ADMM and further simplify the involved subproblems. However, an overlarge proximal parameter decelerates the convergence numerically, though the convergence can be established with it. In this paper, we thus focus on a Linearized Symmetric ADMM (LSADMM) with proper proximal terms for solving a family of multi-block separable convex minimization models, and we determine an optimal (smallest) value of the proximal parameter while convergence of this LSADMM can be still ensured. Our LSADMM partitions the data into two group variables and updates the Lagrange multiplier twice in different forms with suitable step sizes. The region of the proximal parameter, involved in the second group subproblems, is partitioned into the union of three different sets. We show the global convergence and sublinear ergodic convergence rate of LSADMM for the two cases, while a counter-example is given to illustrate that convergence of LSADMM can not be guaranteed for the remaining case. Theoretically, we obtain the optimal lower bound of the proximal parameter. Numerical experiments on the so-called latent variable Gaussian graphical model selection problems are presented to demonstrate performance of the proposed algorithm and the significant advantage of the optimal lower bound of the proximal parameter.