The one parameter scheme by Hager and Zhang (Pac. J. Optim. 2(1) 35-58(2006)) represents a group of descent iterative schemes for large-dimension minimization problems. The nonnegative parameter of the scheme determines the weight of conjuacy and descent, and by extension, the scheme's effectiveness. The scheme, however, only converges globally for the traditional strongly convex quadratic functions. Other approaches have to be applied to ensure this attribute holds for general functions. Moreover, when the parameter approaches $0$, the scheme reduces to the method by Hestenes and Stiefel \cite{kab31}, which in practical sense does not perform well due to the jamming phenomenon. By carrying out eigenvalue analysis of an adaptive two parameter Hager-Zhang type method, a new scheme is presented, which ensures global convergence for monotone functions without any condition for the type of linesearch procedure. The proposed scheme was driven by attributes exhibited by the Hager-Zhang scheme and various schemes designed with double parameters. The scheme is also applicable to non-smooth problems since it doesn't require derivatives. Using fundamental assumptions, we proved global convergence of the scheme and preliminary report of numerical experiments carried out with the scheme and some recent methods indicate it is more effective and efficient.
Mathematics Subject Classification: 90C30, 90C26