The well-known class of conjugate gradient (CG) line search methods for large-scale unconstrained optimization will be analyzed and simple examples will be used for illustrations. To impose certain useful properties to this class, some recent techniques will be introduced to this class. It will be shown, in particular, that certain techniques enforced the global convergence property to most members of the CG class of methods. Some results for a selection of CG algorithms and their modifications (in particular those of Fletcher-Reeves, Polak-Ribie’re and Hestenes-Stiefel methods) will be described. It will be shown that the proposed techniques improve the performance of several CG algorithms substantially.