Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. An accelerated adaptive class of nonlinear conjugate gradient algorithms is suggested. The search direction in these algorithms is given by symmetrization of the scaled Perry conjugate gradient direction (Perry, 1978), which depends on a positive parameter. The value of this parameter is determined by minimizing the distance between the symmetrical scaled Perry conjugate gradient search direction matrix and the self-scaling memoryless BFGS update by Oren in the Frobenius norm. Two variants of the parameter in the search direction are presented as those given by: Oren and Luenberger (1973/74) and Oren and Spedicato (1976). The corresponding algorithm, ACGSSV, is equipped with a very well known acceleration scheme of conjugate gradient algorithms. The global convergence of the algorithm is given both for uniformly convex and general nonlinear functions under the exact or the Wolfe line search. Using a set of 800 unconstrained optimization test problems, of different structure and complexity, we prove that selection of the scaling parameter in self-scaling memoryless BFGS update leads to algorithms which substantially outperform the CG-DESCENT, SCALCG, and CONMIN conjugate gradient algorithms, being more efficient and more robust. However, the conjugate gradient algorithm ADCG based on clustering the eigenvalues of the iteration matrix defined by the search direction is more efficient and slightly more robust than our ACGSSV algorithm. By solving five applications from the MINPACK-2 test problem collection with variables, we show that the adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update, endowed with the acceleration scheme, is top performer versus CG_DESCENT.