ALEC: an adaptive Learning framework for optimizing artificial neural networks. In this paper we present ALEC (Adaptive Learning by Evolutionary Computation), an automatic computational framework for optimizing neural networks wherein the neural network architecture, activation function, weights and learning algorithms are adapted according to the problem. We explored the performance of ALEC and artificial neural networks for function approximation problems. To evaluate the comparative performance, we used three different well-known chaotic time series. We also report some experimentation results related to convergence speed and generalization performance of four different neural network-learning algorithms. Performances of the different learning algorithms were evaluated when the activation functions and architecture were changed. We further demonstrate how effective and inevitable is ALEC to design a neural network, which is smaller, faster and with a better generalization performance.