A COMPARATIVE ANALYSIS OF NEURAL NETWORK OPTIMIZERS USING ANALYSIS OF VARIANCE
Aditya Das, Ved Bhatt, Siba Panda
1SVKM NMIMS Mukesh Patel School of Technology Management and Engineering, MumbaiAbstract: Neural networks have transformed machine learning, with applications from recognizing images to understanding languages. However, choosing the right optimizer is crucial for the proper performance of the neural networks; it also does not apply one size fits all. This study examines the performance of different optimization algorithms—SGD, Adam, RMSprop, Nadam, and FTRL—in neural network training. Using statistical methods like Analysis of Variance (ANOVA) and experiments with synthetic datasets, we compare how these optimizers affect training efficiency and model accuracy. Our results show that adaptive optimizers like Adam and Nadam are particularly effective in early training phases, leading to faster convergence and better error reduction. These findings highlight the importance of choosing the right optimizer based on the specific needs of the training scenario to improve outcomes. This research helps in understanding optimizer impact on neural networks, guiding better choices in neural network design.
Keywords: Optimizers, Neural Networks, Anova, Deep Learning, Statisticsr