000129895 001__ 129895 000129895 005__ 20240731103353.0 000129895 0247_ $$2doi$$a10.3390/forecast5030031 000129895 0248_ $$2sideral$$a136173 000129895 037__ $$aART-2023-136173 000129895 041__ $$aeng 000129895 100__ $$0(orcid)0000-0002-5801-0602$$aLujano-Rojas, Juan M.$$uUniversidad de Zaragoza 000129895 245__ $$aSearching for promisingly trained artificial neural networks 000129895 260__ $$c2023 000129895 5060_ $$aAccess copy available to the general public$$fUnrestricted 000129895 5203_ $$aAssessing the training process of artificial neural networks (ANNs) is vital for enhancing their performance and broadening their applicability. This paper employs the Monte Carlo simulation (MCS) technique, integrated with a stopping criterion, to construct the probability distribution of the learning error of an ANN designed for short-term forecasting. The training and validation processes were conducted multiple times, each time considering a unique random starting point, and the subsequent forecasting error was calculated one step ahead. From this, we ascertained the probability of having obtained all the local optima. Our extensive computational analysis involved training a shallow feedforward neural network (FFNN) using wind power and load demand data from the transmission systems of the Netherlands and Germany. Furthermore, the analysis was expanded to include wind speed prediction using a long short-term memory (LSTM) network at a site in Spain. The improvement gained from the FFNN, which has a high probability of being the global optimum, ranges from 0.7% to 8.6%, depending on the forecasting variable. This solution outperforms the persistent model by between 5.5% and 20.3%. For wind speed predictions using an LSTM, the improvement over an average-trained network stands at 9.5%, and is 6% superior to the persistent approach. These outcomes suggest that the advantages of exhaustive search vary based on the problem being analyzed and the type of network in use. The MCS method we implemented, which estimates the probability of identifying all local optima, can act as a foundational step for other techniques like Bayesian model selection, which assumes that the global optimum is encompassed within the available hypotheses. 000129895 536__ $$9info:eu-repo/grantAgreement/ES/AEI/PID2021-123172OB-I00$$9info:eu-repo/grantAgreement/EUR/AEI/TED2021-129801B-I00 000129895 540__ $$9info:eu-repo/semantics/openAccess$$aby$$uhttp://creativecommons.org/licenses/by/3.0/es/ 000129895 592__ $$a0.532$$b2023 000129895 593__ $$aEconomics, Econometrics and Finance (miscellaneous)$$c2023$$dQ1 000129895 593__ $$aDecision Sciences (miscellaneous)$$c2023$$dQ2 000129895 593__ $$aComputational Theory and Mathematics$$c2023$$dQ2 000129895 593__ $$aComputer Science Applications$$c2023$$dQ2 000129895 594__ $$a5.8$$b2023 000129895 655_4 $$ainfo:eu-repo/semantics/article$$vinfo:eu-repo/semantics/publishedVersion 000129895 700__ $$0(orcid)0000-0002-1490-6423$$aDufo-López, Rodolfo$$uUniversidad de Zaragoza 000129895 700__ $$0(orcid)0000-0001-7764-235X$$aArtal-Sevil, Jesús Sergio$$uUniversidad de Zaragoza 000129895 700__ $$0(orcid)0000-0003-2457-0422$$aGarcía-Paricio, Eduardo$$uUniversidad de Zaragoza 000129895 7102_ $$15009$$2535$$aUniversidad de Zaragoza$$bDpto. Ingeniería Eléctrica$$cÁrea Ingeniería Eléctrica 000129895 773__ $$g5, 3 (2023), 550-575$$pForecasting$$tForecasting$$x2571-9394 000129895 8564_ $$s10462212$$uhttps://zaguan.unizar.es/record/129895/files/texto_completo.pdf$$yVersión publicada 000129895 8564_ $$s2613441$$uhttps://zaguan.unizar.es/record/129895/files/texto_completo.jpg?subformat=icon$$xicon$$yVersión publicada 000129895 909CO $$ooai:zaguan.unizar.es:129895$$particulos$$pdriver 000129895 951__ $$a2024-07-31-09:55:25 000129895 980__ $$aARTICLE