Skip to the main content

Original scientific paper

https://doi.org/10.2498/cit.2001.01.01

Training Artificial Neural Networks: Backpropagation via Nonlinear Optimization

K. Wendy Tang
Jadranka Skorin-Kapov


Full text: english pdf 425 Kb

page 1-14

downloads: 2.490

cite


Abstract

In this paper we explore different strategies to guide backpropagation algorithm used for training artificial neural networks. Two different variants of steepest descent-based backpropagation algorithm, and four different variants of conjugate gradient algorithm are tested. The variants differ whether or not the time component is used, and whether or not additional gradient information is utilized during one-dimensional optimization. Testing is performed on randomly generated data as well as on some benchmark data regarding energy prediction. Based on our test results, it appears that the most promissing backpropagation strategy is to initially use steepest descent algorithm, and then continue with conjugate gradient algorithm. The backpropagation through time strategy combined with conjugate gradients appears to be promissing as well.

Keywords

Hrčak ID:

44817

URI

https://hrcak.srce.hr/44817

Publication date:

30.3.2001.

Visits: 3.042 *