Show simple item record

dc.contributor.authorZavriev, S.
dc.contributor.authorSolodov, Mikhail
dc.date.accessioned2013-01-25T19:38:16Z
dc.date.available2013-01-25T19:38:16Z
dc.date.issued1994-06-06
dc.identifier.citation94-05en
dc.identifier.urihttp://digital.library.wisc.edu/1793/64528
dc.description.abstractConvergence properties of the generalized gradient projection algorithm in the presence of data perturbations are investigated. It is shown that every trajectory of the method is attracted, in a certain sense, to an ?-stationary set of the problem, where ? depends on the magnitude of the perturbations. Estimates for the attraction sets of the iterates are given in the general (nonsmooth and nonconvex) case. In the convex case, our results imply convergence to an ?-optimal set. The results are further strengthened for weakly short and strong convex problems. Convergence of the parallel algorithm in the case of the additive objective function is established. One of the principal applications of our results is the stability analysis of the classical backpropagation algorithm for training artificial neural networks.en
dc.subjectbackpropagation convergenceen
dc.subjectparallelizationen
dc.subjecterror stabilityen
dc.subjectgradient projectionen
dc.titleSTABILITY PROPERTIES OF THE GRADIENT PROJECTION METHOD WITH APPLICATIONS TO THE BACKPROPAGATION ALGORITHMen
dc.typeTechnical Reporten


Files in this item

Thumbnail

This item appears in the following Collection(s)

  • Math Prog Technical Reports
    Math Prog Technical Reports Archive for the Department of Computer Sciences at the University of Wisconsin-Madison

Show simple item record