eduzhai > Applied Sciences > Engineering >

Distributed Stochastic Subgradient Optimization Algorithms Over Random and Noisy Networks

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 41 pages

Abstract: We study distributed stochastic optimization by networked nodes tocooperatively minimize a sum of convex cost functions. The network is modeledby a sequence of time-varying random digraphs with each node representing alocal optimizer and each edge representing a communication link. We considerthe distributed subgradient optimization algorithm with noisy measurements oflocal cost functions subgradients, additive and multiplicative noises amonginformation exchanging between each pair of nodes. By stochastic Lyapunovmethod, convex analysis, algebraic graph theory and martingale convergencetheory, it is proved that if the local subgradient functions grow linearly andthe sequence of digraphs is conditionally balanced and uniformly conditionallyjointly connected, then proper algorithm step sizes can be designed so that allnodes states converge to the global optimal solution almost surely.

Please select stars to rate!

         

0 comments Sign in to leave a comment.

    Data loading, please wait...
×