Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
IEEE Trans Neural Netw Learn Syst ; 32(12): 5723-5737, 2021 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-33048761

RESUMO

In this article, we come up with a novel Nesterov gradient and heavy-ball double accelerated distributed synchronous optimization algorithm, called NHDA, and adopt a general asynchronous model to further propose an effective asynchronous algorithm, called ASY-NHDA, for distributed optimization problem over directed graphs, where each agent has access to a local objective function and computes the optimal solution via communicating only with its immediate neighbors. Our goal is to minimize a sum of all local objective functions satisfying strong convexity and Lipschitz continuity. Consider a general asynchronous model, where agents communicate with their immediate neighbors and start a new computation independently, that is, agents can communicate with their neighbors at any time without any coordination and use delayed information from their in-neighbors to compute a new update. Delays are arbitrary, unpredictable, and time-varying but bounded. The theoretical analysis of NHDA is based on analyzing the interaction among the consensus, the gradient tracking, and the optimization processes. As for the analysis of ASY-NHDA, we equivalently transform the asynchronous system into an augmented synchronous system without delays and prove its convergence through using the generalized small gain theorem. The results show that NHDA and ASY-NHDA converge to the optimal solution at a linear convergence as long as the largest step size is positive and less than an explicitly estimated upper bound, and the largest momentum parameter is nonnegative and less than an upper bound. Finally, we demonstrate the advantages of ASY-NHDA through simulations.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...