Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Neural Comput ; 35(7): 1234-1287, 2023 Jun 12.
Artigo em Inglês | MEDLINE | ID: mdl-37187168

RESUMO

Gradient descent methods are simple and efficient optimization algorithms with widespread applications. To handle high-dimensional problems, we study compressed stochastic gradient descent (SGD) with low-dimensional gradient updates. We provide a detailed analysis in terms of both optimization rates and generalization rates. To this end, we develop uniform stability bounds for CompSGD for both smooth and nonsmooth problems, based on which we develop almost optimal population risk bounds. Then we extend our analysis to two variants of SGD: batch and mini-batch gradient descent. Furthermore, we show that these variants achieve almost optimal rates compared to their high-dimensional gradient setting. Thus, our results provide a way to reduce the dimension of gradient updates without affecting the convergence rate in the generalization analysis. Moreover, we show that the same result also holds in the differentially private setting, which allows us to reduce the dimension of added noise with "almost free" cost.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...