ABSTRACT
The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name random-features Hopfield model. Here, P binary patterns of length N are generated by applying to Gaussian vectors sampled in a latent space of dimension D a random projection followed by a nonlinearity. Using the replica method from statistical physics, we derive the phase diagram of the model in the limit P,N,Dâ∞ with fixed ratios α=P/N and α_{D}=D/N. Besides the usual retrieval phase, where the patterns can be dynamically recovered from some initial corruption, we uncover a new phase where the features characterizing the projection can be recovered instead. We call this phenomena the learning phase transition, as the features are not explicitly given to the model but rather are inferred from the patterns in an unsupervised fashion.
ABSTRACT
We propose a simple yet very predictive form, based on a Poisson's equation, for the functional dependence of the cost from the density of points in the Euclidean bipartite matching problem. This leads, for quadratic costs, to the analytic prediction of the large N limit of the average cost in dimension d = 1,2 and of the subleading correction in higher dimension. A nontrivial scaling exponent, γ(d) = d-2/d, which differs from the monopartite's one, is found for the subleading correction. We argue that the same scaling holds true for a generic cost exponent in dimension d > 2.
Subject(s)
Statistics as Topic/methods , Poisson DistributionABSTRACT
Using a formalism based on the spectral decomposition of the replicated transfer matrix for disordered Ising models, we obtain several results that apply both to isolated one-dimensional systems and to locally treelike graph and factor graph (p-spin) ensembles. We present exact analytical expressions, which can be efficiently approximated numerically for many types of correlation functions and for the average free energies of open and closed finite chains. All the results achieved, with the exception of those involving closed chains, are then rigorously derived without replicas, using a probabilistic approach with the same flavor of cavity method.
Subject(s)
Models, Theoretical , ThermodynamicsABSTRACT
We derive the analytical expression for the first finite-size correction to the average free energy of disordered Ising models on random regular graphs. The formula can be physically interpreted as a weighted sum over all non-self-intersecting loops in the graph, the weight being the free-energy shift due to the addition of the loop to an infinite tree.