Your browser doesn't support javascript.
A stochastic algorithm for solving the posterior inference problem in topic models
TELKOMNIKA ; 20(5):971-978, 2022.
Article in English | ProQuest Central | ID: covidwho-2025608
ABSTRACT
Latent Dirichlet allocation (LDA) is an important probabilistic generative model and has usually used in many domains such as text mining, retrieving information, or natural language processing domains. The posterior inference is the important problem in deciding the quality of the LDA model, but it is usually non-deterministic polynomial (NP)-hard and often intractable, especially in the worst case. For individual texts, some proposed methods such as variational Bayesian (VB), collapsed variational Bayesian (CVB), collapsed Gibb's sampling (CGS), and online maximum a posteriori estimation (OPE) to avoid solving this problem directly, but they usually do not have any guarantee of convergence rate or quality of learned models excepting variants of OPE. Based on OPE and using the Bernoulli distribution combined, we design an algorithm namely general online maximum a posteriori estimation using two stochastic bounds (GOPE2) for solving the posterior inference problem in LDA model. It also is the NP-hard non-convex optimization problem. Via proof of theory and experimental results on the large datasets, we realize that GOPE2 is performed to develop the efficient method for learning topic models from big text collections especially massive/streaming texts, and more efficient than previous methods.
Keywords

Full text: Available Collection: Databases of international organizations Database: ProQuest Central Language: English Journal: TELKOMNIKA Year: 2022 Document Type: Article

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Databases of international organizations Database: ProQuest Central Language: English Journal: TELKOMNIKA Year: 2022 Document Type: Article