Variational Learning for the Inverted Beta-Liouville Mixture Model and Its Application to Text Categorization.

Authors

DOI:

https://doi.org/10.9781/ijimai.2022.08.006

Keywords:

Bayesian Inference, Extended Variational Inference, Mixture Model, Text Categorization, Inverted Beta-Liouville Distribution
Supporting Agencies
This work was supported by the General Project of Science and Technology Plan of Beijing Municipal Commission of Education (No. KM201910009014) and the National Natural Science Foundation of China (Grant No. 62172193).

Abstract

The finite invert Beta-Liouville mixture model (IBLMM) has recently gained some attention due to its positive data modeling capability. Under the conventional variational inference (VI) framework, the analytically tractable solution to the optimization of the variational posterior distribution cannot be obtained, since the variational object function involves evaluation of intractable moments. With the recently proposed extended variational inference (EVI) framework, a new function is proposed to replace the original variational object function in order to avoid intractable moment computation, so that the analytically tractable solution of the IBLMM can be derived in an effective way. The good performance of the proposed approach is demonstrated by experiments with both synthesized data and a real-world application namely text categorization.

Downloads

Download data is not yet available.

References

M. A. Mashrgy, T. Bdiri, N. Bouguila, “Robust simultaneous positive data clustering and unsupervised feature selection using generalized inverted dirichlet mixture models,” Knowledge-Based Systems, vol. 59, no. 2, pp. 182-195, 2014.

T. Bdiri, N. Bouguila, “Variational bayesian inference for infinite generalized inverted dirichlet mixtures with feature selection and its application to clustering,” Applied Intelligence, vol. 40, pp. 507-525, 2016.

C. Liu, H. Li, K. Fu, F. Zhang, M. Datcu, W. J. Emerye, “Bayesian estimation of generalized gamma mixture model based on variational em algorithm,” Pattern Recognition, vol. 87, pp. 269-284, 2019.

Y. Lai, H. Cao, L. Luo, Y. Zhang, F. Bi, X. Gui, Y. Ping, “Extended variational inference for gamma mixture model in positive vectors modeling,” Neurocomputing, vol. 432, pp. 145-158, 2021.

T. Bdiri, N. Bouguila, “Positive vectors clustering using inverted dirichlet finite mixture models,” Expert Systems with Applications, vol. 39, no. 2, pp. 1869-1882, 2012.

Y. Lai, Y. Ping, W. He, B. Wang, J. Wang, X. Zhang, “Variational bayesian inference for finite inverted dirichlet mixture model and its application to object detection,” Chinese Journal of Electronics, vol. 27, no. 3, pp. 603- 610, 2018.

S. Bourouis, M. A. Mashrgy, N. Bouguila, “Bayesian learning of finite generalized inverted dirichlet mixtures: Application to object classification and forgery detection,” Expert Systems with Applications, vol. 41, no. 5, pp. 2329-2336, 2014.

C. Hu, W. Fan, J. Du, N. Bouguila, “A novel statistical approach for clustering positive data based on finite inverted beta-liouville mixture models,” Neurocomputing, vol. 333, pp. 110-123, 2019.

Y. Agusta, D. L. Dowe, “Unsupervised learning of gamma mixture models using minimum message length,” in Proceedings of the 3rd IASTED international multi-conference artificial intelligence and applications, 2003, pp. 457-462.

S. Favaro, Y. W. Teh, “Mcmc for normalized random measure mixture models,” Statistical Science, vol. 28, no. 3, pp. 335-359, 2013.

T. P. Minka, “Expectation propagation for approximate bayesian inference,” in Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, 2001, pp. 362-369.

C. M. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics). New York, NY, USA: Springer-Verlag, 2006.

Z. Ma, J. Xie, Y. Lai, J. Taghia, J. Xue, J. Guo, “Insights into multiple/single lower bound approximation for extended variational inference in non-gaussian structured data modeling,” IEEE Transactions on Neural Network and Learning System, vol. 31, no. 7, pp. 2240-2254, 2019.

W. Fan, N. Bouguila, “Modeling and clustering positive vectors via nonparametric mixture models of liouville distributions,” IEEE Transactions on Neural Networks and Learning Systems, vol. 31, no. 9, pp. 3193-3203, 2020.

G. J. McLachlan, D. Peel, Finite Mixture Models. Wiley, New York, USA, 2000.

Z. Ma, P. K. Rana, J. Taghia, M. Flierl, A. Leijon, “Bayesian estimation of dirichlet mixture model with variational inference,” Pattern Recognition, vol. 47, no. 9, pp. 3143-3157, 2014.

H. Strobelt, D. Oelke, C. Rohrdantz, A. Stoffel, D. A. Keim, O. Deussen, “Document cards: A top trumps visualization for documents,” IEEE Transactions on Visualization and Computer Graphics, vol. 15, no. 6, pp. 1145-1152, 2009.

W. W. Chu, D. B. Johnson, H. Kangarloo, “A medical digital library to support scenario and user-tailored information retrieval,” IEEE Transactions on Information Technology in Biomedicine, vol. 4, no. 2, pp. 97-107, 2000.

Y. Peng, J. Wang, L. Jiao, “A novel text retrieval algorithm for public crisis cases,” Chinese Journal of Electronics, vol. 28, no. 4, pp. 712-717.

K. Nigam, A. McCallum, S. Thrun, T. Mitchell, “Learning to classify text from labeled and unlabeled documents,” in Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence, 1998, pp. 792-799.

Y. Ping, Y. Zhou, C. Xue, Y. Yang, “Efficient representation of text with multiple perspectives,” The Journal of China Universities of Posts and Telecommunications, vol. 1, no. 19, pp. 101-111, 2012.

M. Porter, “An algorithm for suffix stripping,” Program, vol. 14, no. 3, pp. 130-137, 1980.

Downloads

Published

2022-09-01
Metrics
Views/Downloads
  • Abstract
    234
  • PDF
    65

How to Cite

Ling, Y., Guan, W., Ruan, Q., Song, H., and Lai, Y. (2022). Variational Learning for the Inverted Beta-Liouville Mixture Model and Its Application to Text Categorization. International Journal of Interactive Multimedia and Artificial Intelligence, 7(5), 76–84. https://doi.org/10.9781/ijimai.2022.08.006