01622nas a2200265 4500000000100000000000100001008004100002260001200043653002300055653003500078653001800113653002400131653004100155100001600196700001500212700001500227700001600242700001500258245011400273856008000387300001000467490000600477520085900483022001401342 2022 d c09/202210aBayesian Inference10aExtended Variational Inference10aMixture Model10aText Categorization10aInverted Beta-Liouville Distribution1 aYongfa Ling1 aWenbo Guan1 aQiang Ruan1 aHeping Song1 aYuping Lai00aVariational Learning for the Inverted Beta-Liouville Mixture Model and Its Application to Text Categorization uhttps://www.ijimai.org/journal/sites/default/files/2022-08/ijimai_7_5_9.pdf a76-840 v73 aThe finite invert Beta-Liouville mixture model (IBLMM) has recently gained some attention due to its positive data modeling capability. Under the conventional variational inference (VI) framework, the analytically tractable solution to the optimization of the variational posterior distribution cannot be obtained, since the variational object function involves evaluation of intractable moments. With the recently proposed extended variational inference (EVI) framework, a new function is proposed to replace the original variational object function in order to avoid intractable moment computation, so that the analytically tractable solution of the IBLMM can be derived in an effective way. The good performance of the proposed approach is demonstrated by experiments with both synthesized data and a real-world application namely text categorization. a1989-1660