MatLM: a Matrix Formulation for Probabilistic Language Models

Yanshan Wang, Hongfang Liu

Probabilistic language models are widely used in Information Retrieval (IR) to rank documents by the probability that they generate the query. However, the implementation of the probabilistic representations with programming languages that favor matrix calculations is challenging. In this paper, we utilize matrix representations to reformulate the probabilistic language models. The matrix representation is a superstructure for the probabilistic language models to organize the calculated probabilities and a potential formalism for standardization of language models and for further mathematical analysis. It facilitates implementations by matrix friendly programming languages. In this paper, we consider the matrix formulation of conventional language model with Dirichlet smoothing, and two language models based on Latent Dirichlet Allocation (LDA), i.e., LBDM and LDI. We release a Java software package--MatLM--implementing the proposed models. Code is available at: https://github.com/yanshanwang/JGibbLDA-v.1.0-MatLM.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment