PI: Andy Li
GraphBTM is a topic model which is an unsupervised algorithm to understand documents. It learns to discover the latent representation of documents and produce meaning clustering of words in the same topic. The goal of GraphBTM is to overcome the limitations of the Latent Dirichlet Allocation (LDA) which suffers from the data sparsity problem in short text and Biterm Topic Model (BTM) which claims an insufficient whole-corpus topic distribution. We model the biterms (word-pairs in a fixed-size text window) as graphs to capture the word correlation feature. We use the Graph Convolutional Networks to get the features and the transitivity of biterms, e.g., biterms (A, B) and (A, C). B and C tends to be similar due to they are both neighbors of A. This phenomenon can be captured by graphs easily. To learn the latent representation of texts, we use a AEVB based inference method with neural network. Using the Dirichlet prior is important is topic model. However, it is hard to apply it directly due to the difficulty of applying the reparameterization trick. To solve this problem, we approximate the Dirichlet with a logistic normal distribution.