Please help transcribe this video using our simple transcription tool. You need to be logged in to do so.


We study the general class of estimators for graphical model structure based on optimizing $ell_1$-regularized approximate log-likelihood, where the approximate likelihood uses tractable variational approximations of the partition function. We provide a message-passing algorithm that emph{directly} computes the $ell_1$ regularized approximate MLE. Further, in the case of certain reweighted entropy approximations to the partition function, we show that surprisingly the $ell_1$ regularized approximate MLE estimator has a emph{closed-form}, so that we would no longer need to run through many iterations of approximate inference and message-passing. Lastly, we analyze this general class of estimators for graph structure recovery, or its emph{sparsistency}, and show that it is indeed sparsistent under certain conditions.

Questions and Answers

You need to be logged in to be able to post here.