This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
cs401r_w2016:lab9 [2016/03/17 21:43] admin |
cs401r_w2016:lab9 [2018/03/21 16:19] sadler [Hints:] |
||
---|---|---|---|
Line 48: | Line 48: | ||
For this lab, you will code two different inference algorithms on the Latent Dirichlet Allocation (LDA) model. | For this lab, you will code two different inference algorithms on the Latent Dirichlet Allocation (LDA) model. | ||
- | You will use [[http://hatch.cs.byu.edu/courses/stat_ml/files.tar.gz|a dataset of general conference talks]]. Download and untar these files; there is helper code in the ''Hints'' section to help you process them. | + | You will use [[https://www.dropbox.com/s/yr3n9w61ifon04h/files.tar.gz?dl=0|a dataset of general conference talks]]. Download and untar these files; there is helper code in the ''Hints'' section to help you process them. |
**Part 1: Basic Gibbs Sampler** | **Part 1: Basic Gibbs Sampler** | ||
Line 129: | Line 129: | ||
for iters in range(0,100): | for iters in range(0,100): | ||
p = compute_data_likelihood( docs_i, qs, topics, pdtm ) | p = compute_data_likelihood( docs_i, qs, topics, pdtm ) | ||
- | print "Iter %d, p=%.2f" % (iters,p) | + | print("Iter %d, p=%.2f" % (iters,p)) |
# resample per-word topic assignments qs | # resample per-word topic assignments qs | ||
Line 138: | Line 138: | ||
- | <code> | + | </code> |