Iteration 0 of 100; Number of tokens in collocation = 0
Iteration 10 of 100; Number of tokens in collocation = 4736
Iteration 20 of 100; Number of tokens in collocation = 5767
Iteration 30 of 100; Number of tokens in collocation = 6279
Iteration 40 of 100; Number of tokens in collocation = 6514
Iteration 50 of 100; Number of tokens in collocation = 6539
Iteration 60 of 100; Number of tokens in collocation = 6680
Iteration 70 of 100; Number of tokens in collocation = 6714
Iteration 80 of 100; Number of tokens in collocation = 6626
Iteration 90 of 100; Number of tokens in collocation = 6607
Elapsed time is 21.063034 seconds.
Concatenating terms in stream...
Find all unique terms...
Find term indices (this might be slow)...
Writing collocation topics to file: topics_psychreview_col.txt

Post-process the vocabulary to include collocations as separate entries. Recalculate word-topic distributions with expanded
vocabulary