pangoling::example | Worked-out example: Surprisal from a causal (GPT) model as a cognitive processing bottleneck in reading | HTML | source | ||
pangoling::intro-bert | Using a Bert model to get the predictability of words in their context | HTML | source | ||
pangoling::intro-gpt2 | Using a GPT2 transformer model to get word predictability | HTML | source | ||
pangoling::troubleshooting | Troubleshooting the use of Python in R | HTML | source | R code |