Just open sourced the datasets used in my NeurIPS 2024 paper Understanding Transformers via N-Gram Statistics. It includes training data and associated n-gram data to enable the research community to replicate and build upon my work measuring to what extent LLM predictions can be described in terms of n-gram statistics.
Jay McClelland | Neural Networks: Artificial and Biological
Jay McClelland is a pioneer in the field of artificial intelligence and is a cognitive psychologist and professor at Stanford University in the psychology, linguistics, and computer science departments. Together with David Rumelhart, Jay published the two volume work Parallel Distributed Processing, which has led to the flourishing of the connectionist approach to understanding cognition. … Continue reading Jay McClelland | Neural Networks: Artificial and Biological
Interview on Machine Learning Street Talk
Had a great time chatting with fellow podcaster Tim Scarfe over at Machine Learning Street Talk on my recent paper Understanding Transformers via N-gram Statistics: https://www.youtube.com/watch?v=W485bz0_TdI