oxford-cs-deepnlp-2017 / lectures
Oxford Deep NLP 2017 course
AI Architecture Analysis
This repository is indexed by RepoMind. By analyzing oxford-cs-deepnlp-2017/lectures in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Overview (README excerpt)
Crawler viewPreamble This repository contains the lecture slides and course description for the Deep Natural Language Processing course offered in Hilary Term 2017 at the University of Oxford. This is an advanced course on natural language processing. Automatically processing natural language inputs and producing language outputs is a key component of Artificial General Intelligence. The ambiguities and noise inherent in human communication render traditional symbolic AI techniques ineffective for representing and analysing language data. Recently statistical techniques based on neural networks have achieved a number of remarkable successes in natural language processing leading to a great deal of commercial and academic interest in the field This is an applied course focussing on recent advances in analysing and generating speech and text using recurrent neural networks. We introduce the mathematical definitions of the relevant machine learning models and derive their associated optimisation algorithms. The course covers a range of applications of neural networks in NLP including analysing latent dimensions in text, transcribing speech to text, translating between languages, and answering questions. These topics are organised into three high level themes forming a progression from understanding the use of neural networks for sequential language modelling, to understanding their use as conditional language models for transduction tasks, and finally to approaches employing these techniques in combination with other mechanisms for advanced applications. Throughout the course the practical implementation of such models on CPU and GPU hardware is also discussed. This course is organised by Phil Blunsom and delivered in partnership with the DeepMind Natural Language Research Group. Lecturers • Phil Blunsom (Oxford University and DeepMind) • Chris Dyer (Carnegie Mellon University and DeepMind) • Edward Grefenstette (DeepMind) • Karl Moritz Hermann (DeepMind) • Andrew Senior (DeepMind) • Wang Ling (DeepMind) • Jeremy Appleyard (NVIDIA) TAs • Yannis Assael • Yishu Miao • Brendan Shillingford • Jan Buys Timetable Practicals • Group 1 - Monday, 9:00-11:00 (Weeks 2-8), 60.05 Thom Building • Group 2 - Friday, 16:00-18:00 (Weeks 2-8), Room 379 • Practical 1: word2vec • Practical 2: text classification • Practical 3: recurrent neural networks for text classification and language modelling • Practical 4: open practical Lectures Public Lectures are held in Lecture Theatre 1 of the Maths Institute, on Tuesdays and Thursdays (except week 8), 16:00-18:00 (Hilary Term Weeks 1,3-8). Lecture Materials • Lecture 1a - Introduction [Phil Blunsom] This lecture introduces the course and motivates why it is interesting to study language processing using Deep Learning techniques. [[slides]](Lecture%201a%20-%20Introduction.pdf) [[video]](http://media.podcasts.ox.ac.uk/comlab/deep_learning_NLP/2017-01_deep_NLP_1a_intro.mp4) • Lecture 1b - Deep Neural Networks Are Our Friends [Wang Ling] This lecture revises basic machine learning concepts that students should know before embarking on this course. [[slides]](Lecture%201b%20-%20Deep%20Neural%20Networks%20Are%20Our%20Friends.pdf) [[video]](http://media.podcasts.ox.ac.uk/comlab/deep_learning_NLP/2017-01_deep_NLP_1b_friends.mp4) • Lecture 2a- Word Level Semantics [Ed Grefenstette] Words are the core meaning bearing units in language. Representing and learning the meanings of words is a fundamental task in NLP and in this lecture the concept of a word embedding is introduced as a practical and scalable solution. [[slides]](Lecture%202a-%20Word%20Level%20Semantics.pdf) [[video]](http://media.podcasts.ox.ac.uk/comlab/deep_learning_NLP/2017-01_deep_NLP_2a_lexical_semantics.mp4) Reading Embeddings Basics • Firth, John R. "A synopsis of linguistic theory, 1930-1955." (1957): 1-32. • Curran, James Richard. "From distributional to semantic similarity." (2004). • Collobert, Ronan, et al. "Natural language processing (almost) from scratch." Journal of Machine Learning Research 12. Aug (2011): 2493-2537. • Mikolov, Tomas, et al. "Distributed representations of words and phrases and their compositionality." Advances in neural information processing systems. 2013. Datasets and Visualisation • Finkelstein, Lev, et al. "Placing search in context: The concept revisited." Proceedings of the 10th international conference on World Wide Web. ACM, 2001. • Hill, Felix, Roi Reichart, and Anna Korhonen. "Simlex-999: Evaluating semantic models with (genuine) similarity estimation." Computational Linguistics (2016). • Maaten, Laurens van der, and Geoffrey Hinton. "Visualizing data using t-SNE." Journal of Machine Learning Research 9.Nov (2008): 2579-2605. Blog posts • Deep Learning, NLP, and Representations, Christopher Olah. • Visualizing Top Tweeps with t-SNE, in Javascript, Andrej Karpathy. Further Reading • Hermann, Karl Moritz, and Phil Blunsom. "Multilingual models for compositional distributed semantics." arXiv preprint arXiv:1404.4641 (2014). • Levy, Omer, and Yoav Goldberg. "Neural word embedding as implicit matrix factorization." Advances in neural information processing systems. 2014. • Levy, Omer, Yoav Goldberg, and Ido Dagan. "Improving distributional similarity with lessons learned from word embeddings." Transactions of the Association for Computational Linguistics 3 (2015): 211-225. • Ling, Wang, et al. "Two/Too Simple Adaptations of Word2Vec for Syntax Problems." HLT-NAACL. 2015. • Lecture 2b - Overview of the Practicals [Chris Dyer] This lecture motivates the practical segment of the course. [[slides]](Lecture%202b%20-%20Overview%20of%20the%20Practicals.pdf) [[video]](http://media.podcasts.ox.ac.uk/comlab/deep_learning_NLP/2017-01_deep_NLP_2b_practicals.mp4) • Lecture 3 - Language Modelling and RNNs Part 1 [Phil Blunsom] Language modelling is important task of great practical use in many NLP applications. This lecture introduces language modelling, including traditional n-gram…