|
Lecturer(s)
|
-
Sido Jakub, Ing.
-
Konopík Miloslav, Ing. Ph.D.
-
Pražák Ondřej, Ing.
|
|
Course content
|
1. Revision: Multi-layer perceptron and Backpropagation. 2. Language models and Word2Vec. 3. Convolutional neural networks. 4. Recurrent neural networks. 5. LSTM, GRU, tagging. 6. Encoder-decoder architecture, machine translation. 7. Attention principle. 8. Transformer architecture. 9. BERT and similar models. 10. Fine-tuning and pre-trained model application. 11. Generative models. 12. Adversarial training in NLP. 13. Deep Learning Frameworks for Text.
|
|
Learning activities and teaching methods
|
Lecture supplemented with a discussion, Lecture with practical applications, E-learning, Discussion, Multimedia supported teaching, Students' portfolio, One-to-One tutorial, Individual study, Students' self-study, Lecture, Practicum
- Practical training (number of hours)
- 26 hours per semester
- Contact hours
- 26 hours per semester
- Individual project (40)
- 60 hours per semester
- Preparation for formative assessments (2-20)
- 10 hours per semester
- Preparation for an examination (30-60)
- 40 hours per semester
|
| prerequisite |
|---|
| Knowledge |
|---|
| having an overview of basic methods of probability and statistics |
| solving computer tasks at the level of Bachelor degree in Computer Science or a similar field |
| having an overview of basic methodshaving an overview of basic methods of probability and statistics |
| Skills |
|---|
| decompose tasks into simpler units |
| solve linear algebra problems |
| implement more advanced programs in an imperative programming language |
| Competences |
|---|
| N/A |
| learning outcomes |
|---|
| Knowledge |
|---|
| be familiar with multilingual text processing |
| be familiar with basic text summarization methods |
| be familiar with evaluating the success of natural language processing methods |
| describe the principles of natural language processing and text data retrieval |
| Skills |
|---|
| train language models |
| create algorithms for sentence parsing |
| create algorithms for automatic evaluation of semantic similarity of words sentences and documents |
| create named entity recognition algorithms |
| create machine learning algorithms |
| apply machine learning to natural language processing |
| Competences |
|---|
| N/A |
| N/A |
| teaching methods |
|---|
| Knowledge |
|---|
| Self-study of literature |
| Practicum |
| Lecture supplemented with a discussion |
| Interactive lecture |
| Discussion |
| One-to-One tutorial |
| E-learning |
| Multimedia supported teaching |
| Skills |
|---|
| Individual study |
| Competences |
|---|
| Interactive lecture |
| assessment methods |
|---|
| Knowledge |
|---|
| Test |
| Oral exam |
| Skills |
|---|
| Seminar work |
| Competences |
|---|
| Oral exam |
|
Recommended literature
|
-
Aurélien Géron. Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. O'Reilly Media, 2017. ISBN 1491962291.
-
Delip Rao, Brian McMahan. Natural Language Processing with PyTorch: Build Intelligent Language Applications Using Deep Learning. ISBN 1491978236.
-
François Chollet. Deep Learning with Python. 2017. ISBN 9781617294433.
-
Christopher D. Manning and Hinrich Schütze. Foundations of Statistical Natural Language Processing. MIT Press, Cambridge, MA, USA, 1999.
-
Ian Goodfellow, Yoshua Bengio, Aaron Courville. Deep Learning (Adaptive Computation and Machine Learning series). MIT press, 2016. ISBN 9780262035613.
-
Jacob Eisenstein. Introduction to Natural Language Processing (Adaptive Computation and Machine Learning series). MIT Press, 2019. ISBN 0262042843.
-
Jurafsky, Daniel; Martin, James H. Speech and language processing : an introduction to natural language processing, computational linguistics, and speech recognition. 2nd ed. Upper Saddle River : Pearson/Prentice Hall, 2009. ISBN 978-0-13-504196-3.
|