"Advanced Text Processing (Natural Language Processing)" course
starts June 15 / price - 44.900 rubles
Book minimum price
By clicking the button, I consent to the processing of personal data and agree to the privacy policy and the offer agreement.
Course Objectives
1
Remember the basics and features of working with neural networks in the field of NLP
If a few months have passed since taking the course and you have worked with another area of neural networks, you will quickly refresh your knowledge in the first lessons.
2
Gain practical experience in solving NLP tasks
Each session comes with a mini-project, which you will complete under your tutor's guidance
3
Get an in-depth understanding of neural network design and operation principles in NLP tasks
If, in the future, you encounter an unusual problem, you will be able to solve it knowing how to combine the practical knowledge you already have.
Who this course is for
This course is for those who are already familiar with the structure and principle of neural networks and want to specialize in working with the natural language.
What you will learn
After the course you will be able to solve a wide range of tasks, from text classification to NER and translation of sentences.
Each lesson will cover advanced theory so that you can freely change the architecture and strategy of the task if it differs significantly from the typical
The course will also provide useful practical advice for tasks, which, if you try to figure it out on your own, turn out to be too hard without significant practical experience
Training Program
Contents: Introduction to the course. Repeating the basics of recurrent networks: prerequisites for the creation, advantages over full networks, the basic algorithm of work. Work with transfer of training in recurrent networks. Analysis of popular pre-trained models for text analysis. Mini-project: finding the fact of comparing a car with other brands in reviews.
Contents: Acquaintance with vector abstraction of words. Analysis of mathematical properties of vectors that can be used when working with embeddings. Output of CosineDistanceLoss. Exploring word2vec, fasttest, glove. Comparison of language models and pre-trained embeddings in learning transfer. Mini-project: extract hashtags from comments.
Contents: Continued work with language models. Gather your own theme model with Gensim. Comparison of neural network prediction accuracy using embeddings pre-trained on different topics. Comparison of methods to work with texts written in languages other than English. Morphological text analysis using pymorphy2 for text pre-processing.
Contents: Consideration of the structure and meaning of popular recurrent layers: LSTM (and modifications), GRU. Comparison of operating modes of recurrent neural networks. Study of model operation at tensor level. The purpose of the lesson is to learn to understand how the data on a recursive neural network moves in keras, the necessary dimensions at input and output in different modes of neural network operation, for which each axis in input and output tensors is responsible. Theoretical part of the lesson in the future will speed up the creation of the project with its dataset and its architecture many times. Consideration of the Named Entity Recognition task. Mini-project: NER application on world news.
Contents: Exploring the second approach to text analysis - convolutional netowkrs: structure, why this approach works. Comparison of CNN and RNN based models. Examination of NLP convolutional neural networks originally designed for computer vision. Object detection in texts. Mini-project: object detection in car reviews.
Contents: Exploring sequence-to-sequence transformation. Consideration of components of seq2seq model: encoder and decoder. Analysis of seq2seq tasks and ready implementations. Mini-project: creating your own chat-bot that answers questions.
Contents: The boundary between images and texts is finally erased. Extracting the message from the image and text with the subsequent transformation: search for the image by text, creation of the image by means of a generative network with the message of the text as a condition, creation of the text description by the image.
Contents: Studying the concept of "attention" in neural networks. Analysis of the principles of attention in seq2seq tasks. Comparison of the quality of neural networks with attention and without attention. Mini-project: creating your own translator.
Contents: Exploring an advanced attention mechanism called transformer. Building the basic version of the transformer. Examination of structure and possibilities of state-of-art implementations of transformer: BERT, GPT-2, T5. Mini-project: extraction of the gist: writing an abstract for a scientific article.
Contents: Analysis of the search for sentences with the necessary meaning in the text using recurrent neural networks. Mini-project: highlighting sentences in the news, which refer to a predetermined person.
Contents: Studying a scientific article and writing a neural network for it from scratch with a deep analysis of the theoretical part. Solving the problem of text generation using GAN. Text generation, successfully passing Turing test. Parse the CNN+RNN ensemble of neural networks to solve the complex problems of обработки\генерации text. Customization of Keras' work with the help of low-level code on TensorFlow 2.0.
Training Format
11 webinars
Weekly 2+ hour webinars
11 practical assignments
After each session, you will receive a practical assignment
3 months of tutor support
You will be able to ask any questions about the course and the assignments in the common chat.
Apply for course
Course teacher
Konstantin Slepov
Education
ITMO University, Faculty of Infocommunication Technologies
Data science and neural networks course on Python, University of Artificial Intelligence.
Deep learning courses, University of San Francisco
Programming experience of over 4 years
Experience in AI of over 2 years
Area of interest - machine vision, text processing, generative models, creation of new complex types of neural networks and introduction of new modifications in already known neural networks to improve their quality of work
My trick is teaching topics at all levels: from best practices to the principles of neural network design and operation at a low level. I accompany the explanations of complex low-level concepts with intuitive examples, so that anyone who is at least a little familiar with the basic structure of neural networks can learn them.
Konstantin Slepov
Course author and teacher
Certificate
We issue a certificate of course completion
"Basic" plan 44.900 rubles
11 webinars
11 practical seminars
11 tasks
3 months of tutor support
Certificate
Pay 44.900 rubles
Our manager will answer any questions about the course