By clicking the button, I consent to the processing of personal data and agree to the privacy policy and the offer agreement.
Course objectives
The main objective of the course is to understand how modern neural networks work at the algorithmic level.
1
Get acquainted with the mathematical basis for training modern neural networks
2
Get to know the formal models of neurons-layers-networks
3
Understand the principles of training modern neural networks
What are you going to learn
Understand the operation of the network at the lowest level of information processing
Select the parameters of the neural network model taking into account the peculiarities of the task
Training format
Recorded sessions
You will have webinar records available for viewing at any time
Closed training platform
Within 24 hours of payment, you will have access to the GetCourse closed training platform with class notes and course materials.
Mentor's assistance
You will have a chat room where you can ask your mentor any questions you may have
Training Program
You can see a brief training program below
Click on the class title to see its detailed description.
Physiological model of neuron
McCullock-Pitts neuron implementation: convolution+activation function
Calculating the output of a linear neuron
Delta-law training of a linear neuron on the example of linearly separated space examples
Implementation of a neural network layer
Implementation of a multilayer network
Combination of solutions for linear neurons
Regression goal
Definition of the regression model
Error function
The method of least squares
Search for regression parameters based on the gradient descent
Stochastic gradient descent
Learning the linear neural network and its limitations
Nonlinear neuron
The problems of training a nonlinear neuron
Why do we need the backpropagation method
Implementation of the backpropagation method
Nonlinear MLP Training
Problems of training nonlinear MLP : overfitting, underfitting, high computational complexity, gradient attenuation
Method of stochastic gradient descent
Loss function (types of loss function and their implementation)
Regularization L1, L2
Implementation of Adam method
Implementation of NAdam method
Methods of the 2nd order
Model of linear separation of objects in the feature space
Classifier error function
Linear classifier
Support vector machines
Logistic regression
KNN
Classifier based on non-linear MLP
Dispersion-displacement dilemma
Solver ensembles
Evaluation of classifiers - metrics (accuracy, Precition, Recall, F1, ROC)
Why do you need reduction
Evaluation of correlation of features with each other and with the target property
Selection of features based on correlation, based on model analysis
Feature space transformation : PCA
Features : T-SNE
What happens in a neural network: hidden layers as a feature detector - network visualization.
Tensor - definition and meaning
Why a convolutional network works: architecture as processing of tensors and composition of convolutional networks.
Convolutional layers
Implementation of the convolutional layer and visualization of its work
Convolutional layer training
Pooling layers - visualization
Backpropagation through the pooling layer
Normalization layers - implementation and visualization
Reverse layers - how it works
Why is it possible to transfer training
Degradation of deep networks
Residual models: implementation and training
Basic principles of building a convolutional network: examples and implementation
The model of recurrent connections in the neural network and the impact on the result of the work
Hopfield network - implementation and operation
Recurrent networktTraining: Backpropagation Through Time (BPTT).
Long short-term memory (LSTM) networks.
Gated Recurrent Unit (GRU) recurrent network
Training without a teacher
Implementation of training without a teacher (kmeans)
Network - self organizing map
Training as sistribution restoration (EM algorithm can also be a little bit)
RBM - architecture and training
Visualization of probabilistic models
Problems of probabilistic models
Apply for course
Course teacher
Maria Korlyakova
Education: Bauman Moscow State Technical University + MPEI postgraduate course. Candidate of Technical Sciences, Associate Professor Over 30 publications in the last 5 years (Scopus, RINC, EAC, conferences)
Area of scientific interest: Neural networks for image processing (traction/segmentation/detectors), task complexity analysis and evaluation of neural network parameters (Machine Learning, Deep Learning, Computer vision, Object).
Main technological modules: Python (scikit-learn, Keras), OpenCV, Matlab (Neuro NetWork Toolbox, Computer Vision Toolbox, Statistic and ML Toolbox). I work in projects of vision systems (tracking/segmentation/detectors) on neural networks, time series neural network analysis. I work (BMSTU) as the head of graduation works (masters, bachelors) and research students in the direction of Deep learning, Computer Vision. My students participate in programs of Kaggle platform, Sberbank, Polytechnics, etc.
Certificate
We issue a certificate of course completion
Photos from our face-to-face classes
Our manager will answer any questions about the course