Home About Courses Schedule Services Webinars Contact Search

Machine Learning - Level 2

SEE SCHEDULE

Duration: 5 Days

Method: Hands On, Instructor led

Price: $2975.00

Course Code: ML1010


Audience

Developers and Managers with a programming background interested in Machine Learning for Data Analysis

Description

This course covers modern artificial intelligence algorithms based on deep neural networks. It starts with a review of the necessary computer vision, neural networks and statistics background, and then provides an in depth coverage of the different deep learning architectures such as deep convolution networks, sparse autoencoders, recurrent neural networks, belief networks and reinforcement learning techniques. Programming projects on the different deep networks through state of the art libraries using Theano, Google’s Tensor Flow, and Microsoft’s CNTK are described. Applications of deep learning to computer vision, text classification, speech recognition and optimization problems are presented. Some recent research papers in this field will also be explained in the course.

Objectives

Upon successful completion of this course, the student will be able to:

  • Understand the mathematical and statistical background in Machine Learning
  • Understand the fundamental concepts in deep neural networks
  • Program the different deep architectures using the modern ML libraries
  • Understand the state of the art in ML and the future directions in Artificial Intelligence

Prerequisites

Good programming knowledge of Python, Java or C# and a prior course in Machine Learning.

Topics

  • I. Review of Applied Mathematics and Statistics for Machine Learning
    • Matrices and Tensors
    • Linear dependence and Span
    • Eigen decomposition
    • Principle Component Analysis
    • Probability distributions, expectation, variance and covariance
    • Baye's rule, joint PDFs
  • II. Review of different classifiers, introduction to Neural Networks and their Training
    • KNN
    • Random forests
    • Support Vector Machines
    • Boosting and Bagging
    • Random Forests
    • Feedforward Neural Networks
    • Different activation functions for neural networks
    • Softmax function
    • Gradient descent algorithm
    • Backpropagation algorithm for training a neural network
    • Self Organizing Maps (SOM) networks
  • III. Regularization in Neural Networks
    • Avoiding overfitting via regularization constraints, sparsity and drop out in neural networks
    • Cross validation and ROC concepts
  • IV. Introduction to Deep Neural Networks
    • Overview of deep neural architectures
    • Deep sparse Autoencoders for unsupervised learning and dimensionality reduction, review of convolution and its application in deep convolution networks (CNN)
  • V. Deep Convolution Neural Networks (CNNs)
    • Deep CNN architecture, explanation of LENET-5 deep CN
    • Creating a deep CNN architecture for MNIST character recognition and its comparison to a regular neural network
    • Creating a deep CNN architecture for Image recognition for the CIFAR 10 dataset
    • Residual deep CNN architecture and its comparison to the deep CNN architecture
  • VI. Deep Recurrent Neural Networks (RNNs)
    • Deep RNN architecture and its training
    • Long Short-Term Memory (LSTM) RNN and its application to Natural Language Processing
    • Optimization for long term dependencies
    • Application of deep RNNs for speech processing
  • VII. Deep Reinforced Learning
    • Markov decision processes and dynamic programming
    • Deep Q Networks (DQN) for reinforced learning
    • Double DQN and Dueling DQN architectures
  • VIII. Recent Research Papers in Deep Learning