ECE 5973-961/983: Artificial Neural Networks and Applications

Artificial neural networks was introduced in the 50’s of the last century. However, in the last decade, there has been strong resurgence of neural networks as processing techniques where they have been applied to many real-world problems. This leads to numerous breakthroughs on image, video, and natural language processing applications.

This course is aimed to be quite hands-on and should provide students with sufficient details for them to quickly apply to their own research. In particular, applications relating to computer vision and natural language processing will be discussed. There may be some math but we will not spend too much time going into proofs. Instead, we may try to go through (not exhaustively) some of the free libraries such as Caffe and Torch. And you are definitely encouraged to explore and leverage them for your course project.

Textbook

  • Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep Learning, MIT Press.

It is not required but is a very good reference.

Piazza

  • Please sign up Piazza here

Reference

Some Deep Learning Toolboxes and Libraries

  • Tensorflow: From Google, probably most popular package. Not quite optimized for single PC

  • Caffe2: From Facebook

  • Caffe: From Berkeley

  • Torch 7: From NYU, and used by Facebook/Twitter

  • PyTorch: The Python version of Torch

  • Theano: From Bengio's group in Montreal

  • Keras: High-level layer on top of Theano/Tensorflow

  • Lasagne: High-level layer on top of Theano

  • matconvnet: From Oxford, kind of restricted

  • mxnet: From Amazon

  • Neon: From Intel

  • Deeplearning4j

Office Hours

There are no “regular” office hours. And you are welcome to come catch me anytime or contact me through emails.

Course Syllabus (Tentative)

  • Overview of machine learning

  • History of artificial neural networks

  • Perceptrons

  • Backpropagation algorithms

  • Regularization and dropout

  • Weight initialization

  • Optimization methods

  • Convolutional neural networks (CNN)

  • R-CNN, faster R-CNN

  • Weight visualization, Deep visualization, t-SNE, deepdream

  • Recurrent neural networks

  • LSTM networks

  • Restricted boltzmann machines

  • Autoencoders

  • Deep belief networks

Projects

Video presentation due on May 8. Please read this for guideline. Written report is not mandatory but worth a maximum 20% extra credit.

Grading

"Activities": 30%. Quizzes, paper review, presentations, etc.

Homework: 30%. Programming assignments.

Final Project: 40%.

Final grade:

  • A: above 80%

  • B: above 60% but not more than 80%

  • C: above 40% but not more than 60%

  • D: above 20% but not more than 40%

  • F: not more than 20%

Prerequiste

Calculus (MATH 1914 or equivalent), linear algebra (MATH 3333 or equivalent), basic probability (MATH 4733 or equivalent), and intermediate programming skill (experience on Python/Numpy is preferred)

Note that we will “borrow” programming assignment from Stanford 231n. So ability to program in Python is expected. Python is not difficult if you are familiar with any other high level general programming languages such as C/C++/C#/Java/Javascript/Perl/Matlab etc. If you don't know anything about Python, I would recommend you to try out this app.

Late Policy

  • There will be 5% deduction per each late day for all submissions

  • The deduction will be saturated after 10 days. So you will get half of your marks even if you are super late

Reasonable Accommodation Policy

Any student in this course who has a disability that may prevent the full demonstration of his or her abilities should contact me personally as soon as possible so we can discuss accommodations necessary to ensure full participation and facilitate your educational opportunities.

Should you need modifications or adjustments to your course requirements because of documented pregnancy-related or childbirth-related issues, please contact me as soon as possible to discuss. Generally, modifications will be made where medically necessary and similar in scope to accommodations based on temporary disability. Please see this for commonly asked questions.

Calendar

Topics Materials Further reading/watching
1/16 Overview, AI, machine learning and its types, artificial neural networks and its history overview Andrew Ng: Artificial Intelligence is the New Electricity
1/18 Machine learning basics classification and regression
1/23 Linear regression, ridge regression, Lasso (video) regression notebook
1/25 Linear classification, logistic regression, softmax, SVM (video)
1/30 Neural networks, back-propagation (video) neural networks
2/1 Activation functions, weight initialization (video)
2/6 Dropout, batch normalization, data augmentation, optimizers (video) optimizer comparison notebook
2/8 BFGS, LBFGS, babbysitting learning process (video)
2/13 Convolutional neural networks (screencast was not captured successfully, please check this instead) CNN
2/15 CNN Architectures (video), presentation and final project (video)
2/20 Class cancelled due to weather condition
2/22 Class cancelled due to weather condition
2/27 Torch (Hyeri's slides, Betsy's slides)
3/1 Tensorflow
3/6 Tensorflow
3/8 Matconvnet
3/13visualizing CNN layers (video) Visualizing CNN
3/15 CNNs and arts, fooling CNN, semantic segmentation, object detection, YOLO (video) CNN applications
3/20 spring break
3/22 spring break
3/27 Mxnet (slides), caffe (slides)
3/29 Keras (slides)
4/3 R-CNN (video)
4/5 Recurrent neural networks (RNNs) , image captioning, long short-term memory (LSTM) (video) RNN
4/10 Generative models, PixelCNN, PixelRNN (video) Generative networks
4/12 Autoencoder, generative adversarial network (GAN) (video)
4/17 Variational autoencoder, neural machine translations, chatbots (video) Seq2seq models
4/19 Neural Turng machine (video) Neural Turing machine
4/24 Deep reinforcement learning (video) Deep reinforcement learning AlphaGo Zero