User Tools

Site Tools


cs501r_f2016

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs501r_f2016 [2016/03/31 23:06]
admin
cs501r_f2016 [2021/06/30 23:42] (current)
Line 1: Line 1:
-Preliminary Syllabus and topics to be covered:+====CS501r, Fall 2016 - Deep LearningTheory and Practice====
  
-  - Basics of DNNs +[[cs501r_f2016:​desc|Here is the course description.]]
-    - Convolution layers +
-    - Maxpooling layers +
-    - Relu units +
-    - Softmax units +
-    - Local response normalization / contrast normalization +
-  - Regularization strategies +
-    - Dropout +
-    - Dropconnect +
-    - Batch normalization +
-    - Adversarial networks +
-    - Data augmentation+
  
-  ​High-level implementation packages - pros and cons +[[http://​cs231n.github.io/​python-numpy-tutorial/​|Rememberthis is a great tutorial on python / numpy!]]
-    - Tensorflow, Theano, Caffe, Keras, TorchMocha+
  
-Case studies / existing networks and why they'​re interesting +[[cs501r_f2016:​openlabtf|Some instructions for getting Tensorflow to run on the CS open labs]]
-  AlexNet +
-  VGG +
-  GoogLeNet / Inception +
-  ZFNet+
  
-Training & initialization +[[https://​www.dropbox.com/​sh/​aox63ppfd14hf7b/​AABGgv56Q98ikk5I8I4bNbO3a?​dl=0|All of the slides are posted on Dropbox here]]
-  Initialization strategiesXavier, Gaussian, Identity, Sparse +
-  Optimization theory and algorithms +
-  Local minima; saddle points; plateaus +
-  SGD +
-  RPROP +
-  RMS prop +
-  Adagrad +
-  Adam +
-  Higher-order algorithms (LBFGS; Hessian-free;​ trust-region) +
-  Nesterov and momentum+
  
-Large-scale distributed learning +---- 
-  Parameter servers +=== Labs ===
-  ​Asynchronous vs. synchronous architectures+
  
-Temporal networks ​and how to train them +[[cs501r_f2016:​lab_notes|General notes on ipython ​and seaborn]]
-  Basic RNNs +
-  LSTMs +
-  Deep Memory Nets+
  
-Application areas +[[cs501r_f2016:​lab1|Lab 1 - Anaconda and playground screenshot]]
-  Deep reinforcement learning +
-  NN models of style vs. content (deepart.io) +
-  Imagenet classification +
-  The Neural Turing Machine +
-  Sentiment classification +
-  Word embeddings+
  
-Understanding ​and visualizing CNNs +[[cs501r_f2016:​lab2|Lab 2 - Perceptron]] 
-  tSNE embeddings + 
-  ​deconvnets +[[cs501r_f2016:​lab3|Lab 3 - Basic gradient descent]] 
-  data gradients ​inceptionism+ 
 +[[cs501r_f2016:​lab4|Lab 4 - Automatic differentiation]] 
 + 
 +[[cs501r_f2016:​lab5|Lab 5 - Tensorflow image classifier]] 
 + 
 +[[cs501r_f2016:​lab5b|Lab 5b - Convolutional Tensorflow image classifier ​and Tensorboard]] 
 + 
 +[[cs501r_f2016:​lab6|Lab 6 - Feature zoo 1]] 
 + 
 +[[cs501r_f2016:​lab7|Lab 7 - Generative adversarial networks]] 
 + 
 +[[cs501r_f2016:​lab10|Lab 8 - RNNs, LSTMs, GRUs]] 
 + 
 +[[cs501r_f2016:​lab9|Lab 9 - Siamese networks]] 
 + 
 +[[cs501r_f2016:​lab13|Lab 10 - Inceptionism ​deep art]] 
 + 
 + 
 +[[cs501r_f2016:​fp|Final project]]
  
-Misc 
-  Network compression 
-  Low bit-precision networks 
-  Sum-product networks 
-  Evolutionary approaches to topology discovery 
-  Spatial transformer networks 
-  Network-in-network 
-  Regions-with-CNN 
  
cs501r_f2016.1459465617.txt.gz · Last modified: 2021/06/30 23:40 (external edit)