User Tools

Site Tools


cs501r_f2016

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs501r_f2016 [2016/03/31 23:09]
admin
cs501r_f2016 [2021/06/30 23:42] (current)
Line 1: Line 1:
-Preliminary Syllabus ​and topics ​to be covered:+====CS501r, Fall 2016 - Deep Learning: Theory ​and Practice==== 
 + 
 +[[cs501r_f2016:​desc|Here is the course description.]] 
 + 
 +[[http://​cs231n.github.io/​python-numpy-tutorial/​|Remember,​ this is a great tutorial on python / numpy!]] 
 + 
 +[[cs501r_f2016:​openlabtf|Some instructions for getting Tensorflow ​to run on the CS open labs]] 
 + 
 +[[https://​www.dropbox.com/​sh/​aox63ppfd14hf7b/​AABGgv56Q98ikk5I8I4bNbO3a?​dl=0|All of the slides are posted on Dropbox here]] 
 + 
 +---- 
 +=== Labs === 
 + 
 +[[cs501r_f2016:​lab_notes|General notes on ipython and seaborn]] 
 + 
 +[[cs501r_f2016:​lab1|Lab 1 - Anaconda and playground screenshot]] 
 + 
 +[[cs501r_f2016:​lab2|Lab 2 - Perceptron]] 
 + 
 +[[cs501r_f2016:​lab3|Lab 3 - Basic gradient descent]] 
 + 
 +[[cs501r_f2016:​lab4|Lab 4 - Automatic differentiation]] 
 + 
 +[[cs501r_f2016:​lab5|Lab 5 - Tensorflow image classifier]] 
 + 
 +[[cs501r_f2016:​lab5b|Lab 5b - Convolutional Tensorflow image classifier and Tensorboard]] 
 + 
 +[[cs501r_f2016:​lab6|Lab 6 - Feature zoo 1]] 
 + 
 +[[cs501r_f2016:​lab7|Lab 7 - Generative adversarial networks]] 
 + 
 +[[cs501r_f2016:​lab10|Lab 8 - RNNs, LSTMs, GRUs]] 
 + 
 +[[cs501r_f2016:​lab9|Lab 9 - Siamese networks]] 
 + 
 +[[cs501r_f2016:​lab13|Lab 10 - Inceptionism / deep art]] 
 + 
 + 
 +[[cs501r_f2016:fp|Final project]]
  
-  - Basics of DNNs 
-    - Convolution layers 
-    - Maxpooling layers 
-    - Relu units 
-    - Softmax units 
-    - Local response normalization / contrast normalization 
-  - Regularization strategies 
-    - Dropout 
-    - Dropconnect 
-    - Batch normalization 
-    - Adversarial networks 
-    - Data augmentation 
-  - High-level implementation packages - pros and cons 
-    - Tensorflow, Theano, Caffe, Keras, Torch, Mocha 
-  - Case studies / existing networks and why they'​re interesting 
-    - AlexNet 
-    - VGG 
-    - GoogLeNet / Inception 
-    - ZFNet 
-  - Training & initialization 
-    - Initialization strategies: Xavier, Gaussian, Identity, Sparse 
-    - Optimization theory and algorithms 
-    - Local minima; saddle points; plateaus 
-    - SGD 
-    - RPROP 
-    - RMS prop 
-    - Adagrad 
-    - Adam 
-    - Higher-order algorithms (LBFGS; Hessian-free;​ trust-region) 
-    - Nesterov and momentum 
-  - Large-scale distributed learning 
-    - Parameter servers 
-    - Asynchronous vs. synchronous architectures 
-  - Temporal networks and how to train them 
-    - Basic RNNs and Backprop-through-time 
-    - LSTMs 
-    - Deep Memory Nets 
-  - Application areas 
-    - Deep reinforcement learning 
-    - NN models of style vs. content (deepart.io) 
-    - Imagenet classification 
-    - The Neural Turing Machine 
-    - Sentiment classification 
-    - Word embeddings 
-  - Understanding and visualizing CNNs 
-    - tSNE embeddings 
-    - deconvnets 
-    - data gradients / inceptionism 
-  - Misc 
-    - Network compression 
-    - Low bit-precision networks 
-    - Sum-product networks 
-    - Evolutionary approaches to topology discovery 
-    - Spatial transformer networks 
-    - Network-in-network 
-    - Regions-with-CNN 
  
cs501r_f2016.1459465744.txt.gz · Last modified: 2021/06/30 23:40 (external edit)