User Tools

Site Tools


cs601r_w2020:lab1

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs601r_w2020:lab1 [2020/01/06 18:20]
wingated
cs601r_w2020:lab1 [2021/06/30 23:42] (current)
Line 26: Line 26:
 Details: Details:
  
-Your classifier must be a 20 layer Resnet.+Your classifier must be a 20 layer Resnet.  You may NOT use a built-in or pretrained resnet (ie, from torchhub), because you will need to be able to modify it.  However, you are encouraged to use other built-in pytorch blocks, such as batchnorm blocks.
  
-The dataset is the [[http://​cs231n.stanford.edu/​tiny-imagenet-200.zip|Tiny Imagenet]] ​dataset.  ​It has 200 classes, and 500 training images per class (for a total of 100k training images), and 10,000 testing images.+The dataset is the CIFAR-10 dataset You may use the torchvision.datasets interface to simplify the loading and management of the dataset.  ​While not required ​for this labmake sure that you're prepared to implement data augmentation.
  
 You will probably want to use Google colab to host your notebook. You will probably want to use Google colab to host your notebook.
Line 34: Line 34:
 For your loss curve, any reasonable visualization is acceptable. For your loss curve, any reasonable visualization is acceptable.
  
-For your confusion matrix, you should display a 200x200 image, where each pixel i,j represents the number of times an image of class i was classified as class j.+For your confusion matrix, you may use any reasonable visualization or software package.
  
-You will not be graded on any final accuracies.+You will not be graded on any final accuracies. ​ 
  
-MAJOR HINT: we will be working with this classifier in future labs by adjusting its hyper parameters. ​ In particular, you should make it easy to:+You may use any code on the internet to help you, but all submitted code must be your own work. 
 + 
 +MAJOR CONSIDERATION: we will be working with this classifier in future labs by adjusting its hyper parameters. ​ In particular, you should make it easy to:
     * Change the activation function     * Change the activation function
     * Change whether or not you use BatchNorm     * Change whether or not you use BatchNorm
     * Change the learning rate schedule     * Change the learning rate schedule
     * Change the weight regularization     * Change the weight regularization
 +    * Change the weight initialization
  
  
cs601r_w2020/lab1.1578334847.txt.gz · Last modified: 2021/06/30 23:40 (external edit)