User Tools

Site Tools


cs501r_f2016:lab6v2

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs501r_f2016:lab6v2 [2017/10/10 16:47]
wingated
cs501r_f2016:lab6v2 [2021/06/30 23:42] (current)
Line 1: Line 1:
 +=====BYU CS 501R - Deep Learning:​Theory and Practice - Lab 6=====
 +
 +
 +
 ====Objective:​==== ====Objective:​====
  
Line 28: Line 32:
   * 30% Proper design, creation and debugging of a dense prediction network   * 30% Proper design, creation and debugging of a dense prediction network
   * 30% Proper design of a loss function and test set accuracy measure   * 30% Proper design of a loss function and test set accuracy measure
-  * 20% Proper experimentation with regularization+  * 20% Proper experimentation with two different regularizers
   * 20% Tidy visualization of the output of your dense predictor   * 20% Tidy visualization of the output of your dense predictor
  
Line 51: Line 55:
 overfitting is a real possibility. overfitting is a real possibility.
  
-[[http://​liftothers.org/​cancer_data.tar.gz|The data can be downloaded here.]]+[[http://​liftothers.org/​cancer_data.tar.gz|The data can be downloaded here.]] ​//Please note that this dataset is not publicly available, and should not be redistributed.//​
  
 As in the previous lab, you are welcome to sub-sample the data if your computer is not powerful enough to fit it all in RAM.  However, if you do, please clearly report how much data you used in your final report. As in the previous lab, you are welcome to sub-sample the data if your computer is not powerful enough to fit it all in RAM.  However, if you do, please clearly report how much data you used in your final report.
Line 79: Line 83:
  
 Like the previous lab, you must choose your topology. ​ I have had good Like the previous lab, you must choose your topology. ​ I have had good
-luck implementing the "Deep Convolution U-Net" from this paper: +luck implementing the "Deep Convolution U-Net" from this paper: [[https://​arxiv.org/​pdf/​1505.04597.pdf|U-Net:​ Convolutional Networks for Biomedical Image Segmentation]] (See figure 1, replicated at the right).  This should be fairly easy to implement given the
- +
-[[https://​arxiv.org/​pdf/​1505.04597.pdf|U-Net:​ Convolutional Networks +
-for Biomedical Image Segmentation]] +
- +
-(See figure 1).  This should be fairly easy to implement given the+
 ''​conv''​ helper functions that you implemented previously; you ''​conv''​ helper functions that you implemented previously; you
 may also need the tensorflow function ''​tf.concat''​. may also need the tensorflow function ''​tf.concat''​.
  
 +//Note that the simplest network you could implement (with all the desired properties) is just a single convolution layer with two filters and no relu!  Why is that?  (of course it wouldn'​t work very well!)//
  
 **Part 1b: Implement a cost function** **Part 1b: Implement a cost function**
cs501r_f2016/lab6v2.1507654075.txt.gz · Last modified: 2021/06/30 23:40 (external edit)