User Tools

Site Tools


cs501r_f2016:lab6v2

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs501r_f2016:lab6v2 [2017/10/10 16:43]
wingated
cs501r_f2016:lab6v2 [2021/06/30 23:42] (current)
Line 1: Line 1:
 +=====BYU CS 501R - Deep Learning:​Theory and Practice - Lab 6=====
 +
 +
 +
 ====Objective:​==== ====Objective:​====
  
Line 14: Line 18:
 notebook or PDF writeup that describes your (1) topology, (2) cost notebook or PDF writeup that describes your (1) topology, (2) cost
 function, (3) method of calculating accuracy, and (4) results with function, (3) method of calculating accuracy, and (4) results with
-experimenting with regularization.+experimenting with regularization.  You should also report on how much of the data you used.
  
 Your notebook / writeup should also include at an image that Your notebook / writeup should also include at an image that
 shows the dense prediction produced by your network on the shows the dense prediction produced by your network on the
 ''​pos_test_000072.png''​ image. ​ This is an image in the test set that ''​pos_test_000072.png''​ image. ​ This is an image in the test set that
-your network will not have seen before. ​ This image, and the ground truth labeling, is shown at the right.+your network will not have seen before. ​ This image, and the ground truth labeling, is shown at the right. (And is contained in the downloadable dataset below).
  
 ---- ----
Line 28: Line 32:
   * 30% Proper design, creation and debugging of a dense prediction network   * 30% Proper design, creation and debugging of a dense prediction network
   * 30% Proper design of a loss function and test set accuracy measure   * 30% Proper design of a loss function and test set accuracy measure
-  * 20% Proper experimentation with regularization+  * 20% Proper experimentation with two different regularizers
   * 20% Tidy visualization of the output of your dense predictor   * 20% Tidy visualization of the output of your dense predictor
  
Line 51: Line 55:
 overfitting is a real possibility. overfitting is a real possibility.
  
-[[http://​liftothers.org/​cancer_data.tar.gz|The data can be downloaded here.]]+[[http://​liftothers.org/​cancer_data.tar.gz|The data can be downloaded here.]] ​//Please note that this dataset is not publicly available, and should not be redistributed.//​ 
 + 
 +As in the previous lab, you are welcome to sub-sample the data if your computer is not powerful enough to fit it all in RAM.  However, if you do, please clearly report how much data you used in your final report.
  
 ---- ----
Line 77: Line 83:
  
 Like the previous lab, you must choose your topology. ​ I have had good Like the previous lab, you must choose your topology. ​ I have had good
-luck implementing the "Deep Convolution U-Net" from this paper: +luck implementing the "Deep Convolution U-Net" from this paper: [[https://​arxiv.org/​pdf/​1505.04597.pdf|U-Net:​ Convolutional Networks for Biomedical Image Segmentation]] (See figure 1, replicated at the right).  This should be fairly easy to implement given the
- +
-[[https://​arxiv.org/​pdf/​1505.04597.pdf|U-Net:​ Convolutional Networks +
-for Biomedical Image Segmentation]] +
- +
-(See figure 1).  This should be fairly easy to implement given the+
 ''​conv''​ helper functions that you implemented previously; you ''​conv''​ helper functions that you implemented previously; you
 may also need the tensorflow function ''​tf.concat''​. may also need the tensorflow function ''​tf.concat''​.
  
 +//Note that the simplest network you could implement (with all the desired properties) is just a single convolution layer with two filters and no relu!  Why is that?  (of course it wouldn'​t work very well!)//
  
 **Part 1b: Implement a cost function** **Part 1b: Implement a cost function**
cs501r_f2016/lab6v2.1507653820.txt.gz · Last modified: 2021/06/30 23:40 (external edit)