User Tools

Site Tools


cs501r_f2016:lab5

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
cs501r_f2016:lab5 [2016/09/15 09:18]
wingated
cs501r_f2016:lab5 [2016/09/19 08:48]
wingated [Hints:]
Line 41: Line 41:
  
 Also note that there are two versions of Tensorflow -- one that runs on GPUs, and one that runs only on the CPU.  You may want to try the GPU version first; if it works (and you have a GPU in your computer!) it may be **considerably** faster than the CPU only version. ​ Performance won't be a big deal for this lab, but it will matter more later on. Also note that there are two versions of Tensorflow -- one that runs on GPUs, and one that runs only on the CPU.  You may want to try the GPU version first; if it works (and you have a GPU in your computer!) it may be **considerably** faster than the CPU only version. ​ Performance won't be a big deal for this lab, but it will matter more later on.
 +
 +A few other notes:
 +  - The GPU version of Tensorflow has considerably more external dependencies - you will need to install several Nvidia packages to support it.  There be dragons there.
 +  - It might seem natural to use the ''​anaconda''​ install instructions,​ since that's what we're using in class. ​ However, I have found that the creation of virtual environments complicates life, and doesn'​t seem to work well with the notebooks we've been using. ​ A much simpler way is just to use the ''​pip''​ installation method.
  
 **Part 2: implement basic MNIST tutorial** **Part 2: implement basic MNIST tutorial**
Line 69: Line 73:
  
   * Use ''​tf.nn.relu''​ to create a ''​relu''​ layer.   * Use ''​tf.nn.relu''​ to create a ''​relu''​ layer.
-  * Consider variable ​initialization. ​ I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''​W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="​W"​ )''​+  * **Variable ​initialization ​matters** If your classifier seems stuck at 10% or 11% accuracy, make sure you're not initializing to all zeros! ​ I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''​W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="​W"​ )''​
  
  
cs501r_f2016/lab5.txt ยท Last modified: 2016/09/19 08:48 by wingated