User Tools

Site Tools


cs501r_f2016:lab5

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
cs501r_f2016:lab5 [2016/09/19 08:47]
wingated
cs501r_f2016:lab5 [2016/09/19 08:48] (current)
wingated [Hints:]
Line 73: Line 73:
  
   * Use ''​tf.nn.relu''​ to create a ''​relu''​ layer.   * Use ''​tf.nn.relu''​ to create a ''​relu''​ layer.
-  * Consider variable ​initialization. ​ I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''​W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="​W"​ )''​+  * **Variable ​initialization ​matters** If your classifier seems stuck at 10% or 11% accuracy, make sure you're not initializing to all zeros! ​ I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''​W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="​W"​ )''​
  
  
cs501r_f2016/lab5.txt ยท Last modified: 2016/09/19 08:48 by wingated