This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | |||
cs501r_f2016:lab5 [2016/09/19 15:47] wingated |
cs501r_f2016:lab5 [2016/09/19 15:48] wingated [Hints:] |
||
---|---|---|---|
Line 73: | Line 73: | ||
* Use ''tf.nn.relu'' to create a ''relu'' layer. | * Use ''tf.nn.relu'' to create a ''relu'' layer. | ||
- | * Consider variable initialization. I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="W" )'' | + | * **Variable initialization matters**. If your classifier seems stuck at 10% or 11% accuracy, make sure you're not initializing to all zeros! I usually initialize variables by drawing entries from a Gaussian with a small standard deviation: ''W = tf.Variable( tf.random_normal( [784, 100], stddev=0.001 ), name="W" )'' |