User Tools

Site Tools


cs501r_f2016:lab4

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
cs501r_f2016:lab4 [2016/09/12 16:29]
wingated created
cs501r_f2016:lab4 [2016/09/13 21:08]
wingated
Line 71: Line 71:
 step_size = 0.1 step_size = 0.1
 for i in range(0,​NUM_EPOCHS):​ for i in range(0,​NUM_EPOCHS):​
-    ​loss_function_value ​= loss_function( W ) +    ​loss_function_value_t ​= loss_function( W ) 
-    ​grad = grad_f( W ) +    ​grad_t ​= grad_f( W ) 
-    W = W - step_size * grad+    W = W - step_size * grad_t
 </​code>​ </​code>​
  
Line 84: Line 84:
  
 You should plot both the loss function and the classification accuracy. You should plot both the loss function and the classification accuracy.
 +
 +----
 +====Extra awesomeness:​====
 +
 +Now that you have a powerful automatic differentiation engine at your command, how hard would it be to take our simple linear scoring function, and change it to something more complex? ​ For example, could you swap in a simple 2-layer neural network? ​ Or something else entirely?
 +
 +Since this lab is so short, you should have some extra time to play around. ​ I invite you to try something!
  
 ---- ----
Line 90: Line 97:
 You may want to test the automatic differentiation package on known test cases - for example, f(x)=x^2. You may want to test the automatic differentiation package on known test cases - for example, f(x)=x^2.
  
-If you get an error like ''​FloatNode has no attribute named '​exp'​ '',​ make sure you're using the autograd version of numpy!+If your code seems to take forever to get running, seems to stall right at the beginning, or you get an error like ''​FloatNode has no attribute named '​exp'​ '',​ make sure you're using the autograd version of numpy!
  
 <code python> <code python>
 import autograd.numpy as np import autograd.numpy as np
 </​code>​ </​code>​
cs501r_f2016/lab4.txt · Last modified: 2021/06/30 23:42 (external edit)