User Tools

Site Tools


cs501r_f2016:lab4

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
Next revision Both sides next revision
cs501r_f2016:lab4 [2016/09/12 16:29]
wingated created
cs501r_f2016:lab4 [2016/09/12 16:34]
wingated
Line 84: Line 84:
  
 You should plot both the loss function and the classification accuracy. You should plot both the loss function and the classification accuracy.
 +
 +----
 +====Extra awesomeness:​====
 +
 +Now that you have a powerful automatic differentiation engine at your command, how hard would it be to take our simple linear scoring function, and change it to something more complex? ​ For example, could you swap in a simple 2-layer neural network? ​ Or something else entirely?
 +
 +Since this lab is so short, you should have some extra time to play around. ​ I invite you to try something!
  
 ---- ----
Line 90: Line 97:
 You may want to test the automatic differentiation package on known test cases - for example, f(x)=x^2. You may want to test the automatic differentiation package on known test cases - for example, f(x)=x^2.
  
-If you get an error like ''​FloatNode has no attribute named '​exp'​ '',​ make sure you're using the autograd version of numpy!+If your code seems to take forever to get running, seems to stall right at the beginning, or you get an error like ''​FloatNode has no attribute named '​exp'​ '',​ make sure you're using the autograd version of numpy!
  
 <code python> <code python>
 import autograd.numpy as np import autograd.numpy as np
 </​code>​ </​code>​
cs501r_f2016/lab4.txt ยท Last modified: 2021/06/30 23:42 (external edit)