User Tools

Site Tools


cs704r_w2019

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Next revision Both sides next revision
cs704r_w2019 [2019/01/07 19:03]
wingated
cs704r_w2019 [2019/01/07 19:04]
wingated
Line 1: Line 1:
 **Applications** **Applications**
-  BERT - https://​arxiv.org/​abs/​1810.04805 +  ​BERT - https://​arxiv.org/​abs/​1810.04805 
-  Machine Theory of Mind - http://​arxiv.org/​pdf/​1802.07740v2.pdf +  ​Machine Theory of Mind - http://​arxiv.org/​pdf/​1802.07740v2.pdf 
-  Video-to-Video Synthesis +  ​Video-to-Video Synthesis 
-  Video Prediction via Selective Sampling +  ​Video Prediction via Selective Sampling 
-  Learning to decompose & disadvantage representations for video prediction+  ​Learning to decompose & disadvantage representations for video prediction
  
 **GANs / unsupervised** **GANs / unsupervised**
  
-  IntroVAE: Introspective Variational Autoencoders for Photographic Image Synthesis +  ​IntroVAE: Introspective Variational Autoencoders for Photographic Image Synthesis 
-  Wasserstein GAN +  ​Wasserstein GAN 
-  Text adaptive GAN: Manipulating images with natural language+  ​Text adaptive GAN: Manipulating images with natural language
  
 **Network design** **Network design**
  
-  Attention is all you need +  ​Attention is all you need 
-  Neural Ordinary Differential Equations - https://​arxiv.org/​pdf/​1806.07366.pdf +  ​Neural Ordinary Differential Equations - https://​arxiv.org/​pdf/​1806.07366.pdf 
-  Reversible neural networks - https://​arxiv.org/​abs/​1807.03039 - https://​arxiv.org/​abs/​1605.08803+  ​Reversible neural networks - https://​arxiv.org/​abs/​1807.03039 - https://​arxiv.org/​abs/​1605.08803
   ​   ​
 **Foundations / Philosophy** **Foundations / Philosophy**
  
-  Troubling trends in ML scholarship - https://​arxiv.org/​pdf/​1807.03341 +  ​Troubling trends in ML scholarship - https://​arxiv.org/​pdf/​1807.03341 
-  A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation - https://​arxiv.org/​pdf/​1506.06472 +  ​A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation - https://​arxiv.org/​pdf/​1506.06472 
-  Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality:​ a Review - https://​arxiv.org/​pdf/​1611.00740+  ​Why and When Can Deep -- but Not Shallow -- Networks Avoid the Curse of Dimensionality:​ a Review - https://​arxiv.org/​pdf/​1611.00740
   ​   ​
 **RL** **RL**
  
-  Curiosity-driven exploration by self-supervised prediction +  ​Curiosity-driven exploration by self-supervised prediction 
-  Diversity is all you need: Learning skills without a reward function - http://​arxiv.org/​pdf/​1802.06070v6.pdf +  ​Diversity is all you need: Learning skills without a reward function - http://​arxiv.org/​pdf/​1802.06070v6.pdf 
-  World Models - https://​arxiv.org/​pdf/​1803.10122v4.pdf+  ​World Models - https://​arxiv.org/​pdf/​1803.10122v4.pdf
  
 **Graph networks** **Graph networks**
  
-  Graph Neural Networks: A Review of Methods and Applications - https://​arxiv.org/​abs/​1812.08434 +  ​Graph Neural Networks: A Review of Methods and Applications - https://​arxiv.org/​abs/​1812.08434 
-  Relational inductive biases, deep learning, and graph networks - https://​arxiv.org/​pdf/​1806.01261+  ​Relational inductive biases, deep learning, and graph networks - https://​arxiv.org/​pdf/​1806.01261
  
 **Optimization / training** **Optimization / training**
  
-  Averaging weights leads to wider optima and better generalization - http://​arxiv.org/​pdf/​1803.05407v2.pdf +  ​Averaging weights leads to wider optima and better generalization - http://​arxiv.org/​pdf/​1803.05407v2.pdf 
-  The loss surface of multilayer networks - https://​arxiv.org/​pdf/​1412.0233 +  ​The loss surface of multilayer networks - https://​arxiv.org/​pdf/​1412.0233 
-  Visualizing The Loss Landscape of Neural Nets - https://​arxiv.org/​pdf/​1712.09913v3.pdf +  ​Visualizing The Loss Landscape of Neural Nets - https://​arxiv.org/​pdf/​1712.09913v3.pdf 
-  The Matrix Calculus You Need For Deep Learning - https://​arxiv.org/​pdf/​1802.01528v3.pdf +  ​The Matrix Calculus You Need For Deep Learning - https://​arxiv.org/​pdf/​1802.01528v3.pdf 
-  Group Norm - https://​arxiv.org/​pdf/​1803.08494v3.pdf +  ​Group Norm - https://​arxiv.org/​pdf/​1803.08494v3.pdf 
-  Kalman Normalization:​ Normalizing internal representations across network layers +  ​Kalman Normalization:​ Normalizing internal representations across network layers 
-  MetaReg: towards Domain Generalization using meta-regularization +  ​MetaReg: towards Domain Generalization using meta-regularization 
-  AutoAugment - https://​arxiv.org/​abs/​1805.09501 +  ​AutoAugment - https://​arxiv.org/​abs/​1805.09501 
-  A Disciplined Approach To Neural Network Hyper-Parameters:​ part 1 - http://​arxiv.org/​pdf/​1803.09820v2.pdf +  ​A Disciplined Approach To Neural Network Hyper-Parameters:​ part 1 - http://​arxiv.org/​pdf/​1803.09820v2.pdf 
-  (Direct) Feedback alignment+  ​(Direct) Feedback alignment
  
 ** Geometric deep learning ** ** Geometric deep learning **
  
-  Geometric deep learning: going beyond Euclidean data- https://​arxiv.org/​pdf/​1611.08097.pdf +  ​Geometric deep learning: going beyond Euclidean data- https://​arxiv.org/​pdf/​1611.08097.pdf 
-  Convolutional Neural Networks on Surfaces via Seamless Toric Covers +  ​Convolutional Neural Networks on Surfaces via Seamless Toric Covers 
-  SchNet: A continuous-filter convolutional neural network for modeling quantum interactions +  ​SchNet: A continuous-filter convolutional neural network for modeling quantum interactions 
-  Deriving Neural Architectures from Sequence and Graph Kernels +  ​Deriving Neural Architectures from Sequence and Graph Kernels 
-  CayleyNets: Graph convolutional neural networks with complex rational spectral filters +  ​CayleyNets: Graph convolutional neural networks with complex rational spectral filters 
-  Deep Functional Maps: Structured Prediction for Dense Shape Correspondence +  ​Deep Functional Maps: Structured Prediction for Dense Shape Correspondence 
-  Geometric matrix completion with recurrent multi-graph neural networks +  ​Geometric matrix completion with recurrent multi-graph neural networks 
-  Neural Message Passing for Quantum Chemistry +  ​Neural Message Passing for Quantum Chemistry 
-  Deep Learning on Lie Groups for Skeleton-based Action Recognition+  ​Deep Learning on Lie Groups for Skeleton-based Action Recognition
   ​   ​
 **Other** **Other**
  
-  Bayesian neural networks?+  ​Bayesian neural networks?
cs704r_w2019.txt · Last modified: 2021/06/30 23:42 (external edit)