This projects aims to examine the complexity of neural network architectures to leverage structure that improves the training phase.
My notes on many things ML and DL
-[] NEURAL ARCHITECTURE SEARCH WITH REINFORCEMENT LEARNING https://openreview.net/pdf?id=r1Ue8Hcxg - _ Deep Convolutional Neural Networks For LVCSR
[] Introspection:Accelerating Neural Network Training By Learning Weight Evolution
[]
Incremental Growth of Semantic Branches on CNNs via Multi-Shot Learning Quanshi Zhang, Ruiming Cao, Ying Nian Wu and Song-Chun Zhu
Unsupervised Large Graph Embedding Feiping Nie, Wei Zhu and Xuelong Li
Regularization for Unsupervised Deep Neural Nets Baiyang Wang and Diego Klabjan
Efficient Hyperparameter Optimization of Deep Learning Algorithms Using Deterministic RBF Surrogates Ilija Ilievski, Jiashi Feng, Taimoor Akhtar and Christine Shoemaker
Tunable Sensitivity to Large Errors in Neural Network Training Gil Keren, Sivan Sabato and Björn Schuller
Understanding the Semantic Structures of Tables with a Hybrid Deep Neural Network Architecture Kyosuke Nishida, Kugatsu Sadamitsu, Ryuichiro Higashinaka and Yoshihiro Matsuo
Feature engineering - the data you have may have all info that is required by the model, but these might not be in a mode that can be leveraged.
Deep Learning](http://neuralnetworksanddeeplearning.com/chap6.html)
DL Bells and Wistles: Nerual Network Hyperparameters Hyper-Parameters
It has been shown that the use of computer clusters for hyper-parameter selection can have an important effect on results (Pinto et al., 2009).
We define a hyper-parameter for a learning algorithm A as a value to be selected prior to the ac- tual application of A to the data, a value that is not directly selected by the learning algorithm itself.
Matrices: Hessian matrix
Gauss-Newton matrix
Fisher information matrix
What is a long time? So in reference to standard deep learning tasks when working with standard datasets such as MNIST,
On ReLU (rectifying linear unit)
Weigts evolution This is an interesting concept.
To avoid the warning, let’s install tensorflow from source.
* Dependencies
- brew install bazel
- Install Bazel
Once installe, you can upgrade to a newer version of Bazel with:sudo apt-get upgrade bazel
tensorflow/core/platform/cpu_feature_guard.cc:45
The TensorFlow library wasn’t compiled to use SSE4.2 instructionsInvalid path to CUDA 8.0 toolkit. /usr/local/cuda/lib/libcudart.8.0.dylib cannot be found