You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This repo contains a numpy from-scratch implementation of some ML algorithms, initially designed for the MNIST Digit classification task (~3% error, averaged over 20 runs of 5 fold cross-validation)
3
+
4
+
-[x] kNN
5
+
-[x] 3 Layer MLP (ReLU & Softmax activations), Cross-Entropy Loss
6
+
-[x] Least Squares
7
+
-[x] Winnow
8
+
-[x] One-vs-One and One-vs-All Muliticlass Kernel Perceptron
9
+
-[x] logistic regression with AdaGrad optimiser
10
+
-[ ] SVM (primal + dual)
11
+
12
+
13
+
14
+
## Additional Functions
15
+
files: CV.py and helper_functions.py
16
+
-[x] random train/test split
17
+
-[x] Cross Validation
18
+
-[x] Gram Matrix (for polynomial and Gaussian kernels)
19
+
-[x] numerical gradient check
20
+
21
+
TODO: change output type from error rate to prediction
0 commit comments