1 - Sigmoid function implemented with math library and numpy library.
2 - Derivative of sigmoid function. Computes the gradient of the sigmoid function with respect to its input x. Derivative are used in back propagation which is the technique used for optimization
3 - Reshaping Function: Takes a matrix with input of shape (length, height, 3) then returns a vector of shape (length*height*3, 1)
4 - Row normalization function: Often, normalizing the data is done before feeding it into the model, this generally speeds up learning and leads to faster convergence.
5 - Soft Max function: This is a generalization of the logistic function to multiple dimensions/classes. It is used in multinomial logistic regression and is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes.
6 - Element Wise Multiplication function implemented in python and numpy
7 - Inner Product(Dot Product) function implemented in python and numpy
8 - Outer Product function implemented in python and numpy
9 - L1 loss function: Least Absolute Deviations
10 - L2 loss function: Least Square Errors