Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

bosgithub/Numpy_Operations

Open more actions menu

Repository files navigation

Numpy_Operations

let's build some functions with Numpy

1 - Sigmoid function implemented with math library and numpy library.


2 - Derivative of sigmoid function. Computes the gradient of the sigmoid function with respect to its input x. Derivative are used in back propagation which is the technique used for optimization


3 - Reshaping Function: Takes a matrix with input of shape (length, height, 3) then returns a vector of shape (length*height*3, 1)


4 - Row normalization function: Often, normalizing the data is done before feeding it into the model, this generally speeds up learning and leads to faster convergence.


5 - Soft Max function: This is a generalization of the logistic function to multiple dimensions/classes. It is used in multinomial logistic regression and is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes.


6 - Element Wise Multiplication function implemented in python and numpy


7 - Inner Product(Dot Product) function implemented in python and numpy


8 - Outer Product function implemented in python and numpy


9 - L1 loss function: Least Absolute Deviations


10 - L2 loss function: Least Square Errors

About

Some functions implemented in the NumPy library

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

Morty Proxy This is a proxified and sanitized view of the page, visit original site.