Skip to content

Mekala02/Neural-Network-Implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Features

You can easily add layers to network.
You can save or load your trained network.

Activation Functions:

Sigmoid
ReLu
Softmax

Regularization:

L2
Dropout (You can individually drop hidden layers)

Optimizers:

Momentum
RMS
Adam

Weight initialization: (Kernel Initializer)

He
Xavier
(You can choose kernel for individual layers)

Usage:

Adding Layer:

add_layer(node_count, activation, input_len=0, kernel_initializer=None)

node_count: Layers node count.
activation: Activation function for this layer (Sigmoid, Relu, Softmax), you have to pass the function not the name.
input_len: Input lenght for this layer (Only use for first layer because network automatickly fills it for other layers).
kernel_initializer: Weight initialization (He, Xavier).

Training:

train(alpha, iteration, L2=0, dropout=0, optimizer=None, Momentum_B=0.9, RMS_B=0.999)

alpha: Learning rate value for gradient descent
iteration: How many times we want to train the network with all the training data
L2: λ value for L2 regularization
dropout: If you want to use dropout you have to pass list of dropout probabilities for all layers(Example: [0, 0.2, 0.2])
Momentum_B: Beta value for momentum
RMS_B: Beta value for RMS

Saving and loading weights:

save_network(file_name)
load_network(file_name)

file_name: Name for the file that we want to save out values

Example Usage:

model = network()
model.add_layer(120, model.ReLu, input_len=81, kernel_initializer="He")
model.add_layer(120, model.ReLu, kernel_initializer="He")
model.add_layer(81, model.ReLu, kernel_initializer="He")
trainer = train(model)
trainer.load_data(x, y)
trainer.train(0.001, 10000, optimizer="Adam")
model.save_network("test")

About

This is my implementation of neural network.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages