Neural Network implemented from scratch in Python🔥
Here is the step by step explanation with code.
Thread🧵👇
Below is the simple Neural Network consists of 2 layers:
- Hidden Layer
- Output Layer
First Initialize the size of layers along with the weights & biases.
And also define the sigmoid activation function & it's derivative which is really key to introduce non-linearity.
Forward Pass:
Here the input data is passed through the neural network to obtain the predicted output.
In forward pass, First calculate the output of the hidden layer.
hidden_output = X•W1 + b1
Then apply the sigmoid activation to the output.
output = sigmoid( (X•W1) + b1)
Backward Pass:
First compute the gradients of the output layer.
Loss = (y - output)
Gradient of Loss = (y - output) * sigmoid_derivative(output)
Now calculate d_W2 which is gradient of the loss function with respect to W2.
d_W2 = hidden_output.T • Gradient of Loss
Similarly calculate d_W1, d_b2 & d_b1
dW1: Gradient of the loss function wrt W1
d_b2: Gradient of the loss function wrt b2(bias of neuron in output layer)
d_b1: Gradient of the loss function wrt b1(bias of neuron in hidden layer)
Now Update the Weights:
Here learning rate is the hyper parameter!
A low learning rate can cause the model getting caught in local optima, while the high learning rate can cause the model to overshoot the general solution
You can now make Claude Code, Gemini CLI, and other coding agents 10x more powerful by giving them long‑term memory!
It just takes a single line of code.
Here’s a step‑by‑step breakdown (100% local):
Coding agents have a major limitation: they forget everything between sessions.
Without memory, they can’t retain project context, past fixes, or key decisions.
What they need is a persistent memory layer to store and recall context.
Let’s see how to set that up.
We’ll use Cipher, an open‑source memory layer for coding agents.
You can run Cipher as an MCP server so coding agents like Claude Code, Gemini CLI, or plugins for VS Code, Cursor, and Windsurf can connect directly and use its memory layer out of the box