Skip to content

Commit

Permalink
+ backpropagation
Browse files Browse the repository at this point in the history
  • Loading branch information
Mark-Kramer committed Aug 9, 2024
1 parent ae05cad commit 27eec51
Show file tree
Hide file tree
Showing 12 changed files with 1,087 additions and 2 deletions.
243 changes: 243 additions & 0 deletions Backpropagation.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,243 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"---\n",
"title: Backpropagation\n",
"project:\n",
" type: website\n",
"format:\n",
" html:\n",
" code-fold: true\n",
" code-tools: true\n",
"jupyter: python 3\n",
"number-sections: true\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In this notebook, we'll implement a quick representation of the backpropagation algorithm for the simple two node network."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"from scipy.io import loadmat"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Steps to backpropagation\n",
"\n",
"We outlined 4 steps to perform backpropagation,\n",
"\n",
" 1. Choose random initial weights.\n",
" 2. Train the neural network on given input and output data.\n",
" 3. Update the weights.\n",
" 4. Repeat steps 2 & 3 many times.\n",
"\n",
"Let's now implement these steps in an example data set."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Load example data\n",
"\n",
"The training data is [backpropagation_example_data.mat](/Data/backpropagation_example_data.mat). Get these data, and load them:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"# Load the .mat file\n",
"data = loadmat('backpropagation_example_data.mat')\n",
"\n",
"# Extract the variables from the loaded data\n",
"in_true = data['in_true'].squeeze()\n",
"out_true = data['out_true'].squeeze() # .squeeze() removes any unnecessary dimensions"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here we acquire two variables:\n",
"\n",
"`in_true`: the true input to the hidden two-node neural network\n",
"\n",
"`out_true`: the true output of the hidden two-ndoe neural network\n",
"\n",
"The two-node neural network is hidden because we don't know the weights (`w[0]`, `w[1]`, and `w[2]`).\n",
"\n",
"Instead, all we observe are the pairs of inputs and outputs to this hidden neural network."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's look at some of these data:"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[[ -5.07721176 8.86495629]\n",
" [ -4.42151036 8.78851213]\n",
" [ 8.43023852 6.07694085]\n",
" ...\n",
" [ -1.09949946 8.25391663]\n",
" [ 2.72521796 7.36458545]\n",
" [-10.04105911 9.21913802]]\n"
]
}
],
"source": [
"print(np.transpose([in_true, out_true]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"These data were created by sending the inputs (`in_true`, the first column above) into a two-node neural network to produce the outputs (`out_true`, the second column above).\n",
"\n",
"Again, we do not know the weights of this network ... that's what we'd like to find.\n",
"\n",
"To do so, we'll use these data to train a neural network through back propagation."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# For training, first define two useful functions:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"def sigmoid(x):\n",
" return 1/(1+np.exp(-x)) # Define the sigmoid anonymous function.\n",
"\n",
"def feedforward(w, s0): # Define feedforward solution.\n",
" x1 = w[0]*s0 # ... activity of first neuron,\n",
" s1 = sigmoid(x1) # ... output of first neuron,\n",
" x2 = w[1]*s1 # ... activity of second neuron,\n",
" s2 = sigmoid(x2) # ... output of second neuron,\n",
" out= w[2]*s2 # Output of neural network.\n",
" return out,s1,s2"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Now, train the neural network with these (`in_true`, `out_true`) data."
]
},
{
"cell_type": "code",
"execution_count": 14,
"metadata": {},
"outputs": [],
"source": [
"w = [1,1,1] # Choose initial values for the weights.\n",
"alpha = 0.01 # Set the learning constant.\n",
"\n",
"K = np.size(in_true);\n",
"results = np.zeros([K,4]) # Define a variable to hold the results of each iteration. \n",
"\n",
"for k in np.arange(K):\n",
" s0 = in_true[k] # Define the input,\n",
" target = out_true[k] # ... and the target output.\n",
" \n",
" #Step 2. Calculate feedforward solution to get output.\n",
" \n",
" #Step 3. Update the weights.\n",
" w0 = w[0]; w1 = w[1]; w2 = w[2];\n",
" w[2] = \"SOMETHING\" \n",
" w[1] = \"SOMETHING\"\n",
" w[0] = \"SOMETHING\"\n",
" \n",
" # Save the results of this step. --------------------------------------\n",
" # Here we save the 3 weights, and the neural network output.\n",
" # results[k,:] = [w[0],w[1],w[2], out]\n",
"\n",
"# Plot the NN weights and error during training \n",
"# plt.clf()\n",
"# plt.plot(results[:,2], label='w2')\n",
"# plt.plot(results[:,1], label='w1')\n",
"# plt.plot(results[:,0], label='w0')\n",
"# plt.plot(results[:,3]-target, label='error')\n",
"# plt.legend() #Include a legend,\n",
"# plt.xlabel('Iteration number'); #... and axis label.\n",
"\n",
"# Print the NN weights\n",
"# print(results[-1,0:3])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Challenges\n",
"1. Complete the code above to determine the weights (`w[0]`, `w[1]`, and `w[2]`) of the hidden two-node neural network."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.18"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Binary file added Data/backpropagation_example_data.mat
Binary file not shown.
Binary file added Slides/Backpropagation_1.pdf
Binary file not shown.
2 changes: 2 additions & 0 deletions _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ website:
text: HH
- href: Perceptron.html
text: Perceptron
- href: Backpropagation.html
text: Backprop


format:
Expand Down
Binary file added backpropagation_example_data.mat
Binary file not shown.
Loading

0 comments on commit 27eec51

Please sign in to comment.