Skip to content

Commit e2555f9

Browse files
author
Horst Petschenig
committed
Initial commit
0 parents  commit e2555f9

32 files changed

+3390
-0
lines changed

LICENSE

Lines changed: 202 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,202 @@
1+
Apache License
2+
Version 2.0, January 2004
3+
http://www.apache.org/licenses/
4+
5+
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6+
7+
1. Definitions.
8+
9+
"License" shall mean the terms and conditions for use, reproduction,
10+
and distribution as defined by Sections 1 through 9 of this document.
11+
12+
"Licensor" shall mean the copyright owner or entity authorized by
13+
the copyright owner that is granting the License.
14+
15+
"Legal Entity" shall mean the union of the acting entity and all
16+
other entities that control, are controlled by, or are under common
17+
control with that entity. For the purposes of this definition,
18+
"control" means (i) the power, direct or indirect, to cause the
19+
direction or management of such entity, whether by contract or
20+
otherwise, or (ii) ownership of fifty percent (50%) or more of the
21+
outstanding shares, or (iii) beneficial ownership of such entity.
22+
23+
"You" (or "Your") shall mean an individual or Legal Entity
24+
exercising permissions granted by this License.
25+
26+
"Source" form shall mean the preferred form for making modifications,
27+
including but not limited to software source code, documentation
28+
source, and configuration files.
29+
30+
"Object" form shall mean any form resulting from mechanical
31+
transformation or translation of a Source form, including but
32+
not limited to compiled object code, generated documentation,
33+
and conversions to other media types.
34+
35+
"Work" shall mean the work of authorship, whether in Source or
36+
Object form, made available under the License, as indicated by a
37+
copyright notice that is included in or attached to the work
38+
(an example is provided in the Appendix below).
39+
40+
"Derivative Works" shall mean any work, whether in Source or Object
41+
form, that is based on (or derived from) the Work and for which the
42+
editorial revisions, annotations, elaborations, or other modifications
43+
represent, as a whole, an original work of authorship. For the purposes
44+
of this License, Derivative Works shall not include works that remain
45+
separable from, or merely link (or bind by name) to the interfaces of,
46+
the Work and Derivative Works thereof.
47+
48+
"Contribution" shall mean any work of authorship, including
49+
the original version of the Work and any modifications or additions
50+
to that Work or Derivative Works thereof, that is intentionally
51+
submitted to Licensor for inclusion in the Work by the copyright owner
52+
or by an individual or Legal Entity authorized to submit on behalf of
53+
the copyright owner. For the purposes of this definition, "submitted"
54+
means any form of electronic, verbal, or written communication sent
55+
to the Licensor or its representatives, including but not limited to
56+
communication on electronic mailing lists, source code control systems,
57+
and issue tracking systems that are managed by, or on behalf of, the
58+
Licensor for the purpose of discussing and improving the Work, but
59+
excluding communication that is conspicuously marked or otherwise
60+
designated in writing by the copyright owner as "Not a Contribution."
61+
62+
"Contributor" shall mean Licensor and any individual or Legal Entity
63+
on behalf of whom a Contribution has been received by Licensor and
64+
subsequently incorporated within the Work.
65+
66+
2. Grant of Copyright License. Subject to the terms and conditions of
67+
this License, each Contributor hereby grants to You a perpetual,
68+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69+
copyright license to reproduce, prepare Derivative Works of,
70+
publicly display, publicly perform, sublicense, and distribute the
71+
Work and such Derivative Works in Source or Object form.
72+
73+
3. Grant of Patent License. Subject to the terms and conditions of
74+
this License, each Contributor hereby grants to You a perpetual,
75+
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76+
(except as stated in this section) patent license to make, have made,
77+
use, offer to sell, sell, import, and otherwise transfer the Work,
78+
where such license applies only to those patent claims licensable
79+
by such Contributor that are necessarily infringed by their
80+
Contribution(s) alone or by combination of their Contribution(s)
81+
with the Work to which such Contribution(s) was submitted. If You
82+
institute patent litigation against any entity (including a
83+
cross-claim or counterclaim in a lawsuit) alleging that the Work
84+
or a Contribution incorporated within the Work constitutes direct
85+
or contributory patent infringement, then any patent licenses
86+
granted to You under this License for that Work shall terminate
87+
as of the date such litigation is filed.
88+
89+
4. Redistribution. You may reproduce and distribute copies of the
90+
Work or Derivative Works thereof in any medium, with or without
91+
modifications, and in Source or Object form, provided that You
92+
meet the following conditions:
93+
94+
(a) You must give any other recipients of the Work or
95+
Derivative Works a copy of this License; and
96+
97+
(b) You must cause any modified files to carry prominent notices
98+
stating that You changed the files; and
99+
100+
(c) You must retain, in the Source form of any Derivative Works
101+
that You distribute, all copyright, patent, trademark, and
102+
attribution notices from the Source form of the Work,
103+
excluding those notices that do not pertain to any part of
104+
the Derivative Works; and
105+
106+
(d) If the Work includes a "NOTICE" text file as part of its
107+
distribution, then any Derivative Works that You distribute must
108+
include a readable copy of the attribution notices contained
109+
within such NOTICE file, excluding those notices that do not
110+
pertain to any part of the Derivative Works, in at least one
111+
of the following places: within a NOTICE text file distributed
112+
as part of the Derivative Works; within the Source form or
113+
documentation, if provided along with the Derivative Works; or,
114+
within a display generated by the Derivative Works, if and
115+
wherever such third-party notices normally appear. The contents
116+
of the NOTICE file are for informational purposes only and
117+
do not modify the License. You may add Your own attribution
118+
notices within Derivative Works that You distribute, alongside
119+
or as an addendum to the NOTICE text from the Work, provided
120+
that such additional attribution notices cannot be construed
121+
as modifying the License.
122+
123+
You may add Your own copyright statement to Your modifications and
124+
may provide additional or different license terms and conditions
125+
for use, reproduction, or distribution of Your modifications, or
126+
for any such Derivative Works as a whole, provided Your use,
127+
reproduction, and distribution of the Work otherwise complies with
128+
the conditions stated in this License.
129+
130+
5. Submission of Contributions. Unless You explicitly state otherwise,
131+
any Contribution intentionally submitted for inclusion in the Work
132+
by You to the Licensor shall be under the terms and conditions of
133+
this License, without any additional terms or conditions.
134+
Notwithstanding the above, nothing herein shall supersede or modify
135+
the terms of any separate license agreement you may have executed
136+
with Licensor regarding such Contributions.
137+
138+
6. Trademarks. This License does not grant permission to use the trade
139+
names, trademarks, service marks, or product names of the Licensor,
140+
except as required for reasonable and customary use in describing the
141+
origin of the Work and reproducing the content of the NOTICE file.
142+
143+
7. Disclaimer of Warranty. Unless required by applicable law or
144+
agreed to in writing, Licensor provides the Work (and each
145+
Contributor provides its Contributions) on an "AS IS" BASIS,
146+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147+
implied, including, without limitation, any warranties or conditions
148+
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149+
PARTICULAR PURPOSE. You are solely responsible for determining the
150+
appropriateness of using or redistributing the Work and assume any
151+
risks associated with Your exercise of permissions under this License.
152+
153+
8. Limitation of Liability. In no event and under no legal theory,
154+
whether in tort (including negligence), contract, or otherwise,
155+
unless required by applicable law (such as deliberate and grossly
156+
negligent acts) or agreed to in writing, shall any Contributor be
157+
liable to You for damages, including any direct, indirect, special,
158+
incidental, or consequential damages of any character arising as a
159+
result of this License or out of the use or inability to use the
160+
Work (including but not limited to damages for loss of goodwill,
161+
work stoppage, computer failure or malfunction, or any and all
162+
other commercial damages or losses), even if such Contributor
163+
has been advised of the possibility of such damages.
164+
165+
9. Accepting Warranty or Additional Liability. While redistributing
166+
the Work or Derivative Works thereof, You may choose to offer,
167+
and charge a fee for, acceptance of support, warranty, indemnity,
168+
or other liability obligations and/or rights consistent with this
169+
License. However, in accepting such obligations, You may act only
170+
on Your own behalf and on Your sole responsibility, not on behalf
171+
of any other Contributor, and only if You agree to indemnify,
172+
defend, and hold each Contributor harmless for any liability
173+
incurred by, or claims asserted against, such Contributor by reason
174+
of your accepting any such warranty or additional liability.
175+
176+
END OF TERMS AND CONDITIONS
177+
178+
APPENDIX: How to apply the Apache License to your work.
179+
180+
To apply the Apache License to your work, attach the following
181+
boilerplate notice, with the fields enclosed by brackets "[]"
182+
replaced with your own identifying information. (Don't include
183+
the brackets!) The text should be enclosed in the appropriate
184+
comment syntax for the file format. We also recommend that a
185+
file or class name and description of purpose be included on the
186+
same "printed page" as the copyright notice for easier
187+
identification within third-party archives.
188+
189+
Copyright 2024 The Authors of "Rapid learning with phase-change
190+
memory-based in-memory computing through learning-to-learn"
191+
192+
Licensed under the Apache License, Version 2.0 (the "License");
193+
you may not use this file except in compliance with the License.
194+
You may obtain a copy of the License at
195+
196+
http://www.apache.org/licenses/LICENSE-2.0
197+
198+
Unless required by applicable law or agreed to in writing, software
199+
distributed under the License is distributed on an "AS IS" BASIS,
200+
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201+
See the License for the specific language governing permissions and
202+
limitations under the License.

README.md

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
# Rapid learning with phase-change memory-based in-memory computing through learning-to-learn
2+
3+
This is the code repository for the paper
4+
5+
Rapid learning with phase-change memory-based in-memory computing through learning-to-learn
6+
*Thomas Ortner, Horst Petschenig, Athanasios Vasilopoulos, Roland Renner, Spela Brglez, Thomas Limbacher, Enrique Pinero, Alejandro Linares Barranco, Angeliki Pantazi, Robert Legenstein*
7+
[ArXiv Link](https://arxiv.org/abs/)
8+
9+
## Setup
10+
You need [Tensorflow](https://www.tensorflow.org/) to run this code. We used Python 3.9 and TensorFlow 2.5. See the corresponding [Conda](https://docs.conda.io/en/latest/) `environment.yml` file to install all necessary dependencies:
11+
12+
conda env create --file=environment.yml --name RapidLearningInMemoryComputing
13+
conda activate RapidLearningInMemoryComputing
14+
pip install --no-warn-conflicts -r requirements.txt
15+
16+
## Usage
17+
18+
### Few-shot image classification with PCM-based neuromorphic hardware
19+
To start training on the few-shot image classification task, run
20+
21+
cd few_shot_image_classification
22+
python main.py --seed 1234 --batch_size=32 --hidden_channels=56 --noise=False
23+
24+
To load an existing checkpoint, run
25+
26+
cd few_shot_image_classification
27+
python main_omniglot.py --checkpoint checkpoints/pretrained-weights.pickle --seed 42 --batch_size=1 --hidden_channels=56 --noise=False --dataset_seed=128
28+
29+
### Rapid online learning of robot arm trajectories in biologically-inspired neural networks
30+
31+
To start training on the robotic arm online learning task, run
32+
33+
cd online_learning_robot
34+
python main.py
35+
36+
To load an existing checkpoint, run
37+
38+
cd online_learning_robot
39+
python main.py --checkpoint checkpoints/pretrained-weights.pickle
40+
41+
## Acknowledgements
42+
This work was funded in part by the CHIST-ERA grant CHIST-ERA-18-ACAI-004, by the Austrian Science Fund (FWF) [10.55776/I4670-N], by grant PCI2019-111841-2 funded by MCIN/AEI/ 10.13039/501100011033, by SNSF under the project number 20CH21_186999 / 1 and by the European Union. For the purpose of open access, the author has applied a CC BY public copyright licence to any Author Accepted Manuscript version arising from this submission. This work was supported by NSF EFRI grant #2318152.
43+
E. P.-F. work was supported by a "Formación de Profesorado Universitario" Scholarship, with reference number FPU19/04597 from the Spanish Ministry of Education, Culture and Sports. Furthermore, we thank the In-Memory Computing team at IBM for their technical support with the PCM-based NMHW as well as the IBM Research AI Hardware Center. Moreover, we thank Joris Gentinetta for his help with the setup for the robotic arm experiments.

environment.yml

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
name: RapidLearningInMemoryComputing
2+
channels:
3+
- conda
4+
- anaconda
5+
- defaults
6+
dependencies:
7+
- hdf5=1.10.6
8+
- python=3.9.5
9+
- readline=8.1
10+
- scipy=1.7.3
11+
- tk=8.6.10
12+
- pip:
13+
- aim
14+
- aim-ui
15+
- aimrecords
16+
- aimrocks
17+
- h5py==3.1.0
18+
- ipykernel==6.29.2
19+
- keras==2.6.0
20+
- matplotlib==3.4.2
21+
- matplotlib-inline==0.1.6
22+
- pandas==1.3.0
23+
- pandocfilters==1.5.1
24+
- pip==24.0
25+
- python-dateutil==2.8.2
26+
- python-json-logger==2.0.7
27+
- tensorboard==2.6.0
28+
- tensorflow==2.5.0
29+
- tensorflow-datasets==4.4.0
30+
- tensorflow-estimator==2.5.0
31+
- tensorflow-metadata==1.4.0
32+
- tensorflow-probability==0.13.0
33+
- tqdm==4.62.3
34+
- werkzeug==2.0.1
Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
import pickle as pkl
2+
from maml_trainee_omniglot import MAMLTraineeOmniglotConv
3+
import numpy as np
4+
5+
def load_checkpoint_conv(path, trainee: MAMLTraineeOmniglotConv):
6+
with open(path, 'rb') as f:
7+
weights = pkl.load(f)
8+
9+
trainee.kernel_1.load(weights[0])
10+
trainee.kernel_2.load(weights[1])
11+
trainee.kernel_3.load(weights[2])
12+
trainee.kernel_4.load(weights[3])
13+
trainee.w_readout.load(weights[4])
14+
15+
for i, w in enumerate(weights):
16+
print(f'Weight {i}', np.abs(w).sum())
17+
18+
19+
20+
def load_checkpoint_conv_only(path, trainee: MAMLTraineeOmniglotConv):
21+
with open(path, 'rb') as f:
22+
weights = pkl.load(f)
23+
24+
trainee.kernel_1.load(weights[0])
25+
trainee.kernel_2.load(weights[1])
26+
trainee.kernel_3.load(weights[2])
27+
trainee.kernel_4.load(weights[3])
28+
29+
# The readout weights are not loaded!
30+
# trainee.w_readout.load(weights[4])
Binary file not shown.
Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
import numpy as np
2+
import tensorflow as tf
3+
4+
from einsums import tf_bidk_bkf_bdf_noise_factorout
5+
6+
7+
@tf.function
8+
def tf_bikf_bdk_bdf(image_shaped, kernel_flat):
9+
return tf.einsum("bkf, bidk->bidf", kernel_flat, image_shaped)
10+
11+
12+
class ConvLayer(tf.keras.layers.Layer):
13+
"""2D convolution layer for MAML and PCM devices.
14+
15+
2 batch sizes: outer and inner on input, outer on kernel
16+
Perform pcm read noise per dot product.
17+
tf_bidk_bkf_bdf_noise_factorout factors out the noise part in samples the noise afterwards
18+
"""
19+
def __init__(self, image_h, image_w, in_channels, kernel_dims, out_channels, padding=1, stride=1, noise=True,
20+
**kwargs):
21+
super(ConvLayer, self).__init__(**kwargs)
22+
23+
self.image_h = image_h
24+
self.image_w = image_w
25+
self.channels = in_channels
26+
self.kernel_dims = kernel_dims
27+
self.out_channels = out_channels
28+
self.padding = padding
29+
self.stride = stride
30+
self.noise = noise
31+
32+
self.paddings = tf.constant([[0, 0], [0, 0], [padding, padding], [padding, padding], [0, 0]])
33+
34+
self.out_height = int((image_h + 2 * padding - kernel_dims) / stride + 1)
35+
self.out_width = int((image_w + 2 * padding - kernel_dims) / stride + 1)
36+
37+
if self.noise:
38+
self.mult_function = tf_bidk_bkf_bdf_noise_factorout
39+
else:
40+
self.mult_function = tf_bikf_bdk_bdf
41+
42+
# Get the indices of the convolutions
43+
all_in = []
44+
for center_index_h in range(padding, self.image_h - 1 + padding * 2, self.stride):
45+
for center_index_w in range(padding, self.image_w - 1 + padding * 2, self.stride):
46+
indices_kernel = []
47+
for k_h in range(self.kernel_dims):
48+
for k_w in range(self.kernel_dims):
49+
for k_c in range(self.channels):
50+
indices_kernel.append((center_index_h + k_h - 1, center_index_w + k_w - 1, k_c))
51+
all_in.append(indices_kernel)
52+
53+
self.all_in = tf.constant(np.asarray(all_in))
54+
55+
def call(self, x, kernel):
56+
# Pad image with zeros
57+
x_pad = tf.pad(x, self.paddings, "CONSTANT")
58+
59+
batch_dim = x.shape[0] * x.shape[1]
60+
61+
# Reshape indices in list
62+
indices = tf.reshape(self.all_in, (-1, 3)) # 3 because conv 2d h,w,c
63+
64+
batch_inner_outer = x_pad.shape[:2]
65+
66+
# Extract features in the order they get used by applying the convolution
67+
image_as = tf.gather_nd(tf.reshape(x_pad, (-1, * x_pad.shape[2:])),
68+
tf.tile(indices[tf.newaxis, ...], [batch_dim, 1, 1]), batch_dims=1)
69+
70+
# Reshape image to (batch, flat_features) for dot product, width*height (depends on padding and how many
71+
# times the dot product is performed)
72+
image_ready_for_conv = tf.reshape(image_as, (*batch_inner_outer, self.all_in.shape[0], self.all_in.shape[1]))
73+
74+
# Collapse kernel into (height*width*in_channels, out_channels)
75+
kernel_reshape = tf.reshape(kernel, (
76+
kernel.shape[0], self.kernel_dims * self.kernel_dims * self.channels, self.out_channels))
77+
78+
conv_output_factor_out = self.mult_function(image_ready_for_conv, kernel_reshape)
79+
80+
return tf.reshape(conv_output_factor_out,
81+
(*batch_inner_outer, self.out_height, self.out_width, self.out_channels))

0 commit comments

Comments
 (0)