We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This is what I encountered when running the script. Can anyone help me resolving this issue?
input (InputLayer) [(None, 784)] 0 []
encoder_0 (Dense) (None, 500) 392500 ['input[0][0]']
encoder_1 (Dense) (None, 500) 250500 ['encoder_0[0][0]']
encoder_2 (Dense) (None, 2000) 1002000 ['encoder_1[0][0]']
encoder_3 (Dense) (None, 10) 20010 ['encoder_2[0][0]']
tf.expand_dims (TFOpLambda) (None, 1, 10) 0 ['encoder_3[0][0]']
tf.math.subtract (TFOpLambda) (None, 10, 10) 0 ['tf.expand_dims[0][0]']
tf.math.square (TFOpLambda) (None, 10, 10) 0 ['tf.math.subtract[0][0]']
tf.math.reduce_sum (TFOpLambda (None, 10) 0 ['tf.math.square[0][0]'] )
tf.math.truediv (TFOpLambda) (None, 10) 0 ['tf.math.reduce_sum[0][0]']
tf.operators.add (TFOpLamb (None, 10) 0 ['tf.math.truediv[0][0]'] da)
tf.math.truediv_1 (TFOpLambda) (None, 10) 0 ['tf.operators.add[0][0]']
tf.math.pow (TFOpLambda) (None, 10) 0 ['tf.math.truediv_1[0][0]']
tf.compat.v1.transpose (TFOpLa (10, None) 0 ['tf.math.pow[0][0]'] mbda)
tf.math.reduce_sum_1 (TFOpLamb (None,) 0 ['tf.math.pow[0][0]'] da)
tf.math.truediv_2 (TFOpLambda) (10, None) 0 ['tf.compat.v1.transpose[0][0]', 'tf.math.reduce_sum_1[0][0]']
tf.compat.v1.transpose_1 (TFOp (None, 10) 0 ['tf.math.truediv_2[0][0]'] Lambda)
================================================================================================== Total params: 1,665,010 Trainable params: 1,665,010 Non-trainable params: 0
Update interval 140 Save interval 1365 Initializing cluster centers with k-means. 2188/2188 [==============================] - 10s 4ms/step Traceback (most recent call last): File "DEC.py", line 335, in y_pred = dec.fit(x, y=y, tol=args.tol, maxiter=args.maxiter, batch_size=args.batch_size, File "DEC.py", line 210, in fit self.model.get_layer(name='clustering').set_weights([kmeans.cluster_centers_]) File "/research/DEC_Pytorch_tutorial/dec_venv/lib/python3.8/site-packages/keras/engine/training.py", line 3353, in get_layer raise ValueError( ValueError: No such layer: clustering. Existing layers are: ['input', 'encoder_0', 'encoder_1', 'encoder_2', 'encoder_3', 'tf.expand_dims', 'tf.math.subtract', 'tf.math.square', 'tf.math.reduce_sum', 'tf.math.truediv', 'tf.operators.add', 'tf.math.truediv_1', 'tf.math.pow', 'tf.compat.v1.transpose', 'tf.math.reduce_sum_1', 'tf.math.truediv_2', 'tf.compat.v1.transpose_1'].
The text was updated successfully, but these errors were encountered:
Hi, @CoasterJX did you solve this issue? Having this same error now. Really Hope the author can chime in and provide a solution. Thanks.
Sorry, something went wrong.
@XifengGuo Hi Xifeng, could you help take a look at this issue? Much appreciated!
No branches or pull requests
This is what I encountered when running the script. Can anyone help me resolving this issue?
Layer (type) Output Shape Param # Connected to
input (InputLayer) [(None, 784)] 0 []
encoder_0 (Dense) (None, 500) 392500 ['input[0][0]']
encoder_1 (Dense) (None, 500) 250500 ['encoder_0[0][0]']
encoder_2 (Dense) (None, 2000) 1002000 ['encoder_1[0][0]']
encoder_3 (Dense) (None, 10) 20010 ['encoder_2[0][0]']
tf.expand_dims (TFOpLambda) (None, 1, 10) 0 ['encoder_3[0][0]']
tf.math.subtract (TFOpLambda) (None, 10, 10) 0 ['tf.expand_dims[0][0]']
tf.math.square (TFOpLambda) (None, 10, 10) 0 ['tf.math.subtract[0][0]']
tf.math.reduce_sum (TFOpLambda (None, 10) 0 ['tf.math.square[0][0]']
)
tf.math.truediv (TFOpLambda) (None, 10) 0 ['tf.math.reduce_sum[0][0]']
tf.operators.add (TFOpLamb (None, 10) 0 ['tf.math.truediv[0][0]']
da)
tf.math.truediv_1 (TFOpLambda) (None, 10) 0 ['tf.operators.add[0][0]']
tf.math.pow (TFOpLambda) (None, 10) 0 ['tf.math.truediv_1[0][0]']
tf.compat.v1.transpose (TFOpLa (10, None) 0 ['tf.math.pow[0][0]']
mbda)
tf.math.reduce_sum_1 (TFOpLamb (None,) 0 ['tf.math.pow[0][0]']
da)
tf.math.truediv_2 (TFOpLambda) (10, None) 0 ['tf.compat.v1.transpose[0][0]',
'tf.math.reduce_sum_1[0][0]']
tf.compat.v1.transpose_1 (TFOp (None, 10) 0 ['tf.math.truediv_2[0][0]']
Lambda)
==================================================================================================
Total params: 1,665,010
Trainable params: 1,665,010
Non-trainable params: 0
Update interval 140
Save interval 1365
Initializing cluster centers with k-means.
2188/2188 [==============================] - 10s 4ms/step
Traceback (most recent call last):
File "DEC.py", line 335, in
y_pred = dec.fit(x, y=y, tol=args.tol, maxiter=args.maxiter, batch_size=args.batch_size,
File "DEC.py", line 210, in fit
self.model.get_layer(name='clustering').set_weights([kmeans.cluster_centers_])
File "/research/DEC_Pytorch_tutorial/dec_venv/lib/python3.8/site-packages/keras/engine/training.py", line 3353, in get_layer
raise ValueError(
ValueError: No such layer: clustering. Existing layers are: ['input', 'encoder_0', 'encoder_1', 'encoder_2', 'encoder_3', 'tf.expand_dims', 'tf.math.subtract', 'tf.math.square', 'tf.math.reduce_sum', 'tf.math.truediv', 'tf.operators.add', 'tf.math.truediv_1', 'tf.math.pow', 'tf.compat.v1.transpose', 'tf.math.reduce_sum_1', 'tf.math.truediv_2', 'tf.compat.v1.transpose_1'].
The text was updated successfully, but these errors were encountered: