You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This case has not been supported in Model.save() / Model.load() yet.
56
56
Please avoid using Model.save() / Model.load() to save / load models that contain such Lambda layer. Instead, you may use Model.save_weights() / Model.load_weights() to save / load model weights.
57
57
Note: In this case, fn_weights should be a list, and then the trainable weights in this Lambda layer can be added into the weights of the whole model.
58
58
59
-
>>> vara = [tf.Variable(1.0)]
59
+
>>> a = tf.Variable(1.0)
60
60
>>> def func(x):
61
-
>>> return x + vara
61
+
>>> return x + a
62
62
>>> x = tl.layers.Input([8, 3], name='input')
63
-
>>> y = tl.layers.Lambda(func, fn_weights=a, name='lambda')(x)
63
+
>>> y = tl.layers.Lambda(func, fn_weights=[a], name='lambda')(x)
64
64
65
65
66
-
Parametric case, merge other wrappers into TensorLayer
66
+
Parametric case, merge other wrappers into TensorLayer:
67
67
This case is supported in the Model.save() / Model.load() to save / load the whole model architecture and weights(optional).
68
68
69
69
>>> layers = [
@@ -74,27 +74,27 @@ class Lambda(Layer):
74
74
>>> perceptron = tf.keras.Sequential(layers)
75
75
>>> # in order to compile keras model and get trainable_variables of the keras model
0 commit comments