Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Retether Keras 3.8.0 #1485

Merged
merged 40 commits into from
Jan 23, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
e407ae3
iou metrics now default to int dtype
t-kalinowski Jan 23, 2025
200d139
support new export formats: onyx model export; h5 model weight file
t-kalinowski Jan 23, 2025
162ba09
`layer_random_contrast()` gains arg `value_range`
t-kalinowski Jan 23, 2025
8f88adf
clarify layer_random_rotation docs
t-kalinowski Jan 23, 2025
7975d2b
doc improvements
t-kalinowski Jan 23, 2025
2790a0e
`loss_tversky()` gains `axis` arg
t-kalinowski Jan 23, 2025
3d5f92d
updates to random tensor generator seeds
t-kalinowski Jan 23, 2025
f73d26f
updates to `random_seed_generator()`
t-kalinowski Jan 23, 2025
9a4e7df
`op_istft()` updates
t-kalinowski Jan 23, 2025
2494875
redocument
t-kalinowski Jan 23, 2025
728cc36
add `activation_sparse_plus`
t-kalinowski Jan 23, 2025
033061e
add `activation_sparsemax`
t-kalinowski Jan 23, 2025
d2b304d
add `activation_threshold`
t-kalinowski Jan 23, 2025
aa7dcfc
redocument
t-kalinowski Jan 23, 2025
45a3a34
add `op_sparse_plus()`
t-kalinowski Jan 23, 2025
802a720
add `op_sparsemax()`
t-kalinowski Jan 23, 2025
c34005d
add `op_threshold()`
t-kalinowski Jan 23, 2025
f86e96a
tether diffs
t-kalinowski Jan 23, 2025
1bcf8f0
redocument
t-kalinowski Jan 23, 2025
51bbe4b
add `op_diagflat`
t-kalinowski Jan 23, 2025
513ad38
add `op_unravel_index()`
t-kalinowski Jan 23, 2025
fd3f445
add `layer_equalization()`
t-kalinowski Jan 23, 2025
278d7aa
add `layer_mix_up()`
t-kalinowski Jan 23, 2025
6ea0d74
add `layer_rand_augment()`
t-kalinowski Jan 23, 2025
fc75d48
add `layer_random_color_degeneration()`
t-kalinowski Jan 23, 2025
d3cb470
redocument
t-kalinowski Jan 23, 2025
3803d5c
add `layer_random_color_jitter()`
t-kalinowski Jan 23, 2025
bfe965f
add `layer_random_grayscale()`
t-kalinowski Jan 23, 2025
f8889e7
add `layer_random_hue()`
t-kalinowski Jan 23, 2025
db65b99
add `layer_random_posterization()`
t-kalinowski Jan 23, 2025
da9928d
add `layer_random_saturation()`
t-kalinowski Jan 23, 2025
942538d
add `layer_random_sharpness()`
t-kalinowski Jan 23, 2025
420f496
add `layer_random_shear()`
t-kalinowski Jan 23, 2025
7aa79cf
tether diffs
t-kalinowski Jan 23, 2025
3b67ce7
`Layer$add_weight()` new aggregation strategy
t-kalinowski Jan 23, 2025
a8fbafe
redocument
t-kalinowski Jan 23, 2025
f7f22e9
checkin new tethers
t-kalinowski Jan 23, 2025
3853675
R CMD check fixes
t-kalinowski Jan 23, 2025
903494c
more R CMD check fixes
t-kalinowski Jan 23, 2025
89b5ccb
add NEWS
t-kalinowski Jan 23, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
11 changes: 6 additions & 5 deletions .tether/man/Layer.txt
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ class Layer(keras.src.backend.tensorflow.layer.TFLayer, keras.src.ops.operation.
| autocast=True,
| regularizer=None,
| constraint=None,
| aggregation='mean',
| aggregation='none',
| name=None
| )
| Add a weight variable to the layer.
Expand All @@ -271,10 +271,11 @@ class Layer(keras.src.backend.tensorflow.layer.TFLayer, keras.src.ops.operation.
| constraint: Contrainst object to call on the variable after any
| optimizer update, or string name of a built-in constraint.
| Defaults to `None`.
| aggregation: String, one of `'mean'`, `'sum'`,
| `'only_first_replica'`. Annotates the variable with the type
| of multi-replica aggregation to be used for this variable
| when writing custom data parallel training loops.
| aggregation: Optional string, one of `None`, `"none"`, `"mean"`,
| `"sum"` or `"only_first_replica"`. Annotates the variable with
| the type of multi-replica aggregation to be used for this
| variable when writing custom data parallel training loops.
| Defaults to `"none"`.
| name: String name of the variable. Useful for debugging purposes.
|
| build(self, input_shape)
Expand Down
14 changes: 14 additions & 0 deletions .tether/man/activation_sparse_plus.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
__signature__
keras.activations.sparse_plus(x)
__doc__
SparsePlus activation function.

SparsePlus is defined as:

`sparse_plus(x) = 0` for `x <= -1`.
`sparse_plus(x) = (1/4) * (x + 1)^2` for `-1 < x < 1`.
`sparse_plus(x) = x` for `x >= 1`.

Args:
x: Input tensor.

22 changes: 22 additions & 0 deletions .tether/man/activation_sparsemax.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
__signature__
keras.activations.sparsemax(x, axis=-1)
__doc__
Sparsemax activation function.

For each batch `i`, and class `j`,
sparsemax activation function is defined as:

`sparsemax(x)[i, j] = max(x[i, j] - τ(x[i, :]), 0).`

Args:
x: Input tensor.
axis: `int`, axis along which the sparsemax operation is applied.

Returns:
A tensor, output of sparsemax transformation. Has the same type and
shape as `x`.

Reference:

- [Martins et.al., 2016](https://arxiv.org/abs/1602.02068)

19 changes: 19 additions & 0 deletions .tether/man/activation_threshold.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
__signature__
keras.activations.threshold(
x,
threshold,
default_value
)
__doc__
Threshold activation function.

It is defined as:

`threshold(x) = x` if `x > threshold`,
`threshold(x) = default_value` otherwise.

Args:
x: Input tensor.
threshold: The value that decides when to retain or replace x.
default_value: Value to assign when `x <= threshold`.

1 change: 1 addition & 0 deletions .tether/man/callback_backup_and_restore.txt
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ class BackupAndRestore(keras.src.callbacks.callback.Callback)
| >>> callback = keras.callbacks.BackupAndRestore(backup_dir="/tmp/backup")
| >>> model = keras.models.Sequential([keras.layers.Dense(10)])
| >>> model.compile(keras.optimizers.SGD(), loss='mse')
| >>> model.build(input_shape=(None, 20))
| >>> try:
| ... model.fit(np.arange(100).reshape(5, 20), np.zeros(5), epochs=10,
| ... batch_size=1, callbacks=[callback, InterruptingCallback()],
Expand Down
12 changes: 7 additions & 5 deletions .tether/man/callback_model_checkpoint.txt
Original file line number Diff line number Diff line change
Expand Up @@ -64,12 +64,13 @@ class ModelCheckpoint(keras.src.callbacks.callback.Callback)
| which will be filled the value of `epoch` and keys in `logs`
| (passed in `on_epoch_end`).
| The `filepath` name needs to end with `".weights.h5"` when
| `save_weights_only=True` or should end with `".keras"` when
| checkpoint saving the whole model (default).
| `save_weights_only=True` or should end with `".keras"` or `".h5"`
| when checkpoint saving the whole model (default).
| For example:
| if `filepath` is `"{epoch:02d}-{val_loss:.2f}.keras"`, then the
| model checkpoints will be saved with the epoch number and the
| validation loss in the filename. The directory of the filepath
| if `filepath` is `"{epoch:02d}-{val_loss:.2f}.keras"` or
| "{epoch:02d}-{val_loss:.2f}.weights.h5"`, then the model
| checkpoints will be saved with the epoch number and the validation
| loss in the filename. The directory of the filepath
| should not be reused by any other callbacks to avoid conflicts.
| monitor: The metric name to monitor. Typically the metrics are set by
| the `Model.compile` method. Note:
Expand Down Expand Up @@ -183,3 +184,4 @@ class ModelCheckpoint(keras.src.callbacks.callback.Callback)
| batch: Integer, index of batch within the current epoch.
| logs: Dict. Aggregated metric results up until this batch.
|

71 changes: 48 additions & 23 deletions .tether/man/export_savedmodel.keras.src.models.model.Model.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,40 +3,65 @@ keras.Model.export(
self,
filepath,
format='tf_saved_model',
verbose=True
verbose=True,
input_signature=None,
**kwargs
)
__doc__
Create a TF SavedModel artifact for inference.
Export the model as an artifact for inference.

**Note:** This can currently only be used with
the TensorFlow or JAX backends.

This method lets you export a model to a lightweight SavedModel artifact
that contains the model's forward pass only (its `call()` method)
and can be served via e.g. TF-Serving. The forward pass is registered
under the name `serve()` (see example below).
Args:
filepath: `str` or `pathlib.Path` object. The path to save the
artifact.
format: `str`. The export format. Supported values:
`"tf_saved_model"` and `"onnx"`. Defaults to
`"tf_saved_model"`.
verbose: `bool`. Whether to print a message during export. Defaults
to `True`.
input_signature: Optional. Specifies the shape and dtype of the
model inputs. Can be a structure of `keras.InputSpec`,
`tf.TensorSpec`, `backend.KerasTensor`, or backend tensor. If
not provided, it will be automatically computed. Defaults to
`None`.
**kwargs: Additional keyword arguments:
- Specific to the JAX backend and `format="tf_saved_model"`:
- `is_static`: Optional `bool`. Indicates whether `fn` is
static. Set to `False` if `fn` involves state updates
(e.g., RNG seeds and counters).
- `jax2tf_kwargs`: Optional `dict`. Arguments for
`jax2tf.convert`. See the documentation for
[`jax2tf.convert`](
https://github.com/google/jax/blob/main/jax/experimental/jax2tf/README.md).
If `native_serialization` and `polymorphic_shapes` are
not provided, they will be automatically computed.

The original code of the model (including any custom layers you may
have used) is *no longer* necessary to reload the artifact -- it is
entirely standalone.
**Note:** This feature is currently supported only with TensorFlow, JAX
and Torch backends.

Args:
filepath: `str` or `pathlib.Path` object. Path where to save
the artifact.
verbose: whether to print all the variables of the exported model.
Examples:

Example:
Here's how to export a TensorFlow SavedModel for inference.

```python
# Create the artifact
model.export("path/to/location")
# Export the model as a TensorFlow SavedModel artifact
model.export("path/to/location", format="tf_saved_model")

# Later, in a different process/environment...
# Load the artifact in a different process/environment
reloaded_artifact = tf.saved_model.load("path/to/location")
predictions = reloaded_artifact.serve(input_data)
```

If you would like to customize your serving endpoints, you can
use the lower-level `keras.export.ExportArchive` class. The
`export()` method relies on `ExportArchive` internally.
Here's how to export an ONNX for inference.

```python
# Export the model as a ONNX artifact
model.export("path/to/location", format="onnx")

# Load the artifact in a different process/environment
ort_session = onnxruntime.InferenceSession("path/to/location")
ort_inputs = {
k.name: v for k, v in zip(ort_session.get_inputs(), input_data)
}
predictions = ort_session.run(None, ort_inputs)
```

7 changes: 7 additions & 0 deletions .tether/man/keras.activations.txt
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,15 @@ soft_shrink(x, threshold=0.5)
softmax(x, axis=-1)
softplus(x)
softsign(x)
sparse_plus(x)
sparsemax(x, axis=-1)
squareplus(x, b=4)
swish(x)
tanh(x)
tanh_shrink(x)
threshold(
x,
threshold,
default_value
)

83 changes: 83 additions & 0 deletions .tether/man/keras.layers.txt
Original file line number Diff line number Diff line change
Expand Up @@ -542,6 +542,12 @@ Embedding(
lora_rank=None,
**kwargs
)
Equalization(
value_range=(0, 255),
bins=256,
data_format=None,
**kwargs
)
Flatten(data_format=None, **kwargs)
FlaxLayer(
module,
Expand Down Expand Up @@ -913,6 +919,12 @@ MelSpectrogram(
)
minimum(inputs, **kwargs)
Minimum(**kwargs)
MixUp(
alpha=0.2,
data_format=None,
seed=None,
**kwargs
)
MultiHeadAttention(
num_heads,
key_dim,
Expand Down Expand Up @@ -950,14 +962,41 @@ PReLU(
shared_axes=None,
**kwargs
)
RandAugment(
value_range=(0, 255),
num_ops=2,
factor=0.5,
interpolation='bilinear',
seed=None,
data_format=None,
**kwargs
)
RandomBrightness(
factor,
value_range=(0, 255),
seed=None,
**kwargs
)
RandomColorDegeneration(
factor,
value_range=(0, 255),
data_format=None,
seed=None,
**kwargs
)
RandomColorJitter(
value_range=(0, 255),
brightness_factor=None,
contrast_factor=None,
saturation_factor=None,
hue_factor=None,
seed=None,
data_format=None,
**kwargs
)
RandomContrast(
factor,
value_range=(0, 255),
seed=None,
**kwargs
)
Expand All @@ -975,6 +1014,26 @@ RandomFlip(
data_format=None,
**kwargs
)
RandomGrayscale(
factor=0.5,
data_format=None,
seed=None,
**kwargs
)
RandomHue(
factor,
value_range=(0, 255),
data_format=None,
seed=None,
**kwargs
)
RandomPosterization(
factor,
value_range=(0, 255),
data_format=None,
seed=None,
**kwargs
)
RandomRotation(
factor,
fill_mode='reflect',
Expand All @@ -984,6 +1043,30 @@ RandomRotation(
data_format=None,
**kwargs
)
RandomSaturation(
factor,
value_range=(0, 255),
data_format=None,
seed=None,
**kwargs
)
RandomSharpness(
factor,
value_range=(0, 255),
data_format=None,
seed=None,
**kwargs
)
RandomShear(
x_factor=0.0,
y_factor=0.0,
interpolation='bilinear',
fill_mode='reflect',
fill_value=0.0,
data_format=None,
seed=None,
**kwargs
)
RandomTranslation(
height_factor,
width_factor,
Expand Down
4 changes: 3 additions & 1 deletion .tether/man/keras.losses.txt
Original file line number Diff line number Diff line change
Expand Up @@ -210,13 +210,15 @@ tversky(
y_true,
y_pred,
alpha=0.5,
beta=0.5
beta=0.5,
axis=None
)
Tversky(
alpha=0.5,
beta=0.5,
reduction='sum_over_batch_size',
name='tversky',
axis=None,
dtype=None
)

7 changes: 7 additions & 0 deletions .tether/man/keras.ops.nn.txt
Original file line number Diff line number Diff line change
Expand Up @@ -151,7 +151,14 @@ sparse_categorical_crossentropy(
from_logits=False,
axis=-1
)
sparse_plus(x)
sparsemax(x, axis=-1)
squareplus(x, b=4)
swish(x)
tanh_shrink(x)
threshold(
x,
threshold,
default_value
)

2 changes: 2 additions & 0 deletions .tether/man/keras.ops.numpy.txt
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,7 @@ cumsum(
dtype=None
)
diag(x, k=0)
diagflat(x, k=0)
diagonal(
x,
offset=0,
Expand Down Expand Up @@ -358,6 +359,7 @@ tril(x, k=0)
triu(x, k=0)
true_divide(x1, x2)
trunc(x)
unravel_index(indices, shape)
var(
x,
axis=None,
Expand Down
Loading
Loading