keras 2.7.0
-
Default Tensorflow + Keras version is now 2.7.
-
New API for constructing RNN (Recurrent Neural Network) layers. This is a
flexible interface that complements the existing RNN layers. It is primarily
intended for advanced / research applications, e.g, prototyping novel
architectures. It allows you to compose a RNN with a custom "cell", a Keras layer that
processes one step of a sequence.
New symbols:layer_rnn()
, which can compose with builtin cells:layer_gru_cell()
layer_lstm_cell()
layer_simple_rnn_cell()
layer_stacked_rnn_cells()
To learn more, including how to make a custom cell layer, see the new vignette:
"Working with RNNs".
-
New dataset functions:
text_dataset_from_directory()
timeseries_dataset_from_array()
-
New layers:
layer_additive_attention()
layer_conv_lstm_1d()
layer_conv_lstm_3d()
-
layer_cudnn_gru()
andlayer_cudnn_lstm()
are deprecated.
layer_gru()
andlayer_lstm()
will automatically use CuDNN if it is available. -
layer_lstm()
andlayer_gru()
:
default value forrecurrent_activation
changed
from"hard_sigmoid"
to"sigmoid"
. -
layer_gru()
: default valuereset_after
changed fromFALSE
toTRUE
-
New vignette: "Transfer learning and fine-tuning".
-
New applications:
- MobileNet V3:
application_mobilenet_v3_large()
,application_mobilenet_v3_small()
- ResNet:
application_resnet101()
,application_resnet152()
,resnet_preprocess_input()
- ResNet V2:
application_resnet50_v2()
,application_resnet101_v2()
,
application_resnet152_v2()
andresnet_v2_preprocess_input()
- EfficientNet:
application_efficientnet_b{0,1,2,3,4,5,6,7}()
- MobileNet V3:
-
Many existing
application_*()
's gain argumentclassifier_activation
,
with default'softmax'
.
Affected:application_{xception, inception_resnet_v2, inception_v3, mobilenet, vgg16, vgg19}()
-
New function
%<-active%
, a ergonomic wrapper aroundmakeActiveBinding()
for constructing Python@property
decorated methods in%py_class%
. -
bidirectional()
sequence processing layer wrapper gains abackwards_layer
arguments. -
Global pooling layers
layer_global_{max,average}_pooling_{1,2,3}d()
gain a
keepdims
argument with default valueFALSE
. -
Signatures for layer functions are in the process of being simplified.
Standard layer arguments are moving to...
where appropriate
(and will need to be provided as named arguments).
Standard layer arguments include:
input_shape
,batch_input_shape
,batch_size
,dtype
,
name
,trainable
,weights
.
Layers updated:
layer_global_{max,average}_pooling_{1,2,3}d()
,
time_distributed()
,bidirectional()
,
layer_gru()
,layer_lstm()
,layer_simple_rnn()
-
All the backend function with a shape argument
k_*(shape =)
that now accept a
a mix of integer tensors and R numerics in the supplied list. -
All layer functions now accept
NA
as a synonym forNULL
in arguments
that specify shape as a vector of dimension values,
e.g.,input_shape
,batch_input_shape
. -
k_random_uniform()
now automatically castsminval
andmaxval
to the output dtype. -
install_keras()
gains arg with defaultpip_ignore_installed = TRUE
.