TFGENZOO.layers.weight_norm module

weight normalization layer from tensorflow addons code is hightly borrowed from https://github.com/tensorflow/addons/blob/v0.7.1/tensorflow_addons/layers/wrappers.py

Differences:

initialization between multi gpu

class TFGENZOO.layers.weight_norm.WeightNormalization(layer: <module 'tensorflow_core.keras.layers' from '/home/meguru/.pyenv/versions/3.7.7/lib/python3.7/site-packages/tensorflow_core/python/keras/api/_v2/keras/layers/__init__.py'>, data_init: bool = True, **kwargs)[source]

Bases: tensorflow.python.keras.layers.wrappers.Wrapper

This wrapper reparameterizes a layer by decoupling the weight’s magnitude and direction.

Note

This speeds up convergence by improving the conditioning of the optimization problem. Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks: https://arxiv.org/abs/1602.07868 Tim Salimans, Diederik P. Kingma (2016) WeightNormalization wrapper works for keras and tf layers.

Examples

>>> net = WeightNormalization(
>>>   tf.keras.layers.Conv2D(2, 2, activation='relu'),
>>>     input_shape=(32, 32, 3),
>>>        data_init=True)(x)
>>> net = WeightNormalization(
>>>   tf.keras.layers.Conv2D(16, 5, activation='relu'),
>>>     data_init=True)(net)
>>> net = WeightNormalization(
>>>   tf.keras.layers.Dense(120, activation='relu'),
>>>     data_init=True)(net)
>>> net = WeightNormalization(
>>>   tf.keras.layers.Dense(n_classes),
>>>     data_init=True)(net)
Parameters
  • layer (tf.keras.layers.Layer) – a layer instance.

  • data_init (bool) – If True use data dependent variable initialization

Returns

Wrapped Layer

Return type

tf.keras.layers.Layer

Raises
  • ValueError – If not initialized with a Layer instance.

  • ValueError – If Layer does not contain a kernel of weights

  • NotImplementedError – If data_init is True and running graph execution

build(input_shape)[source]

Build Layer

call(inputs)[source]

Call Layer

compute_output_shape(input_shape)[source]

Computes the output shape of the layer.

If the layer has not been built, this method will call build on the layer. This assumes that the layer will later be used with inputs that match the input shape provided here.

Parameters

input_shape – Shape tuple (tuple of integers) or list of shape tuples (one per output tensor of the layer). Shape tuples can include None for free dimensions, instead of an integer.

Returns

An input shape tuple.

get_config()[source]

Returns the config of the layer.

A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.

The config of a layer does not include connectivity information, nor the layer class name. These are handled by Network (one layer of abstraction above).

Returns

Python dictionary.

remove()[source]