TFGENZOO.flows.unsafe.actnorm module¶
-
class
TFGENZOO.flows.unsafe.actnorm.
Actnorm
(scale: float = 1.0, logscale_factor: float = 3.0, **kwargs)[source]¶ Bases:
TFGENZOO.flows.flowbase.FlowComponent
Actnorm Layer
This layer can SyncBatch Normalization, but may crash frequently.
Sources:
Note
- initialize
- mean = mean(first_batch)var = variance(first_batch)logs = log(scale / sqrt(var)) / logscale_factorbias = - mean
- forward formula
- logs = logs * logscale_factorscale = exp(logs)
z = (x + bias) * scale log_det_jacobain = sum(logs) * H * W
- inverse formula
- logs = logs * logsscale_factorinv_scale = exp(-logs)z = x * inv_scale - biasinverse_log_det_jacobian = sum(- logs) * H * W
-
calc_ldj
¶ bool flag of calculate log det jacobian
-
scale
¶ float initialize batch’s variance scaling
-
logscale_factor
¶ float barrier log value to - Inf
-
property
bias
¶
-
build
(input_shape: tensorflow.python.framework.tensor_shape.TensorShape)[source]¶ Creates the variables of the layer (optional, for subclass implementers).
This is a method that implementers of subclasses of Layer or Model can override if they need a state-creation step in-between layer instantiation and layer call.
This is typically used to create the weights of Layer subclasses.
- Parameters
input_shape – Instance of TensorShape, or list of instances of TensorShape if the layer expects a list of inputs (one instance per input).
-
get_config
()[source]¶ Returns the config of the layer.
A layer config is a Python dictionary (serializable) containing the configuration of a layer. The same layer can be reinstantiated later (without its trained weights) from this configuration.
The config of a layer does not include connectivity information, nor the layer class name. These are handled by Network (one layer of abstraction above).
- Returns
Python dictionary.
-
property
logs
¶