Neural network layers.
More...
|
def | nabu.neuralnetworks.components.layer.blstm (inputs, sequence_length, num_units, layer_norm=False, scope=None) |
| a BLSTM layer More...
|
|
def | nabu.neuralnetworks.components.layer.pblstm (inputs, sequence_length, num_units, num_steps=2, layer_norm=False, scope=None) |
| a Pyramidal BLSTM layer More...
|
|
def | nabu.neuralnetworks.components.layer.projected_subsampling (inputs, input_seq_lengths, num_steps, name=None) |
|
§ blstm()
def nabu.neuralnetworks.components.layer.blstm |
( |
|
inputs, |
|
|
|
sequence_length, |
|
|
|
num_units, |
|
|
|
layer_norm = False , |
|
|
|
scope = None |
|
) |
| |
a BLSTM layer
- Parameters
-
inputs | the input to the layer as a [batch_size, max_length, dim] tensor |
sequence_length | the length of the input sequences as a [batch_size] tensor |
num_units | The number of units in the one directon |
layer_norm | whether layer normalization should be applied |
scope | The variable scope sets the namespace under which the variables created during this call will be stored. |
- Returns
- the blstm outputs
§ pblstm()
def nabu.neuralnetworks.components.layer.pblstm |
( |
|
inputs, |
|
|
|
sequence_length, |
|
|
|
num_units, |
|
|
|
num_steps = 2 , |
|
|
|
layer_norm = False , |
|
|
|
scope = None |
|
) |
| |
a Pyramidal BLSTM layer
- Parameters
-
inputs | the input to the layer as a [batch_size, max_length, dim] tensor |
sequence_length | the length of the input sequences as a [batch_size] tensor |
num_units | The number of units in the one directon |
num_steps | the number of time steps to concatenate |
layer_norm | whether layer normalization should be applied |
scope | The variable scope sets the namespace under which the variables created during this call will be stored. |
- Returns
- the PBLSTM outputs
- the new sequence lengths