Nabu-asr
Functions
layer.py File Reference

Neural network layers. More...

Functions

def nabu.neuralnetworks.components.layer.blstm (inputs, sequence_length, num_units, layer_norm=False, scope=None)
 a BLSTM layer More...
 
def nabu.neuralnetworks.components.layer.pblstm (inputs, sequence_length, num_units, num_steps=2, layer_norm=False, scope=None)
 a Pyramidal BLSTM layer More...
 
def nabu.neuralnetworks.components.layer.projected_subsampling (inputs, input_seq_lengths, num_steps, name=None)
 

Detailed Description

Neural network layers.

Function Documentation

§ blstm()

def nabu.neuralnetworks.components.layer.blstm (   inputs,
  sequence_length,
  num_units,
  layer_norm = False,
  scope = None 
)

a BLSTM layer

Parameters
inputsthe input to the layer as a [batch_size, max_length, dim] tensor
sequence_lengththe length of the input sequences as a [batch_size] tensor
num_unitsThe number of units in the one directon
layer_normwhether layer normalization should be applied
scopeThe variable scope sets the namespace under which the variables created during this call will be stored.
Returns
the blstm outputs

§ pblstm()

def nabu.neuralnetworks.components.layer.pblstm (   inputs,
  sequence_length,
  num_units,
  num_steps = 2,
  layer_norm = False,
  scope = None 
)

a Pyramidal BLSTM layer

Parameters
inputsthe input to the layer as a [batch_size, max_length, dim] tensor
sequence_lengththe length of the input sequences as a [batch_size] tensor
num_unitsThe number of units in the one directon
num_stepsthe number of time steps to concatenate
layer_normwhether layer normalization should be applied
scopeThe variable scope sets the namespace under which the variables created during this call will be stored.
Returns
  • the PBLSTM outputs
  • the new sequence lengths