MomentumOptimizer

class ampligraph.latent_features.MomentumOptimizer(optimizer_params, batches_count, verbose=False)

Wrapper around Momentum Optimizer

Methods

__init__(optimizer_params, batches_count[, …]) Initialize the Optimizer
minimize(loss) Create an optimizer to minimize the model loss
update_feed_dict(feed_dict, batch_num, epoch_num) Fills values of placeholders created by the optimizers.
__init__(optimizer_params, batches_count, verbose=False)

Initialize the Optimizer

Parameters:
  • optimizer_params (dict) –

    Consists of key-value pairs. The optimizer will check the keys to get the corresponding params:

    • ’lr’: (float). Learning Rate (default: 0.0005)
    • ’momentum’: (float). Momentum (default: 0.9)

    Example: optimizer_params={'lr': 0.001, 'momentum':0.90}

  • batches_count (int) – number of batches in an epoch
  • verbose (bool) – Enable/disable verbose mode
minimize(loss)

Create an optimizer to minimize the model loss

Parameters:loss (tf.Tensor) – Node which needs to be evaluated for computing the model loss.
Returns:train – Node that needs to be evaluated for minimizing the loss during training
Return type:tf.Operation
update_feed_dict(feed_dict, batch_num, epoch_num)

Fills values of placeholders created by the optimizers.

Parameters:
  • feed_dict (dict) – Dictionary that would be passed while optimizing the model loss to sess.run.
  • batch_num (int) – current batch number
  • epoch_num (int) – current epoch number