SGDOptimizer

class ampligraph.latent_features.SGDOptimizer(optimizer_params, batches_count, verbose=False)

Wrapper around SGD Optimizer

Methods

__init__(optimizer_params, batches_count[, …])

Initialize the Optimizer

minimize(loss)

Create an optimizer to minimize the model loss

update_feed_dict(feed_dict, batch_num, epoch_num)

Fills values of placeholders created by the optimizers.

__init__(optimizer_params, batches_count, verbose=False)

Initialize the Optimizer

Parameters
  • optimizer_params (dict) –

    Consists of key-value pairs. The optimizer will check the keys to get the corresponding params:

    • ’lr’: (float). Learning Rate upper bound (default: 0.0005)

    • ’decay_cycle’: (int). Cycle of epoch over which to decay (default: 0)

    • ’end_lr’: (float). Learning Rate lower bound (default: 1e-8)

    • ’cosine_decay’: (bool). Use cosine decay or to fixed rate decay (default: False)

    • ’expand_factor’: (float). Expand the decay cycle length by this factor after each cycle (default: 1)

    • ’decay_lr_rate’: (float). Decay factor to decay the start lr after each cycle (default: 2)

    Example: optimizer_params={'lr': 0.01, 'decay_cycle':30, 'end_lr':0.0001, 'sine_decay':True}

  • batches_count (int) – number of batches in an epoch

  • verbose (bool) – Enable/disable verbose mode

minimize(loss)

Create an optimizer to minimize the model loss

Parameters

loss (tf.Tensor) – Node which needs to be evaluated for computing the model loss.

Returns

train – Node that needs to be evaluated for minimizing the loss during training

Return type

tf.Operation

update_feed_dict(feed_dict, batch_num, epoch_num)

Fills values of placeholders created by the optimizers.

Parameters
  • feed_dict (dict) – Dictionary that would be passed while optimizing the model loss to sess.run.

  • batch_num (int) – current batch number

  • epoch_num (int) – current epoch number