deltaBarDelta - trains the network using delta-bar-delta

USAGE

    deltaBarDelta [<num-updates>] [-report <report-interval> | -setOnly]

DESCRIPTION

This is a shortcut for training the network using delta-bar-delta. The arguments are similar to those for train. The momentum term is taken from the network's momentum parameter. Other parameters of importance are the network's rateIncrement and rateDecrement.

In this algorithm, each link stores its own learning rate in its lastValue field. This is multiplied by the default learning rate (which is taken from the first one of the block's, group's, or network's learningRate fields that is not NaN). If the link weight takes consecutive steps in the same direction, the link's rate is incremented by rateIncrement. If the weight step changes direction, the rate is multiplied by rateDecrement to scale it back. Thus, rateDecrement is not, strictly speaking, a decrement.

If the -setOnly flag is used, no training will occur. However, the network's numUpdates, reportInterval, and default algorithm will be set. This can be used to set the default training behavior in an initialization script prior to actually training.

EXAMPLES

To train for 1000 epochs, printing reports every 100:

    lens> deltaBarDelta 1000 -r 100

SEE ALSO

train, steepest, momentum, dougsMomentum


Last modified: Fri Nov 17 12:37:33 EST 2000