updateWeights [-algorithm <algorithm> | -noreset | -report]
This performs a single weight update using the currently accumulated link derivatives. This is useful in conjunction with doExample for designing custom training patterns other than the standard fixed batch size.
If an algorithm, such as steepest, momentum, or dougsMomentum, is specified, it will be used to update the weights, but the network's default algorithm will not change. If no algorithm is specified, the network default will be used.
By default the link and unit derivatives will be reset after the
update, but this can be prevented with the -noreset
option.
-report
will cause a report to be printed.
Here's a way to train using random batch sizes:
repeat 1000 { repeat [randInt 100] {doExample -train}; updateWeights }