BNNSOptimizerStep(_:_:_:_:_:_:_:)
Applies a single optimization step to one or more parameters.
Declaration
func BNNSOptimizerStep(_ function: BNNSOptimizerFunction, _ OptimizerAlgFields: UnsafeRawPointer, _ number_of_parameters: Int, _ parameters: UnsafeMutablePointer<UnsafeMutablePointer<BNNSNDArrayDescriptor>>, _ gradients: UnsafeMutablePointer<UnsafePointer<BNNSNDArrayDescriptor>>, _ accumulators: UnsafeMutablePointer<UnsafeMutablePointer<BNNSNDArrayDescriptor>?>?, _ filter_params: UnsafePointer<BNNSFilterParameters>?) -> Int32Parameters
- function:
The optimization algorithm.
- OptimizerAlgFields:
A pointer to parameters for optimization function.
- number_of_parameters:
The number of parameters the step updates.
- parameters:
An array of pointers to parameter descriptors.
- gradients:
An array of pointers to gradient descriptors.
- accumulators:
An array of pointers to accumulator descriptors.
- filter_params:
The filter runtime parameters.
Discussion
Use BNNSOptimizerStep(_:_:_:_:_:_:_:) to update a set of parameters using a supplied optimization algorithm.
For example, the following shows the code required to update the weights data described by weightsDescriptor using an Adam optimizer.
var weightsDescriptor: BNNSNDArrayDescriptor = ...
var deltaDescriptor: BNNSNDArrayDescriptor = ...
var accumulatorOneDescriptor: BNNSNDArrayDescriptor = ...
var accumulatorTwoDescriptor: BNNSNDArrayDescriptor = ...
var adamFields: BNNSOptimizerAdamFields = ...
withUnsafeMutablePointer(to: &weightsDescriptor) { weightsDescriptorPtr in
withUnsafePointer(to: &deltaDescriptor) { deltaDescriptorPtr in
withUnsafeMutablePointer(to: &accumulatorOneDescriptor) { accumulatorOneDescriptorPtr in
withUnsafeMutablePointer(to: &accumulatorTwoDescriptor) { accumulatorTwoDescriptorPtr in
var paramaters = [ weightsDescriptorPtr ]
var gradients = [ deltaDescriptorPtr ]
var accumulators = [ Optional(accumulatorOneDescriptorPtr),
Optional(accumulatorTwoDescriptorPtr) ]
let error = withUnsafePointer(to: &adamFields) { adamFieldsPointer in
BNNSOptimizerStep(BNNSOptimizerFunctionAdam,
adamFieldsPointer, 1,
¶maters,
&gradients,
&accumulators,
nil)
}
if error != 0 {
fatalError("BNNSOptimizerStep failed.")
}
}
}
}
}
See Also
Optimizers
BNNS.AdamOptimizerBNNS.AdamWOptimizerBNNS.RMSPropOptimizerBNNS.SGDMomentumOptimizerBNNSOptimizerBNNSOptimizerRegularizationFunctionBNNSOptimizerAdamFieldsBNNSOptimizerAdamWithClippingFieldsBNNSOptimizerRMSPropFieldsBNNSOptimizerRMSPropWithClippingFieldsBNNSOptimizerSGDMomentumFieldsBNNSOptimizerSGDMomentumWithClippingFieldsBNNSOptimizerSGDMomentumVariantBNNSOptimizerFunction