quri_parts.algo.optimizer.nft module#
- class OptimizerStateNFT(params: 'Params', cost: 'float' = 0.0, status: 'OptimizerStatus' = <OptimizerStatus.SUCCESS: 1>, niter: 'int' = 0, funcalls: 'int' = 0, gradcalls: 'int' = 0)#
Bases:
OptimizerState- Parameters:
params (algo.optimizer.interface.Params) –
cost (float) –
status (OptimizerStatus) –
niter (int) –
funcalls (int) –
gradcalls (int) –
- params: Params#
Current parameter values.
- class NFTBase(randomize=False, ftol=1e-05)#
Bases:
Optimizer- Parameters:
randomize (bool) –
ftol (Optional[float]) –
- get_init_state(init_params)#
Returns an initial state for optimization.
- Parameters:
init_params (algo.optimizer.interface.Params) –
- Return type:
- step(state, cost_function, grad_function=None)#
Run a single optimization step and returns a new state.
- Parameters:
state (OptimizerState) –
cost_function (algo.optimizer.interface.CostFunction) –
grad_function (algo.optimizer.interface.GradientFunction | None) –
- Return type:
- class NFT(randomize=False, reset_interval=32, eps=1e-32, ftol=1e-05)#
Bases:
NFTBaseNakanishi-Fujii-Todo optimization algorithm proposed in [1]. The algorithms optimizes the cost function sequentially with respect to the parameters.
- Parameters:
randomize (bool) – If
True, the order of the parameters over which the cost function is optimized will be random. IfFalse, the order will be the same as the order of the array of initial parameters.reset_interval (Optional[int]) – minimum value of the cost function is evaluated by direct function-call only once in
reset_intervaltimes.eps (float) – a small number used for avoiding zero division.
ftol (Optional[float]) – If not None, judge convergence by cost function tolerance. See
ftol()for details.
- Ref:
- [1]: Ken M. Nakanishi, Keisuke Fujii, and Synge Todo,
Sequential minimal optimization for quantum-classical hybrid algorithms, Phys. Rev. Research 2, 043158 (2020).
- class NFTfit(randomize=False, n_points=3, ftol=1e-05)#
Bases:
NFTBaseBasically the same as Nakanishi-Fujii-Todo optimization algorithm [1]. The difference from
NFTclass is thatNFTfituses SciPy fitting functioncurve_fitfor parameters update.- Parameters:
randomize (bool) – If
True, the order of the parameters over which the cost function is optimized will be random. IfFalse, the order will be the same as the order of the array of initial parameters.n_point – Number of values of cost function for function fitting using SciPy fitting function
curve_fit.ftol (Optional[float]) – If not None, judge convergence by cost function tolerace. See
ftol()for details.n_points (int) –
- Ref:
- [1]: Ken M. Nakanishi, Keisuke Fujii, and Synge Todo,
Sequential minimal optimization for quantum-classical hybrid algorithms, Phys. Rev. Research 2, 043158 (2020).