メインコンテンツまでスキップ

Variational algorithms

Recently, variational quantum algorithms are actively studied, where optimal values of parameters in parametric quantum circuits are searched. In this section, we see how to construct one of the variational algorithms, variational quantum eigensolver (VQE), using the gradient.

Variational quantum eigensolver (VQE) is a method to optimize an expectation value of an operator (e.g. energy of a molecule) over parametrized quantum states. There are two major components in VQE:

  • Ansatz: A parametric quantum circuit which generates the parametrized quantum states subject to optimization
  • Optimizer: A method to numerically optimize the expectation value of the operator

Ansatz

In context of VQE, ansatz refers to a parametric quantum circuit used for generating parametrized quantum states for which expectation values of the target operator is evaluated. You can define a (LinearMapped)UnboundParametricQuantumCircuit on your own, or use a well-known ansatz defined in quri_parts.algo.ansatz package. In this example we use a hardware-efficient ansatz1:

from quri_parts.algo.ansatz import HardwareEfficient

hw_ansatz = HardwareEfficient(qubit_count=4, reps=3)

In order to evaluate the expectation value, the parametrized quantum state is necessary, which is obtained by applying the ansatz to a specific initial state. Here we use a computational basis state 0011|0011\rangle.

from quri_parts.core.state import quantum_state, apply_circuit

cb_state = quantum_state(4, bits=0b0011)
parametric_state = apply_circuit(hw_ansatz, cb_state)

List of available ansatz

Here we list out all the available ansatz provided by quri_parts.algo.ansatz. There are more chemistry-related ansatz provided in quri_parts.chem.ansatz and quri_parts.openfermion.ansatz, which are introduced in the quantum chemistry tutorial.

AnsatzReference
HardwareEfficientHardware-efficient variational quantum eigensolver for small molecules and quantum magnets
HardwareEfficientRealHardware-efficient variational quantum eigensolver for small molecules and quantum magnets
SymmetryPreservingEfficient Symmetry-Peserving State Preparation Circuits for the Variational Quantum Eigensolver Algorithm
SymmetryPreservingRealCalculating transition amplitudes by variational quantum deflation
TwoLocal

Optimizer

An optimizer searches optimal parameters that minimize a given cost function. In context of VQE, the cost function is the expectation value of the target operator. Some optimizers use only the cost function itself, while others use gradient of the cost function for efficient optimization. You can use optimizers provided by libraries such as scipy.optimize, or ones provided in quri_parts.algo.optimizer package. In this example we use Adam2, which uses the gradient.

from quri_parts.algo.optimizer import Adam

# You can pass optional parameters. See the reference for details
adam_optimizer = Adam()

List of available optimizers

Here, we list out all the optimizers available in QURI Parts.

Optimizer NameReference
AdaBeliefAdaBelief Optimizer: Adapting Stepsizes by the Belief in Observed Gradients
AdamAdam: A Method for Stochastic Optimization
NFTSequential minimal optimization for quantum-classical hybrid algorithms
NFTfitSequential minimal optimization for quantum-classical hybrid algorithms
LBFGSNumerical Optimization
SPSAImplementation of the simultaneous perturbation algorithm for stochastic optimization

Running VQE

We first define a target operator, whose expectation value is subject to the optimization:

from quri_parts.core.operator import Operator, pauli_label, PAULI_IDENTITY

# This is Jordan-Wigner transformed Hamiltonian of a hydrogen molecule
hamiltonian = Operator({
PAULI_IDENTITY: 0.03775110394645542,
pauli_label("Z0"): 0.18601648886230593,
pauli_label("Z1"): 0.18601648886230593,
pauli_label("Z2"): -0.2694169314163197,
pauli_label("Z3"): -0.2694169314163197,
pauli_label("Z0 Z1"): 0.172976101307451,
pauli_label("Z0 Z2"): 0.12584136558006326,
pauli_label("Z0 Z3"): 0.16992097848261506,
pauli_label("Z1 Z2"): 0.16992097848261506,
pauli_label("Z1 Z3"): 0.12584136558006326,
pauli_label("Z2 Z3"): 0.17866777775953396,
pauli_label("X0 X1 Y2 Y3"): -0.044079612902551774,
pauli_label("X0 Y1 Y2 X3"): 0.044079612902551774,
pauli_label("Y0 X1 X2 Y3"): 0.044079612902551774,
pauli_label("Y0 Y1 X2 X3"): -0.044079612902551774,
})

Using this operator and the parametric state prepared above, we can define the cost function as a function of the circuit parameters:

from typing import Sequence
from quri_parts.qulacs.estimator import create_qulacs_vector_parametric_estimator

estimator = create_qulacs_vector_parametric_estimator()

def cost_fn(param_values: Sequence[float]) -> float:
estimate = estimator(hamiltonian, parametric_state, param_values)
return estimate.value.real

We also define gradient of the cost function using numerical gradient:

import numpy as np
from quri_parts.core.estimator.gradient import create_numerical_gradient_estimator
from quri_parts.qulacs.estimator import create_qulacs_vector_concurrent_parametric_estimator

qulacs_concurrent_parametric_estimator = create_qulacs_vector_concurrent_parametric_estimator()
gradient_estimator = create_numerical_gradient_estimator(
qulacs_concurrent_parametric_estimator,
delta=1e-4,
)

def grad_fn(param_values: Sequence[float]) -> Sequence[float]:
estimate = gradient_estimator(hamiltonian, parametric_state, param_values)
return np.asarray([g.real for g in estimate.values])

Then we can run VQE with a QURI Parts optimizer:

from quri_parts.algo.optimizer import (
OptimizerStatus, Optimizer, OptimizerState, CostFunction, GradientFunction
)

def vqe(
init_params: Sequence[float],
cost_fn: CostFunction,
grad_fn: GradientFunction,
optimizer: Optimizer
) -> OptimizerState:
opt_state = optimizer.get_init_state(init_params)
while True:
opt_state = optimizer.step(opt_state, cost_fn, grad_fn)
if opt_state.status == OptimizerStatus.FAILED:
print("Optimizer failed")
break
if opt_state.status == OptimizerStatus.CONVERGED:
print("Optimizer converged")
break
return opt_state

init_params = [0.1] * hw_ansatz.parameter_count
result = vqe(init_params, cost_fn, grad_fn, adam_optimizer)
print("Optimized value:", result.cost)
print("Optimized parameter:", result.params)
print("Iterations:", result.niter)
print("Cost function calls:", result.funcalls)
print("Gradient function calls:", result.gradcalls)
#output
Optimizer converged
Optimized value: -1.11198134062873
Optimized parameter: [ 5.47178292e-02 8.40762153e-02 5.12253346e-02 8.19750256e-02
-9.72099808e-03 -1.16141848e-01 -3.06727509e-03 9.66792839e-01
1.27323903e-01 1.04790843e-01 1.27097746e-01 9.40512669e-02
-1.60419261e-02 9.92326546e-01 -3.35897820e-02 9.91027220e-01
6.44048146e-02 2.49953437e-04 6.43611652e-02 -5.72090493e-03
-1.48640075e-02 -1.16555437e-01 -3.59503991e-02 9.79005521e-01
1.67652639e-02 -2.35033768e-01 1.34115104e-02 -2.24492914e-01
-2.91851965e-02 4.35033813e-01 -3.52284767e-03 4.24493167e-01]
Iterations: 24
Cost function calls: 25
Gradient function calls: 24

You can also run VQE with a SciPy optimizer:

from scipy.optimize import minimize, OptimizeResult
from typing import Any

def vqe_scipy(
init_params: Sequence[float],
cost_fn: CostFunction,
grad_fn: GradientFunction,
method: str,
options: dict[str, Any]
) -> OptimizeResult:
return minimize(cost_fn, init_params, jac=grad_fn, method=method, options=options)

init_params = [0.1] * hw_ansatz.parameter_count
bfgs_options = {
"gtol": 1e-6,
}
result = vqe_scipy(init_params, cost_fn, grad_fn, "BFGS", bfgs_options)
print(result.message)
print("Optimized value:", result.fun)
print("Optimized parameter:", result.x)
print("Iterations:", result.nit)
print("Cost function calls:", result.nfev)
print("Gradient function calls:", result.njev)
#output
Optimization terminated successfully.
Optimized value: -1.1299047842810679
Optimized parameter: [ 8.21928344e-04 4.33960432e-02 6.61359653e-01 2.12944495e-03
3.12841863e-01 -4.24605927e-02 -1.39249846e+00 -9.30478701e-04
3.36619434e-01 6.83989561e-05 6.57368440e-01 -2.86291732e-01
6.78422588e-01 1.15552031e-01 2.19605550e+00 -2.44201069e-03
1.57041210e+00 -5.75440049e-06 -4.71087544e-04 1.77640837e-01
1.79445746e-01 -1.42657710e-01 -2.29209768e-01 1.42811907e-02
1.23444775e+00 1.10519712e-01 -6.26506317e-04 1.10540545e-01
-5.21624186e-01 8.94804419e-02 -6.31506659e-01 8.94593665e-02]
Iterations: 171
Cost function calls: 176
Gradient function calls: 176

Footnotes

  1. Kandala, A., Mezzacapo, A., Temme, K. et al. Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets. Nature 549, 242–246 (2017).

  2. Diederik P. Kingma, Jimmy Ba, Adam: A Method for Stochastic Optimization. arXiv:1412.6980 (2014)