TransWikia.com

How to increase matrix size and getting high fidelity for HHL algorithm in Qiskit?

Quantum Computing Asked by Tanin Imanothai on March 29, 2021

I have followed an example that Qiskit provides here.
I tried to increase the matrix size to 16×16 and change num_ancillae and num_time_slices in create_eigs() function.

# set the random seed to get the same pseudo-random matrix for every run
aqua_globals.random_seed = 0
matrix = random_hermitian(16)
vector = matrix.dot(np.array([1, 2, 3, 1, 1, 2, 3, 1, 1, 2, 3, 1, 1, 2, 3, 1]))

m = np.array(matrix)

orig_size = len(vector)
matrix, vector, truncate_powerdim, truncate_hermitian = HHL.matrix_resize(matrix, vector)

# Initialize eigenvalue finding module
eigs = create_eigs(matrix, 2, 2, True)
num_q, num_a = eigs.get_register_sizes()

# Initialize initial state module
init_state = Custom(num_q, state_vector=vector)

# Initialize reciprocal rotation module
reci = LookupRotation(negative_evals=eigs._negative_evals, evo_time=eigs._evo_time)

algo = HHL(matrix, vector, truncate_powerdim, truncate_hermitian, eigs,
           init_state, reci, num_q, num_a, orig_size)
result = algo.run(QuantumInstance(Aer.get_backend('statevector_simulator'),
                                  seed_simulator=aqua_globals.random_seed,
                                  seed_transpiler=aqua_globals.random_seed))
print("solution ", np.round(result['solution'], 5))

result_ref = NumPyLSsolver(matrix, vector).run()
print("classical solution ", np.round(result_ref['solution'], 5))

print("probability %f" % result['probability_result'])
fidelity(result['solution'], result_ref['solution'])

The result is

solution  [ 0.19079-0.95092j  0.26228+0.11189j -0.30868-0.55258j -0.7612 +1.61692j
  0.64665-0.26533j  1.20938-0.40916j -0.51564+1.98277j -0.08177-2.63386j
  1.14807-0.1218j   0.87798+1.39184j  0.8494 +0.00695j -0.0529 -0.11107j
  0.28287+0.74082j  1.3964 +0.23344j -2.15506+1.25378j  1.07591-0.70505j]
classical solution  [1.+0.j 2.+0.j 3.-0.j 1.-0.j 1.+0.j 2.-0.j 3.+0.j 1.-0.j 1.-0.j 2.+0.j
 3.+0.j 1.-0.j 1.+0.j 2.-0.j 3.-0.j 1.+0.j]
probability 0.000000
fidelity 0.040951

I got very low fidelity when I change num_ancillae to 2, if I increase num_ancillae to 3, my kernel just died without showing any error.

My questions are,

What cause my kernel died? Is it normal?

How does num_ancillae and num_time_slices affect the fidelity?

One Answer

If you are using IBMQ Experience jupyter notebook environment, then there is a max memory of 8 GB that you can use.

You can use the command: !free -h to see how much memory you have left.

Actually, I just logged into my IBMQ Experience account and noted that they now showing how much memory you are taking and it is updating every 5 seconds. It is located at the top of your Jupyter notebook. See pic below.

enter image description here

If your notebook is using more than this 8GB allocation memory, the Kernel will dies and restart automatically.

I also see that you are using statevector_simulator is can be very expensive. Instead, switched to qasm_simulator or ibmq_qasm_simulator. See if doing that will fix the memory problem.

Answered by KAJ226 on March 29, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP