TransWikia.com

Time result for my algorithm

Quantum Computing Asked by Simona99 on May 26, 2021

I’ve used the function result.time_taken to know the amount of time used to execute a task on the real chip, but I don’t understand if the result is in seconds or in milliseconds, because if I run my circuit with 1 shot my result is 7.8, if I run my circuit with 1024 shots my result is 8.4. How is this possible? Does the time include only time to execute the algorithm, or other times (for instance the initialization of my input…)?

One Answer

The device time consists of multiple step, such as loading the circuit to the device, then the actual processor time and finally the return of the results.

Only the processor time scales with the number of shots, the device loading time, and result return are constant overhead. Thus it makes sense to see the behavior you saw; you have some constant time and then on top the time that scales linearly in the number of shots. You could find the time for the overhead if you do a linear fit to the total execution time, then the offset equals the overhead.

As a more concrete example, on a recent job the device load and return result together took about 10s and the processor time for 1024 shots was ~4s, and for 8192 ~28s (so even a little bit faster than 8x the 1024 shots time). Of course these numbers depend on the device you're using!

Correct answer by Cryoris on May 26, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP