Stack Overflow Asked by user3599803 on December 20, 2020
How can I measure the full time grpc-python takes to handle a request?
So far the best I can do is:
def Run(self, request, context):
start = time.time()
# service code...
end = time.time()
return myservice_stub.Response()
But this doesn’t measure how much time grpc takes to serialize the request, response, to transfer it over the network.. and so on. I’m looking for a way to "hook" into these steps.
You can measure on the client side:
start = time.time()
response = stub.Run(request)
total_end_to_end = time.time() - start
Then you can get the total overhead (serialization, transfer) by reducing the computation of the Run
method.
To automate the process, you can add (at least for the sake of the test) the computation time as a field to the myservice_stub.Response
.
Answered by Mark Loyman on December 20, 2020
1 Asked on November 18, 2021 by rohini
2 Asked on November 18, 2021
1 Asked on November 18, 2021 by joshbh
1 Asked on November 18, 2021
1 Asked on November 18, 2021 by nomadev95
1 Asked on November 18, 2021 by fransisca-sibarani
apache spark distributed distributed computing google cloud dataproc google cloud platform
1 Asked on November 17, 2021 by andrew-hicks
1 Asked on November 17, 2021 by ashice
0 Asked on November 17, 2021 by cloudyday
1 Asked on November 17, 2021 by theandrewjeff
3 Asked on November 17, 2021 by kevin-d
1 Asked on November 17, 2021 by fatun
2 Asked on November 17, 2021
1 Asked on November 17, 2021 by santma
2 Asked on November 17, 2021 by rempsyc
2 Asked on November 17, 2021 by kaizen-tamashi
1 Asked on November 17, 2021 by biggerthanpenny
1 Asked on November 17, 2021 by rhody
Get help from others!
Recent Questions
Recent Answers
© 2022 AnswerBun.com. All rights reserved. Sites we Love: PCI Database, MenuIva, UKBizDB, Menu Kuliner, Sharing RPP