TransWikia.com

How do you calculate the running time using Big-O notatation?

Computer Science Asked by Veree on October 21, 2021

I’m still new to Data Structure and Algorithm and therefore I would like to ease my doubts. I’m required to find the Big-O running time of myMethod():

 Static int doIt(int n) {
           for ?←1 ?? 100 do    -> 100
             for ?←1 ?? ? do    -> n x 100
                 ?←1;?←?       -> 1
           while ?<? do       -> lg n
          ?←(?+?)/2           -> lg n/2
        end while 
        end for 
        end for
    
    }

 static int myMethod(int n) {
                  ? ←1
                  ?ℎ??? (? < ?) {          -> lg n
                        ????(?)            -> ?
                         ?←?×2             -> lg n
        }
            ?????? 1; 

}

For each of the line I have inserted the answer I think would be a logical value to put e.g "-> 100". However I am not confident with the answer. These are the following questions that I would like to ask as well:

  1. Does all while loop have a value of lg n?

  2. If the doIt(n) method does not have a return statement, do we still add in part of calculation?

  3. For the line with m <- (m+j)/2, the value I have inputted is lg n /2. Is that correct or is it some other value?

  4. Whenever multiplying by 2 or dividing by 2 is it always lg 2? what if n is multiplied by 3?

Thank You

One Answer

  1. The while loop is executed $O(log n)$ times (more precisely, $Theta(log n)$ times).

  2. Yes. You account for every statement that is executed.

  3. The line can $m gets (m+j)/2$ can be executed in time $O(1)$, assuming that arithmetic operations take constant time. The overall time spent in one execution of the wile loop of the function doIt(n) is $O(log n)$, since the number of its iterations is also $O(log n)$ (and, more precisely, $Theta(log n)$).

  4. This depends on the actual cost model you are using. Assuming that arithmetic operations take constant time, what you are probably thinking of is the number of divisions (resp. multiplications) by $2$ needed for $n$ to drop below (become larger than) some value $x < n$ (resp. $x>n$). If that's the case then the number of iterations is $O(log n - log x) subseteq O(log n)$ (resp. $O(log x - log n) subseteq O(log x)$. Multiplying/dividing by any constant greater than $1$ does not change the asymptotic time complexity (recall that $log_b a = frac{log_2 a}{log_2 b}$, where $log_2 b$ is a constant in this scenario).

Answered by Steven on October 21, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP