Jeremy Ting edited p3.tex  about 10 years ago

Commit id: ea7845956e8a665d21dac3a28ecf889258fb6a06

deletions | additions      

       

\item  Using Master Thoerem, we will be able to write the recursive cost of this function as:  \[T(n) = a aT(\frac{n}{b})  + (\frac{n}{b}) + O(n^a)\] O(n^d)\]  The branching factor $a$ is 1 because we are only left with one subproblem at each iteration. Also doing $\frac{y}{2}$ takes constant time because $y$ is in the order of $2^n$, so dividing by 2 is just a simple bitshift. The subproblem is $n/2$ because you only have to deal with a number with half the bitsize. THe cost of combining the subproblems back to the larger problem is the cost of squaring a number and then take mod on a number order $2^0$.  Squaring a number is like multiplying two $n$ bit numbers and using Gauss' observation, it takes $O(n^{1.59})$ time. Taking mod on the order of $2^o$ is simple because you do a shift that takes constant time and the answer is the bits you were shifting. So the cost of recombining is $O(n^{1.59}$. SO So  using Master Theorem: \[T(n) = 1*T(\frac{n}{2}) + O(n^{1.59})\]