exercise: prove that lg(1) equals 0
exercise: prove log(xy) equals log(x) plus log(y)
WC runtime? unknown whether loop terminates
runtime for n a power of 2?
case A: usual assumption?
n fits in one memory location, so each arithmetic op'n takes Theta(1) time
ignore printing
Theta(log n) loop iterations, each iteration takes Theta(1) time, so total runtime Theta(log n)
case B: assume n could be arbitrarily large
each arithmetic op'n takes Theta(log n) time, since n occupies Theta(log n) memory locations
so total runtime Theta(log n) * Theta(log n) = Theta( (log n)*(log n))
def fib(n): if (n<=1): return n return fib(n-1) + fib(n-2)
first, we need a definition of n'th fibonacci number f(n)
f(0) defined as 0
f(1) defined as 1
for all integers t at least 2, f(t) defined as f(t-1) plus f(t-2)
argue by induction
base cases
fib(n) returns f(n) for n = 0,1 (why?)
inductive case
fix n to be some integer t >= 2
assume fib(n) returns f(n) for all n in {0, 1, ..., t-1}
want to show fib(t) returns f(t)
exercise: finish this proof
n fib(n) s current/prev 30 832040 0.503 1.615 31 1346269 0.815 1.619 32 2178309 1.328 1.628 33 3524578 2.164 1.629 34 5702887 3.517 1.625
within one call, contant number of arith ops, 2 recursive calls
total number T(n) of calls?
T(0) = T(1) = 1
for n >= 2, T(n) = 1 + T(n-1) + T(n-2)
similar to fibonacci
can we estimate asymptotic growth of fibonacci?
so f(n) in Theta( gn ) where g = 1.618... is golden ratio
turns out T(n) also in Theta( gn )
matrix equation | f(n) | | 1 1 | | f(n-1) | | | = | | | | | f(n-1) | | 1 0 | | f(n-2) | expand equation n-2 times n-1 | f(n) | | 1 1 | | f(n-1) | | | = | | | | | f(n-1) | | 1 0 | | f(n-2) | solve this equation by diagonalizing the matrix, the two terms of the Binet equation are the two eigenvalues
def ifib(n): a,b = 0,1 for _ in range(n): a, b = b, a+b return a
loop invariant: after j loop iterations, a = f(j), b = f(j+1)
exercise: prove by induction
n s current/prev 65536 0.096 3.641 131072 0.320 3.333 262144 1.205 3.756 524288 4.645 3.853 1048576 18.215 3.920
when n doubles, what happens to runtime?
compare time steps n and 2n
runtime goes from roughly c*n^2 to c*(2n)^2 = 4cn^2
so when n doubles, expect runtime to increase by factor of 4