Minggu V
Master method
Used for many divide-and-conquer recurrences of the form T (n) = aT (n/b) + f (n) where a 1, b > 1, and f (n) > 0. Master Theorem Let a 1 and b > 1 be constants, let f (n) be a function, and let T (n) be defined on the nonnegative integers by the recurrence T(n) = aT(n/b) + f(n), where we interpret n/b to mean either n/b or n/b. Then T (n) can be bounded asymptotically as follows.
1. If f(n) = O(nlogb a- ), for some constant > 0, then T(n) = (nlogb a ) 2. If f(n) = ((nlogb a ), then T(n) = (nlogb a lg n) 3. If f(n) = (nlogb a+ ), for some constant > 0 and if af(n/b) cf(n), for some constant c < 1 and all sufficiently large n, then T (n) = (f (n)).
Based on the master theorem: Compare nlogb a vs. f (n): Case 1: f (n) = O(nlogb a) for some constant > 0.
( f (n) is polynomially smaller than nlogb a.) log a Solution: T (n) = (n b ). (Intuitively: cost is dominated by leaves.)
Case 3: f (n) = (nlogb a+) for some constant > 0 and f (n) satisfies the regularity condition a f (n/b) c f (n) for some constant c < 1 and all sufficiently large n.
( f (n) is polynomially greater than nlogb a.) Solution: T (n) = ( f (n)). (Intuitively: cost is dominated by root.)
What.s with the Case 3 regularity condition? Generally not a problem. It always holds whenever f (n) = nk and f (n) = (nlogb a+) for constant > 0. [Proving this makes a nice homework exercise. See below.] So you dont need to check it when f (n) is a polynomial. [Heres a proof that the regularity condition holds when f (n) = nk and f (n) = (nlogb a+) for constant > 0. Since f (n) = (nlogb a+) and f (n) = nk, we have that k > logb a. Using log a a base of b and treating both sides as exponents, we have bk > b b = a, and so a/bk < 1. Since a, b, and k are constants, if we let c = a/bk, then c is a constant strictly less than 1. We have that a f (n/b) = a(n/b)k = (a/bk )nk = c f (n), and so the regularity condition is satisfied.]
nlog3 27 = n3 vs. n3/ lg n = n3 lg1 n (n3 lgk n) for any k 0. Cannot use the master method.
= < >
Asymptotic notation
O-notation
O(g(n)) = {f (n) : there exist positive constants c and n0 such that 0 f (n) cg(n) for all n n0} .
-notation
(g(n)) = { f (n) : there exist positive constants c and n0 such that 0 cg(n) f (n) for all n n0} .
Also,
-notation
(g(n)) = { f (n) : there exist positive constants c1, c2, and n0 such that 0 c1g(n) f (n) c2g(n) for all n n0} .
When on left-hand side: No matter how the anonymous functions are chosen on the left-hand side, there is a way to choose the anonymous functions on the right-hand side to make the equation valid. Interpret 2n2 + (n) = (n2) as meaning for all functions f (n) (n), there exists a function g(n) (n2) such that 2n2 + f (n) = g(n). Can chain together: 2n2 + 3n + 1 = 2n2 + (n) = (n2) . Interpretation: First equation: There exists f (n) (n) such that 2n 2+3n+1 = 2n2+ f (n). Second equation: For all g(n) (n) (such as the f (n) used to make the first equation hold), there exists h(n) (n2) such that 2n2+ g(n) = h(n).
o-notation
o(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0 > 0 such that 0 f (n) < cg(n) for all n n0} .
-notation
(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0 > 0 such that 0 cg(n) < f (n) for all n n0} .
Comparisons of functions
Relational properties:
Comparisons:
f (n) is asymptotically smaller than g(n) if f (n) = o(g(n)). f (n) is asymptotically larger than g(n) if f (n) = (g(n)). No trichotomy. Although intuitively, we can liken O to , to , etc., unlike real numbers, where a < b, a = b, or a > b, we might not be able to compare functions. Example: n1+sin n and n, since 1 + sin n oscillates between 0 and 2.
ex 1 + x .
Logarithms Notations: lg n = log2 n (binary logarithm) , ln n = loge n (natural logarithm) , lgk n = (lg n)k (exponentiation) , lg lg n = lg(lg n) (composition) . Logarithm functions apply only to the next term in the formula, so that lg n + k means (lg n) + k, and not lg(n + k). In the expression logb a: If we hold b constant, then the expression is strictly increasing as a increases. If we hold a constant, then the expression is strictly decreasing as b increases
Useful identities for all real a > 0, b > 0, c > 0, and n, and where logarithm bases are not 1:
a=b
logb a ,
logc(ab) = logc a + logc b , logb an = n logb a , logb a = (logc a / logc b), logb(1/a) = logb a , logb a = 1/loga b,
logb c
=c
logb a .
Factorials n! = 1 2 3 n. Special case: 0! = 1. Can use Stirlings approximation, n! = (2n) (n/e)n (1 + (1/n)), to derive that lg(n!) = (n lg n).