Anda di halaman 1dari 22

DAA

Minggu V

Master method
Used for many divide-and-conquer recurrences of the form T (n) = aT (n/b) + f (n) where a 1, b > 1, and f (n) > 0. Master Theorem Let a 1 and b > 1 be constants, let f (n) be a function, and let T (n) be defined on the nonnegative integers by the recurrence T(n) = aT(n/b) + f(n), where we interpret n/b to mean either n/b or n/b. Then T (n) can be bounded asymptotically as follows.
1. If f(n) = O(nlogb a- ), for some constant > 0, then T(n) = (nlogb a ) 2. If f(n) = ((nlogb a ), then T(n) = (nlogb a lg n) 3. If f(n) = (nlogb a+ ), for some constant > 0 and if af(n/b) cf(n), for some constant c < 1 and all sufficiently large n, then T (n) = (f (n)).

Based on the master theorem: Compare nlogb a vs. f (n): Case 1: f (n) = O(nlogb a) for some constant > 0.
( f (n) is polynomially smaller than nlogb a.) log a Solution: T (n) = (n b ). (Intuitively: cost is dominated by leaves.)

Case 2: f (n) = (nlogb a lgk n), where k 0.


[This formulation of Case 2 is more general than in Master Theorem] log a ( f (n) is within a polylog factor of n b , but not smaller.) Solution: T (n) = (nlogb a lgk+1 n). log a (Intuitively: cost is n b lgk n at each level, and there are (lg n) levels.) Simple case: k = 0 f (n) = (nlogb a) T (n) = (nlogb a lg n).

Case 3: f (n) = (nlogb a+) for some constant > 0 and f (n) satisfies the regularity condition a f (n/b) c f (n) for some constant c < 1 and all sufficiently large n.
( f (n) is polynomially greater than nlogb a.) Solution: T (n) = ( f (n)). (Intuitively: cost is dominated by root.)

What.s with the Case 3 regularity condition? Generally not a problem. It always holds whenever f (n) = nk and f (n) = (nlogb a+) for constant > 0. [Proving this makes a nice homework exercise. See below.] So you dont need to check it when f (n) is a polynomial. [Heres a proof that the regularity condition holds when f (n) = nk and f (n) = (nlogb a+) for constant > 0. Since f (n) = (nlogb a+) and f (n) = nk, we have that k > logb a. Using log a a base of b and treating both sides as exponents, we have bk > b b = a, and so a/bk < 1. Since a, b, and k are constants, if we let c = a/bk, then c is a constant strictly less than 1. We have that a f (n/b) = a(n/b)k = (a/bk )nk = c f (n), and so the regularity condition is satisfied.]

Examples: T (n) = 5T (n/2) + (n2)


T (n) = 27T (n/3) + (n3 lg n) T (n) = 5T (n/2) + (n3)
n 2 vs. n2 Since log2 5 = 2 for some constant > 0, use Case 1 T (n) = (nlg 5) nlog3 27 = n3 vs. n3 lg n Use Case 2 with k = 1 T (n) = (n3 lg2 n) nlog2 5 vs. n3 Now lg 5 + = 3 for some constant > 0 Check regularity condition (dont really need to since f (n) is a polynomial): a f (n/b) = 5(n/2)3 = 5n3/8 cn3 for c = 5/8 < 1 Use Case 3 T (n) = ((n3)
log 5

T (n) = 27T (n/3) + (n3/ lg n)

nlog3 27 = n3 vs. n3/ lg n = n3 lg1 n (n3 lgk n) for any k 0. Cannot use the master method.

Growth of Functions (overview)


A way to describe behavior of functions in the limit. Were studying asymptotic efficiency. Describe growth of functions. Focus on whats important by abstracting away loworder terms and constant factors. How we indicate running times of algorithms. A way to compare sizes of functions:

= < >

Asymptotic notation
O-notation
O(g(n)) = {f (n) : there exist positive constants c and n0 such that 0 f (n) cg(n) for all n n0} .

Example: 2n2 = O(n2), with c = 1 and n0 = 2. Examples of functions in O(n2):


n2 n2 + n n2 + 1000n 1000n2 + 1000n Also, n n/1000 n1.99999 n2/ lg lg lg n

-notation
(g(n)) = { f (n) : there exist positive constants c and n0 such that 0 cg(n) f (n) for all n n0} .

Example: n = (lg n), with c = 1 and n0 = 16. Examples of functions in (n2):


n2 n2 + n n2 n 1000n2 + 1000n 1000n2 1000n n3 n2.00001 n2 lg lg lg n 22n

Also,

-notation
(g(n)) = { f (n) : there exist positive constants c1, c2, and n0 such that 0 c1g(n) f (n) c2g(n) for all n n0} .

Example: n2/2 2n = (n2), with c1 = 1/4, c2 = 1/2, and n0 = 8.

Theorem: f (n) = (g(n)) if and only if f = O(g(n)) and f = (g(n)) .


Asymptotic notation in equations When on right-hand side: O(n2) stands for some anonymous function in the set O(n2). 2n2+3n+1 = 2n2+ (n) means 2n2+3n+1 = 2n2+ f (n) for some f (n) (n). In particular, f(n) = 3n + 1. By the way, we interpret # of anonymous functions as = # of times the asymptotic notation appears:

When on left-hand side: No matter how the anonymous functions are chosen on the left-hand side, there is a way to choose the anonymous functions on the right-hand side to make the equation valid. Interpret 2n2 + (n) = (n2) as meaning for all functions f (n) (n), there exists a function g(n) (n2) such that 2n2 + f (n) = g(n). Can chain together: 2n2 + 3n + 1 = 2n2 + (n) = (n2) . Interpretation: First equation: There exists f (n) (n) such that 2n 2+3n+1 = 2n2+ f (n). Second equation: For all g(n) (n) (such as the f (n) used to make the first equation hold), there exists h(n) (n2) such that 2n2+ g(n) = h(n).

o-notation
o(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0 > 0 such that 0 f (n) < cg(n) for all n n0} .

-notation
(g(n)) = { f (n) : for all constants c > 0, there exists a constant n0 > 0 such that 0 cg(n) < f (n) for all n n0} .

Comparisons of functions
Relational properties:

Comparisons:
f (n) is asymptotically smaller than g(n) if f (n) = o(g(n)). f (n) is asymptotically larger than g(n) if f (n) = (g(n)). No trichotomy. Although intuitively, we can liken O to , to , etc., unlike real numbers, where a < b, a = b, or a > b, we might not be able to compare functions. Example: n1+sin n and n, since 1 + sin n oscillates between 0 and 2.

Standard notations and common functions


Monotonicity f (n) is monotonically increasing if m n f (m) f (n). f (n) is monotonically decreasing if m n f (m) f (n). f (n) is strictly increasing if m < n f (m) < f (n). f (n) is strictly decreasing if m > n f (m) > f (n). Exponentials Useful identities: a1 = 1/a , (am)n = amn , aman = am+n . Can relate rates of growth of polynomials and exponentials: for all real constants a and b such that a > 1,
Lim nb/an= 0, n which implies that nb = o(a ). A suprisingly useful inequality: for all real x,
n

ex 1 + x .

As x gets closer to 0, ex gets closer to 1 + x.

Logarithms Notations: lg n = log2 n (binary logarithm) , ln n = loge n (natural logarithm) , lgk n = (lg n)k (exponentiation) , lg lg n = lg(lg n) (composition) . Logarithm functions apply only to the next term in the formula, so that lg n + k means (lg n) + k, and not lg(n + k). In the expression logb a: If we hold b constant, then the expression is strictly increasing as a increases. If we hold a constant, then the expression is strictly decreasing as b increases

Useful identities for all real a > 0, b > 0, c > 0, and n, and where logarithm bases are not 1:

a=b

logb a ,

logc(ab) = logc a + logc b , logb an = n logb a , logb a = (logc a / logc b), logb(1/a) = logb a , logb a = 1/loga b,

logb c

=c

logb a .

Factorials n! = 1 2 3 n. Special case: 0! = 1. Can use Stirlings approximation, n! = (2n) (n/e)n (1 + (1/n)), to derive that lg(n!) = (n lg n).

Tugas: 3.1-1; 3.1-2; 3.1-3; 3.1-4; 3.1-8; 3.2-4; 3.3;

Anda mungkin juga menyukai