CS211 Spring 2002 Lecture 13: Asymptotic Complexity 3/4/2003 ------------------------------------------------------------------------------- 0) Announcements + Prelim 1 (T1) this week (see website for rooms) (bring ID) + review session this week (see website in Prelims) + A4 due 3/12 (see online material) + DIS schedule Overview + math review + algorithm analysis - machine view of time and space - approximate view of time and space + complexity analysis - space (asymptotic space complexity) - timing (asymptotic time complexity) + big Oh notation + recurrence relations (DS&SD: 18.5, App C; DS&A: 2.1.5) ------------------------------------------------------------------------------- 1) Math Review See mathreview.txt, DS&SD: pg 100, App. C) Why math? ------------------------------------------------------------------------------- 2) Algorithm Analysis Algorithm: Program: How to analyze an algorithm? "Exact" analysis: How we analyze: ------------------------------------------------------------------------------- 2) Machine Model (see DS&A, CS212 notes) Stack Machine: JVM: Runtime Performance for JVM: + examples: y = y + 1 has y = a[i] has Space Analysis of Machince Model: Time Analysis of Machine Model: ------------------------------------------------------------------------------- 3) Simplified Model of Computer Space: Time: + Some "basics" from ECE: - processor CLOCK: coordinates memory ops by generating time reference signal called CLOCK CYCLE or TICK - CLOCK SPEED: - instruction/tick? - CLOCK PERIOD: - example) + Simplification: + geosum example courtesy of DS&A + DS&SD notation and concepts: - TSmax(n): - TAmax(n): ------------------------------------------------------------------------------- 4) Recurrence/Recursive Relations (see DS&SD 18.5, App C.5 and DS&A Chap 2, DIS notes) + RECURRENCE RELATION/RECURSIVE RELATION: - specify initial values: base case(s) - define remaining values as a function of previous values example from DS&A) factorial Code: int fact(int n) { if (n==0) return 1; else return n*fact(n-1); } { t1, n==0 rr: T(n) = { { T(n-1)+t2, n > 0 T(n) = T(n-1) + t2 = (T(n-2) + t2) + t2 = T(n-2) + 2t2 = (T(n-3) + t2) + t2 = T(n-3) + 3t2 ... ------------------------------------------------------------------------------- 5) Asymptotic Complexity (also "growth notation, asymptotic notation, Big Oh....") + How to compare algorithms? + Example: Compare these: - geosum1 (mult each term): (11/2)*n^2 + (47/2)*n + 24 - geosum2 (horner's rule): 13n+22 - geosum3 (see DS&A): 19(floor(log[2](n+1))+1)+18 - see Fig2.3 in DS&A for plots or MATLAB: >> n=linspace(0,10,100); >> E1= (11/2)*n.^2 + (47/2)*n + 24; >> E2= 13*n+22; >> E3= 19*(floor(log2(n+1))+1)+18; >> plot(n,E1,n,E2,n,E3) observations: conclusion: + One measure: ASYMPTOTIC UPPER BOUND O(g(n)) = { f(n) | (c, n0) such that 0 <= f(n) <= cg(n) for all n >= n0 } + Why "asymptotic"? + Why "complexity"? - Amount of work required = complexity of the problem. + Why "complexity classes"? - Class: classification of sets of bounding functions + Why is this important? ------------------------------------------------------------------------------- + Example 1: f1(n)=(11/2)n^2 + (47/2)n + 22 is in O(n^3) + Example 2: Problem: Given f1(n)=O(g(n)) and f2(n)=O(g(n)) and h(n)=f1(n)+f2(n), is h(n)=O(g(n))? Justify formally using witness pairs (k,N). + Example 3: Prove that for b > 1, log[b](n)=O(log[2](n)) -------------------------------------------------------------------------------