### CS51 - Fall 2009 - Lecture 37

- put up practice final
- our final time slot is Dec. 18 at 9 a.m.
- TP2 is due last day of class at 3:30pm
- Schedule for the rest of the semester
- Monday: sorting, turing machines, misc.
- Wednesday: review

• show demo (again) for Pictionary

• "For me, great algorithms are the poetry of computation. Just like verse, they can be terse, allusive, dense and even mysterious. But once unlocked, they cast a brilliant new light on some aspect of computing.'' -- Francis Sullivan

• What is an algorithm?
- way for solving a problem
- method of steps to accomplish a task

• Examples
- sort a list of numbers
- find a route from one place to another (cars, packet routing, phone routing, ...)
- find the longest common substring between two strings
- microchip wiring/design (VLSI)
- solving sudoku
- cryptography
- compression (file, audio, video)
- spell checking
- pagerank
- classify a web page
- ...

• Sorting (all code for sorting algorithms found here
Input: An array of numbers nums
Output: The array of numbers in sorted order, i.e. nums[i] <= nums[j] for all i < j

- cards
- sort cards: all cards in view
- sort cards: only view one card at a time

- Selection sort
- What is the running time? How many operations?
- We'll use the variable n to describe the length of the array
- what is the running time of indexOfSmallest?
- end_index - start_index + 1
- how many times is it called?
- n - 1 times
- what is the overall cost for selectionSort?
- indexOfSmallest
- n-1
- n-2
- n-3
- ...
- 1
- \sum_{i = 1}^{n-1} i = ((n-1)n)/2s
- Insertion sort
- what is the running time?
- How many times do we iterate through the while loop?
- in the best case: no times
- when does this happen?
- what is the running time? linear
- in the worst case: j - 1 times
- when does this happen?
- what is the running time?
- \sum_{j=1}^n-1 j = ((n-1)n)/2
- quatratic
- average case: (j-1)/2 times

- asymptotic analysis
- Precisely calculating the actual steps is tedious and not generally useful
- Different operations take different amounts of time. Even from run to run, things such as caching, etc. will complicate things
- Want to identify categories of algorithmic runtimes
- Compare different algorithms
- f1(n) takes n^2 steps
- f2(n) takes 2n + 100 steps
- f3(n) takes 4n + 1 steps
- Which algorithm is better? Is the diﬀerence between f2 and f3 impor-
tant/signiﬁcant?
- runtimes table

- Three common notations to talk about run-times
- Big-O: O(g(n))
- A function, f(n) is Big-O of g(n) if eventually (asymptotically, i.e. for large enough inputs) f(n) is always UPPER bounded (<) by some constant times g(n)
- gives us an upper bound on the function
- Omega(g(n))
- A function, f(n) is Omega of g(n) if eventually (asymptotically, i.e. for large enough inputs) f(n) is always LOWER bounded (<) by some constant times g(n)
- Theta(g(n)
- Both upper and lower bounded

- Gives us the big picture, without worrying about details

- What is the running time of Insertion sort?
- O(n^2)
- why Big-O?