CS51A - Spring 2022 - Class 25

Example code in this lecture

   sorting.py

Lecture notes

  • admin
       - no mentor hours this week (I'm trying to get one or two towards the weekend for study question)
       - lab optional this week (come with questions!)
       - Course feedback forms on Wednesday + philosophy

  • nim tournament

  • "For me, great algorithms are the poetry of computation. Just like verse, they can be terse, allusive, dense and even mysterious. But once unlocked, they cast a brilliant new light on some aspect of computing." -- Francis Sullivan

  • What is an algorithm?
       - way for solving a problem
       - method of steps to accomplish a task

  • Examples
       - sort a list of numbers
       - find a route from one place to another (cars, packet routing, phone routing, ...)
       - find the longest common substring between two strings
       - add two numbers
       - microchip wiring/design (VLSI)
       - solving sudoku
       - cryptography
       - compression (file, audio, video)
       - spell checking
       - pagerank
       - classify a web page
       - ...

  • Main parts to algorithm analysis
       - developing algorithms that work
       - making them faster
       - analyzing/understanding the efficiency/run-time

  • What questions might we want to ask/measure about an algorithm?
       - how long does it take to run? How efficient is it?
       - how much memory does it use?
       - does it finish?
       - is it right?
       - how hard is it to code?

  • asymptotic analysis
       - Key idea: how does the run-time grow as we increase the input size?
          - for example, if we double the input, what will happen to the run-time?
             - unchanged?
             - double?
             - triple?
             - quadruple?
       - Big-O notation: an upper bound on the run-time
          - describe how the run-time changes as the input time changes
          - this gives us groups of methods/functions that behave similarly
          - O(f(n)): as n increases, the run-time increases wrt f(n)
             - O(n) = linear
                - double the input size, double the run-time
             - O(n^2) = quadratic
                - double the input size, quadruple the run-time
             - ..
       - An example
          - Say we have an O(n^2) algorithm
          - For an input of size k it takes t time
             - Since it's quadratic, for an input of size k it does on the order of k^2 work to get the answer in time t
             - If we double the size: (2k)^2 = 4 k^2 work needs to be done to get the answer.
             - How long will that take?
                - we know doing k^2 work takes time t
                - so 4 k^2 work will take 4t time

       - runtimes table


  • Revisit contains function on a list (i.e., "in")
       - What is the Big-O for contains?
          - O(n)
          - the runtime increases linearly with the size of the list

  • Sorting
       Input: A list of numbers nums
       Output: The list of numbers in sorted order, i.e. nums[i] <= nums[j] for all i < j

       - cards
          - sort cards: all cards in view
          - sort cards: only view one card at a time
       - many different ways to sort a list

  • Selection sort
       - high-level:
          - starting from the beginning of the list and working to the back, find the smallest element in the remaining list
             - in the first position, put the smallest item in the list
             - in the second position, put the next smallest item in the list
             - ...
          - to find the smallest item in the remaining list, simply traverse it, keeping track of the smallest value

       - look at selection_sort in sorting.py code
          - What is the running time of selection sort?
             - We'll use the variable n to describe the length of the array/input
             - How many times do we go through the for loop in selectionSort?
                - n times
             - Each time through the for loop in selectionSort, we find the smallest element. How much work is this?
                - first time, n-1, second, n-2, third, n-3 ...
                - O(n)
             - what is the overall cost for selectionSort?
                - we go through the for loop n times
                - each time we go through the for loop we incur a cost of roughly n
                - O(n^2)
       
  • Insertion sort
       - high-level: starting from the beginning of the list and, working towards the end, keep the list items we've seen so far in sorted order. For each new item, traverse the list of things sorted already and insert it in the correct place.
       - look at insertion_sort function in sorting.py code
          - what is the running time?
             - How many times do we iterate through the while loop?
                - in the best case: no times
                   - when does this happen?
                      - when the list is sorted already
                   - what is the running time? linear, O(n)
                - in the worst case: j - 1 times
                   - when does this happen?
                   - what is the running time?
                      - \sum_{j=1}^n-1 j = ((n-1)n)/2
                      - O(n^2)
                - average case: (j-1)/2 times
                   - O(n^2)

  • Mergesort
       - You'll learn about it later :)
       - O(n log n) running time!!