Search This Blog

Sunday, November 16, 2014

Week 10: Linear search and Big-O

At last part of the lecture, we learned about analyzing algorithm. For the given Python code, we analysed(counted) the number of execution time for each assignment statement. Why do we do that? Why do we analyse how many steps there are in the code, what the total steps are, etc.? The answer is to measure the running time of the programs.

For example,

def LS(A, x):
    """ Return an index i such that x == L[i]. Otherwise, return -1."""
    i = 0
    while i < len(A):
        if A[i] == x:
            return i
        i = i + 1
    return -1

if we consider the code like this, with few example such as LS([2, 4, 6, 8], 4) we can find the running time of the program. This could be the example of linear search. This part overlaps CSC108 contents. Linear search can be meant 'arranged in line'. And there is binary search. Binary algorithm is much faster than linear search because binary search requires the list to be sorted. (when sorted the list, if the list already sorted, there will be nothing to sort!) For example, if we use linear search on one billion items, there will be one billion iterations in the worst case(literally going through one billion items), however, with binary search, it only requires 30 iterations in the worst case. Now, we see the difference.

Now, it's time for Big-O, Omega and Theta come along. Why do we care these? These are the expression of how the established linear or binary search graph behaves when n is 'large enough'.

To be precise, the definition of Big-O is the same as the following.

For any function f : N → R≥0
O(f) = {g:N → R≥0|∃c ∈ R+, ∃B ∈ N, ∀n ∈ N, n ≥ B ⇒ g(n) ≤ cf(n)}

So  means that f grows faster than g, or f is an upper bound for g. For Omega, it will be the other way, for Theta, g is bounded on both.


No comments:

Post a Comment