For example,
def LS(A, x):
""" Return an index i such that x == L[i]. Otherwise, return -1."""
i = 0
while i < len(A):
if A[i] == x:
return i
i = i + 1
return -1
if we consider the code like this, with few example such as LS([2, 4, 6, 8], 4) we can find the running time of the program. This could be the example of linear search. This part overlaps CSC108 contents. Linear search can be meant 'arranged in line'. And there is binary search. Binary algorithm is much faster than linear search because binary search requires the list to be sorted. (when sorted the list, if the list already sorted, there will be nothing to sort!) For example, if we use linear search on one billion items, there will be one billion iterations in the worst case(literally going through one billion items), however, with binary search, it only requires 30 iterations in the worst case. Now, we see the difference.
Now, it's time for Big-O, Omega and Theta come along. Why do we care these? These are the expression of how the established linear or binary search graph behaves when n is 'large enough'.
To be precise, the definition of Big-O is the same as the following.
For any function f : N → R≥0
O(f) = {g:N → R≥0|∃c ∈ R+, ∃B ∈ N, ∀n ∈ N, n ≥ B ⇒ g(n) ≤ cf(n)}
So
No comments:
Post a Comment