Search This Blog

Sunday, November 30, 2014

Week 11 & Week 12: More analysis, Computability Theory

I feel really stuck. Over the last 2 weeks, I've reviewed, studied Big-O and Omega. I went over the lecture slides, lecture note and practice on my own repeatedly, reviewed tutorials and quizzes. And I finally know how to do it. But this time, I think I am really stuck. I am really worried that I might not be able to do this even until the exam.(Hope that doesn't happen!)

It was the hardest last two weeks for me because the contents felt really hard and I barely understood material. Unfortunately, the last tutorial wasn't about computability or halt. Usually, when I'm stuck, tutorials were really helpful. I would understand the material after the tutorials so it really helped me to keep going and studying. I was hoping to deal with some of these problems on the last tutorial, but we did more of Big-O and Omega.

In terms of terminology, if the function f is well-defined function, that is we can say what f(x) is for every x, but we can't say how to compute f(x), then f is non-computable. As an example, we were introduced to the function halt as a non-computable function. Also, in order to prove uncomputable function, we use reductions.

Suppose that we know some function f is non-computable, the some other well-defined function g could be extended to build f. In other words:
If g is computable, then f is computable.
Using the contrapositive, f  reduces to g:
If f is non-computable, then g is non-computable.

Unfortunately, I haven't fully understood about halt and the process of proving some function's computability so I don't think I can go further than this. I mean, I did sort of copy lecture note for Q6 in Assignment #3, but I'm not confident about this at all because I basically am not really understanding this. It feels bad that I could not make this SLOG with better understanding. But I will keep trying until the exam and get help if I really can't figure out by my self.( I usually prefer try by myself as much as possible cause in that case, it feels so much better and the contents actually last longer in my head..I think.) So, I guess at least I know what I have to do. review the slides and the lecture notes over and over until I understand and practice and practice until I get close to perfect.

Sunday, November 16, 2014

Week 10: Linear search and Big-O

At last part of the lecture, we learned about analyzing algorithm. For the given Python code, we analysed(counted) the number of execution time for each assignment statement. Why do we do that? Why do we analyse how many steps there are in the code, what the total steps are, etc.? The answer is to measure the running time of the programs.

For example,

def LS(A, x):
    """ Return an index i such that x == L[i]. Otherwise, return -1."""
    i = 0
    while i < len(A):
        if A[i] == x:
            return i
        i = i + 1
    return -1

if we consider the code like this, with few example such as LS([2, 4, 6, 8], 4) we can find the running time of the program. This could be the example of linear search. This part overlaps CSC108 contents. Linear search can be meant 'arranged in line'. And there is binary search. Binary algorithm is much faster than linear search because binary search requires the list to be sorted. (when sorted the list, if the list already sorted, there will be nothing to sort!) For example, if we use linear search on one billion items, there will be one billion iterations in the worst case(literally going through one billion items), however, with binary search, it only requires 30 iterations in the worst case. Now, we see the difference.

Now, it's time for Big-O, Omega and Theta come along. Why do we care these? These are the expression of how the established linear or binary search graph behaves when n is 'large enough'.

To be precise, the definition of Big-O is the same as the following.

For any function f : N → R≥0
O(f) = {g:N → R≥0|∃c ∈ R+, ∃B ∈ N, ∀n ∈ N, n ≥ B ⇒ g(n) ≤ cf(n)}

So  means that f grows faster than g, or f is an upper bound for g. For Omega, it will be the other way, for Theta, g is bounded on both.


Sunday, November 9, 2014

Week 8 & 9: Proving negation, Algorithm Analysis(2)

We've been taught about algorithm analysis for about 2 weeks now.(oh my Big O! *face-palm*) I honestly had no idea what we were doing for a couple of lectures at the beginning of this. I guess, I wasn't connecting the proofs of big O, etc to the algorithm that we analyzed on the first day of algorithm analysis. I had lost during that time, because I didn't know from where the big O and all the proving stuff came out all of sudden.

After the time of chaos in my head, after the tutorial time and reviewing the material, something clicked and I finally made 'Ah!' sound.

The start is find out the what the given code means. Once I understand the code, count the number of execution of each line of codes. If it is not easy to generalize at once, start from the small number and see how many times each line is executed and do this for several times. Then generalize from the results. (While I'm practicing this with tutorial problem and lecture material, I got little confused at some point and lost the track of count......means that I forgot if I count this line at that point already or if I'm about to count.....duh...)

Proving still feels difficult for me. At this point, the proof style requires that we choose any existential and make the given equation simpler format(like finding bigger possible number, etc). I definitely need more practice on this. When Danny is doing this together in the lecture, I can follow what we are doing, but I am not confident to do this all by myself from the beginning. I know only the practice makes it perfect, so I guess that's what I will do and hopefully, by next week, I can understand this fully and write the proofs fairly well.

Week 8 & 9: Proving negation, Algorithm Analysis

Quizzes always make me nervous. Well, what wouldn't make me nervous if it is graded, but the thing about the quizzes is that it comes back every week. It is a bit pressure sometimes, but there is an advantage of taking quizzes as well. The good thing about taking quizzes that I found is that it is the opportunity to test myself if I studied material correctly. The given problem for each quiz is asking fairly basic concept of what we learned in previous week. What I learned from quizzes actually helped me in Assignment #2.

Again(I wrote about quiz last week as well), I got quiz #6 wrong. I think I was misunderstanding about proving negation or didn't apply disproving the statement by proving negation flexibly.

Suppose there is a statement that you should prove or disprove. Let's say, after the brainstorming in my head, if I figured out that the given statement is false so I have to disprove it. When disprove the statement, most of the times you negate the given statement and try to find a counterexample. If I can show a counterexample, then that is enough to prove the negation is true. If I show the negation is true, then the original statement is disproved.

Here is the statement from Quiz #6:

                For all real numbers x, y, if x and y are both positive, then


Now, I want to disprove this statement, so I found the negation and showed there exists a counterexample.


 Let    . Then 
Let  . Then  .
Notice, . # 
Then, . # algebra
Then,  . # algebra
Therefore, 
Thus, .


Note that we don't need to indent anything when you show that there is a counterexample. You indent the next proof line only when you assume something. But in this case, showing negation is true, we only introduce there exists a counterexample, so we can write the each step without any indentation.

After this quiz, I realized that I didn't write some of my proofs in assignment #2. Thanks to the quiz, I could handed in the right format of proofs.