Together, we can solve any of these at our office hours. Also, I have
posted brief solutions so that you can check your answers and get some
guidance on how to solve these problems.
- For the following program fragment
compute the worst-case asymptotic time complexity (as a function of
n). Where it says ``loop body'' you can assume that a constant
number of lines of code are there. Briefly explain how you obtained
your answer.
Hint: Write a nested summation to express the
number of times the loop body is executed.
for (i=0; i<=n-1; i++){
for (j=i+1; j<=n-1; j++){
loop body
}
}
solution
-
For each of the following pairs of functions T1(n) and T2(n)
clearly answer the following 4 questions: Is T1(n) = O(T2(n))?,
Is T1(n) = Omega(T2(n))?,
Is T1(n) = Theta(T2(n))? If you were given two algorithms A1
with time complexity T1(n) and A2 with time complexity T2(n),
which would you pick if your goal was to have the fastest algorithm?
You should justify your answer using either the definition of big-oh,
big-theta, big-Omega or the limit approach discussed in class.
- (a) T1(n) = 6n^2, T2(n) = n^2 log n
solution
- (b) T1(n) = 3/2 n^2 + 7n -4, T2(n) = 8n^2
solution
- (c) T1(n) = n^4, T2(n) = (n^3) log n
solution
- Prove whether or not each of the following statements are true.
For those that you believe are false, prove this by giving a counterexample
(i.e. particular functions for f(n) and g(n) for which the given statement
is not true). For those that you believe are true, use the formal definitions
of big-oh, big-Omega, and big-Theta to prove it. In all problems, you are given
that for all n, f(n) >= 0 and g(n) >= 0.
- (a) If f(n) = O(g(n)) then g(n) = O(f(n))
solution
- (b) f(n)+g(n) = O(max(f(n),g(n)))
solution
- (c) If f(n) = Omega(g(n)) then g(n) = O(f(n))
solution
- Are each of the following
true or false?
- (a) 3 n^2 + 10 n log n = O(n log n)
- (b) 3 n^2 + 10 n log n = Omega(n^2)
- (c) 3 n^2 + 10 n log n = Theta(n^2)
- (d) n log n + n/2 = O(n)
- (e) 10 SQRT(n) + log n = O(n)
- (f) SQRT(n) + log n = O(log n)
- (g) SQRT(n) + log n = Theta(log n)
- (h) SQRT(n) + log n = Theta(n)
- (i) 2 SQRT(n) + log n = Theta(SQRT(n))
- (j) SQRT(n) + log n = Omega(1)
- (k) SQRT(n) + log n = Omega(log n)
- (l) SQRT(n) + log n = Omega(n)
solutions
-
In this problem you will compute
the asymptotic time complexity of the following
divide-and-conquer algorithm. You may assume that n is a power of 2.
(NOTE: It doesn't matter what this does!)
float useless(A){
n = A.length;
if (n==1){
return A[0];
}
let A1,A2 be arrays of size n/2
for (i=0; i <= (n/2)-1; i++){
A1[i] = A[i];
A2[i] = A[n/2 + i];
}
for (i=0; i<=(n/2)-1; i++){
for (j=i+1; j<=(n/2)-1; j++){
if (A1[i] == A2[j])
A2[j] = 0;
}
}
b1 = useless(A1);
b2 = useless(A2);
return max(b1,b2);
}
solution
-
Give an exact solution to the following
recurrences. Then use induction to prove that your solution is correct.
- (a) T(n) = 2T(n/2) + cn, T(1)=1 for n >= 0 a power of 2
solution
- (b) T(n) = 4T(n/2) + cn, T(1)=1 for n >= 0 a power of 2
solution
- (c) T(n) = 2T(n/4) + sqrt(n), T(1)=1 for n >= 0 a power of 4
solution