Practical Info
Written exam, theory + exercise
In mid-December or after Christmas
Algorithm Performance
Algorithm: The precise instructions on how to solve the problem.
How to measure the performance of an algorithm?
- Write pseudocode
- Count the number of basic operations as a function of input size
Info
Basic operations: plus +, minus -, multiply *, divide /, modulo %, grater than >, less than <, a memory access, a function call...
In theoretical analysis, we don't really care about exact number of basic operations. We bound them instead.
Upper and Lower Bound
- Upper bound: the number of basic operations is proportional to the runtime (at most).
- Lower bound: the number of basic operations is proportional to the runtime (at least).
Runtime should be a function of the size of input \(n\).
Big-O Notation
We have function \(f(n)\) and \(g(n)\) where \(n\) is the length of input.
"Less or Equal" - \(O\)
If \(\exists\) constant \(c>0,n_0>0\) s.t., \(\forall n>n_0,f(n)\leq c\cdot g(n)\),
we say "\(f(n)\) is asymptotically upper bounded by \(g(n)\)", i.e., \(f(n)\in O(g(n))\).
"Greater or Equal" - \(\Omega\)
If \(\exists\) constant \(c>0,n_0>0\) s.t., \(\forall n>n_0,f(n)\geq c\cdot g(n)\),
we say "\(f(n)\) is asymptotically lower bounded by \(g(n)\)", i.e., \(f(n)\in \Omega(g(n))\).
"Equal" - \(\Theta\)
If both \(f(n)\in O(g(n))\) and \(f(n)\in \Omega(g(n))\) holds,
we say "\(f(n)\) is asymptotically bounded by \(g(n)\)", i.e., \(f(n)\in \Theta(g(n))\).
"Strictly Less" - \(o\)
If \(\forall\) constant \(c>0,\exists n_0>0\) s.t., \(\forall n>n_0,f(n)<c\cdot g(n)\),
we say "\(f(n)\) is asymptotically (strictly) upper bounded by \(g(n)\)", i.e., \(f(n)\in o(g(n))\).
"Strictly Greater" - \(\omega\)
If \(\forall\) constant \(c>0,\exists n_0>0\) s.t., \(\forall n>n_0,f(n)>c\cdot g(n)\),
we say "\(f(n)\) is asymptotically (strictly) lower bounded by \(g(n)\)", i.e., \(f(n)\in \omega(g(n))\).
A More Intuitive Way
to compare two functions.
Let \(f(n),g(n)>0\), s.t., \(v=\lim_{n\to\infty}\frac{f(n)}{g(n)}\) exists, then
- \(v=0\Rightarrow f(n)\in o(g(n))\) aka \(<_a\)
- \(v<\infty\Rightarrow f(n)\in O(g(n))\) aka \(\leq_a\)
- \(v>0\) is a constant \(\Rightarrow f(n)\in\Theta(g(n))\) aka \(=_a\)
- \(v>0\Rightarrow f(n)\in\Omega(g(n))\) aka \(\geq_a\)
- \(v=\infty\Rightarrow f(n)\in \omega(g(n))\) aka \(>_a\)
Other Definitions
\(O(T(n))\): If \(\exists c>0,n_0\geq 0\), s.t., \(\forall n\geq n_0,\forall\) input of size \(n\), algorithm terminates in \(\leq c\cdot T(n)\).
Knuth Definition \(\Omega\): If \(\exists c>0,n_0\), s.t. \(\forall n\geq n_0,\exists\) input of size \(n\), algorithm requires at least \(c\cdot T(n)\) to terminate.
Another definition of \(\Omega\): If \(\exists c>0\forall n_0,\exists n\geq n_0,\exists\) input of size \(n\), algorithm requires at least \(c\cdot T(n)\) to terminate.
There are some algorithms in another definition but not in Knuth's definition.
We can ignore the difference without specification