Asyptotic Notation for Analyzing Algorithm Runtime!
(source: apelbaum.files.wordpress.com/2011/10/yaacovapelbaumbigoplot.jpg)
Often times it is useful to analyze the time it takes to run algorithms in terms of their inputs. A simple way to do this would be to write a function which outputs the number of "fundamental" instructions which are run for an input of size "n". However, most of the time we don't care about constant differences...we only care how the workload increases with n!
In order to quantify this, we use a few different notations:
- Big-O notation: *2n^2 = O(n^3)
- Big-Omega notation: 2n^2 = Omega(n)
(Usually written with a capital omega symbol...which I can't do here) - More!
Now what do these mean...
O(g(n)) is the set of all functions, f(x), where there exists an n0 and c for which, for all n>n0, f(n) <= c*g(n)
(This means that there's a scalar c, such that after a certain point [n0], f(n) will always be bounded above by a constant multiple of g(n))
Omega(g(n)) can be thought of as the complementary idea of being bounded below...
Omega(g(n)) is the set of all functions, f(x), where there exist an n0 and c for which, for all n>n0, f(n) >= c*g(n)
(This means that there's a scalar c, such that after a certain point [n0], f(n) will always be bounded below by a constant multiple of g(n))
Abuse of notation!!!
Note that I said above that these notations describe sets!
When we say something like 2n = O(n^2) this really should probably be written as 2n ∈ O(n^2).
(It's set membership...not really equality which this notation describes)
Examples:
2n = O(n^100)
(This is vacuously true since 2n is obviously bounded above by n^100 everywhere)
n^2 + 50n - 10 = O(n^2)
(This one is less obvious, but means that there exists an n0 after which, for all n, n^2 + 50n - 10 <= c*n^2 for some constant c)