J.E.D.I
The Big-Oh Notation
Although Tn gives the actual amount of time in the execution of an algorithm, it is easier to classify complexities of algorithm using a more general notation, the Big-Oh or
simply O notation. Tn grows at a rate proportional to n and thus Tn is said to have “order of magnitude n” denoted by the O-notation:
Tn = On
This notation is used to describe the time or space complexity of an algorithm. It gives an approximate measure of the computing time of an algorithm for large number of
input. Formally, O-notation is defined as:
gn = Ofn if there exists two constants c and n such that
| gn | = c | fn | for all n = n .
The following are examples of computing times in algorithm analysis:
Big-Oh Description
Algorithm
O1 Constant
Olog
2
n Logarithmic
Binary Search On
Linear Sequential Search
On log
2
n Heapsort
On
2
Quadratic Insertion Sort
On
3
Cubic Floyd’s Algorithm
O 2
n
Exponential To make the difference clearer, lets compare based on the execution time where
n=100000 and time unit = 1 msec:
Fn Running Time
log
2
n 19.93 microseconds
n 1.00 seconds
n log
2
n 19.93 seconds
n
2
11.57 days n
3
317.10 centuries 2
n
Eternity
1.8.2 Operations on the O-Notation
Data Structures 16
J.E.D.I
• Rule for Sums Suppose that T
1
n = O fn and T
2
n = O gn . Then, tn = T
1
n + T
2
n = O max fn, gn .
Proof : By definition of the O-notation, T
1
n ≤
c
1
fn for n
≥ n
1
and T
2
n ≤
c
2
gn for n
≥ n
2
. Let n
= maxn
2
, n
2
. Then T
1
n + T
2
n ≤
c
1
fn + c
2
gn n
≥ n
. ≤
c
1
+ c
2
maxfn,gn n
≥ n
. ≤
c max fn, gn n
≥ n
.
Thus, Tn = T
1
n + T
2
n = O max fn, gn .
For example, 1. Tn = 3n
3
+ 5n
2
= On
3
2. Tn = 2
n
+ n
4
+ nlog
2
n = O2
n
• Rule for Products Suppose that T1n = O fn and T2n = O gn .
Then, Tn = T1n T2n = O fn gn . For example, consider the algorithm below:
forint i=1; in-1; i++{ forint i=1; i=n; i++{
steps taking O1 time }
} Since the steps in the inner loop will take
n + n-1 + n-2 + ... + 2 + 1 times, then
nn+12 = n
2
2 + n2 = On
2
Example: Consider the code snippet below:
for i=1; i = n, i++ for j=1; j = n, j++
steps which take O1 time Since the steps in the inner loop will take n + n-1 + n-2 + ... + 2 + 1 times,
then the running time is
n n+1 2 = n 2
2 + n 2 = O n
2
Data Structures 17
J.E.D.I
1.8.3 Analysis of Algorithms
Example 1: Minimum Revisited
1. public class Minimum { 2.
3. public static void mainString[] args {
4. int a[] = { 23, 45, 71, 12, 87, 66, 20, 33, 15, 69 };
5. int min = a[0];
6. for int i = 1; i a.length; i++ {
7. if a[i] min min = a[i];
8. }
9. System.out.printlnThe minimum value is: + min;
10. } 11.}
In the algorithm, the declarations of a and min will take constant time each. The constant time if-statement in the for loop will be executed n times, where n is the
number of elements in the array a. The last line will also execute in constant time.
Line Times Executed
4 1
5 1
6 n+1
7 n
9 1
Using the rule for sums, we have:
Tn = 2n +4 = On
Since gn = c fn for n = n , then
2n + 4 = cn 2n + 4 = c
--------- n
2 + 4n = c Thus c = 3 and n
= 4. Therefore, the minimum algorithm is in On.
Example 2: Linear Search Algorithm
1 found = false;
2 loc = 1;
3 while loc = n found{
4 if item == a[loc]found = true;
Data Structures 18
J.E.D.I
5 else loc = loc + 1;
6 }
STATEMENT of times executed
1 1
2 1
3 n + 1
4 n
5 n
T n = 3n + 3 so that T n = O n Since g n = c f n for n = n 0, then
3n + 3 = c n 3n + 3n = c = 3 + 3 n = c
Thus c = 4 and n0 = 3.
The following are the general rules on determining the running time of an algorithm: ● FOR loops
➔ At most the running time of the statement inside the for loop times the number of iterations.
● NESTED FOR loops ➔ Analysis is done from the inner loop going outward. The total running time of
a statement inside a group of for loops is the running time of the statement multiplied by the product of thesizes of all the for loops.
● CONSECUTIVE STATEMENTS ➔ The statement with the maximum running time.
● IFELSE ➔ Never more than the running time of the test plus the larger of the running
times of the conditional block of statements.
Data Structures 19
J.E.D.I
1.9 Summary