2. A priori analysis and A posteriori analysis
• A priori analysis of algorithms
• It means we do analysis (space and time) of an algorithm prior to running it
on a specific system.
• That is, we determine time and space complexity of algorithm by just seeing
the algorithm rather than running it on a particular system (with different
processor and compiler).
• A posteriori analysis of algorithms
• It means we analyze the algorithm only after running it on a system.
• It directly depends on the system and it changes from system to system.
3. A priori analysis
• Algorithms
• Independent of language
• Hardware independent
• Time and Space function
• Results do not change
5. Analyzing algorithms
To analyze algorithms, their complexity should be calculates.
The most popular technique for computing complexity of algorithms
is Big oh notation.
6. Computational Complexity
• Computational complexity is a continuum, in that some algorithms require
linear time (that is, the time required increases directly with the number of
items or nodes in the list, graph, or network being processed).
an algorithm is a series of contained steps, which you follow in order to achieve
some goal, or to produce some output.
• Another group of algorithms require quadratic or even exponential time to
complete (that is, the time required increases with the number of items
squared or with the exponential of that number).
7
7. Big O notation
• Time complexity analysis in programming is just an extremely simplified
mathematical way of analyzing how long an algorithm with a given number of
inputs (n) will take to complete it’s task. It’s usually defined using Big-O notation.
It tells you the growth of an algorithm
• Big O Notation in Data Structure tells us how well an algorithm will perform in a
particular situation.
8
8. Big O notation
Assume we have the following program:
array = [2, 3, 4, 5, ……, 8]
int findSum(array){
int total = 0;
for(int i=0, i<array.lemgth; i++)
totatl+=i;
return total;
}
9
Instead of:
How much time does it take to run
this function? This depends on the
type of the machine
Use:
How does the run time of this
function grow?
To answer this use:
Big O notation
9. The general steps for Big-O runtime analysis are
as follows:
1.Figure out what the input is and what n represents.
2.Express the maximum number of operations, the algorithm
performs in terms of n.
3.Eliminate all excluding the highest order terms.
4.Remove all the constant factors.
10
10. Eliminate all excluding the highest order terms
Regular Big-O
2 O(1) --> It's just a constant number
2n + 10 O(n) --> n has the largest effect
5n^2 O(n^2) --> n^2 has the largest effect
11
11. Common Time complexities
1. O(1) — Constant Time: Given an input of size n, it only takes a single
step for the algorithm to accomplish the task.
2. O(log n) — Logarithmic time: given an input of size n, the number of
steps it takes to accomplish the task are decreased by some factor with
each step.
3. O(n) — Linear Time: Given an input of size n, the number of of steps
required is directly related (1 to 1)
12
12. Common Time complexities cont..
4. O(n²) — Quadratic Time (polynomial): Given an input of size n, the
number of steps it takes to accomplish a task is square of n.
5. O(C^n) — Exponential Time: Given an input of size n, the number of
steps it takes to accomplish a task is a constant to the n power (pretty
large number).
13
14. Example:
let n = 16;
O (1) = 1 step "(awesome!)"
O (log n) = 4 steps "(awesome!)" -- assumed base 2
O (n) = 16 steps "(pretty good!)"
O(n^2) = 256 steps "(uhh..we can work with this)"
O(2^n) = 65,536 steps "(...)“ (an n increases by 1 -> count doubles roughly by 2)
15
17. O(1) – Example
//If I know the persons name, I only have to take one step to check:
function isFriend(name){ //similar to knowing the index in an Array
return friends[name]; (1)
}
isFriend('Mark’) // returns True and only took one step
F(n) = 1;
________________________________________________________________________
function add(num1,num2){ // I have two numbers, takes one step to return the value
return num1 + num2; (1)
}
18
Time Complexity: O(1)
Time Complexity: O(1)
18. O(1) – Example
void constantTimeComplexity(int arr[])
{
printf("First element of array = %d",arr[0]);
}
Answer: O(1)
Here, the input array could be 1 item or 1,000 items, but this function 8istill
just require one step.
19
20. O(log n) - Example
//You decrease the amount of work you have to do with each step
function thisOld(num, array){
var midPoint = Math.floor( array.length /2 );
if( array[midPoint] === num) return true;
if( array[midPoint] < num ) --> only look at second half of the array
if( array[midpoint] > num ) --> only look at first half of the array
//recursively repeat until you get the solution
}
When the input is divided with each iteration, it’s O(log n). Example: Binary Search
21
21. O(n) – Example
//The number of steps you take is directly correlated to the input size
function addAges(array){
var sum = 0;
for (let i=0 ; i < array.length; i++){ //has to go through each value
sum += array[i]
}
return sum;
}
22
22. O(n) – Example
void linearTimeComplexity(int arr[], int size)
{
for (int i = 0; i < size; i++)
{
printf("%dn", arr[i]);
}
}
Answer: O(n)
This function runs in O(n) time (or "linear time"), where n is the number of
items in the array. If the array has 10 items, we have to print 10 times. If it
has 1000 items, we have to print 1000 times.
23
23. O(n²) – Example 1
function addedAges(array){
var addedAge = 0;
for (let i=0 ; i < array.length; i++){
for(let j=0 ; j < array.length ; j++){
addedAge += array[i][j];
}
}
return addedAge;
}
Note: If one for loop is linear time (n) Then two nested for loops are (n * n) or
(n^2) Quadratic!
24
Here we're nesting two loops. If
our array has n items, our outer
loop runs n times, and our inner
loop runs n times for each
iteration of the outer loop,
giving us n^2 total prints. If the
array has 10 items, we have to
print 100 times. If it has 1000
items, we have to print
1000000 times. Thus this
function runs in O(n^2) time (or
"quadratic time").
24. O(n²) – Example 2
void quadraticTimeComplexity(int arr[], int size)
{
for (int i = 0; i < size; i++)
{
for (int j = 0; j < size; j++)
{
printf("%d = %dn", arr[i], arr[j]);
}
}
}
Answer: O(n^2)
25
25. O(2^n) – Example 1
The number of steps it takes to accomplish a task is a constant to the n power. For
example, trying to find every combination of letters for a password of length n.
26
26. O(2^n) – Example 2
int fibonacci(int num)
{
if (num <= 1) return num;
return fibonacci(num - 2) + fibonacci(num - 1);
}
Answer: O(2^n)
An example of an O(2^n) function is the recursive calculation of Fibonacci
numbers. O(2^n) denotes an algorithm whose growth doubles with each
addition to the input data set. The growth curve of an O(2^n) function is
exponential - starting off very shallow, then rising meteorically.
27
33. Example
public static void main(String[] args){
int a = 0, b = 0;
int N = 5, M = 5;
for (int i = 0; i < N; i++)
a += 5;
for (int i = 0; i < M; i++)
b += 10;
System.out.println(a + " " + b);
}
34
34. More Examples:
Logarithmic algorithm – O(logn) – Binary Search.
Linear algorithm – O(n) – Linear Search.
Superlinear algorithm – O(nlogn) – Heap Sort, Merge Sort.
Polynomial algorithm – O(n^c) – Strassen’s Matrix Multiplication, Bubble
Sort, Selection Sort, Insertion Sort, Bucket Sort.
Exponential algorithm – O(c^n) – Tower of Hanoi.
Factorial algorithm – O(n!) – Determinant Expansion by Minors, Brute force
Search algorithm for Traveling Salesman Problem.
35
35. Example - HW
int f(int n){
If(n==1)
return 1;
Else
return f(n-1) + f(n-1);
}
36
Every good developer want to give their users more time, so they can do all those things they enjoy. They do this by minimizing time complexity.
(quadratic or exponential time algorithms) At the far end of this continuum lie intractable problems—those whose solutions cannot be efficiently implemented. For those problems, computer scientists seek to find heuristic algorithms that can almost solve the problem and run in a reasonable amount of time.
Time complexity analysis in programming is just an extremely simplified mathematical way of analyzing how long an algorithm with a given number of inputs (n) will take to complete it’s task. It’s usually defined using Big-O notation.
Any time the loop is increased by multiplication then the time complexity is O(log2n)