Wednesday 23 July 2014

Algorithm analysis

An algorithm is a well-ordered collection of unambiguous and effectively computable operations that when executed produces a result and halts in a finite amount of time [Schneider and Gersting 1995].
          Internal Sort
        The data to be sorted is all stored in the computer’s main memory.
          External Sort
        Some of the data to be sorted might be stored in some external, slower, device.
          In Place Sort
        The amount of extra space required to sort the data is constant with the input size.
A problem is said to be intractable if solving it by computer is impractical
•Example: Algorithms with time complexity O(2n)take too long to solve even for moderate values of n; a machine that executes 100 million instructions per second can execute 260 instructions in about 365 years
Constant Time Complexity
•Algorithms whose solutions are independent of the size of the problem’s inputs are said to have constant time complexity
•Constant time complexity is denoted as O(1)
Selection Sort Algorithm
•The algorithm has deterministic complexity
-all possible instances of the problem (“best case”, “worst case”, “average case”) give the same number of operations T(n)=n2–n=O(n2)
characteristics of algorithms.
Algorithms are well-ordered.
Algorithms have unambiguous operations.
Algorithms have effectively computable operations.
Algorithms produce a result.
Algorithms halt in a finite amount of time.
Quick sort: it is the one of the earlier divide and conquer algorithm to be discovered. It was published by C.A.R Hoare in 1962. It is still one of the fastest in practice.
Basic operation:
To analyze an algorithm we can isolate a particular operation fundamental to the problem called basic operation.
Complexity or work done by an algorithm:
Number of instruction executed or the number of storage bits used by a program can be taken as complexity measure.
Complexity depends upon number of inputs.
Analysis of algorithm: criteria
a.       Correctness
b.      Amount of work done (complexity of work done)
Complexity mean the amount of work done, measured by some specified complexity measure, for example the number of basic operation performed. Complexity has nothing to do, with how complicated or tricky an algorithm is; a very complicated algorithm may have low complexity.
Worst-case complexity:
The function W(n) is called the worst-case complexity of the algorithm. W(n) is the maximum number of basic operations performed by the algorithm on any input of size n. determination of W(n) is called worst-case complexity analysis.
The worst-case complexity is valuable because it gives an upper bound on the work done by the algorithm. The worst-case analysis could be used to help form an estimate for time limit for a particular implementation of an algorithm.
c.       Amount of space used
d.      Simplicity clarity
e.      Optimality

Generalized searching routine: it is a procedure that processes an indefinite amount of data until it either exhausts the data or achieves its goal.

No comments:

Post a Comment