Efficiency of an Algorithm

Efficiency of an Algorithm

Factors that affect the efficiency of a program include:

  • Speed of the machine
  • Compiler
  • Operating system
  • Programming language
  • Size of the input 

In addition to these factors, the way data of a program is organized, and the algorithm used to solve the problem also has a significant impact on the efficiency of a program.

The efficiency of an algorithm can be computed by determining the amount of resources it consumes. 

The primary resources that an algorithm consumes are:

  • Time: The CPU time required to execute the algorithm.
  • Space: The amount of memory used by the algorithm for its execution.

The lesser resources an algorithm consumes, the more efficient it is.

Time/Space Tradeoff: 

  • It refers to a situation where you can reduce the use of memory at the cost of slower program execution, or reduce the running time at the cost of increased memory usage.
  • Example is data storage in compressed/uncompressed form.

Memory is extensible, but time is not. Therefore, time considerations generally override memory considerations.

Method for Determining Efficiency

To measure the time efficiency of an algorithm, you can write a program based on the algorithm, execute it, and measure the time it takes to run. 

The execution time that you measure in this case would depend on a number of factors such as:

  • Speed of the machine
  • Compiler
  • Operating system
  • Programming language
  • Input data

However, we would like to determine how the execution time is affected by the nature of the algorithm.

The execution time of an algorithm is directly proportional to the number of key comparisons involved in the algorithm and is a function of n, where n is the size of the input data.

The rate at which the running time of an algorithm increases as a result of an increase in the volume of input data is called the order of growth of the algorithm. 

The order of growth of an algorithm is defined by using the big O notation.

The big O notation has been accepted as a fundamental technique for describing the efficiency of an algorithm.

The different orders of growth and their corresponding big O notations are:

Constant - O(1)
Logarithmic - O(log n)
Linear - O(n)
Loglinear - O(n log n)
Quadratic - O(n2)
Cubic - O(n3)
Exponential - O(2n), O(10n)