Analyzing run times of algorithms is often done in terms of "Big O" notation. This simplifies analysis, but requires some understanding of the rate of growth of some common functions.

For most practical algorithms the run times are polynomial -
bounded by by some polynomial in **n**, the size of the problem.
Here are some functions that are bounded by a polynomial.
Note that **n ^{k}×log(n)** is bounded by

f(n) | n = 10^{3} (1K) |
n = 10^{6} (1M) |
n = 10^{9} (1G) |
---|---|---|---|

1 | 1 | 1 | 1 |

log(n) | 10 | 20 | 30 |

√n | 3.2×10^{1} (32) |
10^{3} (1K) |
3.2×10^{4} (32K) |

n | 10^{3} (1K) |
10^{6} (1M) |
10^{9} (1G) |

n×log(n) | 10^{4} (10K) |
2×10^{7} (20M) |
3×10^{10} (30G) |

n^{2} |
10^{6} (1M) |
10^{12} (1T) |
10^{18} |

n^{3} |
10^{9} (1G) |
10^{18} |
10^{27} |

Exhaustive search algorithms often have exponential or worse run times.
Here are some functions that are exponential or worse.
Note that the **n** values in this table are much smaller than in the
table above.

f(n) | n = 10 | n = 20 | n = 30 |
---|---|---|---|

2^{n} |
10^{3} (1K) |
10^{6} (1M) |
10^{9} (1G) |

3^{n} |
5.9×10^{4} (59K) |
3.5×10^{9} (3.5G) |
2.1×10^{14} |

n! | 3.6×10^{6} (1M) |
2.4×10^{18} |
2.7×10^{32} |

Some simple algorithms can be analyzed for run time by simply adding up the amount of time spent in each line of code. This amounts to adding up terms for each line of code. Each term has the form

`run-time-for-a-single-execution` × `number-of-executions`

This is almost always easier with "Big O" notation, which can absorb unknown constants.

code | time |
---|---|

for j = 1 to A.length - 1 | O(n) |

key = A[j] | O(n) |

i = j | O(n) |

while i > 0 && A[i - 1] > key | Σt_{j}×O(1) |

A[i] = A[i - 1] | Σt_{j}×O(1) |

i = i - 1 | Σt_{j}×O(1) |

A[i] = key | O(n) |

In this analysis **t _{j}** is the number of steps down from
index