← Back

Time Complexity

Suppose computers were infinitely fast and computer memory was free. Would you have any reason to study algorithms? The answer is yes, if for no other reason than that you would still like to demonstrate that your solution method terminates and does so with the correct answer.

Excerpt From: "Introduction to Algorithms"

Determining the exact running time of an algorithm f(n), is not usually worth the effort. When input is large enough and grows without bound ∀ n ≥ n0, the multiplicative constants and lower-order terms of an exact running time become insignificant compared to its highest order of growth g(n).

Asymptotic efficiency of algorithms is studied when input sizes are large enough to make only the order of growth of the running time relevant.

Usually, an algorithm that is asymptotically more efficient will be the best choice for all but very small inputs.

Aysmptotic Notations

Informally, the term asymptotic means approaching a value or curve arbitrarily closely (that is, as some sort of limit is taken). A line or curve A that is asymptotic to given curve C is called the asymptote of C.

Theta Notation (Θ-notation)

θ(g(n)) is a set of all functions for which g(n) is the tight bound.

g(n) is an asymptotically tight bound for f(n), if f(n) is a member of set θ(g(n)).

It is expressed as f(n) ∈ θ(g(n)) or usually as f(n) = θ(g(n))

θ(g(n)) = { f(n): there exist positive constants c1, c2 and n0
                  such that 0 ≤ c1.g(n) ≤ f(n) ≤ c2.g(n)
                  for all n ≥ n0 }

Big-O Notation (O-notation)

g(n) is an asymptotic upper bound for f(n), if f(n) is a member of set O(g(n)).

O(g(n)) = { f(n): there exist positive constants c and n0
                  such that 0 ≤ f(n) ≤ c.g(n) for all n ≥ n0 }

Omega Notation (Ω-notation)

g(n) is an asymptotic lower bound for f(n), if f(n) is a member of set Ω(g(n)).

Ω(g(n)) = { f(n): there exist positive constants c and n0 
                  such that 0 ≤ c.g(n) ≤ f(n) for all n ≥ n0 }

o-notation

g(n) is an loose asymptotic upper bound for f(n), if f(n) is a member of set o(g(n)).

o(g(n)) = { f(n): there exist positive constants c and n0
                  such that 0 ≤ f(n) < c.g(n) for all n ≥ n0 }

ω-notation

g(n) is an loose asymptotic lower bound for f(n), if f(n) is a member of set o(g(n)).

ω(g(n)) = { f(n): there exist positive constants c and n0 
                  such that 0 ≤ c.g(n) < f(n) for all n ≥ n0 }

Significance of Worst Case

  1. The worst-case running time of an algorithm is an upper bound on the running time for any input. It guarantees that the algorithm will never take any longer.

  2. For some algorithms, the worst case occurs often. For example, the searching algorithm’s worst case will often occur when the information is not present.

  3. The “average case” is often as bad as the worst case.

To find the asymptotic worst case runtime g(n) of an algorithm, find the exact worst case runtime f(n), and determine g(n) such that f(n) = θ(g(n)).

Average Case

Average case is usually found using probabalistic analysis.

In practice, most programming environments offer a pseudorandom-number generator: a deterministic algorithm returning numbers that “look” statistically random.