- Forward


The Efficiency of Algorithms
An Introduction


Prof. David Bernstein
James Madison University

Computer Science Department
bernstdh@jmu.edu

Print

An Obvious Way to Measure Efficiency
Back Forward
  • The Process:
    • Write the necessary code
    • Measure the running time and space requirements
  • An Important Caveat:
    • It would be a big mistake to run the application only once
    • Instead, one must develop a sampling plan (i.e., a way of developing/selecting sample inputs) and test the application on a large number of inputs
Problems with This Approach
Back Forward
  • It is often very difficult to test algorithms on a large number of representative inputs.
  • The efficiency can't be known a priori and it if often prohibitively expensive to implement and test an algorithm only to learn that it is not efficient.
An Alternative Approach: Bounds
Back Forward
  • The Idea:
    • Obtain an upper and/or lower bound on the time and space requirements
  • An Observation:
    • Bounds are used in a variety of contexts
Bounds (cont.)
Back Forward
  • A Common Question at Job Interviews:
    • How many hairs do you have on your head?
  • Obtaining a Bound:
    • Suppose a human hair occupies at least a square of 100 microns on a side (where one micron is one millionth of a meter)
    • Then, each hair occupies at least 0.00000001 m2 in area
    • Suppose further that your head has at most 400 cm2 in total area (or 0.04 m2)
    • Then, your head has at most 4 million hairs
Bounds (cont.)
Back Forward
  • Worst Case Analysis:
    • Instead of trying to determine the "actual" performance experimentally, one instead attempts to find theoretical upper bounds on performance
  • Some Notation:
    • The worst case running time for an input of size \(n\) is denoted by \(T(n)\)
    • The worst case space requirement for an input of size \(n\) is denoted by \(S(n)\)
Comparing Different Algorithms
Back Forward
  • Which is Better?
    • An algorithm with worst case time of \(n^{2}+5\)
    • An algorithm with worst case time of \(n^{2}+2\)
  • Do You Care When \(n=1000\)?
    • 1,000,005
    • 1,000,002
    • Expand
Asymptotic Dominance
Back Forward
  • The Idea:
    • When comparing algorithms based on their worst case time or space it seems best to consider their asymptotic performance (i.e., their performance on "large" problems)
  • Applying this Idea:
    • Consider the limit of the worst case time or space
I Hate Math - Does Efficiency Really Matter?
Back Forward
  • Compare Two Alogorithms When:
    • One requires \(n^{3}\) iterations and one requires \(2^{n}\) iterations
    • Each iteration requires one microsecond (i.e., one millionth of a second)
  • Different Problem Sizes:
    • \(T(n)\) \(n=10\) \(n=25\) \(n=50\) \(n=75\)
      \(n^{3}\) 0.001 sec 0.016 sec 0.125 sec 0.422 sec
      \(2^{n}\) 0.001 sec
      33.554 sec
      Expand
      35.702 yrs
      Expand
      11,979,620.707 centuries
      Expand
Landau Numbers
Back Forward
  • Defining "little o":
    • \(T=o[f(n)] \mbox{ iff } \lim_{n \rightarrow \infty } \frac{T(n)}{f(n)} = 0\)
  • An Example: \(n^{2}+5\)
    • "little o" of \(n^{3}\) since \(\lim_{n \rightarrow \infty }[\frac{(n^{2}+5)}{n^{3}}]=0\)
"big O" Notation
Back Forward
  • Interpreting "little o":
    • When \(T\) is "little o" of \(f\) it means that \(T\) is of lower order of magnitude than \(f\)
  • Defining "big O":
    • \(T=O[f(n)]\) iff there exist positive \(k\) and \(n_{0}\) such that \(T(n) \leq k f(n)\) for all \(n \geq n_{0}\)
  • Interpreting "big O":
    • \(T\) is not of higher order of magnitude than \(f\)
"big O" Notation (cont.)
Back Forward
  • Show:
    • \(n^{2}+5\) is \(O(n^{2})\)
  • Choose \(k=3\) and \(n_{0}=2\) and observe:
    • \(n^{2}+5 \leq 3n^{2}\) for all \(n \geq 2\)
Lower Bounds on Growth Rates
Back Forward
  • Defining "Big Omega":
    • \(T= \Omega [g(n)]\) iff there exists a positive constant \(c\) such that \(T(n) \geq c g(n)\) for an infinite number of values of \(n\)
  • An Example: \(n^{2}+5\)
    • Is \(\Omega (n^{2})\) since, setting \(c=1\):
        \(n^{2}+5 \geq n^{2}\) for all \(n \geq 0\)
Categorizing Performance
Back Forward

Asymptotic Bound Name
\(O(1)\) Constant
\(O(\log n)\) Logarithmic
\(O(n)\) Linear
\(O(n^{2})\) Quadratic
\(O(n^{3})\) Cubic
\(O(a^{n})\) Exponential
\(O(n!)\) Factorial

Determining Asymptotic Bounds
Back Forward
  1. If \(T_{1} = O[f_{1}(m)]\) then \(k T_{1} = O[f_{1}(m)]\) for any constant \(k\)
  2. If \(T_{1}= O[f_{1}(m)]\) and \(T_{2}= O[f_{2}(m)]\) then \(T_{1}T_{2}=O[f_{1}(m)]O[f_{2}(m)]\)
  3. If \(T_{1}= O[f_{1}(m)]\) and \(T_{2}= O[f_{2}(m)]\) then \(T_{1}+T_{2}= \max{O[f_{1}(m)], O[f_{2}(m)]}\)
Determining Asymptotic Bounds (cont.)
Back Forward

Suppose an algorithm has three "steps" with running times of \(O(n^{4})\), \(O(n^{2})\), and \(O(\log n)\).

Then, it follows from rule 3 that the running time for the entire algorithm is \(O(n^{4})\).

Determining Asymptotic Bounds (cont.)
Back Forward

Suppose an algorithm has one "step" with a running time of \(O(n)\) and that it repeats this "step" (in a loop) 1000 times.

Then, it follows from rule 1 that the running time for the entire algorithm is \(O(n)\).

An Example: Factorials
Back Forward

What is the asymptotic bound on the worst case time efficiency of the following recursive algorithm?

int factorial(int n) { // Check for valid input if (n > 12) return -1; if ((n == 0) || (n == 1)) return 1; return n*factorial(n-1); }
An Example: Factorials (cont.)
Back Forward

What is the asymptotic bound on the worst case time efficiency of the following iterative algorithm?

int factorial(int n) { int i, value; // Check for valid input if (n > 12) return -1; value = 1; for (i=2; i<=n; i++) { value *= i; } return value; }
An Example: Factorials (cont.)
Back Forward

What is the asymptotic bound on the worst case time efficiency of the following algorithm? What about the space efficiency?

int factorial(int n) { int f[13]={1,1,2,6,24,120,720,5040,40320, 362880,3628800,39916800,479001600}; // Check for valid input if (n > 12) return -1; return f[n]; }
There's Always More to Learn
Back -