Set concepts and notation.
Recursion
Induction Proofs
Logarithms
Summations
Recurrence Relations
Mathematical Background
Estimation Techniques
1,Determine the major parameters that
effect the problem.
2,Derive an equation that relates the
parameters to the problem.
3,Select values for the parameters,and
apply the equation to yield and
estimated solution.
Estimation Example
How many library bookcases does it
take to store books totaling one
million pages?
Estimate:
– Pages/inch
– Feet/shelf
– Shelves/bookcase
Algorithm Efficiency
There are often many approaches
(algorithms) to solve a problem,How do
we choose between them?
At the heart of computer program design are
two (sometimes conflicting) goals.
1,To design an algorithm that is easy to
understand,code,debug.
2,To design an algorithm that makes
efficient use of the computer’s resources.
Algorithm Efficiency (cont)
Goal (1) is the concern of Software
Engineering.
Goal (2) is the concern of data structures
and algorithm analysis.
When goal (2) is important,how do we
measure an algorithm’s cost?
How to Measure Efficiency?
1,Empirical comparison (run programs)
2,Asymptotic Algorithm Analysis
Critical resources:
Factors affecting running time:
For most algorithms,running time depends
on,size” of the input.
Running time is expressed as T(n) for some
function T on input size n.
Examples of Growth Rate
Page 51-55
Best,Worst,Average Cases
Not all inputs of a given size take the same time to run.
Sequential search for K in an array of nintegers:
? Begin at first element in array and look at
each element in turn until K is found
Best case:
Worst case:
Average case:
Which Analysis to Use?
While average time appears to be the fairest
measure,it may be diffiuclt to determine.
When is the worst case time important?
Faster Computer or Algorithm?
What happens when we buy a computer 10
times faster?
T(n) n n’ Change n’/n
10n 1,000 10,000 n’ = 10n 10
20n 500 5,000 n’ = 10n 10
5n log n 250 1,842 ?10 n < n’ < 10n 7.37
2n2 70 223 n’ = ?10n 3.16
2n 13 16 n’ = n + 3 -----
Asymptotic Analysis,Big-oh
Definition,For T(n) a non-negatively valued
function,T(n) is in the set O(f(n)) if there
exist two positive constants c and n0
such that T(n) <= cf(n) for all n > n0.
Usage,The algorithm is in O(n2) in [best,average,
worst] case.
Meaning,For all data sets big enough (i.e.,n>n0),
the algorithm always executes in less than cf(n)
steps in [best,average,worst] case.
Big-oh Notation (cont)
Big-oh notation indicates an upper bound.
Example,If T(n) = 3n2 then T(n) is in O(n2).
Wish tightest upper bound:
While T(n) = 3n2 is in O(n3),we prefer O(n2).
Big-Oh Examples
Example 1,Finding value X in an array
(average cost).
T(n) = csn/2.
For all values of n > 1,csn/2 <= csn.
Therefore,by the definition,T(n) is in O(n)
for n0 = 1 and c = cs.
Big-Oh Examples
Example 2,T(n) = c1n2 + c2n in average
case.
c1n2 + c2n <= c1n2 + c2n2 <= (c1 + c2)n2 for all
n > 1.
T(n) <= cn2 for c = c1 + c2 and n0 = 1.
Therefore,T(n) is in O(n2) by the definition.
Example 3,T(n) = c,We say this is in O(1).
A Common Misunderstanding
,The best case for my algorithm is n=1
because that is the fastest.” WRONG!
Big-oh refers to a growth rate as n grows to
?.
Best case is defined as which input of size n
is cheapest among all inputs of size n.
Big-Omega
Definition,For T(n) a non-negatively valued
function,T(n) is in the set ?(g(n)) if there
exist two positive constants c and n0
such that T(n) >= cg(n) for all n > n0.
Meaning,For all data sets big enough (i.e.,
n > n0),the algorithm always executes in
more than cg(n) steps.
Lower bound.
Big-Omega Example
T(n) = c1n2 + c2n.
c1n2 + c2n >= c1n2 for all n > 1.
T(n) >= cn2 for c = c1 and n0 = 1.
Therefore,T(n) is in ?(n2) by the definition.
We want the greatest lower bound.
Theta Notation
When big-Oh and ? meet,we indicate this
by using ? (big-Theta) notation.
Definition,An algorithm is said to be ?(h(n))
if it is in O(h(n)) and it is in ?(h(n)).
A Common Misunderstanding
Confusing worst case with upper bound.
Upper bound refers to a growth rate.
Worst case refers to the worst input from
among the choices for possible inputs of
a given size.
Simplifying Rules
1,If f(n) is in O(g(n)) and g(n) is in O(h(n)),
then f(n) is in O(h(n)).
2,If f(n) is in O(kg(n)) for any constant k > 0,
then f(n) is in O(g(n)).
3,If f1(n) is in O(g1(n)) and f2(n) is in
O(g2(n)),then (f1 + f2)(n) is in
O(max(g1(n),g2(n))).
4,If f1(n) is in O(g1(n)) and f2(n) is in O(g2(n))
then f1(n)f2(n) is in O(g1(n)g2(n)).
Running Time Examples (1)
Example 1,a = b;
This assignment takes constant time,so it is
?(1).
Example 2:
sum = 0;for (i=1; i<=n; i++)
sum += n;
Running Time Examples (2)
Example 3:
sum = 0;for (i=1; i<=n; j++)
for (j=1; j<=i; i++)sum++;
for (k=0; k<n; k++)A[k] = k;
Running Time Examples (3)
Example 4:
sum1 = 0;for (i=1; i<=n; i++)
for (j=1; j<=n; j++)sum1++;
sum2 = 0;for (i=1; i<=n; i++)
for (j=1; j<=i; j++)sum2++;
Running Time Examples (4)
Example 5:
sum1 = 0;for (k=1; k<=n; k*=2)
for (j=1; j<=n; j++)sum1++;
sum2 = 0;for (k=1; k<=n; k*=2)
for (j=1; j<=k; j++)sum2++;
Other Control Statements
whileloop,Analyze like a for loop.
if statement,Take greater complexity of
then/elseclauses.
switchstatement,Take complexity of most
expensive case.
Subroutine call,Complexity of the
subroutine.
Analyzing Problems
Upper bound,Upper bound of best known
algorithm.
Lower bound,Lower bound for every
possible algorithm.
Analyzing Problems,Example
Common misunderstanding,No distinction
between upper/lower bound when you know
the exact running time.
Example of imperfect knowledge,Sorting
1,Cost of I/O,?(n).
2,Bubble or insertion sort,O(n2).
3,A better sort (Quicksort,Mergesort,
Heapsort,etc.),O(n log n).
4,We prove later that sorting is ?(n log n).
Multiple Parameters
Compute the rank ordering for all C pixel
values in a picture of P pixels.
for (i=0; i<C; i++) // Initialize count
count[i] = 0;
for (i=0; i<P; i++) // Look at all pixels
count[value(i)]++; // Increment count
sort(count); // Sort pixel counts
If we use P as the measure,then time is
?(P log P).
More accurate is ?(P + C log C).
Space Bounds
Space bounds can also be analyzed with
asymptotic complexity analysis.
Time,Algorithm
Space Data Structure
Space/Time Tradeoff Principle
One can often reduce time if one is willing to
sacrifice space,or vice versa.
? Encoding or packing information
Boolean flags
? Table lookup
Factorials
Disk-based Space/Time Tradeoff Principle,
The smaller you make the disk storage
requirements,the faster your program
will run.
Recursion
Induction Proofs
Logarithms
Summations
Recurrence Relations
Mathematical Background
Estimation Techniques
1,Determine the major parameters that
effect the problem.
2,Derive an equation that relates the
parameters to the problem.
3,Select values for the parameters,and
apply the equation to yield and
estimated solution.
Estimation Example
How many library bookcases does it
take to store books totaling one
million pages?
Estimate:
– Pages/inch
– Feet/shelf
– Shelves/bookcase
Algorithm Efficiency
There are often many approaches
(algorithms) to solve a problem,How do
we choose between them?
At the heart of computer program design are
two (sometimes conflicting) goals.
1,To design an algorithm that is easy to
understand,code,debug.
2,To design an algorithm that makes
efficient use of the computer’s resources.
Algorithm Efficiency (cont)
Goal (1) is the concern of Software
Engineering.
Goal (2) is the concern of data structures
and algorithm analysis.
When goal (2) is important,how do we
measure an algorithm’s cost?
How to Measure Efficiency?
1,Empirical comparison (run programs)
2,Asymptotic Algorithm Analysis
Critical resources:
Factors affecting running time:
For most algorithms,running time depends
on,size” of the input.
Running time is expressed as T(n) for some
function T on input size n.
Examples of Growth Rate
Page 51-55
Best,Worst,Average Cases
Not all inputs of a given size take the same time to run.
Sequential search for K in an array of nintegers:
? Begin at first element in array and look at
each element in turn until K is found
Best case:
Worst case:
Average case:
Which Analysis to Use?
While average time appears to be the fairest
measure,it may be diffiuclt to determine.
When is the worst case time important?
Faster Computer or Algorithm?
What happens when we buy a computer 10
times faster?
T(n) n n’ Change n’/n
10n 1,000 10,000 n’ = 10n 10
20n 500 5,000 n’ = 10n 10
5n log n 250 1,842 ?10 n < n’ < 10n 7.37
2n2 70 223 n’ = ?10n 3.16
2n 13 16 n’ = n + 3 -----
Asymptotic Analysis,Big-oh
Definition,For T(n) a non-negatively valued
function,T(n) is in the set O(f(n)) if there
exist two positive constants c and n0
such that T(n) <= cf(n) for all n > n0.
Usage,The algorithm is in O(n2) in [best,average,
worst] case.
Meaning,For all data sets big enough (i.e.,n>n0),
the algorithm always executes in less than cf(n)
steps in [best,average,worst] case.
Big-oh Notation (cont)
Big-oh notation indicates an upper bound.
Example,If T(n) = 3n2 then T(n) is in O(n2).
Wish tightest upper bound:
While T(n) = 3n2 is in O(n3),we prefer O(n2).
Big-Oh Examples
Example 1,Finding value X in an array
(average cost).
T(n) = csn/2.
For all values of n > 1,csn/2 <= csn.
Therefore,by the definition,T(n) is in O(n)
for n0 = 1 and c = cs.
Big-Oh Examples
Example 2,T(n) = c1n2 + c2n in average
case.
c1n2 + c2n <= c1n2 + c2n2 <= (c1 + c2)n2 for all
n > 1.
T(n) <= cn2 for c = c1 + c2 and n0 = 1.
Therefore,T(n) is in O(n2) by the definition.
Example 3,T(n) = c,We say this is in O(1).
A Common Misunderstanding
,The best case for my algorithm is n=1
because that is the fastest.” WRONG!
Big-oh refers to a growth rate as n grows to
?.
Best case is defined as which input of size n
is cheapest among all inputs of size n.
Big-Omega
Definition,For T(n) a non-negatively valued
function,T(n) is in the set ?(g(n)) if there
exist two positive constants c and n0
such that T(n) >= cg(n) for all n > n0.
Meaning,For all data sets big enough (i.e.,
n > n0),the algorithm always executes in
more than cg(n) steps.
Lower bound.
Big-Omega Example
T(n) = c1n2 + c2n.
c1n2 + c2n >= c1n2 for all n > 1.
T(n) >= cn2 for c = c1 and n0 = 1.
Therefore,T(n) is in ?(n2) by the definition.
We want the greatest lower bound.
Theta Notation
When big-Oh and ? meet,we indicate this
by using ? (big-Theta) notation.
Definition,An algorithm is said to be ?(h(n))
if it is in O(h(n)) and it is in ?(h(n)).
A Common Misunderstanding
Confusing worst case with upper bound.
Upper bound refers to a growth rate.
Worst case refers to the worst input from
among the choices for possible inputs of
a given size.
Simplifying Rules
1,If f(n) is in O(g(n)) and g(n) is in O(h(n)),
then f(n) is in O(h(n)).
2,If f(n) is in O(kg(n)) for any constant k > 0,
then f(n) is in O(g(n)).
3,If f1(n) is in O(g1(n)) and f2(n) is in
O(g2(n)),then (f1 + f2)(n) is in
O(max(g1(n),g2(n))).
4,If f1(n) is in O(g1(n)) and f2(n) is in O(g2(n))
then f1(n)f2(n) is in O(g1(n)g2(n)).
Running Time Examples (1)
Example 1,a = b;
This assignment takes constant time,so it is
?(1).
Example 2:
sum = 0;for (i=1; i<=n; i++)
sum += n;
Running Time Examples (2)
Example 3:
sum = 0;for (i=1; i<=n; j++)
for (j=1; j<=i; i++)sum++;
for (k=0; k<n; k++)A[k] = k;
Running Time Examples (3)
Example 4:
sum1 = 0;for (i=1; i<=n; i++)
for (j=1; j<=n; j++)sum1++;
sum2 = 0;for (i=1; i<=n; i++)
for (j=1; j<=i; j++)sum2++;
Running Time Examples (4)
Example 5:
sum1 = 0;for (k=1; k<=n; k*=2)
for (j=1; j<=n; j++)sum1++;
sum2 = 0;for (k=1; k<=n; k*=2)
for (j=1; j<=k; j++)sum2++;
Other Control Statements
whileloop,Analyze like a for loop.
if statement,Take greater complexity of
then/elseclauses.
switchstatement,Take complexity of most
expensive case.
Subroutine call,Complexity of the
subroutine.
Analyzing Problems
Upper bound,Upper bound of best known
algorithm.
Lower bound,Lower bound for every
possible algorithm.
Analyzing Problems,Example
Common misunderstanding,No distinction
between upper/lower bound when you know
the exact running time.
Example of imperfect knowledge,Sorting
1,Cost of I/O,?(n).
2,Bubble or insertion sort,O(n2).
3,A better sort (Quicksort,Mergesort,
Heapsort,etc.),O(n log n).
4,We prove later that sorting is ?(n log n).
Multiple Parameters
Compute the rank ordering for all C pixel
values in a picture of P pixels.
for (i=0; i<C; i++) // Initialize count
count[i] = 0;
for (i=0; i<P; i++) // Look at all pixels
count[value(i)]++; // Increment count
sort(count); // Sort pixel counts
If we use P as the measure,then time is
?(P log P).
More accurate is ?(P + C log C).
Space Bounds
Space bounds can also be analyzed with
asymptotic complexity analysis.
Time,Algorithm
Space Data Structure
Space/Time Tradeoff Principle
One can often reduce time if one is willing to
sacrifice space,or vice versa.
? Encoding or packing information
Boolean flags
? Table lookup
Factorials
Disk-based Space/Time Tradeoff Principle,
The smaller you make the disk storage
requirements,the faster your program
will run.