A way to find a tight asymptotic bound for a recurrence is via the master theorem, which requires the form of a recurrence to be the following:
represents the work done in a function call, excluding the subproblems.
Master theorem does not work for all recurrences
Master theorem works for divide and conquer recurrences that matches the recurrence form. You may need to find another method (e.g. recurrence tree, substitution) for incompatible recurrences.
Constant | Meaning |
---|---|
Number of subproblems per problem. This applies recursively to subproblems, so each subproblem will also have sub-subproblems. | |
The reduction factor, e.g. means each subproblem has half the size of the problem, like in merge sort. This also applies recursively. | |
Constant factor cost of a function call, i.e. cost of splitting the problem and combining solutions. This won’t matter for asymptotic runtime analysis. | |
Exponential cost of a function call, i.e. cost of splitting the problem and combining solutions. | |
Power on the logarithm. |
Once we have in the above form, we can use the master theorem to find . Find the case that matches your recurrence from the table below, then prove the corresponding statement in the second column to get the runtime in the third column. You can choose any value of that makes the statement true (by visual inspection), then prove via limit lemma.
If … | then prove that … for some | Interpretation | |
---|---|---|---|
) | Recursion costs asymptotically more than splitting and merging. | ||
Both costs the same asymptotically. There are levels are each level costs . | |||
Splitting and merging solutions cost asymptotically more than recursion. |
The proof is only for educational purposes
If a case matches, then master theorem already guarantees the tight bound without you having to prove the second column (as long as the recurrence matches the recurrence form exactly).
While proving bounds, also find the that makes the relationship true.