Recurrence Functions #
Notes on Better Master Theorems for Divide-and-Conquer Recurrences
Linear recurrences #
A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. In particular, we are concerned with the following form of linear recurrence $T(x) :[1, \infty) \to (0, \infty)$. $$ T(x) = \begin{cases} h(x) & \text { if } 1 \le x \le x _ 0 \\ \sum _ {i = 1}^k a _ i(x)T\p{b _ i(x)} + g(x) & \text { if } n> x _ 0 \end{cases} $$ There are several components in this recurrence:
- $a _ i(x): [1, \infty) \to (0, \infty)$ is a scaling function.
- $b _ i(x): [1, \infty) \to [1, \infty)$ where $1 \le b _ i(x) \le x$ is a indexing function.
- $g(x) : [x _ 0, \infty) \to (0, \infty)$.
If $g(n)$ is increasing on $\N$; $a _ i(n)$, $b _ i(n)$ are increasing on $n > N _ 0$, $T(n)$ is also increasing on $n > N _ 0 $. We can prove this with induction. For $n \ge N _ 0$. $$ T(n + 1) - T(n) \ge \sum _ {i = 1}^k a _ i(n) \s{T(b _ i(n + 1)) - T(b _ i(n))} + \s{g(n + 1) - g(n)} \ge 0 $$ $T(x)$ is linear in the initial condition $h(x), g(x)$.
You may see asymptotic linear recurrences: $$ T(x)= \begin{cases} O(h(x)) & \text { for } 1 \leq x \leq x _ 0 \\ \sum _ {i=1}^k a _ i (x)T(b _ i(x))+ O(g(x)) & \text { for } x>x _ 0\end{cases} $$
- Where $O$ here may be replaced with $\Theta$ or $\Omega$.
Master Theorem -- Leighton recurrence #
Consider the following linear recurrence $T(x) :[1, \infty) \to (0, \infty)$.
Given constants $a _ i \in (0, \infty)$ and $b _ i \in (0, 1)$ and $g(x): (x _ 0, \infty) \to (0, \infty)$, $h(x): [1, x _ 0] \to (0, \infty)$. $$ T(x)= \begin{cases}h(x) & \text { for } 1 \leq x \leq x _ 0 \\ \sum _ {i=1}^k a _ i T\left(b _ i x+h _ i(x)\right)+g(x) & \text { for } x>x _ 0\end{cases} $$ We require the following conditions for the theorem to be true:
We have: $$ \exists c _ 1, c _ 2 > 0,\forall i,\forall x \in [1, \infty):g[b _ i x, x] \subseteq [c _ 1g(x), c _ 2g(x)] $$
- In particular, if $\abs{g'(x)}$ is bounded by a polynomial of $x$, it satisfies this condition.
We have: $$ \exists c _ 1, c _ 2 > 0,\forall i,\forall x \in [1, \infty):g[b _ i x + h _ i(x), x] \subseteq [c _ 1g(x), c _ 2g(x)] $$
For the perturbation term $h _ i(x)$ it satisfies: $$ \exists \epsilon > 0, \forall x \in [x _ 0, \infty): \abs{h _ i(x)} \le \frac{x}{\log^{1 + \epsilon}x} $$
The constant $x _ 0$ is large enough. See the paper for more details.
Then $T(x)$ is uniquely determined. And there is a unique $p \in \R$ such that: $$ \sum _ {i = 1}^k a _ i b _ i^p = 1, \quad T(x) \in \Theta\p{x^p \p{1 + \int _ 1^x \frac{g(u)}{u^{p + 1}} \dd u}} $$ $T(x)$'s class does not depend on $h(x)$ at all in this special case. So you can ignore it.
$T(x)$ is in the same function class, after changing the initial conditions on $[1, x _ 0]$.
Abbreviated notation of Leighton linear recurrences #
Due to the properties proved above, for Leighton linear recurrences:
- The initial conditions are not important as long as they are strictly positive.
- You can split as many points as the initial values, and it won't change the asymptotic behavior.
- Scaling the $g(n)$ function has no effect on the asymptotic behavior.
In this case, we can omit the details introduced above, and write recurrences as: $$ T(x) = \sum _ {i=1}^k a _ i T\left(b _ i x+h _ i(x)\right)+g(x) $$
Further more, when $g(x)$ is replaced with $O(g(x))$, $\Theta(g(x))$ and $\Omega(g(x))$, just replace with $g(x)$ and solve the concrete recurrence. The result is guaranteed to be in class $O, \Theta, \Omega$ of the concrete solution.
Recursion Tree #
Recursion tree #
A recursion tree is best used to generate a good guess for solving recurrences. It can then be verified by the substitution method.
- Each node represents the cost of a single subproblem (e.g. in splitting and merging).
- Inherently a rough method, since we ignore floor and ceil functions.
Master's theorem #
Consider the Leighton recurrences $T(n) = a T(n / b) + \Theta(n^d)$.
$$ T(n) = \begin{cases} \Theta(n^d \log n) & \text{if } a = b^d\\ \Theta(n^d) & \text{if } a < b^d\\ \Theta(n^{\log _ b a}) & \text{if } a > b^d \end{cases} $$ In the case where $n$ is a power of $b$ and $b$ is integer, we can draw the recursion tree, and see that: $$ T(n) = a T\p{\frac{n}{b}} + n^d = n^d \cdot \sum _ {t = 0}^{\log _ b n}\p{\frac{a}{b^d}}^t $$ Otherwise, try Leighton's formula. $p = \log a / \log b$. $$ x^p \p{1 + \int _ 1^x u^{d - p - 1} \dd u} $$
- Suppose $d = p$, that is $a = b^d$. We have $T(n) \in \Theta(n^d \log n)$.
- Suppose $d < p$, that is $b^d < a$. We have $T(n) \in \Theta(n^{\log _ b a})$.
- Suppose $d > p$, that is $b^d > a$. We have $T(n) \in \Theta(n^d)$.
Substitution Method #
Basically mathematical induction. We first guess a function form of $T(n)$, then prove it.
- Sometimes when induction failed, try adding lower order terms.
- Usually missing a linear term $n \log n \to n \log n + n$.
A basic trick is to use variable coefficient.
- First set variables.
- Choose the variables at last.
Mostly we try to prove $T(n) = O(g(n))$ style result.
In cases of $n / b$, we can safely ignore the ceil and floor here due to Leighton's theorem.
Example 1 #
$T(n) = 2 T(\floor{n / 2}) + n$. We guess $T(n) = O(n \log n)$.
Suppose $T(n) \le a n \log n + b n$.
$T(1) = 1 \le b$.
Suppose true for $T(k), k < n$. Consider $T(n)$. $$ T(n) \le \p{a n \log \frac{n}{2} + b n} + n = an \log n - an \log 2 + (1 + b)n $$
We need $1 - an \log 2 \le 0$. So $a \le 1$.
Therefore, take $a = 1$ and $b = 1$ is good enough.
Example 2 #
$T(n) = T(n / 5) + T(7n / 10) + n$. We can see that $T(n) = \Theta(n)$.
Suppose $T(n) \le cn$.
$T(k) = 1 \le c \cdot k$, so $c \ge 1/10$.
Suppose true for $T(k), k < n$. Consider $T(n)$. $$ T(n) \le cn / 5 + c7n / 10 + n = c9n/10 + n \le cn $$
This gives that $c \ge 10$.
Example 3 #
$T(n) = 4 T(n / 2) + 100n$.
Guess that $T(n) \le an^2 + b n$.
$T(1) = 1 \le a + b$.
Suppose true for $T(k), k < n$. Consider $T(n)$. $$ T(n) \le an^2 + (2b + 100)n $$
So $2b + 100 \le b$. We must have $b \le -100$.
Pick $b = -100$ and $a = 200$.
Example 4 #
$$ T(n) = T(n / 4) + T(n / 2) + n^2 $$
With Leighton's method, $T(n) = \Theta(n^2)$.
Draw the recursion tree, you will see layers $n^2 + 5/16 n^2 + (5/16)^2 n^2 + \cdots$. The sum is finite, and will give $O(n^2)$.
Or use substitution method: suppose $T(n)\le cn^2$.
$T(1) = 1 \le c$.
Induction step: $$ T(n) \le cn^2 / 16 + cn^2 / 4 + n^2 = (5c/16 + 1) n^2 \le cn^2 $$
This is clearly solvable.
Trick 1: skip first terms #
Consider $T(n) = 2T(\floor{n / 2}) + n$. Find its upper bound.
The guess is $T(n) \le c n \log n$.
- $T(1) = 1$, no $c$ works since $\log 1 = 0$.
- You can try to start with $cn \log n + bn$.
Strategy: notice that $T(2) = 3$, $T(3) =5$, and for $n \ge 4$, $T(n)$ does not depend on $T(1)$.
We remove $T(1)$ from induction, and instead show $\forall n \ge 2: T(n) \le cn \log n$.
$3 \le c \cdot 2$ and $5 \le c \cdot 2 \log 3$ are initial conditions.
Now suppose the argument is true for $n < m$, conside even $m$: $$ T(m) = 2 T( m / 2) + m = c m \log _ 2 m + (1 - c)m \le cm \log _ 2 m $$
For odd $m$, it works similarly.
The induction step shows $c \ge 1$ is good enough, and the initial conditions requires $c \ge 2$.
Trick 2: change of variable #
Consider the recurrence: $$ T(n) = 2 T(\floor{\sqrt n}) + \log n $$
$$ T(2^{\log n}) = 2 T(2^{\log n/2}) + \log n $$
$$ G(m) = 2 G(m / 2) + m = O(m \log _ 2 m) $$
So $T(n) = O(\log n \log \log n)$.