Consider the following recursive algorithm:
def f(n: int) -> int:
if n <= 1:
return 1
else:
return g(n) + f(n / 2) + f(n / 2)
Given that g(n)
has time complexity f(n)
. Let's see if we can work out the answer without using any powerful machinery one might initially be tempted to use.1 Most of the difficulty will actually be in evaluating sums, but this should be doable if you've done MATH1081.
In all that follows, we will make some simplifying assumptions:
- Assume that
$n$ is a power of 2. This is OK, because otherwise we can replace it by$2^{\lceil \log_2{n} \rceil} \geq n$ instead and still obtain a suitable upper bound on the complexity off(n)
. - Assume that
$n$ is big enough that an upper bound on the amount of workg(n)
does is$C\log(n)$ for some constant$C$ , and that this is implicitly the logarithm of base 2. This is also OK, becauseg(n)
has time complexity$O(\log{n})$ , so this is just a formal interpretation of what this big-O statement means. Also, by change of base, all logarithms are asymptotically equivalent, regardless of base, so we can just consider base 2.
Let's look at the call tree generated by f
, i.e. a diagrammatic representation of how one call initiates further calls:
f(n) | Level 0
/ \
/ \
/ \
/ \
/ \
f(n/2) f(n/2) | Level 1
/ \ / \
/ \ / \
f(n/4) f(n/4) f(n/4) f(n/4) | Level 2
... ... ... ... | Levels 3 .. log(n)
This tree has exactly f
with the input g
with the same input. This suggests that the total amount of work done on this level is bounded above by
So, the total amount of work done over the whole tree is bounded above by
The first sum is obviously
This answer intuitively suggests that the number of recursive calls was the dominating factor in f
's complexity, not the work it did in each call. Sure enough, this is exactly what certain powerful machinery suggests should happen. This isn't a priori obvious just by looking at the code though!
Footnotes
-
Well, what we're about to do is, more or less, work through a specific case of a specific case of this powerful machinery. ↩