In all cases we examine only the term $a_d n^d$, since it dominates
the subject polynomial's running time. All the lower order terms
would do is increase the $n_0$ needed to make the bounds hold true.
a.
We are seeking some constant $c$ such that $0 \le a_d n^d \le c n^k$.
If $k = d$, this simplifies
to $a_d \le c$. So $c$ would have to be chosen such that it is greater than or equal to $a_d$. But
then we would have a working upper bound on the subject polynomial.
If $k > d$, let $c_2 = k-d$. Our inequality holds when:
$$
\begin{align}
a_d &\le c n^{c_2} \\
\frac{a_d}{c} &\le n^{c_2} \\
\sqrt[c_2]{\frac{a_d}{c}} &\le n \\
\end{align}
$$
So after choosing $c$, we'd just have to wait for n to be large enough to make the inequality hold true.
Which should happen since $n$ is the only side of that inequality that grows with $n$.
So $p(n) = O(n^k)$.
b.
We seek some $c$ such that $0 \le c n^k \le a_d n^d$.
If $k = d$, this simplifies to $c \le a_d$, so $c$ would have to be chosen such that is
less than or equal to $a_d$.
If $k < d$, let $c_2 = d - k$. We would have the inequality we seek when:
$$
\begin{align}
c n^k &\le a_d n^d \\
c &\le a_d n^{c_2} \\
\frac{c}{a_d} &\le n^{c_2} \\
\sqrt[c_2]{\frac{c}{a_d}} &\le n \\
\end{align}
$$
So any $c$ would work. We'd just have to wait for $n$ to be large enough to make the inequality hold
true, which should happen since $n$ is the only side that increases with $n$.
So $p(n) = Ω(n^k)$.
c.
We seek two constants $c_1$ and $c_2$ (both greater than 0)
such that the inequality $0 \le c_1 n^k \le a_d n^d \le c_2 n^k$ holds.
Since $k = d$, this simplifies to $c_1 \le a_d \le c_2$. So we just need to choose $c_1 \le a_d$ and
$c_2 \ge a_d$. Then we'll have tight lower and upper bounds.
So $p(n) = θ(n)$.
d.
Let $c_2 = k - d$. It is positive since $k \gt d$. We seek that $a_d n^d \lt c n^k$, for any $c$.
This is true when:
$$
\begin{align}
a_d n^d &\lt c n^k \\
\frac{a_d}{c} &\lt n^{c_2} \\
\sqrt[c_2]\frac{a_d}{c} &\lt n \\
\end{align}
$$
Which is obvously true since the left side is constant and $n$ increases with $n$. It is important
to observe that $c_2 \gt 0$. Else, it would put the $n$ term in the denominator on the 2nd line, causing
that side to shrink as smaller and smaller values of $c$ are selected (values less than 1),
making the left-hand-side grow. But $c_2$ is positive.
So $p(n) = o(n^k)$.
e.
We seek that $a_n n^d \gt c n^k$, for any $c$. Let $c_2 = d-k$. It is greater than 0 since $k < d$.
Our inequality holds when:
$$
\begin{align}
a_d n^{c_2} &\gt c \\
n^{c_2} &\gt \frac{c}{a_d} \\
n &\gt \sqrt[c_2]{\frac{c}{a_d}} \\
\end{align}
$$
which is obviously true since $n$ increases with $n$ and the constant, right-hand-side does not.
It is important to observe that $c_2 \gt 0$. Else, it would put the $n$ term on the second line of the above
in the denominator. So as $n$ would grow, the left-hand-side of the inequality would shrink.
This would make it impossible for $n$ to keep up with selections of larger and larger values for $c$.
But $c_2$ is positive.
So $p(n) = ω(n^k)$.
|
A |
B |
O |
o |
Ω |
ω |
θ |
a |
$\lg^k n$ |
$n^ε$ |
yes |
yes |
no |
no |
no |
b |
$n^k$ |
$c^n$ |
yes |
yes |
no |
no |
no |
c |
$\sqrt n$ |
$n^{\sin n}$ |
no |
no |
no |
no |
no |
d |
$2^n$ |
$2^{n/2}$ |
no |
no |
yes |
yes |
no |
e |
$n^{\lg c}$ |
$c^{\lg n}$ |
yes |
yes |
no |
no |
no |
f |
$\lg(n!)$ |
$\lg(n^n)$ |
yes |
no |
yes |
no |
yes |
a. By 3.24, The first function is little-o of the second.
b. By 3.13, the first function is little-o of the second.
c. The comparison of the first function with the second, which is a sinosoid, is shown here.
d. We compare the two functions, starting with the conjecture that the second function times a
constant $c$ is less than the first.
$$
\begin{align}
c(2^{n/2}) &\lt 2^n \\
\lg c + \lg (2^{n/2}) &\lt \lg 2^n \\
\lg c + \frac{n}{2} &\lt n \\
2 \lg c + n &\lt 2n \\
\frac{2 \lg c}{n} + 1 &\lt 2 \\
\end{align}
$$
Here, a large enough $n$ reduces the impact of $c$ to 0. So
this inequality is always true for any $c$, for large enough $n$. So $2^{n} = ω(2^{n/2})$.
e. We first transform the second function to be of base $n$, like the first, then compare
the two functions, assuming the second is asymptotically larger than the first.
$$
\begin{align}
c^{\lg n} &= n^{\log_{n} c^{\lg n}} \\
&= n^{\lg n \log_{n} c} \\
\end{align}
$$
$$
\begin{align}
c n^{\lg c} &\lt n^{\lg n \log_{n} c} \\
\log_{n} c + \lg c &\lt \lg n \log_{n} c \\
\end{align}
$$
This inquality holds for all $c$ for large enough $n$. So $n^{\lg c} = o(c^{\lg n})$.
f. We compare the two functions, conjecturing that the first is big-O of the second. We use
the upper exclusive bound from 3.29 as the value of n!.
We drop constant terms during this comparison.
$$
\begin{align}
\lg(n!) &\le c\lg(n^n) \\
\lg(n!) &\le \lg(n^{nc}) \\
n! &\le n^{nc} \\
\sqrt{2 \pi n} \left( \frac{n}{e} \right)^n e^{a_n} &\le n^{nc} \\
\frac{1}{2} \log_{n}(2 \pi n) + n(\log_n n - \log_n e) + \log_n e^{a_n} &\le nc \\
\frac{1}{2} \log_{n}(2 \pi) + \frac{1}{2} \log_{n}(n) + n \log_n n - n \log_n e + \frac{1}{12n}\log_n e &\le nc \\
n - n \log_n e + \frac{\log_n e}{12n} &\le nc \\
1 - \log_n e + \frac{\log_n e}{12n^2} &\le c \\
\end{align}
$$
The $\log_n e / 12n^2$ term goes to 0 as n increases. $- \log_n e$ is always negative, but it approaches
0 as n increases.
This inequality holds as long as $c > 1$, which is not enough for a little-o bound but is enough for
the big-O bound we conjectured.
We then test for a big-Ω bound. We don't have to repeat the algebra, because it mostly stays the same.
$$
\begin{align}
c \le 1 - \log_n e + \frac{\log_n e}{12n^2} \\
\end{align}
$$
This inequality holds when $c = 0.25$, for large values of $n$, but it does not hold for $c = 20$ or greater.
Thus, we have enough for a big-Ω bound, but not enough for a little-ω bound.
We've met the prerequisites for a θ bound.