[amp_mcq option1=”Increases as logn” option2=”Decreases as $$\log \left( {\frac{1}{n}} \right)$$” option3=”Increases as n” option4=”Increases as nlogn” correct=”option1″]
The correct answer is: A. Increases as logn.
The entropy of a source is a measure of its uncertainty. A source with high entropy is more uncertain, while a source with low entropy is less uncertain.
In this case, the source emits n symbols each with a probability p. This means that each symbol is equally likely to be emitted. The entropy of this source is therefore given by:
$H = -\sum_{i=1}^n p_i \log p_i$
where $p_i$ is the probability of symbol $i$ being emitted.
As $n$ increases, the probability of each symbol being emitted decreases. This is because there are more possible symbols that can be emitted. As the probability of each symbol decreases, the entropy of the source increases.
Therefore, the entropy of the source as a function of $n$ increases as $\log n$.
Option B is incorrect because the entropy of a source cannot decrease as $n$ increases.
Option C is incorrect because the entropy of a source does not increase linearly with $n$.
Option D is incorrect because the entropy of a source does not increase as $n^2$.