(This posting is a minor revision of my answer in Math.StackExchange.)

In this posting, we compute:

\[\begin{equation} \lim_{n\to\infty} \left(\sum_{k=0}^{n} \binom{n}{k}^k \right)^{\frac{1}{n^2}} = \max_{p\in[0,1]} e^{pH(p)} \approx 0.4276362127321249214, \tag{*} \end{equation}\]

where $H(p) = -p\log p - (1-p)\log(1-p)$ is the binary entropy function.

Toward this goal, we first establish the following lemma:

Lemma 1. Assume that $(a_{n,k})_{k \in [0:n]; n \in \mathbb{N}_0}$ is a double sequence of positive numbers such that the following conditions hold:

Condition: There exists a continuous function $f : [0, 1] \to [0, \infty)$ such that $a_{n,k_n} \to f(p)$ for any $p \in [0, 1]$ and for any sequence $(k_n)_{n\geq 0}$ such that $k_n/n \to p$.

If $(N_n)_{n\geq 0}$ is any sequence of positive integers such that $\log n \ll N_n$ as $n \to \infty$, then

\[\lim_{n\to\infty} \frac{1}{N_n} \log\left[ \sum_{k=0}^{n} a_{n,k}^{N_n} \right] = \max_{p \in [0, 1]} \log f(p).\]

Before delving into the proof of the lemma, let us check that this indeed implies $\text{(*)}$. To this end, we set

\[a_{n,k} = \binom{n}{k}^{k/n^2}, \qquad f(p) = e^{pH(p)}, \qquad N_n = n^2.\]

Then for any $p \in [0, 1]$ and for any $(k_n)$ such that $k_n / n \to p$, Stirling’s approximation gives

\[\lim_{n\to\infty} a_{n,k_n} = f(p).\]

Then $\text{(*)}$ is an immediate consequence of Lemma 1. So we turn to the proof of the lemma. In doing so, the following observation is useful:

Lemma 2. Let $(f_n)_{n\geq 0}$ and $f$ be real-valued functions on $[0, 1]$ such that

  1. $f$ is continuous on $[0, 1]$, and
  2. $f_n(x_n) \to f(x)$ whenever $x_n \to x \in [0, 1]$.

Then $f_n$ converges uniformly to $f$.

Proof of Lemma 2. Assume otherwise. Then there exist $\varepsilon > 0$, an infinite subset $J$ of positive integers, and a subsequence $x_J = (x_j)_{j\in J}$, such that $\lvert f_j(x_j) - f(x_j) \rvert \geq \varepsilon$. Since $[0, 1]$ is sequentially compact, by passing to a further subsequence if necessary, we may assume that $x_J$ converges to some point $x \in [0, 1]$. Then by the continuity of $f$, $\lvert f(x_j) - f(x) \rvert \to 0$ as $j \to \infty$. Hence,

\[\begin{align*} \lvert f_j(x_j) - f(x)\rvert \geq \underbrace{\lvert f_j(x_j) - f(x_j)\rvert}_{\geq \varepsilon} - \underbrace{\lvert f(x_j) - f(x)\rvert}_{\to 0}, \end{align*}\]

and it follows that $f_j(x_j)$ cannot converge to $f(x)$, a contradiction. $\square$

We can now proceed to the proof of the lemma:

Proof of Lemma 1. For each $n$, define $f_n(x)$ as

\[f_n(p) = a_{n,\lfloor np\rfloor}.\]

Then for any $p \in [0, 1]$ and for any $p_n \to p$, the sequence $k_n = \lfloor np_n\rfloor $ satisfies $k_n/n \to p$, and so, $f_n(p_n) = a_{n,k_n} \to f(p)$. So by Lemma 2, $f_n$ converges uniformly to $f$ on $[0, 1]$. Let $m_n = \max_{k\in[0:n]} a_{n,k}$ be the row-wise maximum of $(a_{n,k})$. Then the uniform convergence implies that

\[\lim_{n\to\infty} m_n = \lim_{n\to\infty} \max_{p\in[0,1]} f_n(p) \to \max_{p\in[0,1]} f(p).\]

On the other hand,

\[m_n^{N_n} \leq \sum_{k=0}^{n} a_{n,k}^{N_n} \leq n \cdot m_n^{N_n},\]

hence

\[m_n \leq \left( \sum_{k=0}^{n} a_{n,k}^{N_n} \right)^{\frac{1}{N_n}} \leq m_n e^{\frac{\log n}{N_n}}.\]

Since $\log n \ll N_n$, the desired conclusion follows from the squeeze lemma. $\square$