where $t$ grows polynomially faster than n (i.e. $t = n^{1 + \epsilon}$ for some $\epsilon > 0$) and we define $0 \log 0 \equiv 0$. The quadratic expression can also be written as the quadratic form x^T A x, where the $ x \in \mathbb{R}^n$ and $A$ is the matrix with 1's in the first k off diagonals.

Now, '$X_i = t/(k+1)$' for exactly $k+1$ contiguous elements and zero otherwise is a solution. Having spent a lot of time with the problem, I am nearly certain that this is the only solution (asymptotically in n), but cannot come up with a proof.

I tried reformulating the problem as minimizing the entropy-like term with the quadratic as a constraint. However, the interior of $\mathbb{R}^{n}^+$ will only include maxima of the function, so there would be $n+1$ Lagrange multipliers, making the approach seem intractable.

I have already shown that if the mass is bounded above by t, or if there exists a set whose size is uniformly bounded in n whose mass is bounded below by t, this is the only solution. The trouble arises when one considers the option of infinitely many o(t) terms adding up to something that is $\Theta(t^2)$ - I cannot seem to find bounds sensitive enough to give me a contradiction here. Also notice that this is not technically a convex problem: the fully concentrated option '$X_i = t$' once and zero otherwise does not satisfy the entropy inequality whenever $k >0$.

Does the implication follow, or is there some pathological example that satisfies both equations without being concentrated on $k+1$ elements?