From Ronald de Wolf: Exercise 23.8 is wrong as stated. Counterexample: take X uniformly distributed over {0,1}, set Y=-X, and let alpha=1/2. Then the random variable alphaX+(1-alpha)Y is always 0, so it has entropy 0. On the other hand, H(X) and H(Y) are both equal to 1. So the lefthandside of the inequality is 0 while the righthandside is 1.

I think it is the probability distributions of X and Y that should be summed (with weights alpha and 1-alpha), not their values themselves. More precisely, assume X and Y are distributed over some set {1,...,n} (these two random variables may be dependent). Define a new random variable Z, also distributed over {1,...,n}, with probability distribution Pr(Z=z)=alpha*Pr(X=z)+(1-alpha)Pr(Y=z). Now prove the inequality H(Z)>=alpha*H(X)+(1-alpha)*H(Y).