Let A\subseteq [n]^r a set of vectors a=(a_1,,\ldots,a_r) with all a_i\in[n]=\{1,\ldots,n\}. The projection of a vector a=(a_1,\ldots,a_r) onto a set of coordinates S=\{i_1, < \ldots < i_k\} is the vector \proj{a}{S}= (a_{i_1},\ldots,a_{i_k}). The projection of a set of vectors A is the set \proj{A}{S}=\left\{\proj{a}{S}\,:\, a\in A\right\}. Note that |\proj{A}{S}|\leq |A|\leq n^{r-|S|}\cdot |\proj{A}{S}| holds for all S\subset[n]. For a vector x\in \proj{A}{S}, the extensions of x in A are all vectors a\in A with \proj{a}{S}=x. Note that x has at least one and at most n^{r-|S|} extensions in A. The average number of extensions of a vector of \proj{A}{S} in A is \aveDeg{A}{S}=\frac{|A|}{|\proj{A}{S}|} The minimum number, \minDeg{A}{S}, of extensions of vectors in \proj{A}{S} is the minimum, over all vectors x\in\proj{A}{S}, of the number of extensions of x in A. Note that 1\leq \minDeg{A}{S}\leq \aveDeg{A}{S}\leq n^{r-|S|} holds for every S\subseteq[r]. For a family \f\subseteq 2^{[r]} of sets of positions, let \minDeg{A}{\f}=\min\{\minDeg{A}{S}\colon S\in\f\}.
The following lemma is a sight generalization of a more specific fact (Corollary 2 below) proved by Ran Raz and Pierre McKenzie [1]; a simpler argument given below is due to Mike Saks.
Thinkness Lemma: Let A\subseteq[n]^r be a set of vectors, 0<\alpha <1 a real number, and \f\subseteq 2^{[r]} a family of sets of positions, each of size k. If \aveDeg{A}{S}\geq \alpha\cdot n^{r-k} holds for all S\in\f, then for every \beta\geq 0 there is a subset B\subseteq A of |B|\geq (1-\beta)|A| vectors such that \minDeg{B}{}\geq \gamma\cdot n^{r-k} for \gamma=\alpha\beta/|\f|.
Proof: The lemma is trivial for \beta=0; so, assume that \beta>0. Define a sequence A=A^0\supset A^1\supset \ldots of subsets of A by the following procedure.
Note that when going from A^j to A^{j+1}, at least one vector x is removed from the projection of A^j onto the set S. So, the total number l of steps satisfies l\leq \sum_{S\in\f} |\proj{A}{S}| =\sum_{S\in\f}\frac{|A|}{\aveDeg{A}{S}}\,, and since \aveDeg{A}{S}\geq \alpha n^{r-k} holds for every i, we can conclude that the procedure will (definitely) stop after l\leq \frac{|\f|\cdot|A|}{\alpha n^{r-k}} steps. On the other hand, A^{j+1} is obtained from A^j by removing at most \Delta vectors. So, the total number of vectors removed from A in all l steps to get A^l is at most l\cdot \Delta\leq \frac{|\f|\cdot |A|}{\alpha n^{r-k}}\cdot\Delta =\frac{|\f|\cdot |A|}{\alpha n^{r-k}}\cdot \frac{\alpha\beta n^{r-k}}{|\f|}=\beta|A|\,. Thus, when the procedure stops after l steps, |A^l|\geq (1-\beta)|A| still holds (the first stopping condition (1) is not fulfilled). So, since the procedure stopped, the second condition \minDeg{A^l}{}\geq \Delta is fulfilled. \Box
Corollary 1: Let G=(V_1\cup V_2,E) be a bipartite n\times n graph such that the average degrees of non-isolated vertices in V_1 and in V_2 is at least \alpha n. Then for every \beta\geq 0, there exists a subset E'\subseteq E of |E'|\geq (1-\beta)|E| edges such that the the degree of every non-isolated vertex of G'=(V_1\cup V_2,E') is at least \alpha\beta n/2.
Proof: Apply the thinkness lemma with r=2 and \f=\{\{1\},\{2\}\}. \Box
Let A\subseteq[n]^r. Given a position 1\leq i\leq n, the i-th neighbor of a vector a\in A is a vector b\in A such that b_j=a_j for all j\neq i.
Corollary 2: Let A\subseteq[n]^r and 0<\alpha <1. If for every position i\in [r] the average number of i-th neighbors of a vector of A is at least \alpha n, then there is a subset B\subseteq A of |B|\geq (1-\beta)|A| vectors such that, for every position i\in [r], every vector of B has at least \alpha\beta n/r i-th neighbors in B.Proof: Apply the thinkness lemma with \f=\{[r]-\{i\}\colon i=1,\ldots,r\}. \Box
Rectangle Lemma: Every subset X\subseteq V_1\times \cdots\times V_r of |X|\geq \alpha n^r vectors contains at least \fact{r}{\alpha}\cdot \binom{n}{k}^r k-rectangles.The proof generalizes the double-counting argument of Kövari, Sós and Turán for bipartite graphs (see Sect. 2.4 in the book): the idea is to use (in a clever way) the convexity of the function g(x)=\binom{x}{k} (with \binom{x}{k}=0 for x< k-1) together with the Jensen inequality for convex functions g(x): \begin{equation}\label{eq:1} \sum_{i=1}^m g(x_i)\geq m\cdot g\Big(\frac{1}{m}\sum_{i=1}^m x_i\Big)\,. \end{equation} Proof: We proceed by induction on the dimension r. The basis case r=1 is trivial. Assume the theorem holds for r, and take a set X\subseteq V_1\times \cdots\times V_{r+1} of |X|\geq \alpha n^{r+1} vectors. For v\in V_{r+1}, let \dens{v}:=|X_v|/n^r be the density of the set of vectors X_v=\{(v_1,\ldots,v_{r})\colon (v_1,\ldots,v_{r},v)\in X\}\,. Then we have \sum_{v\in V_{r+1}}\dens{v}\cdot n^r=|X|\geq \alpha n^{r+1}\,, and hence, the average density is \begin{equation}\label{eq:2} \frac{1}{n}\sum_{v\in V_{r+1}}\dens{v}\geq \alpha\,. \end{equation} Let \rect be the set of all |\rect|=\binom{n}{k}^r k-rectangles in V_1\times\cdots\times V_r, and consider the bipartite graph G\subseteq \rect\times V_{r+1} such that (R,v)\in G if and only if R\times \{v\}\subseteq X. Since |X_v|\geq \dens{v}\cdot n^r holds for every point v\in V_{r+1}, the induction hypothesis implies that each set X_v must contain at least \begin{equation}\label{eq:3} d_v:=\fact{r}{\dens{v}}\cdot\binom{n}{k}^r=\fact{r}{\dens{v}}\cdot|\rect| \end{equation} k-rectangles, and hence, at least so many edges of G must be incident with the vertex v. Since the function h(x)=\fact{i}{x} is convex, the inequalities \eqref{eq:1}-\eqref{eq:3} (with g(x):= \fact{n}{x}, and x_v:=\dens{v}) yield the following lower bound on the number |G| of edges in G: \begin{equation}\label{eq:4} |G|\geq \sum_{v\in V_{r+1}} d_v\overset{\eqref{eq:3}}{=} |\rect|\sum_{v\in V_{r+1}}\fact{r}{\dens{v}}\overset{\eqref{eq:1}}{\geq} | \rect|\cdot n\cdot \ffact{r}\bigg(\frac{1}{n}\sum_{v\in V_{r+1}}\dens{v}\bigg) \overset{\eqref{eq:2}}{\geq} |\rect|\cdot n\cdot \fact{r}{\alpha}\,. \end{equation} For a rectangle R\in\rect, let \dens{R} denote its degree in the graph G; hence, \sum_{R\in\rect}\dens{R}=|G|. By the definition of the graph G, for every rectangle R\in\rect and for every subset S\subseteq V_{r+1} of |S|=k its neighbors in G, the Cartesian product R\times S is a k-rectangle lying in our set X. So, we can again use the Jensen inequality \eqref{eq:1} with g(x):=\binom{x}{k} and m:=|\rect|=\binom{n}{k}^{r} to conclude that the number of k-rectangles in X is at least \begin{align*} \sum_{R\in\rect}\binom{\dens{R}}{k}\overset{\eqref{eq:1}}{\geq} |\rect|\binom{\tfrac{1}{|\rect|}\sum_{R\in\rect}\dens{R}}{k} \overset{\eqref{eq:4}}{\geq} |\rect| \binom{n\cdot \fact{r}{\alpha}}{k} =\binom{n}{k}^r\binom{n\cdot \fact{r}{\alpha}}{k}\,, %\qedhere \end{align*} which is exactly \fact{r+1}{\alpha}\cdot \binom{n}{k}^{r+1}, by the definition of functions \ffact{i}. \Box
Corollary 3: If \log n\geq k^r+\log k, then any subset A\subseteq [n]^r of |A|\geq 4\left(\frac{n}{2^k}\right)^r vectors contains at least one (r,k)-rectangle.Proof: We have |A|\geq \alpha n^r for \alpha:=2^{2-kr}. We also have \fact{n}{\alpha}\ge (\alpha/\euler)^{k^r}\geq 2^{-krk^r}. By the rectangle lemma, the set A contains at least \fact{r}{\alpha}\cdot \binom{n}{k}^r\geq \fact{r}{\alpha}\cdot (n/k)^k (n,k)-rectangles. So, it is enough to ensure that (n/k)^{kr}\geq (\euler/\alpha)^{k^r}. By taking logarithms, this turns into the inequality kr\log n\geq k^r\log(\euler/\alpha)+kr\log k\,. After the division by kr, we obtain that \log n needs only be at least \gamma k^r+\log k for \gamma= \log(\euler/\alpha)/kr. By our condition on \alpha, \gamma is at most 1, so the inequality holds. \Box