TransWikia.com

Proof for a combinatorial identity

Mathematics Asked on December 15, 2021

I have the following formula, which I believe it’s true since it works in Mathematica for all values of $N$ I have tried, but I don’t know how to prove it:

$$sum_{q=0}^{N} {N choose q}^2 x^{q} = frac{1}{{2N choose N}} sum_{k,l=0}^N ; sum_{s=0}^{min(m, N-M)} ; sum_{t=0}^{min(m, , N-M)} \ {N choose M} {M choose m-s} {N-M choose s} {N choose N-m} {N-m choose N-M-t} {m choose t} x^{M-m+s+t} $$

where $m=min(k,l)$ and $M=max(k,l)$, and $x$ can be any complex number. I know one can write the LHS as a Legendre polynomial $ sum_{q=0}^N { N choose q }^2 x^q = (1-x)^N P_N left( frac{1+x}{1-x} right)$, and as a Hypergeometric function $ sum_{q=0}^N { N choose q }^2 x^q = , _2F_1 (-N, -N, 1, x)$, but apart from that I don’t know how to simplify the RHS. I have tried Egorichev method to transform sums involving binomial coefficients into residual integrals, but didn’t get much from there. Any ideas?

Edit: I have found yet another way of writing the same quantity:

$$sum_{q=0}^{N} {N choose q}^2 x^{q} = \
= frac{1}{ {2N choose N} } sum_{p,q=0}^N , sum_{r=max(0, , q+p-N)}^{min (q, , p)} , sum_{s=max (0, , q-p)}^{min (q, , N-p)} {N choose p} {N choose N-p} {p choose r} {N-p choose s} {N-p choose q-r} {p choose q-s} x^q $$

This one looks simpler than the previous one, since for instance here $x$ is decoupled from the sums in $s$ and $t$. Again I have tried Egorychev method on the RHS, which allows you to write the sums in $s$ and $t$ as complex contour integrals, and then you can easily choose your limits in the sum to be whatever is more convenient so that you can actually compute the sums in $r$ and $s$. But in exchange you now have four complex contour integrals (one for every summation limit you want to "kill"), so I don’t know if this is simpler. I suspect there must be a more general identity relating all three expressions them. Any suggestions?

2 Answers

Consider the coefficient of $x^q$ (and being slightly lazy with the limits of the sums) ... It suffices to show begin{eqnarray*} sum_{p,r,s} binom{N}{p} binom{N}{N-p} binom{p}{r} binom{N-p}{s} binom{N-p}{q-r} binom{p}{q-s} = binom{2N}{N} binom{N}{q}^2. end{eqnarray*} We shall use $2$ coeffiecient extractors begin{eqnarray*} binom{N-p}{s}= [x^0]: frac{ (1+x)^{N-p} }{x^s} \ binom{N-p}{q-r} =[y^{0}]: frac{(1+y)^{N-p}}{y^{q-r}}. end{eqnarray*} So begin{eqnarray*} & &sum_{p,r,s} binom{N}{p} binom{N}{N-p} binom{p}{r} binom{N-p}{s} binom{N-p}{q-r} binom{p}{q-s} \ &=& sum_{p} binom{N}{p} binom{N}{N-p} [x^0][y^{0}] : sum_{r,s} binom{p}{r} binom{p}{q-s} frac{ (1+x)^{N-p} }{x^s} frac{(1+y)^{N-p}}{y^{q-r}} \ &=& sum_{p} binom{N}{p} binom{N}{N-p} [x^0][y^{0}] : frac{ (1+x)^{N-p} (1+y)^{N-p}}{x^q y^q}sum_{r} binom{p}{r}y^r binom{p}{q-s} x^{q-s} \ &=& sum_{p} binom{N}{p} binom{N}{N-p} [x^0][y^{0}] : frac{ (1+x)^{N} (1+y)^{N}}{x^q y^q}\ &=& binom{N}{q}^2 sum_{p} binom{N}{p} binom{N}{N-p}. end{eqnarray*} Now recall the well known plum begin{eqnarray*} sum_{p} binom{N}{p} binom{N}{N-p} = binom{2N}{N} end{eqnarray*} and we are dumb. $ddot smile$

Answered by Donald Splutterwit on December 15, 2021

OK I think I have a partial answer that might help prove the second identity by definition. However, I still don't know how this would apply to the first identity. Moreover, I would still like to understand this in a more general way. Therefore I will leave the bounty open. I am only writing this answer to perhaps help someone else to give a full answer.

Basically the trick is the definition of the Hypergeometric function or, in general, the Generalized hypergeometric function. A sum

$$ phi = sum_{n geq 0} beta_n z^n$$

is a Generalized hypergeometric function if the fraction $beta_{n+1}/beta_n$ is some rational function of $n$. In particular, the above sum is defined to be the Generalized hypergeometric function $_pF_q (a_1, ..., a_p ; , b_1, ..., b_q ; , z)$ if the sum coefficients satisfy (up to some overall factor that can be reabsorbed in $z$)

$$frac{beta_{n+1}}{beta_n} = frac{(a_1+n) ... (a_p+n)}{(b_1+n) ... (b_q + n)(1+n)}$$

where the $a$'s and $b$'s are just the roots of the polynomials on the numerator and the denominator respectively. One can check straightforwardly that the sum

$$sum_{q=0}^N { N choose q }^2 x^q$$

gives $frac{beta_{q+1}}{beta_q} = frac{(-N+q)^2}{(1+q)^2}$. Now for the second sum

$$ frac{1}{ {2N choose N} } sum_{p,q=0}^N , sum_{r=max(0, , q+p-N)}^{min (q, , p)} , sum_{s=max (0, , q-p)}^{min (q, , N-p)} {N choose p} {N choose N-p} {p choose r} {N-p choose s} {N-p choose q-r} {p choose q-s} x^q $$

I don't know exactly how one can compute it, but Mathematica does give me $frac{beta_{q+1}}{beta_q} = frac{(-N+q)^2}{(1+q)^2}$. So they are both equal to $_2F_1(-N, -N; 1; x)$.

I don't know how one can check this for the first sum since there the exponent of $x$ is not just $q$. Suggestions are welcome.

Answered by MBolin on December 15, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP