# Critical values of the evaluation map for rational curves in toric surface

Mathematics Asked by Blm on October 3, 2020

I’m looking at rational curves in a toric surface. Such curves have a parametrization of the form
$$tdashrightarrow chi prod_{j=1}^m (t-alpha_j)^{n_j} in (mathbb{C}^*)^2 ,$$
for some scalars $$alpha_i$$, $$chi:mathbb{Z}^2rightarrow mathbb{C}^*$$ a morphism, and where $$n_j$$ are vectors in $$mathbb{Z}^2$$, and $$sum n_j=0$$. (it is just the standard parametrization of a rational curve in $$( mathbb{C}^*)^2$$ without choosing a basis. For a curve of degree $$d$$ in the projective plane, the vectors would be $$(0,1)$$, $$(1,0)$$ and $$(-1,-1)$$ $$d$$ times each.) The scalars $$alpha_i$$ correspond to the points of $$mathbb{C}P^1$$ which are sent to some toric divisor. One can look at the coordinates of these points with the corresponding toric divisor. The coordinate of $$alpha_i$$ is
$$mu_i = chi(n_i^T)prod_{jneq i}(alpha_i – alpha_j)^{det (n_i,n_j)},$$
where $$n_i^T$$ denotes the linear form $$det(n_i,-)$$. In teh case of the projective plane, one recovers indeed the coordinates of the intersection points with the coordinate axis. The map $$f:(chi,alpha_j)mapsto(mu_j)$$ is called the evaluation map. I would like to show that this map has no critical value.

In logarithmic coordinates for $$mu_i$$, the matrix of the differential $$df$$ has the following form :
$$begin{pmatrix} n_1^T & sum_{jneq 1}frac{det(n_1,n_j)}{alpha_1-alpha_j} & & & & \ vdots & & ddots & & -frac{det(n_i,n_j)}{alpha_i-alpha_j} & \ vdots & & & sum_{jneq i}frac{det(n_i,n_j)}{alpha_i-alpha_j} & & \ vdots & & -frac{det(n_i,n_j)}{alpha_i-alpha_j} & & ddots & \ n_m^T & & & & & sum_{jneq m}frac{det(n_m,n_j)}{alpha_m-alpha_j} \ end{pmatrix}$$
The first two columns correspond to the derivative regarding the $$chi$$ coordinates, and the $$m$$ last columns to the coordinates $$alpha_j$$. I need to show (although I do not know if it is true) that this matrix is surjective on the hyperplane $$sum x_i=0$$. We have indeed the obvious relation since $$sum n_i=0$$ and the right symmetric square matrix has the vector $$(1,dots,1)$$ in its kernel. This comes from the fact that the product of the $$mu_j$$ is equal to $$pm 1$$. Moreover, I already know three vectors in its kernel, which correspond to the reparametrization of the rational curve by $$PGL_2(mathbb{C})$$ : the first two are $$(0,0,1,dots,1)$$, corresponding to translations, $$(0,0,alpha_1,dots,alpha_m)$$, corresponding to dilatation, the last corresponds to inversion. This means that the kernel is already at least $$3$$-dimensional. It should not be bigger.

My question is, provided that the $$alpha_j$$ are distincts, is this matrix always surjective to the hyperplane $$sum x_i=0$$ ? Equivalently, is the kernel always $$3$$-dimensional ?

• I already know that for generic values of $$alpha_j$$, it is surjective, but I would like to have it for all values.
• I could restrict to the real case ( scalars $$alpha_j$$ are real or in pairs of complex conjugated points ) and some weaker assumption for the case that interests me, but the question about critical values remains mysterious to me in the complex case.
• The matrix seems to have a nice form but I do not succeed in using it.

## Related Questions

### Finding the height of a Pyramid where the sides are given by an equation

2  Asked on December 20, 2021

### L’hopital rule fails with limits to infinity?

1  Asked on December 20, 2021

### Linear programming with min of max function

2  Asked on December 20, 2021

### How should one understand the “indefinite integral” notation $int f(x);dx$ in calculus?

5  Asked on December 20, 2021 by user9464

### Zeros and poles of rational function.

1  Asked on December 20, 2021

### I need to help with a formula to find the length of a line.

2  Asked on December 20, 2021 by user3762238

### Sequence of Lebesgue integrable functions bounded in norm converges pointwise

1  Asked on December 20, 2021

### Let $Lambda(x)=(lambda_1x_1,lambda_2x_n,…)$ be an operator $l_2 to l_2$. Show its range is closed iff $inf_{lambda_knot=0} |lambda_k|>0$

1  Asked on December 20, 2021

### Name for “orthogonal eigenvalues” of a matrix

0  Asked on December 20, 2021 by marcus-luebke

### Division of Positive Definite Matrix – Inner Product Space

1  Asked on December 20, 2021 by david-carey

### The connection between Kolmogorov complexity and mathematical logic

1  Asked on December 20, 2021

### Is MVUE (Minimum Variance Unbiased Estimator) unique in every case?

2  Asked on December 20, 2021 by block-jeong

### What are some applications of these specific pure math areas?

3  Asked on December 20, 2021 by j0equ1nn

### Let $f: mathbb{R}^2to mathbb{R}^2$ given by $f(x, y) = (e^x cos y, e^x sin y)$.

3  Asked on December 20, 2021 by user482152

### Spectral Characterization of Strong-Mixing

1  Asked on December 20, 2021 by caffeinemachine

### Integer exponent equation

2  Asked on December 20, 2021 by rashed-a564

### Can this be accepted as a proof of the question.

0  Asked on December 20, 2021

### How to derive ${ A vdash C }$ from ${A lor B vdash C}$ in the sequent calculus LK?

1  Asked on December 20, 2021 by julja-muvv

### Drawing non-intersecting curves (or segments) connecting non-adjacent vertices in a regular polygon

2  Asked on December 20, 2021

### Time continuity of the function in L1 norm i.e. $C([0,T];L^1)$

1  Asked on December 20, 2021