Suppose I have a bunch of polynomial equations with coefficients in a number field, and suppose further that I'm guaranteed a priori that they have a solution in that number field. Can I leverage that knowledge into a technique for solving the equations more easily? The cases we care about are massively overdetermined systems of linear and quadratic equations.

I am curious about the origin of the equations you are interested in. Could you say a few words about it ?
–
jvpOct 25 '09 at 2:12

2

Over R, systems of quadratic equations (especially massively overdetermined ones) are no easier to solve than general systems of equations, by Kempe's Universality Theorem for linkages. (Tim Abbott and I have an unpublished paper on this.) I'm not sure whether the proof works rationally, though.
–
Reid BartonOct 25 '09 at 4:31

The origin of the equations is trying to construct fusion categories and subfactors by hand. The equations come from relations that we know to be satisfied.
–
Noah SnyderOct 25 '09 at 17:10

6 Answers
6

By Scott's remark, it seems that the linear terms do not suffice for determining the set S. (what is the expected cardinality of S, btw?) I know nothing about programming or complexity, but my suggestion is to try taking the 2-duple embedding. Sure, you replace P^20 by P^200 or so, but now you end up with linear equations in the variables X_ij = x_i x_j. Of course, you still have the quadratic equations in X_ij which determine the image of the 2-uple embedding, but at least now you will have 10^5 linear terms and then < 200 quadratics, even if you have a few more variables... If you are lucky, the linear space cut out in P^200 might actually be quite small.

Practically speaking: I would try to reduce the equations modulo several small primes (or prime powers), solve them by brute force, and then try to lift to solutions in your number field using the chinese remainder theorem. It might be helpful to combine this with solving them to high precision over the real (or complex) numbers.

Somewhat better? Find a local solution with invertible Jacobian, on a subset of equations maybe, and use a Newton technique to lift it high, so that LLL (p-adic) will find it in the number field. I don't think that $R$ or $C$ will help as well as p-adic for converging. What is the expected field degree, and size of the solutions? The largest I have seen with this is 8-10 variables and field degree 50-100. The work should be for finding the local solution.
–
JunkieApr 30 '10 at 8:14

I have to say that this sounds suspiciously close to Matiyasevich's proof of Hilbert's 10th problem (Yuri V. Matiyasevich, Hilbert's Tenth Problem, MIT Press, Cambridge, Massachusetts, 1993). The "yoga" of his proof is that you can have some "Godel coding" on systems of quadratic and linear forms, which shows that solving a system of such equations is hard from a logic/TCS point of view.

No no, these are massively overdetermined. In typical examples, we're looking at >10^5 quadratics in 20 variables. Our experience so far with Groebner bases has been unhappy -- you can't just hand off the entire list of equations to a Groebner basis implementation and expect it to cope with modern RAM constraints. If this sounds wrong, please tell us!
–
Scott Morrison♦Oct 25 '09 at 6:46

Ahh, if they're THAT overdetermined, I can't really offer much help to you. I don't really have anything that can handle those numbers.
–
Charles SiegelOct 25 '09 at 14:58

3

Generally Groebner basis algorithms are doubly exponential, never let them NEAR anything of that magnitude
–
Michael HoffmanOct 29 '09 at 14:59

One minor comment: if you can ever reduce to solving equations in a single variable, and looking for rational roots, then you can use the rational root theorem. You can reduce to the case where your number field is the rationals easily enough. (For example, if you started out looking for roots in Q(i), write everything as a+bi.) But I don't know that there is any good way to reduce to equations in a single variable.

I don't think there's any reason to suspect that there's a good way to reduce to equations in a single variable. A system of n quadratics of the form x_{k+1} = x_k^2 + c reduces to a single polynomial, but of degree 2^n, and I see no reason that one should expect to be able to do any better even for this special case.
–
Qiaochu YuanNov 15 '09 at 20:32

Well, it's certainly easier to intersect lines and quadrics. I would simply go on intersecting all lines, and then select some nicer quadrics to intersect until I'm having a finite set of points, from which I would eliminate the wrong points by using more quadrics until one is left, the answer.