I have a system of polynomial equations which I need to solve numerically. The one Im currently interested in has around 20 variables and a similar number of equations. All coefficients are rational numbers. In the end I am only interested in real solutions with all variables in the range $[-1,+1]$.
The most elegant approach to my knowledge involves finding a Groebner basis (ideally in diagonal form), which then can be solved by (numberically) solving a series of univariate polynomials, which is easy enough. This has worked for me for some smaller problems (around 10 variables, using the 'Singular' software). But the current problem seems to big for this approach (computation runs out of memory after many hours on quite powerful workstation). Any other ideas I could persue?
I know that the runtime of Gröbner depends greatly on the choosen term order. By default, Singular uses 'degrevlex', which is usually good, but might possibly be improved by introducing some weights to the variables. Is there some heuristic to find good weights?
Im only interested in finding some solution as numerical approximations. I dont need to find all of them or prove that I found all of them or anything like that (which a Gröbner basis would allow).
I did try using Newtons Method to find a root. But if I start from a randomly chosen point, this is extremely unstable. Is there any more stable method for this? (most literature I could find deals with multi-dimension minimization, not with root-finding).
Any ideas would be welcome.