1

In a small programming exercise I asked myself, I want to calculate various things about ellipses. The part I'm stuck with is the following: I want to calculate the angle that is needed cor covering a given distance on the circumference of a given ellipse, while starting on a given angle. This image should illustrate the problem:

image

It is guaranteed that α < 90°, but it is possible that α+β > 90°

I would like to do this only with mathematical operations that are available in 'general' programming languages (like C++, e.g. those in cmath). So, while calculating the actual result in my program, I would like to avoid integrals or derivatives. Although the result will probably less exact without these operations, it should have maximum deviation of ~5%.

null
  • 1,540
sigalor
  • 133

1 Answers1

0

So you want to find for given arc distance on a given ellipse between a point whose polar coordinates are $(\alpha, r_\alpha) $ , extra $\beta $ needed, right? There are two possibilities..

Please look up standard derivation of ellipse arc length derivation in terms elliptic integrals ( iirc, $E(\epsilon)\, 2 \pi a $, of second kind) in terms of eccentricity $ \epsilon $ or a/b. It will be easy to find because that is how elliptic integrals originally arose.

Next, you need the polar coordinate form for a central ellipse:

$$\frac{1}{r^2} =\frac{\cos ^2\theta }{a^2} + \frac{\sin ^2\theta}{b^2}. $$

Narasimham
  • 42,260
  • The two points you mentioned are right, but the arc distance between these points ($c$ in my image) is given, I want to get the angle $\beta$ that is needed for covering this distance. – sigalor Sep 01 '15 at 10:14