Consider a sphere with a coordinate system like the earth. There are $N$ points on its surface at random positions. For all the infinite planes that cuts the sphere exactly in half (i.e. the sphere's center is on the plane), given the set of $(latitude, longitude)$ of the points, find the global maximum number of points that might exist on one side of the plane.
Things I've tried:
- Calculate the sum of all vectors from sphere center to the points, then use the plane perpendicular to the sum vector as the target plane; I think exceptions exist
- use the plane's equation $ax+by+cz+d=0$, the constraint that the sphere center is on the plane, and another inequation to do a linear programming; I haven't figured what the inequation should be, my first thought is that we maximize the algebra sum of distances between the points and the plane but this seems to have added some sort of weight to different points while in the final result all points are equal
- Select all pairs of points $\{p1, p2\}$ from the point set and form a plane with the sphere center, then calculate how much points are on each side and add 2 to the larger side; this should be a doable way but it is $O(n^3)$
Is there a better method to solve it efficiently?
Edit: one exception for method 1
For sphere $s$ and a plane $p$, draw a line $l$ perpendicular to $p$ and goes through the sphere center $C$, assume $l$ meets the sphere surface at point $A$ and $B$. Put 3 points close enough to $A$ and centrosymmetric to $l$. Put another 2 points on the $B$ side of the plane close enough to the plane and centrosymmetric to $l$. In this case, the sum vector will be $3\vec{CA}$ so $p$ is what we are looking for which is obviously not correct. If we tilt $p$ by a little, one of the points on the $B$ side can be included.