0

The question is to find the centre and radius in the circle that is where the plane going through A, B and C cuts the sphere.

The sphere has the equation $(x-1)^2 + (y-1)^2 + (z-1)^2=4$

The three points $A(2,2,1+\sqrt{2})$, $B(0,1+\sqrt{2},2)$ and $C(1,1,3)$

Say the centre is $S(x_0,y_0,z_0)$

$abs(\vec {AS})$ $=$ $abs(\vec {BS})$ $=$ $abs(\vec {CS})=R$ where $R$ is the radius.

I have three equations, and the three unknown coordinates of S, but that set of equation is impossible to solve by hand, how would I otherwise solve it?

  • a related answer: https://math.stackexchange.com/questions/943383/determine-circle-of-intersection-of-plane-and-sphere – Robin to Roxel May 19 '21 at 11:46
  • Isn't there a typo in the coordinates of A,B,C ? – Viera Čerňanová May 19 '21 at 11:46
  • First you have to find the equation of the plane passing through three points, the rest is given in the reference in comment. – sirous May 19 '21 at 11:57
  • Bob, your approach is not wrong. But: the points=solutions of the system you have create a line. It is the line orthogonal to the plane $(ABC).$ Therefore you need something more (an equation of the plane or to minimize R or ...) – Viera Čerňanová May 19 '21 at 12:07
  • @user376343 there is not a typo, but thanks i now solved it. If i also added the equation that the coordinates for W are in the plane of A, B and C i could solve it. – Bob the Turtle May 19 '21 at 14:43

1 Answers1

0

Finding the radius of the circle is an operation which is invariant by translation.

Let us operate a translation bringing the center of the sphere at the origin, using vector

$$T=\begin{pmatrix}-1\\-1\\-1\end{pmatrix}$$

The images of points $A,B,C,\Omega$ (where $\Omega$ is the center of the sphere) become:

$$A'=\begin{pmatrix}1\\1\\s\end{pmatrix}, \ \ B'=\begin{pmatrix}-1\\s\\1\end{pmatrix}, \ \ C'=\begin{pmatrix}0\\0\\2\end{pmatrix}, \ \ O=\begin{pmatrix}0\\0\\0\end{pmatrix}$$

Where:

$$s:=\sqrt{2}$$

We check that points $A',B',C'$ belong to the sphere with center $O$ and radius $2$.

The equation of plane $P:=A'B'C'$ is easily found to be:

$$\det \begin{pmatrix}1&-1&0&x\\ 1&s&0&y\\ s&1&2&z\\ 1&1&1&1\end{pmatrix}=0$$

giving:

$$(3-2s)x+(s-3)y+(-1-s)z+(2+2s)=0\tag{1}$$

The (shortest) distance from origin $O$ to plane $P$ is therefore:

$$d=\dfrac{2+2s}{\sqrt{(3-2s)^2+(s-3)^2+(-1-s)^2}}=\dfrac{2+2s}{\sqrt{31-16s}}$$

$$d\approx 1.6687$$

But $d=OD$ where $D$ is the center of the circumscribed circle to $A'B'C'$. Therefore, using Pythagoras in triangle $ODA'$ (see the figure in the referenced document given by @Robin to Roxel):

$$OD^2+DA'^2=OA'^2 \ \iff \ \ \text{radius} = DA'=\sqrt{OA'^2-OD^2}\approx \sqrt{4-1.6687^2}=1.1025$$


In order to find the coordinates of $D$, just express the fact that :

$$\vec{OD}=k\begin{pmatrix}(3-2s)\\(s-3)\\(-1-s)\end{pmatrix}$$

for a certain constant $k$ where we have taken the normal vector to plane $A'B'C'$ (see equation (1)).

Knowing that $\vec{OD}^2=d^2$, one obtains readily the value of $k$. In this way, (2) gives us the coordinates of $D$, Now it remains to add to these coordinates the opposite of the coordinates of translation vector $T$ to obtain the coordinates of the center of original triangle $ABC$.

Jean Marie
  • 88,997