I'm trying to calculate the minimum radius in a cubic Bezier curve (in C#). I know this question is around on StackExchange, and have thoroughly browsed the answers and tried different implementations. However, the results are not what I expect.
Given this Bezier curve:
If the curvature for a Bezier is k(t), I would expect it be exactly 0.1 for any given value of t for this specific curve. Given my results, this seems incorrect: the minimum is 7.5. If I calculate the radius (given as 1 / k(t)) for 10.000 values between 0 and 1, and then take the mean of that, I do get 10 (with some rounding errors). Does this make sense?
My goal is to find the minimum radius of given Bezier curve, where the curve will always bend in one direction. Thus, if one were to roll a ball of varying size along the Bezier, and this ball would always exactly fit as precisely as possible to the curve, what is the smallest it will get?
Thank you for any thoughts.
[(0, 0), (40/3*sqrt(2) - 40/3, 0), (10, -40/3*sqrt(2) + 70/3), (10, 10)]give a better approximation to a quarter circle, so the range of the radius is much smaller, as this plot shows. – PM 2Ring Oct 28 '21 at 07:29cis calulated by first getting the first and second derivaties (d1andd2) of the Bezier, and then doing:c = (d1.X^2 + d1.Y^2) / sqrt((d1.X * d2.Y - d2.X * d1.Y)^3). This indeed gives me the results you mention. Taking a closer look in Inkscape I now also see that my initial curve indeed does not coincide with a quarter of a cirlce. Thanks! – mennowo Oct 28 '21 at 08:14