How does the Bambu Lidar work? I am convinced that Bambu is not using a Time-of-Flight Lidar method for its bed sensors. It might be a camera (or smart camera depending on your definition) that looks at an angled laser line aimed at the bed. The laser is offset from the camera and angled at 45 degrees. As the camera/laser approaches the bed, the position of the laser line shifts and appears as moving from one side of the camera's FOV to the opposite side. Given a known offset between the laser, camera, and nozzle, the height of the nozzle can be computed based on where the line is in the camera FOV.
The Bambu part for the Lidar seen at Bambu Micro Lidar (Dual Red Laser) - X1 Series and Bambu Lab micro lidar introduction and methods for distinguishing versions(single red or dual red) looks very much like a regular CMOS sensor. The camera can also serve as a line width measurement tool in one of two ways. At a known FOV/magnification, the sensor pixels correlate to a known physical size and can measure width by edge detecting the contrast between material and the bed, this might be why it has some LED illuminators. Or it could also look at the laser line over the material as it would appear offset on the material as compared to the bed. (given a fine enough line)
If this is how it works, then I think the name is confusing and I think it is closer to structured light sensing. I'm also trying to understand the benefit of their later iterations with two lasers orthogonal to each other. Also, it would be worth duplicating this feature using an RP2040 with a camera like the PICO-Cam-A and a laser line source. It seems like a basic bed probe replacement can be made with the RP2040 as a smart camera. Additional features could also be had when connected to a Raspberry Pi over USB for first-layer image analysis.