G01B11/002

Position detector and method for 3D position determination
11707847 · 2023-07-25 · ·

A position detector for generating 3D position information of an object in a position determination space for the object. The position detector has a camera with a lens and an image sensor that defines an imaging area with one first light deflecting element arranged in the imaging area as the camera and the at least one light deflecting element are adapted to simultaneously produce on the image sensor at least two images of the position determination space, a first image being produced by light beams deflected at the first light deflecting element, the at least two images differ with respect to the viewing direction of the position determination space.

SYSTEM AND METHOD FOR DETECTING A POSITION OF A GUIDE CATHETER SUPPORT

A catheter procedure system includes a base and a robotic mechanism having a longitudinal axis and being movable relative to the base along the longitudinal axis. The robotic mechanism includes a robotic drive base including at least one drive mechanism, a cassette operatively secured to the robotic drive base, a rigid guide coupled to the cassette and fixed relative to the robotic mechanism and a flexible track having a distal end, a proximal end and a plurality of reflective sections. At least a portion of the flexible track is disposed within the rigid guide. The robotic mechanism also includes a position detector mounted to the robotic drive base and positioned beneath the flexible track. The position detector is configured to detect light reflected off of the reflective sections of the flexible track and to determine the position of the distal end of the flexible track based on the detected reflected light.

VISUAL POSITIONING SYSTEM, BATTERY REPLACING DEVICE, AND BATTERY REPLACEMENT CONTROL METHOD
20230234234 · 2023-07-27 ·

A visual positioning system, comprising a first visual sensor (501), a second visual sensor (502), and a position obtaining unit (503), wherein the first visual sensor (501) is used for obtaining a first image (G11) of a first position (A) of a target apparatus (7), a second visual sensor (502) is used for obtaining a second image of a second position (B) of the target apparatus (7), and the position obtaining unit (503) is used for obtaining position information of the target apparatus (7) according to the first image (G11) and the second image. Further disclosed are a battery swapping device and a battery swapping control method. Relatively high positioning accuracy is obtained in a visual manner, and accurate positioning between the battery swapping device and a vehicle for battery swap is implemented.

Digital track recording device and coordinate calibration method
20230003505 · 2023-01-05 ·

A coordinate calibrating method includes the steps of: (1) reading paper coordinates by adopting two OID sensors, comparing the coordinates read by the two OID sensors with pre-designed coordinates of paper to obtain coordinate deviation of positions of the two OID sensors, and according to the coordinate deviation, calculating an inclination angle value θ generated by the coordinates of the two OID sensors; and (2) according to the coordinate deviation and the inclination angle value, calibrating an actually read coordinate position of a pen by selecting the second OID sensor as a reference point, translating horizontally and vertically coordinates of the pen to be corrected first, and then rotating the coordinates by the angle θ with reference to the second OID sensor, so that the positions of the two OID sensors coincide with pre-designed standard positions, and the coordinate position of the pen is calibrated.

METHODS FOR OPTICAL TRACKING AND SURFACE ACQUISITION IN SURGICAL ENVIRONMENTS AND DEVICES THEREOF

A computer assisted system is disclosed that includes an optical tracking system and one or more computing devices. The optical tracking system includes an RGB sensor and is configured to capture color images of an environment in the visible light spectrum and tracking images of fiducials in the environment in a near-infrared spectrum. The computer assisted system is configured to generate a color image of the environment using the color images, identify fiducial locations using the tracking images, generate depth maps from the color images, reconstruct three-dimensional surfaces of structures based on the depth maps, and output a display comprising the reconstructed three-dimensional surface and one or more surgical objects that are associated with the tracked fiducials. The computer assisted system can further include a monitor or a head-mounted display (HMD) configured to present augmented reality (AR) images during a procedure.

INNER SURFACE SHAPE MEASUREMENT DEVICE, AND ALIGNMENT METHOD AND MAGNIFICATION CALIBRATION METHOD FOR INNER SURFACE SHAPE MEASUREMENT DEVICE
20230003510 · 2023-01-05 · ·

The inner surface shape measurement device, which measures an inner surface shape of a small hole formed in a workpiece, includes: a rotating body for rotating the workpiece around a rotation axis, and a linear-and-tilting-motion stage; an elongated probe capable of being inserted into the small hole of the workpiece; a probe linear-and-tilting-motion mechanism capable of adjusting posture of the probe; a camera, configured to be rotatable integrally with the rotating body, for imaging the probe from at least three circumferential positions on a rotation trajectory centered on a rotation axis; and a controller for adjusting the posture of the probe using the probe linear-and-tilting-motion mechanism based on an image taken by the camera at each of the circumferential positions.

MEASURING DEVICE, MEASURING SYSTEM, MEASURING METHOD, AND PROGRAM

Provided is a measuring device for measuring the hardness of a rotor blade groove. This measuring device comprises: a hardness meter for measuring hardness; an actuator that presses the hardness meter to an object to be measured; a camera for capturing an image of a measurement range in the object to be measured by the hardness meter; a movement mechanism for moving the hardness meter and the camera to a desired position within the measurement range; and a fixing member for fixing the movement mechanism to the object to be measured.

APPARATUS FOR CORRECTING ASSEMBLY DEVIATION OF AN APPARATUS AND CORRECTING A PROCESS ERROR USING AN APRILTAG, AND AN APPARATUS FOR CORRECTING AN ASSEMBLY DEVIATION OF THE APPARATUS AND CORRECTING A PROCESS ERROR USING THE SAME
20230004137 · 2023-01-05 ·

An apparatus for correcting a process error includes: a frame; a machining unit formed inside or outside the frame with respect to the frame and performing a predetermined process; a conveying unit formed inside or outside the frame with respect to the frame and performing predetermined conveying; a sensing mark formed on the frame, the machining unit, or the conveying unit; an imaging unit formed inside or outside the frame and creating an original image by imaging the sensing mark; and a measuring unit deriving a 3D position variation value of the frame, the machining unit, or the conveying unit by deriving an image variation value of the sensing mark by analyzing the original image transmitted from the imaging unit imaging the sensing mark formed on the frame, the machining unit, or the conveying unit.

METHOD AND SYSTEM FOR PERFORMING AUTOMATIC CAMERA CALIBRATION
20230025684 · 2023-01-26 ·

A system and method for performing automatic camera calibration is provided. The system receives a calibration image, and determines a plurality of image coordinates for representing respective locations at which a plurality of pattern elements of a calibration pattern appear in a calibration image. The system determines, based on the plurality of image coordinates and defined pattern element coordinates, an estimate for a first lens distortion parameter of a set of lens distortion parameters, wherein the estimate for the first lens distortion parameter is determined while estimating a second lens distortion parameter of the set of lens distortion parameters to be zero, or is determined without estimating the second lens distortion parameter. The system determines, after the estimate of the first lens distortion parameter is determined, an estimate for the second lens distortion parameter based on the estimate for the first lens distortion parameter.

OBJECT DETECTION DEVICE AND MODEL PATTERN EVALUATION DEVICE
20230024736 · 2023-01-26 · ·

Provided is an object detection device which can detect with high accuracy a symmetrical target object represented in an image. The object detection device includes a memory which stores a model pattern representing a plurality of predetermined features in different positions to each other on a target object when the target object is viewed from a predetermined direction, a feature extraction unit which extracts a plurality of the predetermined features from an image in which the target object is represented, and a collation unit which calculates a degree of coincidence representing a degree of matching between the plurality of the predetermined features of the model pattern and a plurality of the predetermined features extracted from a region corresponding to the model pattern in the image while changing at least one of a relative position or angle, or a relative direction and relative size of the model pattern with respect to the image, and which judges that the target object is represented in the region of the image corresponding to the model pattern when the degree of coincidence is equal to or greater than a predetermined threshold, wherein the predetermined features stored in the memory include a feature of interest which can be used for detecting a position in a specific direction of the target object in the image or which can be used for detecting an angle in a rotational direction centered about a predetermined point of the target object in the image, and the collation unit increases the contribution in the calculation of the degree of coincidence when the feature of interest of the model pattern and a predetermined feature in the image match as compared to the case in which a predetermined feature of the model pattern other than the feature of interest and the predetermined feature in the image match to calculate the degree of coincidence.