G01B11/00

Optical detector system

An optical detector system provides beam positioning data to an optical tracking system to facilitate optical communications. The optical detector system comprises a plurality of optical photodetectors. For example, a two-by-two array may be used. Incoming light passes through one or more optical elements, such as a lens and a dispersive optical element. A first portion of the beam entering the optical elements is directed into a first spot having a first area on the array. A second portion of the beam entering the optical elements is dispersed to form a second spot having a second area on the array that is larger than the first area. This combination of first portion and second portion of the beam incident on the array provides unambiguous information in the output of the photodetectors that is indicative of a position of the incoming beam with respect to the array.

Estimating gemstone weight in mounted settings

A system comprises a faceted structure imaging assembly and a faceted structure image analyzer. The system is configured to determine carat weight of a gemstone while in a mounted setting. In a first mode, the imaging assembly obtains a first image of a top gemstone surface. The image analyzer uses the first image to obtain at least one gemstone dimension, such as table and diameter dimensions. In a second mode, the imaging assembly obtains a second image of the top gemstone surface while a colored light pattern is reflected onto the gemstone. The image analyzer uses the second image to obtain at least one other gemstone dimension, such as crown and pavilion angles. The image analyzer uses the dimensions obtained from the first and second images to determine weight information of the gemstone. The system quickly determines gemstone weight reliably and consistently without skilled gemologists or removal from the setting.

Methods and apparatus for absolute and relative depth measurements using camera focus distance

A depth measuring apparatus includes a camera assembly configured to capture a plurality of images of a target at a plurality of distances from the target. The depth measuring apparatus further includes a controller configured to, for each of a plurality of regions within the plurality of images: determine corresponding gradient values within the plurality of images; determine a corresponding maximum gradient value from the corresponding gradient values; and determine, based on the corresponding maximum gradient value, a depth measurement for a region of the plurality of regions.

Exposure apparatus and exposure method, and device manufacturing method
11579532 · 2023-02-14 · ·

In corner sections of first to fourth quadrants whose origin point is a center of an upper surface of a stage, three each of two-dimensional heads are provided. The three each of two-dimensional heads include one first head and two second heads. The stage is driven, while measuring a position of the stage using three first heads that face a two-dimensional grating of a scale plate provided above the stage from the four first heads, and during the driving, difference data of measurement values of the two second heads with respect to the first head in a measurement direction are taken in for head groups to which the three first heads belong, respectively, and using the difference data, grid errors are calibrated.

Exposure apparatus and exposure method, and device manufacturing method
11579532 · 2023-02-14 · ·

In corner sections of first to fourth quadrants whose origin point is a center of an upper surface of a stage, three each of two-dimensional heads are provided. The three each of two-dimensional heads include one first head and two second heads. The stage is driven, while measuring a position of the stage using three first heads that face a two-dimensional grating of a scale plate provided above the stage from the four first heads, and during the driving, difference data of measurement values of the two second heads with respect to the first head in a measurement direction are taken in for head groups to which the three first heads belong, respectively, and using the difference data, grid errors are calibrated.

Sensor device for welding

A gas, which flows between a welding device and work pieces while the work pieces are welded together, has large influence on the welding. A sensor device includes a sensor unit and a container that includes a housing case (i.e., housing portion) and a shielding member (i.e., shielding portion). The shielding member is attached to the housing case, and shields radiation heat directed toward the lower surface of the housing case among radiation heat generated while the work pieces W are welded together. The shielding member is inclined with respect to a flow direction of a gas passing through an outlet port for detection of a second gas flow channel so that the gas discharged from the outlet port for detection is blown to the shielding member and thus flows to a side opposite to the side where the work pieces W are to be welded together.

Determining the distance of an object
11579269 · 2023-02-14 · ·

An optoelectronic sensor for determining the distance of an object in a monitoring area has a light transmitter for transmitting transmitted light, a light receiver for generating a received signal from remitted light remitted by the object, and a control and evaluation unit configured to modulate the transmitted light with at least a first frequency and a second frequency, to determine a phase offset between transmitted light and remitted light for the first frequency and the second frequency, and to determine a light time of flight. The control and evaluation unit is configured to determine a first amplitude and a second amplitude for the first frequency and the second frequency from the received signal and to detect whether the transmitted light impinges on an edge in the monitoring area on the basis of an evaluation of the first amplitude and the second amplitude.

Robot, measurement fixture, and tool-tip-position determining method
11577398 · 2023-02-14 · ·

A robot including an arm, a tool attached to the arm, a measurement fixture to be attached to a tip portion of the tool detachably, and a controller that recognizes a reference coordinate system used to control the arm, and that controls the arm. The controller stores data indicating a positional relationship between a tip of the tool and the measurement fixture or data to be used to calculate the positional relationship, and the controller calculates positional coordinates of the tip of the tool in the reference coordinate system based on position data of the measurement fixture and the positional relationship, where the position data is detected by using acquired image data from a visual sensor whose position is associated with the reference coordinate system.

Spatial recognition device, spatial recognition method, and program

A spatial recognition device provided with an analysis unit configured to acquire, from an optical device which receives reflected light obtained by radiating light onto a reflective plate provided on a moving body positioned within a detection area, reflected light information obtained based on the reflected light in accordance with a radiation direction of the light, and analyze a state of the moving body on which the reflective plate is provided, based on a distribution of the reflected light information at coordinates within the detection area.

Methods and apparatus for inspecting an engine

A computer-implemented method comprising: receiving data comprising two-dimensional data and three-dimensional data of a component of an engine; identifying a feature of the component using the two-dimensional data; determining coordinates of the feature in the two-dimensional data; determining coordinates of the feature in the three-dimensional data using: the determined coordinates of the feature in the two-dimensional data; and a pre-determined transformation between coordinates in two-dimensional data and coordinates in three-dimensional data; and measuring a parameter of the feature of the component using the determined coordinates of the feature in the three-dimensional data.