Patent classifications
G02B7/32
Projector Focusing Method and Projector Focusing System Capable of Projecting High Resolution Images at Arbitrary Positions
A projector focusing method includes acquiring a plane angle of a light beam and acquiring a first distance and a second distance between two sides of a light beam edge displayed on a projection plane and a time of flight device after the time of flight device emits the light beam to the projection plane, acquiring a plane equation of the projection plane according to the first distance and the second distance, acquiring an optical axis vector of a digital micro-mirror device (DMD) disposed inside the projector, designating target coordinates of the DMD, converting the target coordinates to the projection target coordinates on the projection plane according to the plane equation, acquiring an customized emitting vector according to the projection target coordinates and a lens position of the projector, and acquiring an ideal focal distances of the projector according to the customized emitting vector and the optical axis vector.
AUTOMATIC FOCUSING PROJECTION METHOD AND SYSTEM
Embodiments of the present disclosure relate to the technical field of digital projection and display, and in particular, relate to an automatic focusing projection method and system. The embodiments provide an automatic focusing projection method, which is applicable to an automatic focusing projection system. The automatic focusing projection system includes a ranging unit, a projection unit, and a reflection unit. The method includes: acquiring a depth image from the ranging unit, and acquiring a vertical projection distance from the ranging unit to a projection plane based on the depth image; acquiring position information of a center point of a projection picture in the depth image based on an elevation angle of the reflection unit and the vertical projection distance; acquiring a projection distance between the projection unit and the projection picture based on the position information; and performing focus adjustment on the projection unit based on the projection distance.
Detection of image data of a moving object
A camera for detecting an object moved through a detection zone is provided that has an image sensor for recording image data, a reception optics having a focus adjustment unit for setting a focal position, a distance sensor for measuring a distance value from the object, and a control and evaluation unit connected to the distance sensor and the focus adjustment unit to set a focal position in dependence on the distance value, and to trigger a recording of image data at a focal position at which there is a focus deviation from a focal position that is ideal in accordance with the measured distance value, with the focus deviation remaining small enough for a required image definition of the image data.
ACTUATOR, IMAGE CAPTURING UNIT AND ELECTRONIC DEVICE
An actuator is configured to drive an imaging lens system. The actuator includes a frame portion configured to accommodate the imaging lens system, a supporting portion disposed on the frame portion, a driving portion configured to drive the imaging lens system to move, an optical mark structure disposed on part of the frame portion, the supporting portion or the driving portion and a liquid disposed on the optical mark structure. The supporting portion is configured to support the imaging lens system and give the imaging lens system a degree of freedom of movement with respect to the frame portion. The optical mark structure includes a plurality of optical mark units arranged side by side. The liquid is in physical contact with the imaging lens system, the frame portion, the supporting portion or the driving portion that is adjacent to the optical mark structure.
ELECTRONIC DEVICE
An electronic device includes at least one optical lens assembly. The optical lens assembly includes four lens elements, and the four lens elements are, in order from an outside to an inside, a first lens element, a second lens element, a third lens element and a fourth lens element. The first lens element has an outside surface being convex in a paraxial region thereof. The second lens element has an inside surface being convex in a paraxial region thereof. The fourth lens element has an inside surface being concave in a paraxial region thereof, wherein at least one of an outside surface and the inside surface of the fourth lens element includes at least one critical point in an off-axis region thereof.
METHOD AND MICROSCOPE FOR DETERMINING A TILT OF A COVER SLIP
A method for determining a tilting of a coverslip in a microscope, which has an object lens facing the coverslip, includes defining at least three measuring points which span a plane on a surface of the coverslip. The following steps are carried out for each of the measuring points: directing a measuring light beam through the object lens to the respective measuring point; producing a reflection light beam by at least partial reflection at the respective measuring point; directing the reflection light beam through the object lens onto a position-sensitive sensor and detecting an incidence position thereon; and determining a distance of the respective measuring point from the object lens along an optical axis thereof based on the detected incidence position. Based on the determined distances, a tilting of the plane spanned by the at least three measuring points relative to the optical axis is determined.
Self-Calibrating and Directional Focusing Systems and Methods for Infinity Corrected Microscopes
A method and system for autofocusing an objective lens in a microscope system are disclosed. A decentered aperture is disposed in an optical path between an objective lens and an image plane of an image capturing device and a plurality of reference images are captured. Each reference image is captured when the objective lens is positioned at a corresponding z-position of a plurality of z-positions along an axis of travel of the objective lens and the optical path is at least partially occluded by the decentered aperture. At least one reference image of the plurality of the reference images is associated with a best focus position. The plurality of reference images are analyzed to develop a plurality of pattern locations, wherein each pattern location represents a position of a pattern formed on the image plane when a corresponding reference image was captured. The objective lens is positioned in accordance with the best focus position and the plurality of pattern locations.
Autofocus functionality in optical sample analysis
A method comprises: directing, using an objective and a first reflective surface, first autofocus light toward a sensor, the first autofocus light reflected from a first surface of a substrate; preventing second autofocus light from reaching the sensor, the second autofocus light reflected from a second surface of the substrate; and directing, using the objective and a second reflective surface, emission light toward the sensor, the emission light originating from a sample at the substrate.
Imaging a sample in a sample holder
A system 100 and method are provided for imaging a sample in a sample holder. For providing autofocus, a 2D pattern is projected onto the sample holder 050 via an astigmatic optical element 120. Image data 172 of the sample is acquired by an image sensor 140 via magnification optics 150. A difference in sharpness of the two-dimensional pattern in the image data is measured along a first axis and a second axis. Based on the difference, a magnitude and direction of defocus of the camera subsystem is determined with respect to the sample holder. This enables the sample holder, and thereby the sample, to be brought into focus in a fast and reliable manner.
Imaging a sample in a sample holder
A system 100 and method are provided for imaging a sample in a sample holder. For providing autofocus, a 2D pattern is projected onto the sample holder 050 via an astigmatic optical element 120. Image data 172 of the sample is acquired by an image sensor 140 via magnification optics 150. A difference in sharpness of the two-dimensional pattern in the image data is measured along a first axis and a second axis. Based on the difference, a magnitude and direction of defocus of the camera subsystem is determined with respect to the sample holder. This enables the sample holder, and thereby the sample, to be brought into focus in a fast and reliable manner.