G06T2207/30208

Methods and systems for calibrating a camera
11538193 · 2022-12-27 · ·

A computer implemented method for calibrating a camera comprises the following steps carried out by computer hardware components: activating a subset of a plurality of light sources according to a plurality of activation schemes, wherein each activation scheme indicates which of the plurality of light sources to activate; capturing an image for each activation scheme using the camera; and calibrating the camera based on the captured images.

POSITIONING AND TRACKING MEMBER, METHOD FOR RECOGNIZING MARKER, STORAGE MEDIUM, AND ELECTRONIC DEVICE
20220405965 · 2022-12-22 ·

A positioning tracking member, a method for recognizing a maker (20), a storage medium, and an electronic device. By directly sticking a positioning tracking member onto the body of a patient, a rigid connection between the positioning tracking member and the human body is not required, thereby avoiding damage to the human body. Furthermore, in combination with a recognition algorithm of the maker (20), recognition of the maker (20) in the image space is quickly achieved by comparing the actual size of each candidate connected region in a three-dimensional medical model with that of the marker (20), the recognition speed being high and the recognition accuracy being high.

IMAGE RESTORATION DEVICE, IMAGE RESTORATION METHOD, AND IMAGE RESTORATION PROGRAM

The image restoration device has an initialization block that initializes the luminance value of each pixel coordinate to an intermediate value in the luminance array list that stores one of a pair of polarity values and intermediate values as the luminance value for each pixel coordinate. The image restoration device also has an update block that updates the initialized luminance array list according to the pixel coordinates and polarity values for each event, and an output block that outputs the luminance array list updated by the update block over the shooting period as a binary image. By the update performed in the update block, the luminance values of the firing coordinates where the event fired in the luminance array list are overwritten by the polarity values of the event. In addition, the update preserves the luminance values of the non-firing coordinates in the luminance array list, excluding the firing coordinates.

Optical flow computing method and computing device

The present disclosure provides an optical flow calculation method for a computing device, including: acquiring an event data flow with a predetermined duration from a DVS, the event data flow including a coordinate position and a timestamp of a triggered event; generating a timestamp matrix in accordance with the coordinate position and the timestamp of the triggered event; scanning elements in the timestamp matrix in a predetermined scanning direction, so as to determine at least one intermediate point in each element in accordance with a value and a gradient of the element in the predetermined scanning direction; and calculating a distance between adjacent intermediate points and a gradient direction, and generating an optical flow matrix in accordance with a calculation result. The present disclosure further provides the computing device.

Locating content in an environment

A method includes determining a device location of an electronic device, and obtaining a content item to be output for display by the electronic device based on the device location, wherein the content item comprises coarse content location information and fine content location information. The method also includes determining an anchor in a physical environment based on the content item, determining a content position and a content orientation for the content item relative to the anchor based on the fine content location information, and displaying a representation of the content item using the electronic device using the content position and the content orientation.

OPTICAL AXIS CALIBRATION OF ROBOTIC CAMERA SYSTEM
20220392012 · 2022-12-08 · ·

A method, instructions for which are executed from a computer-readable medium, calibrates a robotic camera system having a digital camera connected to an end-effector of a serial robot. The end-effector and camera move within a robot motion coordinate frame (“robot frame”). The method includes acquiring, using the camera, a reference image of a target object on an image plane having an optical coordinate frame, and receiving input signals, including a depth measurement and joint position signals. Separate roll and pitch offsets are determined of a target point within the reference image with respect to the robot frame while moving the robot. Offsets are also determined with respect to x, y, and z axes of the robot frame while moving the robot through another motion sequence. The offsets are stored in a transformation matrix, which is used to control the robot during subsequent operation of the camera system.

Interactive augmented reality experiences using positional tracking

Interactive augmented reality experiences with an eyewear device including a position detection system and a display system. The eyewear device registers a first marker position for a user-controlled virtual game piece and a second marker for an interaction virtual game piece. The eyewear device monitors its position (e.g., location and orientation) and updates the position of the user-controlled virtual game piece accordingly. The eyewear device additionally monitors the position of the user-controlled virtual game piece with respect to the interaction virtual game piece for use in generating a score. Augmented reality examples include a “spheroidal balancing” augmented reality experience and a “spheroidal balancing” augmented reality experience.

Fisheye camera calibration system, method and electronic device

Provided are a fisheye camera calibration system, method and an electronic device. The system includes a hemispherical target, a fisheye camera and an electronic device. The hemispherical target includes a hemispherical inner surface and multiple markers provided on the hemispherical inner surface. The fisheye camera is used for photographing the hemispherical target and acquiring a target image, where the hemispherical target and the multiple markers provided on the hemispherical inner surface are captured in the target image. The electronic device is used for acquiring initial values of k.sub.1, k.sub.2, k.sub.3, k.sub.4, k.sub.5, u.sub.0, v.sub.0, m.sub.u and m.sub.v, and using a Levenberg-Marquardt algorithm to optimize the initial values of k.sub.1, k.sub.2, k.sub.3, k.sub.4, k.sub.5, u.sub.0, v.sub.0, m.sub.u and m.sub.v, so as to determine imaging model parameters of the fisheye camera.

White Cap Detection Device
20220375115 · 2022-11-24 ·

A device for analyzing a grain sample including a light source, an image sensor, and a controller. The light source is configured for illuminating the grain sample. The image sensor is configured for capturing images of the grain sample. The controller is coupled to the image sensor and is configured for receiving the images of the grain sample therefrom and for analyzing the images to detect at least one material other than grain in the grain sample. The light source is configured for illuminating the grain sample with a local light spot having a size that is smaller than a width of an average wheat kernel. The image analysis and the detection of material other than grain may, at least partly, be performed using trained neural networks and other artificial intelligence algorithms.

Method of calibrating a patient monitoring system for use with a radiotherapy treatment apparatus

Some embodiments are directed to an image director of a patient monitoring system to obtain calibration images of a calibration sheet or other calibration object at various orientations and locations. The images are then stored and processed to calculate camera parameters defining the location and orientation of the image detector and identifying internal characteristics of the image detector, and the information are stored. The patient monitoring system can be re-calibrated by using the image detector to obtain an additional image of a calibration sheet or calibration object. The additional image and the stored camera parameters are then used to detect any apparent change in the internal characteristics of the image detector (10)(S6-4).