Patent classifications
H04N2213/001
LENS APPARATUS
A lens apparatus includes a lens disposed closest to an object, a holder holding the lens, a cover having a first opening to expose the lens when viewed from an optical axis direction of the lens and being positioned with the holder in the optical axis direction, and an exterior member having a second opening to engage with an outer diameter of the cover. A first gap in a diameter direction orthogonal to the optical axis direction formed between the holder and the cover is larger than a second gap in the diameter direction formed between the exterior member and the cover.
Hybrid lens for head mount display
A lens assembly, related methods and constituent optical elements are described. The assembly may be used to direct and focus light for various applications. In one instance, the lens assembly is used in conjunction with one or more sources of light such as projected images or video as part of a virtual reality system. The lens assembly includes two or more optical elements arranged to receive light or direct light through different spatial regions of the assembly at different focal powers corresponding to a first user viewing zone and a second user viewing zone. In one instance, the first user viewing zone is a peripheral viewing zone and the second viewing zone is a primary or non-peripheral viewing zone (or vice versa).
Grids for LED displays
In example implementations, an apparatus is provided. The apparatus includes a movable light emitting diode (LED) base. A plurality of LED arrays is coupled to the movable LED base. The apparatus includes a grid that includes a plurality of walls. The grid is positioned such that each LED array of the plurality of LED arrays is adjacent to a wall of the plurality of walls.
Imaging apparatus for use in a robotic surgery system
A stereoscopic imaging apparatus for use in a robotic surgery system is disclosed and includes an elongate sheath having a bore. First and second image sensors are adjacently mounted at the distal end to capture high definition images from different perspective viewpoints for generating three-dimensional image information. The image sensors produce an unprocessed digital data signal representing the captured images. A wired signal line transmits the unprocessed digital data signals along the sheath to a proximal end to processing circuitry. The processing circuitry is configured to perform processing operations on the unprocessed digital data signals to produce respective video signals suitable for transmission to a host system or for driving a 3D display. A secondary camera is also disclosed and includes an elongate strip of circuit substrate sized for insertion through a narrow conduit, the strip of circuit substrate connecting between an image sensor and a processing circuit substrate.
IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM
There is provided an image processing apparatus comprising. An obtainment unit obtains a first circular fisheye image accompanied by a first missing region in which no pixel value is present. A generation unit generates a first equidistant cylindrical projection image by performing first equidistant cylindrical transformation processing based on the first circular fisheye image. The generation unit generates the first equidistant cylindrical projection image such that a first corresponding region corresponding to the first missing region has a pixel value in the first equidistant cylindrical projection image.
VIRTUAL REALITY SURGICAL CAMERA SYSTEM
A system includes a console assembly, a trocar assembly operably coupled to the console assembly, a camera assembly operably coupled to the console assembly having a stereoscopic camera assembly, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis. The console assembly includes a first actuator and a first actuator pulley operable coupled to the first actuator. The trocar assembly includes a trocar having an inner and outer diameter, and a seal sub-assembly comprising at least one seal and the seal sub-assembly operably coupled to the trocar. The camera assembly includes a camera support tube having a distal and a proximal end, the stereoscopic camera operably coupled to the distal end of the support tube and a first and second camera module having a first and second optical axis.
EYEWEAR SURFACE TEMPERATURE EVALUATION
A method and apparatus for monitoring a surface temperature of eyewear proximate a processor to understand the surface temperature as a function of computer instructions, such as when the computer instructions are modified during software design. A sensor is coupled to the eyewear proximate the processor, such as at a temple of the eyewear including the processor, using one or more layers of tape. A server provides instructions to the processor for execution, such as instructions of an application, which instructions vary the utilization of the processor. A testing device, such as a digital multi-meter, is coupled to the sensor, as well as the server, and displays the surface temperature as a function of the processor utilization. The surface temperature of the eyewear is monitored to ensure the surface temperature does not exceed a temperature threshold.
EXTENDED FIELD-OF-VIEW CAPTURE OF AUGMENTED REALITY EXPERIENCES
Augmented reality experiences of a user wearing an electronic eyewear device are captured by at least one camera on a frame of the electronic eyewear device, the at least one camera having a field of view that is larger than a field of view of a display of the electronic eyewear device. An augmented reality feature or object is applied to the captured scene. A photo or video of the augmented reality scene is captured and a first portion of the captured photo or video is displayed in the display. The display is adjusted to display a second portion of the captured photo or video with the augmented reality features as the user moves the user's head to view the second portion of the captured photo or video. The captured photo or video may be transferred to another device for viewing the larger field of view augmented reality image.
STEREOSCOPIC IMAGE DISPLAY DEVICE
A stereoscopic image display device includes a flat panel display unit, a lens array unit, and a light guide structure unit. The light guide structure unit includes a light guide microstructure. The light guide microstructure is disposed on a side of the lens array unit. A bottom angle of the light guide microstructure is defined as B, and a bottom length of the light guide microstructure is defined as P. The bottom angle B and the bottom length P of the light guide microstructure satisfies following conditions: (i) 15.5 degrees≤B≤83.5 degrees; and (ii) 10 micrometers≤P≤2,000 micrometers, such that an oblique viewing angle of the stereoscopic image display device falls within a range from 10 degrees to 60 degrees.
Stereo camera and stereophotogrammetric method
A handheld device for the image-based measurement of a remote object, comprising a housing having a front side and a rear side, a first and second camera, which are arranged having a stereo base on the rear side, for recording images of the object, an analysis unit having an algorithm for the stereophotogrammetric analysis of the images of the cameras and a display unit, which is arranged on the front side, for displaying images of the object and results of the stereophotogrammetric analysis, wherein the housing has a longitudinal axis, the stereo base is aligned diagonally relative to the longitudinal axis, and the analysis unit is designed for the purpose of taking into consideration the relative alignment of the stereo base during the stereophotogrammetric analysis.