Patent classifications
G02B7/287
Positioning method and positioning system
Embodiments of the present application provide a positioning method and a positioning system. The method comprises: obtaining a reference direction corresponding to a user; determining that an eye of the user is gazing at an auxiliary positioning object; acquiring position information of the auxiliary positioning object; acquiring a distance of the user relative to the auxiliary positioning object; acquiring an angle of a sight line direction of the user relative to the reference direction; and obtaining position information of the user according to the position information of the auxiliary positioning object, the distance of the user relative to the auxiliary positioning object, the reference direction, and the angle of the sight line direction of the user relative to the reference direction.
Focusing of a camera monitoring a scene
Focusing of a monitoring camera (100) with day and night functionality comprises selecting a focusing day mode or a focusing night mode based on the camera being in day mode or night mode. In focusing day mode, an IR laser range meter (110) will measure a reference distance continuously, and in the focusing night mode the IR laser range meter will only measure reference distance in response to a focus trigger signal being activated, and during a predetermined time period. The focus distance of the camera is set based on the measured reference distance.
Method and apparatus for independent control of focal vergence and emphasis of displayed and transmitted optical content
A first optic adjusts the focus of environment content in independent regions, and delivers the environment content to a see-through display. The display adds display content, and delivers environment and display content to a second optic. The second optic adjusts the focus of the environment and display contents in independent regions, and delivers the environment and display contents to a viewing point. The focuses of the environment and display contents are adjustable independently of one another and in different regions. Environment content may be similar in focus before and after passing through the first and second optics. Display content may be similar in focus to environment content after the display content passes through the second optic. A modifier also may darken, change opacity, or otherwise modify the environment content, independently in different regions; the display also may brighten, enlarge, or otherwise alter the display content, independently in different regions.
IMAGE-CAPTURING DEVICE, PROPOSAL-IMAGE GENERATING DEVICE, PROPOSAL-IMAGE GENERATING METHOD, PROPOSAL-IMAGE GENERATING PROGRAM, AND STORAGE MEDIUM
Provided is an image-capturing device including: an image-capturing unit that acquires an image of a subject; a subject-extracting unit that separates and extracts a main subject and a background from the acquired image; a distance-information acquiring unit that calculates the distance between the image-capturing unit and the extracted main subject and the distance between the image-capturing unit and the extracted background; a photographing-condition acquiring unit that acquires acquisition-time photographing conditions for an acquisition time of the image and an adjustable photographing condition that can be adjusted with respect to the acquisition-time photographing conditions; a photographing-condition selecting unit that selects a photographing condition that is worth adjusting on the basis of the acquired acquisition-time photographing conditions and adjustable photographing condition and the acquired distances; and an image-processing unit that generates, by performing image processing on the acquired image, an image that will be acquired when the selected photographing condition is set.
Focus adjusting virtual reality headset
A virtual reality headset displays a three-dimensional (3D) virtual scene and includes a varifocal element to dynamically adjust a focal length of an optics block included in the virtual reality headset based on a location in the virtual scene where the user is looking. The headset tracks a user's eyes to approximate gaze lines and determines a plane of focus for a frame of the virtual scene as the intersection of the gaze lines. The varifocal element adjusts the focal length of the optics block so the optics block is focused at the plane of focus, which keeps the user's eyes in a zone of comfort as vergence and accommodation change. Based on the plane of focus, the virtual reality headset may provide depth cues, such as depth of field blur, to planes in the virtual scene deeper in the user's field of view than the plane of focus.
FOCUSING OF A CAMERA MONITORING A SCENE
Focusing of a monitoring camera (100) with day and night functionality comprises selecting a focusing day mode or a focusing night mode based on the camera being in day mode or night mode. In focusing day mode, an IR laser range meter (110) will measure a reference distance continuously, and in the focusing night mode the IR laser range meter will only measure reference distance in response to a focus trigger signal being activated, and during a predetermined time period. The focus distance of the camera is set based on the measured reference distance.
Systems and methods for implementing a tracking camera system onboard an autonomous vehicle
Systems, methods, and non-transitory computer-readable media are provided for implementing a tracking camera system onboard an autonomous vehicle. Coordinate data of an object can be received. The tracking camera system actuates, based on the coordinate data, to a position such that the object is in view of the tracking camera system. Vehicle operation data of the autonomous vehicle can be received. The position of the tracking camera system can be adjusted, based on the vehicle operation data, such that the object remains in view of the tracking camera system while the autonomous vehicle is in motion. A focus of the tracking camera system can be adjusted to bring the object in focus. The tracking camera system captures image data corresponding to the object.
Image pickup apparatus and method of controlling image pickup apparatus
An image pickup apparatus, including: an observation unit configured to observe an object; a detector configured to detect whether or not a user observes the object through use of the observation unit; and a controller configured to control an actuator, the actuator being configured to drive at least a part of an photography optical system and being capable of performing a first operation and a second operation different from the first operation, the second operation being an operation for maintenance of the actuator, wherein, when the detector detects that the user does not observe the object through use of the observation unit, the controller causes the actuator to perform the second operation.
Interactive projection display
Disclosed are an interactive projection display method and an interactive projection display system. The method comprises: obtaining to-be-displayed location information corresponding to a coordination device; obtaining display information corres-ponding to the coordination device; generating, according to the display information, virtual display content corresponding to the coordination device; and projecting the virtual display content to a location corresponding to the to-be-displayed location information at a fundus of a user. According to the method and the system, virtual display content, used to interact with a user, of a coordination device is projected near the coordination device, to cause interaction between the user and the coordination device to be more convenient.
Electronic device and control method therefor
An electronic device that has a line-of-sight detection function and is capable of suppressing the execution of an operation different from the intention of a user, and a control method therefor are disclosed. The device has a function for detecting coordinates on an image at which a user is gazing as point-of-gaze coordinates. The device, in a case where a movement of the point-of-gaze coordinates has been detected, measures a duration of the point-of-gaze coordinates after the movement. The device further determines that the movement of the point-of-gaze coordinates with the duration larger than a time threshold is a viewpoint movement that is intended by the user, and determines that the movement of the point-of-gaze coordinates with the duration equal to or smaller than the time threshold is a viewpoint movement that is not intended by the user.