Patent classifications
H04N5/247
SYSTEMS AND METHODS FOR BIOMECHANICALLY-BASED EYE SIGNALS FOR INTERACTING WITH REAL AND VIRTUAL OBJECTS
Systems and methods are provided for discerning the intent of a device wearer primarily based on movements of the eyes. The system may be included within unobtrusive headwear that performs eye tracking and controls screen display. The system may also utilize remote eye tracking camera(s), remote displays and/or other ancillary inputs. Screen layout is optimized to facilitate the formation and reliable detection of rapid eye signals. The detection of eye signals is based on tracking physiological movements of the eye that are under voluntary control by the device wearer. The detection of eye signals results in actions that are compatible with wearable computing and a wide range of display devices.
Computer Vision Based Driver Assistance Devices, Systems, Methods and Associated Computer Executable Code
The present invention includes computer vision based driver assistance devices, systems, methods and associated computer executable code (hereinafter collectively referred to as: “ADAS”). According to some embodiments, an ADAS may include one or more fixed image/video sensors and one or more adjustable or otherwise movable image/video sensors, characterized by different dimensions of fields of view. According to some embodiments of the present invention, an ADAS may include improved image processing. According to some embodiments, an ADAS may also include one or more sensors adapted to monitor/sense an interior of the vehicle and/or the persons within. An ADAS may include one or more sensors adapted to detect parameters relating to the driver of the vehicle and processing circuitry adapted to assess mental conditions/alertness of the driver and directions of driver gaze. These may be used to modify ADAS operation/thresholds.
LED And/Or Laser Light Device Has Projection
The (LED or-and Laser) light source for bulb or light device such as garden light that has at least one of or more than one optics-lens, and light device has one top cover having shape of flat or ½ ball, ⅔ ball, sphere, dome shape for top cover. Foe laser light source incorporate with flat top protective lens and laser film or grating film to enlarge or created plurality of image, lighted patterns. For LED light source can has project assembly which is a built-in or add-on or assembled inside of said light device. Further can incorporated flexible bendable arms to change position, direction, orientation of (LED or-and Laser) light beam. The said Light device also can offer near-by and far-away illumination, or-and lighted image, pattern projection with desired light effects by rotating optic-lens or LED(s). It also can get desired effects while LED(s) controlled by IC or circuitry making LED(s) for color changing or on-off on desired time, duration, cycles. The light device further can have more than one function selected from USB charger, power failure, RF remote control, Infra-red controller, blue-tooth, wifi, internet, App software, motion sensor and wireless with multiple-way communication. Also, light device may have rechargeable circuit, batteries or rechargeable battery, USB ports for the (LED or-and Laser)-bulb be charged or supply other device current.
SEWING MACHINE, STITCHING PATTERN DISPLAY METHOD, AND RECORDING MEDIUM FOR STORING PROGRAM
A sewing machine includes a first image acquisition unit, a second image acquisition unit, and a display unit. In the sewing machine, the first image acquisition performs image acquisition from the front face side of a cloth for a stitching pattern formed in the cloth. The second image acquisition unit performs image acquisition from the back face side of the cloth such that the center position of the acquired image matches the center position of the image of the stitching pattern acquired by the first image acquisition unit. The display unit displays, on the same screen, a first stitching pattern video image thus acquired by the first image acquisition unit and a second stitching pattern video image thus acquired by the second image acquisition unit.
AUTOMATED RADIAL IMAGING AND ANALYSIS SYSTEM
A system for imaging and analyzing a vehicle may include a frame having a central passage, wherein the central passage is configured and dimensioned to allow a vehicle to pass through. The frame may include, for example, a pair of substantially vertical legs connected at the top by a cross member, wherein the legs and cross member define the central passage. One or more bollards may be positioned in front of and/or behind the frame. A plurality of cameras within the each leg, cross member, and/or bollard may be directed toward the passage to record video images of a passing vehicle. Integrated LED array panels may provide bands of light to aid in detection of surface anomalies, for example by simultaneous analysis of symmetrical sides of the vehicle.
System and method for image stitching
A system for stitching images together is disclosed. The images are sometimes referred to as frames, such as frames in a video sequence. The system comprises one or more imagers (e.g. cameras) that work in coordination with a matching amount of custom code modules. The system achieves image stitching using approximately one third the Field of View (FOV) of each imager (camera) and also by increasing the number of imagers to be above a predetermined threshold. The system displays these stitched images or frames on a computer monitor, either in a still-image context but also in a video-context. Normally these tasks would involve a great detail of computation, but the system achieves these effects while managing the computational load. In stitching the images together, it is sometimes necessary to introduce some image distortion (faceting) in the combined image. The system ensures no gaps in any captured view, and assists in achieving full situational awareness for a viewer.
System for Monitoring the Surroundings of a Motor Vehicle
The invention relates to a system (1) for monitoring the surroundings of a motor vehicle (100), in particular an autonomous or semi-autonomous motor vehicle, wherein the system (1) includes at least one optical image capturing device (2) as well as further a lighting device (3, 4), wherein the optical image capturing device (2) is arranged for capturing an area of coverage (E1) of the surroundings, and wherein the area of coverage (E1) can be illuminated at least partially, preferably completely, by the lighting device (3, 4), and wherein the lighting device (3, 4) is arranged for generating a motor vehicle light distribution or part of a motor vehicle light distribution. The system (1) includes at least one optical additional image capturing device (5), which is arranged to capture a so-called additional area of coverage (E2), and wherein the system (1) further includes an additional lighting device (6), which is arranged to illuminate at least partially, preferably completely, the additional area of coverage (E2), also called the second area of coverage.
SYSTEM FOR EVALUATING THE STATE OF THE SURFACE OF A TIRE
The invention concerns a system for evaluating the surface of a tyre (10), comprising: a region (21) for entry of the tyre into the system, a capture region, and an exit region (22), distinct from the entry region, means for moving (23) and for holding a tyre in position, means for illuminating the tyre allowing the illumination of a sidewall of the tyre and of the crown of a tyre in the capture region, means for acquiring a visual image of the tyre in the capture region, means for processing the acquired image, at least one acquisition means being installed on a shaft that is movable with respect to the tyre installed in the capture region.
CAMERA DEVICE AND CLEANING ROBOT
A cleaning robot includes a machine body, a perception system, a control system, and a driving system; the perception system includes a laser distance sensor and a camera; the laser distance sensor is located on a top surface of the cleaning robot; and the camera is mounted on the cleaning robot through a mounting bracket, and a field of view of the camera includes a traveling direction of the cleaning robot. The camera apparatus is applied to the cleaning robot, and provides good shockproof performance and good stability. In addition, when a distance between optical axes of two cameras changes, the camera can be replaced and calibrated at any time, facilitating maintenance and repair.
REALITY CAPTURE DEVICE
A reality capture device configured to perform a measuring process for generating a digital representation of an environment comprising a body defining a first axis, and an imaging unit with one or more 2D cameras configured to provide 2D image data of the environment. The device comprises a ToF camera arrangement configured for capturing 3D point-cloud data of the environment and comprising at least two time-of-flight cameras, wherein each time-of-flight camera comprises a sensor array and one or more laser emitters, the sensor array of each of the time-of-flight cameras having an optical axis and being configured to receive reflections of light pulses emitted by the one or more laser emitters of the respective time-of-flight camera, the time-of-flight cameras being arranged around the first axis so that each sensor array has one or two other sensor arrays as a neighbouring sensor array.