G06V10/147

ELECTRONIC DEVICE

Provided is an electronic device capable of suppressing an influence of internal reflected light in a device. An electronic device is provided with, sequentially from one side to the other side, a first polarizing plate that makes incident light linearly polarized light, a first ¼ wavelength plate a slow axis of which is different from an absorption axis of the first polarizing plate by 45 degrees or 135 degrees, a self-luminous element layer, a second ¼ wavelength plate a slow axis of which is in the same direction as the slow axis of the first ¼ wavelength plate, a second polarizing plate an absorption axis of which is orthogonal to the absorption axis of the first polarizing plate, and an imaging device that images light via the second polarizing plate.

WATER AREA OBJECT DETECTION SYSTEM AND MARINE VESSEL
20230228576 · 2023-07-20 ·

A water area object detection system includes a first imager to image an object around a hull, a second imager provided on the hull such that an imaging direction of the second imager is the same or substantially the same as an imaging direction of the first imager and operable to image the object around the hull, and a controller configured or programmed to perform a control to create a water area map around the hull based on images captured by the first imager and the second imager. The second imager is spaced apart in an upward-downward direction of the hull from the first imager, and the first imager is spaced apart in the imaging direction from the second imager so as not to overlap the second imager in the upward-downward direction perpendicular to the imaging direction.

DISPLAY DEVICE

A display device includes first and second unit pixels adjacent to each other in a first direction. Each of the first and second unit pixels includes a first group disposed in a first row and a second group disposed in a second row spaced apart from the first row. The first group includes a first light emitting area emitting a first light, and a second light emitting area emitting a second light. The second group includes a light receiving area and third and fourth light emitting areas spaced apart from each other with the light receiving area interposed therebetween and emitting the third light. The fourth light emitting area of the first unit pixel and the third light emitting area of the second unit pixel are spaced apart from each other by a distance from about 25 micrometers to about 100 micrometers in the first direction.

AUTOMATICALLY CLASSIFYING ANIMAL BEHAVIOR

Systems and methods are disclosed to objectively identify sub-second behavioral modules in the three-dimensional (3D) video data that represents the motion of a subject. Defining behavioral modules based upon structure in the 3D video data itself—rather than using a priori definitions for what should constitute a measurable unit of action—identifies a previously-unexplored sub-second regularity that defines a timescale upon which behavior is organized, yields important information about the components and structure of behavior, offers insight into the nature of behavioral change in the subject, and enables objective discovery of subtle alterations in patterned action. The systems and methods of the invention can be applied to drug or gene therapy classification, drug or gene therapy screening, disease study including early detection of the onset of a disease, toxicology research, side-effect study, learning and memory process study, anxiety study, and analysis in consumer behavior.

ROBOTIC SURGICAL SYSTEM WITH VIRTUAL CONTROL PANEL FOR TOOL ACTUATION
20230225607 · 2023-07-20 ·

A surgical system includes a detector, comprising an array of pixels configured to detect light reflected by a surgical instrument and generate a first signal comprising a first dataset representative of a visible image of the surgical instrument. The surgical system also includes a processor configured to receive the first signal, generate a modified image of the surgical instrument that includes a control panel. The control panel includes one or more control elements representative of one or more operating parameters of the surgical instrument. The processor is further configured to receive an input to the control panel from a user, the input being effective to change one of the operating parameters. The processor is also configured to generate a command signal based on the input to change the one of the operating parameters.

ROBOTIC SURGICAL SYSTEM WITH VIRTUAL CONTROL PANEL FOR TOOL ACTUATION
20230225607 · 2023-07-20 ·

A surgical system includes a detector, comprising an array of pixels configured to detect light reflected by a surgical instrument and generate a first signal comprising a first dataset representative of a visible image of the surgical instrument. The surgical system also includes a processor configured to receive the first signal, generate a modified image of the surgical instrument that includes a control panel. The control panel includes one or more control elements representative of one or more operating parameters of the surgical instrument. The processor is further configured to receive an input to the control panel from a user, the input being effective to change one of the operating parameters. The processor is also configured to generate a command signal based on the input to change the one of the operating parameters.

Method and system for time-of-flight imaging with high lateral resolution

An image capturing system includes a light source configured to emit light toward an object or scene that is to be imaged. The system also includes a time-of-flight image sensor configured to receive light signals based on reflected light from the object or scene. The system also includes a processor operatively coupled to the light source and the time-of-flight image sensor. The processor is configured to perform compressive sensing of the received light signals. The processor is also configured to generate an image of the object or scene based at least in part on the compressive sensing of the received light signals.

Method and system for time-of-flight imaging with high lateral resolution

An image capturing system includes a light source configured to emit light toward an object or scene that is to be imaged. The system also includes a time-of-flight image sensor configured to receive light signals based on reflected light from the object or scene. The system also includes a processor operatively coupled to the light source and the time-of-flight image sensor. The processor is configured to perform compressive sensing of the received light signals. The processor is also configured to generate an image of the object or scene based at least in part on the compressive sensing of the received light signals.

SYSTEMS AND METHODS FOR FUSING DATA FROM SINGLE PIXEL THERMOPILES AND PASSIVE INFRARED SENSORS FOR COUNTING OCCUPANTS IN OPEN OFFICES
20230016414 · 2023-01-19 ·

A system for determining occupancy in an environment is provided. The system includes plurality of sensor bundles, with each bundle including a presence sensor and a motion sensor. The system further includes a controller in communication with each sensor bundle. The controller is configured to designate one of the sensor bundles as presence triggered if persons are present within a field of view of the presence sensor. The controller is further configured to designate one of the sensor bundles as motion triggered if persons are moving within a field of view of the motion sensor. The controller is further configured to determine a triggered bundle count of the sensor bundles which are both presence triggered and motion triggered. The controller is further configured to determine an occupancy count for the environment based upon the triggered bundle count.

INFORMATION PROCESSING METHOD, NON-TRANSITORY STORAGE MEDIUM, AND INFORMATION PROCESSING SYSTEM
20230018515 · 2023-01-19 ·

An information processing method includes a detection step including detecting a target based on a distance image of a monitoring region; and a stay decision step including making a stay decision. The stay decision includes determining whether any stay of the target has occurred. The stay decision step includes making the decision based on an index indicating a positional change of the target with the passage of time.