G06T7/269

Systems and Methods for Measuring Vital Signs Using Multimodal Health Sensing Platforms

Systems and methods for measuring vitals in accordance with embodiments of the invention are illustrated. One embodiment includes a method for measuring vital signs. The method includes steps for identifying regions of interest (ROIs) from video data of an individual, generating temporal waveforms from the ROIs, analyzing the generated temporal waveforms to extract vital sign measurements, and generating outputs based on the analyzed temporal waveforms.

TECHNIQUES TO PLACE OBJECTS USING NEURAL NETWORKS

Apparatuses, systems, and techniques to place one or more objects in a location and orientation. In at least one embodiment, one or more circuits are to use one or more neural networks to cause one or more autonomous devices to place one or more objects in a location and orientation based, at least in part, on one or more images of the location and orientation.

TECHNIQUES TO PLACE OBJECTS USING NEURAL NETWORKS

Apparatuses, systems, and techniques to place one or more objects in a location and orientation. In at least one embodiment, one or more circuits are to use one or more neural networks to cause one or more autonomous devices to place one or more objects in a location and orientation based, at least in part, on one or more images of the location and orientation.

VEHICLE USING FULL-VELOCITY DETERMINATION WITH RADAR

A computer includes a processor and a memory storing instructions executable by the processor to receive radar data including a radar pixel having a radial velocity from a radar; receive camera data including an image frame including camera pixels from a camera; map the radar pixel to the image frame; generate a region of the image frame surrounding the radar pixel; determine association scores for the respective camera pixels in the region; select a first camera pixel of the camera pixels from the region, the first camera pixel having a greatest association score of the association scores; and calculate a full velocity of the radar pixel using the radial velocity of the radar pixel and a first optical flow at the first camera pixel. The association scores indicate a likelihood that the respective camera pixels correspond to a same point in an environment as the radar pixel.

SYSTEMS AND METHODS FOR STABILIZING VIDEOS USING OPTICAL FLOW DERIVED MOTION AND MOTION SOLVE DERIVED MOTION
20230232101 · 2023-07-20 ·

An image capture device may capture a video while experiencing motion. Motion of the image capture device during video capture may be determined based on optical flow and structure from motion solve of the video. Optical flow derived motion and/or the motion solve derived motion of the image capture device may be selected to perform stabilization of the video.

VOLUMETRIC SAMPLING WITH CORRELATIVE CHARACTERIZATION FOR DENSE ESTIMATION
20230222673 · 2023-07-13 ·

Systems and techniques are described herein for performing optical flow estimation for one or more frames. For example, a process can include determining an optical flow prediction associated with a plurality of frames. The process can include determining a position of at least one feature associated with a first frame and determining, based on the position of the at least one feature in the first frame and the optical flow prediction, a position estimate of a search area for searching for the at least one feature in a second frame. The process can include determining, from within the search area, a position of the at least one feature in the second frame

SYSTEMS AND METHODS FOR GENERATING TIME-LAPSE VIDEOS
20230216987 · 2023-07-06 ·

Positions of an image capture device may be used to determine a position-based time-lapse video frame factor. Apparent motion between pairs of images may be used to determine a visual-based time-lapse video frame rate factor. A time-lapse video frame rate for a time-lapse video may be determined based on the position-based time-lapse video frame factor and the visual-based time-lapse video frame rate factor.

SYSTEMS AND METHODS FOR GENERATING TIME-LAPSE VIDEOS
20230216987 · 2023-07-06 ·

Positions of an image capture device may be used to determine a position-based time-lapse video frame factor. Apparent motion between pairs of images may be used to determine a visual-based time-lapse video frame rate factor. A time-lapse video frame rate for a time-lapse video may be determined based on the position-based time-lapse video frame factor and the visual-based time-lapse video frame rate factor.

Moving object tracking using object and scene trackers

A method of using both object features and scene features to track an object in a scene is provided. In one embodiment, the scene motion is compared with the object motion and if the motions differ greater than a threshold, then the pose from object tracker is used; otherwise, the pose from scene tracker is used. In another embodiment, the pose of an object is tracked by both scene tracker and object tracker and these poses are compared. If these comparison results in a difference greater than a threshold, the pose from object tracker is used; otherwise, the pose from scene tracker is used.

Method and device for calculating river surface flow velocity based on variational principle
11544857 · 2023-01-03 · ·

A method and device for calculating a river surface flow velocity are provided based on a variational principle, which are used to capture and process the images of an objective area, and to obtain the flow velocity field data of the objective area with high precision in a non-contact manner. The method and device include 3 steps: (1) preparation before initial flow measurement; (2) capturing a video of the river by an image acquisition device, converting a motion of a pixel flow field of the fluid in a captured image sequence into solving an energy functional optimization problem, and solving partial differential equations to obtain data of pixel flow field distribution; and (3) obtaining space coordinates of the pixel point in a world coordinate system and calculating the flow velocity according to the data obtained in the step 2 and the transformation relationship determined in the step 1.