Patent classifications
G06T2207/30228
Image processing apparatus and method
An image processing apparatus that generates a virtual viewpoint image: obtains information related to installation of cameras for capturing images used for generating a virtual viewpoint image and viewpoint information related to a virtual viewpoint; determines based on the information related to installation of the cameras and the viewpoint information an image processing method to be used for generating a virtual viewpoint image; and generates the virtual viewpoint image corresponding to the virtual viewpoint by using the determined image processing method.
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND STORAGE MEDIUM
A plurality of images obtained by image capturing by a plurality of cameras and camera parameters are obtained, whether or not an object of interest exists within a depth of field of each camera is determined based on the camera parameters, and processing relating to shape data representing a three-dimensional shape of the object of interest is performed based on an image of the camera for which it has been determined that the object of interest exists within the depth of field.
Generating augmented videos
One or more computing devices, systems, and/or methods for generating augmented videos are provided. First coordinates, of a first video frame of a video, may be determined for projection of a graphical object. A first plurality of keypoints on the first video frame may be determined based upon a set of objects. A second plurality of keypoints on a second video frame may be determined. A keypoint of the second plurality of keypoints may correspond to a keypoint of the first plurality of keypoints. A relationship between the first video frame and the second video frame may be determined based upon the first plurality of keypoints and the second plurality of keypoints. Second coordinates of the second video frame may be determined based upon the first coordinates and the relationship. The graphical object may be projected onto the second video frame based upon the second coordinates.
SYSTEM AND METHOD FOR PROVIDING GOLF INFORMATION FOR GOLFER
According to the present invention, provided is a system for providing golf information, the system including: a green photographing unit for photographing an image of an area including a green on a golf course and communicatively connected to a communication network to transmit green image data to an outside; a golfer terminal that is carried by a golfer on the golf course and communicatively connected to a communication network to transmit golfer position data to the outside; and an information providing server that is communicatively connected to the communication network to generate golf information using the green image data and the golfer position data and transmit the generated golf information to the golfer terminal.
Impact detection
A kinematic analysis system captures and records participant motion via a plurality of video cameras. A participant feature and participant pose are identified in a frame of video data. The feature and pose are correlated across a plurality of frames. A three-dimensional path of the participant is determined based on correlating the feature and pose across the plurality of frames. A potential impact is identified based on analysis of the participant's path.
Player Trajectory Generation via Multiple Camera Player Tracking
A method for trajectory generation based on player tracking is described herein. The method includes determining a temporal association for a first player in a captured field of view and determining a spatial association for the first player. The method also includes deriving a global player identification based on the temporal association and the spatial association and generating a trajectory based on the global player identification.
SYSTEMS AND METHODS FOR DETECTING OBJECTS CROSSING PLANES
There is provided a system for detecting an object crossing a plane at high resolution from long range, the system including a first transmitter projecting a light plane using a light comprising a light plane wavelength, the light plane defining a boundary, wherein the light plane wavelength is not visible to the human eye, a first receiver for detecting a reflection of light when an object crosses the light plane defining the boundary, the receiver configured to detect light in a range of the electromagnetic spectrum including the light plane wavelength, wherein the receiver includes an image sensor comprising a plurality of sensor pixels, and an image processing computer configured to receive an input from the receiver including a recorded frame captured by the receiver, wherein the image processing computer processes the recorded frame for review to identify a frame showing an object crossing the boundary.
SYSTEMS AND METHODS FOR MONITORING OBJECTS AT SPORTING EVENTS
A system for monitoring objects at sporting events or other types of events uses a wearable drone that has at least one camera or other sensor for capturing or otherwise sensing data. When the drone is to be used for monitoring, such as monitoring an object at a sporting event, the wearable drone may be detached from its user, and it may hover or otherwise fly within a certain position of an object to be monitored. While flying, the drone's sensor may be used to capture information, such as performance data or images, of the object during the sporting event.
METHOD AND DEVICE FOR DISPLAYING DATA FOR MONITORING EVENT
An augmented reality method includes acquisition of a plurality of images by an image acquisition device that at least partially cover a space that has at least two landmarks. A three-dimensional position and orientation of the space in relation to the image acquisition device is determined. The instantaneous position, within the reference frame of the space, of a mobile element moving in the space is received. At least one acquired image is displayed on the screen. An overlay is superposed on the displayed image at a predetermined distance in relation to the position of the mobile element in the image. Also, a portable electronic device implements the method.
OBJECT INFORMATION PROCESSING DEVICE, OBJECT INFORMATION PROCESSING METHOD, AND OBJECT INFORMATION PROCESSING PROGRAM
Objects are tracked in real time in a composed video acquired by joining a plurality of videos. A grouping candidate determining unit 12 extracts objects present within an overlapping area, in which pieces of frame data are overlapped, among objects that have been detected and tracked in each of a plurality of pieces of frame data that were captured at the same time as candidate objects. A grouping unit 13 arranges a plurality of candidate objects of which a degree of overlapping is equal to or larger than a predetermined threshold as a group, and an integration unit 14 assigns integration object IDs to groups and objects that have not been grouped.