Patent classifications
H04N5/28
Stitched image
Various embodiments associated with a composite image are described. In one embodiment, a handheld device comprises a launch component configured to cause a launch of a projectile. The projectile is configured to capture a plurality of images. Individual images of the plurality of images are of different segments of an area. The system also comprises an image stitch component configured to stitch the plurality of images into a composite image. The composite image is of a higher resolution than a resolution of individual images of the plurality of images.
SOFT FEC WITH PARITY CHECK
A method for data transmission includes receiving a data stream from a host device, the data stream as received from the host device including encoded data, separating the encoded data in the data stream into first data blocks and second data blocks, and generating a first forward error correction (FEC) block. The first FEC block includes a first parity section and a first data section, the first parity section includes a first parity bit corresponding to the first data blocks and a second parity bit corresponding to the second data blocks, and the first data section includes the first data blocks and the second data blocks. The method further includes transmitting the first FEC block.
Controller and information processing system for vehicle
A controller for a vehicle includes: a disaster determining section that determines presence or absence of damage caused by a disaster; and an activating section that activates a capturing section mounted on the vehicle. An information processing system is an information processing system that includes plural vehicles and a server. The vehicle activates the capturing section mounted on the vehicle at the time of the disaster, and sends a video image captured by the capturing section and location information to the server. The server accumulates the video images sent from the plural vehicles, and associates the video images with map information.
Controller and information processing system for vehicle
A controller for a vehicle includes: a disaster determining section that determines presence or absence of damage caused by a disaster; and an activating section that activates a capturing section mounted on the vehicle. An information processing system is an information processing system that includes plural vehicles and a server. The vehicle activates the capturing section mounted on the vehicle at the time of the disaster, and sends a video image captured by the capturing section and location information to the server. The server accumulates the video images sent from the plural vehicles, and associates the video images with map information.
Image management apparatus, image management method, communication apparatus, control method, and storage medium
There is provided an image management apparatus. A first receiving unit receives an image from a communication apparatus. A sending unit sends the image and an evaluation request for the image to an image evaluation apparatus configured to perform evaluation with respect to likelihood that another user will purchase the image. A second receiving unit receives an evaluation result for the image from the image evaluation apparatus. A registering unit registers the image in a server apparatus configured to provide an image sales service, in a case where the evaluation result indicates that the likelihood satisfies a predetermined standard and information, which indicates that a user consents to having the image registered in the server apparatus, is set in the image.
Image management apparatus, image management method, communication apparatus, control method, and storage medium
There is provided an image management apparatus. A first receiving unit receives an image from a communication apparatus. A sending unit sends the image and an evaluation request for the image to an image evaluation apparatus configured to perform evaluation with respect to likelihood that another user will purchase the image. A second receiving unit receives an evaluation result for the image from the image evaluation apparatus. A registering unit registers the image in a server apparatus configured to provide an image sales service, in a case where the evaluation result indicates that the likelihood satisfies a predetermined standard and information, which indicates that a user consents to having the image registered in the server apparatus, is set in the image.
Human-automation collaborative tracker of fused object
A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
Human-automation collaborative tracker of fused object
A system includes a control station that enables efficient human collaboration with automated object tracking. The control station is communicatively coupled to an aerial vehicle to receive full motion video of a ground scene taken by an airborne sensor of the aerial vehicle. The control station spatially registers features of a movable object present in the ground scene and determines motion of the movable object relative to the ground scene. The control station predicts a trajectory of the movable objective relative to the ground scene. The control station tracks the movable object based on data fusion of: (i) the spatially registered features; (ii) the determined motion; and (iii) the predicted trajectory of the movable object. The control station presents a tracking annotation and a determined confidence indicator for the tracking annotation on a user interface device to facilitate human collaboration with object tracking.
System And Method For Capturing And Projecting Images, And Use Of The System
The invention relates to a system and a method for capturing and projecting images for use in an integrated studio including a real location and panels which render these premises partially virtual, using images generated from outside to create an image that is part real and part virtual by showing the images.
STITCHED IMAGE
Various embodiments associated with a composite image are described. In one embodiment, a handheld device comprises a launch component configured to cause a launch of a projectile. The projectile is configured to capture a plurality of images. Individual images of the plurality of images are of different segments of an area. The system also comprises an image stitch component configured to stitch the plurality of images into a composite image. The composite image is of a higher resolution than a resolution of individual images of the plurality of images.