Patent classifications
H04N5/28
STITCHED IMAGE
Various embodiments associated with a composite image are described. In one embodiment, a handheld device comprises a launch component configured to cause a launch of a projectile. The projectile is configured to capture a plurality of images. Individual images of the plurality of images are of different segments of an area. The system also comprises an image stitch component configured to stitch the plurality of images into a composite image. The composite image is of a higher resolution than a resolution of individual images of the plurality of images.
METHOD AND APPARATUS FOR PORTABLE LIGHTING
Embodiments of the present disclosure provide adjustable lighting devices for use in photography lighting systems. In one embodiment, an adjustable lighting device comprises at least two panel sections and a light element. The at least two panel sections are each configured to act as a light reflector or flag, and are rotatably coupled to each other such that each panel section is freely rotatable relative to its one or more adjacent panel sections. The light element is coupled to one side of a first panel section of the panel sections, and configured to direct light substantially in a direction away from the one side of the first panel section.
CONTACTLESS PHOTO SYSTEM
A contactless photobooth system configured with a viewing screen, user interface, and light sources. The contactless photobooth system to allow a client representative to schedule and obtain photos of a subject different than the client with controlled lighting parameters without requiring the subject to travel to a studio.
CONTACTLESS PHOTO SYSTEM
A contactless photobooth system configured with a viewing screen, user interface, and light sources. The contactless photobooth system to allow a client representative to schedule and obtain photos of a subject different than the client with controlled lighting parameters without requiring the subject to travel to a studio.
Weapon usage monitoring system with unified video depiction of deployment location
Systems and methods are provided for combining video and virtual reality presentations, comprising a server device running application software that receives, at a connection point, a plurality of video signals from a plurality of assets within a deployment location, a video processing facility configured to sync the plurality of video signals to generate a time-matched, unified video depiction of the deployment location, where the unified video depiction presents at least one of positions, orientations, actions or affiliations of the plurality of the assets within the deployment location, and a graphical user interface presenting a field of view of the unified video depiction of the deployment location.
Weapon usage monitoring system with unified video depiction of deployment location
Systems and methods are provided for combining video and virtual reality presentations, comprising a server device running application software that receives, at a connection point, a plurality of video signals from a plurality of assets within a deployment location, a video processing facility configured to sync the plurality of video signals to generate a time-matched, unified video depiction of the deployment location, where the unified video depiction presents at least one of positions, orientations, actions or affiliations of the plurality of the assets within the deployment location, and a graphical user interface presenting a field of view of the unified video depiction of the deployment location.
Self-contained mobile sensor calibration structure
A mobile calibration room may be used for calibrating one or more sensors used on unmanned aerial vehicles (UAVs). A system can include folding or collapsible walls to enable the system to be moved between a stowed position and a deployed position. In the deployed position, the system can comprise a calibration room including one or more 2D or 3D targets used to calibrate one or more sensors (e.g., cameras) on a UAV. The system can include a turntable to rotate the UAV about a first axis during calibration. The system can also include a cradle to rotate the UAV around, or translate the UAV along, a second axis. The turntable can include a frame to rotate the UAV around a third axis during calibration. The mobile calibration room can be coupled to a vehicle to enable the mobile calibration room to be moved between locations.
Self-contained mobile sensor calibration structure
A mobile calibration room may be used for calibrating one or more sensors used on unmanned aerial vehicles (UAVs). A system can include folding or collapsible walls to enable the system to be moved between a stowed position and a deployed position. In the deployed position, the system can comprise a calibration room including one or more 2D or 3D targets used to calibrate one or more sensors (e.g., cameras) on a UAV. The system can include a turntable to rotate the UAV about a first axis during calibration. The system can also include a cradle to rotate the UAV around, or translate the UAV along, a second axis. The turntable can include a frame to rotate the UAV around a third axis during calibration. The mobile calibration room can be coupled to a vehicle to enable the mobile calibration room to be moved between locations.
Weapon usage monitoring system with augmented reality and virtual reality systems
Systems and methods are provided for weapon system monitoring and virtual reality presentation, including a connection point that receives signals from a plurality of sensors within a deployment location, a server device running application software that receives the signals from the connection point and processes the signals to generate a virtual reality depiction of the deployment location, and a graphical user interface presenting a field of view of the virtual reality depiction of the deployment location.
Weapon usage monitoring system with augmented reality and virtual reality systems
Systems and methods are provided for weapon system monitoring and virtual reality presentation, including a connection point that receives signals from a plurality of sensors within a deployment location, a server device running application software that receives the signals from the connection point and processes the signals to generate a virtual reality depiction of the deployment location, and a graphical user interface presenting a field of view of the virtual reality depiction of the deployment location.