Patent classifications
G05D1/0692
Video sensor fusion and model based virtual and augmented reality systems and methods
Techniques are disclosed for systems and methods for video based sensor fusion with respect to mobile structures. A mobile structure may include at least one imaging module and multiple navigational sensors and/or receive navigational data from various sources. A navigational database may be generated that includes data from the imaging module, navigational sensors, and/or other sources. Aspects of the navigational database may then be used to generate an integrated model, forecast weather conditions, warn of dangers, identify hard to spot items, and generally aid in the navigation of the mobile structure.
Video sensor fusion and model based virtual and augmented reality systems and methods
Techniques are disclosed for systems and methods for video based sensor fusion with respect to mobile structures. A mobile structure may include at least one imaging module and multiple navigational sensors and/or receive navigational data from various sources. A navigational database may be generated that includes data from the imaging module, navigational sensors, and/or other sources. Aspects of the navigational database may then be used to generate an integrated model, forecast weather conditions, warn of dangers, identify hard to spot items, and generally aid in the navigation of the mobile structure.
METHODS AND SYSTEMS FOR DETERMINING A DEPTH OF AN OBJECT
A method comprising: providing an autonomous vehicle (AV) with a first estimated position of a target; directing the AV to travel toward the first estimated position at a constant velocity; receiving echo signals of transmitted sonar signals, the echo signals indicating a range and an azimuth of the target; determining a depth difference of the AV and the target based on the received echo signals, the depth difference being determined based on changes to the range and azimuth of the target over time; and in response to a depth difference existing, re-directing the AV toward a second estimated position of the target generated from the depth difference.
INTELLIGENT CLEANING ROBOT
An intelligent cleaning robot comprises a housing, an optical module, a pickup module, a central processing module, and a drive module. The housing defines light transmission holes for the optical module, which comprise an infrared light source, a complex light source, a structure light lens to receive reflected infrared light through the holes to form a three-dimensional image, and a color lens to receive reflected light through the holes to form a color image. The central processing module can receive the three-dimensional image and the color image and form an image of an environment. The pickup module can be controlled to pick up garbage and objects in the environment according to the image of an environment, and the robot can be controlled to move on land or in water.
HYBRID AERIAL/UNDERWATER ROBOTICS SYSTEM FOR SCALABLE AND ADAPTABLE MAINTENANCE OF AQUACULTURE FISH FARMS
Systems and methods for operating a HAUCS sensing platform. The methods comprise: autonomous travel by a UAAV to a first location in proximity to a body of water (BoW) in accordance with a mission plan; actuating a mechanical device to transition a sensor from a retracted position in which the sensor is adjacent to a UAAV to an extended position in which the sensor resides a given distance from a UAAV; collect, by HAUCS sensing platform and sensor, sensor data concerning a water condition of BoW at different depths; actuating the mechanical device to transition the sensor from the extended position to the retracted position after the sensor data has been collected; causing the sensor data to be processed using a machine learning-based analytical engine to determine whether a water distress condition exists/is predicted to occur; and modifying the mission plan when the water distress condition exists/is predicted to occur.
Property assessment system with buoyancy adjust device
A property assessment system is provided to assess a property for various purposes. The system includes a property assessment apparatus configured to be inserted into any space of a property and retrieved therefrom once necessary information is collected from the space. The property assessment apparatus operates to monitor various conditions in the space, which is used to assess the property.
Underwater mobile body and non-transitory computer readable medium
An underwater mobile body includes: a movement control unit that controls movement underwater; a detection unit that moves underwater under the control of the movement control unit and that detects a position of a fish; and a guidance unit that guides the fish based on the position of the fish detected by the detection unit.
Multiple Autonomous Underwater Vehicle (AUV) System
Multiple autonomous underwater vehicles (AUVs) are operated by a single host surface vehicle (HSV) by configuring the AUVs with intermediate nodes (such as unmanned surface vehicles (USVs)) so as to allow the HSV to manage multiple AUVs. The intermediate nodes act as a relay for communications between the HSV and the AUVs allowing the HSV to scale to higher numbers of vehicles thus simultaneously operating the entire fleet of AUVs. The AUVs may provide underwater mapping data.
Control Apparatus and Method for Swimming of Robot Fish
Provided are an apparatus and a method of controlling swimming for a robotic fish. The robotic fish, which is operated in a narrow space like an aquarium, often hits the outer wall during submerging or upwardly swimming. In order to solve this problem, the present invention provides an inclination adjusting means, which adjusts the inclination while generating the rotational propulsive force, it is possible to do smooth submergence and upwardly swimming in the narrow space.
METHOD AND APPARATUS FOR SELF-CONTAINED POSITIONING OF A MOBILE ROBOT INSIDE A TANK
A method and apparatus for positioning a mobile robot inside a vertical cylindrical Aboveground Storage Tank filled with a liquid is described. No additional hardware is needed other than the robot itself. The only piece of information needed is the tank's diameter which is known by construction. The robot carries proprioceptive sensors needed to propagate its position estimate as well as exteroceptive sensors needed to control the dead reckoning positional drift. Proprioceptive and exteroceptive data are merged using data fusion algorithms adapted to the sensor suite integrated in the vehicle, which can take different forms.