Patent classifications
G01S7/629
Sonar sensor fusion and model based virtual and augmented reality systems and methods
Techniques are disclosed for systems and methods for sensor fusion with respect to mobile structures. A mobile structure may include multiple ranging sensor systems and/or receive navigational data from various sensors. A navigational database may be generated that includes data from the ranging sensor systems and/or other sensors. Aspects of the navigational database may then be used to generate an integrated model, which can be used to generally aid in the navigation of the mobile structure.
MARINE ELECTRONIC DEVICE FOR PRESENTMENT OF NAUTICAL CHARTS AND SONAR IMAGES
An apparatus for providing marine information is provided including a user interface, a processor, and a memory including computer program code. The memory and the computer program code are configured to, with the processor, cause the apparatus to generate a sonar image based on sonar return data received from an underwater environment, determine a location associated with the sonar return data based on location data received from one or more position sensors, and render a nautical chart on a display. The computer program code is further configured to cause the apparatus to receive a user input on the user interface directed to a portion of the display in which the nautical chart is presented, and modify presentation of the nautical chart such that the portion of the display presents the sonar image in response to receiving the user input.
Video sensor fusion and model based virtual and augmented reality systems and methods
Techniques are disclosed for systems and methods for video based sensor fusion with respect to mobile structures. A mobile structure may include at least one imaging module and multiple navigational sensors and/or receive navigational data from various sources. A navigational database may be generated that includes data from the imaging module, navigational sensors, and/or other sources. Aspects of the navigational database may then be used to generate an integrated model, forecast weather conditions, warn of dangers, identify hard to spot items, and generally aid in the navigation of the mobile structure.
Video sensor fusion and model based virtual and augmented reality systems and methods
Techniques are disclosed for systems and methods for video based sensor fusion with respect to mobile structures. A mobile structure may include at least one imaging module and multiple navigational sensors and/or receive navigational data from various sources. A navigational database may be generated that includes data from the imaging module, navigational sensors, and/or other sources. Aspects of the navigational database may then be used to generate an integrated model, forecast weather conditions, warn of dangers, identify hard to spot items, and generally aid in the navigation of the mobile structure.
Marine electronic device for presentment of nautical charts and sonar images
An apparatus for providing marine information is provided including a user interface, a processor, and a memory including computer program code. The memory and the computer program code are configured to, with the processor, cause the apparatus to generate a sonar image based on sonar return data received from an underwater environment, determine a location associated with the sonar return data based on location data received from one or more position sensors, and render a nautical chart on a display. The computer program code is further configured to cause the apparatus to receive a user input on the user interface directed to a portion of the display in which the nautical chart is presented, and modify presentation of the nautical chart such that the portion of the display presents the sonar image in response to receiving the user input.
Touch-gesture control for side-looking sonar systems
Techniques are disclosed for systems and methods to provide touch screen side-scan sonar adjustment for mobile structures. A side-scan sonar adjustment system includes a user interface with a touch screen display and a logic device configured to communicate with the user interface and a side-scan sonar system. The user interface is configured to receive and/or display side-scan sonar data provided by the side-scan sonar system. The logic, device is configured to determine a horizontal swipe gesture rate component performed on the touch screen display, stretch the displayed image in accordance with the swipe gesture, and snap to a new field of view in accordance with current field of view and swipe length information. The user interface and logic device may be integrated together to form a multifunction display used to power and/or supply side-scan sonar transmission signals to the side-scan sonar system.
REAL-TIME MONITORING OF SURROUNDINGS OF MARINE VESSEL
Real-time monitoring of surroundings of a marine vessel. One or more observation sensor modules are configured and positioned to generate sensor data extending around the marine vessel. One or more data processors are configured to map and visualize the sensor data in relation to a virtual model of the marine vessel. A user interface is configured to display the virtual model together with the visualized sensor data from a user selectable point of view to a mariner of the marine vessel.
UNDERWATER DETECTION DEVICE
An underwater detection device is provided, which includes an echo signal processing module configured to acquire echo signals and detect signal levels of the echo signals corresponding to water depths, the echo signals being reflection waves caused by ultrasonic waves transmitted underwater, a detection image display controlling module configured to display on a display unit a detection image indicating the signal levels of the echo signals corresponding to the water depths and placed in a chronological order, and a menu display controlling module configured to display first superordinate menu buttons on the detection image displayed on the display unit, one of the first superordinate menu buttons displayed in one of end sections of the detection image where oldest information is displayed, the rest of the first superordinate menu buttons displayed to extend from the one of the first superordinate menu buttons in one of depth directions and time axis directions.
FREQUENCY STEERED SONAR USER INTERFACE
A marine sonar display device comprises a display, a memory element, and a processing element. The display displays sonar images. The memory element stores sonar data. The processing element is configured to transmit a transmit electronic signal to a frequency steered sonar element which transmits an array of sonar beams into a body of water, each sonar beam transmitted in a different angular direction, receive a receive electronic signal from the frequency steered sonar element, the receive electronic signal including a plurality of frequency components, calculate an array of sonar data slices, one sonar data slice for each frequency component, generate an array of sonar image slices, one sonar image slice for each sonar data slice, and control the display to visually present the array of sonar image slices in near real time and a historical sequence of at least one sonar image slice.
Underwater detection device
An underwater detection device is provided, which includes an echo signal processing module configured to acquire echo signals and detect signal levels of the echo signals corresponding to water depths, the echo signals being reflection waves caused by ultrasonic waves transmitted underwater, a detection image display controlling module configured to display on a display unit a detection image indicating the signal levels of the echo signals corresponding to the water depths and placed in a chronological order, and a menu display controlling module configured to display first superordinate menu buttons on the detection image displayed on the display unit, one of the first superordinate menu buttons displayed in one of end sections of the detection image where oldest information is displayed, the rest of the first superordinate menu buttons displayed to extend from the one of the first superordinate menu buttons in one of depth directions and time axis directions.