MARINE DRIVER ASSIST SYSTEM AND METHOD
20230073225 · 2023-03-09
Inventors
- Anson Chin Pang Chan (Richmond, CA)
- Geoffrey David DUDDRIDGE (Nanaimo, CA)
- Declan George David MCINTOSH (Victoria, CA)
Cpc classification
G01S17/58
PHYSICS
G01S13/58
PHYSICS
G01S13/86
PHYSICS
G01S17/86
PHYSICS
B63H25/04
PERFORMING OPERATIONS; TRANSPORTING
B63B49/00
PERFORMING OPERATIONS; TRANSPORTING
International classification
B63B49/00
PERFORMING OPERATIONS; TRANSPORTING
G01S13/42
PHYSICS
G01S17/86
PHYSICS
Abstract
A driver-assist system for a marine vessel may include a camera operable to obtain data comprising images of a view of the camera, and a data processor. The data processor may be programmed to distinguish between portions of the view representing water and/or sky a portion of the view representing an object. The data processor may be programmed to detect an object by causing object detectors to search for objects in respective different subregions of the view. The data processor may be programmed to detect a boat in the view. The data processor may be programmed to cause the marine vessel to follow a detected boat.
Claims
1. A driver-assist system for a marine vessel, the system comprising: a camera operable to obtain data comprising images of a view of the camera; and a data processor programmed to, at least, distinguish between: portions of the view representing water and/or sky; and a portion of the view representing an object.
2. The system of claim 1 wherein the data processor is further programmed to, at least, utilize machine learning trained to identify water, sky, and anything that is not water or sky.
3. The system of claim 1 wherein the data processor is further programmed to, at least, utilize a convolutional neural network trained to identify water, sky, and anything that is not water or sky.
4. The system of claim 1, 2, or 3 wherein the data processor is further programmed to, at least, identify the object as anything that is not sky or water.
5. The system of claim 1, 2, 3, or 4 wherein the data processor is further programmed to, at least, adjust settings of the camera only according to image quality metrics values of only portions of the view not representing water and/or sky.
6. The system of any one of claims 1 to 5 wherein the data processor is further programmed to, at least, identify a horizon in the view.
7. The system of claim 6 wherein the data processor is programmed to identify the object near the horizon.
8. The system of any one of claims 1 to 7 wherein the data processor is further programmed to, at least, detect the object by, at least, causing at least some object detectors of a plurality of object detectors to search for objects only in respective different subregions of a plurality of different subregions of the view.
9. The system of claim 8 wherein the data processor is further programmed to, at least, cause the at least some object detectors to search for objects only in portions of the view not representing water and/or sky.
10. A driver-assist system for a marine vessel, the system comprising: a camera operable to obtain data comprising images of a view of the camera; and a data processor programmed to, at least, detect an object by, at least, causing at least some object detectors of a plurality of object detectors to search for objects only in respective different subregions of a plurality of different subregions of the view.
11. The system of claim 8, 9, or 10 wherein the data processor is further programmed to, at least, implement the plurality of object detectors.
12. The system of claim 8, 9, 10, or 11 wherein at least one object detector of the plurality of object detectors is a single-shot detector (SSD).
13. The system of any one of claims 8 to 12 wherein the plurality of subregions comprises a plurality of different predefined subregions.
14. The system of claim 13 wherein the predefined subregions of the plurality of predefined subregions are distributed evenly across the entire view.
15. The system of claim 13 or 14 wherein the predefined subregions of the plurality of predefined subregions overlap horizontally.
16. The system of claim 13, 14, or 15 wherein the predefined subregions of the plurality of predefined subregions are centered vertically within the view.
17. The system of claim 13, 14, 15, or 16 wherein the data processor is further programmed to, at least: cause available object detectors of the plurality of object detectors to search only some predefined subregions of the plurality of predefined subregions in a first object detection cycle; and cause the available object detectors of the plurality of object detectors to search other predefined subregions of the plurality of predefined subregions in a second object detection cycle after the first object detection cycle.
18. The system of any one of claims 8 to 17 wherein the data processor is further programmed to, at least, cause at least one object detector of the plurality of object detectors to search for objects in a dynamic subregion of the plurality of subregions, the dynamic subregion placed such that the object is within the dynamic subregion.
19. The system of claim 18 wherein the data processor is further programmed to, at least, cause the dynamic subregion to move in response to movement of the object.
20. The system of claim 18 or 19 wherein the data processor is further programmed to, at least, cause the dynamic subregion to have a size in response to a size of the object.
21. The system of any one of claims 8 to 20 wherein the data processor is further programmed to, at least, cause at least one object detector of the plurality of object detectors to search for objects in the entire view.
22. The system of any one of claims 8 to 21 wherein each object detector of the plurality of object detectors, in operation, detects objects at a resolution smaller than a resolution of the images.
23. The system of any one of claims 1 to 22 wherein the data processor is further programmed to, at least, alert a user in response to detecting the object.
24. The system of any one of claims 1 to 23 wherein the system is programmed to, at least, take collision avoidance measures in response to detecting the object.
25. The system of claim 24 wherein the system is programmed to take the collision avoidance measures by, at least, preventing the object from being less than a safety distance at any time from the marine vessel.
26. The system of claim 25 wherein the system is programmed to, at least, vary the safety distance according to, at least, a size of the marine vessel.
27. The system of claim 25 or 26 wherein the system is programmed to, at least, vary the safety distance according to, at least, a speed of the marine vessel.
28. The system of claim 25, 26, or 27 wherein the system is programmed to, at least, vary the safety distance according to, at least, a size of the object.
29. The system of claim 25, 26, 27, or 28 wherein the system is programmed to, at least, vary the safety distance according to, at least, a speed of the object.
30. The system of any one of claims 1 to 29 wherein the data processor is further programmed to, at least, identify a distance from the system to the object.
31. The system of claim 30 wherein the data processor is programmed to identify the distance from the system to the object according to, at least, radar data from a radar apparatus.
32. The system of claim 30 or 31 wherein the data processor is programmed to identify the distance from the system to the object according to, at least, lidar data from a lidar apparatus.
33. The system of claim 30 wherein the data processor is programmed to identify the distance from the system to the object according to, at least, data from a sensor separate from the camera.
34. The system of any one of claims 1 to 33 wherein the data processor is further programmed to, at least, identify a speed of the object.
35. The system of claim 34 wherein the data processor is programmed to identify the speed of the object according to, at least, radar data from a radar apparatus.
36. The system of claim 34 or 35 wherein the data processor is programmed to identify the speed of the object according to, at least, lidar data from a lidar apparatus.
37. The system of claim 34 wherein the data processor is programmed to identify the speed of the object according to, at least, data from a sensor separate from the camera.
38. The system of any one of claims 1 to 37 wherein the data processor is further programmed to, at least, identify a direction of movement of the object.
39. The system of any one of claims 1 to 38 wherein the data processor is further programmed to, at least, identify a size of the object.
40. The system of any one of claims 1 to 39 wherein the object is a detected boat.
41. A driver-assist system for a marine vessel, the system comprising: a camera operable to obtain data comprising images of a view of the camera; and a data processor programmed to, at least, detect a detected boat in the view.
42. The system of claim 40 or 41 wherein the data processor is further programmed to, at least, identify a direction of movement of the detected boat from a shape of the detected boat.
43. The system of claim 42 wherein the shape of the detected boat comprises a bow of the detected boat.
44. The system of claim 42 or 43 wherein the shape of the detected boat comprises a stern of the detected boat.
45. The system of claim 42, 43, or 44 wherein the data processor is further programmed to, at least, identify one of a plurality of different classes of direction of movement in response to, at least, the direction of movement of the detected boat.
46. The system of claim 42, 43, 44, or 45, when dependent from claim 24, wherein the system is programmed to take the collision avoidance measures in response to, at least, the direction of movement of the detected boat.
47. The system of any one of claims 40 to 46 wherein the system is programmed to, at least, cause the marine vessel to follow the detected boat.
48. The system of any one of claims 40 to 47 wherein the system is programmed to, at least, cause the marine vessel to follow the detected boat at a set distance.
49. The system of claim 48 wherein the system is programmed to, at least, set the set distance according to, at least, a user-programmable follow sensitivity.
50. The system of claim 48 or 49 wherein the system is programmed to, at least, set the set distance according to, at least, a speed of the marine vessel.
51. A driver-assist system for a marine vessel, the system comprising: a sensor operable to obtain data from surroundings of the sensor; and a data processor programmed to, at least, cause the marine vessel to follow a detected boat.
52. The system of claim 51 wherein the sensor is a camera, and wherein the data comprise images of a view of the camera.
53. The system of any one of claims 47 to 52 wherein the data processor is programmed to cause the marine vessel to follow the detected boat only if the detected boat is a closest object detected by the system within an adaptive cruise control range.
54. The system of any one of claims 47 to 53 wherein the data processor is programmed to cause the marine vessel to follow the detected boat only if the detected boat is at least a safe-following distance of the from the marine vessel.
55. The system of any one of claims 47 to 54 wherein the data processor is programmed to cause the marine vessel to follow the detected boat only if the detected boat is moving in a same general direction as the marine vessel.
56. The system of any one of claims 47 to 55 wherein the data processor is programmed to cause the marine vessel to follow the detected boat only if the system does not detect any risk of collision with other boats detected by the system.
57. The system of any one of claims 47 to 56 wherein the data processor is programmed to cause the marine vessel to follow the detected boat only if the system does not detect any objects, other than the detected boat, within a safety region.
58. The system of claim 57 wherein the safety region is a safety circle.
59. The system of claim 57 or 58 wherein the data processor is programmed to cause the marine vessel to resume following the detected boat automatically in response to detecting no objects, other than the detected boat, within the safety region.
60. A marine vessel comprising the system of any one of claims 1 to 59.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0087] Embodiments of this disclosure will be more readily understood from the following description of such embodiments given, by way of example only, with reference to the accompanying drawings, in which:
[0088]
[0089]
[0090]
[0091]
[0092]
[0093]
[0094]
[0095]
[0096]
[0097]
[0098]
[0099]
[0100]
[0101]
[0102]
[0103]
[0104]
[0105]
[0106]
[0107]
[0108]
DETAILED DESCRIPTION
[0109] Referring to the drawings and first to
[0110]
[0111] A driver assist system, as described herein for example, may include a data processor that may be programmed, configured, or otherwise operable to implement some or all functions of the driver assist system as described herein, for example. More generally, a driver assist system, as described herein for example, may be programmed, configured, or otherwise operable to implement some or all functions of the driver assist system as described herein, for example.
[0112] A driver assist system, a data processor, a unit, a module, a controller, or a system as described herein, for example, may include one or more processor circuits that may include one or more central processing unit (CPUs) or microprocessors, one or more machine learning chips, discrete logic circuits, or one or more application-specific integrated circuit (ASICs), or combinations of two or more thereof, for example, and that may include one or more of the same or different computer-readable storage media, which in various embodiments may include one or more of a read-only memory (ROM), a random access memory (RAM), a hard disc drive (HDD), a solid-state drive (SSD), and other computer-readable and/or computer-writable storage media. For example, one or more such computer-readable storage media may store program codes that, when executed, cause one or more processor circuits of a driver assist system, of a data processor, of a unit, of a module, of a controller, of a system, or of a combination of two or more thereof to implement functions as described herein, for example, in which case the driver assist system, the data processor, the unit, the module, the controller, the system, or the combination of two or more thereof may be programmed, configured, or operable to implement such functions. Of course a driver assist system, a data processor, a unit, a module, a controller, a system, or a combination of two or more thereof may be configured or otherwise operable to implement other functions and to implement functions in other ways.
[0113] A driver assist system, a data processor, a unit, a module, a controller, or a system as described herein, for example, may be implemented in the same device, such as a device including one processor circuit, or in one or more separate devices other implementations. For example, in some embodiments, the motion planner 69 and one or more other units, modules, controllers, systems, or a combination of two or more thereof as described herein may be implemented in one device including one processor circuit programmed, configured, or otherwise operable to implement some or all functions as described herein. In other embodiments, the motion planner 69 may be implemented in separate processor circuits or in multiple separate devices, and other units, modules, controllers, systems, or a combination of two or more thereof as described herein may be implemented in one device or in separate devices.
[0114] The marine driver assist system may utilize computer vision and machine learning to process the images received from the cameras such as camera 46, a stereo camera in this example, at the front of the craft shown in
[0115]
[0116] The training may be simplified by focusing on objects that are neither sky nor water. Things that are not identified as sky or water may be deemed to be marine object candidates for a collision warning or avoidance. These objects could be a boat, buoy, shoreline, a bridge, a marker, a log, or a piling, for example.
[0117] By focusing on objects that are neither sky nor water, the processes of machine learning and real-time execution speed of image recognition may be accelerated. This method may also simplify the ground truth data annotation because there may be a reduction in the number of segmentation classes. Ground truth is a term used to refer to information provided by direct observation (i.e. empirical evidence) as opposed to information provided by inference. In some embodiments, such a system may be robust to classes of images that may not have been experienced before or which do not exist in training data. For example, a manatee may not have been experienced before, but the VPU may still recognize a manatee as an object which the boat may potentially collide with.
[0118] With reference to
[0119] The marine driver assist system may also utilize boat identification and tracking using computer vision and machine learning. In other words, the marine driver assist system may be capable of identifying boats which may be in a simplified image, similar to
[0120] The marine driver assist system may use a convolutional neural network to search predefined image regions, for example, the image of
[0121] The marine driver assist system may track boats using one or more of speed and acceleration, distance, location of an object in the image, the size of an object in the image, both height and width, to estimate the physical size of a boat based on its size and an image and the measured distance to the boat.
[0122] Boats may have distinctive features such as bows and sterns which can be used to train the convolutional neural network using pictures of different boats facing in different directions. For example, with reference to
[0123] If another boat is detected coming towards the subject boat, then the marine driver assist system may monitor the other boat and may alert the user or take collision avoidance measures with respect to the other boat as required. For example, if the other boat is detected going away from the subject boat then the subject boat can choose to follow the other boat in adaptive cruise control following application. As another example, if the other boat is going sideways to the left of the subject boat, then the VPU can choose to steer it to the right of the other boat to avoid it. As another example, if the other boat is going sideways to the right, then the VPU can choose to steer to the left of the other boat to avoid it.
[0124] With reference to
[0125] The marine driver assist system may search through a predefined set of regions, and may check some (but not all) of the predefined regions every cycle. When objects are detected new (or dynamic) regions may be set around them and may be checked every cycle as long as that object is still detected. Such use of predefined regions may extend detection range in an open water marine environment and may allow better detections while still using low resolution detectors with responsive performances. Such use of predefined regions may also save resources by only checking some fixed number of subregions in one cycle. Such use of dynamic searching algorithms with locking regions on detected objects is novel for marine object tracking and may allow efficient tracking of marine objects as the subject boat (and its sensors) pitch and roll in waves or otherwise move as the vessel travels through the water.
[0126] Many object detection algorithms run in real time. Inputs to such object detection algorithms may be at a relatively low resolution, e.g. 300×300 pixels, but at such low resolution, the model may struggle to detect small objects in the view. On the other hand, larger algorithms with larger resolution inputs may scale exponentially in complexity, and a processor may not have enough computation power and may struggle to complete object detection fast enough for real-time performance.
[0127] To enhance small object detection, or object detection more generally, the marine driver assist system may use a set of searching windows (or predefined subregions) in the view of one or more cameras.
[0128] With reference to
[0129] 1. Prepare predefined subregions: (a) Consider the entire region. (b) Divide this into a series of smaller predefined subregions. (c) Evenly distribute the predefined subregions across the entire region such that they horizontally overlap and are centered vertically. They are centered vertically (or otherwise including or centered on a horizon) because the concern is small boats on the horizon.
[0130] 2. Prepare dynamic subregions: (a) The dynamic subregions track boats detected only in a predefined subregion. (b) A dynamic subregion is placed so that the boat is on the edge of the dynamic subregion with some offset to allow the boat to move further into the dynamic subregion on the next cycle. (c) Boats which are not found for x cycles are dropped and the dynamic subregion is abandoned. (d) If two boats are close enough together, then one dynamic subregion is placed over both boats.
[0131] 3. The processor is capable of processing N regions per cycle using N single shot detectors (SSDs), although other embodiments may include other detectors that may not necessarily be SSDs. This is comprised of: (a) The first SSD is used to detect the entire region at lower resolution. N−1 SSDs are still available. (b) Dynamic subregions are generated to monitor objects detected in previous cycles. There are K of the dynamic subregions, to a maximum of (N−1). The dynamic subregions are repeatedly created based on results of previous iterations of steps 7-9 as described below. (N−1−K) SSDs are still available. (c) All remaining processing power is used in predefined subregions. Thus (N−1−K) detectors are available to search predefined subregions.
[0132] 4. Take N−1−K extra single shot detectors and feed N−1−K of M predefined subregions through the SSDs as well as entire image through the first SSD.
[0133] 5. Move all the N−1−K single shot detectors to new predefined subregions every cycle such that it takes no more than ceiling(M/(N−1−K)) cycles to have a predefined subregion go through an SSD where M is the number of predefined subregions. As a result, the marine driver assist system may be configured to cause N−1−K available detectors to search only some predefined subregions in one object detection cycle, and to cause the N−1−K available detectors to search other predefined subregions in another object detection cycle.
[0134] 6. Continue the cycle through the regions until a single shot detector detects the boat which is not also detected by the whole image single shot detector.
[0135] 7. Then use one dynamic subregion to cover this boat. The marine driver assist system does not center the dynamic subregion on this boat, but rather places it to the edge of the boat detection with some offset to allow the boat to move in that direction frame to frame (or cycle to cycle). Some amount of spatial locality of boat detection from frame to frame (or cycle to cycle) is assumed.
[0136] 8. Continue to search this dynamic subregion with the updated location of the boat as it is tracked. Boats not found for X cycles are dropped and the SSD is returned to search other regions.
[0137] 9. If two boats sufficiently close to each other are detected by the SSDs of a predefined subregion, the marine driver assist system attempts to place one dynamic subregion over the boats with the buffer room. If this is not possible then two dynamic subregions are used in a nonoverlapping manner.
[0138] The marine driver assist system may also employ smart brightness adjustment as seen in
[0139] In
[0140] With reference to
[0141] 1. Capture camera image and apply digital processing such as contrast or edge enhancement to the image to make some compensations and enhancements.
[0142] 2. Through the use of a convolutional neural network, determine and generate foreground segmentation on the image. The foreground regions contain objects of interest for advanced driver assist functions (e.g. boats, docks, buoys) and does not contain the background region of image such as sky or water. The foreground regions may be identified as regions that are not water or sky.
[0143] 3. Mask the image obtained from step 1 with the foreground segmentation from step 2. This step provides a masked image that contains only the foreground regions. The foreground information is illustrated as an example in
[0144] 4. Analyze the masked regions from step 3 and determine the image quality metrics for those regions. Examples of image quality metrics include brightness, contrast, hue, saturation, exposure, and white balance, any one thereof, or a combination of any two or more thereof. The background region is ignored in the analysis.
[0145] 5. Compare the image quality metrics values of the masked regions from step 4 against desired quality metric values, and compute the difference.
[0146] 6. Adjust camera settings for the masked regions from step 3 by minimizing the computed metric difference in step 5.
[0147] Continue to iterate through steps 1 through 6 for each image frame or detection cycle.
[0148] Camera settings may be compensated with every frame or detection cycle, which may ensure even foreground quality even though the image transitions into and away from areas with bright backgrounds. This may give the marine driver assist system the ability to tolerate a large range of lighting conditions within the background of an image without degrading performance of the marine driver assist system. The new contribution of this method is using segmented foreground apart from water or sky areas to isolate regions considered for compensation of the camera and postprocessing settings to enhance only regions of interest in outdoor applications.
[0149] Embodiments of this disclosure may include sensor fusion. Different types of sensors have different attributes and are better at different things. Cameras may be good at recognition and classification. Radar may be good at measuring velocity and position and is not affected by bad weather. Stereo cameras can be used to measure distance to objects. However, radar may have a higher accuracy in measuring medium to long range distance. Lidar may be preferable to measure short to medium range distance.
[0150] Referring to
[0151]
[0152] Where there is risk of collision from the forward direction of the boat, thrust may be reduced according to a control algorithm considering the collidable object's distance and relative velocity. If the control algorithm requests zero thrust, the boat is still moving, and the collidable object is within the safe distance, “brakes” may be applied. The duration and intensity of braking may be based upon a control algorithm considering the object's distance and the subject boat's speed.
[0153] Since boats have significant momentum and no conventional brakes, a series of other mechanisms may be used to accomplish braking. A number of options may be available and may be used in any combination including friction from passing water, shifting out of gear, shifting into reverse gear, deploying the trim tabs, and deploying interceptor tabs as illustrated in
[0154]
[0155] The motion planner may receive sensor data 53 from the sensor fusion module 52 or 68. The sensor data may include velocities of other boats, positions of other boats, distances positions, or both of non-boat objects, velocity of the subject boat (the boat where the marine driver assist system is installed), or a combination of two or more thereof. The motion planner may communicate with a steering controller 84, shift and throttle controller 82, and trim tab controller 86 and may provide them with steering commands, shift and throttle commands, and trim commands respectively.
[0156]
[0157] The adaptive cruise control unit or system for boats, according to an embodiment of this disclosure, may engage when following a preceding boat within a preset distance. However, other boats in a marine environment can come from different locations and can change course more suddenly compared to automobiles on a road. Also boats have inertia but no actual brakes. It may take much longer for a boat to stop or change course, particularly large boats, compared to automobiles on a road. Also boats may not travel within defined lanes and may travel in a staggered pattern even when travelling in a similar direction. Also boats travelling in a similar direction may do so at different speeds. Low-speed adaptive cruise control can be engaged in a busy waterway or marina where oncoming boats may come too close. Adaptive cruise control/collision avoidance systems may need to work together to reduce speed or stop the subject boat if necessary. There may be more interruptions to stop adaptive cruise control systems in boats compared to cars. It may be advantageous to automatically reengage adaptive cruise control for a boat if it is interrupted. The following numbered steps characterize the operation of the adaptive cruise control for boats according to one embodiment, although alternative embodiments may differ.
[0158] 1. With reference to
[0159] 2. Where there are objects in the adaptive cruise control range, the adaptive cruise control unit or system then determines whether the closest object is a boat.
[0160] 3. Where the closest object is determined to be a boat, the adaptive cruise control unit or system determines whether this is within a safe following distance (i.e. it is not too close such that it may cause a crash).
[0161] 4. The adaptive cruise control unit or system then checks whether the preceding boat is travelling in the same general direction as the subject boat.
[0162] 5. If the answer to step 2 was “no”, and the closest object is not a boat, then the closest object may be an obstacle such as a rock. If the answer to step 3 was “no”, then the subject boat may be too close to the preceding boat such that it may have a risk of crashing with it. If the answer to step 4 is “no” then the closest boat may be travelling towards the subject boat. When the answer to any of the steps 2, 3 or 4 is “no” then the adaptive cruise control will disengage and execute the collision avoidance routine to mitigate the risk of collision.
[0163] 6. Before engaging cruise control command, the adaptive cruise control unit or system will utilize its sensor data (e.g. radar, stereo camera, or lidar data) to determine velocities and positions of all detected boats. The adaptive cruise control unit or system then uses the present position and velocity of each boat to project its future position.
[0164] 7. If the projected future position of any boat indicates a risk of collision, then the adaptive cruise control will also disengage and collision avoidance will execute.
[0165] 8. Where there is no risk of collision, then adaptive cruise control will be engaged and locked onto the target boat as a target.
[0166] 9. The projected boat positions and velocities of the preceding boat and surrounding boats are used to calculate the adaptive cruise control commands for steering, shift and throttle and trim tab controllers.
[0167] With reference to
[0168] It will be understood by someone skilled in the art that many of the details provided above are by way of example only and are not intended to limit the scope of the invention which is to be determined with reference to the following claims.