Patent classifications
G07C5/0891
TRUSTED MONITORING SYSTEM AND METHOD
Methods and apparatus for monitoring remotely located objects with a system including at least one master data collection unit, remote sensor units, and a central data collection server are described. The master unit is configured to monitor any object, mobile or stationary, including monitoring multiple remote sensor units associated with the monitored objects. The master unit may be in a fixed location or attached to a mobile object. The master unit is configured for monitoring objects that enter and leave an area. The master unit may act as a parent controller for one or more child devices including remote sensors or monitors of measurable conditions including environmental conditions, substance identification, product identification, and/or biometric identification. The master unit may discover remote sensor units as they enter or leave the area where the master unit is located. The master unit can be remotely reprogrammed such as with authenticated instructions.
EXTERNAL ENVIRONMENT RECOGNITION APPARATUS FOR MOBILE BODY
In a device (1) for recognizing an external environment of a moving body, an image processing unit (10) generates image data indicating the external environment of the moving body. Using this image data, object data indicating an object around the moving body is generated. At an occurrence of a predetermined event, a control unit (30) records the generated object data in the recording device (35) connected to the control unit (30). The external environment of the moving body at the occurrence of the event can be verified by the object data recorded in the recording device (35).
VISION SYSTEM FOR A VEHICLE COOLANT SYSTEM
A vehicle has a prime mover to propel the vehicle, and a fluid circuit that defines a fluid passage and contains a coolant. The fluid circuit in thermal communication with the prime mover. A camera is positioned to image coolant within the fluid passage. A controller is configured to receive image data from the camera, process the image data to determine a state of the coolant in the fluid circuit, and output an indication to a user regarding the state of the coolant. A method of monitoring a fluid circuit in a vehicle is also provided.
Autonomous driving method and apparatus
The present disclosure provides an autonomous driving method and an apparatus. The method includes: receiving a currently collected image transmitted by a unmanned vehicle, where the currently collected image is an image collected in a target scenario; acquiring current driving data according to the currently collected image and a pre-trained autonomous driving model, where the autonomous driving model is used to indicate a relationship between an image and driving data in at least two scenarios, and the at least two scenarios include the target scenario; and sending the current driving data to the unmanned vehicle. Robustness of the unmanned driving method is improved.
Modular intelligent transportation system
A modular intelligent transportation system, comprising an environmentally protected enclosure, a system communications bus, a processor module, communicating with said bus, having a image data input and an audio input, the processor module analyzing the image data and/or audio input for data patterns represented therein, having at least one available option slot, a power supply, and a communication link for external communications, in which at least one available option slot can be occupied by a wireless local area network access point, having a communications path between said communications link and said wireless access point, or other modular components.
UNIVERSAL TELEMATICS AND STORAGE DEVICE FOR VEHICLE BASED CAMERA SYSTEMS
A system for remotely controlling the operation of a vehicle having a camera device via Wi-Fi or cellular networking.
DYNAMIC DRIVING METRIC OUTPUT GENERATION USING COMPUTER VISION METHODS
Aspects of the disclosure relate to dynamic driving metric output platforms that utilize improved computer vision methods to determine vehicle metrics from video footage. A computing platform may receive video footage from a vehicle camera. The computing platform may determine that a reference marker in the video footage has reached a beginning and an end of a road marker based on brightness transitions, and may insert time stamps into the video accordingly. Based on the time stamps, the computing platform may determine an amount of time during which the reference marker covered the road marking. Based on a known length of the road marking and the amount of time during which the reference marker covered the road marking, the computing platform may determine a vehicle speed. The computing platform may generate driving metric output information, based on the vehicle speed, which may be displayed by an accident analysis platform. Based on known dimensions of pavement markings the computing platform may obtain the parameters of the camera (e.g., focal length, camera height above ground plane and camera tilt angle) used to generate the video footage and use the camera parameters to determine the distance between the camera and any object in the video footage.
RIDER SATISFACTION SYSTEM
A rider satisfaction system for optimizing rider satisfaction, the rider satisfaction system includes an electronic commerce interface deployed for access by a rider in a vehicle, and a rider interaction circuit that captures rider interactions with the deployed interface. The rider satisfaction system also includes a rider state determination circuit that processes the captured rider interactions to determine a rider state, and an artificial intelligence system trained to optimize, responsive to the rider state, at least one parameter affecting operation of the vehicle to improve the rider state.
Intelligent vehicles with advanced vehicle camera systems for underbody hazard and foreign object detection
A method for operating an advanced driver assistance (ADAS) system of a motor vehicle includes a vehicle controller receiving, from side and end cameras mounted to the vehicle, camera signals indicative of real-time images of outboard-facing side and end views of the vehicle. The controller determines a region of interest (ROI) inset within each end/side view within which is expected foreign objects and/or hazards. These ROIs are analyzed to detect if a foreign object/hazard is present in the vehicle's end and/or side views. Responsive to detecting the foreign object/hazard, movement of the foreign object/hazard is tracked to determine if the foreign object/hazard moves towards or away from the vehicle's underbody region. If the foreign object/hazard moves to the underbody region, control signals are transmitted to the vehicle's propulsion and/or steering system to automate preventative action that prevents collision of the vehicle with and/or removes the foreign object/hazard from the underbody.
Intelligent vehicles with advanced vehicle camera systems for underbody hazard and foreign object detection
A method for operating an advanced driver assistance (ADAS) system of a motor vehicle includes a vehicle controller receiving, from side and end cameras mounted to the vehicle, camera signals indicative of real-time images of outboard-facing side and end views of the vehicle. The controller determines a region of interest (ROI) inset within each end/side view within which is expected foreign objects and/or hazards. These ROIs are analyzed to detect if a foreign object/hazard is present in the vehicle's end and/or side views. Responsive to detecting the foreign object/hazard, movement of the foreign object/hazard is tracked to determine if the foreign object/hazard moves towards or away from the vehicle's underbody region. If the foreign object/hazard moves to the underbody region, control signals are transmitted to the vehicle's propulsion and/or steering system to automate preventative action that prevents collision of the vehicle with and/or removes the foreign object/hazard from the underbody.