Patent classifications
B60Q1/525
DETACHABLE VEHICLE-TRAVELING GUIDANCE SYSTEM AND METHOD OF GUIDING VEHICLE TRAVELING
Proposed is a detachable traveling guidance system including a frame detachably mounted on a roof of a vehicle, a monitoring module including a sensor array configured to monitor a vicinity of the vehicle and a communicator, the monitoring module being mounted on the frame, and the frame being configured to adjust a position of the monitoring module, and a display mounted on a lower portion of the frame and positioned to cover a side surface or a rear surface of the vehicle, necessary information being output on the display device for viewing from the outside.
TRACKING AND ALERT METHOD AND SYSTEM FOR WORKER PRODUCTIVITY AND SAFETY
An exemplary method and system is disclosed that facilitates the monitoring of worker's productivity and safety, for example, at a construction site and other like labor-intensive occupation and settings. The exemplary method and system provides wearable-based motion sensing and/or proximity sensing between worker and construction equipment. A study discussed herein suggests that sensing at two or more torso locations can provide 95% accuracy in classifying worker's action or motion at a construction site. In some embodiments, as discussed herein, the acquired sensed data are transmitted, over a mesh network, e.g., established between the wearable devices, to a cloud infrastructure to facilitate the real-time monitoring of action and actions that such sites.
ALERTING SYSTEM
An alerting system to be applied to a vehicle configured to execute driving assistance control based on a recognition result from a recognition sensor includes: one or more memories configured to store blind area information indicating a blind area around a recognition range recognized by the recognition sensor; and one or more processors configured to acquire a vehicle condition including at least one of a vehicle speed, or a door open or closed state of the vehicle, start a rendering process for rendering the blind area on a road surface around the vehicle in response to satisfaction of a start condition by the vehicle condition, and execute the rendering process by using a light emitting device.
TARGET DETECTION APPARATUS AND VEHICLE HAVING THE SAME MOUNTED THEREON
A target detection apparatus is configured to recognize a stationary object present within a detection range of an external sensor based on map information. Next, the target detection apparatus is configured to determine, by checking an image of the stationary object detected by the external sensor against the stationary object recognized from the map information, whether the image of the stationary object detected by the external sensor includes an undetected region. When the undetected region is identified, the target detection apparatus is configured to recognize an undetectable object present between the stationary object and a vehicle.
CONNECTED VEHICLE LOCATION TRACKING IN TAILLIGHT
The present disclosure provides methods and apparatuses of a highly sensitive and reliable IoT location tracking device in a form factor of a vehicle taillight. The tracking device replaces the existing taillight and comprises custom-designed GPS and Cellular antennae that are specially integrated with a novel dual-PCBA structure and an anti-theft tactile switch detection mechanism. The purposes of the new methods and apparatuses are to achieve the following: (1) improved reliability of signal reception and transmission with taillight integration; (2) improved electronic shielding of the processor board and PCB (printed circuit board) assembly; (3) high sensitivity of the new GPS and cellular antennae geometry and dimension design; (4) novel anti-theft security and better vehicle tracking management.
Autonomous vehicle intent signaling
Various technologies described herein pertain to controlling an autonomous vehicle to provide indicators that signal a driving intent of the autonomous vehicle. The autonomous vehicle includes a plurality of sensor systems that generate a plurality of sensor signals, a notification system, and a computing system. The computing system determines that the autonomous vehicle is to execute a maneuver that will cause the autonomous vehicle to traverse a portion of a driving environment of the autonomous vehicle. The computing system predicts that a person in the driving environment is to traverse the portion of the driving environment based upon the plurality of sensor signals. The computing system then controls the notification system to output a first indicator indicating that the autonomous vehicle plans to yield to the person or a second indicator indicating that the autonomous vehicle plans to execute the maneuver prior to the person traversing the portion of the driving environment.
SADDLE-RIDING TYPE VEHICLE
A saddle-riding type vehicle includes: a front object recognition unit (54) which recognizes an object in front of a host vehicle (M); a side object recognition unit (54) which recognizes an object at the rear side of the host vehicle (M); a display unit (42) which notifies a driver of the existence of an object in the periphery of the host vehicle (M); and a notification control unit (160) which determines the existence of an object in front of the host vehicle (M) and the existence of an object at the rear side of the host vehicle (M) on the basis of the recognition result of the front object recognition unit (54) and the side object recognition unit (54) and controls the display unit (42). The notification control unit (160) controls, when it is determined that there is an object at the rear side of the host vehicle (M), the display unit (42) to display a first notification (A1) and controls, when it is determined that there are an object at the rear side of the host vehicle (M) and there is an object in front of the host vehicle (M), the display unit (42) to display a second notification (A2) different from the first notification (A1).
INTELLIGENT ELECTRONIC FOOTWEAR AND LOGIC FOR NAVIGATION ASSISTANCE BY AUTOMATED TACTILE, AUDIO, AND VISUAL FEEDBACK
Presented are intelligent electronic footwear and apparel with controller-automated features, methods for making/operating such footwear and apparel, and control systems for executing automated features of such footwear and apparel. A method for operating an intelligent electronic shoe (IES) includes receiving, e.g., via a controller through a wireless communications device from a GPS satellite service, location data of a user. The controller also receives, e.g., from a backend server-class computer or other remote computing node, location data for a target object or site, such as a virtual shoe hidden at a virtual spot. The controller retrieves or predicts path plan data including a derived route for traversing from the user's location to the target's location within a geographic area. The controller then transmits command signals to a navigation alert system mounted to the IES's shoe structure to output visual, audio, and/or tactile cues that guide the user along the derived route.
Haptic and/or Tactile Warning of People via a Vehicle
An assembly is for outputting tactile warning signals in a vehicle environment of a vehicle and includes at least one sensor for scanning the vehicle environment and for detecting sensor data, and at least one signal generator for outputting a warning signal that is tactilely and/or haptically perceptible to people in the vehicle environment. The assembly further includes a control device for receiving and evaluating the sensor data and for generating control commands for actuating the signal generator in the event of a person being detected in the vehicle environment.
Vehicle having a projector for projecting an image on a road surface
Information related to a vehicle can be displayed by projecting an image based on the information on a road surface or the like. An image projection apparatus that projects an image includes: a sensor unit that acquires information related to a vehicle; and an image projection unit that projects the image based on the information acquired by the sensor unit.