G05D1/249

Navigating an autonomous vehicle based upon an image from a mobile computing device

An autonomous vehicle receives geographic location data defined via a mobile computing device operated by a user. The geographic location data is indicative of a device position of the mobile computing device. The autonomous vehicle also receives image data generated by the mobile computing device. The image data is indicative of a surrounding position nearby the device position. The surrounding position is selected from an image captured by a camera of the mobile computing device. A requested vehicle position (e.g., a pick-up or drop-off location) is set for a trip of the user in the autonomous vehicle based on the geographic location data and the image data. A route from a current vehicle position of the autonomous vehicle to the requested vehicle position for the trip of the user in the autonomous vehicle is generated. Moreover, the autonomous vehicle can follow the route to the requested vehicle position.

Visual overlays for indicating stopping distances of remote controlled machines
11885102 · 2024-01-30 · ·

A controller may receive, from a sensor device, machine velocity data indicating a velocity of a machine controlled by a remote control device. The controller may determine, based on the machine velocity data, a distance to be traveled by the machine until the machine stops traveling after a communication, between the machine and the remote control device, is interrupted. The controller may generate, based on the distance, an overlay to indicate the distance. The controller may provide the overlay, for display, with a video feed of an environment surrounding the machine.

Mobile body, information processor, mobile body system, information processing method, and information processing program

An information processing method of an information processor includes: obtaining information received from a mobile body through a wireless communication, the mobile body including a movement mechanism and an imaging unit configured to capture image data, the information received from the mobile body including captured image data obtained by the imaging unit, with the captured image data being updated periodically; and generating route guidance information for use in moving the mobile body by the movement mechanism. The captured image data is stored together with data update time information. The route guidance information includes at least two selectable routes. The route guidance information is generated based on the captured image data, position information of the mobile body, and the data update time information.

Mobile body, information processor, mobile body system, information processing method, and information processing program

An information processing method of an information processor includes: obtaining information received from a mobile body through a wireless communication, the mobile body including a movement mechanism and an imaging unit configured to capture image data, the information received from the mobile body including captured image data obtained by the imaging unit, with the captured image data being updated periodically; and generating route guidance information for use in moving the mobile body by the movement mechanism. The captured image data is stored together with data update time information. The route guidance information includes at least two selectable routes. The route guidance information is generated based on the captured image data, position information of the mobile body, and the data update time information.

MOBILE BODY, INFORMATION PROCESSOR, MOBILE BODY SYSTEM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM
20240103517 · 2024-03-28 · ·

An information processing method of an information processor includes: obtaining information received from a mobile body through a wireless communication, the mobile body including a movement mechanism and an imaging unit configured to capture image data, the information received from the mobile body including captured image data obtained by the imaging unit, with the captured image data being updated periodically; and generating route guidance information for use in moving the mobile body by the movement mechanism. The captured image data is stored together with data update time information. The route guidance information includes at least two selectable routes. The route guidance information is generated based on the captured image data, position information of the mobile body, and the data update time information.

ADAPTIVE ILLUMINATION FOR A TIME-OF-FLIGHT CAMERA ON A VEHICLE
20240107175 · 2024-03-28 ·

Disclosed are devices, systems and methods for capturing an image. In one aspect an electronic camera apparatus includes an image sensor with a plurality of pixel regions. The apparatus further includes an exposure controller. The exposure controller determines, for each of the plurality of pixel regions, a corresponding exposure duration and a corresponding exposure start time. Each pixel region begins to integrate incident light starting at the corresponding exposure start time and continues to integrate light for the corresponding exposure duration. In some example embodiments, at least two of the corresponding exposure durations or at least two of the corresponding exposure start times are different in the image.

MAP GENERATION AND CONTROL SYSTEM

One or more information maps are obtained by an agricultural work machine. The one or more information maps map one or more agricultural characteristic values at different geographic locations of a field. An in-situ sensor on the agricultural work machine senses an agricultural characteristic as the agricultural work machine moves through the field. A predictive map generator generates a predictive map that predicts a predictive agricultural characteristic at different locations in the field based on a relationship between the values in the one or more information maps and the agricultural characteristic sensed by the in-situ sensor. The predictive map can be output and used in automated machine control.

CONVEYANCE SYSTEM, CONTROL APPARATUS, AND CONTROL METHOD
20240103544 · 2024-03-28 · ·

A conveyance system (1) according to the present disclosure includes: a conveyance vehicle (10) that conveys an object based on a first traveling path; a sensor (20) that transmits information regarding a position of the conveyance vehicle (10) via a network; a communication unit (32) that can communicate with the conveyance vehicle (10) and the sensor (20); and a control unit (31) that controls the conveyance vehicle (10) via the communication unit (32). The control unit (31) determines a second traveling path based on the information regarding the position of the conveyance vehicle (10) and corrects a traveling trajectory of the conveyance vehicle (10) based on the first traveling path and the second traveling path.

CONVEYANCE SYSTEM, CONTROL APPARATUS, AND CONTROL METHOD
20240103544 · 2024-03-28 · ·

A conveyance system (1) according to the present disclosure includes: a conveyance vehicle (10) that conveys an object based on a first traveling path; a sensor (20) that transmits information regarding a position of the conveyance vehicle (10) via a network; a communication unit (32) that can communicate with the conveyance vehicle (10) and the sensor (20); and a control unit (31) that controls the conveyance vehicle (10) via the communication unit (32). The control unit (31) determines a second traveling path based on the information regarding the position of the conveyance vehicle (10) and corrects a traveling trajectory of the conveyance vehicle (10) based on the first traveling path and the second traveling path.

AUTO-LOCATING AND AUTONOMOUS OPERATION DATA STORAGE
20240101278 · 2024-03-28 ·

Techniques for auto-locating and autonomous operation data storage are disclosed. An example method can include storing a representation of an aircraft in a data storage of the computing system. The representation and instructions for performing a first operation on the aircraft can be transmitted to a first robot. The first robot can be configured to identify the aircraft based on a first feature of the representation. The method can further include receiving sensor data. The method can further include transmitting the sensor data to an analysis service to identify a state of a part of the aircraft. The method can further include storing the analysis in the data storage. The method can further include generating second instructions for performing a second operation on the aircraft based on the analysis and the representation stored in the data storage.