G05D1/2235

Individual plant recognition and localization

Implementations are described herein for training and applying machine learning models to digital images capturing plants, and to other data indicative of attributes of individual plants captured in the digital images, to recognize individual plants in distinction from other individual plants. In various implementations, a digital image that captures a first plant of a plurality of plants may be applied, along with additional data indicative of an additional attribute of the first plant observed when the digital image was taken, as input across a machine learning model to generate output. Based on the output, an association may be stored in memory, e.g., of a database, between the digital image that captures the first plant and one or more previously-captured digital images of the first plant.

Object search service employing an autonomous vehicle fleet

A computing system that can receive an object search request from a user indicating a request to search for a specific object in an area traversed by one or more autonomous vehicle. The object search request can include a set of physical characteristics of the specific object. The computing system can then transmit a signal to an autonomous vehicle indicating a request for the autonomous vehicle to search for the specific object. The signal can cause the autonomous vehicle to transmit an image, selected based on a physical characteristic of the object, to the computing system. The computing system can then generate a score indicative of a difference between one or more physical characteristic of the object in the image and the specific object. The computing system can then selectively transmit the image to a mobile device operated by the user based on the score.

Server, information processing system and information processing method
12197210 · 2025-01-14 · ·

A vehicle that is an automatic driving vehicle acquires a control program for an ECU from a control center by wireless communication. A server includes a keyboard and mouse to accept handling by a remote monitoring person that monitors the vehicle from the outside of the vehicle, a display to present information to the remote monitoring person, a communication IF configured to communicate with the vehicle, and a processor. The processor controls the display such that the server presents the change content of the control program to the remote monitoring person, when the control program for the vehicle has been updated. The processor controls the communication IF such that a notice of permission of traveling is given to the vehicle, when the keyboard and the mouse have accepted a remote monitoring person's handling for permitting the vehicle to travel in accordance with the control program after the update.

Camera calibration

A first plurality of center points of first two-dimensional bounding boxes corresponding to a vehicle occurring in a first plurality of images acquired by a first camera can be determined. A second plurality of center points of second two-dimensional bounding boxes corresponding to the vehicle occurring in a second plurality of images acquired by a second camera can also be determined. A plurality of non-linear equations based on the locations of the first and second pluralities of center points and first and second camera parameters corresponding to the first and second cameras can be determined. The plurality of non-linear equations can be solved simultaneously for the locations of the vehicle with respect to the first and second cameras and the six degree of freedom pose of the second camera with respect to the first camera.

Autonomous and user controlled vehicle summon to a target

A processor coupled to memory is configured to receive an identification of a geographical location associated with a target specified by a user remote from a vehicle. A machine learning model is utilized to generate a representation of at least a portion of an environment surrounding the vehicle using sensor data from one or more sensors of the vehicle. At least a portion of a path to a target location corresponding to the received geographical location is calculated using the generated representation of the at least portion of the environment surrounding the vehicle. At least one command is provided to automatically navigate the vehicle based on the determined path and updated sensor data from at least a portion of the one or more sensors of the vehicle.

AUTONOMOUS AND USER CONTROLLED VEHICLE SUMMON TO A TARGET

A processor coupled to memory is configured to receive an identification of a geographical location associated with a target specified by a user remote from a vehicle. A machine learning model is utilized to generate a representation of at least a portion of an environment surrounding the vehicle using sensor data from one or more sensors of the vehicle. At least a portion of a path to a target location corresponding to the received geographical location is calculated using the generated representation of the at least portion of the environment surrounding the vehicle. At least one command is provided to automatically navigate the vehicle based on the determined path and updated sensor data from at least a portion of the one or more sensors of the vehicle.

Lighting floor on sides of material handling vehicle to indicate limited or non-limited area

A method is provided for controlling a light source device associated with a materials handling vehicle, wherein the materials handling vehicle includes one or more sensing devices, comprising: sensing via the one or more sensing devices a first distance from a left side of the vehicle to a first boundary object and a second distance from a right side of the vehicle to a second boundary object; controlling the light source device to designate a first area to the left side of the vehicle as either a limited operation area or a non-limited operation area and to designate a second area to the right side of the vehicle as either a limited operation area or a non-limited operation area.

RADIO CONTROLLED AIRCRAFT, REMOTE CONTROLLER AND METHODS FOR USE THEREWITH
20170102698 · 2017-04-13 ·

A radio controlled (RC) vehicle includes a receiver that is configured to receive an RF signal from a remote control device. The RF signal contains command data in accordance with a first coordinate system that is from a perspective of the remote control device. A motion sensor is configured to generate motion data. A processor is configured to transform the command data into control data based on the motion data and in accordance with a second coordinate system that is from a perspective of the RC vehicle. A plurality of control devices are configured to control motion of the RC vehicle based on the control data.

INDIVIDUAL PLANT RECOGNITION AND LOCALIZATION

Implementations are described herein for training and applying machine learning models to digital images capturing plants, and to other data indicative of attributes of individual plants captured in the digital images, to recognize individual plants in distinction from other individual plants. In various implementations, a digital image that captures a first plant of a plurality of plants may be applied, along with additional data indicative of an additional attribute of the first plant observed when the digital image was taken, as input across a machine learning model to generate output. Based on the output, an association may be stored in memory, e.g., of a database, between the digital image that captures the first plant and one or more previously-captured digital images of the first plant.

METHOD AND DEVICE FOR REMOTE CONTROL OF THE MOVEMENT OF A VEHICLE

A method for remotely controlling the movement of a vehicle by means of a hand-held device is described. The method includes the following steps: activating a button by pressing and holding the button by a user; setting a tilt angle of the device by the user starting from an initial tilt angle when activating the button; receiving a signal representing the current tilt angle of the device; converting the received signal into a virtual pedal position, which corresponds to the position of a physical pedal corresponding to the virtual pedal; controlling the movement of the vehicle according to the virtual pedal position; and stopping the vehicle if pressing and holding the button is ended.