Patent classifications
G01C21/3647
METHOD AND DEVICE FOR USING AUGMENTED REALITY IN TRANSPORTATION
Augmented reality may be used to help navigate a user to a desired location. A first location and a device orientation of a user device may be obtained. A second location may be obtained. A path from the first location to the second location may be determined. A camera feed may be displayed on the user device. An indicator of the path may be displayed over the camera feed based on the device orientation. A marker may be displayed over the camera feed in response to determining that the device orientation aligns with the second location, and the marker may indicate the second location.
Techniques for manipulating panoramas
A multi-step animation sequence for smoothly transitioning from a map view to a panorama view of a specified location is disclosed. An orientation overlay can be displayed on the panorama, showing a direction and angular extent of the field of view of the panorama. An initial specified location and a current location of the panorama can also be displayed on the orientation overlay. A navigable placeholder panorama to be displayed in place of a panorama at the specified location when panorama data is not available is disclosed. A perspective view of a street name annotation can be laid on the surface of a street in the panorama.
Implement guidance display system for work vehicles
An implement guidance display system may be deployed onboard a work vehicle including an operator station and chassis. The implement guidance display system includes a display device within the operator station of the work vehicle, implement data sources configured to provide implement tracking data pertaining to work the implement when mounted to the chassis, and a controller in signal communication with the display device and with the implement data sources. The controller is configured to: (i) receive the implement tracking data from the implement data sources; (ii) establish a projected trajectory of the work implement utilizing the implement tracking data; and (iii) generate, on the display device, implement trajectory symbology indicative of the projected trajectory of the work implement.
PROCESSING DEVICE, PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM
A processing device according to one aspect of the present invention includes a vibration detecting unit configured to detect a vibration of a vehicle, an orientation detecting unit configured to detect an orientation of the vehicle, a storing unit configured to store information indicating the orientation of the vehicle, a processing unit configured to set an orientation of a vehicle on a map of a navigation system based on the information indicating the orientation of the vehicle, and a power supply controlling unit configured to control power supplied to the orientation detecting unit and the storing unit based on a detection result of the vibration detecting unit and in response to an accessory power of the vehicle turned OFF.
PROVIDING A MODEL OF A VEHICLE TO A RIDER AT AN ACCURATE ORIENTATION
Systems and methods are disclosed herein for providing a model of a vehicle to a rider. In an embodiment, the systems and methods determine that a driver has accepted a task to operate a vehicle from a first location to a rider location. The systems and methods then access a model corresponding to the vehicle, determine a direction of movement of the vehicle, and determine a relative orientation of the vehicle with respect to the rider based on the first location, the direction of movement of the vehicle, and the rider location. The systems and methods generate for display, on a client device of the rider, the model at an angle determined based on the relative orientation.
METHODS AND APPARATUSES FOR NAVIGATION GUIDANCE AND ESTABLISHING A THREE-DIMENSIONAL REAL SCENE MODEL, DEVICE AND MEDIUM
Provided are methods and apparatuses for navigation guidance and establishing a three-dimensional real scene model, a device and a medium, which relate to the field of artificial intelligence and, in particular, intelligent transportation technologies. A specific implementation includes: determining a navigation route of a navigation device and candidate three-dimensional real scene data corresponding to the navigation route; where the candidate three-dimensional real scene data includes candidate location information, candidate three-dimensional model data and a candidate observation viewing angle; matching real-time positioning information of the navigation device with the candidate location information in the candidate three-dimensional real scene data to obtain target three-dimensional real scene data; and simulating target three-dimensional model data in the target three-dimensional real scene data at a target observation viewing angle in the target three-dimensional real scene data and at a location corresponding to the real-time positioning information to obtain a real-time navigation guidance image.
Lane change notification
Aspects of the disclosure relate providing a lane change notification when a vehicle is to perform a lane change. One or more computing devices may generate and display a video, where the video is generated from a perspective of a virtual camera at a default position and default pitch. The one or more computing devices may receive an indication that the vehicle is to perform a lane change from a first lane to a second lane and adjust, after the vehicle receives the indication, the default position and default pitch of the virtual camera to an updated position further above the vehicle relative to ground than the default position and an updated pitch directed more towards the ground than the default pitch. The video may be generated and displayed from the perspective of the virtual camera at the updated position and updated pitch.
End of trip sequence
Aspects of the disclosure relate to providing an end of trip sequence when a vehicle is nearing its destination. One or more computing devices may generate and display a video indicating a projected trajectory of the vehicle and objects detected by sensors on the vehicle, on a map corresponding to a route the vehicle is currently following, where the video is generated from a perspective of a virtual camera at a default position and default pitch A determination that the vehicle has reached a threshold relative to the route of the vehicle may be made and the position and pitch of the virtual camera may be adjusted to an updated position above the vehicle and a perspective which looks downwards towards a roof of the vehicle. The video may then be generated and displayed from the perspective of the virtual camera at the updated position.
TRAVEL ROUTE OBSERVATION AND COMPARISON SYSTEM FOR A VEHICLE
A travel route observation and comparison system for a vehicle includes one or more processors and a memory communicably coupled to the one or more processors. The memory stores a display control module including instructions that when executed by the one or more processors cause the one or more processors to control at least one display device to simultaneously display at least a first visual representation of at least a portion of a first travel route currently being traveled by the vehicle, and a second visual representation of the at least a portion of the first travel route during a previous traveling of the first travel route.
VEHICLE CONTROL SYSTEM AND OWN VEHICLE POSITION ESTIMATING METHOD
A vehicle control system includes a movement amount calculating unit configured to calculate a movement amount of a vehicle by using dead reckoning, an imaging device configured to capture an image of a travel route on which the vehicle is traveling, a map generating unit configured to generate a map of a surrounding area of the vehicle, and an own vehicle position estimating unit configured to estimate a position of the vehicle on the map. The own vehicle position estimating unit is configured to calculate a first own vehicle position based on the movement amount of the vehicle calculated by the movement amount calculating unit, calculate a second own vehicle position by comparing the image captured by the imaging device with the map, and estimate the position of the vehicle on the map based on the first own vehicle position and the second own vehicle position.