Patent classifications
G01C21/3638
METHOD, APPARATUS AND COMPUTER PROGRAM PRODUCT FOR ADAPTIVE VENUE ZOOMING IN A DIGITAL MAP INTERFACE
A method, apparatus, and computer program product are provided for adaptive zoom control for zooming in on a venue beyond the maximum zoom level available in a digital map. An apparatus may be provided including at least one processor and at least one non-transitory memory including computer program code instructions. The computer program code instructions may be configured to, when executed, cause the apparatus to at least: provide for presentation of a map of a region including a venue; receive an input corresponding to a zoom-in action to view an enlarged portion of the region, where the enlarged portion of the region includes the venue; and in response to receiving the input corresponding to a zoom-in action to view the enlarged portion of the region, transition from the presentation of the map of the region to a presentation of a venue object corresponding to the venue.
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, PROGRAM, AND MOVING BODY
The present technique relates to an information processing apparatus, an information processing method, a program, and a moving body that can appropriately display content on top of a scene viewed by a user.
An aspect of the present technique provides an information processing apparatus that sets a frame as a superimposition location of content in a region corresponding to a surface of an object on the basis of a movement state of a user and generates visual information for displaying the content in the region corresponding to the set frame. The present technique can be applied to an apparatus that performs AR display of content.
Navigation application with novel declutter mode
Some embodiments provide a navigation application with a novel declutter navigation mode. In some embodiments, the navigation application has a declutter control that when selected, directs the navigation application to simplify a navigation presentation by removing or de-emphasizing non-essential items that are displayed in the navigation presentation. In some embodiments, the declutter control is a mode-selecting control that allows the navigation presentation to toggle between normal first navigation presentation and a simplified second navigation presentation, which below is also referred to as a decluttered navigation presentation. During normal mode operation, the navigation presentation of some embodiments provides (1) a representation of the navigated route, (2) representations of the roads along the navigated route, (3) representation of major and minor roads that intersect or are near the navigated route, and (4) representations of buildings and other objects in the navigated scene. However, in the declutter mode, the navigation presentation of some embodiments provides a representation of the navigated route, while providing a de-emphasized presentation of the roads that intersect the navigated route or are near the navigated route. In some embodiments, the presentation shows the major roads that are not on the route with more emphasis than minor roads not on the route. Also, in some embodiments, the presentation fades out the minor roads not on the route more quickly than fading out the major roads not on the route.
METHOD FOR PROVIDING A THREE-DIMENSIONAL MAP IN A MOTOR VEHICLE
A motor vehicle has at least one driver assistance system and a navigation system that generates a three-dimensional map and a route-related navigation representation when route guidance is activated by the navigation system. When the at least one driver assistance system is activated, sensor data describing the surroundings of the motor vehicle are acquired by at least one sensor unit of the motor vehicle according to a data acquisition rule of the driver assistance system. The acquired sensor data is evaluated to generate at least one driver assistance related additional information representation. The map with the route-related navigation representation and the driver assistance related additional information representation are output by a display device in the motor vehicle.
ROUTE GUIDANCE METHOD AND DEVICE USING AUGMENTED REALITY VIEW
A method of obtaining a route from a point of departure to a destination includes providing route guidance through an augmented reality (AR) view including an image captured by a camera. To provide route guidance, as a user terminal moves toward a destination on the basis of an obtained route, when an interaction occurs between the user terminal and a node or a link in which predetermined information included in the route is registered, content associated with the predetermined information is augmented and displayed as guidance information on an image of an AR view.
Emergency drone guidance device
Techniques are described for configuring a monitoring system to assist users during a detected emergency condition at a property. In some implementations, sensor data from one or more sensors that are located at the property are obtained by a monitoring system that is configured to monitor the property. A determination that there is an emergency condition at the property is made by the monitoring system based on the sensor data, determining. A location of a person inside the property is determined by the monitoring system based on the sensor data. A first path to the person and a second path to guide the person away from the emergency condition are determined by the monitoring system. The first path to the person and the second path to guide the person away from the emergency condition are navigated by a computing device of the monitoring system.
LOCALIZING TRANSPORTATION REQUESTS UTILIZING AN IMAGE BASED TRANSPORTATION REQUEST INTERFACE
The present application discloses an improved transportation matching system, and corresponding methods and computer-readable media. According to the disclosed embodiments, the transportation matching system utilizes an image-based transportation request interface and environmental digital image stream to efficiently generate transportation requests with accurate pickup locations. For instance, the disclosed system can utilize one or more environmental digital images provided from a requestor computing device (e.g., a mobile device or an augmented reality wearable device) to determine information such as the location of the requestor computing device and a transportation pickup location within the environmental digital images. Furthermore, the disclosed system can provide, for display on the requestor computing device, one or more augmented reality elements at the transportation pickup location within an environmental scene that includes the transportation pickup location.
Systems and methods for controlling mapping information inaccuracies
Systems and methods for correcting mapping information inaccuracies are described. In some aspects, the method includes receiving terrestrial data captured in an area of interest, and detecting features in the terrestrial data identifying ground points in the area of interest. The method also includes correlating the ground points with ground control points in the area of interest to determine a correspondence, and computing an aggregate of positional differences between corresponding points. The method further includes generating a report indicating a quality of the terrestrial data captured in the area of interest based on the aggregate.
Apparatus for displaying information of a vehicle and method thereof
A display apparatus for a vehicle includes: a controller configured to create map information; and a display device configured to display the map information created by the controller, wherein the controller controls the display device to display a path guidance texture based on a road shape when guiding a path among the map information.
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM
A configuration is achieved in which display data selected on the basis of characteristic information acquired while a vehicle is traveling is displayed on a display unit inside the vehicle. The configuration has a data processing unit that executes display control of data output to the display unit provided inside a mobile device. The data processing unit acquires characteristic information while the mobile device is traveling, and decides, on the basis of the acquired characteristic information, display data to be output to the display unit. The data processing unit selects display data recorded corresponding to the acquired characteristic information from a characteristic-information corresponding display data storage database, generates an AR image obtained by superimposing the selected display data on a real object image that is an image captured by a camera mounted on the mobile device, and outputs the AR image to the display unit.