Patent classifications
G05G9/04737
Localization of a robot in an environment using detected edges of a camera image from a camera of the robot and detected edges derived from a three-dimensional model of the environment
Methods, apparatus, systems, and computer-readable media are provided for using a camera of a robot to capture an image of the robot's environment, detecting edges in the image, and localizing the robot based on comparing the detected edges in the image to edges derived from a three-dimensional (3D) model of the robot's environment from the point of view of an estimated pose of the robot in the environment. In some implementations, the edges are derived based on rendering, from the 3D model of the environment, a model image of the environment from the point of view of the estimated poseand applying an edge detector to the rendered model image to detect model image edges from the model image.
LOCALIZATION OF A ROBOT IN AN ENVIRONMENT USING DETECTED EDGES OF A CAMERA IMAGE FROM A CAMERA OF THE ROBOT AND DETECTED EDGES DERIVED FROM A THREE-DIMENSIONAL MODEL OF THE ENVIRONMENT
Methods, apparatus, systems, and computer-readable media are provided for using a camera of a robot to capture an image of the robot's environment, detecting edges in the image, and localizing the robot based on comparing the detected edges in the image to edges derived from a three-dimensional (3D) model of the robot's environment from the point of view of an estimated pose of the robot in the environment. In some implementations, the edges are derived based on rendering, from the 3D model of the environment, a model image of the environment from the point of view of the estimated poseand applying an edge detector to the rendered model image to detect model image edges from the model image.
Collaborative augmented reality eyewear with ego motion alignment
Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
COLLABORATIVE AUGMENTED REALITY EYEWEAR WITH EGO MOTION ALIGNMENT
Eyewear providing an interactive augmented reality experience between two eyewear devices by using alignment between respective 6DOF trajectories, also referred to herein as ego motion alignment. An eyewear device of user A and an eyewear device of user B track the eyewear device of the other user, or an object of the other user, such as on the user's face, to provide the collaborative AR experience. This enables sharing common three-dimensional content between multiple eyewear users without using or aligning the eyewear devices to common image content such as a marker, which is a more lightweight solution with reduced computational burden on a processor. An inertial measurement unit may also be used to align the eyewear devices.
Control device for controlling real or virtual airborne objects
In order to further improve a control device for controlling unmanned and/or manned and/or virtual airborne objects in such a way that the control is easier to use and can be learned intuitively and more quickly even by untrained individuals, the control device has a first control element for controlling a movement about a vertical axis, a longitudinal axis and a transverse axis of the airborne object, a rotary movement and/or pivoting movement of the first control element about its vertical axis, its longitudinal axis and its transverse axis causing the airborne object to move about its vertical axis, longitudinal axis and transverse axis, and the control device also has a second control element for changing the flying altitude and/or a speed and/or a thrust of the airborne object.
HANDLEBAR TYPE INPUT DEVICE
The subject matter of this specification can be embodied in, among other things, a handlebar-shaped housing having an elongate central body having a first hand grip at a first axial end, a first flexible paddle affixed to the first hand grip, a first deflection sensor configured to identify a first amount of deflection of the first flexible paddle, a second hand grip at a second axial end, a second flexible affixed to the second hand grip, a second deflection sensor configured to identify a second amount of deflection of the second flexible paddle, and circuitry configured to identify one or more of a pitch, a roll, and a yaw of the elongate central body, and a controller configured to receive an orientation sensor signals, deflection signals from the deflection sensors, and provide a control signal based on one or more of the sensor signals.