B25J13/081

Methods and apparatus to calibrate a positional orientation between a robot gripper and a component

Methods of calibrating a position of a component include providing a robot with a gripper and crush and crash sensors, a calibration tool coupled to the gripper, and the component, which has a recess and a crush zone. The methods also include moving the gripper in a first direction to sense contact between the calibration tool and the crush zone, recording the contact position, and moving the gripper to insert the tool into the recess. The gripper is then moved in second directions to sense contact between the tool and the recess and moved in third directions to also sense contact between the tool and the recess. The methods further include recording and processing the contact positions to determine a surface location in the first direction and a physical center of the recess. Robot calibration apparatus for performing the method is also disclosed, as are other aspects.

Touch input processing method and electronic device supporting the same

An electronic device including: a housing; a sensor module disposed on an inner face of the housing and including a plurality of sensing units; and a processor positioned within the housing and electrically connected to the sensor module. Each of the plurality of sensing units is electrically connected to another sensing unit adjacent thereto among the plurality of sensing units, and includes a central portion and a plurality of peripheral portions connected to a partial area of the central portion and arranged around the central portion, and each of the central portion and the plurality of peripheral portions includes a touch sensor. In addition to this, various embodiments understood through this document are possible.

ROBOT, ROBOT CONTROL METHOD, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM

The robot includes a storage unit and a control unit. The control unit acquires outside stimulus feature amounts that are feature amounts of an outside stimulus acting from outside, stores the acquired outside stimulus feature amounts in the storage unit as a history, compares outside stimulus feature amounts acquired at a certain timing with outside stimulus feature amounts stored in the storage unit to calculate a first similarity degree, and controls operations based on the calculated first similarity degree.

Devices, systems, and methods for robotic pipe handling

The present disclosure relates to systems and methods for automated drill pipe handling operations, such as trip in, trip out, and stand building operations. A pipe handling system of the present disclosure may include a lifting system for handling a load of a pipe stand, a pipe handling robot configured for engaging with the pipe stand and manipulating a position of the pipe stand, and a feedback device configured to provide information about a condition of the pipe stand, the lifting system, or the pipe handling robot. In some embodiments, the pipe handling robot may be a first robot configured for engaging with and manipulating a first end of the pipe stand, and the system may include a second pipe handling robot configured for engaging with and manipulating a second end of the pipe stand.

Pixel circuit including conversion element, capacitive element, and transistors

Provided is a pixel circuit. The pixel circuit includes a conversion element forming a voltage of an input level at a first node, a first transistor adjusting the voltage of the first node to a first level in response to a first signal received at a first time interval, a first capacitive element forming a voltage at a second node based on the voltage of the first node, a second transistor adjusting a level of the voltage of the second node to a second level in response to the first signal, a third transistor forming a voltage at a third node, a fourth transistor outputting a current in response to a second signal received in a second time interval, and a. fifth transistor adjusting the voltage of the third node to a third level in response to a third signal received in a third time interval.

Object recognition apparatus

The present disclosure provides an object recognition apparatus, which includes: an actuator unit configured to contact an object and generate vibrations and transmit them through objects based on the inherent characteristic of the object; and a sensor unit connected to the actuator unit to receive the vibration and generate a voltage signal.

Sheet conveying device and sheet conveying method
11478938 · 2022-10-25 · ·

A sheet conveying device and a sheet conveying method whereby it is possible to convey a sheet more accurately. This sheet conveying device includes a holder capable of holding a sheet, bringing the sheet into a stretched state while holding the sheet, and conveying the held sheet. This sheet conveying device also includes a sheet state recognizer that recognizes the state of the sheet when the sheet is held in a stretched state by the holder.

EXTENDED REALITY SYSTEMS FOR VISUALIZING AND CONTROLLING OPERATING ROOM EQUIPMENT
20230083605 · 2023-03-16 ·

A camera tracking system receives patient reference tracking information indicating pose of a patient reference array tracked by a patient tracking camera relative to a patient reference frame. A local XR headset view pose transform is determined between a local XR headset reference frame and the patient reference frame. Remote reference tracking information is received indicating pose of a remote reference array tracked by a remote reference tracking camera. A remote XR headset view pose transform is determined between a remote XR headset reference frame of a remote XR headset and the remote reference array. A 3D computer image is transformed from a local pose determined using the local XR headset view pose transform to a remote pose determined using the remote XR headset view pose transform. The transformed 3D computer image is provided to the remote XR headset for display with the remote pose relative to the remote XR headset reference frame.

Robotic picking assemblies configured to grasp multiple items

Systems, methods, and computer-readable media are disclosed for robotic picking assemblies configured to grasp multiple items. In one embodiment, an example system may include a picking assembly coupled to a vacuum system, the picking assembly having a first suction cup assembly with a first suction cup and a first sensor, and a second suction cup assembly with a second suction cup and a second sensor. The example system may include a controller configured to cause the picking assembly to grasp a plurality of items, where the plurality of items includes a first item and a second item. The controller may be further configured to cause the picking assembly to move from a first position to a second position, and cause the picking assembly to release the first item at a first time and the second item at a second time.

Waveguides for use in sensors or displays

Waveguides, such as light guides, made entirely of elastomeric material or with indents on an outer surface are disclosed. These improved waveguides can be used in sensors, soft robotics, or displays. For example, the waveguides can be used in a strain sensor, a curvature sensor, or a force sensor. In an instance, the waveguide can be used in a hand prosthetic. Sensors that use the disclosed waveguides and methods of manufacturing waveguides also are disclosed.