Patent classifications
G05B2219/45007
COMPUTER-AUTOMATED SCRIPTED ELECTRONIC ACTOR CONTROL
Computer-implemented systems and methods use a story to automatically control and coordinate electronic actors. The story may be written in a natural language (e.g., English), rather than a programming language or other language designed specifically to control machines. Embodiments of the present invention may parse the story into a script and generate cue signals based on the story. The cue signals are provided to the electronic actor(s), which may perform actions based on their interpretations of those cue signals. As a result, the story, the script, and the cue signals issued to the electronic actors do not deterministically dictate the actions performed by the electronic actors.
ROBOT CONTROL METHOD, A ROBOT CONTROL SYSTEM AND A MODULAR ROBOT
The present disclosure relates to the field of robots, and particularly relates to a robot control method, a robot control system and a modular robot. The robot control method includes the steps of: T1: providing a robot, with at least one wheel and at least one motion posture; T2: regulating the robot to a motion posture, saving motion-posture information corresponding to the motion posture, and generating preset action control information based on the speed of the wheel and the motion-posture information; T3: constructing and forming an operating model based on the preset action control information; and T4: outputting, by the operating model, actual motion control information of a motion according to user's input to control the robot to perform the motion. Thus, it is convenient to set motion modes to meet the diverse needs of users, and the design space of the robot suitable for more scenarios is increased.
Toy construction system with robotics control unit
A toy construction robotics system including a robotics control unit, the robotics control unit comprising: a housing comprising coupling elements configured for releasably interconnecting the robotics control unit with cooperating toy construction elements; a processor comprising programmed instructions; a plurality of I/O-ports connected to communicate with the processor; a plurality of separate light emitters arranged in a two-dimensional array on a front side of the housing, each of the light emitters being operable in response to instructions from the processor so as to produce at least two different indicator states; wherein the light emitters are aligned with respect to the I/O-ports such that each of the I/O-ports has an associated light emitter next to it. The light emitters may be operable, in response to instructions from the processor, to produce a machine readable code encoding data in respect of the robotics control unit. The machine readable code may comprise instructions for interaction between the robotics control unit and an external device.
DEVICE CONTROL APPARATUS, DEVICE CONTROL METHOD, AND RECORDING MEDIUM
A device control apparatus includes at least one processor executing a program stored in at least one memory. The at least one processor determines whether a predetermined condition related to a simulated growth degree of a device is satisfied, acquires external stimulus data representing an external stimulus externally applied to the device, changes and sets basic character data according to a first condition based on the acquired external stimulus data in a case where the predetermined condition is not satisfied, corrects the basic character data according to a second condition, different from the first condition and based on the acquired external stimulus data, to obtain character data in a case where the predetermined condition is satisfied, controls a movement of the device according to the basic character data in the case where the predetermined condition is not satisfied, and controls a movement of the device according to the character data obtained by the correction in the case where the predetermined condition is satisfied.
More endearing robot, method of controlling the same, and non-transitory recording medium
A more endearing robot includes an operation unit that causes the robot to operate, a viewing direction determiner that determines whether a viewing direction of a predetermined target is toward the robot or not, and an operation controller that controls the operation unit based on a result of determination by the viewing direction determiner.
TOY CONSTRUCTION SYSTEM WITH ROBOTICS CONTROL UNIT
A toy construction robotics system including a robotics control unit, the robotics control unit comprising: a housing comprising coupling elements configured for releasably interconnecting the robotics control unit with cooperating toy construction elements; a processor comprising programmed instructions; a plurality of I/O-ports connected to communicate with the processor; a plurality of separate light emitters arranged in a two-dimensional array on a front side of the housing, each of the light emitters being operable in response to instructions from the processor so as to produce at least two different indicator states; wherein the light emitters are aligned with respect to the I/O-ports such that each of the I/O-ports has an associated light emitter next to it. The light emitters may be operable, in response to instructions from the processor, to produce a machine readable code encoding data in respect of the robotics control unit. The machine readable code may comprise instructions for interaction between the robotics control unit and an external device.
Communication device
A communication device includes a structure, a controller, and sensors that detect the relative position of an object around the structure. The structure has a face unit that is one of the units of the structure. The controller controls the movement of each unit of the structure, thereby causing the structure to perform a gesture expressing a communication behavior. The controller controls the movement of the units so that the structure performs different gestures depending on the angle between the direction of the face unit and the direction of the user's face as viewed from the face unit.
ROBOT CAPABLE OF AUTONOMOUS DRIVING THROUGH IMITATION LEARNING OF OBJECT TO BE IMITATED AND AUTONOMOUS DRIVING METHOD FOR THE SAME
An artificial intelligence (AI) robot capable of performing imitation learning of an imitation target to be imitated may collect olfactory information of the imitation target and motion information about a motion executed by the imitation target according to the olfactory information for imitation learning, and may perform machine learning. When learned olfactory information is detected by the AI robot, the AI robot may be caused to execute the motion information about the motion executed by the imitation target. Accordingly, an imitation robot may imitate the imitation target based on olfactory information, in addition to sound and image information.
Apparatus, robot, method, and recording medium
In a case where an apparatus that is moved by a user is moved in a vertical direction, when a distance from an object located in the same two-dimensional position as the apparatus and directly underneath the apparatus is greater than a height, which corresponds to the two-dimensional position and which possibly damages the apparatus when the apparatus falls, a display is caused to perform a first display and a speaker is caused to output a first sound. When the distance is equal to or shorter than the height, the display is caused to perform a second display and the speaker is caused to output a second sound.
Device, method, and computer-readable recording medium for editing and playing robot motion
A device for editing and playing a robot motion comprises: a storage configured to store a 3D image file of a robot and further store time-dependent stationary postures of the robot edited by a user input in units of key frames; and a robot motion viewer configured to display, as a video, motion units of the robot obtained by connecting the stationary postures stored in units of key frames using an interpolation technique.