Patent classifications
B25J11/001
SYSTEM AND METHOD FOR DYNAMIC PROGRAM CONFIGURATION
The present teaching relates to method, system, medium, and implementations for configuring an animatronic device. Information about a user is obtained for whom an animatronic device is to be configured to carry out a dialogue with the user and is used to select, from a plurality of selectable programs, a program related to a topic to be covered in the dialogue, where the program is to be used by the animatronic device to drive the dialogue with the user. The animatronic device is then configured based on the program for carrying out the dialogue with the user.
Reading and contingent response educational and entertainment method and apparatus
The present invention describes devices and methods for assisting in the education of individuals, particularly the education of children. The present invention brings provides technological interventions with information about the child's eye-gaze location, gesture activity, emotional expression or other inputs. The disclosure further seeks to enhance learning by mimicking, for example, the parent's output of activities of physical gesture and of directed eye gaze or joint attention to enhance the child's learning experience. The present invention includes responds to input, either from the user, other indivduals, or information stored in memory. The present invention then processes, via a processor and associated memory, the inputs according to one or more computer program modules. Based on the input and computer program module(s), the present invention then processes an output, of various forms, to the user.
SOCIALLY ASSISTIVE ROBOT
A companion robot is disclosed. In some embodiments, the companion robot may include a head having a facemask and a projector configured to project facial images onto the facemask; a facial camera; a microphone configured to receive audio signals from the environment; a speaker configured to output audio signals; and a processor electrically coupled with the projector, the facial camera, the microphone, and the speaker. In some embodiments, the processor may be configured to receive facial images from the facial camera; receive speech input from the microphone; determine an audio output based on the facial images and/or the speech input; determine a facial projection output based the facial images and/or the speech input; output the audio output via the speaker; and project the facial projection output on the facemask via the projector.
Speech and behavior control device, robot, storage medium storing control program, and control method for speech and behavior control device
The present invention allows a robot to carry out communication with excellent affectiveness. A speech and behavior control device (1) includes an utterance content selecting section (16) which selects utterance content of a robot (100) from among a plurality of utterances, a movement control section (17) which controls a movable part (13) to move based on a kind of feeling corresponding to the utterance content, and an audio control section (18) which controls the robot (100) to output the utterance content as audio after movement of the movable part (13) has been started.
SYSTEMS AND METHODS TO MANAGE CONVERSATION INTERACTIONS BETWEEN A USER AND A ROBOT COMPUTING DEVICE OR CONVERSATION AGENT
Exemplary implementations may: receive one or more inputs including parameters or measurements regarding a physical environment from the one or more input modalities; identify a user based on analyzing the received inputs from the one or more input modalities; determine if the user shows signs of engagement or interest in establishing a communication interaction by analyzing a user's physical actions, visual actions, and/or audio actions, the user's physical actions, visual actions and/or audio actions determined based at least in part on the one or more inputs received from the one or more input modalities; and determine whether the user is interested in an extended communication interaction with the robot computing device by creating visual actions of the robot computing device utilizing the display device or by generating one or more audio files to be reproduced by one or more speakers.
Robot and method of controlling the same
A robot includes a display, a sensing unit including at least one sensor for detecting a physical stimulus, and a processor configured to detect the physical stimulus based on a sensing value acquired from the at least one sensor while an operation is performed, identify the physical stimulus based on the acquired sensing value, perform control to stop or terminate the operation based on the identified physical stimulus, and control the display to display a graphical user interface (GUI) corresponding to the identified physical stimulus.
CONTROL DEVICE, CONTROL METHOD, AND CONTROL SYSTEM
A control device that controls a robot capable of self-propelling in a facility is provided. The control device comprises: a visitor identifying unit configured to identify a location of a visitor in the facility; a robot identifying unit configured to identify a position of the robot; an instruction unit configured to instruct the robot to capture an image of the visitor in a case where the robot is located in a predetermined range near the visitor; an estimation unit configured to estimate an emotion of the visitor, based on the image that has been captured by the robot; and a control unit configured to control whether the robot stays within the predetermined range or moves out of the predetermined range in accordance with the emotion of the visitor.
DEVICE CONTROL APPARATUS, DEVICE CONTROL METHOD, AND RECORDING MEDIUM
A device control apparatus includes at least one processor that executes a program stored in at least one memory. The at least one processor sets growth degree data representing a simulated growth degree of a device, acquires character data representing a simulated character of the device, acquires other data related to the device, the other data being different from the character data, selectively sets a first movement mode based on the character data and a second movement mode based on the other data as a movement mode of the device according to the set growth degree data, and controls a movement of the device according to the set movement mode.
DEVICE CONTROL APPARATUS, DEVICE CONTROL METHOD, AND RECORDING MEDIUM
A device control apparatus includes at least one processor executing a program stored in at least one memory. The at least one processor determines whether a predetermined condition related to a simulated growth degree of a device is satisfied, acquires external stimulus data representing an external stimulus externally applied to the device, changes and sets basic character data according to a first condition based on the acquired external stimulus data in a case where the predetermined condition is not satisfied, corrects the basic character data according to a second condition, different from the first condition and based on the acquired external stimulus data, to obtain character data in a case where the predetermined condition is satisfied, controls a movement of the device according to the basic character data in the case where the predetermined condition is not satisfied, and controls a movement of the device according to the character data obtained by the correction in the case where the predetermined condition is satisfied.
Autonomously acting robot that understands physical contact
A planar touch sensor (an electrostatic capacitance sensor) for detecting a contact by a user on a body surface is installed on a robot. Pleasantness and unpleasantness are determined in accordance with a place of contact and a strength of contact on the touch sensor. Behavioral characteristics of the robot change in accordance with a determination result. Familiarity with respect to the user who touches changes in accordance with the pleasantness or unpleasantness. The robot has a curved form and has a soft body.