Patent classifications
G06V40/174
IMAGING APPARATUS, CONTROL METHOD, AND STORAGE MEDIUM
An imaging apparatus includes an imaging unit configured to image a subject, a transmission unit configured to transmit image data to an external apparatus, a subject search unit configured to automatically search for a subject and detect the subject, a determination unit configured to determine whether to image a found subject, and a control unit configured to control transmission processing for transmitting the image data and imaging processing by the imaging unit not to be performed in parallel, wherein the subject search unit searches for a subject even during transmission of the image data, and wherein, in a case where the determination unit determines to image the found subject while the image data is transmitted, the transmission unit suspends the transmission of the image data, and the imaging unit images the found subject.
BREAKOUT OF PARTICIPANTS IN A CONFERENCE CALL
Systems and methods for creating and managing a breakout conference for a primary conference are disclosed. The system monitors communications between participants of a primary conference to determine if a) participants have a disagreement that needs to be resolved or b) if a topic from the meeting agenda requires additional time for discussion. Participant language, including negations and repetitive word usage, job profiles, body language, overlapping voice signals, among other factors, are monitored to determine if a disagreement exists. If a disagreement exists or additional time is required, the system automatically creates a virtual breakout session, determines the topic that created the disagreement, determines participants associated with the disagreed topic, and moves them to the breakout session. The system also provides meeting tools such that participants in the primary conference may communicate and alert participants in the breakout session, and vice versa, without leaving their respective sessions.
SYSTEMS AND METHODS FOR INSERTING EMOTICONS WITHIN A MEDIA ASSET
Systems and methods are described herein for inserting emoticons within a media asset based on an audio portion of the media asset. Each audio portion of a media asset is associated with a respective part of speech, and an emotion corresponding to the audio portion for the media asset is determined. A corresponding emoticon is identified based on the determined emotion in the audio portion and causing to be presented at the location within the media asset.
Industrial machine startup control system, startup control method, and program
An industrial machine startup control system for controlling startup of an industrial machine, the industrial machine startup control system including: a vital data measurement device that measures vital data of a worker; a health state analysis device that acquires the vital data of the worker measured by the vital data measurement device, and that determines a health state of the worker based on the acquired vital data; and a startup control device that controls whether or not to allow startup of the industrial machine based on the determination result from the health state analysis device.
Electronic device and method of obtaining emotion information
Emotion information is obtained by an electronic device in order to improve communication between a person and the electronic device. Multimedia data is obtained regarding a person, predicted values for the person are obtained by applying the multimedia data to neural network models, and emotion information of the person is obtained by applying the predicted values to a weight model. Then, feedback information is obtained from the person with respect to the first emotion information of the person. Finally, the weight model is updated by using the feedback information. Subsequently, when multimedia data are again obtained regarding the person, new predicted values for the person are obtained by applying later multimedia data the plurality of neural network models, and emotion information of the person is again obtained, but this time using the weight model updated using the feedback information.
Providing media based on image analysis
The present disclosure relates to a method of providing media to a user based on analysis of an image. The method comprises analysing the image to obtain image information about what is depicted therein. The method also comprises, based on said obtained image information, selecting a first plurality of media items comprising audio, from a media database, said media items of the first plurality being associated with that which is depicted in the image according to the image information. The method also comprises filtering the first plurality of media items based on metadata associated with the user to obtain a plurality of seed media items. The method also comprises providing at least one media item from the media database to the user based on the obtained seed media items.
Generating digital avatar
In one embodiment, a method includes, by one or more computing systems: receiving one or more non-video inputs, where the one or more non-video inputs include at least one of a text input, an audio input, or an expression input, accessing a K-NN graph including several sets of nodes, where each set of nodes corresponds to a particular semantic context out of several semantic contexts, determining one or more actions to be performed by a digital avatar based on the one or more identified semantic contexts, generating, in real-time in response to receiving the one or more non-video inputs and based on the determined one or more actions, a video output of the digital avatar including one or more human characteristics corresponding to the one or more identified semantic contexts, and sending, to a client device, instructions to present the video output of the digital avatar.
Information processing system for extracting images, image capturing apparatus, information processing apparatus, control methods therefor, and storage medium
An information processing system comprises an image capturing apparatus and an information processing apparatus, the image capturing apparatus includes an image capturing device; a first evaluation unit configured to perform first evaluation on a captured image; and a first transmission unit configured to transmit the captured image to the information processing apparatus, and the information processing apparatus includes a reception unit configured to receive the captured image; a second evaluation unit configured to perform second evaluation on the captured image; and a second transmission unit configured to transmit an evaluation result to the image capturing apparatus, the image capturing apparatus further including a sorting unit configured to receive the evaluation result of the second evaluation, and sort the captured image using an evaluation results of the first evaluation and the second evaluation.
METHOD AND SYSTEM FOR ESTIMATING GESTURE OF USER FROM TWO-DIMENSIONAL IMAGE, AND NON-TRANSITORY COMPUTER-READABLE RECORDING MEDIUM
There is provided a method of estimating a gesture of a user from a two-dimensional image. The method includes the steps of: acquiring a two-dimensional image relating to a user's body from a two-dimensional camera; specifying two-dimensional relative coordinate points corresponding first and second body parts of the user in a relative coordinate system dynamically defined in the two-dimensional image, and comparing a first positional relationship between the two-dimensional relative coordinate points of the first and second body parts at a first time point, and a second positional relationship between the two-dimensional relative coordinate points of the first and second body parts at a second time point; and estimating the gesture made by the user between the first and second time points based on the result of comparing and context information acquired from the two-dimensional image.
INFORMATION PROCESSING DEVICE, AND EMOTION ESTIMATION METHOD
An information processing device includes an acquisition unit, an identification unit and an emotion estimation unit. The acquisition unit acquires input information as information regarding a user in a certain situation. The identification unit identifies an object to which the user is paying attention and the user's appearance when the user is paying attention to the object based on the input information. The emotion estimation unit estimates emotion of the user based on object information indicating the identified object, appearance information indicating the identified appearance, and a predetermined method of estimating the emotion.