CUSTOMIZATION OF ROBOT

20190105783 · 2019-04-11

Assignee

Inventors

Cpc classification

International classification

Abstract

The present invention relates to a robot (1) configured to exhibit a robot identity and/or a robot personality at least partly through execution of computer-readable program code. The robot comprises a robot head (3) configured for connection to a face part (9; 9A, 9B) including a display surface, the robot being configured to cause the display of a face image depicting a face of said robot on said display surface. The robot head (3) is configured for detachable connection to said face part to allow detachable connection of different face parts (9; 9A, 9B) to the robot head, and the robot (1) is configured to automatically adapt said robot identity and/or said robot personality based on the face part currently being connected to the robot head (3). This allows a user of the robot to change character of the robot, i.e. to change the robot identity and/or personality, by connection of different face parts to the robot head.

Claims

1-21. (canceled)

22. A robot configured to: exhibit a robot personality at least partly through execution of computer-readable program code, the robot comprising a robot head configured for connection to a face part including a display surface, the robot being configured to cause the display of a face image depicting a face of the robot on the display surface, the robot head being configured for detachable connection to the face part to allow detachable connection of different face parts to the robot head, the robot being configured to automatically adapt the robot personality based on the face part currently being connected to the robot head, wherein the robot further comprises a data receiving unit configured to receive, from the face part being brought in the proximity of, or currently connected to, the robot head, a program code sequence stored on a data carrier of the face part, and the robot is configured to automatically adapt the robot personality through execution of the program code sequence received from the face part being brought in the proximity of, or currently connected to, the robot head.

23. The robot of claim 22, wherein the robot is configured to adapt the robot personality through execution of the program code sequence that is automatically selected in dependence of the face part currently being connected to the robot head.

24. The robot of claim 22, further comprising a communication unit configured to receive data from the face part currently being connected to the robot head, the robot being configured to use the data in the adaption of the robot personality.

25. The robot of claim 24, wherein the communication unit comprises a sensor arrangement for automatic identification of the face part currently being connected to the robot head.

26. The robot of claim 25, wherein the sensor arrangement is configured to retrieve a face part identifier from the face part currently being connected to the robot head, the robot being configured to retrieve a computer program sequence from an internal memory of the robot or from a network device to which the robot is communicatively connectable, which computer program sequence is selected in dependence of the face part identifier, and to adapt the robot personality through execution of the computer program code sequence.

27. The robot of claim 22, wherein the automatic adaption of the robot personality involves adaption of at least one robot characteristic selected from a group consisting of the face image depicting a face of the robot, a voice of the robot, a behaviour of the robot and a skill of the robot.

28. The robot of claim 22, wherein the automatic adaption of the robot personality comprises adaption of the face image depicting a face of the robot.

29. The robot of claim 22, wherein the display surface comprises a 3D display surface and the face image comprises a face animation adapted to the shape of the 3D display surface.

30. The robot of claim 22, wherein the display surface comprises a projection surface of a translucent mask constituting or forming part of the face part, the robot further comprising a projector for projecting the face image onto the projection surface.

31. The robot of claim 22, wherein the display surface comprises a display surface of an electronic or fibre-optic display unit constituting or forming part of the face part, the robot being configured to cause display of the face image on the electronic or fibre-optic display surface.

32. The robot of claim 22, wherein the robot is configured to adapt the robot personality based on the face part currently being connected to the robot head through adaption of at least one of the face image depicting a face of the robot or a voice of the robot, and at least one of a behaviour of the robot or a skill of the robot.

33. The robot of claim 22, wherein the robot is configured to provide telepresence of a remote user at a location of the robot.

34. The robot of claim 34, wherein the robot is configured to provide telepresence to one of a plurality of remote users, the robot being configured to select one of the remote users based on the face part currently being connected to the robot head, and to cause at least the face image depicting a face of the robot to be selected based on the selection of the remote user.

35. The robot of claim 22, wherein the robot is configured to run different software applications which, when executed, cause the robot to exhibit different skills, the robot being configured to select which application to run in dependence of the face part currently being connected to the robot head.

36. A robot system comprising: a robot configured to exhibit a robot personality at least partly through execution of computer-readable program code, and at least one face part comprising a display surface, the robot comprising a robot head configured for connection to the face part, wherein the robot is configured to cause the display of a face image depicting a face of the robot on the display surface, the robot head and the face part are configured for detachable connection to each other to allow detachable connection of different face parts to the robot head, the robot being configured to automatically adapt the robot personality based on the face part currently being connected to the robot head, wherein the face part further comprises a data carrier storing a program code sequence for automatically adapting the robot personality of the robot based on the face part, the robot further comprises a data receiving unit configured to receive, from the face part currently connected to the robot head, a program code sequence stored on a data carrier of the face part, and the robot is configured to automatically adapt the robot personality through execution of the program code sequence received from the face.

37. A face part configured for connection to a robot head of a robot, the face part comprising: a display surface on which the robot is configured to cause the display of a face image depicting a face of said robot, the face part being configured for detachable connection to the robot head to allow detachable connection of different face parts to the robot head, wherein: the face part comprises a data carrier configured to allow the robot to automatically identify the face part when connected to the robot head, and the face part is configured to store a program code sequence for automatically adapting a robot personality of the robot based on the face part, the face part further configured to transmit the program code sequence to the robot when the face part is brought in the proximity of, or connected to, the robot head.

38. The face part of claim 38, wherein the data carrier is configured to transfer a face part identifier to the robot when the face part is brought in the proximity of, or connected to, the robot head.

39. A face part configured for connection to a robot head of a robot, the face part comprising: a display surface on which the robot is configured to cause the display of a face image depicting a face of the robot, the face part being configured for detachable connection to the robot head to allow detachable connection of different face parts to the robot head, wherein the face part comprises a data carrier storing a program code sequence for adapting the robot personality of the robot based on said face part, the face part being configured to transmit said program code sequence to the robot when the face part is brought in the proximity of, or connected to, the robot head, and the robot being configured to adapt the robot personality based on the face part currently being connected to the robot head through adaption of the face image depicting a face of the robot and a voice of the robot, and at least one of a behaviour of the robot or a skill of the robot.

40. A method for customizing a robot configured to exhibit a robot personality at least partly through execution of computer-readable program code, the robot comprising a robot head configured for connection to a face part, the method comprising: receiving a program code sequence from the face part currently connected to the robot head, the face part including a display surface on which the robot is configured to display a face image depicting a face of the robot, the robot head configured for detachable connection to the face part to allow detachable connection of the face part to the robot head, and automatically adapting the robot personality based on the face part currently being connected to the robot head through execution of the program code sequence received from the face part.

41. A computer-readable storage medium comprising instructions for customizing a robot configured to exhibit a robot personality at least partly through execution of the instructions, the robot comprising a robot head configured for connection to a face part, the instructions, when executed by a processor, cause the processor to carry out a method comprising: receiving a program code sequence from the face part currently connected to the robot head, the face part including a display surface on which the robot is configured to display a face image depicting a face of the robot, the robot head configured for detachable connection to the face part to allow detachable connection of the face part to the robot head, and automatically adapting the robot personality based on the face part currently being connected to the robot head through execution of the program code sequence received from the face part.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0054] The present invention will become more fully understood from the detailed description following hereinafter and the accompanying drawings which are given by way of illustration only. In the different drawings, same reference numerals correspond to the same element.

[0055] FIG. 1 illustrates a side view of a robot according to an exemplary embodiment of the present disclosure, as well as a face part which is detachably connectable to a head part of the robot;

[0056] FIG. 2 illustrates the robot of FIG. 1 together with two different face parts, each associated with a respective robotic character;

[0057] FIG. 3 is a flowchart illustrating the principles of the proposed procedure for automatic customization of the robot;

[0058] FIG. 4 is a flowchart illustrating a way of implementing said procedure for automatic customization of the robot, and

[0059] FIG. 5 is a flowchart illustrating an alternative way of implementing said procedure for automatic customization of the robot.

DETAILED DESCRIPTION

[0060] FIG. 1 illustrates a side view of a robot 1 according to an exemplary embodiment of the present invention. The robot 1 constitutes what is sometimes referred to as a robot head or talking robot head within the art of robotics.

[0061] The robot 1 comprises a first structural part forming a head 3 of the robot, a second structural part forming a neck 5 of the robot, and a third structural part forming a torso 7 of the robot.

[0062] The robot head 3 is configured for detachable connection to a face part 9, hereinafter referred to as a face mask or simply a mask. That the robot head 3 is configured for detachable connection to the face mask 9 herein means that the robot head and the mask are intended and configured to be firmly but non-permanently connected to each other, and to be releasable from each other without causing any damage to the robot head or the mask. Preferably, the connection interface for establishment of the detachable connection between the robot head 3 and the face mask 9 is configured to permit detachment of the mask 9 from the robot head 3 without the use of any tools.

[0063] In the exemplary embodiment illustrated in FIG. 1, the robot head 3 is configured for magnetic connecting to the face mask 9. To this end, the robot head 3 and the mask 9 are provided with oppositely polarized magnets 11A and 11B which, when the mask 9 is brought in the proximity of the robot head 3, cause the mask 9 into releasable magnetic engagement with the head 3 of the robot 1. Alternatively, the detachable connection between the robot head 3 and the face mask 9 may be achieved by means of a clamping or clip-on connection. For example, the robot head 3 may be provided with one or more clips allowing the face mask 9 to be brought into releasable clamped engagement with the robot head 3. In this case, the face part 9 may comprise structural parts which are adapted in dimension and structure to establish the firm but detachable connection with the robot head 3 when said structural parts of the face part 9 are brought into clamped engagement with the clips of the robot head.

[0064] The magnets 11A of the robot head 3 is attached to a rig or frame 13 of the robot 1. The rig constitutes a main structural component of the robot 1 and forms a skeleton structure serving to support and carry other components of the robot 1. When the magnets 11B of the face mask 9 are brought into magnetic engagement with the magnets 11A of the robot head 3, the face mask 9 becomes firmly but detachably attached to the robot rig 3.

[0065] The robot 1 is a projective robot comprising a projector 15, such as a pocket LED projector, for causing the display of a face image depicting a face of the robot 1 on a display surface of the face mask 9. The projector 15 is located in the neck 5 of the robot and the face image is projected onto the backside of the face mask 9, constituting said display surface, by means of mirror 17 in the back of the robot head 3. The robot head 3 and the face mask 9 thus constitutes a robot system comprising what is sometimes referred to as a rear-lit projection system in which the rear projection screen is constituted by the backside of the face mask 9.

[0066] The projected face image is typically a face animation that is controlled by an onboard computer 19 of the robot 1 by running a software animation system installed on the onboard computer. The face animation may be a 2D or 3D face animation of a human face, an animal face or the face of a fictional character. In this exemplary embodiment, the face animation is a 3D animation of an anthropomorphic face, closely resembling the face of a human being.

[0067] The face mask 9 is a 3D translucent mask, typically made of plastic, allowing the projected face animation to be viewed by a viewer standing in front of the robot 1. The design of the mask removes any enforced curvature design of the lips and of the eyes to avoid any mismatch between the projected animation and the 3D shape of the mask.

[0068] The robot 1 is further configured to exhibit an anatomically inspired two degrees of freedom neck movement. To this end, the robot 1 comprises two mechanical servos 21A, 21B configured to mimic the anatomical movement of a human head. The first servo 21A is a pan servo located in the torso 7 of the robot, which pan servo 21A is configured to rotate the robot head 3 in a horizontal plane. The second servo 21B is a tilt servo located at an upper and rear end of the robot head 3, allowing for natural tilt and nodding movements of the robot head 3. In an exemplary embodiment, the servos 21A and 21B are high-torque Dynamixel servos. The servos 21A, 21B are connected to the onboard computer 19 which controls the servos and thus the neck and head movements of the robot 1 by means of a neck control software installed on the onboard computer.

[0069] The robot 1 comprises a skull part 22 forming a cover of the robot head 3. The skull 22 may also be detachably connected to the rig 13, for example by means of magnets, for easy transportation of the robot and maintenance of robot components arranged within the robot head, which components are at least partly covered by said skull. The skull 9 is preferably white and made of plastic.

[0070] The robot 1 further comprises a built-in audio system 23 for output of audio, such as a synthetic voice of the robot. The audio system 23 may comprise an amplifier and at least one speaker, internally connected to the onboard computer 19. The use of the built-in audio system may be optional. Preferably, the onboard computer 19 is provided with a connection interface, such as a 3.5 mm audio connector port, allowing an external audio system to be connected to the robot by plugging the external audio system into the onboard computer 19 via said connection interface.

[0071] The robot 1 further comprises a base plate 25 for carrying the components of the robot. The base plate 25 is provided with rubber feet 27 or the like to ensure that the robot 1 stands steady on the underlying surface. The base plate 25, the onboard computer 19, the audio system 23, the pan servo 21A and a lower part of the robot rig 13 are enclosed by a housing 29, visually resembling a torso of the robot. The housing 29 is preferably detachable from the base plate 25 or provided with one or more openings to provide easy access to the onboard computer 19 and the other components arranged within the housing.

[0072] The robot 1 is further provided with connectivity for connection to one or more network devices via a network, such as the internet. For example, the onboard computer 19 may be provided with an Ethernet connection interface allowing an

[0073] Ethernet cable to be plugged into the computer to allow the robot 1 to communicate and exchange information with external network devices. The robot 1 may also be configured to support connection of various hardware components and peripherals to the robot, e.g. via USB connectors of the onboard computer 19. Such external hardware components and peripherals may include an external monitor, a keyboard and a computer mouse but it may also include various sensors for adding functionalities to the robot 1. For example, the robot 1 may be configured to support connection of different image sensors and microphones. The robot 1 may contain software for facial recognition, facial expression recognition and gesture recognition as well as face and body tracking, and be configured to provide any or all of these functionalities (skills) based on image information recorded by an external image sensor connected to the onboard computer 19, such as a depth camera recording 3D images of the surroundings. Likewise, the robot may contain software for e.g. voice recognition and speech interpretation, and be configured to provide any or all of these functionalities based on audio information recorded by an external microphone. A connection interface 31 for connection of a network cable and external hardware equipment to the robot 1 is schematically illustrated by a dashed box in FIG. 1. In other embodiments (not shown), the robot 1 may comprise one or more integrated image sensors and microphones to provide any or all of the above mentioned functionalities.

[0074] The robot 1 is configured to exhibit a robot identity and a robot personality which, at least to some extent, are governed by software run on the onboard computer 19. This means that the robot 1 may select which identity and/or which personality to exhibit by executing different program code sequences that are stored in or downloadable to the onboard computer 19.

[0075] The above described face animation that is projected onto the display surface of the face mask 9 is a software-controlled robot characteristic forming part of what is herein referred to as the robot identity. Another software-controlled robot characteristic forming part of the robot identity is the voice of the robot, output by the built-in audio system 23 or an external audio system connected to the onboard computer 19 of the robot 1.

[0076] The behaviour of the robot 1 is a software-controlled robot characteristic forming part of what is herein referred to as the robot personality. The behaviour of the robot 1 includes the social behaviour of the robot 1, i.e. the way the robot interacts with humans or other automated machines, and the movement pattern of the robot, i.e. the way the robot moves the eyes of the animated face and the head 3 of the robot.

[0077] The behaviour of the robot 1 also includes the skills of the robot, including its interactive skills, its knowledge base, and its capability of providing answers to questions.

[0078] In accordance with the principles of the present invention, the robot 1 is configured to automatically adapt the robot identity and/or the robot personality based on the face part or mask 9 currently being connected to the robot head 3, which means that the robot 1 is configured to adapt said robot identity and/or robot personality through execution of a program code sequence which is automatically selected in dependence of the face mask currently being connected to the robot head 3.

[0079] To this end, the robot 1 may comprise a communication unit 33 for receiving data from the face mask 9 currently being connected to the robot head 3. The data received from the face mask may be used by the robot 1 to achieve said face-mask dependent adaption of the robot identity and/or personality.

[0080] In this exemplary embodiment, the communication unit 33 comprises a sensor arrangement for automatic identification of the face mask 9. The sensor arrangement is a radio frequency identification (RFID) reader, configured for wireless communication with a data carrier 35 in form of an RFID tag, attached to the face mask 9. The RFID reader is connected to the onboard computer 19 and configured to receive information from the data carrier 35 allowing the robot 1 to automatically identify the face mask 9 when the face mask is connected to or brought in the proximity of the robot head 3.

[0081] The data that is read from the RFID tag 35 by the RFID reader may be a face part identification number, hereinafter referred to as a mask ID. The onboard computer 19 uses the mask ID to identify one or more program code sequences which, when run by the onboard computer 19, causes the robot 1 to exhibit a certain robot identity and/or a certain robot personality, which robot identity and/or robot personality hence is automatically selected based on the face mask 9 identified by the sensor arrangement 33.

[0082] For example, the onboard computer 19 may store a look-up table comprising a plurality of mask IDs, wherein each mask ID is associated with a certain robotic character having a certain robot identity and/or robot personality, e.g. a certain face animation, a certain voice, a certain behaviour and/or a certain set of skills. When a new mask ID is read by the RFID reader, the onboard computer 19 executes a program code sequence causing the robot 1 to exhibit the robot identity and/or robot personality that is associated with the mask ID in said look-up table. The look-up table and the program code sequences causing the robot to exhibit different robot identities and/or different robot personalities in dependence of the currently connected face mask may be stored in an internal memory 37 of the onboard computer 19, from which the program code sequences may be executed by a processor 39 of the onboard computer 19.

[0083] The automatic face-part dependent adaption of the robot 1 allows a user of the robot to change the character of the robot 1 completely, simply by changing the face mask of the robot. Consequently, it is contemplated that a user may be provided with a robot system according to the present disclosure, which robot system comprises a robot as described above and at least two mutually different face parts associated with different robotic characters, allowing the user to switch between said robotic characters by changing the face mask currently being connected to the robot head for the other face mask.

[0084] Of course, in order for the robot 1 to know which robot character to turn into when one of the face masks 9A and 9B is attached to the robot head 3, the association between each face mask and a corresponding robotic character needs first to be created. The creation of robotic characters, i.e. the programming of face animations, voices, behaviours and skills, and the assignment of a particular robotic character to a particular face mask, i.e. the creation of an association between a face mask and a robotic character to be presented when said face mask is attached to the robot 1, may be made prior to delivery of the robot system to the user, and/or be made by the user himself. Preferably but not necessarily, upon delivery of the robot system to a user, each face mask delivered with the robot system is associated with a robotic character which is presented to the user when the face mask is connected to the robot head 3. However, the robot system may advantageously comprise a software development kit allowing the user to alter or add robotic characters, and associate the robotic characters with a face mask of their choice. In this way, the user may create a robotic character having a desired face animation, voice, behaviour and set of skills, and associate the robotic character with a certain face mask which, whenever said face mask is connected to the robot 1, will turn the thus created robotic character to life. Preferably, the robot system further comprises a software application, such as an online market place, allowing users to upload robotic characters, each developed and intended for use with a particular face mask or potentially two or more face masks having similar design. In this way, other users having a corresponding face mask can download robotic characters intended for and specifically adapted to that face mask, created by other users.

[0085] FIG. 2 illustrates a robot system comprising a robot 1, a first face mask 9A associated with a first robotic character, and a second face mask 9B associated with a second robotic character.

[0086] In one exemplary scenario, one of the face masks 9A, 9B may be associated with a male robotic character having a masculine face and voice (i.e. a male robotic identity), while the other face mask may be associated with a female robotic character having a feminine face and voice (i.e. a female robotic identity).

[0087] In another exemplary scenario one of the face masks 9A, 9B may be associated with an adult robotic character having an adult face and voice, while the other face mask may be associated with a baby-like robotic character having a baby-like face and voice. Of course, the adult robotic character may also be pre-programmed to exhibit an adult-type of behaviour and skills that are typically possessed by an adult person, i.e. to exhibit an adult robotic personality. Likewise, the baby-like robotic character may be pre-programmed to exhibit a baby-like behaviour and skills or a lack of skills that are typical for a baby, i.e. to exhibit a baby-like robotic personality.

[0088] In yet another exemplary scenario in which the robot system is used for education or amusement, each face mask 9A, 9B of the system may be associated with a historical person, such as Albert Einstein, Napoleon, William Shakespeare or Aristotle. By connecting a face mask associated with e.g. Albert Einstein to the robot head 3, the robot 1 will automatically become Albert Einstein by causing the display of a face image depicting Albert Einstein, speaking with a voice resembling the voice of Albert Einstein, behaving like Albert Einstein, being capable of answering questions of interest to Albert Einstein, etc.

[0089] In yet another exemplary scenario in which the robot is used for providing telepresence of remote meeting participants during e.g. a business meeting, each face mask 9A, 9B may be associated with a remote meeting participant not physically present at the meeting. The meeting administrator could then attach a face mask associated with a remote meeting participant to the robot head 3, whereby the robot 1 would turn into said remote meeting participant by automatically causing the display of a face image depicting or representing said participant on the display surface of the face mask 9A, 9B. For example, the robot 1 may be configured to cause the display of a real time or near real time video of the face of said remote meeting participant as said face image, which video may be captured by a video camera collocated with the remote meeting participant. Alternatively, the robot may be configured to cause the display of a face animation, preferably a 3D face animation, of said remote meeting participant as said face image, which face animation is caused to mimic the facial expressions of the remote meeting participant. This may be achieved by the robot 1 or a network device to which the robot is communicatively connected through image processing of said video of the face of the remote meeting participant, e.g. by using known techniques for detection of faces, direction of gaze, emotions, speech-related face movements etc. Preferably, in this telepresence scenario, the robot 1 is also configured to output or reproduce speech of the remote meeting participant, which speech may be recorded by a microphone collocated with the remote meeting participant.

[0090] In yet another exemplary scenario, the face masks 9A, 9B may be associated with different skills or sets of skills of the robot 1. For example, each of the face masks 9A, 9B may be associated with any of or any combination of the skill of providing telepresence of a remote person, the skill of singing and the skill of telling stories. The robot 1 may then be configured to adapt its skills based on the face part currently being connected to the robot head. The robot 1 may for example be configured to run different applications (apps) in dependence of the face mask currently being connected to the robot head 3. Thus, by connecting a telepresence face part to the robot head 3 the user could trigger the robot 1 to run a telepresence application, and by connecting a singing face part to the robot head 3, the user could trigger the robot to run an application causing the robot to sing songs, etc.

[0091] A method for customizing the robot 1 according to the principles of the present disclosure will now be described with reference to FIGS. 3 and 4. Simultaneous reference will be made to the components of the robot systems illustrated in FIGS. 1 and 2.

[0092] The method of customizing the robot 1 is a computer-implemented method which is performed by the robot 1, possibly in collaboration with one or more network devices to which the robot 1 is communicatively connectable. Unless stated otherwise, the method steps performed by the robot 1 is performed through execution by the processor 39 of a computer program stored in the memory 37 of the onboard computer 19.

[0093] FIG. 3 is a flowchart illustrating the principles of the proposed procedure for customization of the robot 1.

[0094] In a first step A1, a detachable connection is established between the robot head 3 and the face mask 9. This step, of course, is not performed through execution of a computer program. Instead, the detachable connection with the face part 9 is established by the robot 1 by means of a connection interface for firm but detachable connection of the robot head 3 to the face mask 9, such as the magnetic interface formed by the magnets 11A shown in FIGS. 1 and 2.

[0095] In a second step A2, the robot 1 automatically adapts the robot identity and/or personality based on the face mask 9 currently connected to the robot head 3.

[0096] FIG. 4 is a flowchart illustrating a way of implementing the principles of the proposed procedure for customization of the robot 1, illustrated in FIG. 3.

[0097] In a first step B1, corresponding to step A1 in FIG. 3, the robot establishes a detachable connection with the face mask 9.

[0098] In a second step B2, the program code which, when executed by the robot, causes the robot to exhibit a certain robot identity and/or robot personality is received directly from the face mask 9. As mentioned above, the program code may be received by a data receiving unit of the robot 1, e.g. a data receiving unit in form of an SD card reader comprised in the communication unit 33.

[0099] In a third step B3, the robot identity and/or personality is adapted to or at least adapted based on the face mask 9 currently being connected to the robot head 3 through execution of the program code retrieved in step B2.

[0100] FIG. 5 is a flowchart illustrating an alternative way of implementing the principles of the proposed procedure for customization of the robot 1, illustrated in FIG. 3.

[0101] In a first step C1, corresponding to step A1 in FIG. 3, the robot establishes a detachable connection with the face mask 9.

[0102] In a second step C2, a mask ID being an identifier of the face mask 9 currently connected to the robot head 3 is obtained by the robot 1. The mask ID may be obtained from a data carrier of the face mask 9 (step C2i) by means of a sensor arrangement of the robot, such as an RFID reader, or obtained from the user of the robot 1 through manual input of the mask ID via a user interface of the robot system (step C2ii).

[0103] In a third step C3, the program code which, when executed by the robot, causes the robot to exhibit a certain robot identity and/or robot personality is retrieved by the robot using the mask ID obtained in step C2. The program code may be retrieved from an internal memory 37 of the robot (step C3i) or from an external network device to which the robot 1 is communicatively connectable (step C3ii).

[0104] In a fourth step C4, the robot identity and/or personality is adapted to or at least adapted based on the face mask 9 currently being connected to the robot head 3 through execution of the program code retrieved in step C3.