Household Appliance with Multifunctional Display

20230011404 · 2023-01-12

    Inventors

    Cpc classification

    International classification

    Abstract

    The invention relates to a household appliance (100), in particular a kitchen appliance (100) for preparing food, comprising a processing device (110) for at least partially automatically carrying out a household activity, in particular for processing ingredients, wherein the processing device (110) is controlled by a control unit (102). Further, the household appliance (100) comprises a position device (120) for detecting position information (200) of a user (1) operating the household appliance (100), the position information (200) being transmitted to the control unit (102). Furthermore, the household appliance (100) has a display (130) which is adapted to reproduce image information (140) determined by the control unit (102), wherein the image information (140) reproduced by the display (130) generates an impression of depth at the user (1) and is dependent on at least the position information (200) of the user (1). Furthermore, the invention relates to a method for operating a household appliance (100).

    Claims

    1. A household appliance, for preparing food, comprising: a processing device for at least partially automatically carrying out a household activity the processing device being controlled by a control unit, a position device for detecting position information of a user operating the household appliance, wherein the position information is transmitted to the control unit, a display adapted to display image information determined by the control unit, wherein the image information reproduced by the display generates an impression of depth at the user and is dependent on at least the position information of the user.

    2. The household appliance according to claim 1, wherein the display comprises at least an autostereoscopic display or virtual reality glasses or a holographic display.

    3. The household appliance according to claim 1, wherein the display comprises at least one of the following components: an optical component, a barrier mask, or a light unit.

    4. The household appliance according to claim 1, wherein the image information reproduced by the display generates an impression of depth at the user at a defined image position in relation to the display, the defined image position corresponding to the position information of the user, when a user moves in relation to the display.

    5. The household appliance according to claim 1, wherein at least the positioning device is suitable for detecting position information from a plurality of users in each case or that the display is adapted to reproduce a plurality of image information, each of which creates an impression of depth for a plurality of users.

    6. The household appliance according to claim 1, wherein the display has a 2D mode and a 3D mode.

    7. The household appliance according to claim 1, wherein at least a user identification device is further provided which is configured to identify the user or furthermore, a user identification device is provided which is configured to at least actively or passively identify the user.

    8. The household appliance according to claim 1, wherein the display, has at least one axis of rotation, about which the display can be rotated in relation to the user.

    9. The household appliance according to claim 1, wherein a further display is provided.

    10. The household appliance according to claim 1, wherein the positioning device comprises at least one of the following components: a manually operable input unit, a camera a gesture sensor adapted to detect a gesture of the user for controlling the household appliance, or a sound transducer which is configured to detect a voice input of the user for controlling the household appliance.

    11. The household appliance according to claim 10, wherein the gesture sensor comprises at least one of the following components: a camera, a radar sensor, a lidar sensor, a capacitive sensor or an ultrasonic sensor.

    12. The household appliance according to claim 1, wherein the household appliance is of multipart configuration and has at least one main body which can at least be connected to a mains power supply or on which the processing device is provided.

    13. A method for operating a household appliance, comprising: detection of position information of a user by a position device, reproduction of image information by a display adapted to create an impression of depth in the user, wherein the image information is dependent on at least the position information.

    14. The method of operating a household appliance according to claim 13, wherein the method further comprises at least one of the following: reproduction of image information specific to the operation of the household appliance by a display, wherein, in addition to the image information, at least previous image information or subsequent image information is displayed initial recognition of a user by a user identification device, detection of a gesture of the user by at least a gesture sensor to control the household appliance, or detection of a voice input of the user to control the household appliance by a sound transducer, performing a household activity.

    15. The method of operating a household appliance according to claim 13, wherein the household appliance comprises a processing device for at least partially automatically carrying out a household activity the processing device being controlled by a control unit, a position device for detecting position information of a user operating the household appliance, wherein the position information is transmitted to the control unit, a display adapted to display image information determined by the control unit, wherein the image information reproduced by the display generates an impression of depth at the user and is dependent on at least the position information of the user.

    Description

    [0069] Further advantages, features and details of the invention will be apparent from the following description, in which embodiments of the invention are described in detail with reference to the drawings. In this connection, the features mentioned in the claims and in the description may each be essential to the invention individually or in any combination. It schematically shows:

    [0070] FIG. 1 shows a household appliance according to the invention, which is designed/configured as a kitchen appliance,

    [0071] FIG. 2 a side view of a household appliance according to the invention with a user,

    [0072] FIG. 3 a coordinate system showing a display of a household appliance according to the invention and a head of a user,

    [0073] FIG. 4 a part of a household appliance according to the invention,

    [0074] FIG. 5 display of a household appliance according to the invention,

    [0075] FIG. 6 an example of image information of a household appliance according to the invention,

    [0076] FIG. 7 an embodiment of a display comprising a lens array,

    [0077] FIG. 8 an embodiment of a display comprising a barrier mask, and

    [0078] FIG. 9 a method of operating a household appliance according to the invention.

    [0079] In the following description of some embodiments of the invention, the identical reference signs are used for the same technical features even in different embodiments.

    [0080] FIG. 1 shows a household appliance 100 according to the invention, which is designed/configured as a kitchen appliance 100 for preparing food. The household appliance 100 has a processing device 110, in particular a mixing vessel 111, for at least partially automatically carrying out a household activity. The household activity may in particular be a processing of ingredients. The processing device 110 is thereby controlled by a control unit 102. Also provided on the household appliance 100 is a positioning device 120, which is used for detecting a position information 200 of a user 1 (not shown here) operating the household appliance 100. The positioning device 120 may, for example, have sensors (shown schematically on the left of the display 130 in FIG. 1) and/or be designed/configured as a manually operable input unit 121, in particular as a rotary knob 121 (shown schematically on the right of the display 130 in FIG. 1). The position device 120 transmits the detected position information 200 to a control unit 102. Also shown in FIG. 1 is a display 130, which is designed/configured to reproduce image information 140 determined by the control unit 102. The image information 140 reproduced by the display 130 generates an impression of depth for the user 1 (not shown here) and is dependent at least on the position information 200 of the user 1.

    [0081] Also shown in FIG. 1 is that, in addition to the display 130, a further display 136 is provided, which may in particular be designed/configured as a 2-D display 136. With respect to the display 130, it can be provided that it has a 2-D mode and a 3-D mode, wherein in particular the display 130 can switch between a 2-D mode and a 3-D mode on the basis of the position information 200.

    [0082] Furthermore, it can be seen from FIG. 1 that the household appliance 100 can be of multi-part design/configuration, and has at least one main body 101 which can be connected to a mains power supply and/or on which the processing device 110 is provided, wherein in particular the display 130 can be detachably attached to the main body 101 or the display 130 is integrally connected to the main body 101.

    [0083] FIG. 2 shows a side view of the household appliance 100 according to the invention, wherein the household appliance is designed/configured as a kitchen appliance 100 and comprises a processing device 110 with a stirring vessel 111. Also shown in FIG. 2 is a user 1 carrying a mobile identification transmitter 4. According to the embodiment shown, this mobile identification transmitter 4 can cooperate with a user identification device 150 provided on the household appliance. The user identification device 150 is adapted to identify the user 1, which may in particular be adapted to communicate with a mobile identification transmitter 4 of the user 1. In this context, the mobile identification transmitter 4 may in particular be designed/configured as a mobile radio telephone and or RFID tag 4. The communication between the mobile identification transmitter 4 and the user identification device 150 can in particular be wireless, preferably via Bluetooth, Bluetooth low energy and/or WLAN.

    [0084] Furthermore, it is shown in FIG. 2 that the display 130 has an axis of rotation about which the display 130 can be rotated in relation to the user 1, in particular based on the position information 200. This makes it possible for the depth impression to be generated for the user even if the latter changes his position, in particular the height, which is denoted by Z in FIG. 2. In particular, it may be provided that the display 130 has a motor which automatically moves the display 130 in such a way, which is shown in FIG. 2 as an arrow starting from the display 130, is always directed towards the face of the user 1. This can ensure that the image information 140 reproduced by display 130 can always create an impression of depth for user 1.

    [0085] According to FIG. 2, the user 1 is located at a distance R from the household appliance 100 and has a height Z. It may be provided that the user manually communicates his distance R and/or his size Z to the positioning device 120 by an input. Alternatively, or in addition, it may also be provided that the positioning device 120 automatically determines both the distance R and the size Z of the user.

    [0086] It may also be provided that the display 130 is rotatable about two axes. A gimbal mounting of the display 130 may also be provided, which offers the advantage that the display 130 can be optimally aligned in a space for the user 1 independently of the orientation of the household appliance 100.

    [0087] FIG. 3 shows the position of a user 1 in relation to the display 130. This essentially represents the position information 200. Thus, the user 1 can be at a distance R and an azimuth angle φ and a height Z to the display 130, in particular to a center of the display 130 or a normal to the center of the display 130. Through this, the position, in particular of a head, of the user 1 can be unambiguously determined as position information 200. Furthermore, the normal can correspond to a polar angle θ.

    [0088] FIG. 4 shows a household appliance 100 according to the invention, in which a display 130 presents image information 400. Furthermore, it is shown in FIG. 4 that the positioning device 120 may further comprise a camera 122, which is in particular adapted to detect the user 1, a gesture sensor 123, which is adapted to detect a gesture of the user 1 for controlling the household appliance 100, and a sound transducer 124, which is adapted to detect a voice input of the user 1 for controlling the household appliance 100, wherein the sound transducer 124 is in particular configured as a microphone. Furthermore, in the embodiment example according to FIG. 4, it is provided that the positioning device 120 further comprises a manually operable input unit 121 in the form of a rotary knob 121. Thereby, the manually operable input unit 121 can be designed/configured to control at least the processing device 110 by an input of the user 1 (both not shown here).

    [0089] In this regard, the gesture sensor 123 may comprise at least one of the following components: A camera, a radar sensor, a lidar sensor, a capacitive sensor, or an ultrasonic sensor.

    [0090] FIG. 5 shows a display 130 that displays image information 140. Thereby, according to FIG. 5, the display 130 has a housing 134, wherein in particular the housing 134 has a frame. On this housing 134, in particular the positioning device 120 can be provided, and in particular this can be integrated. Also shown in FIG. 5 is that the image information 140 may comprise image information 141 which is specific to the operation of the household appliance. In the example shown, the specific image information 141 displayed is an icon for a heater, which may be active, for example, in a kitchen appliance at the time of display. Also shown according to FIG. 5, the image information 140 may also comprise previous image information 144 and/or subsequent image information 143, wherein in particular at least two of the image information 141, 143, 144 differ in a display depth for the user 1 (not shown here). In the example shown in FIG. 5, the specific image information 141 is shown spatially in front of a previous image information 140 and a subsequent image information 143. In other words, the image information 140 in FIG. 5 is shown as a collection of spatially separated cuboidal information blocks 141, 143, 144, which are distributed in particular as a CD cover within the image information 140. In particular, it may be provided that a user can spatially move the cuboidal information blocks 141, 143, 144 via a gesture which can be recognized by the gesture control 123. This simplifies the use of the household appliance 100, since the user in particular does not have to touch the household appliance 100, and can nevertheless also work through complicated sequences of steps/stages to be performed.

    [0091] FIG. 6 discloses another variant of an image information 140 represented by the display 130, in which a 3-D object 146 is represented. In addition to the 3-D object 146, a 3-D user object 147 is also shown, which can be moved in the spatial representation by the user 1 in particular by a gesture recognized by the positioning device 120 or the gesture sensor 123. It may be provided that the user 1 can enter a recipe selection by the recognized gesture on the 3-D object 146. In particular, it may be provided that a hand of the user is rendered as a 3-D user object 147 in the image information 140. This offers the advantage that the user can find the gesture taking place in the three-dimensional space, also spatially represented, in the image information 140, so that the use of the household appliance 100 is simplified. It may also be provided that the 3-D user object 147 is represented together with further 3-D objects 146 in the image information 140 in such a way that the user 1 is given the impression that he is manipulating the 3-D object 146 with the gesture. This also offers the advantage that the user 1 can operate the household appliance 100 particularly easily.

    [0092] FIG. 7 shows an example embodiment of the display 130 in which the display has an optical component 131. According to the example in FIG. 7, this is formed as a lens array 131. Furthermore, as shown in FIG. 7, the display 130 may further comprise a light unit 132 which, starting from the eye 2, 3 of the viewer 1, is located behind the lens array 131 and reproduces the image information 140 by emitting photons (shown here as dashed lines in the form of light beams 133). According to the representation in FIG. 7, image information 140.1 intended for the left eye 2 is deflected by the lens array 131 in such a way that these light beams 133 strike only the left eye 2, and image information 140.2 intended for the right eye 3 is deflected by the lens array 131 in such a way that the light beams 133 strike only the right eye 3. Accordingly, the lens array 131 is suitable for blocking the image information 140.1 intended for the left eye 2 for the right eye 3 and for blocking the image information 140.2 intended for the right eye 3 for the left eye 2.

    [0093] FIG. 8 shows an alternative embodiment of the display 130, in which the optical component 130 is designed/configured as a barrier mask 135. The barrier mask 135 also serves to block image information 140.1 intended for the left eye 2 from reaching the right eye 3, and to block image information 140.2 intended for the right eye 3 from reaching the left eye 2. In contrast to the lens array 131, in which the light rays 133 are deflected by refraction, the barrier mask 135 blocks the light rays 133 by absorption due to the geometric arrangement of the masks.

    [0094] FIG. 9 illustrates the method according to the invention for operating a household appliance 100, in particular a kitchen appliance 100 for preparing food, wherein the detection 300 of position information 200 of a user 1 by a positioning device 120, as well as the reproduction 400 of image information 140 by a display 130, which is designed/configured to generate an impression of depth at the user 1, is shown. In this case, the image information 140 is dependent on at least the position information 200. Furthermore, FIG. 9 also shows the reproduction 500 of an image information 141, which is specific to the operation of the household appliance, by a display 130, wherein, in addition to the specific image information 141, a previous image information 144 and/or a subsequent image information 143 are shown, wherein, in particular, at least two of the image information 141, 143, 144 differ in a display depth for the user 1.

    [0095] The foregoing explanation of embodiments describes the present invention exclusively in the context of examples. Of course, individual features of the embodiments can be freely combined with each other, if technically feasible, without affecting the scope of the present invention.

    LIST OF REFERENCE SIGNS

    [0096] 1 User [0097] 2 Left eye [0098] 3 Right eye [0099] 4 Identification device [0100] 100 Household appliance, especially kitchen appliance [0101] 101 Main body [0102] 102 Control unit [0103] 110 Processing device [0104] 111 Stirring vessel [0105] 120 Positioning device [0106] 121 Input unit, especially rotary knob [0107] 122 Camera [0108] 123 Gesture sensor [0109] 124 Sound transducer [0110] 130 Display, especially 3D display [0111] 131 Optical unit, especially lens array [0112] 132 Lighting unit [0113] 133 Light rays [0114] 134 Display housing [0115] 135 Barrier-mask [0116] 136 Additional display [0117] 140 Image information [0118] 140.1 Image information of the left eye [0119] 140.2 Image information of the right eye [0120] 141 Image information, especially specific to the household appliance [0121] 143 Image information, especially sequence information [0122] 144 Image information, especially sequence information [0123] 146 3D object [0124] 147 3D user object [0125] 150 User identification device [0126] 200 Position information [0127] 300 Detection [0128] 400, 500 Reproduction [0129] R Distance, especially of the user, to the display [0130] Z Height, especially of the user, [0131] φ Azimuth angle [0132] θ Polar angle