LABORATORY AUTOMATION DEVICE WITH TRANSPARENT DISPLAY IN DOOR

20230228776 · 2023-07-20

Assignee

Inventors

Cpc classification

International classification

Abstract

A laboratory automation device comprises a workspace with liquid containers and a pipetting arm for moving liquids between the liquid containers; a housing enclosing the workspace; a door of the housing for accessing the workspace, wherein the door comprises a transparent display for displaying information and for allowing a person to view into the workspace and a tracking sensor for tracking an eye position of the person. The laboratory automation device is adapted for determining the eye position of the person from sensor data acquired with the tracking sensor; and for displaying information for a notification area in the workspace on the transparent display, such that the information is displayed from a perspective of the person in front of the notification area.

Claims

1. A laboratory automation device, comprising: a workspace with liquid containers and a pipetting arm for moving liquids between the liquid containers; a housing enclosing the workspace; a door of the housing for accessing the workspace, wherein the door comprises a transparent display for displaying information and for allowing a person to view into the workspace; a tracking sensor for tracking an eye position of the person; wherein the laboratory automation device is adapted for: determining the eye position of the person from sensor data acquired with the tracking sensor; displaying information for a notification area in the workspace on the transparent display, such that the information is displayed from a perspective of the person in front of the notification area.

2. The laboratory automation device according to claim 1, wherein the notification area is projected towards the eye position onto the transparent display for determining a display area on the transparent display, in which the information is displayed.

3. The laboratory automation device according to claim 1, wherein the notification area is an area, in which a component of the laboratory automation device is situated; wherein information, which is displayed on the transparent display, is a status information of the component.

4. The laboratory automation device according to claim 1, wherein the notification area is an area, in which a component of the laboratory automation device is to be placed; wherein information, which is displayed on the transparent display, is a status information of the component to be placed.

5. The laboratory automation device according to claim 1, wherein the tracking sensor comprises a camera.

6. The laboratory automation device according to claim 1, wherein the tracking sensor comprises an eye tracking sensor.

7. The laboratory automation device according to claim 1, wherein the tracking sensor is adapted for determining that more than one person is in front of the door.

8. The laboratory automation device of claim 7, wherein the person, for who the eye position is determined, is selected via gesture detection performed with sensor data of the tracking sensor.

9. The laboratory automation device according to claim 7, wherein the person, for who the eye position is determined, is selected via the transparent display.

10. The laboratory automation device according to claim 1, wherein the door has a position sensor for determining a position of the door; wherein a display area on the transparent display, in which the information is displayed, is determined in dependence of the position of the door.

11. The laboratory automation device according to claim 1, wherein the transparent display comprises a touchscreen.

12. A method for operating a laboratory automation device of claim 1, the method comprising: determining the eye position of the person from sensor data acquired with the tracking sensor; displaying information for a notification area in the workspace on the transparent display, such that the information is displayed from a perspective of the person in front of the notification area.

13. A computer program for a controller of a laboratory automation device, which, when being executed by a processor, is adapted for performing the method of claim 12.

14. A computer-readable medium, in which a computer program according to claim 13 is stored.

15. A controller of a laboratory automation device adapted for performing the method of claim 1.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0040] Below, embodiments of the present invention are described in more detail with reference to the attached drawings.

[0041] FIG. 1 shows a schematic perspective view of a laboratory automation device according to an embodiment of the invention.

[0042] FIG. 2 shows a schematic cross-sectional view of a laboratory automation device according to a further embodiment of the invention.

[0043] FIGS. 3A, 3B and 3C show schematic cross-sectional views of a laboratory automation device according to a further embodiment of the invention.

[0044] FIG. 4 shows a flow diagram illustrating a method for operating a laboratory automation device according to an embodiment of the invention.

[0045] The reference symbols used in the drawings, and their meanings, are listed in summary form in the list of reference symbols. In principle, identical parts are provided with the same reference symbols in the figures.

DETAILED DESCRIPTION

[0046] FIG. 1 shows a laboratory automation device 10, which comprises a workbench 12 onto which several components 14 of the laboratory automation device 10 are mounted. The shown examples include a cartridge 14a with pipette tips 14b, a cartridge 14c with test tubes 16a, a microplate 14d with wells 16b and a container 16c containing a reagent 18. In general, some of the components 14 may be liquid containers 16.

[0047] The laboratory automation device 10 further comprises a pipetting arm 20 with a pipette 22, which may be moved in three dimensions, for example with the aid of motors. A sample may be pipetted with the pipette 22 from the test tubes 16a, by being aspirated and may be dispensed into a well 16b. Analogously, the reagent 18 may be conveyed into the well 16b. With a gripper arm 23 equipped with grippers 24, the microplate 14d may be exchanged and moved into further devices, such as a heater, an optical analysis device, etc.

[0048] The components 14, 20, 22, 24 of the laboratory automation device 10 are arranged in a workspace 26 above the workbench 12, which is enclosed by a housing 28. At a front side, the housing 28 has a door 30, which can be opened for accessing the components 14, 20, 22, 24. A part of the door 30 or the complete door 30 is a transparent display 32, allowing to view into the workspace 26 and on which information 34 can be displayed. Such information may include symbols, text and images 34a and an outlining and/or highlighting 34b from components 14, 20, 22, 24.

[0049] Some or all of the information 34 may be displayed with respect to a specific viewing perspective of a person in front of the laboratory automation device 10, such that the information 34 is displayed near or in front of a component 14, 20, 22, 24 for the person.

[0050] In general, the information 34 may contain stationary information 34a, which is displayed on the same position on the transparent display 32 independently from the eye position of a person (and optionally independently from a position of the door 30). Furthermore, information 34 may contain augmented reality information 34b, which is displayed on the transparent display 32 at a position in dependence from the eye position of the person (and optionally dependent on a position of the door 30).

[0051] For example, the outlining and/or highlighting 34b of a component 14, 20, 22, 24 may be displayed, such that it overlays the area at which the component 14, 20, 22, 24 is arranged in the workspace.

[0052] For determining the eye position, the laboratory automation device 10 comprises a tracking sensor 36, which comprises a camera 36a and/or an eye tracking sensor 36b. For example, besides a camera 36a, an eye tracking sensor may comprise one or more infrared light sources 36c. The tracking sensor 36 and in particular the camera 36a and the infrared light sources 36c may be attached to the housing 28.

[0053] FIG. 1 furthermore shows a controller 38 of the laboratory automation device 10, which may perform determining the eye position and rendering the information 34 to be displayed on the transparent display 32. The controller 38 may be a computer or embedded device in data communication with the laboratory automation device 10 or may be included into the laboratory automation device 10.

[0054] FIG. 2 shows an embodiment of a laboratory automation device 10 with a hinged door 30. The door 30 of the laboratory automation device 10 of FIG. 1 may be designed in such a way. The door 30 may be hinged about an axis A. In a first position 40a, the door 30 is closed and the door 30 prevents, that the workspace 26 is accessible through the opening, which is closed by the door 30. In a second position 40b, the door 30 is opened, such that a person 42 may view through the transparent display and may reach into the workspace 26, for example for exchanging a liquid container 16. There may be a further position, in which the door 30 is completely opened. The positions 40a, 40b of the door are determined with a position sensor 44.

[0055] The position sensor 44 may be used by the controller 38, for stopping the operation of the pipetting arm 20 and/or of the gripper 24, when the door 30 is not completely closed, i.e. is not in the position 40a.

[0056] FIG. 2 also illustrates, how the controller 38 determines a display area 50 of the information 34b. With the known eye position 46, a notification area 48 in the workspace is projected onto the transparent display 32. This projection is used as the display area 40, in which the information 34b is displayed. The notification area 48 may be a volume in the workspace 26. The display area 40 may be a two-dimensional shape, such as a polygonal part of the display.

[0057] For performing the projection of a notification area 48, the controller 38 has to know the eye position 46 and the position of the transparent display 32. This position may be determined from the door position 40a, 40b or may be assumed to be constant.

[0058] FIGS. 3A, 3B and 3C show an embodiment of a laboratory automation device 10 with a sliding door 30. The door 30 of the laboratory automation device 10 of FIG. 1 may be designed in such a way. The sliding door 30 may prevent access to the workspace 26 in a first door position 40a (position “closed”), may allow access to the workspace 26 in a second door position 40b (position “work bench accessible”), and may allow access to the whole workspace 26 in a third door position 40c (position “completely opened”). Again, the display area 50 may be determined in dependence of the door position 40a, 40b. The sliding door may have different “work bench accessible” 40b positions varying in the height d of the assessing gap 52.

[0059] FIG. 4 shows a flow diagram for a method for operating a laboratory automation device 10, such as shown in the previous figures.

[0060] In step S10, when more than one person 42 are in front of the door 30, a person 42 is selected. The controller 38 may determine how many persons are in front of the door 30, by evaluating the video stream of the camera 36a. The camera 36a may be arranged, such that its field of view is directed to an area in front of the door 30, where persons 42 are to be expected. A person 42 or the head of a person may be identified with object recognition.

[0061] It may be that a person 42 is selected via gesture detection performed with sensor data of the tracking sensor 36 and in particular the camera 36a. For example, a waving arm or a specific gesture may be determined with object recognition of the hand and/or arm and tracking its position.

[0062] It also may be that the person 42 is selected via the transparent display 32. The heads and/or faces of all persons 42 in the field of view of the camera 36a may be displayed on the transparent display 32, for example as stationary information 34a. The transparent display 32 may comprise a touchscreen and the person may be selected by touching the area with the respective stationary information 34a.

[0063] In step S12, the eye position 46 of the person 42 is determined from sensor data acquired with the tracking sensor 36. For example, the eye position 46 may be extracted from the image of the head of the person selected in step S10 or from a single person in front of the door 30. The eye position 46 may be determined via object recognition in the video stream from the camera 36a. It also may be that the infrared light sources 36c are used, which are reflected from the eye and which reflections may be identified in the video stream.

[0064] The eye position 46 may be a three-dimensional coordinate that is provided with respect to a coordinate system, in which also positions of points inside the workspace 26 may be calculated and/or in which also the position of the transparent display 32 is known.

[0065] In step S14, augmented reality information 34b for a notification area 48 in the workspace 26 is displayed on the transparent display 32.

[0066] The controller 38 determines one or more notification areas 48, for which information 34b should be displayed. The notification area 48 may be an area, in which a component 14 of the laboratory automation device 10 is situated. The information 34b may be a status information of the component 14, such as an error message or a message for the person to exchange or refill the component 14. It also may be that the notification area 48 is an empty area in the workspace 26. For example, an area in which a component 14 of the laboratory automation device 10 is to be placed by the person 42. The information 34b may be a status information of the component 14 to be placed.

[0067] The controller 38 may maintain an internal model of the workspace 26 and the components 14 inside. This model may provide information at which positions which components 14 are situated. The model also may provide information on a bounding volume of the components. Such bounding volumes may be used as notification area 48. The internal model may be updated, when a component 14 is put into the workspace 26 and removed from the workspace 26. The internal model may be updated, when a component 14 changes its place, for example due to the operation of the pipetting arm 20 and/or the gripper 24.

[0068] When the one or more notification areas 48 are determined, the notification areas 48 are projected towards the eye position 46 onto the transparent display 32 for determining a display area 50 on the transparent display 32. The transparent display 32 may be modelled as a rectangle, which position is determined by the position 40a, 40b, 40c of the door 30. The notification areas 48 may be projected onto this rectangle via central perspective with the eye position 46.

[0069] In the end, information 34b, which is shown in the respective display area 50, is rendered. The information 34b may comprise at least one of text, numbers, symbols, outlines, highlights, etc. It may be that the graphical information 34b is blinking or changes color to catch the interest of the person 42.

[0070] Due to the projection, the information 34b is displayed from a perspective of the person 42 in front of the notification area 48. In such a way, the person can identify easily, to which component 14 the information 34b belongs, in particular independently of the position 40a, 40b of the sliding door 30 (FIG. 3A-3C) or the opening angle of the hinged door 30 (FIG. 2).

[0071] While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art and practising the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single processor or controller or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.