Surgical microscope with gesture control and method for a gesture control of a surgical microscope

11284948 · 2022-03-29

Assignee

Inventors

Cpc classification

International classification

Abstract

The present invention relates to a surgical microscope (1) with a field of view (9) and comprising an optical imaging system (3) which images an inspection area (11) which is at least partially located in the field of view (9), and to a method for a gesture control of a surgical microscope (1) having an optical imaging system (3). Surgical microscopes (1) of the art have the disadvantage that an adjustment of the microscope parameters, for instance the working distance, field of view (9) or illumination mode, requires the surgeon to put down his or her surgical tools, look up from the microscope's eyepiece (5) and perform the adjustment by operating the microscope handles. Also foot switches and mouth switches where proposed, whereas those are not intuitively operated, require training and are less accurate than hand control of the microscope handles. Thus, the surgeon may lose his or her focus, is confronted with inconveniences and delays and bears an increased risk of contamination. In order to overcome the above disadvantages, the present invention provides a surgical microscope (1) which is characterized in that the surgical microscope (1) further comprises a gesture detection unit (17) for detection of a movement (65) of an object (29) such as at least one of a finger and a tool, and in that the gesture detection unit (17) is connected to the optical imaging system (3) via a signal line (76) and in that the gesture detection unit (17) is configured to output a control signal (77) via the signal line (76) to the optical imaging system (3) depending on the movement (65) of the object (29), the optical imaging system (3) being configured to alter its state depending on the control signal (77). The invention further provides a method for gesture control of the inventive surgical microscope (1).

Claims

1. A surgical microscope having a field of view and comprising an optical imaging system which images an inspection area which is at least partially located in the field of view, characterized in that the surgical microscope further comprises a gesture detection unit for detection of a movement of an object, in that the gesture detection unit is connected to the optical imaging system via a signal line and in that the gesture detection unit is configured to output a control signal via the signal line to the optical imaging system depending on the movement of the object, the optical imaging system being configured to alter its state depending on the control signal, wherein the object is selected from a group of objects including a first type of object and a second type of object different from the first type of object; wherein the gesture detection unit further comprises a relationship mapping unit which is connected to the movement detection module via the movement-data line, wherein the relationship mapping unit is adapted to map relationships between a movement of the object and a control signal, which control signal is provided via the signal line to the optical imaging system, and to store a plurality of relationship maps each respectively associated with one type of object in the group of objects, each of the plurality of relationship maps relating a plurality of predefined movements of the associated type of object to a plurality of corresponding control signals, wherein the gesture detection unit is adapted to determine the type of the object and the relationship mapping unit is adapted to select a corresponding relationship map from the plurality of relationship maps according to the type of the object, wherein the gesture detection unit is adapted to select one of the corresponding control signals in the selected relationship map according to the detected movement of the object, wherein detection of one of the plurality predefined movements of the object results in selection of a first control signal when the object is determined to be of a first type, and detection of the same one of the plurality predefined movements of the object results in selection of a second control signal different from the first control signal when the object is determined to be of a second type different from the first type.

2. The surgical microscope according to claim 1, wherein the first type of object is a finger and the second type of object is a tool.

3. The surgical microscope according to claim 1, wherein the gesture detection unit comprises a detection zone which is located between the inspection area and the optical imaging system and which detection zone is spaced apart from the inspection area, wherein gesture detection of the gesture detection unit is limited to movements of the object in the detection zone.

4. The surgical microscope according to claim 3, wherein the detection zone overlaps the inspection area at least partially.

5. The surgical microscope according to claim 3, wherein the gesture detection unit further comprises a movement detection module for differentiating 3-dimensional movements of the object in the detection zone, wherein the movement detection module is connected to the gesture detection unit via a movement-data line and the movement detection module is adapted to output a movement-data signal via the movement-data line to the gesture detection unit depending on the movement of the object.

6. The surgical microscope according to claim 2, wherein the relationship map for the first type of object provides control signals for coarse adjustment of the optical imaging system, and the relationship map for the second type of object provides control signals for fine adjustment of the optical imaging system.

7. The surgical microscope according to claim 1, wherein the optical imaging system comprises at least one moveable mechanical component, wherein the position of the at least one moveable mechanical component depends on the control signal.

8. The surgical microscope according to claim 1, further comprising a projection unit optically connected to the optical imaging system, wherein the projection unit is adapted to project optional and/or interactive images on at least one interaction portion, wherein the at least one interaction portion at least partially overlaps with the imaged inspection area.

9. A method for a gesture control of a surgical microscope having an optical imaging system, comprising the steps of: optically detecting an object, wherein the object is selected from a group of objects including a first type of object and a second type of object different from the first type of object; determining the type of the object by comparing the object with predetermined object patterns; selecting one of at least two relationship maps according to the type of the object; contactlessly detecting a movement of the object with respect to a reference point; and controlling the optical imaging system depending on the movement of the object and the selected relationship map, wherein the optical imaging system is controlled according to a first control signal in response to the detected movement when the object is determined to be the first type of object, and the optical imaging system is controlled according to a second control signal different from the first control signal in response to the same detected movement when the object is determined to be the second type of object.

10. The method according to claim 9, wherein the first type of object is a finger and the second type of object is a tool.

11. The method according to claim 9, wherein the reference point is a gesture detection unit.

12. The method according to claim 9, further comprising the step of providing distance and/or movement data of the object to a computing unit.

13. The method according to claim 9, further comprising the step of differentiating the detected movement of the object.

14. The method according to claim 9, further comprising the step of comparing the detected movement of the object with predetermined movement patterns.

15. The method according to claim 9, further comprising the step of controlling at least one moveable mechanical component of the surgical microscope depending on the movement of the object.

16. The method according to claim 9, further comprising the steps of: projecting at least one optional and/or interactive image onto at least one interaction portion of an inspection area imaged by the optical imaging system; comparing the movement of the object with predetermined interaction movement patterns; and controlling the optical imaging system and/or the surgical microscope depending on the movement of the object within the at least one interaction portion.

17. The method according to claim 10, wherein the relationship map for the first type of object provides control signals for coarse adjustment of the optical imaging system, and the relationship map for the second type of object provides control signals for fine adjustment of the optical imaging system.

18. A non-transitory storage media embodied to perform the method of claim 9.

Description

BRIEF DESCRIPTION OF THE DRAWING VIEWS

(1) In the figures,

(2) FIG. 1 shows a schematic side view of a first embodiment of the inventive surgical microscope;

(3) FIG. 2 shows a schematic front view of a second embodiment of the inventive surgical microscope;

(4) FIG. 3 shows different possible movement patterns of the object;

(5) FIG. 4 shows an exemplary adjustment of the working distance of the inventive surgical microscope; and

(6) FIG. 5 shows an exemplary application of one of four interaction portions for selection of an imaging note.

DETAILED DESCRIPTION OF THE INVENTION

(7) FIG. 1 shows a schematic side view of a first embodiment of an inventive surgical microscope 1, which surgical microscope 1 comprises an optical imaging system 3, which optical imaging system 3 comprises, for instance, a stereoscopic eyepiece 4 comprising eyepieces 5, a stereoscopic assistant's eyepiece 4a comprising assistant's eyepieces 5a located at the side of the surgical microscope 1 and an objective 7. The optical imaging system 3 determines a field of view 9 which is a portion of the inspection area 11 which may be a patient 13.

(8) The field of view 9 is determined by the numerical aperture of the optical imaging system 3 and a working distance 15, which working distance 15 is measured along a z-axis. The field of view 9 extends along the x-axis and y-axis.

(9) The surgical microscope 1 further comprises a gesture detection unit 17 which is attached to the housing 19 of the surgical microscope 1. The gesture detection unit 17 acts as a reference point 38 in the embodiments shown in FIG. 1. In other embodiments the reference point 38 may for instance be located in the inspection area 11.

(10) The eyepieces 5 and the assistant's eyepieces 5a comprise sensors 6 which detect whether one of the eyepieces 5, 5a is used for viewing the imaged field of view 9. The sensors 6 are proximity sensors 6a in the embodiment shown.

(11) The gesture detection unit 17 comprises two signal lines 76, wherein one signal line 76 is an electrical line (solid) and a second signal line 76 is a wireless connection (dashed). Via those two signal lines 76, a control signal 77, illustrated by an arrow, is transmitted.

(12) The signal line 76 indicated by the solid line is connected to a moveable mechanical component which is a joint 82a of a holding arm 82b which holds the surgical microscope 1.

(13) The gesture detection unit 17 is tilted around the x-axis and faces towards the field of view 9, such that a gesture detection zone 21 at least partially overlaps with the viewing section 23 of the optical imaging system 3. The viewing section 23 is indicated by a dashed surrounding line.

(14) The objective 7 defines an optical axis 25 which is oriented parallel to the z-axis. It is noted that the optical axis 25 is oriented according to the objective 7 and not necessarily to further components (e.g. the eyepiece 5) of the optical imaging system 3.

(15) In FIG. 1, it is shown that the detection zone 21, which is a 3D-volume, is located at a distance to the field of view 9 of the inspection area 11. Only the movement of an object (not shown) within the surrounded volume inside the dashed line is detected, i.e. movements of an object above (with reference to the z-axis) or below the detection zone 21 will be ignored and will not generate a control signal.

(16) FIG. 2 shows a schematic front view of a second embodiment of the inventive surgical microscope 1 in which the gesture detection unit 17 is embodied integrally in the housing 19 of the surgical microscope 1. This embodiment does not comprise the stereoscopic assistant's eyepiece 4a.

(17) In the embodiment shown in FIG. 2, the 3D-camera 27 of the surgical microscope 1 provides the distance information of objects 29 via a signal line 76.

(18) The gesture detection unit 17 further comprises a movement detection module 78, which is provided by the signal line 76a with data from the 3D-camera 27.

(19) The movement detection unit 78 provides a movement-data signal 80 via a movement-data line 79 to the gesture detection unit 17.

(20) Via the signal line 76, the gesture detection unit 17 outputs a control signal 77 to the optical imaging system 3. The control signal 77 and the movement-data signal 80 are indicated by arrows.

(21) The gesture detection unit 17 further comprises a relationship mapping unit 81, which maps a detected movement 65 with an operation to be performed by the optical imaging system 3 or the moveable mechanical component 82.

(22) The surgical microscope 1 further comprises a projection unit 83, which is adapted to project additional and/or interactive images 71 (not shown) onto the imaged inspection area. In the embodiment shown, the projection unit 83 utilizes the beam splitters 33 for being optically coupled to the optical imaging system 3.

(23) In FIG. 2, the object 29 is a scalpel 31, which is located inside the detection zone 21 and whose movements are detected by the gesture detection unit 17.

(24) The 3D-camera 27 is only schematically shown, whereas only the beam splitters 33 and the 2D-sensors 35 are indicated. The 2D-sensors may be CCD cameras. Due to the different viewing sections 23a and 23b of the two channels, a three-dimensional image of the field of view 9 may be generated, imaged and viewed via the eyepiece 5.

(25) In the embodiment of the surgical microscope 1 shown in FIG. 2, the gesture detection unit 17 receives data from the 3D-camera 27, in particular from the 2D-sensors 35 and assesses the distance data of the scalpel 31, whereas the distance data corresponds to the distance 37 of the tip of the scalpel 31.

(26) The surgical microscope 1 of FIG. 2 does not comprise a stereoscopic assistant's eyepiece 4a.

(27) FIG. 3 shows a schematic drawing illustrating different movement patterns of the object (not shown) whereas one may distinguish one hand-gestures 39 and two-hand gestures 41.

(28) Possible one-hand gestures 39 are movements of the objects along the y-axis, for instance a vertical gesture 43, a horizontal gesture 45, a combined 2D-gesture 47, a circular gesture 49 and an upward gesture 51 during which the object is moved along the z-direction in a positive sense of the z-axis.

(29) Possible two-hand gestures 41 are displacement gestures 53a and 53b, a zoom-in gesture 55 and a zoom-out gesture 57.

(30) A possible two-hand gesture 41 along the z-axis is shown in case of the second upward gesture 59, in which both objects (not shown) are moved along the z-direction, whereas the drawn arrows are pointing in different directions due to the perspective chosen for the figure.

(31) FIG. 3 furthermore shows a rotational gesture 49a, which is one possible one-hand gesture 39 and a circular gesture 49 as well. The rotational gesture 49a is performed in the x-y plane by rotating the scalpel 31 from a first rotational position 31a to a second rotational position 31b and into a third rotational position 31c.

(32) FIG. 4 shows an exemplary operation which is performed by the optical imaging system depending on the movement of a scalpel 31.

(33) FIG. 4 shows the image of the field of view 9 in two states of the optical imaging system 3. In the unfocused state 61 the image is blurred due to an incorrect working distance and consequently misaligned focus of the surgical microscope 1.

(34) In the unfocused state 61, the scalpel 31 is located in the detection zone 21, which is, however not visible in FIG. 4 (see for instance FIG. 2), i.e. movements of the scalpel 31 are detected by the gesture detection unit 17 (not shown), which in turn generates a control signal for altering the state of the optical imaging system 3.

(35) The scalpel 31 performs a movement 65 which is indicated by an arrow, which movement 65 starts at a start position 67 and ends at an end position 69 of the scalpel 31.

(36) The detected movement generates a control signal which controls the working distance of the optical imaging system 3 and which control signal initiates change of the working distance 15 of the optical imaging system 3, which in turn adjusts the focus of the surgical microscope 1.

(37) If the scalpel 31 is moved from the start position 67 to the end position 69, the movement 65 of the scalpel 31 is detected as an online gesture, i.e. the surgeon directly sees the change of the state of the surgical microscope 1 during the movement 65.

(38) Upon arriving at a focused state 63 of the field of view 9 imaged by the optical imaging system 3, the surgeon stops the movement of the scalpel, which stopped position corresponds to the end position 69.

(39) FIG. 5 shows an exemplary application of interaction portions for selecting one out of several imaging modes.

(40) In FIG. 5, the same field of view 9 of a focused state 63 is shown, where a projection unit (not shown) projects four opaque images 71 in an overlaying manner on the imaged inspection area, which images 71 define interaction portions 73 of the field of view 9. The images 71 and interaction portions 73 are solely labeled in the first view 75a shown in FIG. 5.

(41) The images 71 contain a visual feedback of the operation of the optical imaging system 3 which can be activated by the corresponding interaction portion 73.

(42) In FIG. 5, the shown interaction portions 73 correspond to different imaging modes, wherein in the first view 75a, the scalpel 31 is positioned outside the interaction portions 73 and subsequently moved into the left lower interaction portion 73a, which is shown in the second view 75b.

(43) In the third view 75c, a movement 65 of the scalpel 31 out of the drawing plane, i.e. along the z-axis is indicated by an arrow. This gesture is detected by the gesture detection unit 17 (not shown), which gesture detection unit 17 generates a control signal processed by a computing unit (not shown), which subsequently initiates change of the imaging mode to the one corresponding to the according left lower interaction portion 73a.

(44) The imaging mode selected in FIG. 5 in the third view 75c is the mode ‘MFL 400’ and after recognition of the movement 65 of the scalpel 31, the detection of the movement 65 for activating the function corresponding to the left lower interaction portion 73a is indicated by projecting an alternate, i.e. modified, image 71 in the left lower interaction portion 73a.

(45) The movement 65 shown in the third view 75c is an offline movement, i.e. the operation corresponding to the activation of the according interaction portion 73 is only performed after completion of the movement 65.

(46) As already mentioned in the beginning of the explanation of the figures, the embodiment shown in FIGS. 1-5 are to be understood as exemplary and do not limit the scope of protection of the present disclosure. Therefore, any number of interaction portions 73, one-hand gestures 39, two-hand gestures 41 and functional sections of the field of view 9 are conceivable.

REFERENCE NUMERALS

(47) 1 surgical microscope 3 optical imaging system 4 stereoscopic eyepiece 4a stereoscopic assistant's eyepiece 5 eyepiece 5a assistant's eyepiece 6 sensor 6a proximity sensor 7 objective 9 field of view 11 inspection area 13 patient 15 working distance 17 gesture detection unit 19 housing 21 detection zone 23, 23a, 23b viewing section 25 optical axis 27 3D-camera 29 object 31 scalpel 31a first rotational position 31b second rotational position 31c third rotational position 33 beam splitter 35 2D-sensor 37 distance 38 reference point 39 one-hand gestures 41 two-hand gestures 43 vertical gesture 45 horizontal gesture 47 combined 2D-gesture 49 circular gesture 49a rotational gesture 51 upward gesture 53a, 53b displacement gestures 55 zoom-in gesture 57 zoom-out gesture 59 second upward gesture 61 unfocused state 63 focused state 65 movement 67 start position 69 end position 71 image 73 interaction portion 73a left lower interaction portion 75a first view 75b second view 75c third view 76, 76a signal line 77 control signal 78 movement detection module 79 movement-data line 80 movement-data signal 81 relationship mapping unit 82 moveable mechanical components 82a joint 82b holding arm 83 projection unit x x-axis y y-axis z z-axis