OBSERVATION SYSTEM AND OBSERVATION SUPPORT METHOD
20220343560 · 2022-10-27
Assignee
Inventors
Cpc classification
G02B21/368
PHYSICS
G06V20/69
PHYSICS
G02B21/361
PHYSICS
G06V10/24
PHYSICS
International classification
Abstract
An observation system includes: an observation device that includes an eyepiece lens and an objective and forms a real image of a sample on an optical path between the eyepiece lens and the objective; and an observation auxiliary device that is worn by a user and outputs auxiliary information to the user, the observation auxiliary device superimposing the auxiliary information on a virtual image of the sample to be observed by the user through the eyepiece lens on the basis of a relative position of the observation auxiliary device with respect to the observation device.
Claims
1. An observation system comprising: an observation device that includes an eyepiece lens and an objective and that forms a real image of a sample on an optical path between the eyepiece lens and the objective; and an observation auxiliary device that is worn by a user and outputs auxiliary information to the user, the observation auxiliary device superimposing the auxiliary information on a virtual image of the sample to be observed by the user via the eyepiece lens on a basis of a relative position of the observation auxiliary device with respect to the observation device.
2. The observation system according to claim 1, wherein the observation auxiliary device changes an output position of the auxiliary information on a basis of a distance and a direction from an optical axis of the eyepiece lens to the observation auxiliary device.
3. The observation system according to claim 1, wherein the observation auxiliary device stops output of the auxiliary information on a basis of a distance from the observation device to the observation auxiliary device.
4. The observation system according to claim 1, wherein one of the observation device and the observation auxiliary device includes a transmitter that transmits a signal for detecting the relative position, and the other of the observation device and the observation auxiliary device includes a receiver that receives the signal for detecting the relative position.
5. The observation system according to claim 4, wherein the signal includes at least one of an electromagnetic wave outside a visible range or an ultrasonic wave.
6. The observation system according to claim 4, wherein the observation device and the observation auxiliary device detect the relative position by three-point positioning.
7. The observation system according to claim 1, wherein the observation auxiliary device includes: a retina projection device that projects an image on a retina of the user; and an imaging device that captures an image of the retina, and the observation auxiliary device detects the relative position on a basis of the image of the retina captured by the imaging device during a period in which the retina projection device projects the image for detecting the relative position on the retina.
8. The observation system according to claim 7, wherein the observation auxiliary device detects the relative position on a basis of the image for detecting the relative position projected on the retina and the image of the sample projected on the retina, each of which is included in the image of the retina.
9. The observation system according to claim 7, wherein the retina projection device projects the image for detecting the relative position on the retina with light outside a visible range.
10. The observation system according to claim 9, wherein the light outside the visible range is infrared light.
11. The observation system according to claim 1, wherein the observation auxiliary device includes at least one of smart glasses, a see-through type head mounted display, or a smart contact lens.
12. The observation system according to claim 1, wherein the observation auxiliary device includes a device using at least one of a retinal projection method, a retinal scanning method, a hologram method, or a micro LED method.
13. The observation system according to claim 1, wherein the observation auxiliary device further includes a sensor that detects an orientation of the observation auxiliary device, and the observation auxiliary device outputs an image of the sample captured by the observation device as the auxiliary information when the relative position and the orientation detected by the sensor satisfy a predetermined condition.
14. The observation system according to claim 1, wherein the observation auxiliary device outputs an image indicating a position of the observation device as the auxiliary information when the relative position satisfies a predetermined condition.
15. The observation system according to claim 1, wherein the observation device has an eye relief of equal to or longer than 40 mm
16. The observation system according to claim 2, wherein one of the observation device and the observation auxiliary device includes a transmitter that transmits a signal for detecting the relative position, and the other of the observation device and the observation auxiliary device includes a receiver that receives the signal for detecting the relative position.
17. The observation system according to claim 2, wherein the observation auxiliary device includes: a retina projection device that projects an image on a retina of the user; and an imaging device that captures an image of the retina, and the observation auxiliary device detects the relative position on a basis of the image of the retina captured by the imaging device during a period in which the retina projection device projects the image for detecting the relative position on the retina.
18. An observation support method comprising: by an observation device including an eyepiece lens and an objective, forming a real image of a sample on an optical path between the eyepiece lens and the objective; and by an observation auxiliary device worn by a user, outputting auxiliary information to the user and superimposing the auxiliary information on a virtual image of the sample to be observed by the user through the eyepiece lens on a basis of a relative position of the observation auxiliary device with respect to the observation device.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0009] The present invention will be more apparent from the following detailed description when the accompanying drawings are referenced.
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
DESCRIPTION OF EMBODIMENTS
[0033] In order to use the technique described in WO 2018/231204 A, a projection device needs to be incorporated in an existing observation device. It is therefore necessary to significantly change a configuration of the existing observation device.
[0034] Considering such circumstances, an embodiment of the present invention will be described hereinafter.
[0035] (First embodiment)
[0036]
[0037] As illustrated in
[0038] The observation device 100 is, for example, a microscope device. The observation device 100 includes an objective 101 and an eyepiece lens 102. The observation device 100 forms an optical image (real image) of a sample on an optical path between the objective 101 and the eyepiece lens 102. As illustrated in
[0039] In addition, the observation device 100 may include a digital camera 103. The observation device 100 may branch into an optical path that guides light from the objective 101 to the eyepiece lens 102 and an optical path that guides the light to the digital camera 103 by a trinocular tube. A digital image 2 acquired by the digital camera 103 is output to, for example, a control device 105 that controls a microscope main body.
[0040] In the observation device 100, the control device 105 may generate auxiliary information 3 on the basis of a result of image analysis performed on the digital image 2. The auxiliary information 3 may be information for displaying an outline of a specific target (for example, cells, cell nuclei, etc.) included in the sample as illustrated in
[0041] The observation device 100 may further include a transmitter 104 for position detection. The transmitter 104 may be used to detect a relative position of the observation auxiliary device 200 with respect to the observation device 100 by being used in combination with a receiver 201 which will be described later. Hereinafter, the relative position of the observation auxiliary device 200 with respect to the observation device 100 will be simply referred to as a relative position. In order to grasp how far the user's eyes are from the eye point, the transmitter 104 is preferably provided at the eyepiece lens 102 or near the eyepiece lens 102 as illustrated in
[0042] The observation auxiliary device 200 is, for example, smart glasses. The observation auxiliary device 200 is a wearable device worn by the user of the observation device 100 and outputs auxiliary information to the user when the user uses the observation device 100. The observation auxiliary device 200 only requires to output the auxiliary information in a mode that allows the user to visually recognize the auxiliary information. More specifically, the observation auxiliary device 200 only requires to output the auxiliary information in such a manner that the user can visually recognize the auxiliary information while observing the optical image of the sample through the eyepiece lens 102.
[0043] Thus, the observation auxiliary device 200 is not limited to the smart glasses and may include, for example, a see-through type head mounted display, may include a contact lens type device (smart contact lens) directly attached to the eyeball, or may include at least one of these. Note that a method of outputting the auxiliary information to the user is not particularly limited. The observation auxiliary device 200 may include, for example, a device using at least one of a retinal projection method, a retinal scanning method, a hologram method, or a micro LED method and may output the auxiliary information to the user using each method.
[0044] As illustrated in
[0045] The observation auxiliary device 200 may include the receiver 201 for position detection. The receiver 201 may be used to detect the relative position by being used in combination with the transmitter 104 described above. The method of position detection by the transmitter 104 and the receiver 201 is not particularly limited. For example, three or more transmitters 104 may be provided in the observation device 100, and three-point positioning in which positioning is performed by signals from the transmitters 104 may be used as a positioning method. In addition, three or more receivers 201 may be provided in the observation auxiliary device 200, and three-point positioning in which positioning is performed by signals received by the receivers 201 may be used as a positioning method. The observation device 100 and the observation auxiliary device 200 may detect the relative position by three-point positioning.
[0046]
[0047] When the system 1000 starts the processing illustrated in
[0048] In addition, the system 1000 acquires auxiliary information (step S20). For example, the system 1000 may acquire the auxiliary information by performing the auxiliary information acquisition processing illustrated in
[0049] Note that, while
[0050] In addition, the system 1000 detects the relative position (step S30). For example, the system 1000 may detect the relative position by performing the relative position detection processing illustrated in
[0051] Note that while in
[0052] When the processing from step S10 to step S30 ends, the system 1000 outputs the auxiliary information and superimposes the auxiliary information on the optical image (step S40). Note that the order of the processing from step S10 to step S30 is not limited to this example. The processing from step S10 to step S30 only requires to be performed before the processing of step S40. The processing from step S10 to step S30 may be performed in order different from the order of the processing illustrated in
[0053] In step S40, the system 1000 may superimpose the auxiliary information on the optical image, for example, by performing the auxiliary information output processing illustrated in
[0054] Finally, the observation auxiliary device 200 outputs the auxiliary information acquired in step S23 in accordance with the output position determined in step S42 (step S43). As a result, the output position of the auxiliary information is appropriately changed on the basis of the distance and the direction calculated in step S41, so that the auxiliary information can be output to a position corresponding to the position of the observation auxiliary device 200 with respect to the optical axis of the eyepiece lens 102.
[0055] The system 1000 can display the auxiliary information 3 at an appropriate position on the optical image 1 by performing the processing illustrated in
[0056] Furthermore, in the system 1000, the observation device 100 can be configured without making a significant change to the existing observation device. Specifically, the observation device 100 can be configured only by adding the transmitter 104 to an existing observation device, for example. Other necessary changes can be achieved by software processing to be performed by the control device 105.
[0057] Thus, according to the system 1000, it is possible to provide the user with the auxiliary information together with the optical image without significantly changing the configuration of the existing observation device. In particular, the auxiliary information can be superimposed and displayed at an appropriate position on the optical image, so that the user can correctly recognize the auxiliary information. In addition, it is possible to avoid increase in size of the device as compared with an existing device.
[0058] In addition, it is not necessary to significantly change the configuration of existing smart glasses for the observation auxiliary device 200. The observation auxiliary device 200 can be configured by, for example, only adding the receiver 201 to existing smart glasses. Other necessary changes can be achieved by software processing to be performed by a processor included in the observation auxiliary device 200.
[0059] Thus, according to the system 1000, existing smart glasses used for other applications can be used as the observation auxiliary device 200 included in the system 1000 with necessary modifications added. Thus, the user can use one pair of smart glasses in a plurality of systems including the system 1000, so that it is possible to avoid inconvenience such as having to replace the smart glasses in accordance with the system to be used.
[0060]
[0061] The observation auxiliary device 200 calculates a distance (hereinafter, referred to as a first distance) from the observation device 100 to the observation auxiliary device 200 on the basis of the relative position calculated in step S33 (step S51) and determines whether or not the calculated first distance is equal to or longer than a predetermined distance (step S52). The predetermined distance only requires to be a distance indicating that the user (observation auxiliary device 200) is away from the observation device 100 to such an extent that it can be determined that the user is not looking into the eyepiece lens 102. More specifically, the predetermined distance is determined in advance on the basis of, for example, an eye relief that is a distance from the eyepiece lens 102 to the eye point. In other words, in step S52, the observation auxiliary device 200 determines whether or not the distance from the eyepiece lens 102 to the user's eye (for example, the pupil) is sufficiently longer than the eye relief.
[0062] In a case where it is determined in step S52 that the first distance is equal to or longer than the predetermined distance (Step S52: Yes), the observation auxiliary device 200 further determines whether or not the first distance increases over the predetermined distance when compared with the first distance calculated last time (step S53). In a case where it is determined that the first distance increases over the predetermined distance (step S53: Yes), the observation auxiliary device 200 stops output of the auxiliary information 3 (step S54), and thereafter, ends the auxiliary information output processing illustrated in
[0063] On the other hand, in a case where it is determined in step S52 that the first distance is not equal to or longer than the predetermined distance (step S52: No), the observation auxiliary device 200 calculates a distance (hereinafter, referred to as a second distance) and a direction from the optical axis of the eyepiece lens 102 to the observation auxiliary device 200 on the basis of the relative position calculated in step S33 (step S55) and determines an output position of the auxiliary information on the basis of the calculated second distance and direction (step S56). Note that the processing in step S55 and step S56 is similar to the processing in step S41 and step S42 in
[0064] Thereafter, the observation auxiliary device 200 determines whether or not the first distance decreases over the predetermined distance when compared with the first distance calculated last time (step S57). In a case where it is determined that the first distance decreases over the predetermined distance (step S57: Yes), the observation auxiliary device 200 starts outputting the auxiliary information 3 (step S58), outputs the auxiliary information in accordance with the output position determined in step S56 (step S59) and then ends the auxiliary information output processing illustrated in
[0065] In the auxiliary information output processing illustrated in
[0066] As a result, in a case where the user's eyes are located at the eye point, the auxiliary information 3 is output from the observation auxiliary device 200, so that, as illustrated in
[0067] Even in a case where the system 1000 performs the auxiliary information output processing illustrated in
[0068]
[0069] As illustrated in
[0070] As illustrated in
[0071]
[0072] In the example illustrated in
[0073] In the example illustrated in
[0074] (Second Embodiment)
[0075]
[0076] The observation device 100a is different from the observation device 100 in that the observation device 100a includes a reception unit 150 instead of the transmitter unit 140. The observation auxiliary device 200a is different from the observation auxiliary device 200 in that the observation auxiliary device 200a includes a transmitter unit 250 instead of the reception unit 240.
[0077] In other words, the present system is different from the system 1000 in that the observation device 100a functions as a reception side of a position signal for detecting the relative position, and the observation auxiliary device 200a functions as a transmitter side. Furthermore, in the present system, the relative position is calculated by the observation device 100a, and thus, information regarding the relative position is also transmitted from the observation device 100a to the observation auxiliary device 200a together with the auxiliary information. Effects similar to those of the system 1000 can be obtained by the present system.
[0078] (Third Embodiment)
[0079]
[0080] The observation device 100b is different from the observation device 100 in that the transmitter unit 140 is not included. The observation auxiliary device 200b is different from the observation auxiliary device 200 in that the observation auxiliary device 200b includes an imaging unit 270 instead of the reception unit 240. The imaging unit 270, which is an imaging device that captures an image of the retina, captures an image of the retina of the user wearing the observation auxiliary device 200b, and the control unit 210 detects the relative position on the basis of the image captured by the imaging unit 270. In other words, the present system is different from the system 1000 in that the relative position is detected by the observation auxiliary device 200b alone.
[0081]
[0082] The imaging unit 270 includes an imaging element 282, a half mirror 283, the lens 284, and the concave mirror 285. The light from the retina passes through the pupil and is incident on the half mirror 283 via the concave mirror 285 and the lens 284. Furthermore, the imaging element 282 is irradiated with the light reflected by the half mirror 283, whereby the image of the retina is projected on the imaging element 282.
[0083] When the imaging unit 270 captures an image of the retina, the output unit 260 projects an image for position detection on the retina. Specifically, for example, when the user approaches the eyepiece lens 102, the output unit 260 displays an annular ring having a size corresponding to the number of fields of view of the eyepiece lens 102 on the display element 281 and projects an image 10 for detecting the relative position on the retina. The output unit 260 preferably projects the image 10 for detecting the relative position on the retina with light outside the visible range by the display element 281 emitting light outside the visible range. As the light outside the visible range, for example, infrared light is preferably used. The imaging unit 270 captures an image of the retina during a period in which the image 10 for detecting the relative position is projected on the retina by the output unit 260. As a result, an image including the image 9 projected on the retina via the eyepiece lens 102 and the image 10 for detecting the relative position is captured by the imaging unit 270.
[0084] In the observation auxiliary device 200b, the control unit 210 detects the relative position on the basis of the image of the retina captured by the imaging unit 270 during the period in which the image 10 for detecting the relative position is projected on the retina by the output unit 260. More specifically, the relative position may be detected on the basis of the image 9 and the image 10 for detecting the relative position, and for example, the relative position may be detected by comparing center positions of the image 9 and the image 10 for detecting the relative position.
[0085] Effects similar to those of the system 1000 can be obtained by the present system. In addition, according to the present system, it is possible to minimize deformation with respect to the observation device 100 in related art.
[0086] (Fourth Embodiment)
[0087]
[0088] The observation auxiliary device 200c is different from the observation auxiliary device 200 in that the observation auxiliary device 200c includes an orientation detection unit 291. The orientation detection unit 291 includes a sensor that detects an orientation of the observation auxiliary device 200c. The observation auxiliary device 200c outputs the digital image captured by the observation device 100 as the auxiliary information when the relative position and the orientation detected by the orientation detection unit 291 satisfy a predetermined condition. As a result, it is possible to support work of setting the sample on the stage by the user.
[0089] The predetermined condition may be, for example, a condition that a distance between the observation device 100 and the observation auxiliary device 200 is equal to or longer than a predetermined distance (third distance) and less than a predetermined distance (fourth distance (>third distance)), and the detected orientation is a predetermined orientation (for example, obliquely downward, and the like). The predetermined condition may be any condition to be satisfied when the user sets the sample (container) on the stage.
[0090] As a result, as illustrated in
[0091] (Fifth Embodiment)
[0092]
[0093] The observation auxiliary device 200d is different from the observation auxiliary device 200 in that the observation auxiliary device 200d includes an illuminance detection unit 292. The illuminance detection unit 292 includes an illuminance sensor that detects brightness around the observation auxiliary device 200. As illustrated in
[0094] The predetermined condition may be, for example, a condition that a distance between the observation device 100 and the observation auxiliary device 200 is equal to or longer than a predetermined distance (third distance) and less than a predetermined distance (fourth distance (>third distance)). Further, the predetermined condition may be a condition that the distance between the observation device 100 and the observation auxiliary device 200 is equal to or longer than the predetermined distance (third distance) and less than the predetermined distance (fourth distance (>third distance)), and the detected illuminance is less than a predetermined value. The predetermined condition may be a condition to be satisfied when the user approaches the observation device 100 to some extent, and more preferably, may be a condition to be satisfied when the user approaches the observation device 100 to some extent in a dark environment.
[0095] As a result, the wire frame 11 of the observation device 100 is displayed as illustrated in
[0096] The above embodiments are specific examples for facilitating understanding of the invention, and the present invention is not limited to these embodiments. Modifications obtained by modifying the above embodiments and alternative forms replacing the above embodiments can be included. In other words, in each embodiment, the components can be modified without departing from the spirit and the scope thereof. Further, a new embodiment can be implemented by appropriately combining the multiple components disclosed in one or more of the embodiments. Further, some components may be omitted from the components described in the corresponding embodiment, or some components may be added to the components described in the embodiment. Further, the order of the processing procedures in each embodiment is interchangeable as long as there is no contradiction. In other words, the observation system of the present invention and the observation support method can be variously modified and changed without departing from the scope of the invention defined by the claims.
[0097] While in the above-described embodiment, for example,
[0098] In the above-described embodiment,
[0099] While in the above-described embodiment, the smart glasses, the see-through type head mounted display, and the smart contact lens have been exemplified as specific examples of the observation auxiliary device, in a case where the smart glasses or the see-through type head mounted display is used as the observation auxiliary device, if the eye relief is shorter than a case where the user looks into the eyepiece lens 102 with the naked eye, the observation auxiliary device interferes and the eye cannot be placed at the eye point in some cases. It is therefore preferable that the observation device includes a so-called high-eye point eyepiece lens 102, and for example, it is preferable that the observation device has an eye relief of equal to or longer than 40 mm
[0100] Herein, expression of “on the basis of A” does not indicate “on the basis of only A”, but indicates “on the basis of at least A”. In other words, “on the basis of A” may be on the basis of B in addition to A.