DISPLAY PROCESSING APPARATUS, DISPLAY PROCESSING METHOD, AND PROGRAM
20200278754 ยท 2020-09-03
Inventors
Cpc classification
G06F3/017
PHYSICS
G09G5/36
PHYSICS
G09G3/002
PHYSICS
International classification
Abstract
[Problem] In a case where information is projected and displayed, confidentiality is ensured by controlling display according to a state of an object. [Solution] A display processing apparatus according to the present disclosure includes: a state acquisition unit configured to acquire a spatial state of an object; and a display control unit configured to control display of projected information according to the state of the object including a posture of the object. With this configuration, in a case where information is projected and displayed, it is possible to ensure confidentiality by controlling display according to the state of the object.
Claims
1. A display processing apparatus, comprising: a state acquisition unit configured to acquire a spatial state of an object; and a display control unit configured to control display of projected information according to the state of the object including a posture of the object.
2. The display processing apparatus according to claim 1, wherein the state acquisition unit acquires the state of the object within a projection area in which the information is projected.
3. The display processing apparatus according to claim 1, wherein the state acquisition unit acquires, as the state of the object, a position of the object in a depth direction with respect to a projection plane.
4. The display processing apparatus according to claim 1, wherein the state acquisition unit acquires, as the state of the object, a position of the object in a direction along a projection plane.
5. The display processing apparatus according to claim 1, wherein the state acquisition unit includes a detection unit configured to detect a spatial position of the object.
6. The display processing apparatus according to claim 1, wherein the state acquisition unit includes a tracking unit configured to track a position of the object.
7. The display processing apparatus according to claim 1, wherein the state acquisition unit includes a posture estimation unit configured to estimate the posture of the object.
8. The display processing apparatus according to claim 1, wherein: the state acquisition unit includes a recognition unit configured to recognize a gesture of the object; and the display control unit changes a display state of the information on the basis of the gesture.
9. The display processing apparatus according to claim 8, wherein: the object is a hand of a user; and the gesture includes at least one of operation of clenching the hand, operation of unclenching the hand, operation of turning over a palm, operation of tapping the displayed information, operation of dragging the displayed information, operation of touching a projection area with the hand, operation of lowering the hand toward the projection area, operation of moving the hand out of the projection area, and operation of waving the hand.
10. The display processing apparatus according to claim 1, wherein the display control unit changes a display state so that the information is unrecognizable according to the state of the object.
11. The display processing apparatus according to claim 1, wherein: the information is displayed in a reversible form; and the display control unit changes a display state of the information by reversing the information to a front or back side according to the state of the object.
12. The display processing apparatus according to claim 1, wherein the display control unit controls display of the information projected onto the object.
13. The display processing apparatus according to claim 1, wherein the display control unit controls display of the information projected onto a predetermined projection plane.
14. The display processing apparatus according to claim 1, wherein the object is an object held with a hand of a user.
15. A display processing method, comprising: acquiring a spatial state of an object; and controlling display of projected information according to the state of the object including a posture of the object.
16. A program for causing a computer to function as: means for acquiring a spatial state of an object; and means for controlling display of projected information according to the state of the object including a posture of the object.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DESCRIPTION OF EMBODIMENTS
[0037] Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In this specification and the drawings, components having substantially the same functional configuration will be denoted by the same reference sign, and description thereof will not be repeated.
[0038] Description will be made in the following order.
[0039] 1. Configuration example of system
[0040] 2. Processing performed in image projection system
[0041] 3. Examples of display on screen
[0042] 4. Estimation of posture of hand by hand posture estimation unit
[0043] 5. Display control according to posture of hand
[0044] 6. Examples of specific operation
[0045] 7. Examples of operation using object other than hand
[0046] 8. Examples of control by server
[0047] 1. Configuration Example of System
[0048] First, a schematic configuration of a projection system 1000 according to an embodiment of the present disclosure will be described with reference to
[0049] In the projection system 1000, for example, the table 200 having a flat projection plane is placed on the floor, and an output unit 116 of the projector apparatus 100 is provided above the table 200 so as to face downward.
[0050] In the image projection system 100, an image is projected from the output unit 116 of the projector apparatus 100 provided above the table 200 onto the projection plane of the table 200 provided below the projector apparatus, thereby displaying the image on the table 200.
[0051]
[0052] The input unit 102 is a device for, by using a hand of a user on a screen 102 as an object to be detected, acquiring user operation on the basis of a state (position, posture, movement, or the like) of the object. For example, the input unit 102 includes an RGB camera serving as an image sensor, a stereo camera or a time-of-flight (TOF) camera serving as a distance measurement sensor, a structured light camera, and the like. Therefore, it is possible to acquire a distance image (depth map) regarding the object on the basis of information detected by the input unit 102.
[0053] The hand detection unit 104 detects a region of the hand from the image on the basis of the information acquired from the input unit 102. There are a method of using an image of the RGB camera and a method of using an image of the distance measurement sensor, and the methods are not particularly limited.
[0054] For example, the region of the hand can be detected by performing block matching between a hand template image previously held in a memory or the like and the image acquired by the input unit 102. By using the distance image (depth map), the hand detection unit 104 can detect a position of the hand in a depth direction with respect to the projection plane of the table 200 and a position of the hand in a direction along the projection plane. Specifically,
[0055] Upon receipt of the detection result from the hand detection unit 104, the hand tracking unit 106 takes correspondence between the hand detected in a previous frame and the hand detected in a current frame, thereby tracking the position of the hand. As a tracking method, there are known a method of associating objects in close positions and other tracking techniques. However, the methods are not particularly limited.
[0056] Upon receipt of the detection result from the hand detection unit 104, the hand posture estimation unit 108 estimates a posture of the hand by using the image of the distance measurement sensor. At this time, by regarding a palm or back of the hand as a plane, an angle of the plane is estimated. An estimation method will be described later.
[0057] Upon receipt of the detection results of the hand detection unit 104 and the hand posture estimation unit 108, the gesture recognition unit 110 recognizes a gesture of user operation. A method of recognizing a gesture by estimating a skeleton model of a hand is common, and the gesture recognition unit 110 can recognize a gesture by using such a method. For example, the gesture recognition unit 110 performs matching among a gesture previously held in the memory or the like, the position of the hand detected by the hand detection unit 104, and the posture of the hand estimated by the hand posture estimation unit 108, thereby recognizing the gesture. Specifically,
[0058] The hand detection unit 104, the hand tracking unit 106, the hand posture estimation unit 108, and the gesture recognition unit 110 described above function as a state acquisition unit 120 that acquires a spatial state of the hand including the posture of the hand (object). The state acquisition unit 120 can acquire the state of the hand within a projection area 202 or out of the projection area 202.
[0059] Upon receipt of the gesture recognition result by the gesture recognition unit 110, the information generation unit 114 generates information corresponding to the user operation.
[0060] For example, the information generation unit 114 compares display information corresponding to a gesture held in advance with the gesture recognition result, and generates display information corresponding to the gesture recognition result. The information generation unit 114 stores context of the generated information in the memory or the like. Specifically,
[0061] Upon receipt of the information from the hand tracking unit 106 and the information generation unit 114, the display control unit 112 performs control so that the display information generated by the information generation unit 114 is displayed at a predetermined position on the table 200. Specifically, the display control unit 112 can perform control so that the display information generated by the information generation unit 114 is displayed at the position of the hand of the user tracked by the hand tracking unit 106. The output unit 116 includes, for example, a projection lens, a liquid crystal panel, a lamp, and the like, and outputs light under the control of the display control unit 112, thereby outputting an image to the table 200. As a result, content is displayed on the table 200.
[0062] In the configuration in
[0063] 2. Processing Performed in Image Projection System
[0064]
[0065] In the next Step S14, the hand detection unit 104 determines whether or not the hand has been detected. When the hand is detected, the processing proceeds to Step S16. In Step S16, the hand tracking unit 106 performs processing of tracking the hand of the user.
[0066] The processing in Steps S20 to S26 is performed in parallel with the processing in Step S16. In Step S20, the hand posture estimation unit 108 performs processing of estimating the posture of the hand. After Step S20, the processing proceeds to Step S22, and the gesture recognition unit 110 performs processing of recognizing a gesture.
[0067] In the next Step S24, it is determined whether or not the gesture recognized by the gesture recognition unit 110 is a specific gesture. When the gesture is a specific gesture, the processing proceeds to Step S26. In Step S26, the information generation unit 114 performs processing of generating display information corresponding to the specific gesture.
[0068] After Steps S16 and S26, the processing proceeds to Step S28. In Step S28, based on the result of the hand tracking processing in Step S16 and the information generation processing in Step S26, the display control unit 112 performs processing for display control. In this way, display is performed on the table 200 on the basis of the result of the hand tracking processing and the result of the information generation processing.
[0069] 3. Examples of Display on Table
[0070]
[0071] 4. Estimation of Posture of Hand by Hand Posture Estimation Unit
[0072]
[0073] Estimation of the posture of the hand is sequentially performed according to Steps (1) to (3) of
[0074] 5. Display Control According to Posture of Hand
[0075]
[0076] As illustrated in
[0077] As described above, it is possible to display the content (A) 310 in the correct shape according to the posture of the hand 400 by performing perspective projection transformation according to the posture of the hand 400. Therefore, the user can visually recognize the content (A) 310 having no distortion or the like in the correct shape on the palm.
[0078] 6. Examples of Specific Operation
[0079] Next, specific operation performed by the user by using the projection system 1000 will be described.
[0080] Further, in a case where, after the playing card is projected onto the screen (Step (1)), the playing card is projected onto the back of the hand 400 (Step S(6)), the playing card is grabbed and moved (Step (7)), the palm is turned over, and the hand 400 is opened, the front side of the playing card is displayed (Step (8)).
[0081] Further, in Step (2) and Step (3) in
[0082]
[0083] In this way, in the operation of
[0084]
[0085] As another example, in a case where a playing card is displayed on the palm (Step (3)) and the hand 400 is turned over, the back side of the playing card displayed on the palm is displayed on the back of the hand (Step (4)).
[0086]
[0087] The example (1) in
[0088]
[0089] The example (1) in
[0090] The example (2) in
[0091] The example (3) in
[0092]
[0093]
[0094] In the operation without the table 200 in
[0095]
[0096] In a case where the hand is excessively tilted as illustrated in the right diagram of
[0097] Note that an example where, in a case where the playing card (content (A) 300) is displayed on the hand 400, a display state is changed in response to a gesture of the user has been mainly described in the above description. However, the display state can also be changed according to a gesture of the user in a case where the playing card is displayed on the table 200.
[0098] According to the specific operation examples described above, it is possible to present additional information to the user by recognizing the hand 400 existing in the projecting area 202 and grasping a relative position between the hand 400 and the table 200. Further, it is possible to improve usability by detecting the posture and state of the hand 400 in real time and dynamically changing the content of projection in response to user operation or gesture. Furthermore, operation of the hand 400 and a change in the content of projection can be associated by intuitive movement. This makes it possible to achieve an operation system with low learning costs. Still further, it is also possible to create a private screen by displaying content on the hand 400 in a public screen projected by the projector apparatus 100. Note that an example of displaying content such as a playing card has been described in the above examples. However, other kinds of content may be displayed, such as mahjong, a card game using cards having the front side and the back side, and Gungin shougi (kind of board game). Further, the present disclosure is applicable to, as content to be projected, various kinds of content other than the content related to the above games. Because the hand 400 can be used as a private screen as described above, it is particularly useful for displaying an application that requires confidentiality such as a personal identification number.
[0099] 7. Examples of Operation Using Object Other than Hand
[0100] An example where a display state of information regarding the content 300 is changed in response to operation of the hand 400 of the user has been described in the above description. However, the display state of the information may be changed in response to operation other than operation of the hand 400.
[0101] Further, in the example in
[0102] 8. Examples of Control by Server
[0103] In the configuration example in
[0104]
[0105] According to the configuration example in
[0106] As described above, according to this embodiment, it is possible to optimally control display of content to be projected according to a spatial state of an object, such as the hand 400 or the board 410. Further, the object such as the hand 400 or the board 410 can be used as a private screen, and thus it is possible to achieve a highly confidential application that could have not been achieved by existing projection systems, without using any special device or tool. Furthermore, the content of projection can be optimized by simple and intuitive operation by associating operation of the object with a change in the content of the projection.
[0107] Hereinabove, the preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings. However, the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can make various changes and modifications within the scope of the technical idea recited in the claims. It is understood that those changes and modifications are also included in the technical scope of the present disclosure.
[0108] Further, the effects described in the present specification are merely illustrative or exemplary and are not limited. That is, the technology according to the present disclosure can have other effects apparent to those skilled in the art from the description of the present specification, in addition to or instead of the above effects.
[0109] The following configurations are also included in the technical scope of the present disclosure.
(1)
[0110] A display processing apparatus, comprising:
[0111] a state acquisition unit configured to acquire a spatial state of an object; and
[0112] a display control unit configured to control display of projected information according to the state of the object including a posture of the object.
(2)
[0113] The display processing apparatus according to (1), wherein the state acquisition unit acquires the state of the object within a projection area in which the information is projected.
(3)
[0114] The display processing apparatus according to (1) or (2), wherein the state acquisition unit acquires, as the state of the object, a position of the object in a depth direction with respect to a projection plane.
(4)
[0115] The display processing apparatus according to (1) or (2), wherein the state acquisition unit acquires, as the state of the object, a position of the object in a direction along a projection plane.
(5)
[0116] The display processing apparatus according to any one of (1) to (4), wherein the state acquisition unit includes a detection unit configured to detect a spatial position of the object.
(6)
[0117] The display processing apparatus according to any one of (1) to (5), wherein the state acquisition unit includes a tracking unit configured to track a position of the object.
(7)
[0118] The display processing apparatus according to any one of (1) to (6), wherein the state acquisition unit includes a posture estimation unit configured to estimate the posture of the object.
(8)
[0119] The display processing apparatus according to any one of (1) to (7), wherein:
[0120] the state acquisition unit includes a recognition unit configured to recognize a gesture of the object; and
[0121] the display control unit changes a display state of the information on the basis of the gesture.
(9)
[0122] The display processing apparatus according to (8), wherein:
[0123] the object is a hand of a user; and
[0124] the gesture includes at least one of operation of clenching the hand, operation of unclenching the hand, operation of turning over a palm, operation of tapping the displayed information, operation of dragging the displayed information, operation of touching a projection area with the hand, operation of lowering the hand toward the projection area, operation of moving the hand out of the projection area, and operation of waving the hand.
(10)
[0125] The display processing apparatus according to any one of (1) to (9), wherein the display control unit changes a display state so that the information is unrecognizable according to the state of the object.
(11)
[0126] The display processing apparatus according to any one of (1) to (10), wherein:
[0127] the information is displayed in a reversible form; and
[0128] the display control unit changes a display state of the information by reversing the information to a front or back side according to the state of the object.
(12)
[0129] The display processing apparatus according to any one of (1) to (11), wherein the display control unit controls display of the information projected onto the object.
(13)
[0130] The display processing apparatus according to any one of (1) to (11), wherein the display control unit controls display of the information projected onto a predetermined projection plane.
(14)
[0131] The display processing apparatus according to any one of (1) to (13), wherein the object is an object held with a hand of a user.
(15)
[0132] A display processing method, comprising:
[0133] acquiring a spatial state of an object; and
[0134] controlling display of projected information according to the state of the object including a posture of the object.
(16)
[0135] A program for causing a computer to function as:
[0136] means for acquiring a spatial state of an object; and
[0137] means for controlling display of projected information according to the state of the object including a posture of the object.
REFERENCE SIGNS LIST
[0138] 104 HAND DETECTION UNIT [0139] 106 HAND TRACKING UNIT [0140] 108 HAND POSTURE ESTIMATION UNIT [0141] 110 GESTURE RECOGNITION UNIT [0142] 112 DISPLAY CONTROL UNIT [0143] 120 STATE ACQUISITION UNIT [0144] 130 DISPLAY PROCESSING APPARATUS