INFORMATION PROCESSING APPARATUS, NON-TRANSITORY COMPUTER READABLE MEDIUM, AND INFORMATION PROCESSING METHOD
20230046226 · 2023-02-16
Assignee
Inventors
Cpc classification
G06V10/44
PHYSICS
G06V30/224
PHYSICS
G06V10/22
PHYSICS
International classification
G06V10/22
PHYSICS
Abstract
An information processing apparatus includes a processor configured to: acquire a captured image of an object; specify a first area of the object in the captured image, the first area being an area occupied by a work target that is a target to be worked on; process the captured image to make a second area other than the first area invisible to generate a processed image; in response to a change in the first area with a deformation of the work target, apply a deformation area instead of the first area to make a second area obtained by the application invisible to generate a processed image, the deformation area being an area defined by a pre-registered shape of the work target after deformation; and transmit the processed image.
Claims
1. An information processing apparatus comprising: a processor configured to: acquire a captured image of an object; specify a first area of the object in the captured image, the first area being an area occupied by a work target that is a target to be worked on; process the captured image to make a second area other than the first area invisible to generate a processed image; in response to a change in the first area with a deformation of the work target, apply a deformation area instead of the first area to make a second area obtained by the application invisible to generate a processed image, the deformation area being an area defined by a pre-registered shape of the work target after deformation; and transmit the processed image.
2. The information processing apparatus according to claim 1, wherein: the processor is configured to: further acquire space information corresponding to the captured image, the space information being information on a three-dimensional space including the object; detect a feature point indicating the work target from the captured image; and specify a first space and a second space in the space information by using the feature point, the first space being a space corresponding to the first area, the second space being a space corresponding to the second area.
3. The information processing apparatus according to claim 2, wherein: the processor is configured to: set invisibility information for the second space, the invisibility information being information for making an area invisible; and make the second area corresponding to the second space in the captured image invisible by using the invisibility information to generate a processed image.
4. The information processing apparatus according to claim 3, wherein: the processor is configured to: further acquire position information and direction information in the space information, the position information being information on a position of an image capturing unit that obtains the captured image and a position of the work target, the direction information indicating a direction in which the image capturing unit captures an image of the object; and make the second area invisible in accordance with the position information and the direction information to generate a processed image.
5. The information processing apparatus according to claim 4, wherein: the processor is configured to: estimate the position information and the direction information from an amount of change of the feature point.
6. The information processing apparatus according to claim 1, wherein: the processor is configured to: store a plurality of deformation areas, each of the plurality of deformation areas comprising the deformation area; and accept designation of one deformation area among the plurality of deformation areas.
7. The information processing apparatus according to claim 2, wherein: the processor is configured to: store a plurality of deformation areas, each of the plurality of deformation areas comprising the deformation area; and accept designation of one deformation area among the plurality of deformation areas.
8. The information processing apparatus according to claim 3, wherein: the processor is configured to: store a plurality of deformation areas, each of the plurality of deformation areas comprising the deformation area; and accept designation of one deformation area among the plurality of deformation areas.
9. The information processing apparatus according to claim 4, wherein: the processor is configured to: store a plurality of deformation areas, each of the plurality of deformation areas comprising the deformation area; and accept designation of one deformation area among the plurality of deformation areas.
10. The information processing apparatus according to claim 5, wherein: the processor is configured to: store a plurality of deformation areas, each of the plurality of deformation areas comprising the deformation area; and accept designation of one deformation area among the plurality of deformation areas.
11. The information processing apparatus according to claim 1, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
12. The information processing apparatus according to claim 2, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
13. The information processing apparatus according to claim 3, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
14. The information processing apparatus according to claim 4, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
15. The information processing apparatus according to claim 5, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
16. The information processing apparatus according to claim 6, wherein: the processor is configured to: detect a feature point indicating the work target from the captured image; and in response to detection of a deformation of the work target from an amount of change of the feature point, provide a notification of switching of the first area to the deformation area.
17. The information processing apparatus according to claim 11, wherein: the processor is configured to: in response to the detected deformation of the work target being a deformation that expands the first area, apply a deformation area corresponding to the deformed work target to generate a processed image.
18. The information processing apparatus according to claim 1, wherein: the processor is configured to: in response to receipt of an instruction to switch to a deformation area obtained by narrowing the first area, apply the deformation area in accordance with the instruction to generate a processed image.
19. A non-transitory computer readable medium storing a program causing a computer to execute a process for information processing, the process comprising: acquiring a captured image of an object; specifying a first area of the object in the captured image, the first area being an area occupied by a work target that is a target to be worked on; processing the captured image to make a second area other than the first area invisible to generate a processed image; in response to a change in the first area with a deformation of the work target, applying a deformation area instead of the first area to make a second area obtained by the application invisible to generate a processed image, the deformation area being an area defined by a pre-registered shape of the work target after deformation; and transmitting the processed image.
20. An information processing method comprising: acquiring a captured image of an object; specifying a first area of the object in the captured image, the first area being an area occupied by a work target that is a target to be worked on; processing the captured image to make a second area other than the first area invisible to generate a processed image; in response to a change in the first area with a deformation of the work target, applying a deformation area instead of the first area to make a second area obtained by the application invisible to generate a processed image, the deformation area being an area defined by a pre-registered shape of the work target after deformation; and transmitting the processed image.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Exemplary embodiments of the present disclosure will be described in detail based on the following figures, wherein:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DETAILED DESCRIPTION
First Exemplary Embodiment
[0025] Exemplary embodiments of the present disclosure will be described in detail hereinafter with reference to the drawings.
[0026] In one example, as illustrated in
[0027] The information processing apparatus 10 is a terminal such as a tablet including a monitor 16 and a camera 18 described below. The information processing apparatus 10 acquires an image of a target to be worked on (hereinafter referred to as a “work target”) and transmits the acquired image to the terminal 50. If the image to be transmitted to the terminal 50 includes an object other than the work target and a background, the information processing apparatus 10 processes the image to make the object other than the work target and the background invisible in the image. Further, the information processing apparatus 10 acquires from the terminal 50 information on assistance in the work performed by the assistant (the information is hereinafter referred to as “assistance information”) and presents the assistance information to the worker.
[0028] The terminal 50 acquires an image from the information processing apparatus 10 and presents the image to the assistant. Further, the terminal 50 transmits assistance information input from the assistant to the information processing apparatus 10.
[0029] In the information processing system 1, the information processing apparatus 10 transmits and presents an image captured by the worker to the terminal 50, and the terminal 50 transmits and presents assistance information input from the assistant to the information processing apparatus 10. The information processing system 1 allows the worker to receive the assistance information from the assistant at a remote location through the information processing apparatus 10 and to perform work on the work target.
[0030] In this exemplary embodiment, an image is captured, by way of example but not limitation. A video may be captured.
[0031] Next, the hardware configuration of the information processing apparatus 10 will be described with reference to
[0032] As illustrated in
[0033] The CPU 11 controls the overall operation of the information processing apparatus 10. The ROM 12 stores various programs including an information processing program according to this exemplary embodiment, data, and the like. The RAM 13 is a memory used as a work area for executing various programs. The CPU 11 loads a program stored in the ROM 12 into the RAM 13 and executes the program to perform a process for processing an image to generate a processed image. In one example, the storage 14 is a hard disk drive (HDD), a solid state drive (SSD), a flash memory, or the like. The storage 14 may store the information processing program and the like. The input unit 15 includes a touch panel and a keyboard for receiving text input and the like. The monitor 16 displays text and an image. The communication I/F 17 transmits and receives data. The camera 18 is an imaging device for capturing an image of the work target. The camera 18 is an example of an “image capturing unit”.
[0034] Next, the functional configuration of the information processing apparatus 10 will be described with reference to
[0035] In one example, as illustrated in
[0036] The acquisition unit 21 acquires an image including an object that is a work target whose image is captured by the camera 18. Further, the acquisition unit 21 acquires designation of a work target that is a target to be worked on among objects included in the image. The work target according to this exemplary embodiment is designated by reading a quick response code (QR code (registered trademark)) on the surface of the object, by way of example but not limitation. Identified objects may be displayed on the monitor 16 to allow a user to select the work target from among the objects.
[0037] The detection unit 22 detects feature points indicating the object from the acquired image. The feature points are points indicating the edges, the corners, and the like of the object included in the image.
[0038] The estimation unit 23 estimates the position and direction of the worker (the camera 18) by using the detected feature points and estimates a target space in the position and direction of the worker. Specifically, Simultaneous Localization and Mapping (SLAM) technology is used to estimate position information indicating the position of the worker, direction information indicating the direction of the worker, and the target space.
[0039] For example, the worker starts capturing an image of the work target by causing a QR code (registered trademark) attached to the work target, and the detection unit 22 detects the feature points included in the captured image. The estimation unit 23 compares the feature points included in a plurality of images captured over time and estimates positions and directions at and in which the worker (the camera 18) captures the images from the amount of change of the feature points.
[0040] In one example, as illustrated in
[0041] The setting unit 24 sets three-dimensional space information by using the acquired image and the detected feature points. Specifically, in one example, the setting unit 24 sets three-dimensional space information illustrated in
[0042] The setting unit 24 further sets information for the non-target space 34 to make the non-target space 34 invisible as a background not to be visually recognized. This information is hereinafter referred to as “invisibility information”. That is, the three-dimensional space information is a three-dimensional coordinate space in which the feature points 31 indicating the object are set, and includes the target space 33 occupied by the designated work target 32, and the non-target space 34 that is a space excluding the target space 33.
[0043] If three-dimensional space information is held, the setting unit 24 sets the three-dimensional space information by performing positioning of the non-target space 34 and the target space 33 in the position information and direction information of the worker by using the acquired image and the detected feature points 31.
[0044] In one example, as illustrated in
[0045] In one example, as illustrated in
[0046] The storage unit 27 stores shapes of deformed work targets.
[0047] In response to a change of the target area 36 with a deformation of the work target 32, the acceptance unit 28 accepts, from the user, an instruction for switching the target area 36 to an area (hereinafter referred to as a “deformation area”) defined by a pre-registered shape of the work target 32 after deformation, and the shape of the deformed work target 32.
[0048] In response to detection of a deformation of the work target 32 from the amount of change of the feature points 31, the notification unit 29 provides a notification of switching of the target area 36 to the deformation area.
[0049] Next, prior to the description of the operation of the information processing apparatus 10, a description of a method for switching the target area 36 to the deformation area will be given with reference to
[0050] For example, the work target 32 is an image forming apparatus. In this case, a door of the image forming apparatus may be opened or closed, and the image forming apparatus may be deformed in shape. If the work target 32 is deformed, in the target space 33 set by the setting unit 24 before the deformation, a portion of the work target 32 may be made invisible as a result of processing, or an area to be made invisible may fail to be made invisible as a result of processing.
[0051] In the information processing apparatus 10 according to this exemplary embodiment, accordingly, the storage unit 27 stores, for each work target 32, a shape of the work target 32 after deformation in advance, and the information processing apparatus 10 accepts, from the user, designation of a shape corresponding to the shape of the deformed work target 32 from among the shapes stored in advance.
[0052] The acceptance unit 28 accepts designation of a shape of the deformed work target 32 from the user, and the setting unit 24 sets a space (hereinafter referred to as a “deformation space”) corresponding to the shape of the deformed work target 32 for which the designation is accepted. As illustrated in
[0053] The deformation space is set in the three-dimensional space information in consideration of the estimated position information and direction information. For example, the shape of the work target 32 after deformation stored in the storage unit 27 is the shape of the work target 32 as viewed from the front. In this case, a deformation space corresponding to the estimated position information and direction information (for example, the shape of the deformed work target 32 as viewed from a side) is generated. Accordingly, the shape of the work target 32 after deformation stored in the storage unit 27 is associated with the shape of the work target 32 as viewed from the current position of the worker.
[0054] The specifying unit 25 compares the acquired image 35 with the three-dimensional space information and specifies a deformation area in the image 35 corresponding to the deformation space in the three-dimensional space information. The specifying unit 25 further specifies an area other than the deformation area as the non-target area 37 to be processed as a background.
[0055] The generation unit 26 makes the non-target area 37 invisible in the image 35 to generate the processed image 38.
[0056] Next, the operation of the information processing apparatus 10 according to this exemplary embodiment will be described with reference to
[0057] In step S101, the CPU 11 acquires a captured image 35 of an object including the work target 32.
[0058] In step S102, the CPU 11 acquires the work target 32 designated by the user.
[0059] In step S103, the CPU 11 detects the feature points 31 of the object from the acquired image 35.
[0060] In step S104, the CPU 11 estimates the position information and direction information of the worker from the detected feature points 31 by using SLAM technology.
[0061] In step S105, the CPU 11 sets the target space 33 and the non-target space 34 in three-dimensional space information. In the three-dimensional space information, invisibility information is set for the non-target space 34. When a new image 35 is acquired over time and the feature points 31 are detected, the feature points 31 are used to set the target space 33 and the non-target space 34 in the three-dimensional space information.
[0062] In step S106, the CPU 11 determines whether designation of a shape of the deformed work target 32 has been accepted from the user. If designation of a shape of the deformed work target 32 has been accepted (step S106: YES), the CPU 11 proceeds to step S107. On the other hand, if designation of a shape of the deformed work target 32 has not been accepted (step S106: NO), the CPU 11 proceeds to step S109.
[0063] In step S107, the CPU 11 accepts an instruction to switch to the deformation area and the shape of the deformed work target 32.
[0064] In step S108, the CPU 11 sets a deformation space by using the accepted shape of the deformed work target 32, specifies a deformation area corresponding to the deformation space, and applies the deformation area instead of the target area 36.
[0065] In step S109, the CPU 11 compares the acquired image 35 with the three-dimensional space information and specifies the target area 36 in the image 35.
[0066] In step S110, the CPU 11 performs processing on the non-target area 37 to generate the processed image 38.
[0067] In step S111, the CPU 11 transmits the generated processed image 38 to the terminal 50.
[0068] In step S112, the CPU 11 determines whether to terminate the process. If the process is to be terminated (step S112: YES), the CPU 11 ends the process of generating the processed image 38. On the other hand, if the process is not to be terminated (step S112: NO), the CPU 11 proceeds to step S113.
[0069] In step S113, the CPU 11 acquires a new captured image 35 of an object including the work target 32.
[0070] In step S114, the CPU 11 detects the feature points 31 of the object from the acquired image 35. Then, the CPU 11 returns to step S104, performs positioning of the target space 33 and the non-target space 34 by using the detected feature points 31, and sets three-dimensional space information.
[0071] As described above, this exemplary embodiment may prevent a captured image from being excessively or insufficiently processed even upon a change in the area occupied by the work target 32.
Second Exemplary Embodiment
[0072] In the first exemplary embodiment, designation of a shape of the deformed work target 32 is accepted from the user. In a second exemplary embodiment, the shape of the deformed work target 32 is detected.
[0073] The configuration of the information processing system 1, the hardware configuration of the information processing apparatus 10, and the functional configuration of the information processing apparatus 10 according to this exemplary embodiment are similar to the configuration illustrated in
[0074] In one example, as illustrated in
[0075] In one example, as illustrated in
[0076] The operation of the information processing apparatus 10 according to this exemplary embodiment will be described with reference to
[0077] In step S115, the CPU 11 determines whether a deformation of the work target 32 is detected from the amount of change of the feature points 31. If a deformation of the work target 32 is detected (step S115: YES), the CPU 11 proceeds to step S116. On the other hand, if no deformation of the work target 32 is detected (step S115: NO), the CPU 11 proceeds to step S109.
[0078] In step S116, the CPU 11 determines whether the detected deformation is a deformation that reduces the size of the target space 33. If the detected deformation is a deformation that reduces the size of the target space 33 (step S116: YES), the CPU 11 proceeds to step S117. On the other hand, if the detected deformation is not a deformation that reduces the size of the target space 33 (i.e., the detected deformation is a deformation that increases the size of the target space 33) (step S116: NO), the CPU 11 proceeds to step S119.
[0079] In step S117, the CPU 11 notifies the user of switching to the deformation area.
[0080] In step S118, the CPU 11 accepts from the user an instruction to switch to the deformation area and the shape of the deformed work target 32.
[0081] In step S119, the CPU 11 detects the shape of the deformed work target 32 from the amount of change of the detected feature points 31.
[0082] In step S120, the CPU 11 sets a deformation space by using the shape of the deformed work target 32, specifies a deformation area corresponding to the deformation space, and applies the deformation area instead of the target area 36.
[0083] As described above, this exemplary embodiment may prevent the occurrence of unintended insufficient processing.
[0084] In the exemplary embodiments described above, the information processing apparatus 10 is a terminal carried by a worker, by way of example but not limitation. The information processing apparatus 10 may be a server. For example, a server including the information processing apparatus 10 may acquire the image 35 and the work target 32 from a terminal carried by a worker, process the image 35 to make the non-target area 37 invisible in the image 35, and transmit the processed image 38 to a terminal carried by an assistant.
[0085] While exemplary embodiments of the present disclosure have been described, the present disclosure is not limited to the scope described in the exemplary embodiments. The exemplary embodiments may be modified or improved in various ways without departing from the scope of the present disclosure, and such modifications or improvements also fall within the technical scope of the present disclosure.
[0086] In the embodiments above, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device).
[0087] In the embodiments above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiments above, and may be changed.
[0088] In the exemplary embodiments, the information processing program is installed in a storage, by way of example but not limitation. The information processing program according to the exemplary embodiments may be provided in such a manner as to be recorded in a computer-readable storage medium. For example, an information processing program according to an exemplary embodiment of the present disclosure may be provided in such a manner as to be recorded in an optical disk such as a compact disc ROM (CD-ROM) or a digital versatile disc ROM (DVD-ROM). An information processing program according to an exemplary embodiment of the present disclosure may be provided in such a manner as to be recorded in a semiconductor memory such as a Universal Serial Bus (USB) memory or a memory card. The information processing program according to the exemplary embodiments may be acquired from an external device via a communication line connected to the communication I/F 17.
[0089] The foregoing description of the exemplary embodiments of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.