Artificial neural network apparatus and operating method for the same

11488003 · 2022-11-01

Assignee

Inventors

Cpc classification

International classification

Abstract

An artificial neural network apparatus and an operating method including a plurality of layer processors for performing operations on input data are disclosed. The artificial neural network apparatus may include: a flag layer processor for outputting a flag according to a comparison result between a pooling output value of a current frame and a pooling output value of a previous frame; and a controller for stopping operation of a layer processor which performs operations after the flag layer processor among the plurality of layer processors when the flag is outputted from the flag layer processor, wherein the flag layer processor is a layer processor that performs a pooling operation first among the plurality of layer processors.

Claims

1. An artificial neural network apparatus including a plurality of layer processors for performing operations on input data, comprising: a flag layer processor for outputting a flag according to a comparison result between a pooling output value of a current frame and a pooling output value of a previous frame; and a controller for stopping operation of a layer processor which performs operations after the flag layer processor among the plurality of layer processors, when the flag is outputted from the flag layer processor, wherein the flag layer processor is a layer processor that performs a pooling operation first among the plurality of layer processors.

2. The artificial neural network apparatus of claim 1, wherein the flag layer processor outputs the flag when the pooling output value of the current frame is equal to the pooling output value of the previous frame.

3. The artificial neural network apparatus of claim 1, wherein the pooling output value of the current frame and the pooling output value of the previous frame are position information stored according to a comparison result between a pooling operation value and a previously stored reference value.

4. The artificial neural network apparatus of claim 1, wherein the flag layer processor includes a memory for storing the pooling output value of the current frame and the pooling output value of the previous frame.

5. The artificial neural network apparatus of claim 1, further comprising: a comparator for comparing the pooling output value of the current frame with the pooling output value of the previous frame, and outputting the flag when the pooling output value of the current frame and the pooling output value of the previous frame are the same.

6. An artificial neural network apparatus including a plurality of layer processors for performing operations on input data, comprising: a flag layer processor for outputting a flag when position information of a current frame and position information of a previous frame respectively stored in accordance with comparison results between pooling operation values and previously stored reference values are the same; and a controller for stopping operation of a layer processor which performs operations after the flag layer processor among the plurality of layer processors when the flag is outputted from the flag layer processor, wherein the flag layer processor is a layer processor that performs a pooling operation first among the plurality of layer processors.

7. An operating method of an artificial neural network apparatus including a plurality of layer processors for performing operations on input data, comprising: outputting a flag according to a comparison result between a pooling output value of a current frame and a pooling output value of a previous frame; and stopping operation of a layer processor which performs operations after the flag layer processor among the plurality of layer processors when the flag is outputted from the flag layer processor.

8. The operating method of claim 7, wherein the outputting a flag outputs the flag when the pooling output value of the current frame is equal to the pooling output value of the previous frame.

9. The operating method of claim 7, wherein the outputting a flag performs operations on input data, stores pooling output values according to a comparison result between a pooling operation value and a previously stored reference value, and outputs the flag according to a comparison result between the stored pooling output value of the current frame and the stored pooling output value of the previous frame.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) FIG. 1 is a schematic diagram of an artificial neural network.

(2) FIG. 2 is a schematic diagram illustrating an operation process of an artificial neural network layer.

(3) FIG. 3 is a block diagram of an artificial neural network apparatus according to an exemplary embodiment.

(4) FIG. 4 is a block diagram of a flag processor according to an exemplary embodiment.

(5) FIG. 5 is a flowchart of an operating method of an artificial neural network apparatus according to an exemplary embodiment.

(6) FIG. 6 is a flowchart for explaining a step of outputting a flag of an operating method of an artificial neural network apparatus according to an exemplary embodiment.

DETAILED DESCRIPTION OF THE EMBODIMENTS

(7) Hereinafter, exemplary embodiments of the present disclosure will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily practice the present disclosure. However, the present disclosure may be modified in various different ways and is not limited to embodiments described herein. In the accompanying drawings, portions unrelated to the description will be omitted in order to obviously describe the present disclosure, and similar reference numerals will be used to describe similar portions throughout the present specification.

(8) Throughout the present specification and the claims, unless explicitly described to the contrary, the word “comprise” and variations such as “comprises” or “comprising” will be understood to imply the inclusion of stated elements but not the exclusion of any other elements.

(9) FIG. 1 is a schematic diagram of an artificial neural network.

(10) Referring to FIG. 1, when input data is input to an artificial neural network, all layers from a layer for processing input data to a layer for outputting a recognition result are activated.

(11) FIG. 2 is a schematic diagram illustrating an operation process of an artificial neural network layer.

(12) Referring to FIG. 2, an artificial neural network layer of the Convolutional Neural Network (CNN) model having an image recognition function performs convolution, ReLu (Rectified Linear Unit), and pooling.

(13) Since an amount of data is generally reduced to about half or one quarter of a total amount of input data in the pooling, the pooling is suitable for comparing values between a previous frame and a current frame. Since operation in the pooling reflects position information of an object on the two-dimensional plane, it is possible to compare the position information between the previous frame and the current frame.

(14) FIG. 3 is a block diagram of an artificial neural network apparatus according to an exemplary embodiment.

(15) Referring to FIG. 3, an artificial neural network apparatus includes a plurality of layer processors 100 and a controller 200 according to an exemplary embodiment.

(16) A flag layer processor 102 that performs a pooling operation first among the plurality of layer processors 100 outputs a flag according to a comparison result between a pooling output value of a current frame and a pooling output value of a previous frame. Specifically, the flag layer processor 102 outputs the flag when the pooling output value of the current frame and the pooling output value of the previous frame are the same. The flag layer processor 102 may be a first half layer module or an initial layer module for recognizing brightness and a low dimensional shape of a pixel and performing first pooling. When the flag layer processor 102 is the first layer module, a layer processor which performs operations after the flag layer processor may be a second half layer module.

(17) FIG. 4 is a block diagram of a flag processor according to an exemplary embodiment.

(18) Referring to FIG. 4, The flag layer processor 102 according to an exemplary embodiment includes a memory 111 for storing the pooling output value of the current frame and the pooling output value of the previous frame, and a comparator 113 for comparing the pooling output value of the current frame with the pooling output value of the previous frame and for outputting the flag when the pooling output value of the current frame and the pooling output value of the previous frame are the same.

(19) The pooling output value stored in the memory 111 is a value stored according to a comparison result between a pooling operation value and a previously stored reference value. Specifically, the pooling output value is position information stored in the order of the pooling operation when the pooling output value of the current frame is larger than the previously stored reference value. The memory 111 may be a FIFO (First In First Out) buffer. The reference value may be determined in advance through experiments and tests and stored in the FIFO, and may be modified during operation.

(20) The comparator 113 compares position information, which is the pooling output value of the current frame stored in the FIFO, with position information, which is the pooling output value of the previous frame stored in the FIFO, and outputs a flag based on the comparison result. The flag is a signal for determining whether to operate a layer processor 103 which performs operations after performing an operation of the flag layer processor 102 (i.e., which receives output data of the flag layer processor 102 and performs operations).

(21) Referring to FIG. 3, the controller 200 determines whether to operate the layer processor 103, which is the second half layer module, according to whether a flag of the flag layer processor 102 is received or not. Specifically, when the flag is received from the flag layer processor 102, the controller 200 stops operation of the layer processor 103 which performs operations after the flag layer processor 102.

(22) When the operation of the layer processor 103 is stopped, result values calculated by the flag layer processor 102 become output data. On the other hand, since the layer processor 103 does not output the flag when the pooling output value of the current frame is different from the pooling output value of the previous frame, result values calculated by processors 103, 104, and 105 following the flag layer processor 102 become output data.

(23) FIG. 5 is a flowchart of an operating method of an artificial neural network apparatus according to an exemplary embodiment.

(24) Referring to FIGS. 3 and 5, an operating method of an artificial neural network apparatus according to an exemplary embodiment includes outputting, by a flag layer processor 102, a flag according to a comparison result between a pooling output value of a current frame and a pooling output value of a previous frame (S100), and stopping, by a controller 200, operation of a layer processor 103 which performs operations after performing an operation of the flag layer processor 102 when the controller 200 receives the flag (S300).

(25) FIG. 6 is a flowchart for explaining a step of outputting a flag of an operating method of an artificial neural network apparatus according to an exemplary embodiment.

(26) The outputting a flag (S100) performs operations on input data (S110), stores pooling output values according to a comparison result between a pooling operation value and a previously stored reference value (S120), and outputs the flag according to a comparison result between the stored pooling output value of the current frame and the stored pooling output value of the previous frame (S130). The outputting a flag (S100) is the same as the operation of the flag layer processor 102 described above, so detailed description will be omitted.

(27) The step S300 of interrupting the operation of the layer processor 103 is the same as the operation of the controller 200 described above, and thus a detailed description thereof will be omitted. The stopping operation of a layer processor 103 is the same as the operation of the controller 200 described above, so detailed description will be omitted.

(28) While this invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.