CONSTRUCTION SYSTEM FOR RECURRENT BAYESIAN NETWORKS, CONSTRUCTION METHOD FOR RECURRENT BAYESIAN NETWORKS, COMPUTER READABLE RECORDING MEDIUM, NON-TRANSITORY COMPUTER PROGRAM PRODUCT, AND RADIO NETWORK CONTROL SYSTEM
20230110592 · 2023-04-13
Inventors
Cpc classification
G06N3/126
PHYSICS
G06N3/086
PHYSICS
International classification
Abstract
A construction system for recurrent Bayesian networks, a construction method for recurrent Bayesian networks, a computer-readable recording medium, a non-transitory computer program product, and a radio network control system are provided. The method is performed by a processor, and includes: establishing an initial population, and setting the initial population as a current population; establishing a corresponding recurrent Bayesian network for each CPPN in the current population, to obtain a set of recurrent Bayesian networks corresponding to the current population; evolving the current population by using an evolutionary algorithm and a fitness function to obtain a next population; determining whether a termination condition is met; and repeatedly performing the foregoing steps in response to the termination condition being not met, and selecting a solution network in the current population as the task network based on the fitness function in response to the termination condition being met.
Claims
1. A construction system for recurrent Bayesian networks, comprising a processor, the processor being configured to perform the following steps to generate a task network: (a) setting a current population, the current population comprising a plurality of compositional pattern-producing networks (CPPNs); (b) establishing a corresponding recurrent Bayesian network for each CPPN in the current population, to obtain a set of recurrent Bayesian networks corresponding to the current population; (c) evolving the current population by using an evolutionary algorithm and a fitness function to obtain a next population, and setting the next population as the current population; (d) determining whether a termination condition is met according to the fitness function and the set of recurrent Bayesian networks corresponding to the current population; and (e) repeatedly performing steps (b), (c), and (d) in response to the termination condition being not met, and selecting a solution network in the current population as the task network based on the fitness function in response to the termination condition being met.
2. The construction system for recurrent Bayesian networks according to claim 1, wherein step (b) comprises: selecting a current CPPN in the current population; and establishing, by using an output generated by the current CPPN corresponding to an offspring node and a parent node corresponding to the offspring node in the corresponding recurrent Bayesian network, a conditional probability table of the parent node corresponding to the offspring node in the corresponding recurrent Bayesian network.
3. The construction system for recurrent Bayesian networks according to claim 2, wherein the evolutionary algorithm is neuroevolution of augmenting topologies.
4. The construction system for recurrent Bayesian networks according to claim 1, wherein the termination condition is steps (b), (c), and (d) are repeatedly for a default number of times, or, according to the fitness function, the current population comprises a candidate network allowing a fitness value of the candidate network to be greater than a preset fitness value.
5. The construction system for recurrent Bayesian networks according to claim 1, wherein step (a) comprises: randomly generating an initial population according to a parameter; and setting the initial population as the current population, wherein the initial population comprising the CPPNs.
6. A construction method for recurrent Bayesian networks, performed by a processor, to generate a task network, the construction method for recurrent Bayesian networks comprises: (a) setting a current population, the current population comprising a plurality of compositional pattern-producing networks (CPPNs); (b) establishing a corresponding recurrent Bayesian network for each CPPN in the current population, to obtain a set of recurrent Bayesian networks corresponding to the current population; (c) evolving the current population by using an evolutionary algorithm and a fitness function to obtain a next population, and setting the next population as the current population; (d) determining whether a termination condition is met according to the fitness function and the set of recurrent Bayesian networks corresponding to the current population; and (e) repeatedly performing steps (b), (c), and (d) in response to the termination condition being not met, and selecting a solution network in the current population as the task network based on the fitness function in response to the termination condition being met.
7. The construction method for recurrent Bayesian networks according to claim 6, wherein step (b) comprises: selecting a current CPPN in the current population; and establishing, by using an output generated by the current CPPN corresponding to an offspring node and a parent node corresponding to the offspring node in the corresponding recurrent Bayesian network, a conditional probability table of the parent node corresponding to the offspring node in the corresponding recurrent Bayesian network.
8. The construction method for recurrent Bayesian networks according to claim 7, wherein the evolutionary algorithm is neuroevolution of augmenting topologies.
9. The construction method for recurrent Bayesian networks according to claim 6, wherein the termination condition is that steps (b), (c), and (d) are performed for a default number of times, or, according to the fitness function, the current population comprises a candidate network allowing a fitness value of the candidate network to be greater than a preset fitness value.
10. The construction method for recurrent Bayesian networks according to claim 6, wherein step (a) comprises: randomly generating an initial population according to a parameter; and setting the initial population as the current population, wherein the initial population comprising the CPPNs.
11. A computer-readable recording medium with a stored program, the stored program, when loaded and executed by a processing unit, performing the method according to claim 6.
12. A non-transitory computer program product, storing at least one instruction, the at least one instruction, when executed by a processor, causing the processor to perform the method according to claim 6.
13. A radio network control system, coupled to a radio access network, the radio network control system comprising: a construction system for recurrent Bayesian networks, comprising a processor, the processor being configured to perform the following steps to generate a task network: (a) setting a current population, the current population comprising a plurality of compositional pattern-producing networks (CPPNs); (b) establishing a corresponding recurrent Bayesian network for each CPPN in the current population, to obtain a set of recurrent Bayesian networks corresponding to the current population; (c) evolving the current population by using an evolutionary algorithm and a fitness function to obtain a next population, and setting the next population as the current population; (d) determining whether a termination condition is met according to the fitness function and the set of recurrent Bayesian networks corresponding to the current population; and (e) repeatedly performing steps (b), (c), and (d) in response to the termination condition being not met, and selecting a solution network in the current population as the task network based on the fitness function in response to the termination condition being met; an optimizer, configured to receive a network state and a target policy of the radio access network, and output an optimal configuration to the radio access network for configuring and optimizing the radio access network; and a control unit, configured to receive and store the task network; wherein the optimizer outputs, based on a plurality of solutions in a solution space, a plurality of configuration feature vectors of each solution to the control unit; the task network evaluates at least one key performance indicator (KPI) of each configuration feature vector to obtain at least one KPI value of each configuration feature vector; and the optimizer selects an optimal solution in the solution space based on the at least one KPI of each configuration feature vector, and outputs the optimal configuration to the radio access network based on the optimal solution.
14. The radio network control system according to claim 13, wherein components of each configuration feature vector comprise a topological structure, a routing table, a traffic intensity matrix, and a power intensity matrix of the radio access network.
15. The radio network control system according to claim 13, wherein the at least one KPI is selected from one of a group consisting of a delay indicator, a jitter indicator, and a loss indicator.
16. The radio network control system according to claim 13, wherein step (b) comprises: selecting a current PCCN in the current population; and establishing, by using an output generated by the current CPPN corresponding to an offspring node and a parent node corresponding to the offspring node in the corresponding recurrent Bayesian network, a conditional probability table of the parent node corresponding to the offspring node in the corresponding recurrent Bayesian network.
17. The radio network control system according to claim 16, wherein the evolutionary algorithm is neuroevolution of augmenting topologies.
18. The radio network control system according to claim 13, wherein the termination condition is that steps (b), (c), and (d) are repeatedly performed for a default number of times, or, according to the fitness function, the current population comprises a candidate network allowing a fitness value of the candidate network to be greater than a preset fitness value.
19. The radio network control system according to claim 13, and the construction system for recurrent Bayesian networks according to claim 1, wherein step (a) comprises: randomly generating an initial population according to a parameter; and setting the initial population as the current population, wherein the initial population comprising the CPPNs.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DETAILED DESCRIPTION
[0025] The foregoing and other technical contents, features, and effects of the present invention can be clearly presented below in detailed description with reference to embodiments of the accompanying drawings. Proportions or sizes of the components in the drawings expressed in an exaggerated, omitted or general manner are used to help a person skilled in the art to understand and read, and are not intended to limit restraint conditions under which the present invention can be implemented and therefore have no technical significance. Any modification to the structure, change to the proportional relationship or adjustment on the size should fall within the scope of the technical content disclosed by the present invention without affecting the effects and the objectives that can be achieved by the present invention. The same reference numerals are used to indicate the same or similar components in all of the accompanying drawings. The term “couple” or “connect” provided in the following embodiments may refer to any direct or indirect, or wired or wireless connection means.
[0026]
[0027] A construction method for recurrent Bayesian networks and collaboration between modules of the construction system 100 for recurrent Bayesian networks according to the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
[0028]
[0029] In the embodiment shown in
[0030] A connection pattern of the neural network may be stored as a four-dimensional hypercube by using the CPPN. Each point of the four-dimensional hypercube may be represented as (x.sub.1, y.sub.1, x.sub.2, y.sub.2), where x.sub.1, y.sub.1, x.sub.2, and y.sub.2 are coordinate components of each point of the four-dimensional hypercube. Each point encodes a connection between two nodes. The CPPN 205 may be considered as a four-dimensional function CPPN(1):
w=CPPN(x.sub.1,y.sub.1,x.sub.2,y.sub.2) (1).
[0031] w is an output value of the CPPN 205. As shown in
[0032] Referring to
[0033] After the neural network is expressed as the node gene 301 and the connection gene 302, as shown in
[0034] It should be noted that, the random addition of an element to the connection gene 401 may be implemented by randomly generating appropriate “in”, “out”, “weight”, and “innovation” through a random simulation function (for example, a random ( ) function in a random module in Python) of general software.
[0035] Referring to
[0036] In the foregoing example, the connection gene element 5011 matches the connection gene element 5021. Therefore, one of the connection gene element 5011 and the connection gene element 5021 is randomly selected and placed into a connection gene 505 of a neural network 506. The connection gene element 5012 matches the connection gene element 5022. Therefore, one of the connection gene element 5012 and the connection gene element 5022 is randomly selected and placed into the connection gene 505 of the neural network 506, and so on. The connection gene elements that are not matched, such as the connection gene elements 5014, 5024, or 5025, are unconditionally placed into the connection gene 505 of the neural network 506. Finally, the connection gene 505 and the corresponding neural network 506 may be obtained. The connection gene 505 includes connection gene elements 5051, 5052, 5053, 5054, 5055, and 5056. The neural network 503 and the neural network 504 are referred to as parents of the neural network 506. The neural network 506 is referred to as an offspring of the neural network 503 and the neural network 504.
[0037] The algorithms for evolving the topological structure and the connection weights of the neural network shown in
[0038] It should be noted that the process of randomly selecting one of the connection gene element 5012 and the connection gene element 5022 may be implemented by randomly selecting one of the connection gene element 5012 and the connection gene element 5022 through a random simulation function (for example, a random( ) function in a random module in Python) of general software.
[0039] In step S901 in
[0040]
[0041]
[0042] In an embodiment of the present invention, properties of the Bayesian network on the nodes are discrete. In this embodiment, the conditional probabilities may be represented by using a conditional probability table (CPT). A sum of probabilities in any row of the CPT is definitely 1. Taking the Bayesian network shown in
[0043]
[0044]
[0045] The processor 101 establishes a corresponding recurrent Bayesian network for each CPPN in the current population by using the foregoing establishment method. Therefore, the processor 101 finally can obtain a set of recurrent Bayesian networks corresponding to the current population.
[0046] In step S903, the processor 101 evolves the current population by using an evolutionary algorithm and a fitness function to obtain a next population, and sets the next population as the current population. In an embodiment of the present invention, the processor 101 calculates a fitness value of each CPPN in the current population by using the fitness function, and ranks each CPPN in the current population by using the fitness value of each CPPN in the current population. The processor 101 selects N CPPNs in the current population with larger fitness values based on the ranking result, where N is a positive integer, and N is less than the population quantity M. The processor 101 then operates the selected N CPPNs by using the mutation process and the crossover process in the neural network evolutionary operation shown in
[0047] In an embodiment of the present invention, the process in which the processor 101 operates the selected N CPPNs by using the mutation process and the crossover process in the neural network evolutionary operation shown in
[0048] It should be noted that, any neural evolutionary algorithm based on the neuroevolution of augmenting topologies for evolving the topological structures and connection weights of the neural network as shown in
[0049] In step S904, the processor 101 determines whether a termination condition is met according to the fitness function and the set of recurrent Bayesian networks corresponding to the current population. In this embodiment, the termination condition is that steps S902 and S903 are performed for a default number of times, or, according to the fitness function, the current population includes a candidate network allowing a fitness value of the candidate network to be greater than a preset fitness value. If the termination condition is not met, the processor 101 executes steps S902 and S903 again. If the termination condition is met, the processor 101 executes step S905.
[0050] In step S905, because the current population already includes at least one candidate network that the fitness value of the candidate network to be greater than the preset fitness value. The processor 101 then selects a solution network in the current population, whose fitness value is greater than the preset fitness value, as the task network according to the fitness function.
[0051]
[0052]
[0053]
[0054] The internal memory 1202 and the non-volatile memory 1203 are used to store programs. The programs may include program code, and the program code includes computer operation instructions. The internal memory 1202 and the non-volatile memory 1203 provide instructions and data to the processor 1201. The processor 1201 reads a corresponding computer program from the non-volatile memory 1203 into the internal memory 1202 and runs the corresponding computer program. The processor 1201 is specifically configured to perform the steps described in
[0055] The processor 1201 may be an integrated circuit chip with a signal processing capability. During implementation, the method and steps disclosed in the foregoing embodiments may be implemented by using a hardware integrated logic circuit in the processor 801 or implemented by using instructions in a software form. The processor 1201 may be a general purpose processor, including a central processing unit (CPU), a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA) or another programmable logic device that may implement or perform the methods and steps disclosed in the foregoing embodiments.
[0056] Embodiments of the present specification further provide a computer-readable storage medium. The computer-readable storage medium stores at least one instruction, the at least one instruction, when executed by the processor 1201 of the electronic device 1200, causing the processor 1201 of the electronic device 1200 to perform the methods and steps disclosed in the foregoing embodiments.
[0057] Examples of computer storage media include, but are not limited to, a phase change memory (PRAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), or other types of random access memory (RAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a flash memory or another internal memory technology, a compact disc read-only memory (CD-ROM), a digital versatile disc (DVD) or another optical storage, or a cartridge tape. A magnetic storage of a magnetic tape or a disc, another magnetic storage device, or any other non-transmission medium may be configured to store information that can be accessed by a computing device. As defined herein, the computer-readable medium does not include a transitory medium, such as a modulated data signal and a carrier wave.
[0058]
[0059] The optimizer 1302 inputs a plurality of configuration feature vectors of each solution to the control unit 1303 based on a plurality of solutions in a solution space. The task network 1304 then evaluates the input KPI of each configuration feature vector to obtain a KPI value for each configuration feature vector. The optimizer 1302 selects an optimal solution in the solution space for each configuration feature vector based on the KPI value. The optimizer 1302 outputs the optimal configuration to the RAN 1301 based on the optimal solution.
[0060] In some embodiments of the present invention, components of each configuration feature vector include a topological structure, a routing table, a traffic intensity matrix, and a power strength matrix of the RAN.
[0061] In some embodiments of the present invention, the KPI is selected from one of a delay indicator, a jitter indicator, and a loss indicator, or a combination thereof.
[0062] In some embodiments of the present invention, the RAN 1301 is a 5G RAN.
[0063] Based on the above, some embodiments of the present invention provide a construction system and construction method for recurrent Bayesian networks, a computer-readable recording medium, and a non-transitory computer program product, which may efficiently construct an appropriate recurrent Bayesian network by evolving a population consisting of a CPPN.
[0064] Some embodiments of the present invention provide a radio network control system that can automatically complete a self-organizing network solution by applying a task network generated by the construction system for recurrent Bayesian networks.
[0065] Although the present invention has been described in considerable detail with reference to certain preferred embodiments thereof, the disclosure is not for limiting the scope of the invention. Persons having ordinary skill in the art may make various modifications and changes without departing from the scope and spirit of the invention. Therefore, the scope of the appended claims should not be limited to the description of the preferred embodiments described above.