TRAINING METHOD AND SYSTEM FOR PASSENGER DISTRIBUTION PREDICTION MODEL, AND PASSENGER GUIDING METHOD AND SYSTEM

20220391717 · 2022-12-08

Assignee

Inventors

Cpc classification

International classification

Abstract

The present invention discloses a method and system for training a passenger distribution prediction model, and a method and system for guiding passengers. In an embodiment, the passenger distribution is intelligently sensed by means of the characteristics of temperature, humidity and CO.sub.2 concentration distribution changes in cars caused by passenger density changes, which avoids the problems of crowd flow and obscuration faced by passenger distribution detection conducted with images, and avoids the difficulty in floor intrusive transformation faced by passenger distribution detection conducted with pressure sensors; the passenger flow is guided by adjusting the brightness of lighting tubes in the cars, for example, the lighting tubes in areas with high passenger density are dimmed, and the lighting tubes in areas with low passenger density are brightened, to guide ordered flow of passengers toward areas with low passenger density. Further details are disclosed herein.

Claims

1. A training method for a distribution prediction model of passengers in a subway car, comprising: establishing a spatial coordinate system of each car, installing a collection device constituted by temperature and humidity sensors and a CO.sub.2 concentration sensor in each car, each collection device serving as a sample collection point, and obtaining installation coordinates of each sample collection point in the corresponding spatial coordinate system; obtaining a temperature time series, a humidity time series, and a CO.sub.2 concentration time series at each sample collection point of each car during a sampling period, and obtaining an average temperature value and an average humidity value locally of the day, wherein the installation coordinates of each sample collection point, the temperature time series, the humidity time series, and the CO.sub.2 concentration time series corresponding to the sample collection point, and the average temperature value and the average humidity value constitute an environmental data sample at each sample collection point of each car during the sampling period; recording passenger positions in each car at an end time of the sampling period, and obtaining coordinates of passengers in the corresponding spatial coordinate system, i.e. obtaining a passenger distribution for each car during the sampling period; converting the passenger distribution for each car during the sampling period into a binary image, and calculating a mean value, a variance, a maximum value and a minimum value of pixels in the binary image; calculating an amplitude and a gradient direction of each pixel in the binary image, and establishing a frequency distribution histogram with the gradient direction of all pixels as the abscissa axis and the pixel value corresponding to each pixel as the ordinate axis; extracting a frequency distribution feature of the frequency distribution histogram, the frequency distribution feature referring to a feature vector constituted by pixel values corresponding to each interval among R intervals equally divided from the abscissa axis of the frequency distribution histogram; combining the mean value, the maximum value, the minimum value, the variance and the frequency distribution feature of the pixels to form an original feature vector, and clustering the binary images with the original feature vector a input of a clustering algorithm to obtain image clustering results; coding the passenger distribution mode according to the image clustering results to obtain a passenger distribution code, each category of image clustering result corresponding to a passenger distribution mode, and each passenger distribution mode corresponding to a passenger distribution code; and establishing a distribution prediction model for each car, and training the distribution prediction model with training samples to obtain a trained distribution prediction model, the training sample taking the environmental data sample at each sample collection point of each car during the sampling period as input and the corresponding passenger distribution code as output.

2. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein the spatial coordinate system is a two-dimensional rectangular coordinate system with a horizontal centerline of the roof of the car as the x-axis and a longitudinal centerline of the roof as the y-axis, or the spatial coordinate system is a two-dimensional rectangular coordinate system with a horizontal centerline of the ground of the car as the x-axis and a longitudinal centerline of the ground as the y-axis.

3. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein 15 collection devices are installed on the roof of each car, and the 15 collection devices are arranged in a array.

4. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein the specific process of converting the passenger distribution for each car during the sampling period into a binary image is as follows: dividing each car into W equal parts along the x-axis of the spatial coordinate system of the car, and dividing each car into L equal parts along the y-axis of the spatial coordinate system of the car, to convert the space of each car into W×L independent units; setting each independent unit as a pixel, the number of passengers in each independent unit being a pixel value of the pixel; and obtaining pixel values of all pixels, and converting the passenger distribution into the binary image composed of W×L pixels.

5. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein formulas for calculating the amplitude and the gradient direction of the pixel are respectively: G ( l , w ) = G x ( l , w ) 2 + G y ( l , w ) 2 , θ ( l , w ) = arctan G y ( l , w ) G x ( l , w ) G x ( l , w ) = I ( l + 1 , w ) - I ( l - 1 , w ) , G y ( l , w ) = I ( l , w + 1 ) - I ( l , w - 1 ) where, G(l, w) is the amplitude of the pixel, θ(l, w) is the gradient direction of the pixel, Gx(l, w) is the gradient amplitude of the pixel along the x-axis of the spatial coordinate system, Gy(l, w) is the gradient amplitude of the pixel along the y-axis of the spatial coordinate system, I(l+1, w) is the pixel value of the pixel in row l+1 and column w, I(l−1, w) is the pixel value of the pixel in row l−1 and column w, 1=1, 2, 3 . . . L, w=1, 2, 3 . . . W, W is a number of equal parts divided from the car along the x-axis of the spatial coordinate system of each car, and L is a number of equal parts divided from the car along the y-axis of the spatial coordinate system of each car.

6. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein the clustering algorithm is an OPTICS algorithm, the initial neighborhood distance parameter E of the OPTICS algorithm is 0.2, and the initial neighborhood sample parameter MinPts is 3.

7. The training method for the distribution prediction model of passengers in the subway car according to claim 1, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

8. A training system for a distribution prediction model of passengers in a subway car, comprising a plurality of training subsystems, each car corresponding to a training subsystem, wherein each training subsystem comprises a plurality of collection devices arranged on the roof of the car and a control device; the collection device comprises temperature and humidity sensors and a CO.sub.2 concentration sensor; the control device comprises: a coordinate system establishment unit, configured to establish a spatial coordinate system of each car, and obtain installation coordinates of each sample collection point in the corresponding spatial coordinate system, each collection device serving as a sample collection point; an environmental data obtaining unit, configured to obtain a temperature time series, a humidity time series, and a CO.sub.2 concentration time series at each sample collection point of each car during a sampling period, and obtain an average temperature value and an average humidity value locally of the day, wherein the installation coordinates of each sample collection point, the temperature time series, the humidity time series, and the CO.sub.2 concentration time series corresponding to the sample collection point, and the average temperature value and the average humidity value constitute an environmental data sample at each sample collection point of each car during the sampling period; a passenger distribution obtaining unit, configured to obtain coordinates of passengers in the corresponding spatial coordinate system according to a recorded passenger positions in each car at the end time of the sampling period, i.e. obtain a passenger distribution for each car during the sampling period; an image conversion and feature extraction unit, configured to convert the passenger distribution for each car during the sampling period into a binary image, and calculate a mean value, a variance, a maximum value and a minimum value of pixels in the binary image; calculate an amplitude and a gradient direction of each pixel in the binary image, and establish a frequency distribution histogram with the gradient direction of all pixels as the abscissa axis and the pixel value corresponding to each pixel as the ordinate axis; and extract a frequency distribution feature of the frequency distribution histogram, the frequency distribution feature referring to a feature vector constituted by pixel values corresponding to each interval among R intervals equally divided from the abscissa axis of the frequency distribution histogram; a clustering classification unit, configured to combine the mean value, the maximum value, the minimum value, the variance and the frequency distribution feature of the pixels to form an original feature vector, and cluster the binary images with the original feature vector as input of a clustering algorithm to obtain image clustering results; a coding unit, configured to code the passenger distribution mode according to the image clustering results to obtain a passenger distribution code, each category of image clustering result corresponding to a passenger distribution mode, and each passenger distribution mode corresponding to a passenger distribution code; and a model training unit, configured to establish a distribution prediction model for each car, and train the distribution prediction model with training samples to obtain a trained distribution prediction model, the training sample taking the environmental data sample of each sample collection point of the car during the sampling period as input and the corresponding passenger distribution code as output.

9. A method for guiding passengers in a subway car based on environmental monitoring and lighting guidance, comprising: obtaining an environmental data sample at each sample collection point of each ca during each sampling period in real time, the environmental data sample being constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value; inputting the environmental data samples into the distribution prediction model for the corresponding car which is trained by the training method according to claim 1, to obtain a predicted passenger distribution code for each car during the corresponding sampling period; and adjusting the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow.

10. The training method for distribution prediction model of passengers in a subway car according to claim 9, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the environmental data samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: inputting the temperature time series into the trained temperature time series LSTM neural network model with the classification layer removed, the vector outputted by the fully connected layer being the corresponding temperature time series feature vector; inputting the humidity time series into the trained humidity time series LSTM neural network model with the classification layer removed, the vector outputted by the fully connected layer being the corresponding humidity time series feature vector; and inputting the CO.sub.2 concentration time series into the trained CO.sub.2 concentration time series LSTM neural network model with the classification layer removed according to claim 7, the vector outputted by the fully connected layer being the corresponding CO.sub.2 concentration time series feature vector.

11. The method for guiding passengers in the subway car according to claim 9, wherein the specific process of controlling the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period is as follows: obtaining a current image clustering result according to the predicted passenger distribution code for the car during the corresponding sampling period, and determining a clustering center corresponding to the current image according to the current image clustering result; defining S binary images closest to the clustering center as typical clustering images of a corresponding category, and calculating an average pixel value of each pixel position in the S typical clustering images, to obtain an average pixel image under the category; assigning each lighting tube to a lighting area according to the layout of lighting tubes in the corresponding car, and calculating an average value of pixel values for pixels in the lighting area corresponding to each lighting tube; if the maximum pixel value in the average pixel image is greater than a pixel threshold, starting lighting guidance and performing next step, otherwise skipping lighting guidance; extracting pixels in the average pixel image whose average pixel values are greater than the pixel threshold, the pixels greater than the pixel threshold constituting a set of pixels to be guided; calculating a lighting guidance coefficient according to the coordinates of the lighting tubes in the spatial coordinate system, the coordinates of pixels in the set of pixels to be guided in the spatial coordinate system, and the average value of pixel values for the pixels in the lighting area, wherein the specific calculation expression is: β n = A n ( ( x 0 n - x 1 l ) 2 + ( y 0 n - y 1 w ) 2 + ( x 0 n - x 2 l ) 2 + ( y 0 n - y 1 w ) 2 + .Math. .Math. + ( x 0 n - x H l ) 2 + ( y 0 n - y H w ) 2 ) where, β.sub.n is the lighting guidance coefficient, A.sub.n is the average value of pixel values for pixels in the lighting area corresponding to the n.sup.th lighting tube, (x.sub.0.sup.n, y.sub.0.sup.n) is the coordinates of the n.sup.th lighting tube in the spatial coordinate system, (x.sub.1.sup.l, y.sub.1.sup.w) is the coordinates of the first pixel in the set of pixels to be guided in the spatial coordinate system, (x.sub.2.sup.l, y.sub.2.sup.w) is the coordinates of the second pixel in the set of pixels to be guided in the spatial coordinate system, (x.sub.H.sup.l,y.sub.H.sup.w) is the coordinates of the H.sup.th pixel in the set of pixels to be guided in the spatial coordinate system, and H is the number of pixels in the set of pixels to be guided; adjusting the brightness of the lighting tubes in the lighting areas corresponding to the β.sub.n values according to the order of the β.sub.n values, and guiding passengers to flow toward the lighting area with a smaller β.sub.n value.

12. A system for guiding passengers in a subway car based on environmental monitoring and lighting guidance, comprising: a real-time data obtaining unit, configured to obtain an environmental data sample at each sample collection point of each car during each sampling period in real time, the environmental data sample being constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value; a prediction unit, configured to input the environmental data samples into the distribution prediction model for the corresponding car which is trained by the training method according to claim 1, to obtain a predicted passenger distribution code for each car during the corresponding sampling period; and an adjustment and guidance unit, configured to adjust the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow.

13. The system for guiding passengers in the subway car according to claim 12, wherein the adjustment and guidance unit is specifically configured to: obtain a current image clustering result according to the predicted passenger distribution code for the car during the corresponding sampling period, and determine a clustering center corresponding to the current image according to the current image clustering result; define S binary images closest to the clustering center as typical clustering images of a corresponding category, and calculate an average pixel value of each pixel position in the S typical clustering images, to obtain an average pixel image under the category; assign each lighting tube to a lighting area according to the layout of lighting tubes in the corresponding car, and calculate an average value of pixel values for pixels in the lighting area corresponding to each lighting tube; if the maximum pixel value in the average pixel image is greater than a pixel threshold, start lighting guidance and perform next step, otherwise skip lighting guidance; extract pixels in the average pixel image whose average pixel values are greater than the pixel threshold, the pixels greater than the pixel threshold constituting a set of pixels to be guided; calculate a lighting guidance coefficient according to the coordinates of the lighting tubes in the spatial coordinate system, the coordinates of pixels in the set of pixels to be guided in the spatial coordinate system, and the average value of pixel values for the pixels in the lighting area, wherein the specific calculation expression is: β n = A n ( ( x 0 n - x 1 l ) 2 + ( y 0 n - y 1 w ) 2 + ( x 0 n - x 2 l ) 2 + ( y 0 n - y 1 w ) 2 + .Math. .Math. + ( x 0 n - x H l ) 2 + ( y 0 n - y H w ) 2 ) where, β.sub.n is the lighting guidance coefficient, A.sub.n is the average value of pixel values for pixels in the lighting area corresponding to the n.sup.th lighting tube, (x.sub.0.sup.n, y.sub.0.sup.n) is the coordinates of the n.sup.th lighting tube in the spatial coordinate system, (x.sub.1.sup.l, y.sub.1.sup.w) is the coordinates of the first pixel in the set of pixels to be guided in the spatial coordinate system, (x.sub.2.sup.l, y.sub.2.sup.w) is the coordinates of the second pixel in the set of pixels to be guided in the spatial coordinate system, (x.sub.H.sup.l, y.sub.H.sup.w) is the coordinates of the H.sup.th pixel in the set of pixels to be guided in the spatial coordinate system, and H is the number of pixels in the set of pixels to be guided; adjust the brightness of the lighting tubes in the lighting areas corresponding to the β.sub.n values according to the order of the β.sub.n values, and guide passengers to flow toward the lighting area with a smaller β.sub.n value.

14. The training method for the distribution prediction model of passengers in the subway car according to claim 2, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

15. The training method for the distribution prediction model of passengers in the subway car according to claim 3, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

16. The training method for the distribution prediction model of passengers in the subway car according to claim 4, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

17. The training method for the distribution prediction model of passengers in the subway car according to claim 5, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

18. The training method for the distribution prediction model of passengers in the subway car according to claim 6, wherein the temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training samples are replaced with a temperature time series feature vector, a humidity time series feature vector, and a CO.sub.2 concentration time series feature vector, respectively, wherein the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector are obtained as follows: establishing a temperature time series LSTM neural network model, and randomly dividing the temperature time series in the training samples into two temperature training sets, wherein one of the temperature training sets and the passenger distribution code corresponding to the temperature training set constitute a first training set, and the other temperature training set is a second training set; training the temperature time series LSTM neural network model with the first training set to obtain a trained temperature time series LSTM neural network model; removing a classification layer of the trained temperature time series LSTM neural network model, and obtaining the temperature time series feature vector with the second training set as input and a vector outputted by a fully connected layer as output; establishing a humidity time series LSTM neural network model, and randomly dividing the humidity time series in the training samples into two humidity training sets, wherein one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute a third training set, and the other humidity training set is a fourth training set; training the humidity time series LSTM neural network model with the third training set to obtain a trained humidity time series LSTM neural network model; removing a classification layer of the trained humidity time series LSTM neural network model, and obtaining the humidity time series feature vector with the fourth training set as input and a vector outputted by a fully connected layer as output; and establishing a CO.sub.2 concentration time series LSTM neural network model, and randomly dividing the CO.sub.2 concentration time series in the training samples into two CO.sub.2 concentration training sets, wherein one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute a fifth training set, and the other CO.sub.2 concentration training set is a sixth training set; training the CO.sub.2 concentration time series LSTM neural network model with the fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; removing a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model, and obtaining the CO.sub.2 concentration time series feature vector with the sixth training set as input and a vector outputted by a fully connected layer as output.

19. A system for guiding passengers in a subway car based on environmental monitoring and lighting guidance, comprising: a real-time data obtaining unit, configured to obtain an environmental data sample at each sample collection point of each car during each sampling period in real time, the environmental data sample being constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value; a prediction unit, configured to input the environmental data samples into the distribution prediction model for the corresponding car which is trained by the training method according to claim 2, to obtain a predicted passenger distribution code for each car during the corresponding sampling period; and an adjustment and guidance unit, configured to adjust the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow.

20. A system for guiding passengers in a subway car based on environmental monitoring and lighting guidance, comprising: a real-time data obtaining unit, configured to obtain an environmental data sample at each sample collection point of each car during each sampling period in real time, the environmental data sample being constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value; a prediction unit, configured to input the environmental data samples into the distribution prediction model for the corresponding car which is trained by the training method according to claim 3, to obtain a predicted passenger distribution code for each car during the corresponding sampling period; and an adjustment and guidance unit, configured to adjust the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0073] In order to more clearly illustrate the technical solutions of the present invention, the drawings used in the description of the embodiments will be briefly described below. Obviously, the drawings in the following description are only some embodiments of the present invention, and those of ordinary skill in the art can also obtain other drawings according to the drawings without any creative effort.

[0074] FIG. 1 is a flowchart of a training method for a distribution prediction model of passengers in a subway car in an embodiment of the present invention;

[0075] FIG. 2 is a flowchart of a passenger guiding method for a subway car based on environmental monitoring and lighting guidance in an embodiment of the present invention;

[0076] FIG. 3 is a schematic diagram of the layout of lighting tubes and lighting areas in an embodiment of the present invention.

DETAILED DESCRIPTION OF THE EMBODIMENTS

[0077] A clear and complete description will be made to the technical solutions in the present invention below with reference to the accompanying drawings in the embodiments of the present invention. Apparently, the embodiments described are only part of the embodiments of the present invention, not all of them. All other embodiments obtained by those of ordinary skill in the art based on the embodiments of the present invention without any creative effort shall fall within the protection scope of the present invention.

[0078] As shown in FIG. 1, a training method for a distribution prediction model of passengers in a subway car provided by this embodiment includes:

[0079] 1. Obtaining of Training Samples

[0080] 1.1 Establishment of a Coordinate System and Setting of Collection Devices

[0081] A spatial coordinate system of each car is established, a collection device constituted by temperature and humidity sensors and a CO.sub.2 concentration sensor is installed in each car, each collection device serves as a sample collection point, and installation coordinates of each sample collection point in the corresponding spatial coordinate system are obtained. The spatial coordinate system is a two-dimensional rectangular coordinate system parallel to the roof or the ground, or the spatial coordinate system is a two-dimensional rectangular coordinate system coincident with the roof or the ground. The spatial coordinate system facilitates the mapping of sample collection points and passengers (or independent units constituted by passengers) into the spatial coordinate system.

[0082] In this embodiment, the spatial coordinate system is a two-dimensional rectangular coordinate system with the horizontal centerline of the roof as the x-axis and the longitudinal centerline of the roof as the y-axis, or the spatial coordinate system is a two-dimensional rectangular coordinate system with the horizontal centerline of the ground as the x-axis and the longitudinal centerline of the ground as they-axis.

[0083] In order to obtain the mapping relationship between the distribution of passengers in the subway car and environmental parameters, original environmental data is collected first by the collection device 15 collection devices are installed on the roof of each car, and the 15 collection devices are arranged in a 3×5 array. It is assumed that the installation coordinates of the i.sup.th sample collection point in the corresponding spatial coordinate system are P.sub.i(x.sub.1, y.sub.i), i=1, 2, . . . o, and o is 15 in this embodiment.

[0084] 1.2 Obtaining of Training Input Samples (i.e. Original Environmental Data Samples)

[0085] A temperature time series, a humidity time series, and a CO.sub.2 concentration time series at each sample collection point of each car during a sampling period are obtained, an average temperature value and an average humidity value locally of the day are obtained; and the installation coordinates of each sample collection point, the temperature time series, the humidity time series, and the CO.sub.2 concentration time series corresponding to the sample collection point, and the average temperature value and the average humidity value constitute an environmental data sample at each sample collection point of each car during the sampling period.

[0086] The time when screen doors of the subway is opened after the arrival of the subway is defined as a time start 0 of the sampling period, the time when positions of passengers are basically stable after completion of passenger boarding (i.e. no crowd flow) is define as an end T of the sampling period, and the collection device obtains a temperature time series

[00004] Temp i = { temp t i , t = 1 , 2 , .Math. , T t 0 } ,

humidity time series

[00005] RH i = { Rh t i , t = 1 , 2 , .Math. , T t 0 }

and a CO.sub.2 concentration time series

[00006] C co 2 i { c t i , t = 1 , 2 , .Math. , T t 0 }

of the i.sup.th sample collection point duringa sampling period, where t.sub.0 is a sampling interval, T is a duration of the sampling period, temp.sub.t.sup.i is a temperature value of the i.sup.th sample collection point at time t, Rh.sub.t.sup.i is a humidity value of the i.sup.th sample collection point at time t, and c.sub.t.sup.i is a CO.sub.2 concentration value of the i.sup.th sample collection point at time t. An average temperature value temp.sub.0 and an average humidity value Rh.sub.0 locally of the day are obtained, then the environmental data sample obtained during a sampling period (time 0 to time T) is E=[x.sub.i,y.sub.i,temp.sub.t.sup.i,Rh.sub.t.sup.i,c.sub.t.sup.i,temp.sub.0,Rh.sub.0].

[0087] 1.3 Obtaining of Training Output Samples (i.e. Coding of Passenger Distribution)

[0088] 1.31 Obtaining of Passenger Distribution

[0089] In order to avoid the problem of low accuracy of passenger distribution detection due to obscuration and the like in other existing passenger distribution collection modes, passenger positions in each car at an end time T of the sampling period are recorded by artificial statistics, and coordinates of passengers in the corresponding spatial coordinate system are obtained, i.e. a passenger distribution for each car during the sampling period is obtained, specifically denoted as D={x.sub.j, y.sub.j, j=1, 2, . . . , N}, where j is the j.sup.th passenger, N is the number of passengers in the car, and (x.sub.j, y.sub.j) is the coordinates of the j.sup.th passenger in the corresponding spatial coordinate system.

[0090] 1.32 Coding of a Passenger Distribution Mode of Each Car

[0091] In order to quantitatively describe the passenger distribution, the passenger distribution mode is coded. The specific coding steps are as follows:

[0092] a. The passenger distribution for each car during the sampling period (time 0 to time T) is converted into a binary image, specifically:

[0093] a1. Each car is divided into W equal parts along the x-axis of the spatial coordinate system of the car, and each car is divided into L equal parts along they-axis of the spatial coordinate system of the car, to convert the space of each car into W×L independent units. In this embodiment, L=3 and W=10.

[0094] a2. Each independent unit is set as a pixel, and the number of passengers in each independent unit is a pixel value of the pixel, that is, the number of passengers I (l, w) in the independent unit of row l and column w is the pixel value of the pixel in row l and column w, l=1, 2, 3, . . . , L, w=1, 2, 3, . . . , W.

[0095] a3. The pixel values of all pixels are obtained, and the passenger distribution is converted into a binary image constituted by W×L pixels.

[0096] b. A mean value, a variance, a maximum value I.sub.max and a minimum value I.sub.min of pixels in the binary image are calculated.

[0097] The mean value Ī of pixels in the binary image is

[00007] I ¯ = - 1 L × W .Math. l = 1 , w = 1 l = L , w = W I ( l , w ) ( 1 )

[0098] The variance I.sub.SDE of pixels in the binary image is

[00008] I SDE = .Math. l = 1 , w = 1 l = L , w = W ( I ( l , w ) - I ¯ ) 2 / ( L × W ) ( 2 )

[0099] c. In order to obtain a feature of change in the pixel values of the binary image in different directions, an amplitude and a gradient direction of each pixel in the binary image are calculated, and a frequency distribution histogram is established with the gradient direction of all pixels as the abscissa axis and the pixel value corresponding to each pixel as the ordinate axis; and a frequency distribution feature of the frequency distribution histogram is extracted, the frequency distribution feature referring to a feature vector constituted by pixel values corresponding to each interval among R intervals equally divided from the abscissa axis of the frequency distribution histogram. Formulas for calculating the amplitude and the gradient direction ofeach pixel in the binary image are respectively:

[00009] G ( l , w ) = G x ( l , w ) 2 + G y ( l , w ) 2 ( 3 ) θ ( l , w ) = arctan G y ( l , w ) G x ( l , w ) ( 4 ) G x ( l , w ) = I ( l + 1 , w ) - I ( l - 1 , w ) ( 5 ) G y ( l , w ) = I ( l , w + 1 ) - I ( l , w - 1 ) ( 6 )

[0100] Where, G(l, w) is the amplitude of the pixel, θ(l, w) is the gradient direction of the pixel, G.sub.x(l, w) is the gradient amplitude of the pixel along the x-axis of the spatial coordinate system, G.sub.y(l, w) is the gradient amplitude of the pixel along the y-axis of the spatial coordinate system, I(l+1, w) is the pixel value of the pixel in row l+1 and columnw, and I(l−1, w) is the pixel value of the pixel in row l−1 and column w.

[0101] In this embodiment, R=9, that is, the gradient direction is divided into 9 equal parts within [0,180°], to construct 9 gradient direction intervals [0, 20°), [20, 40°), [40, 60°), [60, 80°), [80,100°), 1100,120°), [120,140°). [140,160°), and [160,180°]. At the time start of each sampling period, the initial frequency value of each interval is 0, that is, the cumulative pixel value of the gradient direction in this interval is 0, then the initial gradient directionfrequency vector is F.sub.0={0,0,0,0,0,0,0,0,0}, where F.sub.0 indicates that the initial frequency values of the 9 intervals are all 0. The gradient direction frequency vector is set to be F.sub.l×w={f.sub.1, . . . f.sub.r, . . . f.sub.9}, and if the gradient direction of the (l×w).sup.th pixel falls within the r.sub.th interval, the updated gradient direction frequency vector is F.sub.l×w={f.sub.1, . . . f.sub.r+G(l,w), . . . f.sub.9}, where l×w=1, 2, . . . L×W, r=1, 2, . . . 9. The final gradient direction frequency vector F.sub.L×W obtained after all pixels are traversed is the frequency distribution feature of the binary image.

[0102] d. The mean value, the maximum value, the minimum value, the variance and the frequency distribution feature of the pixels are combined to form an original feature vector P=[Ī,I.sub.max,I.sub.min,I.sub.SDE,F.sub.L×W], and the binary images are clustered with the original feature vector P as input of a clustering algorithm to obtain image clustering results.

[0103] The clustering algorithm for classification is the prior art. In this embodiment, the clustering algorithm is an OPTICS algorithm for clustering the binary images to obtain image clustering results. The initial neighborhood distance parameter ε of the OPTICS algorithm is set to 0.2, and the initial neighborhood sample parameter MinPts is set to 3. The OPTICS algorithm is the prior art, and the OPTICS algorithm is not sensitive to input parameters, which improves the accuracy of clustering results.

[0104] e. The passenger distribution mode is coded according to the image clustering results to obtain a passenger distribution code. Each category of image clustering result corresponds to a passenger distribution mode, and each passenger distribution mode corresponds to a passenger distribution code.

[0105] The clustering results show that the same category of images are the same category of image samples, the passenger distribution mode corresponding to the same category of image samples is the same type of distribution mode, and each type of passenger distribution mode corresponds to a passenger distribution code. For example, the image clustering results show that the images are divided into 5 categories, then 5 passenger distribution codes[1, 0, 0, 0,0].sup.T, [0,1, 0, 0,0].sup.T, [0, 0,1,0,0].sup.T, [0, 0, 0,1,0].sup.T and [0, 0, 0, 0,1].sup.T can be obtained according to binary codes. The passenger distribution codes are sorted in descending order according to the number of image samples in the category it belongs to, the first passenger distribution code is [1, 0, 0, 0, 0].sup.T, the second passenger distribution code is [0, 1, 0, 0, 0].sup.T, the third passenger distribution code is [0, 0, 1, 0, 0].sup.T, the fourth passenger distribution code is [0, 0, 0, 1, 0].sup.T, and the fifth type of passenger distribution code is [0, 0, 0, 0, 1].sup.T.

[0106] 1.4 Processing of the Original Environmental Data Samples

[0107] The temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the original environmental data samples have high dimensionality. In order to reduce the dimensionality of input training samples and reduce the complexity of subsequent model training and testing, a LSTM neural network model is established to extract feature vectors. The specific process is:

[0108] A temperature time series LSTM neural network model is established, and the temperature time series LSTM neural network model is trained with a first training set to obtain a trained temperature time series LSTM neural network model; a classification layer of the trained temperature time series LSTM neural network model is removed, and a temperature time series feature vector is obtained with a second training set as input and a vector outputted by a fully connected layer as output; the temperature time series during a sampling period in step 1.21 is randomly divided into two temperature training sets, one of the temperature training sets and the passenger distribution code corresponding to the temperature training set (step e) constitute the first training set, and the other temperature training set is the second training set. For example, the temperature time series

[00010] Temp i = { temp t i , t = 1 , 2 , .Math. , T t 0 }

during a sampling period is divided into two temperature training sets according to a proportion of 5:3, the time series has a length of T/t.sub.0, and the feature dimension is the number o of the sample collection points. The humidity time series and the CO.sub.2 concentration time series are processed in similar ways as that of the temperature time series.

[0109] A humidity time series LSTM neural network model is established, and the humidity time series LSTM neural network model is trained with a third training set to obtain a trained humidity time series LSTM neural network model; a classification layer of the trained humidity time series LSTM neural network model is removed, and a humidity time series feature vector is obtained with a fourth training set as input and a vector outputted by a fully connected layer as output; the humidity time series during a sampling period in step 1.21 is randomly divided into two humidity training sets, one of the humidity training sets and the passenger distribution code corresponding to the humidity training set constitute the third training set, and the other humidity training set is the fourth training set.

[0110] A CO.sub.2 concentration time series LSTM neural network model is established, and the CO.sub.2 concentration time series LSTM neural network model is trained with a fifth training set to obtain a trained CO.sub.2 concentration time series LSTM neural network model; a classification layer of the trained CO.sub.2 concentration time series LSTM neural network model is removed, and a CO.sub.2 concentration time series feature vector is obtained with a sixth training set as input and a vector outputted by a fully connected layer as output; the CO.sub.2 concentration time series during a sampling period in step 1.21 is randomly divided into two CO.sub.2 concentration training sets, one of the CO.sub.2 concentration training sets and the passenger distribution code corresponding to the CO.sub.2 concentration training set constitute the fifth training set, and the other CO.sub.2 concentration training set is the sixth training set.

[0111] The temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the training input samples are replaced with the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector respectively, to obtain new environmental data samples E′ in the training input samples.

[0112] The establishment of the LSTM neural network model is the prior art. In this embodiment, the number of neurons in a hidden layer of the LSTM neural network model is 100, and the number of training iterations is 2000.

[0113] 2. Establishment of a Distribution Prediction Model for Each Car, and Training of the Distribution Prediction Model (i.e. Obtaining of a Mapping Relationship Between Environmental Data and Passenger Distribution)

[0114] A distribution prediction model for each car is established, and the distribution prediction model is trained with training samples to obtain a trained distribution prediction model, the training sample taking the new environmental data sample E′ at each sample collection point of the car during a sampling period in step 1 as input and the corresponding passenger distribution code as output.

[0115] The distribution prediction model may be any of a support vector machine model, a relevance vector machine model or a neural network model, and the establishment of the distribution prediction model is the prior art.

[0116] A training system for a distribution prediction model of passengers in a subway car provided by this embodiment includes a plurality of training subsystems, each car corresponds to a training subsystem, and each training subsystem includes a plurality of collection devices arranged on the roof of the car and a control device; and the collection device includes temperature and humidity sensors and a CO.sub.2 concentration sensor. The setting of the collection device is recorded in step 1.1.

[0117] The control device may be shared by the entire train, or each car corresponds to a control device. Corresponding to the training method for a distribution prediction model of passengers in a subway car, the control device includes:

[0118] a coordinate system establishment unit, configured to establish a spatial coordinate system of each car, and obtain installation coordinates of each sample collection point in the corresponding spatial coordinate system, each collection device serving as a sample collection point;

[0119] an environmental data obtaining unit, configured to obtain a temperature time series, a humidity time series, and a CO.sub.2 concentration time series of each sample collection point of each car during a sampling period, and obtain an average temperature value and an average humidity value locally of the day, wherein the installation coordinates of each sample collection point, the temperature time series, the humidity time series, and the CO.sub.2 concentration time series corresponding to the sample collection point, and the average temperature value and average humidity value constitute an environmental data sample of each sample collection point of each car during the sampling period;

[0120] a passenger distribution obtaining unit, configured to obtain coordinates of passengers in the corresponding spatial coordinate system according to the recorded passenger positions in each car at an end time of the sampling period, i.e. obtain a passenger distribution for each car during the sampling period;

[0121] an image conversion and feature extraction unit, configured to convert the passenger distribution for each car during the sampling period into a binary image, and calculate a mean value, a variance, a maximum value and a minimum value of pixels in the binary image according to formulas (1) and (2); calculate an amplitude and a gradient direction of each pixel in the binary image according to formulas (3) to (6), and establish a frequency distribution histogram with the gradient direction of all pixels as the abscissa axis and the pixel value corresponding to each pixel as the ordinate axis;

[0122] and extract a frequency distribution feature of the frequency distribution histogram, the frequency distribution feature referring to a feature vector constituted by pixel values corresponding to each interval among R intervals equally divided from the abscissa axis of the frequency distribution histogram;

[0123] a clustering classification unit, configured to combine the mean value, the maximum value, the minimum value, the variance and the frequency distribution feature of the pixels to form an original feature vector, and cluster the binary images with the original feature vector as input of a clustering algorithm to obtain image clustering results;

[0124] a coding unit, configured to code the passenger distribution mode according to the image clustering results to obtain a passenger distribution code, each type of image clustering result corresponding to a passenger distribution mode, and each passenger distribution mode corresponding to a passenger distribution code; and

[0125] a model training unit, configured to establish a distribution prediction model for each car, and train the distribution prediction model with training samples to obtain a trained distribution prediction model, the training sample taking the environmental data sample at each sample collection point of each car during the sampling period as input and the corresponding passenger distribution code as output.

[0126] As shown in FIG. 2, a method for guiding passengers in a subway car based on environmental monitoring and lighting guidance provided by this embodiment includes:

[0127] 3.1 Obtaining of Prediction Samples

[0128] 3.11 During the prediction, the collection device set in step 1.1 obtain an environmental data sample of each sample collection point of each car during each sampling period in real time. The environmental data sample is constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value.

[0129] 3.12 The temperature time series in the environmental data samples of step 3.11 is inputted into the trained temperature time series LSTM neural network model with the classification layer removed in step 1.4, the vector outputted by the fully connected layer being a corresponding temperature time series feature vector;

[0130] The humidity time series in the environmental data samples of step 3.11 is inputted into the trained humidity time series LSTM neural network model with the classification layer removed in step 1.4, the vector outputted by the fully connected layer being a corresponding humidity time series feature vector;

[0131] The CO.sub.2 concentration time series in the environmental data samples of step 3.11 is inputted into the trained CO.sub.2 concentration time series LSTM neural network model with the classification layer removed in step 1.4, the vector outputted by the fully connected layer being a corresponding CO.sub.2 concentration time series feature vector.

[0132] 3.13 The temperature time series, the humidity time series, and the CO.sub.2 concentration time series in the environmental data samples obtained in real time are replaced with the temperature time series feature vector, the humidity time series feature vector, and the CO.sub.2 concentration time series feature vector respectively, to obtain new environmental data samples obtained in real time.

[0133] 3.2 Obtaining of Prediction Results

[0134] The new environmental data samples obtained in real time are inputted into the distribution prediction model for the corresponding car which is trained by the training method for the distribution prediction model of passengers in the subway car, to obtain a predicted passenger distribution code for each car during the corresponding sampling period.

[0135] 3.3 Lighting Guidance

[0136] Taking the end time of the sampling period as a time start, the brightness of lighting tubes in the corresponding car is adjusted according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow. The specific process is:

[0137] 3.31 A current image clustering result (i.e. a category of current passenger distribution) is obtained according to the predicted passenger distribution code for the car during the corresponding sampling period, and a clustering center corresponding to the current image is determined according to the current image clustering result. The image clustering result includes the clustering center and samples around the clustering center.

[0138] 3.32 S binary images closest to the clustering center are defined as typical clustering images of a corresponding category, and an average pixel value of each pixel position in the S typical clustering images is calculated, to obtain an average pixel image under the category.

[0139] In this embodiment, S=10, and an average pixel value of the pixel position (l, w) in the 10 typical clustering images is

[00011] I ¯ = - 1 L × W .Math. l = 1 , w = 1 l = L , w = W I ( l , w ) ,

where I(l, w).sub.k is the pixel value of the k.sup.th typical clustering image at the position (l, w). An average pixel value of each pixel position in the 10 typical clustering images is calculated, that is, an average pixel value of the pixels at the same position (or corresponding position) in the 10 typical clustering images is solved, and the number of average pixel values corresponds to the number of pixels on the typical clustering image.

[0140] 3.33 According to the layout of lighting tubes of each car, each lighting tube is assigned to a lighting area in accordance with the principle of proximity. As shown in FIG. 3, dual-line parallel layout and lighting with 8 strip LED lamp panels are the typical lighting mode of the subway train. The brightness of 8 strip LED tubes is controlled to change the brightness distribution of the passenger compartment, so as to guide passenger flow.

[0141] An average value of pixel values for pixels in the lighting area corresponding to each lighting tube is calculated. The line connecting the points,located about the same distance from the centers of the lighting tubes respectively in two adjacent lighting areas, constitutes a dividing line of the two adjacent lighting areas.

[0142] 3.34 If the maximum pixel value Ī.sub.max in the average pixel image is greater than a pixel threshold δ, lighting guidance is started and next step 3.35 is performed (i.e. as long as the average value for a pixel position is greater than the pixel threshold δ, the lighting guidance is started), otherwise, no lighting guidance is required. In this embodiment, δ=15, and the average value for the pixel position which is greater than δ indicates a crowded state.

[0143] 3.35 Pixels in the average pixel image whose average pixel values are greater than the pixel threshold δ are extracted, and the pixels whose average pixel values are greater than the pixel threshold constitute a set of pixels to be guided Ī.sub.A={(Ī(l.sub.1,w.sub.1),Ī(l.sub.2,w.sub.2), . . . ,Ī(l.sub.H,w.sub.H)}, where Ī(l.sub.1, w.sub.l) is an average pixel value of the first pixel in the pixels to be guided. Ī(l.sub.2,w.sub.2) is an average pixel value of the second pixel in the pixels to be guided, Ī(l.sub.H,w.sub.H) is an average pixel value of the H.sup.th pixel in the pixels to be guided, H is the number of pixels in the set of pixels to be guided, (l.sub.1,w.sub.1) indicates that the first pixel in the pixels to be guided is a pixel corresponding to the independent unit with the row of l.sub.1 and the column of w.sub.1, i.e. pixel coordinates of the pixel, and other pixels can be analogized using the above method.

[0144] 3.36 A lighting guidance coefficient is calculated according to the coordinates of the lighting tubes in the spatial coordinate system, the coordinates ofpixels in the set of pixels to be guided in the spatial coordinate system, and the average value of pixel values for the pixels in the lighting area. The specific calculation expression is:

[00012] β n = A n ( ( x 0 n - x 1 l ) 2 + ( y 0 n - y 1 w ) 2 + ( x 0 n - x 2 l ) 2 + ( y 0 n - y 1 w ) 2 + .Math. .Math. + ( x 0 n - x H l ) 2 + ( y 0 n - y H w ) 2 ) ( 7 )

[0145] Where, β.sub.n is the lighting guidance coefficient, A.sub.n is the average value of pixel values for pixels in the lighting area corresponding to the n.sup.th lighting tube. (x.sub.0.sup.n, y.sub.0.sup.n) is the coordinates of the n.sub.th lighting tube in the spatial coordinate system, (x.sub.1.sup.x, y.sub.1.sup.w) is the coordinates of the first pixel in the set of pixels to be guided in the spatial coordinate system. (x.sub.2.sup.l, y.sub.2.sup.w) is the coordinates of the second pixel in the set of pixels to be guided in the spatial coordinate system, and (x.sub.H.sup.l, y.sub.H.sup.w) is the coordinates of the H.sup.th pixel in the set of pixels to be guided in the spatial coordinate system.

[0146] 3.37 The brightness of the lighting tubes in the lighting areas corresponding to the β.sub.n values are adjusted according to the order of the β.sub.n values, and the passengers are guided to flow toward the lighting area with a smaller β.sub.n value.

[0147] For example, the β.sub.n values are sorted from large to small, the lighting tubes in the lighting area corresponding to the maximum β.sub.n value are adjusted to the darkest, the lighting tubes in the lighting area corresponding to the minimum β.sub.n value are adjusted to the brightest, and the lighting tubes in the lighting area with smaller β.sub.n value are adjusted to be brighter than the lighting tubes in the lighting area with larger β.sub.n value. The passenger flow is guided by changing the brightness distribution in the car. This guidance method has the advantages of simplicity, directness, and has high level of public acceptance.

[0148] This embodiment further provides a system for guiding passengers in a subway car based on environmental monitoring and lighting guidance, including:

[0149] a real-time data obtaining unit, configured to obtain an environmental data sample at each sample collection point of each car during each sampling period in real time; the environmental data sample being constituted by installation coordinates of each sample collection point, a temperature time series, a humidity time series, and a CO.sub.2 concentration time series corresponding to the sample collection point, and an average temperature value and an average humidity value:

[0150] a prediction unit, configured to input the environmental data samples into the distribution prediction model for the corresponding car which is trained by the training method for the distribution prediction model of passengers in the subway car, to obtain a predicted passenger distribution code for each car during the corresponding sampling period; and

[0151] an adjustment and guidance unit, configured to adjust the brightness of lighting tubes in the corresponding car according to the predicted passenger distribution code for the car during the corresponding sampling period to guide passenger flow.

[0152] Disclosed above are only the specific embodiments of the present invention, but the protection scope of the present invention is not limited thereto. Any person skilled in the art can easily conceive of changes or modifications within the technical scope disclosed in the present invention, and these changes and modifications shall fall within the protection scope of the present invention.