VEHICLE DETECTION APPARATUS, METHOD AND PROGRAM

20230154314 · 2023-05-18

Assignee

Inventors

Cpc classification

International classification

Abstract

An apparatus includes a signal acquisition part acquires oscillation signals from sensors provided under lanes of a bridge and close to an expansion joint, a signal separation part applies BSS to the oscillation signals to estimate source oscillation signals respectively separated in the plurality of lanes, and adjusts amplitude of the source oscillation signals to output amplitude adjusted oscillation signals, and a vehicle estimation part estimates, from the amplitude adjusted oscillation signal, a response oscillation due to a vehicle passing on the lane of interest to detect and count vehicles passing on the lane.

Claims

1. A vehicle detection apparatus comprising: at least a processor; and a memory storing program instructions executable by the processor, wherein the processor is configured to execute the program instructions to implement: a signal acquisition part that acquires a plurality of oscillation signals from a plurality of sensors, respectively, each sensor provided for each lane of a bridge that includes a plurality of lanes running in parallel and capable of sensing oscillation induced by a vehicle passing on the lane; a signal separation part that receives, from the signal acquisition part, the plurality of the oscillation signals, applies blind source separation (BSS) to the plurality of the oscillation signals to estimate and separate a plurality of source oscillation signals, each source oscillation signal specific to each lane, and adjusts amplitude of an individual one of the plurality of the source oscillation signals to output a plurality of amplitude adjusted oscillation signals; and a vehicle estimation part that receives the amplitude adjusted oscillation signal corresponding to the lane of interest from the signal separation part, and estimates, from the amplitude adjusted oscillation signal, a response oscillation due to a vehicle passing on the lane of interest to detect and count individual vehicles passing on the lane of interest.

2. The vehicle detection apparatus according to claim 1, wherein the signal separation part calculates a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to the each frame extracted from the oscillation signal by a sliding window, calculates a sum of a normalized frequency spectrum, from the frequency spectrum of the each frame, for each lane, detects a first peak in at least one of a plurality of the sums of normalized frequency spectrum, each of the sums corresponding to each lane, to find a first vehicle region that is a time interval during which a vehicle is travelling on a lane and there is no vehicle travelling on other lane(s), performs the BSS to the oscillation signals of a time interval including the first vehicle region to estimate and separate the source oscillation signals, calculates an amplitude ratio based on the oscillation signal measured by the sensor for each lane and a corresponding source oscillation signal specific to each lane estimated by the BSS, and adjusts an amplitude of the source oscillation signal specific to each lane, by amplifying the amplitude of the source oscillation signal with the amplitude ratio to output the amplitude adjusted oscillation signals.

3. The vehicle detection apparatus according to claim 1, wherein the vehicle estimation part obtains a feature value for each frame of the amplitude adjusted oscillation signal by applying Fourier transform to the each frame extracted by a window of a predetermined length, and calculating a feature value for the each frame in a frequency domain, performs Gaussian mixture model-clustering on a time series of the feature values for respective frames to estimate one or more clusters, each of which is modeled with a Gaussian probability distribution best fit to the time series, and detects and counts individual vehicles passing on the lane by counting number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

4. The vehicle detection apparatus according to claim 1, where the estimation part applies Fourier transform to each frame extracted by a window of a predetermined length to obtain a frequency spectrum of the each frame, calculates a normalized frequency spectrum of the each frame by normalizing the frequency spectrum of the each frame, calculates a frame-wise sum of an amplitude spectrum of the normalized frequency spectrum, for the each frame, performs scaling of the frame-wise sum to a pre-defined range, obtains a time repeated vector of the scaled frame-wise sum by multiplying a time value of each scaled frame-wise sum by a magnitude of the each frame-wise sum, performs the Gaussian mixture model-clustering on the time repeated vector, and detect and counts individual vehicles passing on the lane by counting the number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

5. A computer-based vehicle detection method comprising: acquiring a plurality of oscillation signals from a plurality of sensors, respectively, each sensor provided for each lane of a bridge that includes a plurality of lanes running in parallel and capable of sensing oscillation induced by a vehicle passing on the lane; performing signal separation processing comprising: applying blind source separation (BSS) to the plurality of the oscillation signals to estimate and separate a plurality of source oscillation signals, each source oscillation signal specific to each lane; and adjusting amplitude of an individual one of the plurality of the source oscillation signals to output a plurality of amplitude adjusted oscillation signals; and performing estimation processing comprising: receiving the amplitude adjusted oscillation signal corresponding to the lane of interest output from the signal separation processing; and estimating, from the amplitude adjusted oscillation signal, a response oscillation due to a vehicle passing on the lane of interest to detect and count individual vehicles passing on the lane of interest.

6. The computer-based vehicle detection method according to claim 5, wherein the signal separation processing comprises: calculating a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to each frame extracted by a window of a predetermined length; calculating a sum of a normalized frequency spectrum, from the frequency spectrum of each frame, for each lane, detecting a first peak in at least one of a plurality of the sums of normalized frequency spectrum, each of the sums corresponding to each lane, to find a first vehicle region that is a time interval during which a vehicle is travelling on a lane and there is no vehicle travelling on other lane(s), performing the BSS to the oscillation signals of a time interval including the first vehicle region to estimate and separate the source oscillation signals, calculating an amplitude ratio based on the oscillation signal measured by the sensor for each lane and a corresponding source oscillation signal specific to each lane estimated by the BSS, and adjusting an amplitude of the source oscillation signal specific to each lane, by amplifying the amplitude of the source oscillation signal with the amplitude ratio to output the amplitude adjusted oscillation signals.

7. The computer-based vehicle detection method according to claim 5, wherein the estimation processing comprises: obtaining a feature value for each frame of the amplitude adjusted oscillation signal by applying Fourier transform to each frame extracted by a window of a predetermined length, and calculating a feature value for the each frame in a frequency domain, performing Gaussian mixture model-clustering on a time series of the feature values for respective frames to estimate one or more clusters, each of which is modeled with a Gaussian probability distribution best fit to the time series, and detecting and counting individual vehicles passing on the lane by counting number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

8. The computer-based vehicle detection method according to claim 5, wherein the estimation processing comprises: applying Fourier transform to each frame extracted by a window of a predetermined length to obtain a frequency spectrum of the each frame, calculating a normalized frequency spectrum of the each frame by normalizing the frequency spectrum of the each frame, calculating a frame-wise sum of an amplitude spectrum of the normalized frequency spectrum, for the each frame, performing scaling of the frame-wise sum to a pre-defined range, obtaining a time repeated vector of the scaled frame-wise sum by multiplying a time value of each scaled frame-wise sum by a magnitude of the each frame-wise sum, performing the Gaussian mixture model-clustering on the time repeated vector, and detecting and counting individual vehicles passing on the lane by counting the number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

9. A non-transitory computer readable medium storing thereon a program causing a computer to execute processing comprising: acquiring a plurality of oscillation signals from a plurality of sensors, respectively, each sensor provided for each lane of a bridge that includes a plurality of lanes running in parallel and capable of sensing oscillation induced by a vehicle passing on the lane; performing signal separation processing comprising: applying blind source separation (BSS) to the plurality of the oscillation signals to estimate and separate a plurality of source oscillation signals, each source oscillation signal specific to each lane; and adjusting amplitude of an individual one of the plurality of the source oscillation signals to output a plurality of amplitude adjusted oscillation signals; and performing estimation processing comprising: receiving the amplitude adjusted oscillation signal corresponding to the lane of interest output from the signal separation processing; and estimating, from the amplitude adjusted oscillation signal, a response oscillation due to a vehicle passing on the lane of interest to detect and count individual vehicles passing on the lane of interest.

10. The non-transitory computer readable medium according to claim 9, wherein the signal separation processing comprises: calculating a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to each frame extracted by a window of a predetermined length; calculating a sum of a normalized frequency spectrum, from the frequency spectrum of each frame, for each lane, detecting a first peak in at least one of a plurality of the sums of normalized frequency spectrum, each of the sums corresponding to each lane, to find a first vehicle region that is a time interval during which a vehicle is travelling on a lane and there is no vehicle travelling on other lane(s), performing the BSS to the oscillation signals of a time interval including the first vehicle region to estimate and separate the source oscillation signals, calculating an amplitude ratio based on the oscillation signal measured by the sensor for each lane and a corresponding source oscillation signal specific to each lane estimated by the BSS, and adjusting an amplitude of the source oscillation signal specific to each lane, by amplifying the amplitude of the source oscillation signal with the amplitude ratio to output the amplitude adjusted oscillation signals.

11. The non-transitory computer readable medium according to or claim 9, wherein the estimation processing comprises: obtaining a feature value for each frame of the amplitude adjusted oscillation signal by applying Fourier transform to each frame extracted by a window of a predetermined length, and calculating a feature value for the each frame in a frequency domain, performing Gaussian mixture model-clustering on a time series of the feature values for respective frames to estimate one or more clusters, each of which is modeled with a Gaussian probability distribution best fit to the time series, and detecting and counting individual vehicles passing on the lane by counting number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

12. The non-transitory computer readable medium according to claim 9, wherein the estimation processing comprises: applying Fourier transform to each frame extracted by a window of a predetermined length to obtain a frequency spectrum of the each frame, calculating a normalized frequency spectrum of the each frame by normalizing the frequency spectrum of the each frame, calculating a frame-wise sum of an amplitude spectrum of the normalized frequency spectrum, for the each frame, performing scaling of the frame-wise sum to a pre-defined range, obtaining a time repeated vector of the scaled frame-wise sum by multiplying a time value of each scaled frame-wise sum by a magnitude of the each frame-wise sum, performing the Gaussian mixture model-clustering on the time repeated vector, and detecting and counting individual vehicles passing on the lane by counting the number of clusters, each of which has a probability density value greater than a predetermined threshold value for detection of a response oscillation due to a vehicle passing on the lane.

13. The vehicle detection apparatus according to claim 1, wherein the vehicle estimation part that obtains a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to the each frame extracted from the oscillation signal by a sliding window and calculates a sum of a normalized frequency vector, from the frequency spectrum of the each frame, for each lane, to generate a maximum vector with each element thereof being a max value of the sum of a normalized frequency vectors calculated for the plurality of lane, generates a binary vector with columns, number of which is equal to number of the sums of normalized frequency vectors and with rows, an element of which corresponds to a time index and has a value 1 at an element position with a maximum value and 0 otherwise, obtains time repeated vectors of the sum of a normalized frequency vectors and maximum vectors by multiplying a time value by a magnitude value of each of the vectors, performs Gaussian mixture model-clustering on the time repeated vector to count number of clusters and finds start time and end time of presence of a vehicle in each lane, and calculates a column-wise mean value of elements corresponding to the start time and the end time, in a row of the binary matrix to decide that the vehicle is present in a lane corresponding to a column of the binary matrix for which the column-wise mean value is calculated if the column-wise mean value is not less than a predetermined threshold value, while if the column-wise mean value is less than a predetermined threshold value for all columns of the binary matrix, deciding that the vehicle is present in the plurality of lanes, wherein the vehicle estimation part is enabled to detect presence or absence of a vehicle passing on a lane of the bridge, without using BSS.

14. The computer-based vehicle detection method according to claim 5, wherein the estimation processing comprises: obtaining a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to the each frame extracted from the oscillation signal by a sliding window and calculating a sum of a normalized frequency vector, from the frequency spectrum of the each frame, for each lane, to generate a maximum vector with each element thereof being a maximum value of the sum of a normalized frequency vectors calculated for the plurality of lane; generating a binary vector with columns, number of which is equal to number of the sums of normalized frequency vectors and with rows, an element of which corresponds to a time index and has a value 1 at an element position with a maximum value and 0 otherwise, obtaining time repeated vectors of the sum of a normalized frequency vectors and maximum vectors by multiplying a time value by a magnitude value of each of the vectors, performing Gaussian mixture model-clustering on the time repeated vector to count number of clusters to find start time and end time of presence of a vehicle in each lane; calculating a column-wise mean value of elements corresponding to the start time and the end time, in a row of the binary matrix to decide that the vehicle is present in a lane corresponding to a column of the binary matrix for which the column-wise mean value is calculated if the column-wise mean value is not less than a predetermined threshold value, while if the column-wise mean value is less than a predetermined threshold value for all columns of the binary matrix, deciding that the vehicle is present in the plurality of lanes, wherein the estimation processing is enabled to detect presence or absence of a vehicle passing on a lane of the bridge, without using BSS.

15. The non-transitory computer readable medium according to claim 9, wherein estimation processing comprises: obtaining a frequency spectrum of each frame of the oscillation signal by applying Fourier transform to the each frame extracted from the oscillation signal by a sliding window and calculating a sum of a normalized frequency vector, from the frequency spectrum of the each frame, for each lane, to generate a maximum vector with each element thereof being a maximum value of the sum of a normalized frequency vectors calculated for the plurality of lane; generating a binary vector with columns, number of which is equal to number of the sums of normalized frequency vectors and with rows, an element of which corresponds to a time index and has a value 1 at an element position with a maximum value and 0 otherwise, obtaining time repeated vectors of the sum of a normalized frequency vectors and maximum vectors by multiplying a time value by a magnitude value of each of the vectors, performing Gaussian mixture model-clustering on the time repeated vector to count number of clusters to find start time and end time of presence of a vehicle in each lane; calculating a column-wise mean value of elements corresponding to the start time and the end time, in a row of the binary matrix to decide that the vehicle is present in a lane corresponding to a column of the binary matrix for which the column-wise mean value is calculated if the column-wise mean value is not less than a predetermined threshold value, while if the column-wise mean value is less than a predetermined threshold value for all columns of the binary matrix, deciding that the vehicle is present in the plurality of lanes, wherein the estimation processing is enabled to detect presence or absence of a vehicle passing on a lane of the bridge, without using BSS.

Description

BRIEF DESCRIPTION OF DRAWINGS

[0054] FIGS. 1A and 1B are diagrams illustrating an embodiment of the invention.

[0055] FIG. 2 is a diagram illustrating an arrangement of a vehicle detection system of the embodiment.

[0056] FIGS. 3A to 3C are diagrams illustrating lane separation by BSS.

[0057] FIG. 4 is a flowchart illustrating an operation example of the embodiment.

[0058] FIGS. 5A to 5C are diagrams illustrating an example of the embodiment.

[0059] FIGS. 6A to 6D are diagrams illustrating an example of the embodiment.

[0060] FIG. 7 is a diagram illustrating an example of the embodiment.

[0061] FIGS. 8A to 8D are diagrams illustrating an example of the embodiment.

[0062] FIGS. 9A and 9B are diagrams illustrating an example of the embodiment.

[0063] FIG. 10 is a flowchart illustrating an operation example of the embodiment.

[0064] FIGS. 11A to 11C are diagrams illustrating an example of the embodiment.

[0065] FIGS. 12A to 12C are diagrams illustrating an example of the embodiment.

[0066] FIGS. 13A and 13B are diagrams illustrating an example of the embodiment.

[0067] FIG. 14 is a diagram illustrating an arrangement of a vehicle detection system of the embodiment.

[0068] FIG. 15 is a diagram illustrating an operation example of the vehicle estimation part of another embodiment.

[0069] FIGS. 16A and 16B are figures cited from NPL1.

[0070] FIG. 17 is a schematic diagram of traffic models.

[0071] FIGS. 18A and 18B are a diagram and an example of an acceleration signal of parallel traffic model.

[0072] FIGS. 19A and 19B are a diagram and an example of an acceleration signal of serial traffic model.

[0073] FIGS. 20A and 20B are a diagram and an example of an acceleration signal of mixture traffic model.

DETAILED DESCRIPTION

[0074] The following describes an example embodiment with reference to drawings. FIG. 1 is a schematic diagram illustrating the example embodiment of the present invention. FIG. 1A is a schematic illustration of a side view, while FIG. 1B is a schematic illustration of a plan view. Referring to FIG. 1, an expansion joint 14 is a joint provided between separate structures with different properties for accommodating movement, shrinkage, and temperature variations on reinforced and pre-stressed concrete, composite, and steel structures. An accelerometer is used as a sensor 12 for detecting acceleration (i.e., oscillation) of the bridge induced by a vehicle 1 passing on a lane of the bridge 10 to convert the oscillation to an electric signal. The sensor (accelerometer) 12 is provided below a concrete slab of the bridge 10 at least on an edge point of the bridge 10. Acceleration data (oscillation signal) captured by the accelerometer is transmitted in digital data via wired or wireless communication to a vehicle detection apparatus not shown.

[0075] FIG. 2 is a schematic diagram illustrating an example of an arrangement of a vehicle detection apparatus of the example embodiment. Referring to FIG. 2, the vehicle detection apparatus 100 includes a signal acquisition part 102, a signal separation part 104, a vehicle estimation part 106, and an output part 108. The signal acquisition part 102 acquires acceleration data from the sensors s1 and s2 which are communicatively connected to the signal acquisition part 102. The acceleration data includes an impulse response(s) of the lane, the impulse being given to the bridge 10 by axles of the vehicle passing over the bridge. The acceleration data (oscillation signals) from the sensors s1 and s2 are synchronized in time, that is, the acceleration data sampled by the sensors s1 and s2 at the same sampling time are received by the signal acquisition part 102, as data of the same index in time series acceleration data vector. The signal separation part 104 receives, from the signal acquisition part 102, respective acceleration data (oscillation signals) acquired by the sensors s1 and s2 and applies blind source separation (BSS) to acceleration data (oscillation signals) from the sensors s1 and s2 to separate each oscillation signal specific to each lane. The vehicle estimation part 106 detects and counts individual vehicles in each lane, based on the oscillation signal specific to each lane which has been separated by the signal separation part 104. The vehicle estimation part 106 calculates time repletion feature from peaks of an amplitude of the oscillation signal separated to each lane and performs clustering on time repletion feature to detect presence of a vehicle(s) on each lane. The output part 108 outputs the detection result to a display apparatus in a storage apparatus, or via a communication network to a terminal or a computer system. The signal acquisition part 102, signal separation part 104, vehicle estimation part 106, and output part 108 may be implemented by a processor that is included in the vehicle detection apparatus 100 and execute program instructions stored in a memory included in the vehicle detection apparatus 100.

[0076] FIGS. 3A-3C are schematic illustration of the blind source separation (BSS) which separates a set of source signals from a set of mixed signals, without aid of information about the source signals or a mixing system.

[0077] BSS is used with an assumption that measured vibration responses can be separated to estimate individual vibration response of each lane. In FIG. 3A, two vehicles are passing in lane 1 and lane 2 on the bridge. The sensors s1 and s2 captures oscillation mixed signal y.sub.1 and y.sub.2 which are outputs of a mixing system which mixes actual source signals (oscillation signals) x.sub.1 and x.sub.2 of the lane 1 and lane 2 respectively, as illustrated in FIG. 3B and FIG. 3C.


y.sub.j(n)=Σ.sub.i=1.sup.2Σ.sub.p=1.sup.pH.sub.ji(p)x.sub.i(n−p+1)  (1)

where x.sub.i is a signal from actual source, y.sub.j is a signal captured by the sensor (accelerometer) s.sub.j (j=1,2), and H.sub.ji is a mixing matrix from actual source to the sensor (accelerometer) s.sub.j.
Outputs of BSS are given as follows.


{circumflex over (x)}.sub.i(n)=Σ.sub.j=1.sup.2Σ.sub.q=1.sup.QW.sub.ij(q)y.sub.ij(n−q+1)  (2)

where {circumflex over (x)}.sub.i (i=1,2) is a separated signal, and W.sub.ij is a separation filter with Q taps.

[0078] In a case of p=1 and q=1, as illustrated in FIG. 3B and FIG. 3C.


y.sub.1(n)=H.sub.11x.sub.1(n)+H.sub.21x.sub.2(n)


y.sub.2(n)=H.sub.12x.sub.1(n)+H.sub.22x.sub.2(n)


{circumflex over (x)}.sub.1(n)+W.sub.11y.sub.1(n)+W.sub.21y.sub.2(n)


{circumflex over (x)}.sub.2(n)+W.sub.12y.sub.1(n)+W.sub.22y.sub.2(n)  (3)

[0079] FIG. 4 is a flowchart of the embodiment. Referring FIG. 4, the signal acquisition part 102 acquires oscillation signals (time domain signals) from sensors s1 and s2 (S101). The signal acquisition part 102 may cut off a DC component of the oscillation signal.

[0080] The signal separation part 104 calculates a sum of a normalized frequency spectrum (S102). More specifically, the signal separation part 104 extracts waveform data (frame) using a sliding window of a time interval T (length of one frame) and perform FFT (fast Fourier transform) to the extracted waveform data to obtain a frequency spectrum, which is normalized by dividing each frequency component (amplitude) by a total sum value of frequency components (amplitudes) of the frequency spectrum. The signal separation part 104 calculates a sum of a normalized frequency spectrum for each frame of the waveform data acquired by the sensors s1 and s2.

[0081] FIG. 5 is a diagram for explaining the embodiment. It is assumed that a vehicle 1 (truck) with 3 axles is passing in the lane 1 and a vehicle 2 (car) with 2 axles is passing in the lane 2, as illustrated in FIG. 5A. FIGS. 5B and 5C show waveform data (oscillation signals) acquired by the sensors s1 and s2 provided in lane 1 and lane 2 where the vehicle 1 (3-axle truck) and vehicle 2(2-axle car) are passing, respectively. In FIGS. 5B and 5C, a horizontal axis is a time and a vertical axis is an acceleration[m/s.sup.2].

[0082] FIG. 6A shows time series of a sum of the normalized frequency spectrum of lane 1 and FIG. 6B shows time series of a sum of the normalized frequency spectrum of lane 2. In FIGS. 6A and 6B, a sum of the normalized frequency spectrum for each frame are plotted along a time axis (horizontal axis).

[0083] The frequency component is normalized by a combined sum of the frequency spectrums of the oscillation signals (each frame). In case of FIG. 6A, the sum of normalized frequency of the frame acquired by the sensor s1 (lane 1) is given by the following equation:

[00001] Sum of normalized frequency ( s 1 ) = sum ( spectrum ( s 1 ) sum of ( spectrum ( s 1 ) , s pectrum ( s 2 ) ) ) ( 4 )

[0084] where spectrum (s1) is a vector having, as each element, an amplitude spectrum of each frequency bin in the frequency spectrum of the frame of the oscillation signal acquired by the sensor s1,

[0085] spectrum (s2) is a vector having, as each element, an amplitude spectrum of each frequency bin in the frequency spectrum of the frame of the oscillation signal acquired by the sensor s2,

[0086] sum of (spectrum (s1), spectrum (s2)) is a combined sum operation which adds each amplitude spectrum at each element of the vector spectrum (s1) and the vector spectrum (s2) to generate a scaler value (sum) by summing the added amplitude spectrums,

[0087] spectrum (s1)/sum of (spectrum (s1), spectrum (s2)) is an operation of dividing an amplitude spectrum of each element (frequency bin) in the vector spectrum (s1) by the sum of (spectrum (s1), spectrum (s2)) to obtain a normalized amplitude spectrum of each element (frequency bin) in the spectrum (s1), and sum ( ) is an operation of summing each normalized amplitude spectrum of each element (frequency bin) in the spectrum (s1).

[0088] In case of FIG. 6B, the sum of normalized frequency of the frame acquired by the sensor s2 (lane 2) is given, as with the expression (4), by the following expression:

[00002] Sum of normalized frequency ( s 2 ) = sum ( spectrum ( s 2 ) sum of ( spectrum ( s 1 ) , s pectrum ( s 2 ) ) ) ( 5 )

[0089] The signal separation part 104 detects a first peak in the sum of the normalized frequency spectrum of each of lane 1 and lane 2 (S103). FIG. 6C shows a sum of normalized frequencies of the lane 1 (sensor s1) where a first peak designated by a small circle indicates that a first vehicle has entered in the lane 1. FIG. 6D shows a sum of normalized frequencies of the lane 2 where a first peak designated by a circle indicates that a second vehicle has entered in the lane 2. Using a peak(s) of the sum of normalized frequencies, the signal separation part 104 determines a region boundary between a first vehicle region that is a time interval during when a vehicle enters in one of the lanes and there is no vehicle entering in the other lane and a mixed signal region is a time interval during when there are vehicles entering in both lanes. The signal separation part 104 selects a first vehicle region from a time-stamp of the first peak of FIG. 6C until the time-stamp of the first peak of FIG. 6D (S104).

[0090] FIG. 7 shows the first vehicle region and mixed signal region which are mapped to the oscillation signal (time domain signal) of lane 1 (FIG. 5B) acquired by the sensor s1. The signal separation part 104 stores the time domain signals of lane 1 and lane 2 (FIG. 5B and FIG. 5C) in a semiconductor memory (e.g., RAM (Random Access Memory or EEPROM(Electrically Erasable Programmable Read-Only Memory)), or a storage unit (e.g., HDD (Hard Disk Drive) or SSD(Solid State Drive)) provided in the vehicle detection apparatus.

[0091] The signal separation part 104 applies BSS to time domain signals (FIG. 5B and FIG. 5C) including the first regions and the mixed signal region to obtain estimated source signals ({circumflex over (x)}1 and {circumflex over (x)}2 in FIG. 3C). (S105).

[0092] FIG. 8A and FIG. 8B show the estimated source signals {circumflex over (x)}1 of lane 1 and {circumflex over (x)}2 of lane 2 which have been estimated by the BSS.

[0093] The signal separation part 104 calculates an amplitude ratio α by dividing a sum of power spectrum of the first vehicle region of measured time domain signal by a sum of power spectrum of the first vehicle region of time domain signal estimated by BSS.

[00003] α = sum of power spectrum of first vehicle region_measured sum of power spectrum of first vehicle region_estimated ( 6 )

[0094] The signal separation part 104 multiplies the amplitude ratio α to the estimated signals {circumflex over (x)}1 and {circumflex over (x)}2 to obtain amplified estimated signals α.Math.{circumflex over (x)}1 and α.Math.{circumflex over (x)}2.

[0095] FIG. 8C and FIG. 8D show respectively amplified estimated signals α.Math.{circumflex over (x)}1 and α.Math.{circumflex over (x)}2 of lane 1 and lane 2 which are obtained by multiplying the amplitude ratio α to the estimated source signals {circumflex over (x)}1 and {circumflex over (x)}2 of lane 1 and lane 2 shown in FIG. 8A and FIG. 8B.

[0096] The following describes an example of the operation of the vehicle estimation part 106 which can detect and count vehicles passing one after other with combination of vehicle types, e.g., a truck and a car (serial model), in a single lane of interest (one of lane 1 and lane 2) individually, based on one of amplified estimated signals α.Math.{umlaut over (x)}1 and α.Math.{circumflex over (x)}2.

[0097] FIG. 9A illustrates a serial case where a vehicle 1 (3-axle truck) and a vehicle 2 (2-axle car) following the vehicle 1 are present on the lane 1 of the bridge 10. The vehicle 2 is preferably spaced apart from the vehicle 1 by a time interval of e.g., about 0.5 second or more. FIG. 9B illustrates estimation of the number of vehicles based on clustering using a Gaussian mixture model with Dirichlet process prior. The number of Gaussian probability density function(s) obtained by the clustering and taking a value greater than a predetermined threshold value is counted as the number of vehicles.

[0098] FIG. 10 is a flow chart illustrating an operation example of the vehicle estimation part 106 which detects the number of vehicles passing a lane.

[0099] It is assumed that the vehicle estimation part 106 receives from the signal acquisition part 102, the oscillation signal, i.e., the amplified estimated signal of a lane of interest, as shown in FIG. 11A.

[0100] The vehicle estimation part 106 is configured to detect a vehicle (s) passing serially on a single lane (e.g., lane 1) from the oscillation signal of lane 1 captured by the sensor s1.

[0101] The vehicle estimation part 106 calculates a normalized frequency spectrum (S201). FIG. 11B shows a normalized frequency spectrum of the signal shown in FIG. 11A. More specifically, the vehicle estimation part 106 applies short-time(term) FFT (Fast Fourier Transform) (STFT) to the oscillation signal shown in FIG. 11A. That is, using a sliding window of a predetermined length (length of one frame), each shifted by predetermined value, each frame is extracted from the oscillation signal. The neighboring frames have overlapping portion. FFT is applied to each frame to obtain a frequency spectrum of each frame. Discrete Fourier transform (DFT) may as a matter of course be used in place of FFT.

[0102] Let's X.sub.t=[x.sup.0, x.sup.1 . . . x.sup.N−1, x.sup.N . . . ] be time series of sampled values x.sup.k (where k is non-negative integer) of the oscillation signal with a shift of the sliding window=m, frames X.sub.j (j=1,2,3 . . . ) with length N (N>m) are extracted by the sliding window from the oscillation signal and FFT is applied to each frame to obtain a frequency spectrum Y(ω).sub.j of the j-th frame X.sub.j (j=1,2,3 . . . ),


X.sub.1=[x.sup.0, . . . ,x.sup.N−1].fwdarw.Y(ω).sub.1=FFT(X.sub.1)


X.sub.2=[x.sup.m−1, . . . ,x.sup.N+m−2].fwdarw.Y(ω).sub.2=FFT(X.sub.2)


X.sub.3=[x.sup.2m−1, . . . ,x.sup.N+2m−3].fwdarw.Y(ω).sub.3=FFT(X.sub.3)

[0103] The vehicle estimation part 106 calculates a normalized frequency spectrum by dividing each frequency component (amplitude) by a total sum of amplitudes of the frequency components of the frequency spectrum. The total sum of amplitude spectrum q.sub.j for j-th frame X.sub.j is given by

[00004] S j = .Math. i = 1 N 2 - 1 q j ( i ) ( 7 )

where q.sub.j(i) is an amplitude of i-th frequency bin of the frequency spectrum Y(ω).sub.j of the j-th frame X.sub.j.


q.sub.j(i)=√{square root over (Re(y.sub.j(i)).sup.2+Im(y.sub.j(i)).sup.2)}  (8)

where y.sub.j(i) (i=1, . . . , N/2−1) is an i-th frequency component (bin) of the frequency spectrum Y(ω).sub.j. Re( ) and Im ( ) are real part and imaginary part of complex y.sub.j(i) where y.sub.j(0) (i=0) is a DC component, an imaginary part of which is zero and a real part of which is assumed to be zero, and an index i=N/2 corresponds to the Nyquist frequency bin.

[0104] The normalized frequency spectrum Q for the j-th frame X is given as

[00005] Q j = ( 1 s j ) [ q j ( 1 ) , .Math. , q j ( N 2 - 1 ) ] ( 9 )

[0105] The vehicle estimation part 106 calculates a frame-wise sum of a normalized frequency spectrum (S202).

[0106] The frame-wise sum f(j) of a normalized frequency spectrum for j-th frame X.sub.j (j=1, 2, . . . ) is given as

[00006] f ( j ) = .Math. i = 1 N 2 - 1 q j ( i ) ( 10 )

[0107] FIG. 11C shows frame-wise sum for each frame. In FIG. 11C, values of the frame-wise sum f(j) (j=1,2,3, . . . ) are plotted, where a horizontal axis is a time axis (i.e., index j=1,2,3, . . . ) and a vertical axis is the value of the frame-wise sum: f(j). FIG. 11C is a plot of the following vector (frame-wise sum vector),


F=(f(1),f(2),f(3), . . . )  (11)

[0108] The vehicle estimation part 106 performs amplitude transformation of the vector F to scale in pre-defined range (S203). FIG. 12A shows a result of amplitude transformation of the frame-wise sum vector shown in FIG. 11C. In the example of FIG. 12A, the vector F in FIG. 11C is transformed to a vector F_scaled of a range between 0 and 100, though not limited thereto.


scaled_min=0,


scaled_max=100,


F_min=min(F),


F_max=max(F).

[0109] The amplitude transformation is calculated as:


F_scaled=scale*F+scaled_min−F_min*scale  (12)


where


scale=(scaled_max−scaled_min)/(V_max−V_min)  (13)

[0110] The vehicle estimation part 106 creates a new vector (time-repeated vector) from the vector F_scaled by repeating time value by its magnitude value (S204). FIG. 12B shows an example of the new vector (time-repeated vector), where a vertical axis is a time axis and a horizontal axis is a repetition index. More specifically, the vehicle estimation part 106 repeats time occurrence by its magnitude. The vehicle estimation part 106 repeats x (time value) by y (scaled amplitude) times, i.e., (time value)*(scaled amplitude at the corresponding time). For example, in FIG. 13A, at time: 0 (index 0 of F_scaled), scaled amplitude (value of an element 0 of F_scaled) is 2, so time-repeated vector V starts with a time value 0 repeating 2 times, next, at time: 1 (index 1 of F_scaled), scaled amplitude (value of an element 1 of F_scaled) is 10, so repeating a time value 1 by 10 times. After the multiplication for all time value is iterated, we have a new time-repeated vector V as shown in FIG. 12B. In FIG. 12B, a time axis (horizontal axis) is an index of each element of the time-repeated vector V and a vertical axis is a value of each element of the time-repeated vector V.

[0111] Assuming that each vehicle is estimated based on a Gaussian Mixture model, transforming the signal to time-repeated feature makes it easy to perform clustering and vehicle detection by Gaussian Mixture Modelling. A vehicle occurrence time in the oscillation signal is not known. To detect the vehicle occurrence time, repeating time value by a scaled amplitude times is adopted, which generates more density at a peak location of the amplitude. This operation results in an expected distribution (such as Gaussian probability distribution) at each vehicle occurrence, as shown in FIG. 12C, which is a histogram (time axis histogram) of the time-repeated vector created by the vehicle estimation part 106.

[0112] The vehicle estimation part 106 performs clustering based on learning (unsupervised model training) of a mixture of Gaussian probability distributions (S205). The Gaussian mixture model is a probabilistic model that assumes all the data points are generated from a mixture of a finite number of Gaussian probability distributions with unknown parameters. The Gaussian Mixture Model may be learned from train data. Though not limited thereto, in the embodiment, Variational Bayesian Gaussian Mixture model, a variant of the Gaussian mixture model with variational inference algorithms, such as Variational Bayesian DPGMM (Dirichlet Process Gaussian Mixture Model) is used, which is an infinite mixture model with the Dirichlet Process, as a prior distribution on the number of clusters. Regarding Variational Bayesian DPGMM, reference may be made to NPL2 or NPL3. FIG. 13A shows a clustering result of the time repeated vector V using Variational Bayesian DPGMM. In FIG. 13A, a horizontal axis is a time axis and a vertical axis is a scaled version of probability density value.

[0113] The vehicle estimation part 106 counts the number of clusters, each of which has a value of a probability density function greater than a predetermined threshold value (S206), as shown in FIG. 13B. The predetermined threshold value is defined to identify a response oscillation of the bridge induced by a vehicle passing on the lane of the bridge.

[0114] The vehicle detection apparatus 100 may be implemented on a computer system as illustrated in FIG. 14. Referring to FIG. 14, a computer apparatus 200, such as a server, includes a processor (Central Processing Unit) 202, a memory 204 including, for example, a semiconductor memory (for example, Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable and Programmable ROM (EEPROM), and/or a storage device including at least one of Hard Disk Drive (HDD), SSD (Solid State Drive), Compact Disc (CD), Digital Versatile Disc (DVD) and so forth, a display apparatus 206 that display the result of detection of the number of a vehicle(s) passing on each lane, and a communication interface 208. The communication interface 208 (such as a network interface controller (NIC)) may well be configured to communicatively connect to sensor(s) provided under lanes of a bridge. A program 210 including program instructions (program modules) for executing processing of the signal acquisition part 102, the signal separation part 104, the vehicle estimation part 106 and the output part 108 of the vehicle detection apparatus 100 shown in FIG. 2 is(are) stored in a memory 204. The processor 202 is configured to read the program 210 (program instructions) from the memory 204 to execute the program (program instructions) to realize the function and processing of the vehicle detection apparatus 100.

[0115] In the above embodiments, detection of the number of vehicles passing on a single lane of a bridge is described, but the present invention is not limited to the number of vehicles. The present invention can be applied to detection of weight of a vehicle passing on a single lane of a bridge, a load weight of a vehicle, a deterioration/fatigue diagnostic of a bridge, etc.

[0116] In the above embodiments, accelerometers are used as sensors to detect impulse response (oscillation) of the bridge. However, in the present invention, a sensor is not limited to detection of impulse response (oscillation) of the bridge. That is, the present invention is applicable to an oscillation signal detected by an acoustic sensor such as a piezoelectric transducer, microphone, etc.

[0117] The following describes a further example embodiment. In this embodiment, the vehicle estimation part 106 is configured to realize vehicle detection in parallel traffic model (FIG. 18A). In this embodiment, without using BSS, the vehicle estimation part 106 can detect presence/absence of vehicle(s) passing on a lane(s) of the bridge 0. In this embodiment, the vehicle detection apparatus 100 includes the signal acquisition part 102, the vehicle estimation part 106, and the output part 108. The signal separation part 104 can be omitted.

[0118] FIG. 15 is a flow chart illustrating an operation example of the vehicle estimation part 106 in this embodiment. The signal acquisition part 102 acquires oscillation signals (time domain signals) from sensors s1 and s2 (S301).

[0119] The vehicle estimation part 106 calculates a sum of a normalized frequency spectrum (S302). More specifically, the vehicle estimation part 106 extracts waveform data (frame) using a sliding window of a time interval T (length of one frame) and perform FFT (fast Fourier transform) to the extracted waveform data to obtain a frequency spectrum, which is normalized by dividing each frequency component (amplitude) by a total sum value of frequency components (amplitudes) of the frequency spectrum.

[0120] The vehicle estimation part 106 obtains sum of normalized frequency vectors from sensors s1 and s2, by using respectively the equation (4) and (5) which define the sum of normalized frequency for each frame (S303). The sum of normalized frequency vector (s1) is constituted by a time series vector including, as elements thereof (where total N elements are assumed), the sums of normalized frequency (each given by the equation (4)) e.g., for frames 1, 2, . . . , N (or time instant 1 to N). The sum of normalized frequency vector from the sensor s2 is constituted by a time series vector including, as elements thereof (where total N elements are assumed), the sums of normalized frequency (each given by the equation (5)), e.g., for frames 1, 2, . . . , N (or time instant 1 to N).

[0121] Then, the vehicle estimation part 106 obtains s1s2_maximum vector by selecting a maximum amplitude value at each time-instant (each element) from the sum of normalized frequency vector (s1) and the sum of normalized frequency vector (s2). The s1s2_maximum vector has N elements (S304).


s1s2_maximum vector=max(sum of normalized frequency vector(s1),sum of normalized frequency vector(s2))  (14)

[0122] For instance, let's assume sum of normalized frequency vector (s1)=[sa1, sa2, . . . , saN] where sa1, sa2, . . . , saN are sums of normalized frequency vector (s1) at time-instant 1, 2, . . . N and sum of normalized frequency vector (s2)=[sb1, sb2, . . . , sbN] where sb1, sb2, . . . , sbN are sums of normalized frequency vector (s2) at time-instant 1, 2, . . . N and sa1sb2>sb1, sa2<sa2, . . . and saN<sbN, then


s1s2_maximum vector=[sa1,sb2, . . . ,sbN]  (15)

[0123] The vehicle estimation part 106 generates a new binary matrix B with the number of rows equal to the number of elements in s1s2_maximum vector (N elements) and the number of columns equal to 2 (corresponding to sums of normalized frequency vectors (s1) and (s2)), where, at each time-instant, i.e., in each row, a value 1 represents element position with the maximum value and 0 otherwise (S305). In the case of the equation (15), the N*2 binary matrix is given as

[00007] B T = [ 0 1 .Math. 1 1 0 .Math. 0 ] ( 16 )

where a superscript T indicates a transpose of the matrix B.

[0124] The vehicle estimation part 106 creates a time-repetition vector by repeating a time value by a corresponding magnitude value of the s1s2_maximum vector (S306). Similarly, the vehicle estimation part 106 also repeats the rows in the matrix B by repeating each row corresponding to a magnitude value of the s1s2_maximum vector.

[0125] Then, the vehicle estimation part 106 applies Gaussian Mixture model-based clustering (S307) and counts the number of clusters (S308), as explained in steps 205 and 206 in FIG. 10 on the time-repeated vectors, that is, counts the number of estimated Gaussian distributions (the number of vehicles).

[0126] Assuming, for example, ±3σ value (where σ is a standard deviation) from a mean value of the Gaussian distribution, as start time and end time of a vehicle impulse, the vehicle estimation part 106 obtains start time and end time of each vehicle present in each lane (S309). In order to detect in which lane the vehicle is present, the vehicle estimation part 106 calculates a column-wise mean of sliced number of rows from the binary matrix B, that corresponds to the start and end time of each vehicle. Assuming b.sub.ij is a value (1 or 0) of i-th row and j-th column of the binary matrix B (where j=1 or 2) (S310).

[00008] the column wise mean value ( for jth column ) = ( 1 end_time - start_time ) .Math. i = start_time end_time b i j ( 17 )

[0127] When the column-wise mean value of the sliced number of rows of the binary matrix B is greater than or equal to a pre-defined threshold (S311: “Yes” branch), the vehicle estimation part 106 estimates that the vehicle is detected in that a lane corresponding to the column of the matrix B for which the column-wise mean value is calculated (corresponding to the sensors s1 or s2) of the matrix B (S312). When the column-wise mean value in both the columns of the matrix B is less than the pre-defined threshold (S311: “No” branch), the vehicle estimation part 106 estimates that presence of a vehicle is detected in both of the lanes under which the sensors s1 and s2 are arranged (S313). It is noted that the above described embodiments are not limited to two lanes running in parallel on the bridge but can be applied to detection of vehicles passing on three or more lanes on the bridge.

[0128] Each disclosure of the above-listed PTLs 1-3 and NPLs 1-3 is incorporated herein by reference. Modification and adjustment of each example embodiment and each example are possible within the scope of the overall disclosure (including the claims) of the present invention and based on the basic technical concept of the present invention. Various combinations and selections of various disclosed elements (including each element in each Supplementary Note, each element in each example, each element in each drawing, and the like) are possible within the scope of the claims of the present invention. That is, the present invention naturally includes various variations and modifications that could be made by those skilled in the art according to the overall disclosure including the claims and the technical concept.