SYSTEM AND METHOD FOR DETECTING AND TRACKING A MOVING OBJECT
20180005033 · 2018-01-04
Assignee
Inventors
Cpc classification
G06V20/49
PHYSICS
International classification
Abstract
A device includes a memory configured to store instructions and a processor configured to execute the instructions to obtain image data of a region of interest included in an image frame. The processor may also be configured to compare the image data of the region of interest with image data of a background to detect a change in the region of interest. The processor may further be configured to detect the object in image frame based on the detected change.
Claims
1. A device, comprising: a memory configured to store instructions; and a processor configured to execute the instructions to: obtain image data of a region of interest included in an image frame; compare the image data of the region of interest with image data of a background to detect a change in the region of interest; and detect the object in the image frame based on the detected change.
2. The device of claim 1, wherein the processor is further configured to execute the instructions to detect the object in the image frame based on: time information indicating when the change occurred to the region of interest; and at least one of brightness data or color data obtained from a location within the region of interest where the change occurred.
3. The device of claim 1, wherein the processor is further configured to execute the instructions to: determine, based on the comparison, a difference between the image data of the region of interest and the image data of the background.
4. The device of claim 3, wherein the processor is further configured to execute the instructions to: determine a difference vector between a first vector storing the image data of the region of interest and a second vector storing the image data of the background; and determine whether the difference vector satisfies a predetermined condition.
5. The device of claim 3, wherein the processor is further configured to execute the instructions to: determine that the difference is smaller than a predetermined difference value, and update the image data of the background using the image data of the region of interest.
6. The device of claim 5, wherein updating the image data of the background includes: updating the image data of the background using image data of the region of interest from a predetermined number of image frames.
7. The device of claim 3, wherein the processor is further configured to execute the instructions to: determine that the difference is greater than or equal to a predetermined difference value, determine a number of image frames, from a predetermined plurality of image frames, in which the difference is greater than or equal to the predetermined difference value; and when the number of image frames is greater than or equal to a predetermined frame number value, update the image data of the background using the image data of the region of interest from the predetermined plurality of image frames.
8. The device of claim 1, wherein each image frame includes a plurality of regions of interests, and wherein the processor is further configured to execute the instructions to: determine, from the comparison, a location in each of the plurality of regions of interest where a change has occurred; obtain color data at the location in different regions of interests from a different image frames; and compare color data obtained from the different regions of interest in the different image frames.
9. The device of claim 8, wherein the processor is further configured to execute the instructions to: obtain time instances that indicate when changes occurred in a plurality regions of interest; determine a time sequence of the changes based on the time instances; and determine a moving direction of the object across the regions of interest based on the time sequence.
10. The device of claim 9, wherein the processor is further configured to execute the instructions to: calculate a plurality of time intervals based on a plurality of pairs of adjacent time instances; compare the time intervals with a plurality of predetermined time delay values; and determine that a same object is moving across the regions of interest in the moving direction when: each of the time intervals are less than each of the corresponding predetermined time delay values, and a difference in the color data obtained from the different regions of interest in the different image frames is smaller than a predetermined color value.
11. The device of claim 10, wherein the processor is further configured to execute the instructions to initiate tracking of the object in the moving direction by sending a control signal to drive a motor to turn a camera in the moving direction to follow the object.
12. The device of claim 11, wherein the processor is further configured to execute the instructions to determine a moving speed of the object based on the time instances and distances between adjacent regions of interest, and wherein the control signal controls the speed of driving the motor based on the moving speed.
13. The device of claim 1, wherein the processor is further configured to execute the instructions to: obtaining a brightness matrix including brightness values of pixels included in the region of interest; and transform the brightness matrix into a brightness vector, each row of the vector being calculated based on brightness values of all columns in the row, wherein the image data of the region of interest is represented by the brightness vector.
14. The device of claim 13, wherein the processor is further configured to execute the instructions to: obtain a color data matrix including color data of pixels included in the region of interest; and transform the color data matrix into a color value vector, each row of the vector being a sum of color values of all columns in the row, wherein the image data associated with the region of interest includes the color value vector.
15. The device of claim 13, wherein the processor is further configured to execute the instructions to: obtain an initial brightness vector as the image data of the background, based on brightness vectors obtained from a predetermined number of image frames.
16. The device of claim 1, wherein the image frame comprises three regions of interest.
17. A method for detecting an object, comprising: obtaining image data of a region of interest included in an image frame; comparing the image data of the region of interest with image data of a background to detect a change in the region of interest; and detecting the object in the image frame based on the detected change.
18. The method of claim 17, wherein detecting the object in the image frame includes detecting the object based on time information indicating when a plurality of changes occur to a plurality of regions of interest in a plurality of image frames, and color data obtained from a location within the plurality of regions of interest.
19. A non-transitory computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform a method for detecting an object, the method comprising: obtaining image data of a region of interest included in an image frame; comparing the image data of the region of interest with image data of a background to detect a change in the region of interest; and detecting the object in image frame based on the detected change.
20. The non-transitory computer-readable storage medium of claim 19, wherein detecting the object in the image frame includes detecting the object based on time information indicating when a plurality of changes occur to a plurality of regions of interest in a plurality of image frames, and color data obtained from a location within the plurality of regions of interest.
Description
DESCRIPTION OF DRAWINGS
[0015] The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
DETAILED DESCRIPTION
[0025] The disclosed embodiments provide a computer-vision based system and method for detecting an object, such as a moving object, from image frames captured by a camera. The disclosed embodiments may detect the moving object and track the moving object in real time. The disclosed embodiments may detect and track the moving object from the image frames in an accurate, fast, and efficient manner. In addition, the disclosed system and method may achieve simultaneous monitoring of a certain area in an environment, and detecting and tracking a moving object in that area. Thus, the disclosed system may eliminate the needs to set up two different systems, one for monitoring the area, and one for detecting and tracking a moving object in that area. Furthermore, the disclosed method may be implemented as an algorithm that requires a small memory size and a low computational cost. Accordingly, the disclosed methods may be implemented in an embedded system, such as a vehicle, a wearable device, an airplane, etc., which do not have a large computational capacity.
[0026] Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of exemplary embodiments do not represent all implementations consistent with the invention. Instead, they are merely examples of devices and methods consistent with aspects related to the invention as recited in the appended claims.
[0027]
[0028] Image capturing device 105 may include a camera, such as a video camera, a digital photo camera, an infrared camera, etc. For simplicity of discussion, image capturing device 105 may be referred to as a camera 105. Camera 105 may have a field of view indicated by dashed lines 112 and 113. Camera 105 may be mounted on a camera mount 115, which may be further attached to another human (e.g., as a wearable device), a vehicle, an item of furniture, a post on a street, a wall of an office, a house, or a building, a roof, etc. Camera mount 115 may include a motor 120. Motor 120 may be configured to adjust an angle of camera 105, such as a panning angle, a tilt angle. In some embodiments, the angle of camera 105 may be adjusted in all directions. When motor 120 adjusts the angle of camera 105, the field of view of camera 105 is adjusted accordingly. Motor 120 may include any suitable motor, such as a stepper motor, a brushless motor, etc.
[0029] The disclosed system may include a computer 130. Computer 130 may include a monitor, a processor, a memory, a storage device, and other components. Computer 130 may communicate with camera 105 through a network 135, which may be a wired or wireless network. Network 135 may enable data transmission between camera 105 and computer 130 through a wired connection, such as a wire, a cable, etc., or a wireless connection, such as infrared, Wifi, Bluetooth, near field communication connection, cellular connection, radio connection, etc. Camera 105 may transmit captured image frames to computer 130. Computer 130 may send signals to camera 105 to control various settings of camera 105, such as zoom, pixel resolution, etc. Computer 130 may also send a control signal to motor 120 to adjust an angle of camera 105, thereby changing the field of view of camera 105. For example, when a moving object (e.g., human 110) is detected by computer 130 from the image frames captured by camera 105, computer 130 may send a control signal to motor 120 to adjust the angle of camera 105 such that the field of view follows the movement of moving object 110. Thus, system 100 may detect moving object 110 from a plurality of image frames, and may track moving object 110.
[0030] Although shown as separate components in
[0031]
[0032] The components included in system 200 may communicate with each other through any suitable communication means, such as a data and/or signal transmission bus, a cable, or other wired and/or wireless transmission means. For example, various components may communicate with each other through network 135 shown in
[0033] Image capturing module 205 shown in
[0034] Image capturing module 205 may include hardware components, such as any suitable optical or non-optical image capturing devices or sensors. For example, image capturing module 205 may include one or more video cameras, digital cameras, film cameras, infrared cameras, etc. In one embodiment, image capturing module 205 includes camera 105 shown in
[0035] Image capturing module 205 may include software components, such as image processing code, for implementing methods related to image capturing and/or processing. In some embodiments, the image frames captured by image capturing module 205 may be directly transmitted to processor 210 for processing. In some embodiments, the image frames captured by image capturing module 205 may be stored in memory 215 (or other storage devices included in system 200), and retrieved or read by processor 210 for processing.
[0036] Processor 210 may be configured to (e.g. programmed to) process instructions to perform the disclosed methods for detecting and/or tracking an object, such as a moving object, from a plurality of image frames captured by image capturing module 205. For example, processor 210 may be configured to obtain image data of one or more image frames from image capturing module 205 or memory 215. Processor 210 may be configured to process the image data for detecting and/or tracking an object, such as a moving object from image frames.
[0037] Processor 210 may include hardware and/or software components. For example, processor 210 may include hardware components, such as at least one of a central processing unit (CPU), a graphical processing unit (GPU), a microprocessor, a digital signal processor, circuits, etc. In some embodiments, processor 210 may include any appropriate type of general-purpose processors, microprocessors, and controllers. In some embodiments, processor 210 may also include special-purpose microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), or field programmable gate arrays (FPGAs).
[0038] Processor 210 may also include executable software code that is embedded within the circuits of processor 210 or stored within memory 215 for performing computational functions and/or image processing functions provided by processor 210. In some embodiments, processor 210 may be implemented as software for performing the disclosed methods for detecting and/or tracking an object, such as a moving object. In one embodiment, processor 210 may be included in computer 130 shown in
[0039] Memory 215 may include any suitable non-transitory computer-readable medium. Memory 215 may be implemented using any type of volatile or non-volatile memory devices, or a combination thereof, such as a static random access memory (SRAM), an electrically erasable programmable read-only memory (EEPROM), an erasable programmable read-only memory (EPROM), a programmable read-only memory (PROM), a read-only memory (ROM), a magnetic memory, a flash memory, a magnetic or optical disk.
[0040] In addition to executable code, memory 215 may be configured to store data, such as image data of a plurality of image frames captured by image capturing module 205. In some embodiments, processor 210 may read or retrieve image data from memory 215 for processing.
[0041] Memory 215 may include various databases and image processing software. For example, memory 215 may store instructions that are executable by processor 210. When executed by processor 210, the instructions may cause processor 210 (and/or system 200) to perform the disclosed methods for detecting and/or tracking an object from the image frames, such as detecting and tracking a moving object.
[0042] Control module 220 may include hardware and/or software components configured to perform various control functions. For example, control module 220 may include one or more circuits configured to drive a motor (e.g., motor 120) installed at a camera mount (e.g., camera mount 115), on which a camera (e.g., camera 105) is mounted. Motor 120 may be configured to change a mounting angle of camera 105 or an angle that camera 105 is pointing, thereby changing a field of view of camera 105. For example, when a moving object is detected by processor 210 from a plurality of image frames captured by image capturing module 205 (which may include camera 105), processor 210 may send a control signal, e.g., through control module 220, to motor 120. The control signal may drive motor 120 to adjust an angle (e.g., a panning angle and/or a tilt angle) of camera 105, thereby changing the field of view of camera 105 for, e.g., tracking a moving object identified in the image frames.
[0043]
[0044] In
[0045] One or more regions of interest (ROIs) may be arranged in each image frame. The ROIs shown in
[0046] The three ROIs may be arranged at a center portion of the field of view (e.g., at the center portion of image frame 301). For example, the three ROIs may be arranged as a group around the center ⅓ portion of image frame 301. Although one group of three ROIs are shown as arranged at the center portion of image frame 301, the disclosed systems and methods may use multiple groups of ROIs in image frame 301 for detecting multiple moving object simultaneously. For example, image frame 301 may include two or more groups of ROIs arranged at suitable locations within image frame 301 for detecting and/or tracking two or more moving objects at two or more regions of image frame 301. The number of ROIs included in different groups of ROIs may be the same or may be different.
[0047] In the embodiment shown in
[0048] In the embodiment shown in
[0049] Processor 210 may obtain image data of right ROI 313 from image frame 301 after detecting moving object 305 entering right ROI 313. Processor 210 may compare the image data (e.g., brightness data) of right ROI 313 with image data (e.g., brightness data) of a background image corresponding to right ROI 313 using the methods described below. Processor 210 may also obtain color data as part of the image data of right ROI 313. The color data of right ROI 313 may be used to compare to color data of center ROI 312, which may be obtained in an image frame 302 shown in
[0050]
[0051] Processor 210 may obtain image data of center ROI 312 from image frame 302 after detecting moving object 305 entering center ROI 312. Processor 210 may compare the image data (e.g., brightness data) of center ROI 312 with image data (e.g., brightness data) of a background image corresponding to center ROI 312 using the methods described below. Processor 210 may also obtain color data as part of the image data of center ROI 312. The color data of center ROI 312 may be used to compare to color data of right ROI 313, which may be obtained in image frame 301 shown in
[0052]
[0053] Processor 210 may obtain image data of left ROI 311 from image frame 303 after detecting moving object 305 entering left ROI 311. Processor 210 may compare the image data (e.g., brightness data) of left ROI 311 with image data (e.g., brightness data) of a background image corresponding to left ROI 311 using the methods described below. Processor 210 may also obtain color data as part of the image data of left ROI 311. The color data of left ROI 311 may be used to compare to color data of center ROI 312, which may be obtained in an image frame 302 shown in
[0054] As shown in
[0055]
[0056] For each image frame, processor 210 may perform one or more of steps 410, 415, and 420. In steps 410, 415, and 420, processor 210 may carry out similar analysis and process for different region of interest (ROI). The analysis of an ROI and its corresponding background image (also referred to as “background”) may be performed independently from any other ROIs. For example, in step 410, processor 210 may analyze image data of the left ROI (e.g., left ROI 311) and corresponding background image in an image frame (e.g., image frame 301). In step 415, processor 210 may analyze image data of the center ROI (e.g., center ROI 312) and corresponding background image in an image frame (e.g., image frame 302). In step 420, processor 210 may analyze image data of the right ROI (e.g., right ROI 313) and corresponding background image in an image frame (e.g., image frame 303). In some embodiments, for each image frame, only one of steps 410, 415, and 420 is carried out by processor 210. In other embodiments, for each image frame, two or more steps 410, 415, and 420 may be carried out by processor 210. In steps 410, 415, and 420, analyzing an ROI and corresponding background image may include comparing the image data of the ROI with image data of the background, updating the image data of the background when certain criteria are satisfied based on a result of the comparison, and obtaining time and/or color data for detecting an object, such as a moving object, from the image frames. When more than three ROIs are included in an image frame, method 400 may include additional steps similar to those of steps 410, 415, and 420. When fewer than three ROIs are included in an image frame, one or more of steps 410, 415, and 420 may not be included in method 400. For example, when only two ROIs (right and center ROIs 313 and 312) are included in the image frames, step 410 may be omitted.
[0057] Updating the image data of a background corresponding to an ROI in a present image frame may be performed for each ROI independently. First, initial image data for the background of an ROI is obtained after a certain number (e.g., N, which may be a positive integer, e.g., 2, 3, 5, 10, 15, 20, etc.) of image frames have been obtained. As image frames are obtained subsequently, the background may be updated using the newly obtained image data of the ROI. The initial image data of the background may be obtained using the following method.
[0058] For each ROI, the initial image data of the background may be determined based on image data of the ROI from a predetermined number of image frames, such as the last N image frames as described below. The last N image frames refers to the previously obtained image frames counted backwards from a present image frame under analysis. The term “last N image frames” may include the present image frame including the ROI that is being analyzed. The last N image frames may be the last N consecutively or non-consecutively obtained image frames. In other embodiments, the term “last N image frames” may not include the present image frame including the ROI that is being analyzed.
[0059] The image data of an image (e.g., of ROI or background) may include pixel values of pixels included in the image. A pixel value may include both the brightness data and the color data of a pixel. First, for any image frame, the brightness of the pixels included in an ROI may be obtained as an n-by-m brightness matrix M.sub.n,m, where n is a positive integer standing for the number of rows and m is a positive integer standing for the number of columns. The brightness matrix M.sub.n,m may be converted or transformed into an n-by-1 brightness vector P by calculating a value based on all m columns in each row. For example, processor 210 may add up values in all m columns in each row, and use that sum as a value for the row in vector P. In other words, each row in the brightness vector P may be obtained by P(i)=sum (M(i,1), M(i,2), . . . , M(i,m)), where sum ( ) stands for a mathematical function of summation, and where i=1, 2, . . . , n, standing for the row number of the vector P.
[0060] This conversion or transformation is performed for the last N image frames to generate N brightness vectors representing image data of the last N image frames, P.sub.f1, P.sub.f2, . . . , P.sub.fN. The initial brightness vector Q for the background image corresponding to the ROI may be obtained based on a statistical analysis on P.sub.f1, P.sub.f2, . . . , P.sub.fN corresponding to the last N consecutive image frames. The statistical analysis may include any suitable statistical analysis. In one embodiment, the statistical analysis may include calculating a median vector based on P.sub.f1, P.sub.f2, . . . , P.sub.fN. For example, each row of the initial vector Q may be calculated as Q(i)=median (P.sub.f1(i), P.sub.f2(i), . . . , P.sub.fN(i)), where median ( )stands for a mathematical function of median calculation, i=1, 2, . . . , n, standing for the row number of vector Q. In other words, each row of the initial vector Q for the background image may be the median value of the corresponding rows of the brightness vectors of the last N image frames. In another embodiment, the statistical analysis may include calculating an average vector based on P.sub.f1, P.sub.f2, . . . , P.sub.fN. For example, each row of the initial vector Q may be calculated as Q(i)=mean (P.sub.f1(i), P.sub.f2(i), . . . , P.sub.fN(i)), where mean ( ) stands for a mathematical function of mean or average calculation. In other words, each row of the initial vector Q for the background image may be the average value of the corresponding rows of the brightness vectors of the last N image frames.
[0061] Other statistical values calculated based on P.sub.f1, P.sub.f2, . . . , P.sub.fN corresponding to the last N image frames may also be used in the initial vector Q for the background image.
[0062] After the initial vector Q is obtained for a background image of an ROI, the initial vector Q may be updated as more image frames are captured by camera 105. Some existing technologies update the background image (e.g., update Q) in real time, meaning that Q is updated with every newly captured image frame. Such methods tend to blend image data of the moving object appearing at the foreground into the background image, which affects the extraction of background image. Other existing technologies do not update the background image at all, which are ineffective for changing scenes.
[0063] The present disclosure provides a new method for updating the background image that addresses problems associated with the existing technologies.
[0064] Processor 210 may determine whether a difference between the image data of the ROI and the image data of the background is greater than or equal to a predetermined difference value (step 520). When the difference is not greater than or equal to (i.e., smaller than) the predetermined difference value (No, step 520), processor 210 may update the image data of the background using the image data of the ROI obtained from the present image frame (step 525). When the difference is greater than or equal to the predetermined difference value (Yes, step 520), processor 210 may determine whether within the last N image frames, a total number of image frames N.sub.f, in which such difference is detected, is greater than or equal to a predetermined frame number N.sub.1 (i.e., N.sub.f≧N.sub.1) (step 530). When the total number of image frames N.sub.f is not greater than or equal to N1 (i.e., N.sub.f<N.sub.i) (No, step 530), processor 210 may not update the image data of the background (step 535). When the total number of image frames N.sub.f is greater than or equal to N.sub.1 (i.e., N.sub.f≧N.sub.1) (Yes, step 530), processor 210 may update the image data of the background (step 540). In some embodiments, processor 210 may update the image data of the background using image data of the ROI from the last N image frames. The updated image data of the background may be stored in memory 215 or other storage devices.
[0065] In some embodiments, step 515 may be implemented as follows. Take left ROI 311 (
[0066] In some embodiments, step 520 may be implemented as follows. Each row of D.sub.1, i.e., D.sub.1(i), may be compared with a predetermined absolute change D.sub.abs, e.g., 5, 10, 15, 20, etc. Processor 210 may determines whether D.sub.1(i) is greater than or equal to D.sub.abs. Additionally or alternatively, each row of D.sub.2, i.e., D.sub.2(i), may be compared with a predetermined relative change D.sub.perc, e.g., 15%, 20%, 30%, etc. Processor 210 may determine whether D.sub.2(i) is greater than or equal to D.sub.perc.
[0067] Various methods may be used to determine whether the difference between the image data of the ROI and the image data of the background is greater than or equal to a predetermined difference value in step 520. For example, in some embodiments, processor 210 may determine that the difference between the image data of the ROI and the image data of the background is greater than or equal to a predetermined difference value (Yes, step 520) when processor 210 determines that a total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to a predetermined row number R.sub.1 (e.g., 5, 8, 10, etc.) and a total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to a predetermined row number R.sub.2 (e.g., 5, 8, 10, etc.). Otherwise, if the total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is less than R.sub.1, and/or the total number of rows in D.sub.2 that are greater than D.sub.perc is less than R.sub.2, processor 210 may determine that the difference between the image data of the ROI and the image data of the background is less than a predetermined difference value (No, step 520). For example, even if the total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2, processor 210 may determine that the difference is smaller than the predetermined difference value when the total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is less than R.sub.1, which indicates that the absolute changes are small, and vice versa. In these embodiments, the “difference between the image data of the ROI and the image data of the background” includes both the absolute difference and the relative difference, and the “predetermined difference value” includes both the predetermined absolute change D.sub.abs and the predetermined relative change D.sub.perc.
[0068] In other embodiments, method 500 may use only one of the absolute difference vector D.sub.1 and the relative difference vector D.sub.2 in the determination performed in step 520. For example, processor 210 may implement step 520 by determining that the total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to R.sub.1 (Yes, step 520), or is less than R.sub.1 (No, step 520). In this example, the difference is represented by the absolute change in brightness values, and the predetermined difference value is D.sub.abs. As another example, processor 210 may implement step 520 by determining that the total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2 (Yes, step 520), or is less than R.sub.2 (No, step 520). In this example, the difference is represented by the relative change in brightness values, and the predetermined difference value is D.sub.perc.
[0069] If processor 210 determines “No” in step 520 (e.g., by determining at least one of the following is not satisfied: “a total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to R.sub.1,” “a total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2,” or both), processor 210 may update the image data of the background (step 525). This means, when the difference between the image data of the ROI and the background is small, the background image is updated using the image data of the ROI in the present image frame. The updating may be the same as the background image updating method discussed above, i.e., using the median or average value of the ROI image data from the last N image frames including the present image frame. For example, the brightness vector Q of the background image may be replaced with a median or average vector obtained based on brightness vectors P.sub.f1, P.sub.f2, . . . , P.sub.fN corresponding to the last N image frames.
[0070] If processor 210 determines “Yes” in step 520 (e.g., by determining at least one of the following is satisfied: “a total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to R.sub.1,” “a total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2,” or both), processor 210 may check the results of similar determinations made in step 520 for the last N image frames (which may or may not include the present image frame). Processor 210 may calculate the total number of image frames N.sub.f in the last N image frames, in which a “Yes” determination is made in a similar step 520 performed previously. In other words, in those N.sub.f image frames, at least one of the following is satisfied: “a total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to R.sub.1,” “a total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2,” or both.
[0071] In step 530, processor 210 may also determine whether N.sub.f is greater than or equal to the predetermined frame number N.sub.1. If N.sub.f≧N.sub.1, processor 210 may update the image data of the background (e.g., brightness vector Q) using the image data of the ROI (e.g., brightness vector P) (step 540). Otherwise, processor 210 may not update the image data of the background (step 535). The background image updating in step 540 may be similar to those discussed above in step 525. That is, the background image updating may be based on the last N image frames, including the present image frame.
[0072] Referring back to
[0073] In step 425, processor 210 may determine a time sequence of changes occurred to right, center, and left ROIs 313, 312, and 311, based on the recorded time instances T.sub.1, T.sub.2, and T.sub.3. For example, when T.sub.1 is earlier than T.sub.2, and T.sub.2 is earlier than T.sub.3, processor 210 may determine the time sequence of changes as the following: the changes occurred to right ROI 313 first, then to center ROI 312, and finally to left ROI 311. Alternatively, if T.sub.3 is earlier than T.sub.2, and T.sub.2 is earlier than T.sub.1, then processor 210 may determine the time sequence of changes as follows: the changes occurred to left ROI 311 first, and then to center ROI 312, and finally to right ROI 313.
[0074] Referring to
[0075] Processor 210 may select certain rows of color vector V.sub.c, and generate another color vector V.sub.c2 containing the selected rows. The selected rows include color data of areas (e.g., arears around a certain height) of the ROI to which changes occurred. Processor 210 may determine which rows to select from V.sub.c based on the rows in the brightness vector P of the ROI, in which difference (as compared to the brightness vector Q of the background image) greater than or equal to the predetermined difference value is detected in step 520. For example, when P is compared to Q, one or both of the following conditions may be satisfied: a total number of rows in D.sub.1 that are greater than or equal to D.sub.abs is greater than or equal to R.sub.1 and a total number of rows in D.sub.2 that are greater than D.sub.perc is greater than or equal to R.sub.2. Depending on applications, if one or both of the above conditions are satisfied, processor 210 determines that there exists a moving object in the ROI. Those rows satisfying one or both of these conditions in D.sub.1 and/or D.sub.2 are identified, and their positions in the vectors are recorded (e.g., saved in a variable named “position_Move.”). Those rows in brightness vector P at these positions are identified (e.g., row number 1, 5, 8, 10). Then, the corresponding rows at these positions in the color vector V.sub.c are identified, and selected to form the color vector V.sub.c2. V.sub.c2 includes color data of areas of the ROI to which changes occurred. For example, each row of V.sub.c2 indicates the color of an area of the ROI at a height that corresponds to the position of the row.
[0076] Processor 210 may obtain a color vector V.sub.c2 for each ROI. For example, left ROI 311 may be associated with a color vector V.sub.c2.sub._L, center ROI 312 may be associated with a color vector V.sub.c2.sub._C, and right ROI 313 may be associated with a color vector V.sub.c2.sub._R.
[0077] In step 430, processor 210 may determine whether a same moving object has moved across the right, center, and left ROIs 313, 312, and 311 based on a determination of whether the difference between the color vectors, V.sub.c2.sub._L, V.sub.c2.sub._C, and V.sub.c2.sub._R is small (e.g., smaller than a predetermined color difference value), or substantially the same. Various methods may be used to quantify the color difference and determine whether the color difference between the color vectors is sufficiently small such that processor 210 may determine that a same moving object has moved across the ROIs. In one embodiment, processor 210 may compare the color vectors to obtain color difference vectors, e.g., (V.sub.c2.sub._L−V.sub.c2.sub._C), and (V.sub.c2.sub._C−V.sub.cs.sub._R). Processor 210 may determine whether the value of each row of the resulting color difference vectors falls within a predetermined range of color difference values, or is smaller than a predetermined color value. For example, if each row of the color difference vectors falls within the predetermined range of color difference values or is smaller than the predetermined color difference value, the color difference in the ROIs may be determined to be sufficiently small to indicate that a same object moved across the ROIs. Otherwise, the color difference in the three ROIs may be determined to be large, and processor 210 may determine that the changes caused by a moving object in the ROIs are not caused by the same object. In one embodiment, when the color difference is large (indicating that different moving objects may have caused the changes in the ROIs), processor 210 may not perform steps 435-455. When the color difference between the ROIs is sufficiently small, as discussed above, processor 210 may continue to perform steps 435-455.
[0078] In step 435, processor 210 may detect, in the image frames, a moving object (e.g., same moving object 305) moving from right to the left in the field of view. In step 445, processor 210 may detect, in the image frames, a moving object moving from left to the right in the field of view. In some embodiments, processor 210 may perform one of steps 435 and 445. In some embodiments, processor 210 may perform both steps 435 and 445.
[0079] Detection of the moving object in steps 435 and 445 may be based on the time information and/or the color data obtained from the image frames.
[0080] Processor 210 may compare first time interval T.sub.i1 with a first predetermined time delay T.sub.d1, and compare second time interval T.sub.i2 with a second predetermined time delay T.sub.d2. In some embodiments, T.sub.d1 is the same as T.sub.d2. In some embodiments, T.sub.d1 and T.sub.d2 are different from each other. In some embodiments, each of time delays T.sub.d1 and T.sub.d2 is predetermined based on distance d.sub.1 or d.sub.2 between the ROIs. For example, first predetermined time delay T.sub.d1 may be proportional to distance d.sub.1, and second predetermined time delay T.sub.d2 may be proportional to distance d.sub.2. When d.sub.1 is larger than d.sub.2, T.sub.d1 may be pre-set to be larger than T.sub.d2. Processor 210 may detect moving object 305 moving across right ROI 313, center ROI 312, and left ROI 311 when both a time requirement and a color requirement are satisfied. The time requirement may be that first time interval T.sub.i1 is less than first predetermined time delay T.sub.d1 (i.e., T.sub.i1<T.sub.d1) and second time interval T.sub.i2 is less than second predetermined time delay T.sub.d2 (i.e., T.sub.i2<T.sub.d2). The color requirement may be that the color difference between the color difference vectors, e.g., (V.sub.c2.sub._L−V.sub.c2.sub._C), and (V.sub.c2.sub._C−V.sub.c2.sub._R), is small, or in other words, when the color vectors for the ROIs are substantially the same. Additionally, in some embodiments, processor 210 may also calculate the color difference vector between the right and left ROIs, e.g., (V.sub.c2.sub._L −V.sub.c2.sub._R), and compare this vector with the above two color difference vectors.
[0081] In step 435, processor 210 may determine that moving object 305 is moving from right to the left in the field of view of camera 105 based on the time sequence of changes occurred to right, center, and left ROIs 313, 312, and 311. For example, as shown in
[0082] Still referring to
[0083] As shown in Fig, 4, processor 210 may determine a moving speed of the left moving object (step 440). Alternatively or additionally, processor 210 may determine a moving speed of the right moving object (step 450). Referring to
[0084] As shown in
[0085] As shown in
[0086]
[0087] While illustrative embodiments have been described herein, the scope of any and all embodiments have equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those skilled in the art based on the present disclosure. The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. The examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed processes may be modified in any manner, including by reordering steps and/or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as illustrative only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.