Image capture and processing system supporting a multi-tier modular software architecture
09785811 ยท 2017-10-10
Assignee
Inventors
Cpc classification
G06K7/14
PHYSICS
G06K7/10831
PHYSICS
International classification
G06K5/00
PHYSICS
G06K7/14
PHYSICS
Abstract
An image capture and processing system supports a multi-tier modular software, and plug-in extendable, architecture. The image capture and processing system can be realized as an image-capturing cell phone, a digital camera, a video camera, mobile computing terminal and portable data terminal (PDT), provided with suitable hardware platform, communication protocols and user interfaces. A third-party customer can write and install a software plug-in into the application layer so as to enhance or modify the behavior of the image capture and processing system without any required knowledge of the hardware platform, communication protocols and/or user interfaces.
Claims
1. A system, comprising: an image formation and detection subsystem; an image capturing subsystem for capturing images detected by the image formation and detection subsystem; an image processing subsystem for processing images captured by the image capturing subsystem; an illumination subsystem comprising at least one light emitting diode (LED); a processing device; and memory comprising processing instructions that, when executed by the processing device, execute an application for reading indicia within the processed images and producing symbol character data representative of the read indicia; wherein the memory comprises a memory architecture that supports a multi-tier modular software architecture characterized by an operating system (OS) layer and an application layer in which the application for reading indicia and producing symbol character data is run.
2. The system according to claim 1, comprising an input/output subsystem for transmitting signals to an external system.
3. The system according to claim 2, comprising an image buffering subsystem for buffering images detected by the image formation and detection subsystem.
4. The system according to claim 2, wherein the multi-tier modular software architecture comprises an intermediate layer disposed between the application layer and the OS layer.
5. The system according to claim 4, wherein: the OS layer comprises one or more software modules comprising an OS kernel module, an OS file system module, and/or a device driver module; the intermediate layer comprises one or more software modules comprising a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, a timer subsystem module, an input/output subsystem module, and/or a memory control subsystem module; and the application layer comprises one or more software modules comprising an indicia decoding module, a function programming module, an application events manager module, a user commands table module, and/or a command handler module.
6. The system according to claim 4, wherein, prior to capturing images, the processing device is configured to: (1) access one or more software modules from the OS layer and executing code contained therein; (2) access one or more software modules from the intermediate layer and executing code contained therein; and (3) access one or more software modules from the application layer and executing code contained therein.
7. The system according to claim 1, wherein the at least one light emitting diode (LED) produces narrow-band illumination.
8. The system according to claim 1, comprising an illumination control subsystem for controlling the operation of the illumination subsystem.
9. The system according to claim 8, wherein the one or more indicia comprise 1D bar code symbols, 2D bar code symbols, and/or data matrix type code symbol structures.
10. The system according to claim 1, comprising an automatic object detection subsystem; wherein the image formation and detection subsystem has a field of view (FOV); and wherein the automatic object detection subsystem automatically detects the presence of an object in the FOV, and in response thereto, generates a trigger signal.
11. A system, comprising: an image formation and detection subsystem; an image capturing subsystem for capturing images detected by the image formation and detection subsystem; an image processing subsystem for processing images captured by the image capturing subsystem; an input/output subsystem for transmitting signals to an external system; an illumination subsystem comprising at least one light emitting diode (LED); a processing device; memory comprising processing instructions that, when executed by the processing device, execute an application for reading indicia within the processed images and producing symbol character data representative of the read indicia; Flash ROM for storing the application; and RAM for storing the images captured by the image capturing subsystem; wherein the memory comprises a memory architecture that supports a multi-tier modular software architecture characterized by an operating system (OS) layer and an application layer in which the application for reading indicia and producing symbol character data is run.
12. The system according to claim 11, comprising an image buffering subsystem for buffering images detected by the image formation and detection subsystem.
13. The system according to claim 11, wherein the multi-tier modular software architecture comprises an intermediate layer disposed between the application layer and the OS layer.
14. The system according to claim 13, wherein: the OS layer comprises one or more software modules comprising an OS kernel module, an OS file system module, and/or a device driver module; the intermediate layer comprises one or more software modules comprising a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, a timer subsystem module, an input/output subsystem module, and/or a memory control subsystem module; and the application layer comprises one or more software modules comprising an indicia decoding module, a function programming module, an application events manager module, a user commands table module, and/or a command handler module.
15. The system according to claim 11, wherein, prior to capturing images, the processing device is configured to: (1) access one or more software modules from the OS layer and executing code contained therein; (2) access one or more software modules from the intermediate layer and executing code contained therein; and (3) access one or more software modules from the application layer and executing code contained therein.
16. The system according to claim 11, wherein the memory maintains system parameters used to configure functions of the image capture and processing system.
17. A system, comprising: an image formation and detection subsystem; an image capturing subsystem for capturing images detected by the image formation and detection subsystem; an image processing subsystem for processing images captured by the image capturing subsystem; an input/output subsystem for transmitting signals to an external system; an illumination subsystem comprising at least one light emitting diode (LED); a processing device; and memory comprising processing instructions that, when executed by the processing device, execute an application for reading indicia within the processed images and producing symbol character data representative of the read indicia; wherein the memory comprises a memory architecture that supports a multi-tier modular software architecture characterized by an operating system (OS) layer and an application layer in which the application for reading indicia and producing symbol character data is run; and wherein the application layer comprises one or more software modules comprising an indicia decoding module, a function programming module, an application events manager module, a user commands table module, and/or a command handler module.
18. The system according to claim 17, wherein the OS layer comprises one or more software modules comprising an OS kernel module, an OS file system module, and/or a device driver module.
19. The system according to claim 17, wherein the multi-tier modular software architecture comprises an intermediate layer disposed between the application layer and the OS layer.
20. The system according to claim 19, wherein the intermediate layer comprises one or more software modules comprising a tasks manager module, an events dispatcher module, an input/output manager module, a user commands manager module, a timer subsystem module, an input/output subsystem module, and/or a memory control subsystem module.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) For a more complete understanding of how to practice the Objects of the Present Invention, the following Detailed Description of the Illustrative Embodiments can be read in conjunction with the accompanying Drawings, briefly described below:
(2)
(3)
(4) FIGS. 1C1-1C3, taken together, sets forth a table indicating the features and functions supported by each of the subsystems provided in the system architecture of the a digital image capture and processing system of the present invention, represented in
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19) FIG. 2L1 is a schematic block diagram representative of a system design for the hand-supportable digital imaging-based bar code symbol reading device illustrated in
(20) FIG. 2L2 is a schematic block representation of the Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem, realized using the three-tier computing platform illustrated in
(21)
(22)
(23)
(24)
(25)
(26)
(27) FIG. 3F1 is a first schematic representation showing, from a side view, the physical position of the LEDs used in the Multi-Mode Illumination Subsystem, in relation to the image formation lens assembly, the image sensing array employed therein (e.g. a Motorola MCM20027 or National Semiconductor LM9638 CMOS 2-D image sensing array having a 12801024 pixel resolution ( format), 6 micron pixel size, 13.5 Mhz clock rate, with randomly accessible region of interest (ROI) window capabilities);
(28) FIG. 3F2 is a second schematic representation showing, from an axial view, the physical layout of the LEDs used in the Multi-Mode Illumination Subsystem of the digital imaging-based bar code symbol reading device, shown in relation to the image formation lens assembly, and the image sensing array employed therein;
(29) FIG. 4A1 is a schematic representation specifying the range of narrow-area illumination, near-field wide-area illumination, and far-field wide-area illumination produced from the LED-Based Multi-Mode Illumination Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
(30) FIG. 4A2 is a table specifying the geometrical properties and characteristics of each illumination mode supported by the LED-Based Multi-Mode Illumination Subsystem employed in the hand-supportable digital imaging-based bar code symbol reading device of the present invention;
(31)
(32) FIG. 4C1 is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention;
(33) FIG. 4C2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the narrow-area illumination array in the Multi-Mode Illumination Subsystem of the present invention;
(34) FIG. 4C3 is a schematic representation of the cylindrical lenses used before the LEDs in the narrow-area (linear) illumination arrays in the digital imaging-based bar code symbol reading device of the present invention, wherein the first surface of the cylindrical lens is curved vertically to create a narrow-area (i.e. linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the narrow-area illumination pattern to produce a narrow-area (i.e. linear) illumination field;
(35) FIG. 4C4 is a schematic representation of the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area (linear) illumination array employed in the digital imaging-based bar code symbol reading device of the present invention;
(36) FIG. 4C5 is a set of six illumination profiles for the narrow-area (linear) illumination fields produced by the narrow-area (linear) illumination array employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the narrow-area illumination field begins to become substantially uniform at about 80 millimeters;
(37) FIG. 4D1 is a graphical representation showing the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention;
(38) FIG. 4D2 is a graphical representation showing the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the far-field and near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the present invention;
(39) FIG. 4D3 is a schematic representation of the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the illumination subsystem of the present invention,
(40) FIG. 4D4 is a schematic representation of the layout of LEDs and plano-convex lenses used to implement the far and narrow wide-area illumination array employed in the digital imaging-based bar code symbol reading device of the present invention, wherein the illumination beam produced therefrom is aimed by positioning the lenses at angles before the LEDs in the near-field (and far-field) wide-area illumination arrays employed therein;
(41) FIG. 4D5 is a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the near-field wide-area illumination field begins to become substantially uniform at about 40 millimeters;
(42) FIG. 4D6 is a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays employed in the digital imaging-based bar code symbol reading device of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the digital imaging-based bar code symbol reading device, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters;
(43) FIG. 4D7 is a table illustrating a preferred method of calculating the pixel intensity value for the center of the far-field wide-area illumination field produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device of the present invention, showing a significant signal strength (greater than 80 DN);
(44) FIG. 5A1 is a schematic representation showing the red-wavelength reflecting (high-pass) imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device, and the low-pass optical filter disposed before its CMOS image sensing array therewithin, cooperate to form a narrow-band optical filter subsystem for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the Multi-Mode Illumination Subsystem employed in the digital imaging-based bar code symbol reading device, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources);
(45) FIG. 5A2 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element disposed after the red-wavelength reflecting high-pass imaging window within the hand-supportable housing of the digital imaging-based bar code symbol reading device, but before its CMOS image sensing array, showing that optical wavelengths below 620 nanometers are transmitted and wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected);
(46) FIG. 5A3 is a schematic representation of transmission characteristics (energy versus wavelength) associated with the red-wavelength reflecting high-pass imaging window integrated within the hand-supportable housing of the digital imaging-based bar code symbol reading device of the present invention, showing that optical wavelengths above 700 nanometers are transmitted and wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected);
(47) FIG. 5A4 is a schematic representation of the transmission characteristics of the narrow-based spectral filter subsystem integrated within the hand-supportable imaging-based bar code symbol reading device of the present invention, plotted against the spectral characteristics of the LED-emissions produced from the Multi-Mode Illumination Subsystem of the illustrative embodiment of the present invention;
(48)
(49)
(50) FIGS. 6C1 and 6C2, taken together, set forth a schematic diagram of a hybrid analog/digital circuit designed to implement the Automatic Light Exposure Measurement and Illumination Control Subsystem of
(51)
(52) FIGS. 6E1 and 6E2, taken together, set forth a flow chart describing the steps involved in carrying out the global exposure control method of the present invention, within the digital imaging-based bar code symbol reading device of the illustrative embodiments;
(53)
(54)
(55)
(56)
(57)
(58)
(59)
(60)
(61)
(62) FIGS. 12E1 and 12E2 set forth a schematic representation of the Input/Output Subsystem software module which provides a means for creating and deleting input/output connections, and communicating with external systems and devices;
(63) FIGS. 12F1 and 12F2 set forth a schematic representation of the Timer Subsystem which provides a means for creating, deleting, and utilizing logical timers;
(64) FIGS. 12G1 and 12G2 set forth a schematic representation of the Memory Control Subsystem which provides an interface for managing the thread-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data;
(65)
(66)
(67)
(68)
(69)
(70)
(71)
(72)
(73)
(74)
(75)
(76)
(77)
(78)
(79)
(80)
(81)
(82)
(83)
(84)
(85)
(86)
(87)
(88)
(89)
(90)
(91)
(92)
(93) FIG. 19A1 is a perspective view of an alternative illustrative embodiment of the digital image capture and processing engine shown in
(94) FIG. 19A2 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in FIG. 19A1, wherein the system design is similar to that shown in FIG. 2A1, except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, involving the real-time exposure quality analysis of captured digital images in accordance with an adaptive system control method;
(95) FIG. 19B1 is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques, and employing the general engine design shown in FIG. 19A1;
(96) FIG. 19B2 is a cross-sectional view of the system shown in FIG. 19B1;
(97) FIG. 19B3 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in FIG. 19B1, wherein the system design is similar to that shown in FIG. 2A1, except that the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with a software-based illumination metering program realized within the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, performing the real-time exposure quality analysis of captured digital images in accordance with an adaptive system control method;
(98) FIG. 19C1 is a perspective view of an automatic imaging-based bar code symbol reading system of the present invention supporting a pass-through mode of operation using narrow-area illumination and video image capture and processing techniques, as well as a presentation-type mode of operation using wide-area illumination and video image capture and processing techniques;
(99) FIG. 19C2 is a schematic representation illustrating the system of FIG. 19C1 operated in its Pass-Through Mode of system operation;
(100) FIG. 19C3 is a schematic representation illustrating the system of FIG. 19C1 operated in its Presentation Mode of system operation;
(101) FIG. 19C4 is a schematic block diagram representative of a system design for the digital image capture and processing engine of the present invention shown in FIGS. 19C1 and 19C2, wherein the system design is similar to that shown in FIG. 2A1, except for the following differences: (1) the Automatic Light Exposure Measurement and Illumination Control Subsystem is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem in cooperation with the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem, carrying out real-time quality analysis of captured digital images in accordance with an adaptive system control method; (2) the narrow-area field of illumination and image capture is oriented in the vertical direction with respect to the counter surface of the POS environment, to support the Pass-Through Mode of the system, as illustrated in FIG. 19C2; and (3) the IR-based object presence and range detection system employed in FIG. 19C1 is replaced with an automatic IR-based object presence and direction detection subsystem which comprises four independent IR-based object presence and direction detection channels;
(102) FIG. 19C5 is a schematic block diagram of the automatic IR-based object presence and direction detection subsystem employed in the bar code reading system illustrated in FIGS. 19C1 through 19C4, showing four independent IR-based object presence and direction detection channels which automatically generate activation control signals for four orthogonal directions within the FOV of the system, which are received and processed by a signal analyzer and control logic block;
(103)
(104)
(105)
(106)
(107)
(108)
(109)
(110)
(111)
(112) FIG. 23C1 is an exemplary flow chart representation showing what operations are carried out when the system feature called Data Formatting Procedure is executed within the Data Output Procedure software module in the Application Layer of the system as shown in
(113) FIG. 23C2 is an exemplary flow chart representation showing what operations are carried out when the system feature called Scanner Configuration Procedure is executed within the Data Output Procedure software module in the Application Layer of the system as shown in
DETAILED DESCRIPTION
(114) Referring to the figures in the accompanying Drawings, the various illustrative embodiments of the hand-supportable imaging-based bar code symbol reading system of the present invention will be described in greater detail, wherein like elements will be indicated using like reference numerals.
(115) Overview of the Digital Image Capture and Processing System of the Present Invention Employing Multi-Layer Software-Based System Architecture Permitting Modification and/or Extension of System Features and Functions by Way of Third Party Code Plug-Ins
(116) The present invention addresses the shortcomings and drawbacks of prior art digital image capture and processing systems and devices, including laser and digital imaging-based bar code symbol readers, by providing a novel system architecture, platform and development environment which enables VARs, OEMs and others (i.e. other than the original system designers) to modify and/or extend the standard system features and functions of a very broad class of digital image capture and processing systems and devices, without requiring such third-parties to possess detailed knowledge about the hard-ware platform of the system, its communications with outside environment, and/or its user-related interfaces. This novel approach has numerous benefits and advantages to third parties wishing to employ, in their third party products, the digital image capture and processing technology of an expert digital imager designer and manufacturer, such as Applicants and their Assignee, Metrologic Instruments, Inc., but not having to sacrifice or risk the disclosure of its valuable intellectual property and know now, during such system feature and functionality modification and/or extension processes, in order to meet the requirements of its end-user applications at hand.
(117) As shown in
(118) For the illustrative embodiments of the present invention disclosed herein, exemplary standard system features and functions are described in the table of FIGS. 1C1-1C3. Such system features and functions are described below, in conjunction with the subsystem that generally supports the feature and function in the digital image capture and processing of the present invention:
(119) System Triggering Feature (i.e. Trigger Event Generation): Object Presence Detection Subsystem
(120) Standard System Functions:
(121) Automatic Triggering (i.e. IR Object Presence Detection) (e.g. ON, OFF)
(122) Manual Triggering (e.g. ON, OFF)
(123) Semi-Automatic Triggering (e.g. ON, OFF)
(124) Object Range Detection Feature: Object Range Detection Subsystem
(125) Standard System Functions:
(126) (IR-Based) Long/Short Range Detection (e.g. ON, OFF)
(127) (IR-Based) Quantized/Incremental Range Detection (e.g. ON, OFF)
(128) Object Velocity Detection Feature: Object Velocity Detection Subsystem
(129) Standard System Functions:
(130) LIDAR-Based Object Velocity Detection (e.g. ON, OFF)
(131) IR-PULSE-DOPPLER Object Velocity Detection (e.g. ON, OFF)
(132) Object Dimensioning Feature: Object Dimensioning Subsystem
(133) Standard System Functions:
(134) LIDAR-based Object Dimensioning (e.g. ON or OFF)
(135) Structured-Laser Light Object Dimensioning (e.g. ON or OFF)
(136) Field of View (FOV) Illumination Feature: Illumination Subsystem
(137) Standard System Functions:
(138) Illumination Mode (e.g. Ambient/OFF, LED Continuous, and LED Strobe/Flash)
(139) Automatic Illumination Control (i.e. ON or OFF)
(140) Illumination Field Type (e.g. Narrow-Area Near-Field Illumination, Wide-Area Far-Field
(141) Illumination, Narrow-Area Field Of Illumination, Wide-Area Field Of Illumination)
(142) Imaging Formation and Detection Feature: Imaging Formation and Detection (IFD) Subsystem
(143) Standard System Functions:
(144) Image Capture Mode (e.g. Narrow-Area Image Capture Mode, Wide-Area Image Capture Mode)
(145) Image Capture Control (e.g. Single Frame, Video Frames)
(146) Electronic Gain Of The Image Sensing Array (e.g. 1-10,000)
(147) Exposure Time For Each Image Frame Detected by The Image Sensing Array (e.g. programmable in increments of milliseconds)
(148) Exposure Time For Each Block Of Imaging Pixels Within The Image Sensing Array (e.g. programmable in increments of milliseconds)
(149) Field Of View Marking (e.g. One Dot Pattern; Two Dot Pattern; Four Dot Pattern; Visible Line Pattern; Four Dot And Line Pattern)
(150) Digital Image Processing Feature: Digital Image Processing Subsystem
(151) Standard System Functions:
(152) Image Cropping Pattern on Image Sensing Array (e.g. x1, y2, x2, y2, x3, y3, x4, y4)
(153) Pre-processing of Image frames (e.g. digital filter 1, digital filter 2, . . . digital filter n)
(154) Information Recognition Processing (e.g. Recognition of A-th Symbology; . . . Recognition of Z-th
(155) Symbology, Alphanumerical Character String Recognition using OCR 1, . . . Alphanumerical Character
(156) String Recognition using OCR n; and Text Recognition Processes)
(157) Post-Processing (e.g. Digital Data Filter 1, Digital Data Filter 2, . . . )
(158) Object Feature/Characteristic Set Recognition (e.g. ON or OFF)
(159) Sound Indicator Output Feature: Sound Indicator Output Subsystem
(160) Standard System Functions:
(161) Sound Loudness (e.g., High, Low, Medium Volume)
(162) Sound Pitch (e.g. freq. 1, freq2, freq3, . . . sound 1, . . . sound N)
(163) Visual Indictor Output Feature: Visual Indictor Output Subsystem
(164) Standard System Functions:
(165) Indicator Brightness (e.g., High, Low, Medium Brightness)
(166) Indicator Color (e.g. red, green, yellow, blue, white)
(167) Power Management Feature: Power Management Subsystem
(168) Standard System Functions:
(169) Power Operation Mode (e.g. OFF, ON Continuous, ON Energy Savings)
(170) Energy Savings Mode (e.g. Savings Mode No. 1, Savings Mode No. 2, . . . Savings Mode M)
(171) Image Time/Space Stamping Feature: Image Time/Space Stamping Subsystem
(172) Standard System Functions:
(173) GPS-Based Time/Space Stamping (e.g. ON, OFF)
(174) Network Server Time Assignment (e.g. ON, OFF)
(175) Network (IP) Address Storage Feature: IP Address Storage Subsystem
(176) Standard System Functions:
(177) Manual IP Address Storage (e.g. ON, OFF)
(178) Automatic IP Address Storage via DHCP (e.g. ON, OFF)
(179) Remote Monitoring/Servicing Feature: Remote Monitoring/Servicing Subsystem
(180) Standard System Functions:
(181) TCP/IP Connection (e.g. ON, OFF)
(182) SNMP Agent (e.g. ACTIVE or DEACTIVE)
(183) Input/Output Feature: Input/Output Subsystem
(184) Standard System Functions:
(185) Data Communication Protocols (e.g. RS-232 Serial, USB, Bluetooth, etc)
(186) Output Image File Formats (e.g. JPG/EXIF, TIFF, PICT, PDF, etc)
(187) Output Video File Formats (e.g. MPEG, AVI, etc)
(188) Data Output Format (e.g. ASCII)
(189) Keyboard Interface (e.g. ASCII)
(190) Graphical Display (LCD) Interface (e.g. ACTIVE or DEACTIVE)
(191) System Control and/or Coordination Feature: System Control and/or Coordination Subsystem Standard System Functions:
(192) Mode of System Operation (e.g. System Mode 1, System Mode 2, . . . . . System Mode N)
(193) As indicated in
(194) As shown in
(195) In general, the digital image capture and processing system of the present invention has a set of standard features and functions as described above, and a set of custom features and functionalities that satisfy customized end-user application requirements, which typically aim to modify and/or extend such standard system features and functions for particular applications at hand.
(196) In the illustrative embodiments described in detail below with reference to
(197) a digital camera subsystem for projecting a field of view (FOV) upon an object to be imaged in said FOV, and detecting imaged light reflected off the object during illumination operations in an image capture mode in which one or more digital images of the object are formed and detected by said digital camera subsystem;
(198) a digital image processing subsystem for processing digital images and producing raw or processed output data or recognizing or acquiring information graphically represented therein, and producing output data representative of the recognized information;
(199) an input/output subsystem for transmitting said output data to an external host system or other information receiving or responding device;
(200) a system control system for controlling and/or coordinating the operation of the subsystems above; and
(201) a computing platform for supporting the implementation of one or more of the subsystems above, and the features and functions of the digital image capture and processing system.
(202) The computing platform includes (i) memory for storing pieces of original product code written by the original designers of the digital image capture and processing system, and (ii) a microprocessor for running one or more Applications by calling and executing pieces of said original product code in a particular sequence, so as support a set of standard features and functions which characterize a standard behavior of the digital image capture and processing system.
(203) As will be described in greater detail with reference to
(204) In accordance with the novel principles of the present invention, one or more pieces of third-party code (plug-ins) are inserted or plugged into the set of place holders, and operate to extend the standard features and functions of the digital image capture and processing system, and modify the standard behavior thereof into a custom behavior for the digital image capture and processing system.
(205) In most embodiments of the present invention, the digital image capture and processing system will further comprise a housing having a light transmission window, wherein the FOV is projected through the light transmission window and upon an object to be imaged in the FOV. Also, typically, these pieces of original product code as well as third-party product code are maintained in one or more libraries supported in the memory structure of the computing platform. In general, such memory comprises a memory architecture having different kinds of memory, each having a different access speed and performance characteristics.
(206) In accordance with the principles of the present invention, the end-user, such a value-added reseller (VAR) or original equipment manufacturer (OEM), can write such pieces of third-party code (i.e. plug-ins) according to specifications set by the original system designers, and these pieces of custom code can be plugged into the place holders, so as to modify and extend the features and functions of the digital image capture and processing system (or third-party product into which the system is integrated or embodied), and modify the standard behavior of the digital image capture and processing system into a custom behavior for the digital image capture and processing system, without permanently modifying the standard features and functions of the digital image capture and processing system.
(207) In some illustrative embodiments of the present invention, the digital camera system comprises: a digital image formation and detection subsystem having (i) image formation optics for projecting the FOV through a light transmission window and upon the object to be imaged in the FOV, and (ii) an image sensing array for detecting imaged light reflected off the object during illumination operations in an image capture mode in which sensor elements in the image sensing array are enabled so as to detect one or more digital images of the object formed on the image sensing array; an illumination subsystem having an illumination array for producing and projecting a field of illumination through the light transmission window and within the FOV during the image capture mode; and an image capturing and buffering subsystem for capturing and buffering these digital images detected by the image formation and detection subsystem.
(208) The image sensing array can be realized by a digital image sensing structure selected from the group consisting of an area-type image sensing array, and a linear-type image sensing array.
(209) Preferably, the memory employed in the computing platform of the system maintains system parameters used to configure the functions of the digital image capture and processing system. In the illustrative embodiments, the memory comprises a memory architecture that supports a three-tier modular software architecture characterized by an Operating System (OS) layer, a System CORE (SCORE) layer, and an Application layer and responsive to the generation of a triggering event within said digital-imaging based code symbol reading system, as shown in
(210) The field of illumination projected from the illumination subsystem can be narrow-band illumination produced from an array of light emitting diodes (LEDs). Also, the digital image processing subsystem is typically adapted to process captured digital images so as to read one or more code symbols graphically represented in the digital images, and produces output data in the form of symbol character data representative of the read one or more code symbols. Each code symbol can be a bar code symbol selected from the group consisting of a 1D bar code symbol, a 2D bar code symbol, and a data matrix type code symbol structure.
(211) These and other aspects of the present invention will become apparent hereinafter and in the claims. It is, therefore, appropriate at this juncture to now describe in detail, the various illustrative embodiments of the digital image capture and processing system of the present invention depicted in
(212) Hand-Supportable Digital Imaging-Based Bar Code Reading Device of the First Illustrative Embodiment of the Present Invention
(213) Referring to
(214) As best shown in
(215) In other embodiments of the present invention the form factor of the hand-supportable housing might be different. In yet other applications, the housing need not even be hand-supportable, but rather might be designed for stationary support on a desktop or countertop surface, or for use in a commercial or industrial application.
(216) Schematic Block Functional Diagram as System Design Model for the Hand-Supportable Digital Image-Based Bar Code Reading Device of the Present Invention
(217) As shown in the system design model of FIG. 2L1, the hand-supportable Digital Imaging-Based Bar Code Symbol Reading Device 1 of the illustrative embodiment comprises: an IR-based Object Presence and Range Detection Subsystem 12; a Multi-Mode Area-type Image Formation and Detection (i.e. camera) Subsystem 13 having narrow-area mode of image capture, near-field wide-area mode of image capture, and a far-field wide-area mode of image capture; a Multi-Mode LED-Based Illumination Subsystem 14 having narrow-area mode of illumination, near-field wide-area mode of illumination, and a far-field wide-area mode of illumination; an Automatic Light Exposure Measurement and Illumination Control Subsystem 15; an Image Capturing and Buffering Subsystem 16; a Multi-Mode Image-Processing Bar Code Symbol Reading Subsystem 17 having five modes of image-processing based bar code symbol reading indicated in FIG. 2L2 and to be described in detail hereinabove; an Input/Output Subsystem 18; a manually-actuatable trigger switch 2C for sending user-originated control activation signals to the device; a System Mode Configuration Parameter Table 70; and a System Control Subsystem 18 integrated with each of the above-described subsystems, as shown.
(218) The primary function of the IR-based Object Presence and Range Detection Subsystem 12 is to automatically produce an IR-based object detection field 20 within the FOV of the Multi-Mode Image Formation and Detection Subsystem 13, detect the presence of an object within predetermined regions of the object detection field (20A, 20B), and generate control activation signals A1 which are supplied to the System Control Subsystem 19 for indicating when and where an object is detected within the object detection field of the system.
(219) In the first illustrative embodiment, the Multi-Mode Image Formation And Detection (I.E. Camera) Subsystem 13 has image formation (camera) optics 21 for producing a field of view (FOV) 23 upon an object to be imaged and a CMOS area-image sensing array 22 for detecting imaged light reflected off the object during illumination and image acquisition/capture operations.
(220) In the first illustrative embodiment, the primary function of the Multi-Mode LED-Based Illumination Subsystem 14 is to produce a narrow-area illumination field 24, near-field wide-area illumination field 25, and a far-field wide-area illumination field 25, each having a narrow optical-bandwidth and confined within the FOV of the Multi-Mode Image Formation And Detection Subsystem 13 during narrow-area and wide-area modes of imaging, respectively. This arrangement is designed to ensure that only light transmitted from the Multi-Mode Illumination Subsystem 14 and reflected from the illuminated object is ultimately transmitted through a narrow-band transmission-type optical filter subsystem 4 realized by (1) high-pass (i.e. red-wavelength reflecting) filter element 4A mounted at the light transmission aperture 3 immediately in front of panel 5, and (2) low-pass filter element 4B mounted either before the image sensing array 22 or anywhere after panel 5 as shown in
(221) The primary function of the narrow-band integrated optical filter subsystem 4 is to ensure that the CMOS image sensing array 22 only receives the narrow-band visible illumination transmitted by the three sets of LED-based illumination arrays 27, 28 and 29 driven by LED driver circuitry 30 associated with the Multi-Mode Illumination Subsystem 14, whereas all other components of ambient light collected by the light collection optics are substantially rejected at the image sensing array 22, thereby providing improved SNR thereat, thus improving the performance of the system.
(222) The primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to twofold: (1) to measure, in real-time, the power density [joules/cm] of photonic energy (i.e. light) collected by the optics of the system at about its image sensing array 22, and generate Auto-Exposure Control Signals indicating the amount of exposure required for good image formation and detection; and (2) in combination with Illumination Array Selection Control Signal provided by the System Control Subsystem 19, automatically drive and control the output power of selected LED arrays 27, 28 and/or 29 in the Multi-Mode Illumination Subsystem, so that objects within the FOV of the system are optimally exposed to LED-based illumination and optimal images are formed and detected at the image sensing array 22.
(223) The primary function of the Image Capturing and Buffering Subsystem 16 is to (1) detect the entire 2-D image focused onto the 2D image sensing array 22 by the image formation optics 21 of the system, (2) generate a frame of digital pixel data 31 for either a selected region of interest of the captured image frame, or for the entire detected image, and then (3) buffer each frame of image data as it is captured. Notably, in the illustrative embodiment, a single 2D image frame (31) is captured during each image capture and processing cycle, or during a particular stage of a processing cycle, so as to eliminate the problems associated with image frame overwriting, and synchronization of image capture and decoding processes, as addressed in U.S. Pat. Nos. 5,932,862 and 5,942,741 assigned to Welch Allyn, and incorporated herein by reference.
(224) The primary function of the Multi-Mode Imaging-Based Bar Code Symbol Reading Subsystem 17 is to process images that have been captured and buffered by the Image Capturing and Buffering Subsystem 16, during both narrow-area and wide-area illumination modes of system operation. Such image processing operations include image-based bar code decoding, and described in detail hereinafter in Applicant WIPO International Publication No. 2005/050390, incorporated herein by reference.
(225) The primary function of the Input/Output Subsystem 18 is to support standard and/or proprietary communication interfaces with external host systems and devices, and output processed image data and the like to such external host systems or devices by way of such interfaces. Examples of such interfaces, and technology for implementing the same, are given in U.S. Pat. No. 6,619,549, incorporated herein by reference in its entirety.
(226) The primary function of the System Control Subsystem 19 is to provide some predetermined degree of control or management signaling services to each subsystem component integrated, as shown. While this subsystem can be implemented by a programmed microprocessor, in the illustrative embodiment, it is implemented by the three-tier software architecture supported on computing platform shown in
(227) The primary function of the manually-actuatable Trigger Switch 2C integrated with the hand-supportable housing is to enable the user to generate a control activation signal upon manually depressing the Trigger Switch 2C, and to provide this control activation signal to the System Control Subsystem 19 for use in carrying out its complex system and subsystem control operations, described in detail herein.
(228) The primary function of the System Mode Configuration Parameter Table 70 is to store (in non-volatile/persistent memory) a set of configuration parameters for each of the available Programmable Modes of System Operation specified in the Programmable Mode of Operation Table, and which can be read and used by the System Control Subsystem 19 as required during its complex operations.
(229) The detailed structure and function of each subsystem will now be described in detail above.
(230) Schematic Diagram as System Implementation Model for the Hand-Supportable Digital Imaging-Based Bar Code Reading Device of the Present Invention
(231)
(232) In the illustrative embodiment, the image formation optics 21 supported by the bar code reader provides a field of view of 103 mm at the nominal focal distance to the target, of approximately 70 mm from the edge of the bar code reader. The minimal size of the field of view (FOV) is 62 mm at the nominal focal distance to the target of approximately 10 mm. Preliminary tests of the parameters of the optics are shown on
(233) The Multi-Mode Illumination Subsystem 14 is designed to cover the optical field of view (FOV) 23 of the bar code symbol reader with sufficient illumination to generate high-contrast images of bar codes located at both short and long distances from the imaging window. The illumination subsystem also provides a narrow-area (thin height) targeting beam 24 having dual purposes: (a) to indicate to the user where the optical view of the reader is; and (b) to allow a quick scan of just a few lines of the image and attempt a super-fast bar code decoding if the bar code is aligned properly. If the bar code is not aligned for a linearly illuminated image to decode, then the entire field of view is illuminated with a wide-area illumination field 25 or 26 and the image of the entire field of view is acquired by Image Capture and Buffering Subsystem 16 and processed by Multi-Mode Bar Code Symbol Reading Subsystem 17, to ensure reading of a bar code symbol presented therein regardless of its orientation.
(234) The interface board 43 employed within the bar code symbol reader provides the hardware communication interfaces for the bar code symbol reader to communicate with the outside world. The interfaces implemented in system will typically include RS232, keyboard wedge, and/or USB, or some combination of the above, as well as others required or demanded by the particular application at hand.
(235) Specification of the Area-Type Image Formation and Detection (i.e. Camera) Subsystem During its Narrow-Area (Linear) and Wide-Area Modes of Imaging, Supported by the Narrow and Wide Area Fields of Narrow-Band Illumination, Respectively
(236) As shown in
(237) The Multi-Mode Illumination Subsystem 14 includes three different LED-based illumination arrays 27, 28 and 29 mounted on the light transmission window panel 5, and arranged about the light transmission window 4A. Each illumination array is designed to illuminate a different portion of the FOV of the bar code reader during different modes of operation. During the narrow-area (linear) illumination mode of the Multi-Mode Illumination Subsystem 14, the central narrow-wide portion of the FOV indicated by 23 is illuminated by the narrow-area illumination array 27, shown in
(238) In
(239) As shown in
(240) As shown in
(241) In FIGS. 3F1 and 3F2, the lens holding assembly 45 and imaging sensing array 22 are mounted along an optical path defined along the central axis of the system. In the illustrative embodiment, the image sensing array 22 has, for example, a 12801024 pixel resolution ( format), 6 micron pixel size, with randomly accessible region of interest (ROI) window capabilities. It is understood, though, that many others kinds of imaging sensing devices (e.g. CCD) can be used to practice the principles of the present invention disclosed herein, without departing from the scope or spirit of the present invention.
(242) Details regarding a preferred Method of Designing the Image Formation (i.e. Camera) Optics Within the Image-Based Bar Code Reader Of The Present Invention Using The Modulation Transfer Function (MTF) and also a Method Of Theoretically Characterizing The DOF Of The Image Formation Optics Employed In The Imaging-Based Bar Code Reader Of The Present Invention can be found in WIPO International Publication No. WO 2005/050390, supra.
(243) Specification of Multi-Mode LED-Based Illumination Subsystem Employed in the Hand-Supportable Image-Based Bar Code Reading System of the Present Invention
(244) In the illustrative embodiment, the LED-Based Multi-Mode Illumination Subsystem 14 comprises: narrow-area illumination array 27; near-field wide-area illumination array 28; and far-field wide-area illumination array 29. The three fields of narrow-band illumination produced by the three illumination arrays of subsystem 14 are schematically depicted in FIG. 4A1. As will be described hereinafter, narrow-area illumination array 27 can be realized as two independently operable arrays, namely: a near-field narrow-area illumination array and a far-field narrow-area illumination array, which are activated when the target object is detected within the near and far fields, respectively, of the automatic IR-based Object Presence and Range Detection Subsystem 12 during wide-area imaging modes of operation. However, for purposes of illustration, the first illustrative embodiment of the present invention employs only a single field narrow-area (linear) illumination array which is designed to illuminate over substantially entire working range of the system, as shown in FIG. 4A1.
(245) As shown in
(246) The near-field wide-area illumination array 28 includes two sets of (flattop) LED light sources 28A1-28A6 and 28A7-28A13 without any lenses mounted on the top and bottom portions of the light transmission window panel 5, as shown in
(247) As shown in
(248) Narrow-Area (Linear) Illumination Arrays Employed in the Multi-Mode Illumination Subsystem
(249) As shown in FIG. 4A1, the narrow-area (linear) illumination field 24 extends from about 30 mm to about 200 mm within the working range of the system, and covers both the near and far fields of the system. The near-field wide-area illumination field 25 extends from about 0 mm to about 100 mm within the working range of the system. The far-field wide-area illumination field 26 extends from about 100 mm to about 200 mm within the working range of the system. The Table shown in FIG. 4A2 specifies the geometrical properties and characteristics of each illumination mode supported by the Multi-Mode LED-based Illumination Subsystem 14 of the present invention.
(250) The narrow-area illumination array 27 employed in the Multi-Mode LED-Based Illumination Subsystem 14 is optically designed to illuminate a thin area at the center of the field of view (FOV) of the imaging-based bar code symbol reader, measured from the boundary of the left side of the field of view to the boundary of its right side, as specified in FIG. 4A1. As will be described in greater detail hereinafter, the narrow-area illumination field 24 is automatically generated by the Multi-Mode LED-Based Illumination Subsystem 14 in response to the detection of an object within the object detection field of the automatic IR-based Object Presence and Range Detection Subsystem 12. In general, the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation and Detection Subsystem 13 are spatially co-extensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader. The narrow-area illumination field 24, produced in response to the detection of an object, serves a dual purpose: it provides a visual indication to an operator about the location of the optical field of view of the bar code symbol reader, thus, serves as a field of view aiming instrument; and during its image acquisition mode, the narrow-area illumination beam is used to illuminated a thin area of the FOV within which an object resides, and a narrow 2-D image of the object can be rapidly captured (by a small number of rows of pixels in the image sensing array 22), buffered and processed in order to read any linear bar code symbols that may be represented therewithin.
(251) FIG. 4C1 shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the narrow-area illumination array 27 in the Multi-Mode Illumination Subsystem 14. FIG. 4C2 shows the Lambertian emittance versus polar angle characteristics of the same LEDs. FIG. 4C3 shows the cylindrical lenses used before the LEDs (633 nm InGaAlP) in the narrow-area (linear) illumination arrays in the illumination subsystem of the present invention. As shown, the first surface of the cylindrical lens is curved vertically to create a narrow-area (linear) illumination pattern, and the second surface of the cylindrical lens is curved horizontally to control the height of the of the linear illumination pattern to produce a narrow-area illumination pattern. FIG. 4C4 shows the layout of the pairs of LEDs and two cylindrical lenses used to implement the narrow-area illumination array of the illumination subsystem of the present invention. In the illustrative embodiment, each LED produces about a total output power of about 11.7 mW under typical conditions. FIG. 4C5 sets forth a set of six illumination profiles for the narrow-area illumination fields produced by the narrow-area illumination arrays of the illustrative embodiment, taken at 30, 40, 50, 80, 120, and 220 millimeters along the field away from the imaging window (i.e. working distance) of the bar code reader of the present invention, illustrating that the spatial intensity of the area-area illumination field begins to become substantially uniform at about 80 millimeters. As shown, the narrow-area illumination beam is usable beginning 40 mm from the light transmission/imaging window.
(252) Near-Field Wide-Area Illumination Arrays Employed in the Multi-Mode Illumination Subsystem
(253) The near-field wide-area illumination array 28 employed in the LED-Based Multi-Mode Illumination Subsystem 14 is optically designed to illuminate a wide area over a near-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in FIG. 4A1. As will be described in greater detail hereinafter, the near-field wide-area illumination field 28 is automatically generated by the LED-based Multi-Mode Illumination Subsystem 14 in response to: (1) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus.
(254) In general, the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV of the Image Formation And Detection Subsystem 13 are spatially co-extensive and the object detection field spatially overlaps the FOV along the entire working distance of the imaging-based bar code symbol reader. The near-field wide-area illumination field 23, produced in response to one or more of the events described above, illuminates a wide area over a near-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in
(255) FIG. 4D1 shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the wide area illumination arrays in the illumination subsystem of the present invention. FIG. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the near field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14. FIG. 4D4 is geometrical the layout of LEDs used to implement the narrow wide-area illumination array of the Multi-Mode Illumination Subsystem 14, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the near-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14. FIG. 4D5 sets forth a set of six illumination profiles for the near-field wide-area illumination fields produced by the near-field wide-area illumination arrays of the illustrative embodiment, taken at 10, 20, 30, 40, 60, and 100 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1. These plots illustrate that the spatial intensity of the near-field wide-area illumination field begins to become substantially uniform at about 40 millimeters (i.e. center:edge=2:1 max).
(256) Far-Field Wide-Area Illumination Arrays Employed in the Multi-Mode Illumination Subsystem
(257) The far-field wide-area illumination array 26 employed in the Multi-Mode LED-based Illumination Subsystem 14 is optically designed to illuminate a wide area over a far-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in FIG. 4A1. As will be described in greater detail hereinafter, the far-field wide-area illumination field 26 is automatically generated by the LED-Based Multi-Mode Illumination Subsystem 14 in response to: (1) the detection of any object within the near-field of the system by the IR-based Object Presence and Range Detection Subsystem 12; and (2) one or more of following events, including, for example: (i) failure of the image processor to successfully decode process a linear bar code symbol during the narrow-area illumination mode; (ii) detection of code elements such as control words associated with a 2-D bar code symbol; and/or (iii) detection of pixel data in the image which indicates that object was captured in a state of focus. In general, the object detection field of the IR-based Object Presence and Range Detection Subsystem 12 and the FOV 23 of the image detection and formation subsystem 13 are spatially co-extensive and the object detection field 20 spatially overlaps the FOV 23 along the entire working distance of the imaging-based bar code symbol reader. The far-field wide-area illumination field 26, produced in response to one or more of the events described above, illuminates a wide area over a far-field portion of the field of view (FOV) of the imaging-based bar code symbol reader, as defined in
(258) During both near and far field wide-area illumination modes of operation, the Automatic Light Exposure Measurement and Illumination Control Subsystem (i.e. module) 15 measures and controls the time duration which the Multi-Mode Illumination Subsystem 14 exposes the image sensing array 22 to narrow-band illumination (e.g. 633 nanometers, with approximately 15 nm bandwidth) during the image capturing/acquisition process, and automatically terminates the generation of such illumination when such computed time duration expires. In accordance with the principles of the present invention, this global exposure control process ensures that each and every acquired image has good contrast and is not saturated, two conditions essential for consistent and reliable bar code reading
(259) FIG. 4D1 shows the Lambertian emittance versus wavelength characteristics of the LEDs used to implement the far-field wide-area illumination arrays 29 in the Multi-Mode Illumination Subsystem 14. FIG. 4D2 shows the Lambertian emittance versus polar angle characteristics of the LEDs used to implement the same. FIG. 4D3 shows the plano-convex lenses used before the LEDs in the far-field wide-area illumination arrays in the Multi-Mode Illumination Subsystem 14. FIG. 4D4 shows a layout of LEDs and plano-convex lenses used to implement the far wide-area illumination array 29 of the illumination subsystem, wherein the illumination beam produced therefrom is aimed by angling the lenses before the LEDs in the far-field wide-area illumination arrays of the Multi-Mode Illumination Subsystem 14. FIG. 4D6 sets forth a set of three illumination profiles for the far-field wide-area illumination fields produced by the far-field wide-area illumination arrays of the illustrative embodiment, taken at 100, 150 and 220 millimeters along the field away from the imaging window (i.e. working distance) of the imaging-based bar code symbol reader 1, illustrating that the spatial intensity of the far-field wide-area illumination field begins to become substantially uniform at about 100 millimeters. FIG. 4D7 shows a table illustrating a preferred method of calculating the pixel intensity value for the center of the far field wide-area illumination field produced from the Multi-Mode Illumination Subsystem 14, showing a significant signal strength (greater than 80 DN at the far center field).
(260) Specification of the Narrow-Band Optical Filter Subsystem Integrated within the Hand-Supportable Housing of the Imager of the Present Invention
(261) As shown in FIG. 5A1, the hand-supportable housing of the bar code reader of the present invention has integrated within its housing, narrow-band optical filter subsystem 4 for transmitting substantially only the very narrow band of wavelengths (e.g. 620-700 nanometers) of visible illumination produced from the narrow-band Multi-Mode Illumination Subsystem 14, and rejecting all other optical wavelengths outside this narrow optical band however generated (i.e. ambient light sources). As shown, narrow-band optical filter subsystem 4 comprises: red-wavelength reflecting (high-pass) imaging window filter 4A integrated within its light transmission aperture 3 formed on the front face of the hand-supportable housing; and low pass optical filter 4B disposed before the CMOS image sensing array 22. These optical filters 4A and 4B cooperate to form the narrow-band optical filter subsystem 4 for the purpose described above. As shown in FIG. 5A2, the light transmission characteristics (energy versus wavelength) associated with the low-pass optical filter element 4B indicate that optical wavelengths below 620 nanometers are transmitted therethrough, whereas optical wavelengths above 620 nm are substantially blocked (e.g. absorbed or reflected). As shown in FIG. 5A3, the light transmission characteristics (energy versus wavelength) associated with the high-pass imaging window filter 4A indicate that optical wavelengths above 700 nanometers are transmitted therethrough, thereby producing a red-color appearance to the user, whereas optical wavelengths below 700 nm are substantially blocked (e.g. absorbed or reflected) by optical filter 4A.
(262) During system operation, spectral band-pass filter subsystem 4 greatly reduces the influence of the ambient light, which falls upon the CMOS image sensing array 22 during the image capturing operations. By virtue of the optical filter of the present invention, an optical shutter mechanism is eliminated in the system. In practice, the optical filter can reject more than 85% of incident ambient light, and in typical environments, the intensity of LED illumination is significantly more than the ambient light on the CMOS image sensing array 22. Thus, while an optical shutter is required in nearly most conventional CMOS imaging systems, the imaging-based bar code reading system of the present invention effectively manages the exposure time of narrow-band illumination onto its CMOS image sensing array 22 by simply controlling the illumination time of its LED-based illumination arrays 27, 28 and 29 using control signals generated by Automatic Light Exposure Measurement and Illumination Control Subsystem 15 and the CMOS image sensing array 22 while controlling illumination thereto by way of the band-pass optical filter subsystem 4 described above. The result is a simple system design, without moving parts, and having a reduced manufacturing cost.
(263) While the band-pass optical filter subsystem 4 is shown comprising a high-pass filter element 4A and low-pass filter element 4B, separated spatially from each other by other optical components along the optical path of the system, subsystem 4 may be realized as an integrated multi-layer filter structure installed in front of the image formation and detection (IFD) module 13, or before its image sensing array 22, without the use of the high-pass window filter 4A, or with the use thereof so as to obscure viewing within the imaging-based bar code symbol reader while creating an attractive red-colored protective window. Preferably, the red-color window filter 4A will have substantially planar surface characteristics to avoid focusing or defocusing of light transmitted therethrough during imaging operations.
(264) Specification of the Automatic Light Exposure Measurement and Illumination Control Subsystem of the Present Invention
(265) The primary function of the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is to control the brightness and contrast of acquired images by (i) measuring light exposure at the image plane of the CMOS imaging sensing array 22 and (ii) controlling the time duration that the Multi-Mode Illumination Subsystem 14 illuminates the target object with narrow-band illumination generated from the activated LED illumination array. Thus, the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 eliminates the need for a complex shuttering mechanism for CMOS-based image sensing array 22. This novel mechanism ensures that the imaging-based bar code symbol reader of the present invention generates non-saturated images with enough brightness and contrast to guarantee fast and reliable image-based bar code decoding in demanding end-user applications.
(266) During object illumination, narrow-band LED-based light is reflected from the target object (at which the hand-supportable bar code reader is aimed) and is accumulated by the CMOS image sensing array 22. Notably, the object illumination process must be carried out for an optimal duration so that the acquired image frame has good contrast and is not saturated. Such conditions are required for the consistent and reliable bar code decoding operation and performance. The Automatic Light Exposure Measurement and Illumination Control Subsystem 15 measures the amount of light reflected from the target object, calculates the maximum time that the CMOS image sensing array 22 should be kept exposed to the actively-driven LED-based illumination array associated with the Multi-Mode Illumination Subsystem 14, and then automatically deactivates the illumination array when the calculated time to do so expires (i.e. lapses).
(267) As shown in
(268) As shown in
(269) Notably, in the illustrative embodiment, there are three possible LED-based illumination arrays 27, 28 and 29 which can be selected for activation by the System Control Subsystem 19, and the upper and/or lower LED subarrays in illumination arrays 28 and 29 can be selectively activated or deactivated on a subarray-by-subarray basis, for various purposes taught herein, including automatic specular reflection noise reduction during wide-area image capture modes of operation.
(270) Each one of these illumination arrays can be driven to different states depending on the auto-exposure control signal generated by electronic signal processing circuit 57, which will be generally a function of object distance, object surface reflectivity and the ambient light conditions sensed at photo-detector 56, and measured by signal processing circuit 57. The operation of signal processing circuitry 57 will now be detailed below.
(271) As shown in
(272) As will be explained in greater detail below, the LED array driver circuit 64 shown in
(273) As shown in
(274) Global Exposure Control Method of the Present Invention Carried Out Using the CMOS Image Sensing Array
(275) In the illustrative embodiment, the CMOS image sensing array 22 is operated in its Single Frame Shutter Mode (i.e. rather than its Continuous Frame Shutter Mode) as shown in
(276) As indicated at Block A in FIG. 6E1, Step A in the global exposure control method involves selecting the single frame shutter mode of operation for the CMOS imaging sensing array provided within an imaging-based bar code symbol reading system employing an automatic light exposure measurement and illumination control subsystem, a multi-mode illumination subsystem, and a system control subsystem integrated therewith, and image formation optics providing the CMOS image sensing array with a field of view into a region of space where objects to be imaged are presented.
(277) As indicated in Block B in FIG. 6E1, Step B in the global exposure control method involves using the automatic light exposure measurement and illumination control subsystem to continuously collect illumination from a portion of the field of view, detect the intensity of the collected illumination, and generate an electrical analog signal corresponding to the detected intensity, for processing.
(278) As indicated in Block C in FIG. 6E1, Step C in the global exposure control method involves activating (e.g. by way of the system control subsystem 19 or directly by way of trigger switch 2C) the CMOS image sensing array so that its rows of pixels begin to integrate photonically generated electrical charge in response to the formation of an image onto the CMOS image sensing array by the image formation optics of the system.
(279) As indicated in Block D in FIG. 6E1, Step D in the global exposure control method involves the CMOS image sensing array 22 automatically (i) generating an electronic rolling shutter (ERS) digital pulse signal when all rows of pixels in the image sensing array are operated in a state of integration, and providing this ERS pulse signal to the Automatic Light Exposure Measurement And Illumination Control Subsystem 15 so as to activate light exposure measurement and illumination control functions/operations therewithin.
(280) As indicated in Block E in FIG. 6E2, Step E in the global exposure control method involves, upon activation of light exposure measurement and illumination control functions within Subsystem 15, (i) processing the electrical analog signal being continuously generated therewithin, (ii) measuring the light exposure level within a central portion of the field of view 23 (determined by light collecting optics 55 shown in
(281) Finally, as indicated at Block F in FIG. 6E2, Step F in the global exposure control method involves using (i) the auto exposure control signal and (ii) the illumination array selection control signal to drive the selected LED-based illumination array(s) and illuminate the field of view of the CMOS image sensing array 22 in whatever image capture mode it may be configured, precisely when all rows of pixels in the CMOS image sensing array are in a state of integration, as illustrated in
(282) Specification of the IR-Based Automatic Object Presence and Range Detection Subsystem Employed in the Hand-Supportable Digital Image-Based Bar Code Reading Device of the Present Invention
(283) As shown in
(284) As shown in
(285) In general, the function of range analysis circuitry 93 is to analyze the digital range data from the A/D converter 90 and generate two control activation signals, namely: (i) an object presence detection type of control activation signal A.sub.1A indicating simply whether an object is presence or absent from the object detection field, regardless of the mode of operation in which the Multi-Mode Illumination Subsystem 14 might be configured; and (ii) a near-field/far-field range indication type of control activation signal A.sub.1B indicating whether a detected object is located in either the predefined near-field or far-field portions of the object detection field, which correspond to the near-field and far-field portions of the FOV of the Multi-Mode Image Formation and Detection Subsystem 13.
(286) Various kinds of analog and digital circuitry can be designed to implement the IR-based Automatic Object Presence and Range Detection Subsystem 12. Alternatively, this subsystem can be realized using various kinds of range detection techniques as taught in U.S. Pat. No. 6,637,659, incorporated herein by reference in its entirely.
(287) In the illustrative embodiment, Automatic Object Presence and Range Detection Subsystem 12 operates as follows. In System Modes of Operation requiring automatic object presence and/or range detection, Automatic Object Presence and Range Detection Subsystem 12 will be activated at system start-up and operational at all times of system operation, typically continuously providing the System Control Subsystem 19 with information about the state of objects within both the far and near portions of the object detection field 20 of the imaging-based symbol reader. In general, this Subsystem detects two basic states of presence and range, and therefore has two basic states of operation. In its first state of operation, the IR-based automatic Object Presence and Range Detection Subsystem 12 automatically detects an object within the near-field region of the FOV 20, and in response thereto generates a first control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this first fact. In its second state of operation, the IR-based automatic Object Presence and Range Detection Subsystem 12 automatically detects an object within the far-field region of the FOV 20, and in response thereto generates a second control activation signal which is supplied to the System Control Subsystem 19 to indicate the occurrence of this second fact. As will be described in greater detail and throughout this patent specification, these control activation signals are used by the System Control Subsystem 19 during particular stages of the system control process, such as determining (i) whether to activate either the near-field and/or far-field LED illumination arrays, and (ii) how strongly should these LED illumination arrays be driven to ensure quality image exposure at the CMOS image sensing array 22.
(288) Specification of the Mapping of Pixel Data Captured by the Imaging Array into the SDRAM Under the Control of the Direct Memory Access (DMA) Module within the Microprocessor
(289) As shown in
(290) Referring to
(291) In the implementation of the illustrative embodiment, the CMOS image sensing array 22 sends 7-bit gray-scale data bytes over a parallel data connection to FPGA 39 which implements a FIFO using its internal SRAM. The FIFO 39 stores the pixel data temporarily and the microprocessor 36 initiates a DMA transfer from the FIFO (which is mapped to address OXOCOOOOOO, chip select 3) to the SDRAM 38. In general, modern microprocessors have internal DMA modules, and a preferred microprocessor design, the DMA module will contain a 32-byte buffer. Without consuming any CPU cycles, the DMA module can be programmed to read data from the FIFO 39, store read data bytes in the DMA's buffer, and subsequently write the data to the SDRAM 38. Alternatively, a DMA module can reside in FPGA 39 to directly write the FIFO data into the SDRAM 38. This is done by sending a bus request signal to the microprocessor 36, so that the microprocessor 36 releases control of the bus to the FPGA 39 which then takes over the bus and writes data into the SDRAM 38.
(292) Below, a brief description will be given on where pixel data output from the CMOS image sensing array 22 is stored in the SDRAM 38, and how the microprocessor (i.e. implementing a decode algorithm) 36 accesses such stored pixel data bytes.
(293) During image acquisition operations, the image pixels are sequentially read out of the image sensing array 22. Although one may choose to read and column-wise or row-wise for some CMOS image sensors, without loss of generality, the row-by-row read out of the data is preferred. The pixel image data set is arranged in the SDRAM 38 sequentially, starting at address OXAOEC0000. To randomly access any pixel in the SDRAM 38 is a straightforward matter: the pixel at row y 1/4 column x located is at address (OXAOEC0000+y x 1280+x).
(294) As each image frame always has a frame start signal out of the image sensing array 22, that signal can be used to start the DMA process at address OXAOEC0000, and the address is continuously incremented for the rest of the frame. But the reading of each image frame is started at address OXAOEC0000 to avoid any misalignment of data. Notably, however, if the microprocessor 36 has programmed the CMOS image sensing array 22 to have a ROI window, then the starting address will be modified to (OXAOEC0000+1280 X R.sub.1), where R.sub.1 is the row number of the top left corner of the ROI.
(295) Specification of the Three-Tier Software Architecture of the Hand-Supportable Digital Image-Based Bar Code Reading Device of the Present Invention
(296) As shown in
(297) While the operating system layer of the imaging-based bar code symbol reader is based upon the Linux operating system, it is understood that other operating systems can be used (e.g. Microsoft Windows, Max OXS, Unix, etc), and that the design preferably provides for independence between the main Application Software Layer and the Operating System Layer, and therefore, enables of the Application Software Layer to be potentially transported to other platforms. Moreover, the system design principles of the present invention provides an extensibility of the system to other future products with extensive usage of the common software components, which should make the design of such products easier, decrease their development time, and ensure their robustness.
(298) In the illustrative embodiment, the above features are achieved through the implementation of an event-driven multi-tasking, potentially multi-user, Application layer running on top of the System Core software layer, called SCORE. The SCORE layer is statically linked with the product Application software, and therefore, runs in the Application Level or layer of the system. The SCORE layer provides a set of services to the Application in such a way that the Application would not need to know the details of the underlying operating system, although all operating system APIs are, of course, available to the application as well. The SCORE software layer provides a real-time, event-driven, OS-independent framework for the product Application to operate. The event-driven architecture is achieved by creating a means for detecting events (usually, but not necessarily, when the hardware interrupts occur) and posting the events to the Application for processing in real-time manner. The event detection and posting is provided by the SCORE software layer. The SCORE layer also provides the product Application with a means for starting and canceling the software tasks, which can be running concurrently, hence, the multi-tasking nature of the software system of the present invention.
(299) Specification of Software Modules within the SCORE Layer of the System Software Architecture Employed in Imaging-Based Bar Code Reader of the Present Invention
(300) The SCORE layer provides a number of services to the Application layer.
(301) The Tasks Manager provides a means for executing and canceling specific application tasks (threads) at any time during the product Application run.
(302) The Events Dispatcher provides a means for signaling and delivering all kinds of internal and external synchronous and asynchronous events
(303) When events occur, synchronously or asynchronously to the Application, the Events Dispatcher dispatches them to the Application Events Manager, which acts on the events accordingly as required by the Application based on its current state. For example, based on the particular event and current state of the application, the Application Events Manager can decide to start a new task, or stop currently running task, or do something else, or do nothing and completely ignore the event.
(304) The Input/Output Manager provides a means for monitoring activities of input/output devices and signaling appropriate events to the Application when such activities are detected.
(305) The Input/Output Manager software module runs in the background and monitors activities of external devices and user connections, and signals appropriate events to the Application Layer, which such activities are detected. The Input/Output Manager is a high-priority thread that runs in parallel with the Application and reacts to the input/output signals coming asynchronously from the hardware devices, such as serial port, user trigger switch 2C, bar code reader, network connections, etc. Based on these signals and optional input/output requests (or lack thereof) from the Application, it generates appropriate system events, which are delivered through the Events Dispatcher to the Application Events Manager as quickly as possible as described above.
(306) The User Commands Manager provides a means for managing user commands, and utilizes the User Commands Table provided by the Application, and executes appropriate User Command Handler based on the data entered by the user.
(307) The Input/Output Subsystem software module provides a means for creating and deleting input/output connections and communicating with external systems and devices
(308) The Timer Subsystem provides a means of creating, deleting, and utilizing all kinds of logical timers.
(309) The Memory Control Subsystem provides an interface for managing the multi-level dynamic memory with the device, fully compatible with standard dynamic memory management functions, as well as a means for buffering collected data. The Memory Control Subsystem provides a means for thread-level management of dynamic memory. The interfaces of the Memory Control Subsystem are fully compatible with standard C memory management functions. The system software architecture is designed to provide connectivity of the device to potentially multiple users, which may have different levels of authority to operate with the device.
(310) The User Commands Manager, which provides a standard way of entering user commands, and executing application modules responsible for handling the same. Each user command described in the User Commands Table is a task that can be launched by the User Commands Manager per user input, but only if the particular user's authority matches the command's level of security.
(311) The Events Dispatcher software module provides a means of signaling and delivering events to the Application Events Manager, including the starting of a new task, stopping a currently running task, or doing something or nothing and simply ignoring the event.
(312)
(313) The imaging-based bar code symbol reading device of the present invention provides the user with a command-line interface (CLI), which can work over the standard communication lines, such as RS232, available in the bar code reader. The CLI is used mostly for diagnostic purposes, but can also be used for configuration purposes in addition to the MetroSet and MetroSelect programming functionalities. To send commands to the bar code reader utilizing the CLI, a user must first enter the User Command Manager by typing in a special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and S for example. Any standard and widely available software communication tool, such as Windows HyperTerminal, can be used to communicate with the bar code reader. The bar code reader acknowledges the readiness to accept commands by sending the prompt, such as MTLG> back to the user. The user can now type in any valid Application command. To quit the User Command Manager and return the scanner back to its normal operation, a user must enter another special character, which could actually be a combination of multiple and simultaneous keystrokes, such Ctrl and R for example.
(314) An example of the valid command could be the Save Image command, which is used to upload an image from the bar code reader's memory to the host PC. This command has the following CLI format:
(315) save [filename [compr]]
(316) where
(317) (1) save is the command name.
(318) (2) filename is the name of the file the image gets saved in. If omitted, the default filename is image.bmp.
(319) (3) compr is the compression number, from 0 to 10. If omitted, the default compression number is 0, meaning no compression. The higher compression number, the higher image compression ratio, the faster image transmission, but more distorted the image gets.
(320) The imaging-based bar code symbol reader of the present invention can have numerous commands. All commands are described in a single table (User Commands Table shown in
(321) When a user enters a command, the User Command Manager looks for the command in the table. If found, it executes the function the address of which is provided in the record for the entered command. Upon return from the function, the User Command Manager sends the prompt to the user indicating that the command has been completed and the User Command Manager is ready to accept a new command.
(322) Specification of Software Modules within the Application Layer of the System Software Architecture Employed in Imaging-Based Bar Code Reader of the Present Invention
(323) The image processing software employed within the system hereof performs its bar code reading function by locating and recognizing the bar codes within the frame of a captured image comprising pixel data. The modular design of the image processing software provides a rich set of image processing functions, which could be utilized in the future for other potential applications, related or not related to bar code symbol reading, such as: optical character recognition (OCR) and verification (OCV); reading and verifying directly marked symbols on various surfaces; facial recognition and other biometrics identification; etc.
(324) The CodeGate Task, in an infinite loop, performs the following task. It illuminates a thin narrow horizontal area at the center of the field-of-view (FOV) and acquires a digital image of that area. It then attempts to read bar code symbols represented in the captured frame of image data using the image processing software facilities supported by the Image-Processing Bar Code Symbol Reading Subsystem 17 of the present invention to be described in greater detail hereinafter. If a bar code symbol is successfully read, then Subsystem 17 saves the decoded data in the special Decode Data Buffer. Otherwise, it clears the Decode Data Buffer. Then, it continues the loop. The CodeGate Task routine never exits on its own. It can be canceled by other modules in the system when reacting to other events. For example, when a user pulls the trigger switch 2C, the event TRIGGER_ON is posted to the application. The Application software responsible for processing this event, checks if the CodeGate Task is running, and if so, it cancels it and then starts the Main Task. The CodeGate Task can also be canceled upon OBJECT_DETECT_OFF event, posted when the user moves the bar code reader away from the object, or when the user moves the object away from the bar code reader. The CodeGate Task routine is enabled (with Main Task) when semi-automatic-triggered system modes of programmed operation (Modes of System Operation Nos. 11-14) are to be implemented on the illumination and imaging platform of the present invention.
(325) The Narrow-Area Illumination Task is a simple routine which is enabled (with Main Task) when manually-triggered system modes of programmed operation (Modes of System Operation Nos. 1-5) are to be implemented on the illumination and imaging platform of the present invention. However, this routine is never enabled simultaneously with CodeGate Task. As shown in the event flow chart of
(326) Depending on the System Mode in which the imaging-based bar code symbol reader is configured, Main Task will typically perform differently, but within the limits described in
(327) The MetroSet functionality is executed by the special MetroSet Task. When the Focus RS232 software driver detects a special NULL-signal on its communication lines, it posts the METROSET_ON event to the Application. The Application software responsible for processing this event starts the MetroSet task. Once the MetroSet Task is completed, the scanner returns to its normal operation.
(328) The function of the Plug-In Controller (i.e. Manager) is to read configuration files and find plug-in libraries within the Plug-In and Configuration File Library, and install plug-in into the memory of the operating system, which returns back an address to the Plug-In Manager indicating where the plug-in has been installed, for future access. As will be described in greater detail hereinafter, the Plug-In Development Platform support development of plug-ins that enhance, extend and/or modify the features and functionalities of the image-processing based bar code symbol reading system, and once developed, to upload developed plug-ins within the file system of the operating system layer, while storing the addresses of such plug-ins within the Plug-In and Configuration File Library in the Application Layer.
(329) Modes of System Operation Nos. 6-10 can be readily implemented on the illumination and imaging platform of the present invention by making the following software system modifications: (1) an Auto-Read Task routine would be added to the system routine library (wherein Auto-Read Task could be an infinite loop routine where the primary operations of CodeGate Task and Main Task are sequenced together to attempt first automatic narrow-area illumination and image capture and processing, followed by automatic wide-area illumination and image capture and processing, and repeating the wide-area operation in an infinite loop, until the object is no longer detected within a particular predetermined time period; and (2) modifying the query block Is CodeGate Task or Narrow-Area Illumination Task Enabled? in the Object_Detect_On event handling routine shown in
(330) Operating System Layer Software Modules within the Application Layer of the System Software Architecture Employed in Imaging-Based Bar Code Reader of the Present Invention
(331) The Devices Drivers software modules, which include trigger drivers, provide a means for establishing a software connection with the hardware-based manually-actuated trigger switch 2C employed on the imaging-based device, an image acquisition driver for implementing image acquisition functionality aboard the imaging-based device, and an IR driver for implementing object detection functionality aboard the imaging-based device.
(332) As shown in
(333) Basic System Operations Supported by the Three-Tier Software Architecture of the Hand-Supportable Digital Imaging-Based Bar Code Reading Device of the Present Invention
(334) The basic system operations supported by the three-tier software architecture of the hand-supportable digital imaging-based bar code reading device of the present invention are shown in
(335) Notably, these basic operations represent functional modules (or building blocks) with the system architecture of the present invention, which can be combined in various combinations to implement numerous Programmable Modes of System Operation using the image acquisition and processing platform disclosed herein. For purposes of illustration, and the avoidance of obfuscation of the present invention, these basic system operations will be described below with reference to Programmable Mode of System Operation No. 12: Semi-Automatic-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode And The Manual Or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem 17.
(336)
(337) Upon receiving the SCORE_OBJECT_DETECT_ON event at the Application Layer, the Application Events Manager executes an event handling routine (shown in
(338) As shown in
(339) As shown in
(340) The routine determines whether the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has been enabled, and if so, then the routine exits. If the routine determines that the Presentation Mode (i.e. Programmed Mode of System Operation No. 10) has not been enabled, then it determines whether the CodeGate Task is running, and if it is running, then it first cancels the CodeGate Task and then deactivates the narrow-area illumination array 27 associated with the Multi-Mode Illumination Subsystem 14, and thereafter executes the Main Task. If however the routine determines that the CodeGate Task is not running, then it determines whether Narrow-Area Illumination Task is running, and if it is not running, then Main Task is started. However, if Narrow-Area Illumination Task is running, then the routine increases the narrow-illumination beam to full power and acquires a narrow-area image at the center of the field of view of the system, then attempts to read the bar code in the captured narrow-area image. If the read attempt is successful, then the decoded (symbol character) data is saved in the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown. If the read attempt is unsuccessful, then the routine clears the Decode Data Buffer, the Narrow-Area Illumination Task is canceled, the narrow-area illumination beam is stopped, and the routine starts the Main Task, as shown.
(341) The Narrow-Area Task routine is an infinite loop routine that simply keeps a narrow-area illumination beam produced and directed at the center of the field of view of the system in a recursive manner (e.g. typically at half or less power in comparison with the full-power narrow-area illumination beam produced during the running of CodeGate Task).
(342) The first step performed in the Main Task by the Application Layer is to determine whether CodeGate Data is currently available (i.e. stored in the Decode Data Buffer), and if such data is available, then the Main Task directly executes the Data Output Procedure. However, if the Main Task determines that no such data is currently available, then it starts the Read TimeOut Timer, and then acquires a wide-area image of the detected object, within the time frame permitted by the Read Timeout Timer. Notably, this wide-area image acquisition process involves carrying out the following operations, namely: (i) first activating the wide-area illumination mode in the Multi-Mode Illumination Subsystem 14 and the wide-area capture mode in the CMOS image formation and detection module; (ii) determining whether the object resides in the near-field or far-field portion of the FOV (through object range measurement by the IR-based Object Presence and Range Detection Subsystem 12); and (iii) then activating either the near or far field wide-area illumination array to illuminate either the object in either the near or far field portions of the FOV using either the near-field illumination array 28 or the far-field illumination array 29 (or possibly both 28 and 29 in special programmed cases) at an intensity and duration determined by the automatic light exposure measurement and control subsystem 15; while (iv) sensing the spatial intensity of light imaged onto the CMOS image sensing array 22 in accordance with the Global Exposure Control Method of the present invention, described in detail hereinabove. Then the Main Task performs image processing operations on the captured image using either the Manual, ROI-Specific or Automatic Modes of operation (although it is understood that other image-processing based reading methods taught herein, such as Automatic or OmniScan (as well we other suitable alternative decoding algorithms/processes not disclosed herein), can be used depending on which Programmed Mode of System Operation has been selected by the end user for the imaging-based bar code symbol reader of the present invention. Notably, the time duration of each image acquisition/processing frame is set by the Start Read Timeout Timer and Stop Read Timeout Timer blocks shown therein, and that within the Programmed Mode of System Operation No. 12, the Main Task will support repeated (i.e. multiple) attempts to read a single bar code symbol so long as the trigger switch 2C is manually depressed by the operator and a single bar code has not yet been read. Then upon successfully reading a (single) bar code symbol, the Main Task will then execute the Data Output Procedure. Notably, in other Programmed Modes of System Operation, in which a single attempt at reading a bar code symbol is enabled, the Main Task will be modified accordingly to support such system behavior. In such a case, an alternatively named Main Task (e.g. Main Task No. 2) would be executed to enable the required system behavior during run-time.
(343) It should also be pointed out at this juncture, that it is possible to enable and utilize several of different kinds of symbol reading methods during the Main Task, and to apply particular reading methods based on the computational results obtained while processing the narrow-area image during the CodeGate Task, and/or while preprocessing of the captured wide-area image during one of the image acquiring/processing frames or cycles running in the Main Task. The main point to be made here is that the selection and application of image-processing based bar code reading methods will preferably occur through the selective activation of the different modes available within the multi-mode image-processing based bar code symbol reading Subsystem 17, in response to information learned about the graphical intelligence represented within the structure of the captured image, and that such dynamic should occur in accordance with principles of dynamic adaptive learning commonly used in advanced image processing systems, speech understanding systems, and alike. This general approach is in marked contrast with the approaches used in prior art imaging-based bar code symbol readers, wherein permitted methods of bar code reading are pre-selected based on statically defined modes selected by the end user, and not in response to detected conditions discovered in captured images on a real-time basis.
(344) The first step carried out by the Data Output Procedure, called in the Main Task, involves determining whether the symbol character data generated by the Main Task is for programming the bar code reader or not. If the data is not for programming the bar code symbol reader, then the Data Output Procedure sends the data out according to the bar code reader system configuration, and then generates the appropriate visual and audio indication to the operator, and then exits the procedure. If the data is for programming the bar code symbol reader, then the Data Output Procedure sets the appropriate elements of the bar code reader configuration (file) structure, and then saves the Bar Code Reader Configuration Parameters in non-volatile RAM (i.e. NOVRAM). The Data Output Procedure then reconfigures the bar code symbol reader and then generates the appropriate visual and audio indication to the operator, and then exits the procedure. Decoded data is sent from the Input/Output Module at the System Core Layer to the Device Drivers within the Linux OS Layer of the system.
(345) Specification of Symbologies and Modes Supported by the Multi-Mode Bar Code Symbol Reading Subsystem Module Employed within the Hand-Supportable Digital Image-Based Bar Code Reading Device of the Present Invention
(346) Various bar code symbologies are supported by the Multi-Mode Bar Code Symbol Reading Subsystem 17 employed within the hand-supportable digital imaging-based bar code symbol reading device of the present invention. These bar code symbologies include: Code 128; Code 39; I2of5; Code93; Codabar; UPC/EAN; Telepen; UK-Plessey; Trioptic; Matrix 2of5; Ariline 2of5; Straight 2of5; MSI-Plessey; Code11; and PDF417.
(347) Specification of the Various Modes of Operation in the Multi-Mode Bar Code Symbol Reading Subsystem of the Present Invention
(348) The Multi-Mode Image-Processing Based Bar Code Symbol Reading Subsystem 17 of the illustrative embodiment supports five primary modes of operation, namely: the Automatic Mode of Operation; the Manual Mode of Operation; the ROI-Specific Mode of Operation; the No-Finder Mode of Operation; and Omniscan Mode of Operation. Various combinations of these modes of operation can be used during the lifecycle of the image-processing based bar code reading process of the present invention.
(349) The Automatic Mode of Multi-Mode Bar Code Symbol Reading Subsystem
(350) In its Automatic Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically start processing a captured frame of digital image data, prior to the complete buffering thereof, so as to search for one or more bar codes represented therein in an incremental manner, and to continue searching until the entire image is processed.
(351) This mode of image-based processing enables bar code locating and reading when no prior knowledge about the location of, or the orientation of, or the number of bar codes that may be present within an image, is available. In this mode of operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the top-left corner and continues until it reaches the bottom-right corner, reading any potential bar codes as it encounters them.
(352) The Manual Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
(353) In its Manual Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the center or sweep spot of the image at which the user would have aimed the bar code reader, so as to search for (i.e. find) a at least one bar code symbol represented therein. Unlike the Automatic Mode, this is done by searching in a helical manner through frames or blocks of extracted image feature data, and then marking the same and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data.
(354) This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to the center of the image. The Multi-Mode Bar Code Symbol Reading Subsystem 17 starts processing the image from the center, along rectangular strips progressively further from the center and continues until either the entire image has been processed or the programmed maximum number of bar codes has been read.
(355) The ROI-Specific Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
(356) In its ROI-Specific Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data, starting from the region of interest (ROI) in the captured image, specified by coordinates acquired during a previous mode of operation within the Multi-Mode Bar Code Symbol Reading Subsystem 17. Unlike the Manual Mode, this is done by analyzing the received ROI-specified coordinates, derived during either a previous NoFinder Mode, Automatic Mode, or Omniscan Mode of operation, and then immediately begins processing image feature data, and image-processing the corresponding raw digital image data until a bar code symbol is recognized/read within the captured frame of image data. Thus, typically, the ROI-Specific Mode is used in conjunction with other modes of the Multi-Mode Bar Code Symbol Reading Subsystem 17.
(357) This mode of image processing enables bar code locating and reading when the maximum number of bar codes that could be present within the image is known a priori and when portions of the primary bar code have a high probability of spatial location close to specified ROI in the image. The Multi-Mode Bar Code Symbol Reading Subsystem starts processing the image from these initially specified image coordinates, and then progressively further in a helical manner from the ROI-specified region, and continues until either the entire image has been processed or the programmed maximum number of bar codes have been read.
(358) The No-Finder Mode of the Multi-Mode Bar Code Symbol Reading Subsystem
(359) In its No-Finder Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured narrow-area (linear) frame of digital image data, without the feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a one or more bar code symbols represented therein.
(360) This mode enables bar code reading when it is known, a priori, that the image contains at most one (1-dimensional) bar code symbol, portions of which have a high likelihood of spatial location close to the center of the image and when the bar code is known to be oriented at zero degrees relative to the horizontal axis. Notably, this is typically the case when the bar code reader is used in a hand-held mode of operation, where the bar code symbol reader is manually pointed at the bar code symbol to be read. In this mode, the Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and filters the image at zero (0) degrees and 180 degrees relative to the horizontal axis. Using the bar-and-space-count data generated by the filtration step, it reads the potential bar code symbol.
(361) The Omni-Scan Mode of the Multi-Mode Bar Code Reading Subsystem
(362) In its Omniscan Mode of Operation, the Multi-Mode Bar Code Symbol Reading Subsystem 17 is configured to automatically process a captured frame of digital image data along any one or more predetermined virtual scan line orientations, without feature extraction and marking operations used in the Automatic, Manual and ROI-Specific Modes, so as to read a single bar code symbol represented in the processed image.
(363) This mode enables bar code reading when it is known, a priori, that the image contains at most one (1-dimensional) bar code, portions of which have a high likelihood of spatial location close to the center of the image but which could be oriented in any direction. Multi-Mode Bar Code Symbol Reading Subsystem 17 starts at the center of the image, skips all bar code location steps, and filters the image at different start-pixel positions and at different scan-angles. Using the bar-and-space-count data generated by the filtration step, the Omniscan Mode reads the potential bar code symbol.
(364) Programmable Modes of Bar Code Reading Operation within the Hand-Supportable Digital Image-Based Bar Code Reading Device of the Present Invention
(365) In the illustrative embodiment, the imaging-based bar code symbol reader of the present invention has at least seventeen (17) Programmable System Modes of Operation, namely: Programmed Mode of System Operation No. 1Manually-Triggered Single-Attempt 1D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No. 2Manually-Triggered Multiple-Attempt 1D Single-Read Mode Employing the No-Finder Mode of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode Of System Operation No. 3Manually-Triggered Single-Attempt 1D/2D Single-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 4Manually-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 5Manually-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing the No-Finder Mode And The Automatic Or Manual Modes of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 6Automatically-Triggered Single-Attempt 1D Single-Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 7Automatically-Triggered Multi-Attempt 1D Single-Read Mode Employing The No-Finder Mode Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 7Automatically-Triggered Multi-Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 9Automatically-Triggered Multi-Attempt 1D/2D Multiple-Read Mode Employing The No-Finder Mode and Manual and/or Automatic Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. 10Automatically-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing The Manual, Automatic or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmed Mode of System Operation No. 11Semi-Automatic-Triggered Single-Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. 12Semi-Automatic-Triggered Multiple-Attempt 1D/2D Single-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 13Semi-Automatic-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing The No-Finder Mode And The Automatic Or Manual Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 14Semi-Automatic-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing The No-Finder Mode And The Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of Operation No. 15Continuously-Automatically-Triggered Multiple-Attempt 1D/2D Multiple-Read Mode Employing The Automatic, Manual Or Omniscan Modes Of the Multi-Mode Bar Code Reading Subsystem; Programmable Mode of System Operation No. 16Diagnostic Mode Of Imaging-Based Bar Code Reader Operation; and Programmable Mode of System Operation No. 17Live Video Mode Of Imaging-Based Bar Code Reader Operation.
(366) Preferably, these Modes Of System Operation can programmed by reading a sequence of bar code symbols from a programming menu as taught, for example, in U.S. Pat. No. 6,565,005, which describes a bar code scanner programming technology developed by Metrologic Instruments, Inc., and marketed under the name MetroSelect Single Line Configuration Programming Method.
(367) These Programmable System Modes of Operation will be described in detail hereinbelow. Alternatively, the MetroSet Graphical User Interface (GUI) can be used to view and change configuration parameters in the bar code symbol reader using a PC. Alternatively, a Command Line Interface (CLI) may also be used to view and change configuration parameters in the bar code symbol reader,
(368) Each of these programmable modes of bar code reader operation is described in greater detail in WIPO International Publication No. WO 05/05039, supra.
(369) Overview of the Imaging-Based Bar Code Reader Start-Up Operations
(370) When the bar code reader hereof boots up, its FPGA is programmed automatically with 12.5/50/25 MHz clock firmware and all required device drivers are also installed automatically. The login to the Operating System is also done automatically for the user root, and the user is automatically directed to the/root/directory. For nearly all programmable modes of system operation employing automatic object detection, the IR object detection software driver is installed automatically. Also, for all Programmable System Modes of operation employing the narrow-area illumination mode, the narrow-area illumination software drivers are automatically installed, so that a Pulse Width Modulator (PWM) is used to drive the narrow-area LED-based illumination array 27. To start the bar code reader operation, the operating system calls the/tmp/directory first (cd/tmp), and then the focusapp program, located in/root/directory is run, because the/root/directory is located in Flash ROM, and to save captured images, the directory/tmp/should be the current directory where the image is stored in transition to the host), which is located in RAM.
(371) Second Illustrative Embodiment of Digital Imaging-Based Bar Code Symbol Reading Device of the Present Invention, Wherein Four Distinct Modes of Illumination are Provided
(372) In the first illustrative embodiment described above, the Multi-mode Illumination Subsystem 14 had three primary modes of illumination: (1) narrow-area illumination mode; (2) near-field wide-area illumination mode; and (3) far-field wide-area illumination mode.
(373) In a second alternative embodiment of the digital imaging-based bar code symbol reading device of the present invention, the Multi-Mode Illumination Subsystem 14 is modified to support four primary modes of illumination: (1) near-field narrow-area illumination mode; (2) far-field narrow-area illumination mode; (3) near-field wide-area illumination mode; and (4) far-field wide-area illumination mode. In general, these near-field and far-field narrow-area illumination modes of operation are conducted during the narrow-area image capture mode of the Multi-Mode Image Formation and Detection Subsystem 13, and are supported by a near-field narrow-illumination array and a far field narrow-area illumination array. In the second illustrative embodiment, each of these illumination arrays are realized using at least a pair of LEDs, each having a cylindrical lens of appropriate focal length to focus the resulting narrow-area (i.e. linear) illumination beam into the near-field portion and far-field portion of the field of view of the system, respectively.
(374) One of the advantages of using a pair of independent illumination arrays to produce narrow-area illumination fields over near and far field portions of the FOV is that it is possible to more tightly control the production of a relatively narrow or narrowly-tapered narrow-area illumination field along its widthwise dimension. For example, during bar code menu reading applications, the near-field narrow area illumination array can be used to generate (over the near-field portion of the FOV) an illumination field that is narrow along both its widthwise and height-wise dimensions, to enable the user to easily align the illumination field (beam) with a single bar code symbol to be read from a bar code menu of one type or another, thereby avoiding inadvertent reads of two or more bar code symbols or simply the wrong bar code symbol. At the same time, the far-field narrow area illumination array can be used to generate (over the far-field portion of the FOV) an illumination field that is sufficient wide along its widthwise dimension, to enable the user to easily read elongated bar code symbols in the far-field portion of the field of view of the bar code reader, by simply moving the object towards the far portion of the field.
(375) Third Illustrative Embodiment of Digital Imaging-Based Bar Code Symbol Reading Device of the Present Invention
(376) Alternatively, the imaging-based bar code symbol reading device of the present invention can have virtually any type of form factor that would support the reading of bar code symbols at diverse application environments. One alternative form factor for the bar code symbol reading device of the present invention is a portable digital imaging-based bar code symbol reading device of the present invention arranged in a Presentation Mode (i.e. configured in Programmed System Mode No. 12).
(377) The digital imaging-based bar code symbol reading device of the present invention can also be realized in the form of a Digital Imaging-Based Bar Code Reading Engine that can be readily integrated into various kinds of information collection and processing systems. Notably it is understood that a trigger switch or functionally equivalent device will be typically integrated with the housing of the resultant system into which the engine is embedded so that the user can interact with and actuate the same. Such Engines according to the present invention can be realized in various shapes and sizes and be embedded within various kinds of systems and devices requiring diverse image capture and processing functions as taught herein.
(378) Digital Image Capture and Processing Engine of the Present Invention Employing Linear Optical Waveguide Technology for Collecting and Conducting LED-Based Illumination in the Automatic Light Exposure Measurement and Illumination Control Subsystem During Object Illumination and Image Capture Modes of Operation
(379) Referring to
(380) The digital image capture and processing engine 220 is shown generating and projecting a visible illumination-based Image Cropping Pattern (ICP) 200 within the field of view (FOV) of the engine, during object illumination and image capture operations. Typically, as shown, the digital image capture and processing engine 220 will be embedded or integrated within a host system 222 which uses the digital output generated from the digital image capture and processing engine 220. The host system 222 can be any system that requires the kind of information that the digital image capture and processing engine 220 can capture and process.
(381) As shown in 14Q and 15, the digital image capture and processing engine 220 depicted in
(382) In
(383) In
(384) As shown in
(385) In
(386) In
(387) Notably, use of FOV folding mirror 236 can help to achieve a wider FOV beyond the light transmission window, while using a housing having narrower depth dimensions. Also, use of the linear optical waveguide 221 obviates the need for large aperture light collection optics which requires significant space within the housing.
(388) Digital Image Capture and Processing Engine of the Present Invention Employing Curved Optical Waveguide Technology for Collecting and Conducting LED-Based Illumination in the Automatic Light Exposure Measurement and Illumination Control Subsystem During Object Illumination and Image Capture Modes of Operation
(389) In FIGS. 19A1 and 19A2, an alternative embodiment of the digital image capture and processing engine 220 of the present invention 220 is shown reconfigured in such as way that the illumination/aiming subassembly 232 (depicted in
(390) Automatic Imaging-Based Bar Code Symbol Reading System of the Present Invention Supporting Presentation-Type Modes of Operation Using Wide-Area Illumination and Video Image Capture and Processing Techniques
(391) In FIGS. 19B1 through 19B3, a presentation-type imaging-based bar code symbol reading system 300 is shown constructed using the general components of the digital image capture and processing engine of FIGS. 19A1. As shown, the illumination/aiming subassembly 232 of
(392) Automatic Imaging-Based Bar Code Symbol Reading System Of The Present Invention Supporting a Pass-Through Mode Of Operation Using Narrow-Area Illumination and Video Image Capture And Processing Techniques, and a Presentation-Type Mode Of Operation Using Wide-Area Illumination and Video Image Capture and Processing Techniques
(393) In FIGS. 19C1 through 19C5, there is shown an automatic imaging-based bar code symbol reading system of the present invention 400 supporting a pass-through mode of operation illustrated in FIG. 19C2 using narrow-area illumination and video image capture and processing techniques, and a presentation-type mode of operation illustrated in FIG. 19C3 using wide-area illumination and video image capture and processing techniques. As shown in FIGS. 19C1 through 19C5, the POS-based imaging system 400 employs a digital image capture and processing engine similar in design to that shown in FIGS. 19B1 and 19B2 and that shown in FIG. 2A1, except for the following differences:
(394) (1) the Automatic Light Exposure Measurement and Illumination Control Subsystem 15 is adapted to measure the light exposure on a central portion of the CMOS image sensing array and control the operation of the LED-Based Multi-Mode Illumination Subsystem 14 in cooperation with a the Multi-Mode Image Processing Based Bar Code Symbol Reading Subsystem 17 employing software for performing real-time exposure quality analysis of captured digital images in accordance with an adaptive system control method of the present invention;
(395) (2) the substantially-coplanar narrow-area field of illumination and narrow-area FOV 401 are oriented in the vertical direction (i.e. oriented along Up and Down directions) with respect to the counter surface of the POS environment, so as to support the pass-through imaging mode of the system, as illustrated in FIG. 19C2; and
(396) (3) the IR-based object presence and range detection system 12 employed in FIG. 19A2 is replaced with an automatic IR-based object presence and direction detection subsystem 12 comprising four independent IR-based object presence and direction detection channels (i.e. fields) 402A, 402B, 402C and 402D, generated by IR LED and photodiode pairs 12A1, 12A2, 12A3 and 12A4 respectively, which automatically produce activation control signals A1(t), A2(t), A3(t) and A4(t) upon detecting an object moving through the object presence and direction detection fields, and a signal analyzer and control logic block 12B for receiving and processing these activation control signals A1(t), A2(t), A3(t) and A4(t), according to Processing Rules 1 through 5 set forth in FIG. 19C4, so as to generate a control activation signal indicative that the detected object is being moved either in a pass-though direction (e.g. L.fwdarw.>R, R-->L, U.fwdarw.D, or D.fwdarw.U), or in a presentation direction (towards the imaging window of the system).
(397) Preferably, this POS-based imaging system supports an adaptive control process, and in the illustrative embodiment of the present invention, operates generally according to System Mode No. 17, described hereinabove. In this POS-based imaging system, the trigger signal is generated from the automatic IR-based object presence and direction detection subsystem 12. In the illustrative embodiment, the trigger signal can take on one or three possible values, namely: (1) that no object has been detected in the FOV of the system; (2) that an object has been detected in the FOV and is being moved therethrough in a Pass-Through manner; or that an object has been detected in the FOV and is being moved therethrough in a Presentation manner (i.e. toward the imaging window).
(398) By virtue of the intelligent automatic pass-through/presentation digital image capture and processing system of the present invention, it is now possible for operators to move objects past the imager in either a pass-through or presentation type manner, and the system will automatically adapt and reconfigure itself to optimally support the method of image-based scanning chosen by the operator.
(399) Alternative Embodiments of Imaging-Based Bar Code Symbol Reading System of the Present Invention
(400) In
(401) In
(402) In
(403) In each of the POS-based systems disclosed in
(404) In
(405) Method of and Apparatus for Modifying and/or Extending System Features and Functions within a Digital Image Capture and Processing System in Accordance with Principles of the Present Invention
(406) Referring now to
(407) As indicated in Block A of
(408) As indicated in Block B of
(409) As indicated in Block C of
(410) As indicated in Block D of
(411) As indicated in Block E of
(412) Having provided a brief overview on the system feature/functionality modification methodology of the present invention, it is now in order to describe these method steps in greater detail referring to
(413) In the illustrative embodiment, each plug-in module, stored within the Plug-In and Configuration File Library, shown in
(414) The management of all plug-in modules (i.e. third-party code) is performed by the Plug-in Controller shown in
(415) Any task of the Image-Processing Based Bar Code Symbol Reading System can request information from the Plug-in Controller about a plug-in module and/or request an operation on it. For a set of predetermined features, the Application tasks can request the Plug-in Controller to check the availability of a third-party plug-in module, and if such module is available, install it and provide its executable address as well as the rules of the plug-in engagement. The tasks then can execute it either instead or along with the standard module that implements the particular feature. The rules of engagement of the plug-in module, i.e. determination whether the plug-in module should be executed as a replacement or a complimentary module to the standard module, can be unique to the particular feature. The rules can also specify whether the complimentary plug-in module should be executed first, prior to the standard module, or after. Moreover, the plug-in module, if executed first, can indicate back to the device whether the standard module should also be called or not, thus, allowing the alteration of the device's behavior. The programming interfaces are predefined for the features that allow the plug-in functionality, thus, enabling the third-parties to develop their own software for the device.
(416) Consider, as a first and very simple example, an Image Pre-Processing Plug-in. The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's standard Image Pre-Processing Module (i.e. original product code of executable binary format), which is normally executed by the Main Task, after the system acquires an image. In accordance with the principles of the present invention, the customer can provide its own image preprocessing software as a plug-in module (i.e. third-party code) to the multi-tier software-based system. Notably, the third-party code is typically expressed in executable binary format. The plug-in can be described in an Image Preprocessing Plug-in Configuration File, having a format, for example, as expressed below:
(417) TABLE-US-00001 // Image Preprocessing Configuration File //type param library function IMGPREPR: libimgprepr_plugin.so.1>PluginImgprepr IMGPREPR_PROGMD: libimgprepr_plugin.so.1>PluginImgpreprProgmd IMGPREPR_PROGBC: libimgprepr_plugin.so.1>PluginImgpreprProgbc
(418) The block-diagram set forth in
(419) Consider, as a second, more interesting example, the Image Processing and Barcode Decoding Plug-in described in
(420) TABLE-US-00002 // Decode Plug-in Configuration File //type param library function DECODE: 0x02: libdecode_plugin.so.1 >PluginDecode
wherein DECODE is a keyword identifying the image processing and barcode decoding plug-in; wherein 0x02 is the value identifying the plug-in's rules of engagement; wherein
(421) TABLE-US-00003 bit meaning 0 0 = compliment standard; 1 = replace standard 1 (if bit0==0) 0 = call before standard func; 1 = call after standard func 2 reserved . . . . . .
libdecode_plugin.so.1 is the name of the plug-in library in the device's file system; and wherein PluginDecode is the name of the plug-in function that implements the customer-specific image processing and barcode decoding functionality.
The individual bits of the value param, which is used as the value indicating the rules of this particular plug-in's engagement, can have the following meaning:
The value 0x02, therefore, means that the customer plug-in is a complimentary, not a replacement, module (the bit 0 is 0), and it should be executed after the execution of the standard module (bit 1 is 1).
(422) Consider, as a third example, the Image Processing and Barcode Decoding Plug-in described in FIG. 23C1. The original equipment manufacturer of the Image-Processing Based Bar Code Symbol Reading System supplies the system's standard Image Processing and Barcode Decoding Module, which is normally executed by the Main Task after the system acquires an image as indicated in
(423) TABLE-US-00004 // Data Formatting Plug-in Configuration File //type param library function PREFORMAT: libformat_plugin.so.1>PluginPreformat FORMAT_PROGMD: libformat_plugin.so.1>PluginFormatProgmd FORMAT_PROGBC: libformat_plugin.so.1>PluginFormatProgbc
(424) The block-diagram set forth in FIG. 23C1 illustrates the logic of the Data Formatting Procedure plug-in.
(425) The Plug-Ins described above provide a few examples of the many kinds of plug-ins (objects) that be developed so that allowed features and functionalities of the system can be modified by persons other than the system designer, in accordance with the principles of the present invention. Other system features and functionalities for which Plug-in modules can be developed and installed within the Image-Processing Based Bar Code Symbol Reading System include, but are not limited to, control over functions supported and performed by the following systems: the IR-based Object Presence and Range Detection Subsystem 12; the Multi-Mode Area-type Image Formation and Detection (i.e. camera) Subsystem 13; the Multi-Mode LED-Based Illumination Subsystem 14; the Automatic Light Exposure Measurement and Illumination Control Subsystem 15; the Image Capturing and Buffering Subsystem 16; the Multi-Mode Image-Processing Bar Code Symbol Reading Subsystem 17; the Input/Output Subsystem 18; the manually-actuatable trigger switch 2C; the System Mode Configuration Parameter Table 70; the System Control Subsystem 18; and any other subsystems which may be integrated within the Image-Processing Based Bar Code Symbol Reading System.
(426) Having described the structure and function of Plug-In Modules that can be created by persons other than the OEM system designer, it is now in order to describe an illustrative embodiment of the Plug-In Development Platform of the present invention with reference to
(427) In the illustrative embodiment, the system designer/OEM of the system (e.g. Metrologic Focus1690 Image-Processing Bar Code Reader) will provide the plug-in developer with a CD that contains, for example, the following software tools:
(428) Arm Linux Toolchain for Linux PC This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Linux PC.
(429) Arm Linux Toolchain for Cygwin This directory contains the Arm Linux cross-compiling toolchain package for IBM-compatible Windows PC. The Cygwin software must be installed prior to the usage of this cross-compiling toolchain.
(430) Plug-in Samples This directory contains sample plug-in development projects. The plug-in software must be compiled on the IBM-compatible Linux PC using the Arm Linux Toolchain for Linux PC or on Windows PC with installed Cygwin software using Arm Linux Toolchain for Cygwin.
(431) FWZ Maker This directory contains the installation package of the program FWZ Maker for Windows PC. This program is used to build the FWZ-files for downloading into the Focus 1690 scanner.
(432) Latest Metrologic Focus Software This directory contains the FWZ-file with the latest Metrologic Focus scanner software.
(433) The first step of the plug-in software development process involves configuring the plug-in developer platform by installing the above tools on the host/developer computer system. The next step involves installing system software onto the Image-Processing Bar Code Reader, via the host plug-in developer platform using a communications cable between the communication ports of the system and the plug-in developer computer shown in
(434) To develop plug-in software, a corresponding shared library can be developed on the plug-in developer platform (i.e. the Linux PC) or in Windows Cygwin, and then the proper plug-in configuration file. The plug-in configuration file is then be loaded to the /usr directory in the case of developing a plug-in for example, an image capture and receiving device, such as Metrologic's Focus image-processing bar code reader. In this illustrative embodiment, each line of the plug-in configuration file contains information about a plug-in function in the following format:
(435) plug-in type: parameter: filename->function_name
(436) wherein plug-in type is one of the supported plug-in type keywords, followed by the field separator :
(437) wherein parameter is a number (could be decimal or hex, if preceded with 0x), having a specific and unique meaning for some plug-in functions. The parameter is also called a call-mode, for it can provide some specific information on how the plug-in should be called. The parameter is not required and can be omitted. If specified, the parameter must be followed by the field separator :; wherein filename is the name of the shared library, followed by the filename separator ->. The filename can contain a full-path to the library. If the path is omitted, the library is assumed to be located in either /usr/local/lib or /usr/lib/ directory in the Focus scanner. It is therefore important to make sure that the shared library is loaded to the correct directory in the Focus scanner, as specified by the plug-in configuration file; and
wherein function_name is the name of the corresponding plug-in C function.
(438) Notably, that the configuration file can also contain single-line C-style comments.
(439) It is within the discretion of the plug-in developer to decide which plug-in functions (of those supported by the system designer) should be included in the plug-in module (i.e. object). Once the shared library is built and configuration file is prepared on the plug-in development platform (illustrated in
(440) In the case of installing plug-in software for Metrologic's Focus Image-processing bar code reader, it is recommended not to use dynamic memory allocation and have static buffers rather than allocating them dynamically. As far as the file system is concerned, if necessary to store data in a file, then the locations such as /usr/ and /usr/local are recommended for storing data in non-volatile Flash memory; the Amp directory can be used to store data in RAM.
(441) Programming Barcodes and Programming Modes
(442) In the illustrative embodiment, configuration of image-processing bar code reader of the present invention can be changed via scanning special programming barcodes, or by sending equivalent data to the reader from the host computer (i.e. plug-in development computer). Programming barcodes are usually Code 128 symbols with the Fn3 codeword.
(443) When scanning a programming barcode, the reader may or may not be in its so-called programming mode. When the reader is not in its programming mode, the effect of the programming barcode is supposed to be immediate. On the other hand, when the reader is in its programming mode, the effect of all the programming barcodes read during the programming mode should occur at the time when the reader exits the programming mode.
(444) There is a special set of programming barcodes reserved for the plug-in software configuration purposes. These barcodes have at least 4 data characters, and the first three data characters are 990. It is recommended (but not required) that the Decode Plug-in use programming barcodes having 6 characters long, starting with 9900xx. It is recommended (but not required) that the Image Preprocessing Plug-in use programming barcodes having 6 characters long, starting with 9901xx. It is recommended (but not required) that the Formatting Plug-in use programming barcodes having 6 characters long, starting with 9902xx.
(445) Once a plug-in module has been developed in accordance with the principles of the present invention, the plug-in can be uninstalled by simply downloading an empty plug-in configuration file. For example, to uninstall a Decode plug-in, download an empty decode.plugin file into the Iusr directory of the file system within the OS layer, shown in
(446) Details about the Decode Plug-in of the Illustrative Embodiment
(447) The purpose of the Decode Plug-in is to provide a replacement or a complimentary barcode decoding software to the standard Focus barcode decoding. The Decode Plug-in can have the following plug-in functions:
(448) DECODE; DECODE_ENABLE2D; DECODE_PROGMD; DECODE_PROGBC.
(449) DECODE Plug-in Function
(450) This function is called to perform a barcode decoding from the given image in memory. Image is represented in memory as a two-dimensional array of 8-bit pixels. The first pixel of the array represents the upper-left corner of the image.
(451) Function Prototype:
(452) TABLE-US-00005 int /* Return: number of decoded barcodes; negative if error */ (*PLUGIN_DECODE)( void *p_image, /* Input: pointer to the image */ int size_x, /* Input: number of columns */ int size_y, /* Input: number of rows */ int pitch, /* Input: row size, in bytes */ DECODE_RESULT *p_decode_results, /* Output: decode results */ int max_decodes, /* Input: maximum decode results allowed */ int *p_cancel_flag); /* Input: if not NULL, pointer to the cancel flag */
(453) Note that p_decode_results points to the location in memory where the Decode plug-in function should store one or more results of barcode decoding (if of course the plug-in successfully decodes one or more barcodes in the given image) in the form of the array of DECODE_RESULT structures. The maximum number of allowed decode results (i.e. the size of the array) is given in max_decodes. The plug-in must return the number of successfully decoded barcodes (i.e. the number of populated elements in the array p_decode_results), or a negative number in case of an error.
(454) If p_cancel_flag is not NULL, it points to the integer flag (called Cancel flag) that indicates whether the decoding process should continue or should stop as soon as possible. If the flag is 0, the decoding process can continue. If the flag is not zero, the decoding process must stop as soon as possible. The reason for aborting the decoding process could be, for example, a time out. It is recommended to check the Cancel flag often enough so that the latency on aborting the decoding process would be as short as possible.
(455) Note that the Cancel flag is not the only way the Decoding plug-in (or any plug-in for that matter) can be aborted. Depending on the circumstances, the system can decide to abruptly kill the thread, in which the Decoding plug-in is running, at any time.
(456) Structure DECODE_RESULT
(457) The structure DECODE_RESULT has the following format:
(458) TABLE-US-00006 #define MAX_DECODED_DATA_LEN4096 #define MAX_SUPPL_DATA_LEN128 typedef struct { intx; inty; } BC_POINT; typedef struct { BC_POINTBCPts[4]; /* Coordinates of the 4 corners of the barcode */ } BC_BOUNDS;
The order of the array elements (i.e. corners) in BC_BOUNDS structure is as follows:
(459) TABLE-US-00007 0 - top left 1 - top right 2 - bottom right 3 - bottom left typedef struct { int decode_result_index; /* index of the decode result, starting from 0 */ int num_decode_results; /* total number of decode results minus 1 (i.e. 0-based) */ char SymId[32]; /* the symbology identifier characters */ int Symbology; /* the decoded barcode's symbology identifier number */ int Modifier; /* additional information of the decoded barcode */ int DecId; /* reserved */ int Class; /* 1 for 1D, 2 for 2D */ unsigned char Data[MAX_DECODED_DATA_LEN]; /* decoded data - may contain null chars */ int Length; /* number of characters in the decoded barcode */ unsigned char SupplData[MAX_SUPPL_DATA_LEN]; /* supplemental code's data */ int SupplLength; /* number of characters in the supplemental code's data */ unsigned char LinkedData[MAX_DECODED_DATA_LEN]; int LinkedLength; BC_BOUNDS C_Bounds; /* Bounds for the primary barcode */ BC_BOUNDS S_Bounds; /* Bounds for the supplemental barcode */ } DECODE_RESULT;
The first two members of each populated DECODE_RESULT structure must contain a zero-based index of the decode result in the array (i.e. the first decode result must have decode_result_index=0, the second must have decode_result_index=1, and so on) and the zero-based total number of successfully decoded barcodes (which should equal the returned value minus 1).
(460) The SymId member of DECODE_RESULT structure can have a string of up to 31 null-terminated characters describing the barcode symbology. It is used for informational purposes only. The following values are recommended for some known barcode symbologies.
(461) TABLE-US-00008 AZTEC Aztec CBR Codabar CBK_A Codablock A CBK_F Codablock F C11 Code 11 C128 Code 128 C39 Code 39 C93 Code 93 DM Datamatrix S2O5 Straight 2 of 5 I2O5 Interleaved 2 of 5 MC MexiCode PDF Code PDF QR Code QR RSS-E Code RSS-E RSS-EST Code RSS-EST RSS14-LIM Code RSS Limited RSS14 Code RSS-14 RSS14-ST Code RSS-ST UPC Code UPC/EAN
The Symbology member of the DECODE_RESULT structure must contain the id of the decoded barcode symbology. The following symbology ids must be used for the known barcode symbologies:
(462) TABLE-US-00009 MBCD_SYM_C128 Code 128 MBCD_SYM_C39 Code 39 MBCD_SYM_ITF Interleaved 2 of 5 MBCD_SYM_C93 Code 93 MBCD_SYM_CBR Codabar MBCD_SYM_UPC Code UPC/EAN MBCD_SYM_TPEN Telepen MBCD_SYM_RSS14 Code RSS-14 MBCD_SYM_RSSE Code RSS-E MBCD_SYM_RSSL Code RSS Limited MBCD_SYM_MTF Matrix 2 of 5 MBCD_SYM_ATF Airline 2 of 5 MBCD_SYM_STF Straight 2 of 5 MBCD_SYM_MPLY MSI Plessey MBCD_SYM_C11 Code 11 MBCD_SYM_PDF Code PDF MBCD_SYM_PN Postnet MBCD_SYM_DM Datamatrix MBCD_SYM_MC MaxiCode MBCD_SYM_QR Code QR MBCD_SYM_AZ Aztec MBCD_SYM_MICROPDF MicroPDF MBCD_SYM_CBLA 1Codablock A MBCD_SYM_CBLF Codablock F MBCD_SYM_UNKNOWN User-defined symbology
(463) The Modifier member of the DECODE_RESULT structure contains additional information about the decoded barcode. The values of the Modifier are usually bit-combinatory. They are unique for different symbologies, and many symbologies don't use it all. If the Modifier is not used, it should be set to 0. For some symbologies that support Modifier, the possible values are presented below.
(464) TABLE-US-00010 Coupon Modifier MBCD_MODIFIER_COUP Coupon code UPC Modifier Bit Flag Constants MBCD_MODIFIER_UPCA UPC-A MBCD_MODIFIER_UPCE UPC-E MBCD_MODIFIER_EAN8 EAN-8 MBCD_MODIFIER_EAN13 EAN-13 MBCD_MODIFIER_SUPP2 2-digit supplement MBCD_MODIFIER_SUPP5 5 digit supplement Code 128 Modifier Bit Flag Constants MBCD_MODIFIER_C128A Code 128 with A start character MBCD_MODIFIER_C128B Code 128 with B start character MBCD_MODIFIER_C128C Code 128 with C start character, but not an EAN128 MBCD_MODIFIER_EAN128 EAN-128 MBCD_MODIFIER_PROG Programming label (overrides all other considerations) MBCD_MODIFIER_AIM_AI Code 128 with AIM Application indicator Code 39 Modifier Bits Flag Constands MBCD_MODIFIER_ITPHARM Italian Pharmaceutical Codabar Modifier Bit Flag Constants MBCD_MODIFIER_CBR_DF Double-Field Codabar POSTNET iModifier Bit Flag Constants MBCD_MODIFIER_PN POSTNET MBCD_MODIFIER_JAP Japan Post MBCD_MODIFIER_AUS Australia Post MBCD_MODIFIER_PLANET PLANET MBCD_MODIFIER_RM Royal Mail MBCD_MODIFIER_KIX KIX Code MBCD_MODIFIER_UPU57 UPU (57-bar) MBCD_MODIFIER_UPU75 UPU (75-bar) Datamatrix Modifier Bit Flag Constants MBCD_MODIFIER_ECC140 ECC 000-140 MBCD_MODIFIER_ECC200 ECC 200 MBCD_MODIFIER_FNC15 ECC 200, FNC1 in first or fifth position MBCD_MODIFIER_FNC26 ECC 200, FNC1 in second or sixth position MBCD_MODIFIER_ECI ECC 200, ECI protocol implemented MBCD_MODIFIER_FNC15_ECI ECC 200, FNC1 in first or fifth position, ECI protocol MBCD_MODIFIER_FNC26_ECI ECC 200, FNC1 in second or sixth position, ECI protocol MBCD_MODIFIER_RP Reader Programming Code MaxiCode Modifier Bit Flag Constants MBCD_MODIFIER_MZ Symbol in Mode 0 MBCD_MODIFIER_M45 Symbol in Mode 4 or 5 MBCD_MODIFIER_M23 Symbol in Mode 2 or 3 MBCD_MODIFIER_M45_ECI Symbol in Mode 4 or 5, ECI protocol MBCD_MODIFIER_M23_ECI Symbol in Mode 2 or 3, ECI protocol
The DecId member of the DECODE_RESULT structure is currently not used and should be set to 0.
(465) The Class member of the DECODE_RESULT structure must be set either to 1 or 2. If the decoded barcode is a regular linear barcode, such as UPC, Code 39, RSS, etc., the Class should be set to 1. If the decoded barcode is a 2D symbology, such as Code PDF, Datamatrix, Aztec, MaxiCode, etc., the Class should be set to 2.
(466) The Data member of the DECODE_RESULT structure contains the decoded data. It can contain up to MAX_DECODED_DATA_LEN bytes of data.
(467) The Length member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in Data.
(468) The SupplData member of the DECODE_RESULT structure contains the data decoded in a supplemental part of the barcode, such as a coupon. It can contain up to MAX_DECODED_DATA_LEN bytes of data.
(469) The SupplLength member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in SupplData.
(470) The LinkedData member of the DECODE_RESULT structure contains the data decoded in a secondary part of the composite barcode, such as RSS/PDF composite. It can contain up to MAX_DECODED_DATA_LEN bytes of data.
(471) The LinkedLength member of the DECODE_RESULT structure specifies how many bytes of decoded data are stored in LinkedData.
(472) The C_Bounds and S_Bounds members of the DECODE_RESULT structure are currently not used.
(473) DECODE Plug-in Call-Mode
(474) The DECODE plug-in can have the following call-mode values:
(475) TABLE-US-00011 bit value 0 <-- 0 = compliment standard; 1 = replace standard 1 <-- (if bit0==0) 0 = call before standard function; 1 = call after standard function
(476) The default call-mode value is 0, meaning that by default, the DECODE plug-in is considered a complimentary module to standard Focus barcode decoding software and is executed before the standard function. In this case, the standard function will be called only if the result returned from DECODE plug-in is not negative and less than max_decodes.
(477) DECODE_ENABLE2D Plug-in Function
(478) This function is called to notify the plug-in that the scanner enters a mode of operation in which decoding of 2D symbologies (such as PDF417, Datamatrix, etc.) should be either allowed or disallowed. By default, the decoding of 2D symbologies is allowed.
(479) Function Prototype:
(480) void
(481) (*PLUGIN ENABLE2D)(int enable); /* Input: 0=disable; 1=enable */
(482) For example, when the Focus scanner is configured to work in linear mode (as opposed to omni-directional mode), the decoding of 2D symbologies is disallowed.
(483) DECODE_PROGMD Plug-in Function
(484) This function is called to notify the plug-in that the scanner enters a programming mode.
(485) Function Prototype:
(486) void
(487) (*PLUGIN_PROGMD)(int progmd); /* Input: 1=enter; 0=normal exit; (1)=abort */
(488) DECODE_PROGBC Plug-in Function
(489) This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes.
(490) Function Prototype:
(491) TABLE-US-00012 int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */ (*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
Details about the Image Preprocessing Plug-in of the Illustrative Embodiment of the Present Invention
(492) The purpose of the Image Preprocessing Plug-in is to allow the plug-in to perform some special image processing right after the image acquisition and prior to the barcode decoding. The Image Preprocessing Plug-in can have the following plug-in functions:
(493) IMGPREPR; IMGPREPR_PROGMD; IMGPREPR_PROGBC.
(494) IMGPREPR Plug-in Function
(495) This function is called to perform an image preprocessing. The image is represented in memory as a two-dimensional array of 8-bit pixels. The first pixel of the array represents the upper-left corner of the image.
(496) Function Prototype:
(497) TABLE-US-00013 int /* Return: 1 if preprocessing is done; 0 if not; neg. if error */ (*PLUGIN_IMGPREPR)( void *p_image, /* Input: pointer to the image */ int size_x, /* Input: number of columns */ int size_y, /* Input: number of rows */ int pitch, /* Input: row size, in bytes */ void **pp_new_image, /* Output: pointer to the new image */ int *p_new_size_x, /* Output: new number of columns */ int *p_new_size_y, /* Output: new number of rows */ int *p_new_pitch); /* Output: new row size, in bytes */
(498) If the IMGPREPR plug-in function is successful, it should return 1 and store the address of the new image in the location in memory pointed to by pp_new_image. The new image dimensions should be stored in the locations pointed to by p_new_size_x, p_new_size_y, and p_new_pitch.
(499) If the preprocessing is not performed for whatever reason, the IMGPREPR plug-in function must return 0.
(500) The negative returned value indicates an error.
(501) IMGPREPR_PROGMD Plug-in Function
(502) This function is called to notify the plug-in that the scanner enters a programming mode.
(503) Function Prototype:
(504) void
(505) (*PLUGIN_PROGMD)(int progmd); /* Input: 1=enter; 0=normal exit; (1)=abort */
(506) IMGPREPR_PROGBC Plug-in Function
(507) This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes.
(508) Function Prototype:
(509) TABLE-US-00014 int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */ (*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
(510) Details about Formatting Plug-in of the Illustrative Embodiment
(511) The purpose of the Formatting Plug-in is to provide a replacement or complimentary software to the standard Focus data formatting software. The Formatting Plug-in configuration file must have the name format.plugin and loaded in the /usr directory in the Focus scanner.
(512) The Formatting Plug-in can currently have the following plug-in functions:
(513) PREFORMAT; FORMAT_PROGMD; FORMAT_PROGBC.
(514) PREFORMAT Plug-in Function
(515) This function is called to perform a necessary transformation of the decoded barcode data prior to the data being actually formatted and sent out.
(516) Function Prototype:
(517) TABLE-US-00015 int /* Return: 1 if preformat is done; 0 if not; neg. if error */ (*PLUGIN_PREFORMAT)( DECODE_RESULT *decode_results, /* Input: decode results */ DECODE_RESULT *new_decode_results); /* Output: preformatted decode results */
(518) If the PREFORMAT plug-in function is successful, it should return 1 and store the new decode result in the location in memory pointed to new_decode_results.
(519) If the preformatting is not performed for whatever reason, the PREFORMAT plug-in function must return 0.
(520) The negative returned value indicates an error.
(521) For the details about the DECODE_RESULT structure, please refer to the section DECODE Plug-in Function.
(522) FORMAT_PROGMD Plug-in Function
(523) This function is called to notify the plug-in that the scanner enters a programming mode.
(524) Function Prototype:
(525) void
(526) (*PLUGIN_PROGMD)(int progmd); /* Input: 1=enter; 0=normal exit; (1)=abort */
(527) FORMAT_PROGBC Plug-in Function
(528) This function is called to notify the plug-in that the scanner just scanned a programming barcode, which can be used by the plug-in for its configuration purposes.
(529) Function Prototype:
(530) TABLE-US-00016 int /* Return: 1 if successful; 0 if barcode is invalid; negative if error */ (*PLUGIN_PROGBC)(unsigned char *bufferptr, int data_len);
(531) The method of system feature/functionality modification described above can be practiced in diverse application environments which are not limited to image-processing based bar code symbol reading systems described hereinabove. In general, any image capture and processing system or device that supports an application software layer and at least an image capture mechanism and an image processing mechanism would be suitable for the practice of the present invention. Thus, image-capturing cell phones, digital cameras, video cameras, and portable or mobile computing terminals and portable data terminals (PDTs) are all suitable systems in which the present invention can be practiced.
(532) Also, it is understood that the application layer of the image-processing bar code symbol reading system of the present invention, illustrated in
(533) The Image Capture and Processing System of the present invention described above can be implemented on various hardware computing platforms such as Palm, PocketPC, MobilePC, JVM, etc. equipped with CMOS sensors, trigger switches etc. In such illustrative embodiments, the 3-tier system software architecture of the present invention can be readily modified by replacing the low-tier Linux OS (described herein) with any operating system (OS), such as Palm, PocketPC, Apple OSX, etc. Furthermore, provided that the mid-tier SCORE subsystem described hereinabove supports a specific hardware platform equipped with an image sensor, trigger switch of one form or another etc., and that the same (or similar) top-tier Bar Code Symbol Reading System Application is compiled for that platform, any universal (mobile) computing device can be transformed into an Image Acquisition and Processing System having the bar code symbol reading functionalities of the system shown in
(534) Some Modifications which Readily Come to Mind
(535) In alternative embodiments of the present invention, illumination arrays 27, 28 and 29 employed within the Multi-Mode Illumination Subsystem 14 may be realized using solid-state light sources other than LEDs, such as, for example, visible laser diode (VLDs) taught in great detail in WIPO International Publication No. WO 02/43195 A2, published on May 30, 2002, assigned to Metrologic Instruments, Inc., and incorporated herein by reference in its entirety as if set forth fully herein. However, when using VLD-based illumination techniques in the imaging-based bar code symbol reader of the present invention, great care must be taken to eliminate or otherwise substantially reduce speckle-noise generated at the image detection array 22 when using coherent illumination source during object illumination and imaging operations. WIPO Publication No. WO 02/43195 A2, supra, provides diverse methods of and apparatus for eliminating or substantially reducing speckle-noise during image formation and detection when using VLD-based illumination arrays.
(536) While CMOS image sensing array technology was described as being used in the preferred embodiments of the present invention, it is understood that in alternative embodiments, CCD-type image sensing array technology, as well as other kinds of image detection technology, can be used.
(537) The bar code reader design described in great detail hereinabove can be readily adapted for use as an industrial or commercial fixed-position bar code reader/imager, having the interfaces commonly used in the industrial world, such as Ethernet TCP/IP for instance. By providing the system with an Ethernet TCP/IP port, a number of useful features will be enabled, such as, for example: multi-user access to such bar code reading systems over the Internet; control of multiple bar code reading system on the network from a single user application; efficient use of such bar code reading systems in live video operations; web-servicing of such bar code reading systems, i.e. controlling the system or a network of systems from an Internet Browser; and the like.
(538) While the illustrative embodiments of the present invention have been described in connection with various types of bar code symbol reading applications involving 1-D and 2-D bar code structures, it is understood that the present invention can be use to read (i.e. recognize) any machine-readable indicia, dataform, or graphically-encoded form of intelligence, including, but not limited to bar code symbol structures, alphanumeric character recognition strings, handwriting, and diverse dataforms currently known in the art or to be developed in the future. Hereinafter, the term code symbol shall be deemed to include all such information carrying structures and other forms of graphically-encoded intelligence.
(539) Also, imaging-based bar code symbol readers of the present invention can also be used to capture and process various kinds of graphical images including photos and marks printed on driver licenses, permits, credit cards, debit cards, or the like, in diverse user applications.
(540) It is understood that the image capture and processing technology employed in bar code symbol reading systems of the illustrative embodiments may be modified in a variety of ways which will become readily apparent to those skilled in the art of having the benefit of the novel teachings disclosed herein. All such modifications and variations of the illustrative embodiments thereof shall be deemed to be within the scope and spirit of the present invention as defined by the Claims to Invention appended hereto.