US10244180B2 - Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances - Google Patents
Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances Download PDFInfo
- Publication number
- US10244180B2 US10244180B2 US15/083,493 US201615083493A US10244180B2 US 10244180 B2 US10244180 B2 US 10244180B2 US 201615083493 A US201615083493 A US 201615083493A US 10244180 B2 US10244180 B2 US 10244180B2
- Authority
- US
- United States
- Prior art keywords
- imager
- imaging
- target
- determined
- illumination light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10792—Special measures in relation to the object to be scanned
- G06K7/10801—Multidistance reading
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H04N5/2352—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10712—Fixed beam scanning
- G06K7/10722—Photodetector array or CCD scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10712—Fixed beam scanning
- G06K7/10722—Photodetector array or CCD scanning
- G06K7/10752—Exposure time control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/10881—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H04N5/2256—
-
- H04N5/2258—
-
- H04N5/2354—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K2007/10524—Hand-held scanners
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K2207/00—Other aspects
- G06K2207/1013—Multi-focal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
Definitions
- the present invention relates generally to an imaging module and an imaging reader for, and a method of, expeditiously setting one or more imaging parameters, such as exposure and/or gain values, of at least one imager for imaging targets to be electro-optically read by image capture over a range of working distances.
- one or more imaging parameters such as exposure and/or gain values
- Solid-state imaging systems or imaging readers have been used, in both handheld and/or hands-free modes of operation, to electro-optically read targets, such as one- and two-dimensional bar code symbol targets, and/or non-symbol targets, such as documents.
- a handheld imaging reader includes a housing having a handle held by an operator, and an imaging module, also known as a scan engine, supported by the housing and aimed by the operator at a target during reading.
- the imaging module includes an imaging assembly having a solid-state imager or imaging sensor with an imaging array of photocells or light sensors, which correspond to image elements or pixels in an imaging field of view of the imager, and an imaging lens assembly for capturing return light scattered and/or reflected from the target being imaged, and for projecting the return light onto the array to initiate capture of an image of the target.
- an imager may include a one- or two-dimensional charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) device and associated circuits for producing and processing electronic signals corresponding to a one- or two-dimensional array of pixel data over the imaging field of view.
- the imaging module In order to increase the amount of the return light captured by the array, for example, in dimly lit environments, the imaging module generally also includes an illuminating light assembly for illuminating the target with illumination light in an illumination pattern for reflection and scattering from the target.
- far-out targets e.g., on products located on high overhead shelves, which are located at a far-out range of working distances on the order of thirty to fifty feet away from the reader
- close-in targets e.g., on products located at floor level or close to the operator, which are located at a close-in range of working distances on the order of less than two feet away from the reader.
- the reader may illuminate the far-out targets by emitting an illumination light at an intense, bright level, and capturing the return light from the illuminated far-out targets by employing a far-out imager having a relatively narrow field of view, and may illuminate the close-in targets by emitting the illumination light at a less intense, dimmer level, and capturing the return light from the illuminated close-in targets by employing a close-in imager having a relatively wide field of view.
- This variable illumination light level enables each such target to be more reliably imaged and successfully read.
- each target must be read by the correct imager; the correct imager should be set with one or more optimum imaging parameters, such as exposure values and/or gain values; and the illumination light should be set at an optimum illumination light level or value.
- optimum imaging parameters such as exposure values and/or gain values
- the illumination light should be set at an optimum illumination light level or value.
- AEC automatic exposure controller
- AGC automatic gain controller
- a typical known strategy is to use exposure priority, in which the exposure is increased first until a maximum exposure time or threshold (typically around 4-8 ms in order to reduce hand jitter motion effects for a handheld reader) is reached.
- FIG. 1 is a side elevational view of a portable imaging reader operative for reading targets by image capture over an extended range of working distances in accordance with this disclosure.
- FIG. 2 is a schematic diagram of various components, including imaging, illuminating, and range finding assemblies supported on an imaging module that is mounted inside the reader of FIG. 1 .
- FIG. 3 is a perspective view of the imaging module of FIG. 2 in isolation.
- FIG. 4 is a cross-sectional view taken on line 4 - 4 of FIG. 3 .
- FIG. 5 is a cross-sectional view taken on line 5 - 5 of FIG. 3 .
- FIG. 6 is a vertical sectional view taken on line 6 - 6 of FIG. 3 .
- FIG. 7 is a flow chart depicting steps performed in accordance with a method of this disclosure.
- One aspect of the present disclosure relates to an imaging module, also known as a scan engine, for setting one or more imaging parameters, e.g., exposure and/or gain values, of at least one imager for imaging targets to be electro-optically read over a range of working distances away from the module.
- an imaging reader having a housing for supporting the imaging module, and a light-transmissive window on the housing.
- the imaging module comprises an imaging assembly including a near imager for imaging targets over a relatively wider imaging field of view, and a far imager for imaging targets over a relatively narrower imaging field of view.
- An illuminating light assembly illuminates targets with illumination light.
- a range finder determines a distance to a target.
- a main controller or programmed microprocessor controls a default one of the imagers, for example, the far imager, to capture a minor portion of an image of the target, determines a light intensity level of the captured minor portion of the image, selects at least one of the imagers based on the determined distance and/or the determined light intensity level, and controls the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level.
- the main controller sets at least one of the imaging parameters of the selected at least one imager to a predetermined value based on the determined light intensity level and/or the determined distance, and controls the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
- a memory is accessible to the main controller and stores a plurality of predetermined values, e.g., exposure values and/or gain values, of the at least one imaging parameter for retrieval by the main controller from a look-up table. These predetermined values are different based on the determined light intensity level and/or the determined distance.
- the default imager is controlled by the main controller to operate at a predetermined frame rate, e.g., 60 frames per second (fps). The main controller determines the light intensity level from the minor portion of the image at a rate faster than the predetermined frame rate.
- the minor portion of the image can be one of these quadrants, in which case, the main controller can determine the light intensity level from the minor portion of the image at a rate that is four times faster, e.g., 240 fps, than the predetermined frame rate.
- the selected imager is more rapidly and efficiently set with optimum exposure values and/or gain values than heretofore.
- Still another aspect of the present disclosure relates to a method of setting one or more imaging parameters of at least one imager for imaging targets to be electro-optically read over a range of working distances.
- the method is performed by providing a near imager to image targets over a relatively wider imaging field of view, by providing a far imager to image targets over a relatively narrower imaging field of view, by providing an illuminator to illuminate targets with illumination light, and by determining a distance to a target.
- the method is further performed by controlling a default one of the imagers, e.g., the far imager, to capture a minor portion of an image of the target, by determining a light intensity level of the captured minor portion of the image, by selecting at least one of the imagers based on the determined distance and/or the determined light intensity level, by controlling the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level, and by setting the at least one imaging parameter of the selected at least one imager to a predetermined value based on the determined light intensity level and/or the determined distance.
- the method is still further performed by controlling the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
- Reference numeral 30 in FIG. 1 generally identifies an ergonomic imaging reader configured as a gun-shaped housing having an upper barrel or body 32 and a lower handle 28 tilted rearwardly away from the body 32 at an angle of inclination, for example, fifteen degrees, relative to the vertical.
- a light-transmissive window 26 is located adjacent the front or nose of the body 32 and is preferably also tilted at an angle of inclination, for example, fifteen degrees, relative to the vertical.
- the imaging reader 30 is held in an operator's hand and used in a handheld mode in which a trigger 34 is manually depressed to initiate imaging of targets, especially bar code symbols, to be read in an extended range of working distances, for example, on the order of thirty to fifty feet, away from the window 26 . Housings of other configurations, as well as readers operated in the hands-free mode, could also be employed.
- an imaging module 10 is mounted in the reader 30 behind the window 26 and is operative, as described below, for expeditiously setting one or more imaging parameters, e.g., exposure and/or gain values, of an imager for imaging targets to be electro-optically read by image capture through the window 26 over an extended range of working distances away from the module 10 .
- a target may be located anywhere in a working range of distances between a close-in working distance (WD 1 ) and a far-out working distance (WD 2 ).
- WD 1 is either at, or about eighteen inches away, from the window 26 , and WD 2 is much further away, for example, over about sixty inches away from the window 26 .
- An intermediate working distance between WD 1 and WD 2 is about eighteen to about sixty inches away from the window 26 .
- the module 10 includes an imaging assembly that has a near imaging sensor or imager 12 , and a near imaging lens assembly 16 for capturing return light over a relatively wide imaging field of view 20 , e.g., about thirty degrees, from a near target located in a close-in region of the range, e.g., from about zero inches to about eighteen inches away from the window 26 , and for projecting the captured return light onto the near imager 12 , as well as a far imaging sensor or imager 14 , and a far imaging lens assembly 18 for capturing return light over a relatively narrow imaging field of view 22 , e.g., about sixteen degrees, from a far target located in a far-out region of the range, e.g., greater than about sixty inches away from the window 26 , and for projecting the captured return light onto the far imager 14 .
- a relatively wide imaging field of view 20 e.g., about thirty degrees
- a near target located in a close-in region of the range e.g.,
- Each imager 12 , 14 is a solid-state device, for example, a CCD or a CMOS imager having a one-dimensional array of addressable image sensors or pixels arranged in a single, linear row, or preferably a two-dimensional array of such sensors arranged in mutually orthogonal rows and columns, and operative for detecting return light captured by the respective imaging lens assemblies 16 , 18 along respective imaging axes 24 , 36 through the window 26 .
- Each imaging lens assembly is advantageously a Cooke triplet, although other fixed focus and variable focus lens combinations can also be employed.
- an illuminating light assembly is also supported by the imaging module 10 and includes an illumination light source, e.g., at least one light emitting diode (LED) 40 , stationarily mounted on an optical axis 42 , and an illuminating lens assembly that includes an illuminating lens 44 also centered on the optical axis 42 .
- the illuminating light assembly is shared by both imagers 12 , 14 .
- FIGS. 1 illumination light source
- an aiming light assembly is also supported by the imaging module 10 and includes an aiming light source 46 , e.g., a laser, stationarily mounted on an optical axis 48 , and an aiming lens 50 centered on the optical axis 48 .
- the aiming lens 50 may be a diffractive or a refractive optical element, and is operative for projecting a visible aiming light pattern on the target prior to reading.
- the imagers 12 , 14 , the LED 40 and the laser 46 are operatively connected to a main controller or programmed microprocessor 52 operative for controlling the operation of these components.
- a memory 54 is connected and accessible to the controller 52 .
- the controller 52 is the same as the one used for processing the return light from the targets and for decoding the captured target images.
- the aforementioned aiming light assembly also serves as a range finder to determine the distance to a target.
- the aiming axis 48 is offset from the imaging axes 24 , 36 so that the resulting parallax provides target distance information. More particularly, the parallax between the aiming axis 48 and either one of the imaging axes 24 , 36 provides range information from the pixel position of the aiming beam on one of the imaging sensor arrays. It is preferred to use the imaging axis 36 of the far imager 14 , because the parallax error will be greater for the far imager 14 than for the near imager 12 . It will be understood that other types of range finders, e.g., acoustic devices, can be employed to determine the target distance. Thus, the range finder locates the target to determine whether the target is in a close-in region, or an intermediate region, or a far-out region, of the range.
- the main controller 52 controls a default one of the imagers, for example, the far imager 14 , to capture a minor or fractional portion of an image of the target.
- the image is comprised of a two-dimensional array of pixels arranged in a predetermined row number (M) of rows and a predetermined column number (N) of columns
- the minor portion of the image is comprised of a subarray of pixels arranged in a number of rows less than M and in a number of columns less than N.
- the subarray can be located anywhere on the image; for example, it can be in a corner or central area of the image, or it can be the area of the image covered by the aiming light pattern.
- the main controller 52 determines a light intensity level of the captured minor portion of the image. This is performed much faster than in the known art where the light intensity level had to be determined from the entire image. For example, if the default far imager 14 operates at a predetermined frame rate, e.g., 60 frames per second (fps), and if the image is subdivided into four quadrants, then the main controller 52 can determine the light intensity level from the minor portion or quadrant at a rate that is four times faster, e.g., 240 fps, than the predetermined frame rate.
- a predetermined frame rate e.g. 60 frames per second (fps)
- the main controller 52 can determine the light intensity level from the minor portion or quadrant at a rate that is four times faster, e.g., 240 fps, than the predetermined frame rate.
- the main controller 52 selects either the near imager 12 or the far imager 14 based on the determined distance and/or the determined light intensity level. Once the correct imager has been selected, the main controller 52 then sets the one or more imaging parameters, e.g., exposure and/or gain values, of the selected imager to a predetermined or optimum value based on the determined light intensity level and/or the determined distance.
- the aforementioned memory 54 stores a set of exposure values and/or a set of gain values in a look-up table 60 .
- the main controller 52 has a gain controller 56 that can access the look-up table 60 and retrieve the correct gain value that corresponds to the determined distance and/or the determined light intensity level.
- the main controller 52 also has an exposure controller 58 that can access the look-up table 60 and retrieve the correct exposure value that corresponds to the determined distance and/or the determined light intensity level.
- Each set of exposure and gain values includes a range of different values, and is determined in advance by knowledge of the F-stop and responsivity of each imager as a function of distance away from the respective imager and/or the light intensity level.
- the main controller 52 also controls the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level.
- the main controller 52 energizes the illuminating light assembly to illuminate the target with illumination light of a relatively lesser intensity when the range finder determines that the target to be imaged and read is located in a close-in region of the range; or energizes the illuminating light assembly to illuminate the target with illumination light of a relatively greater intensity when the range finder determines that the target to be imaged and read is located in a far-out region of the range; or energizes the illuminating light assembly to illuminate the target with illumination light of a relatively intermediate intensity that is between the lesser intensity and the greater intensity when the range finder determines that the target to be imaged and read is located in an intermediate region that is between the close-in region and the far-out region of the range.
- the main controller 52 energizes the LED 40 with a variable electrical current to vary the intensity or level of the illumination light.
- the electrical current is on the order of 30 milliamperes when the close-in region lies between about 0.0 inches and about eighteen inches from the window 26 , is on the order of 150 milliamperes when the intermediate region lies between about eighteen inches and about sixty inches from the window 26 , and is on the order of 600 milliamperes when the far-out region lies between about sixty inches and infinity from the window 26 .
- the main controller 52 varies the intensity of the illumination light either as a continuous analog function, or as a stepwise, multi-level function, of the distance determined by the range finder.
- the selected imager is operated by the main controller 52 to capture an image of the target to be read. Reader performance is rapid and aggressive.
- step 100 the minor portion of the image of the target is captured by one of the imagers by default, e.g., the far imager 14 .
- step 102 the light intensity level of the captured minor portion of the image is determined, and the distance to the target is also determined. Either the far imager 14 or the near imager 12 may be selected based on the determined distance and/or the determined light intensity level in step 104 .
- the target is illuminated with illumination light whose illumination level is also based on the determined distance and/or the determined light intensity level in step 106 .
- optimum exposure and/or gain values are set for the selected imager in step 108 by referral to the look-up table 60 .
- step 110 the selected imager, whose exposure and/or gain has already been set, captures an image of the target, which has been illuminated at the illumination light level.
- a includes . . . a,” or “contains . . . a,” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, or contains the element.
- the terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein.
- the terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%.
- the term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
- a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- processors such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- processors or “processing devices” such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- FPGAs field programmable gate arrays
- unique stored program instructions including both software and firmware
- an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein.
- Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Image Input (AREA)
- Studio Devices (AREA)
- Facsimile Scanning Arrangements (AREA)
Abstract
An imaging reader has near and far imagers for imaging illuminated targets to be read over a range of working distances. A range finder determines a distance to a target. A default imager captures a minor portion of an image of the target, and rapidly determines its light intensity level. At least one of the imagers is selected based on the determined distance and/or the determined light intensity level. The exposure and/or gain of the selected imager is set to a predetermined value, and an illumination level is determined, also based on the determined light intensity level and/or the determined distance. The selected imager, which has been set with the predetermined value, captures an image of the target, which has been illuminated at the illumination light level.
Description
The present invention relates generally to an imaging module and an imaging reader for, and a method of, expeditiously setting one or more imaging parameters, such as exposure and/or gain values, of at least one imager for imaging targets to be electro-optically read by image capture over a range of working distances.
Solid-state imaging systems or imaging readers have been used, in both handheld and/or hands-free modes of operation, to electro-optically read targets, such as one- and two-dimensional bar code symbol targets, and/or non-symbol targets, such as documents. A handheld imaging reader includes a housing having a handle held by an operator, and an imaging module, also known as a scan engine, supported by the housing and aimed by the operator at a target during reading. The imaging module includes an imaging assembly having a solid-state imager or imaging sensor with an imaging array of photocells or light sensors, which correspond to image elements or pixels in an imaging field of view of the imager, and an imaging lens assembly for capturing return light scattered and/or reflected from the target being imaged, and for projecting the return light onto the array to initiate capture of an image of the target. Such an imager may include a one- or two-dimensional charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) device and associated circuits for producing and processing electronic signals corresponding to a one- or two-dimensional array of pixel data over the imaging field of view. In order to increase the amount of the return light captured by the array, for example, in dimly lit environments, the imaging module generally also includes an illuminating light assembly for illuminating the target with illumination light in an illumination pattern for reflection and scattering from the target.
In some applications, for example, in warehouses, it is sometimes necessary for the same reader to read not only far-out targets, e.g., on products located on high overhead shelves, which are located at a far-out range of working distances on the order of thirty to fifty feet away from the reader, but also close-in targets, e.g., on products located at floor level or close to the operator, which are located at a close-in range of working distances on the order of less than two feet away from the reader. The reader may illuminate the far-out targets by emitting an illumination light at an intense, bright level, and capturing the return light from the illuminated far-out targets by employing a far-out imager having a relatively narrow field of view, and may illuminate the close-in targets by emitting the illumination light at a less intense, dimmer level, and capturing the return light from the illuminated close-in targets by employing a close-in imager having a relatively wide field of view. This variable illumination light level enables each such target to be more reliably imaged and successfully read.
However, the use of more than one imager and the variable illumination level presents a challenge to reader performance. For optimum reader performance, each target must be read by the correct imager; the correct imager should be set with one or more optimum imaging parameters, such as exposure values and/or gain values; and the illumination light should be set at an optimum illumination light level or value. These values are different for each imager, and vary, among other things, as a function of the working distance and of the illumination light level. Increasing the exposure and/or the gain values of the imager, as well as increasing the illumination light level, will increase the captured image brightness of the image of the target, and vice versa.
In order to set an imager with one or more optimum imaging parameters, it is known for the imager to capture an entire image from the target, to analyze the brightness of the entire image, to change the imaging parameters based on the analysis of the entire image, to capture another entire image from the target, and to repeat all the steps of this process for as many times as it takes until the brightness of the entire image is within an acceptable level. An automatic exposure controller (AEC) is typically used to control the imager's exposure, and an automatic gain controller (AGC) is typically used to control the imager's gain. A typical known strategy is to use exposure priority, in which the exposure is increased first until a maximum exposure time or threshold (typically around 4-8 ms in order to reduce hand jitter motion effects for a handheld reader) is reached. If the image brightness is still too low, then the gain is increased. This strategy maximizes the signal-to-noise ratio (SNR) of the imager, because the gain is only increased when necessary. Although generally satisfactory for its intended purpose, this known process is very slow and inefficient in practice, especially when more than one imager is involved, and when the entire known process has to be repeated for each additional imager. Reader performance can be deemed sluggish, and is unacceptable in many applications.
Accordingly, there is a need to expeditiously select the correct imager in such readers, to expeditiously set the selected imager with one or more optimum imaging parameters, and to expeditiously set the illuminating light assembly to illuminate the target with illumination light at an optimum illumination light level, in order to more rapidly, efficiently, reliably, and successfully read both far-out targets and close-in targets with the same reader.
The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and locations of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
The system and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
One aspect of the present disclosure relates to an imaging module, also known as a scan engine, for setting one or more imaging parameters, e.g., exposure and/or gain values, of at least one imager for imaging targets to be electro-optically read over a range of working distances away from the module. Another aspect of the present disclosure relates to an imaging reader having a housing for supporting the imaging module, and a light-transmissive window on the housing.
In both aspects, the imaging module comprises an imaging assembly including a near imager for imaging targets over a relatively wider imaging field of view, and a far imager for imaging targets over a relatively narrower imaging field of view. An illuminating light assembly illuminates targets with illumination light. A range finder determines a distance to a target. A main controller or programmed microprocessor controls a default one of the imagers, for example, the far imager, to capture a minor portion of an image of the target, determines a light intensity level of the captured minor portion of the image, selects at least one of the imagers based on the determined distance and/or the determined light intensity level, and controls the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level. In addition, the main controller sets at least one of the imaging parameters of the selected at least one imager to a predetermined value based on the determined light intensity level and/or the determined distance, and controls the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
A memory is accessible to the main controller and stores a plurality of predetermined values, e.g., exposure values and/or gain values, of the at least one imaging parameter for retrieval by the main controller from a look-up table. These predetermined values are different based on the determined light intensity level and/or the determined distance. Advantageously, the default imager is controlled by the main controller to operate at a predetermined frame rate, e.g., 60 frames per second (fps). The main controller determines the light intensity level from the minor portion of the image at a rate faster than the predetermined frame rate. By way of numerical example, if the image is subdivided into four quadrants, then the minor portion of the image can be one of these quadrants, in which case, the main controller can determine the light intensity level from the minor portion of the image at a rate that is four times faster, e.g., 240 fps, than the predetermined frame rate. Thus, the selected imager is more rapidly and efficiently set with optimum exposure values and/or gain values than heretofore.
Still another aspect of the present disclosure relates to a method of setting one or more imaging parameters of at least one imager for imaging targets to be electro-optically read over a range of working distances. The method is performed by providing a near imager to image targets over a relatively wider imaging field of view, by providing a far imager to image targets over a relatively narrower imaging field of view, by providing an illuminator to illuminate targets with illumination light, and by determining a distance to a target. The method is further performed by controlling a default one of the imagers, e.g., the far imager, to capture a minor portion of an image of the target, by determining a light intensity level of the captured minor portion of the image, by selecting at least one of the imagers based on the determined distance and/or the determined light intensity level, by controlling the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level, and by setting the at least one imaging parameter of the selected at least one imager to a predetermined value based on the determined light intensity level and/or the determined distance. The method is still further performed by controlling the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
As schematically shown in FIG. 2 , and as more realistically shown in FIGS. 3-6 , an imaging module 10 is mounted in the reader 30 behind the window 26 and is operative, as described below, for expeditiously setting one or more imaging parameters, e.g., exposure and/or gain values, of an imager for imaging targets to be electro-optically read by image capture through the window 26 over an extended range of working distances away from the module 10. A target may be located anywhere in a working range of distances between a close-in working distance (WD1) and a far-out working distance (WD2). In a preferred embodiment, WD1 is either at, or about eighteen inches away, from the window 26, and WD2 is much further away, for example, over about sixty inches away from the window 26. An intermediate working distance between WD1 and WD2 is about eighteen to about sixty inches away from the window 26. The module 10 includes an imaging assembly that has a near imaging sensor or imager 12, and a near imaging lens assembly 16 for capturing return light over a relatively wide imaging field of view 20, e.g., about thirty degrees, from a near target located in a close-in region of the range, e.g., from about zero inches to about eighteen inches away from the window 26, and for projecting the captured return light onto the near imager 12, as well as a far imaging sensor or imager 14, and a far imaging lens assembly 18 for capturing return light over a relatively narrow imaging field of view 22, e.g., about sixteen degrees, from a far target located in a far-out region of the range, e.g., greater than about sixty inches away from the window 26, and for projecting the captured return light onto the far imager 14. Although only two imagers 12, 14 and two imaging lens assemblies 16, 18 have been illustrated in FIG. 2 , it will be understood that more than two could be provided in the module 10.
Each imager 12, 14 is a solid-state device, for example, a CCD or a CMOS imager having a one-dimensional array of addressable image sensors or pixels arranged in a single, linear row, or preferably a two-dimensional array of such sensors arranged in mutually orthogonal rows and columns, and operative for detecting return light captured by the respective imaging lens assemblies 16, 18 along respective imaging axes 24, 36 through the window 26. Each imaging lens assembly is advantageously a Cooke triplet, although other fixed focus and variable focus lens combinations can also be employed.
As also shown in FIGS. 2 and 4 , an illuminating light assembly is also supported by the imaging module 10 and includes an illumination light source, e.g., at least one light emitting diode (LED) 40, stationarily mounted on an optical axis 42, and an illuminating lens assembly that includes an illuminating lens 44 also centered on the optical axis 42. The illuminating light assembly is shared by both imagers 12, 14. As further shown in FIGS. 2, 5 and 6 , an aiming light assembly is also supported by the imaging module 10 and includes an aiming light source 46, e.g., a laser, stationarily mounted on an optical axis 48, and an aiming lens 50 centered on the optical axis 48. The aiming lens 50 may be a diffractive or a refractive optical element, and is operative for projecting a visible aiming light pattern on the target prior to reading.
As further shown in FIG. 2 , the imagers 12, 14, the LED 40 and the laser 46 are operatively connected to a main controller or programmed microprocessor 52 operative for controlling the operation of these components. A memory 54 is connected and accessible to the controller 52. Preferably, the controller 52 is the same as the one used for processing the return light from the targets and for decoding the captured target images.
The aforementioned aiming light assembly also serves as a range finder to determine the distance to a target. The aiming axis 48 is offset from the imaging axes 24, 36 so that the resulting parallax provides target distance information. More particularly, the parallax between the aiming axis 48 and either one of the imaging axes 24, 36 provides range information from the pixel position of the aiming beam on one of the imaging sensor arrays. It is preferred to use the imaging axis 36 of the far imager 14, because the parallax error will be greater for the far imager 14 than for the near imager 12. It will be understood that other types of range finders, e.g., acoustic devices, can be employed to determine the target distance. Thus, the range finder locates the target to determine whether the target is in a close-in region, or an intermediate region, or a far-out region, of the range.
In operation, the main controller 52 controls a default one of the imagers, for example, the far imager 14, to capture a minor or fractional portion of an image of the target. For example, if the image is comprised of a two-dimensional array of pixels arranged in a predetermined row number (M) of rows and a predetermined column number (N) of columns, then the minor portion of the image is comprised of a subarray of pixels arranged in a number of rows less than M and in a number of columns less than N. The subarray can be located anywhere on the image; for example, it can be in a corner or central area of the image, or it can be the area of the image covered by the aiming light pattern.
The main controller 52 then determines a light intensity level of the captured minor portion of the image. This is performed much faster than in the known art where the light intensity level had to be determined from the entire image. For example, if the default far imager 14 operates at a predetermined frame rate, e.g., 60 frames per second (fps), and if the image is subdivided into four quadrants, then the main controller 52 can determine the light intensity level from the minor portion or quadrant at a rate that is four times faster, e.g., 240 fps, than the predetermined frame rate.
The main controller 52 then selects either the near imager 12 or the far imager 14 based on the determined distance and/or the determined light intensity level. Once the correct imager has been selected, the main controller 52 then sets the one or more imaging parameters, e.g., exposure and/or gain values, of the selected imager to a predetermined or optimum value based on the determined light intensity level and/or the determined distance. The aforementioned memory 54 stores a set of exposure values and/or a set of gain values in a look-up table 60. The main controller 52 has a gain controller 56 that can access the look-up table 60 and retrieve the correct gain value that corresponds to the determined distance and/or the determined light intensity level. The main controller 52 also has an exposure controller 58 that can access the look-up table 60 and retrieve the correct exposure value that corresponds to the determined distance and/or the determined light intensity level. Each set of exposure and gain values includes a range of different values, and is determined in advance by knowledge of the F-stop and responsivity of each imager as a function of distance away from the respective imager and/or the light intensity level.
The main controller 52 also controls the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the determined distance and/or the determined light intensity level. The main controller 52 energizes the illuminating light assembly to illuminate the target with illumination light of a relatively lesser intensity when the range finder determines that the target to be imaged and read is located in a close-in region of the range; or energizes the illuminating light assembly to illuminate the target with illumination light of a relatively greater intensity when the range finder determines that the target to be imaged and read is located in a far-out region of the range; or energizes the illuminating light assembly to illuminate the target with illumination light of a relatively intermediate intensity that is between the lesser intensity and the greater intensity when the range finder determines that the target to be imaged and read is located in an intermediate region that is between the close-in region and the far-out region of the range.
More particularly, the main controller 52 energizes the LED 40 with a variable electrical current to vary the intensity or level of the illumination light. By way of non-limiting numerical example, the electrical current is on the order of 30 milliamperes when the close-in region lies between about 0.0 inches and about eighteen inches from the window 26, is on the order of 150 milliamperes when the intermediate region lies between about eighteen inches and about sixty inches from the window 26, and is on the order of 600 milliamperes when the far-out region lies between about sixty inches and infinity from the window 26. The main controller 52 varies the intensity of the illumination light either as a continuous analog function, or as a stepwise, multi-level function, of the distance determined by the range finder.
Once the correct imager has been selected by the main controller 52, and once the gain and/or exposure values for the selected imager have been set by the gain and exposure controllers 56, 58, and once the illumination light level has been determined by the main controller 52, then the selected imager is operated by the main controller 52 to capture an image of the target to be read. Reader performance is rapid and aggressive.
The flow chart of FIG. 7 depicts the method disclosed herein. In step 100, the minor portion of the image of the target is captured by one of the imagers by default, e.g., the far imager 14. In step 102, the light intensity level of the captured minor portion of the image is determined, and the distance to the target is also determined. Either the far imager 14 or the near imager 12 may be selected based on the determined distance and/or the determined light intensity level in step 104. The target is illuminated with illumination light whose illumination level is also based on the determined distance and/or the determined light intensity level in step 106. Then, optimum exposure and/or gain values are set for the selected imager in step 108 by referral to the look-up table 60. In step 110, the selected imager, whose exposure and/or gain has already been set, captures an image of the target, which has been illuminated at the illumination light level.
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has,” “having,” “includes,” “including,” “contains,” “containing,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a,” “has . . . a,” “includes . . . a,” or “contains . . . a,” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, or contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein, will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
Claims (20)
1. An imaging module for setting at least one imaging parameter of at least one imager for imaging targets to be electro-optically read over a range of working distances away from the module, the module comprising:
an imaging assembly including a near imager for imaging targets over a relatively wider imaging field of view, and a far imager for imaging targets over a relatively narrower imaging field of view, the near imager being separate from the far imager;
an illuminating light assembly for illuminating targets with illumination light;
a range finder for determining a distance to a target; and
a main controller for controlling only a default one of the imagers to capture a minor portion of an image of the target, for determining a light intensity level of the captured minor portion of the image, for selecting at least one of the imagers based on at least one of the determined distance and the determined light intensity level, for controlling the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the at least one of the determined distance and the determined light intensity level, for setting the at least one imaging parameter of the selected at least one imager to a predetermined value based on the at least one of the determined light intensity level and the determined distance, and for controlling the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
2. The module of claim 1 , and a memory accessible to the main controller for storing a plurality of predetermined values of the at least one imaging parameter for retrieval by the main controller, and wherein the predetermined values are different based on the at least one of the determined light intensity level and the determined distance.
3. The module of claim 2 , wherein the main controller includes an exposure controller and a gain controller, wherein the plurality of predetermined values stored in the memory include a set of exposure values for retrieval by the exposure controller and a set of gain values for retrieval by the gain controller, wherein the exposure values are different based on the at least one of the determined light intensity level and the determined distance, and wherein the gain values are different based on the at least one of the determined light intensity level and the determined distance.
4. The module of claim 1 , wherein the default imager is controlled by the main controller to operate at a predetermined frame rate, and wherein the main controller determines the light intensity level from the minor portion of the image at a rate faster than the predetermined frame rate.
5. The module of claim 1 , wherein the image is comprised of pixels arranged in a predetermined row number of rows and a predetermined column number of columns, and wherein the minor portion of the image is comprised of pixels arranged in a number of rows less than the predetermined row number and in a number of columns less than the predetermined column number.
6. The module of claim 1 , wherein the main controller is operative for controlling the illuminating light assembly to illuminate the target with illumination light having a relatively lesser illumination light level when the range finder determines that the target to be imaged and read is located in a close-in region of the range, for controlling the illuminating light assembly to illuminate the target with illumination light of a relatively greater illumination light level when the range finder determines that the target to be imaged and read is located in a far-out region of the range, and for controlling the illuminating light assembly to illuminate the target with illumination light of a relatively intermediate illumination light level that is between the lesser illumination light level and the greater illumination light level when the range finder determines that the target to be imaged and read is located in an intermediate region that is between the close-in region and the far-out region of the range.
7. The module of claim 1 , wherein the main controller varies the illumination light level as one of a continuous and a stepwise function based on the at least one of the determined distance and the determined light intensity level.
8. The module of claim 1 , wherein each imager captures return light from the target along an optical axis, and wherein the range finder includes an aiming assembly for emitting an aiming beam along an aiming axis that is offset from at least one of the optical axes.
9. An imaging reader for reading targets by image capture over a range of working distances away from the reader, comprising:
a housing having a light-transmissive window; and
an imaging module for setting at least one imaging parameter of at least one imager for imaging the targets, the module including
an imaging assembly including a near imager for imaging targets over a relatively wider imaging field of view, and a far imager for imaging targets over a relatively narrower imaging field of view, the near imager being separate from the far imager,
an illuminating light assembly for illuminating targets with illumination light,
a range finder for determining a distance to a target, and
a main controller for controlling only a default one of the imagers to capture a minor portion of an image of the target, for determining a light intensity level of the captured minor portion of the image, for selecting at least one of the imagers based on at least one of the determined distance and the determined light intensity level, for controlling the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the at least one of the determined distance and the determined light intensity level, for setting the at least one imaging parameter of the selected at least one imager to a predetermined value based on the at least one of the determined light intensity level and the determined distance, and for controlling the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
10. The reader of claim 9 , and a memory accessible to the main controller for storing a plurality of predetermined values of the at least one imaging parameter for retrieval by the main controller, and wherein the predetermined values are different based on the at least one of the determined light intensity level and the determined distance.
11. The reader of claim 10 , wherein the main controller includes an exposure controller and a gain controller, wherein the plurality of predetermined values stored in the memory include a set of exposure values for retrieval by the exposure controller and a set of gain values for retrieval by the gain controller, wherein the exposure values are different based on the at least one of the determined light intensity level and the determined distance, and wherein the gain values are different based on the at least one of the determined light intensity level and the determined distance.
12. The reader of claim 9 , wherein the default imager is controlled by the main controller to operate at a predetermined frame rate, and wherein the main controller determines the light intensity level from the minor portion of the image at a rate faster than the predetermined frame rate.
13. The reader of claim 9 , wherein the image is comprised of pixels arranged in a predetermined row number of rows and a predetermined column number of columns, and wherein the minor portion of the image is comprised of pixels arranged in a number of rows less than the predetermined row number and in a number of columns less than the predetermined column number.
14. A method of setting at least one imaging parameter of at least one imager for imaging targets to be electro-optically read over a range of working distances, the method comprising:
providing a near imager to image targets over a relatively wider imaging field of view;
providing a far imager to image targets over a relatively narrower imaging field of view, the near imager being separate from the far imager;
providing an illuminator to illuminate targets with illumination light;
determining a distance to a target;
controlling only a default one of the imagers to capture a minor portion of an image of the target;
determining a light intensity level of the captured minor portion of the image;
selecting at least one of the imagers based on at least one of the determined distance and the determined light intensity level;
controlling the illuminating light assembly to illuminate the target with illumination light at an illumination light level based on the at least one of the determined distance and the determined light intensity level;
setting the at least one imaging parameter of the selected at least one imager to a predetermined value based on the at least one of the determined light intensity level and the determined distance; and
controlling the selected at least one imager, which has been set with the predetermined value, to capture an image of the target, which has been illuminated at the illumination light level.
15. The method of claim 14 , and storing a plurality of predetermined values of the at least one imaging parameter, and configuring the predetermined values to be different based on the at least one of the determined light intensity level and the determined distance.
16. The method of claim 15 , and storing the plurality of predetermined values as a set of exposure values and as a set of gain values, and configuring the exposure values to be different based on the at least one of the determined light intensity level and the determined distance, and configuring the gain values to be different based on the at least one of the determined light intensity level and the determined distance.
17. The method of claim 14 , and controlling the default imager to operate at a predetermined frame rate, and wherein the light intensity level is determined from the minor portion of the image at a rate faster than the predetermined frame rate.
18. The method of claim 14 , and configuring the image of pixels arranged in a predetermined row number of rows and a predetermined column number of columns, and configuring the minor portion of the image of pixels arranged in a number of rows less than the predetermined row number and in a number of columns less than the predetermined column number.
19. The method of claim 14 , and illuminating the target with illumination light having a relatively lesser illumination light level upon the determination that the target to be imaged and read is located in a close-in region of the range, and with illumination light of a relatively greater illumination light level upon the determination that the target to be imaged and read is located in a far-out region of the range, and with illumination light of a relatively intermediate illumination light level that is between the lesser illumination light level and the greater illumination light level upon the determination that the target to be imaged and read is located in an intermediate region that is between the close-in region and the far-out region of the range.
20. The method of claim 14 , wherein the determining of the distance to the target is performed by emitting an aiming beam along an aiming axis that is offset from an optical axis along which the selected at least one imager captures the image.
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/083,493 US10244180B2 (en) | 2016-03-29 | 2016-03-29 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
GB1704592.3A GB2550660B (en) | 2016-03-29 | 2017-03-23 | Imaging module, reader and method for expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
DE102017106509.1A DE102017106509A1 (en) | 2016-03-29 | 2017-03-27 | An imaging module and reader and method for rapidly setting imaging parameters of images to map targets to be read over a range of working distances |
CN201710192572.8A CN107241534B (en) | 2016-03-29 | 2017-03-28 | Imaging module and reader for rapidly setting imaging parameters of imager and method thereof |
US16/280,834 US20190182413A1 (en) | 2016-03-29 | 2019-02-20 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/083,493 US10244180B2 (en) | 2016-03-29 | 2016-03-29 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/280,834 Continuation US20190182413A1 (en) | 2016-03-29 | 2019-02-20 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170289421A1 US20170289421A1 (en) | 2017-10-05 |
US10244180B2 true US10244180B2 (en) | 2019-03-26 |
Family
ID=58687975
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/083,493 Active US10244180B2 (en) | 2016-03-29 | 2016-03-29 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
US16/280,834 Abandoned US20190182413A1 (en) | 2016-03-29 | 2019-02-20 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/280,834 Abandoned US20190182413A1 (en) | 2016-03-29 | 2019-02-20 | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
Country Status (4)
Country | Link |
---|---|
US (2) | US10244180B2 (en) |
CN (1) | CN107241534B (en) |
DE (1) | DE102017106509A1 (en) |
GB (1) | GB2550660B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190182413A1 (en) * | 2016-03-29 | 2019-06-13 | Symbol Technologies, Llc | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
US11227173B2 (en) | 2020-02-18 | 2022-01-18 | Datalogic IP Tech, S.r.l. | Virtual-frame preprocessing for optical scanning |
US11893450B2 (en) | 2021-12-06 | 2024-02-06 | Datalogic IP Tech, S.r.l. | Robust optical aimer for triangulation-based distance measurement |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD982585S1 (en) * | 2013-12-05 | 2023-04-04 | Hand Held Products, Inc. | Indicia scanner |
USD791137S1 (en) * | 2015-07-22 | 2017-07-04 | Hand Held Products, Inc. | Scanner |
US10491790B2 (en) * | 2016-03-22 | 2019-11-26 | Symbol Technologies, Llc | Imaging module and reader for, and method of, variably illuminating targets to be read by image capture over a range of working distances |
US10701277B2 (en) * | 2017-05-31 | 2020-06-30 | Fotonation Limited | Automatic exposure module for an image acquisition system |
US10803265B2 (en) * | 2018-03-22 | 2020-10-13 | Symbol Technologies, Llc | Aiming light patterns for use with barcode readers and devices systems and methods associated therewith |
US10452885B1 (en) * | 2018-04-17 | 2019-10-22 | Zebra Technologies Corporation | Optimized barcode decoding in multi-imager barcode readers and imaging engines |
CN109375358B (en) * | 2018-11-28 | 2020-07-24 | 南京理工大学 | Differential phase contrast quantitative phase microscopic imaging method based on optimal illumination mode design |
CN111343389B (en) * | 2019-05-16 | 2021-09-10 | 杭州海康慧影科技有限公司 | Automatic exposure control method and device |
CN113679401B (en) * | 2020-05-18 | 2024-09-24 | 西门子(深圳)磁共振有限公司 | Imaging control method and system, imaging system and storage medium |
US11928546B2 (en) * | 2020-12-30 | 2024-03-12 | Datalogic IP Tech, S.r.l. | Dual illuminator as field of view identification and aiming |
US11381729B1 (en) | 2021-01-08 | 2022-07-05 | Hand Held Products, Inc. | Systems, methods, and apparatuses for focus selection using image disparity |
US12067450B2 (en) | 2022-08-29 | 2024-08-20 | Hand Held Products, Inc. | Near co-axial polarized illuminator apparatuses and uses thereof |
US12120431B2 (en) | 2022-09-30 | 2024-10-15 | Motorola Mobility Llc | Tag assisted image capture parameter generation |
US11775783B1 (en) * | 2022-10-31 | 2023-10-03 | Zebra Technologies Corporation | Compact opto-mechanical layout of long-range dual-camera bar-code imager |
US20240161439A1 (en) * | 2022-11-16 | 2024-05-16 | Motorola Mobility Llc | Tag based flash intensity determination for image capture |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4506150A (en) | 1982-02-19 | 1985-03-19 | Nippon Kogaku K.K. | Automatic focus adjusting device with a brightness dependent lens drive signal |
US5702059A (en) | 1994-07-26 | 1997-12-30 | Meta Holding Corp. | Extended working range dataform reader including fuzzy logic image control circuitry |
US5808296A (en) | 1996-03-22 | 1998-09-15 | Banner Engineering Corporation | Programmable detection sensor with means to automatically adjust sensor operating characteristics to optimize performance for both high gain and low contrast applications |
US20020176605A1 (en) | 2001-05-25 | 2002-11-28 | The Regents Of The University Of California | Method and apparatus for intelligent ranging via image subtraction |
US6621987B1 (en) | 2003-01-29 | 2003-09-16 | Novatek Microelectronics Corp. | Method of fast converging appropriate exposure value |
US20050140786A1 (en) | 2003-07-14 | 2005-06-30 | Michael Kaplinsky | Dual spectral band network camera |
US20050199719A1 (en) * | 2004-01-21 | 2005-09-15 | Hepworth Paul J. | Graphical code reader having illumination leds of different wavelengths |
US20060011725A1 (en) | 2003-11-13 | 2006-01-19 | Michael Schnee | System for detecting image light intensity reflected off an object in a digital imaging-based bar code symbol reading device |
US20060065732A1 (en) | 2004-09-30 | 2006-03-30 | Edward Barkan | System and method for reducing motion sensitivity in an imager based optical code reader |
US20060170816A1 (en) * | 2005-01-28 | 2006-08-03 | Silverstein D A | Method and system for automatically adjusting exposure parameters of an imaging device |
US7148923B2 (en) | 2000-09-30 | 2006-12-12 | Hand Held Products, Inc. | Methods and apparatus for automatic exposure control |
US20070002163A1 (en) | 2005-06-29 | 2007-01-04 | Dariusz Madej | Imager settings |
US20070102520A1 (en) | 2004-07-15 | 2007-05-10 | Symbol Technologies, Inc. | Optical code reading system and method for processing multiple resolution representations of an image |
US7227117B1 (en) | 2006-05-30 | 2007-06-05 | Symbol Technologies, Inc. | High speed auto-exposure control |
US20070177048A1 (en) | 2006-01-31 | 2007-08-02 | Phil Van Dyke | Long exposure images using electronic or rolling shutter |
US20080296379A1 (en) | 2007-05-31 | 2008-12-04 | The Code Corporation | Graphical code readers for balancing decode capability and speed by using image brightness information |
US20090001163A1 (en) | 2007-06-27 | 2009-01-01 | Symbol Technologies, Inc. | Imaging scanner with illumination and exposure control |
US7546951B2 (en) | 2003-11-13 | 2009-06-16 | Meterologic Instruments, Inc. | Digital image capture and processing system employing real-time analysis of image exposure quality and the reconfiguration of system control parameters based on the results of such exposure quality analysis |
US20100147957A1 (en) * | 2008-12-17 | 2010-06-17 | Vladimir Gurevich | Range finding in imaging reader for electro-optically reading indicia |
US7780089B2 (en) | 2005-06-03 | 2010-08-24 | Hand Held Products, Inc. | Digital picture taking optical reader having hybrid monochrome and color image sensor array |
US20100252635A1 (en) | 2009-04-02 | 2010-10-07 | Symbol Technologies, Inc. | Exposure control for multi-imaging scanner |
US20120000982A1 (en) | 2010-06-30 | 2012-01-05 | Datalogic Scanning, Inc. | Adaptive data reader and method of operating |
US20120091206A1 (en) | 2010-10-15 | 2012-04-19 | Symbol Technologies, Inc. | Method and apparatus for capturing images with variable sizes |
US20120126015A1 (en) * | 2010-11-22 | 2012-05-24 | MOTOROLA, INC., Law Department | Light seal gasket for using in imaging-based barcode reader |
US20120181338A1 (en) | 2011-01-18 | 2012-07-19 | Datalogic ADC, Inc. | Systems and methods for illuminating a scan volume of an optical code reader |
US20130083201A1 (en) | 2011-10-03 | 2013-04-04 | Raytheon Company | Methods and apparatus for determining misalignment of first and second sensors |
US20130248602A1 (en) | 2012-03-21 | 2013-09-26 | Symbol Technologies, Inc. | Apparatus for and method of controlling imaging exposure of targets to be read |
US20140362286A1 (en) | 2011-12-12 | 2014-12-11 | Opticon, Inc. | Miniature imaging and decoding module |
US20150244923A1 (en) | 2014-02-21 | 2015-08-27 | Samsung Electronics Co., Ltd. | Method for acquiring image and electronic device thereof |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5640001A (en) * | 1986-08-08 | 1997-06-17 | Norand Technology Corporation | Hand-held instant bar code reader having automatic focus control for operation over a range of distances |
US6189793B1 (en) * | 1992-06-12 | 2001-02-20 | Metrologic Instruments, Inc. | Automatic laser projection scanner with improved activation controlling mechanism |
CN1037379C (en) * | 1992-09-29 | 1998-02-11 | 欧林巴斯光学工业股份有限公司 | Combined two distance laser scanner |
US7128266B2 (en) * | 2003-11-13 | 2006-10-31 | Metrologic Instruments. Inc. | Hand-supportable digital imaging-based bar code symbol reader supporting narrow-area and wide-area modes of illumination and image capture |
US7490774B2 (en) * | 2003-11-13 | 2009-02-17 | Metrologic Instruments, Inc. | Hand-supportable imaging based bar code symbol reader employing automatic light exposure measurement and illumination control subsystem integrated therein |
US7163149B2 (en) * | 2004-03-02 | 2007-01-16 | Symbol Technologies, Inc. | System and method for illuminating and reading optical codes imprinted or displayed on reflective surfaces |
US20080265035A1 (en) * | 2007-04-25 | 2008-10-30 | Symbol Technologies, Inc. | Dual imaging lens system for bar code reader |
US8800874B2 (en) * | 2009-02-20 | 2014-08-12 | Datalogic ADC, Inc. | Systems and methods of optical code reading using a color imager |
US8146821B2 (en) * | 2009-04-02 | 2012-04-03 | Symbol Technologies, Inc. | Auto-exposure for multi-imager barcode reader |
JP5454348B2 (en) * | 2010-05-12 | 2014-03-26 | ソニー株式会社 | Imaging apparatus and image processing apparatus |
CN203482300U (en) * | 2013-09-24 | 2014-03-12 | 深圳市民德电子科技有限公司 | Image recognition device |
US9185306B1 (en) * | 2014-05-15 | 2015-11-10 | Symbol Technologies, Llc | Imaging module and reader for, and method of, illuminating and imaging targets to be read over an extended range of working distances |
US9679175B2 (en) * | 2014-06-13 | 2017-06-13 | The Code Corporation | Barcode reader ranging using targeting illumination |
CN104618661A (en) * | 2015-03-05 | 2015-05-13 | 广东欧珀移动通信有限公司 | Method and device for controlling light supplement of a camera |
US10244180B2 (en) * | 2016-03-29 | 2019-03-26 | Symbol Technologies, Llc | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
-
2016
- 2016-03-29 US US15/083,493 patent/US10244180B2/en active Active
-
2017
- 2017-03-23 GB GB1704592.3A patent/GB2550660B/en active Active
- 2017-03-27 DE DE102017106509.1A patent/DE102017106509A1/en active Pending
- 2017-03-28 CN CN201710192572.8A patent/CN107241534B/en active Active
-
2019
- 2019-02-20 US US16/280,834 patent/US20190182413A1/en not_active Abandoned
Patent Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4506150A (en) | 1982-02-19 | 1985-03-19 | Nippon Kogaku K.K. | Automatic focus adjusting device with a brightness dependent lens drive signal |
US5702059A (en) | 1994-07-26 | 1997-12-30 | Meta Holding Corp. | Extended working range dataform reader including fuzzy logic image control circuitry |
US5808296A (en) | 1996-03-22 | 1998-09-15 | Banner Engineering Corporation | Programmable detection sensor with means to automatically adjust sensor operating characteristics to optimize performance for both high gain and low contrast applications |
US7148923B2 (en) | 2000-09-30 | 2006-12-12 | Hand Held Products, Inc. | Methods and apparatus for automatic exposure control |
US20020176605A1 (en) | 2001-05-25 | 2002-11-28 | The Regents Of The University Of California | Method and apparatus for intelligent ranging via image subtraction |
US6621987B1 (en) | 2003-01-29 | 2003-09-16 | Novatek Microelectronics Corp. | Method of fast converging appropriate exposure value |
US20050140786A1 (en) | 2003-07-14 | 2005-06-30 | Michael Kaplinsky | Dual spectral band network camera |
US7546951B2 (en) | 2003-11-13 | 2009-06-16 | Meterologic Instruments, Inc. | Digital image capture and processing system employing real-time analysis of image exposure quality and the reconfiguration of system control parameters based on the results of such exposure quality analysis |
US20060011725A1 (en) | 2003-11-13 | 2006-01-19 | Michael Schnee | System for detecting image light intensity reflected off an object in a digital imaging-based bar code symbol reading device |
US20050199719A1 (en) * | 2004-01-21 | 2005-09-15 | Hepworth Paul J. | Graphical code reader having illumination leds of different wavelengths |
US20070102520A1 (en) | 2004-07-15 | 2007-05-10 | Symbol Technologies, Inc. | Optical code reading system and method for processing multiple resolution representations of an image |
US20060065732A1 (en) | 2004-09-30 | 2006-03-30 | Edward Barkan | System and method for reducing motion sensitivity in an imager based optical code reader |
US20060170816A1 (en) * | 2005-01-28 | 2006-08-03 | Silverstein D A | Method and system for automatically adjusting exposure parameters of an imaging device |
US7780089B2 (en) | 2005-06-03 | 2010-08-24 | Hand Held Products, Inc. | Digital picture taking optical reader having hybrid monochrome and color image sensor array |
US20070002163A1 (en) | 2005-06-29 | 2007-01-04 | Dariusz Madej | Imager settings |
US20070177048A1 (en) | 2006-01-31 | 2007-08-02 | Phil Van Dyke | Long exposure images using electronic or rolling shutter |
US7227117B1 (en) | 2006-05-30 | 2007-06-05 | Symbol Technologies, Inc. | High speed auto-exposure control |
US20080296379A1 (en) | 2007-05-31 | 2008-12-04 | The Code Corporation | Graphical code readers for balancing decode capability and speed by using image brightness information |
US20090001163A1 (en) | 2007-06-27 | 2009-01-01 | Symbol Technologies, Inc. | Imaging scanner with illumination and exposure control |
US20100147957A1 (en) * | 2008-12-17 | 2010-06-17 | Vladimir Gurevich | Range finding in imaging reader for electro-optically reading indicia |
US20100252635A1 (en) | 2009-04-02 | 2010-10-07 | Symbol Technologies, Inc. | Exposure control for multi-imaging scanner |
US20120000982A1 (en) | 2010-06-30 | 2012-01-05 | Datalogic Scanning, Inc. | Adaptive data reader and method of operating |
US20120091206A1 (en) | 2010-10-15 | 2012-04-19 | Symbol Technologies, Inc. | Method and apparatus for capturing images with variable sizes |
US20120126015A1 (en) * | 2010-11-22 | 2012-05-24 | MOTOROLA, INC., Law Department | Light seal gasket for using in imaging-based barcode reader |
US20120181338A1 (en) | 2011-01-18 | 2012-07-19 | Datalogic ADC, Inc. | Systems and methods for illuminating a scan volume of an optical code reader |
US20130083201A1 (en) | 2011-10-03 | 2013-04-04 | Raytheon Company | Methods and apparatus for determining misalignment of first and second sensors |
US20140362286A1 (en) | 2011-12-12 | 2014-12-11 | Opticon, Inc. | Miniature imaging and decoding module |
US20130248602A1 (en) | 2012-03-21 | 2013-09-26 | Symbol Technologies, Inc. | Apparatus for and method of controlling imaging exposure of targets to be read |
US20150244923A1 (en) | 2014-02-21 | 2015-08-27 | Samsung Electronics Co., Ltd. | Method for acquiring image and electronic device thereof |
Non-Patent Citations (1)
Title |
---|
Utility U.S. Appl. No. 15/171,266, filed Jun. 2, 2016, entitled "Imaging Module and Reader for, and Method of, Expeditiously Setting Imaging Parameters of an Imager Based on the Imaging Parameters Previously Set for a Default Imager" (25 pages). |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190182413A1 (en) * | 2016-03-29 | 2019-06-13 | Symbol Technologies, Llc | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances |
US11227173B2 (en) | 2020-02-18 | 2022-01-18 | Datalogic IP Tech, S.r.l. | Virtual-frame preprocessing for optical scanning |
US11675986B2 (en) | 2020-02-18 | 2023-06-13 | Datalogic IP Tech, S.r.l. | Virtual-frame preprocessing for optical scanning |
US11893450B2 (en) | 2021-12-06 | 2024-02-06 | Datalogic IP Tech, S.r.l. | Robust optical aimer for triangulation-based distance measurement |
Also Published As
Publication number | Publication date |
---|---|
DE102017106509A1 (en) | 2017-10-05 |
US20190182413A1 (en) | 2019-06-13 |
CN107241534A (en) | 2017-10-10 |
US20170289421A1 (en) | 2017-10-05 |
GB2550660B (en) | 2020-09-02 |
GB2550660A (en) | 2017-11-29 |
CN107241534B (en) | 2020-01-07 |
GB201704592D0 (en) | 2017-05-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10244180B2 (en) | Imaging module and reader for, and method of, expeditiously setting imaging parameters of imagers for imaging targets to be read over a range of working distances | |
US9646188B1 (en) | Imaging module and reader for, and method of, expeditiously setting imaging parameters of an imager based on the imaging parameters previously set for a default imager | |
US9800749B1 (en) | Arrangement for, and method of, expeditiously adjusting reading parameters of an imaging reader based on target distance | |
US10929623B2 (en) | Imaging module and reader for, and method of, reading targets by image capture over a range of working distances with multi-functional aiming light pattern | |
US8857719B2 (en) | Decoding barcodes displayed on cell phone | |
US10534944B1 (en) | Method and apparatus for decoding multiple symbology types | |
US9305197B2 (en) | Optimizing focus plane position of imaging scanner | |
US20120097744A1 (en) | Arrangement For And Method Of Reducing Vertical Parallax Between An Aiming Pattern And An Imaging Field Of View In A Linear Imaging Reader | |
US11009347B2 (en) | Arrangement for, and method of, determining a distance to a target to be read by image capture over a range of working distances | |
US9082033B2 (en) | Apparatus for and method of optimizing target reading performance of imaging reader in both handheld and hands-free modes of operation | |
US10671824B2 (en) | Decoding designated barcode in field of view of barcode reader | |
US10491790B2 (en) | Imaging module and reader for, and method of, variably illuminating targets to be read by image capture over a range of working distances | |
CN109154974B (en) | Apparatus and method for determining target distance and adjusting reading parameters of an imaging reader based on target distance | |
CN110390221B (en) | Optimized barcode decoding in a multi-imager barcode reader and imaging engine | |
WO2024220697A1 (en) | Scan engine ranging failure handling | |
US20190286858A1 (en) | Barcode Readers Having Multiple Image Sensors and Methods Associated Therewith | |
US11853838B2 (en) | Systems and approaches for reducing power consumption in industrial digital barcode scanners | |
US8686338B2 (en) | Method and apparatus for controlling output of the solid-state imager in a barcode reader | |
US9213880B2 (en) | Method of optimizing focus plane position of imaging scanner | |
US20240354527A1 (en) | Scan Engine Ranging Failure Handling |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, LLC, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAN, CHINH;GOREN, DAVID P.;KUCHENBROD, HARRY E.;AND OTHERS;SIGNING DATES FROM 20160322 TO 20160329;REEL/FRAME:038122/0093 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |