CN104680113A - Image capturing device - Google Patents
Image capturing device Download PDFInfo
- Publication number
- CN104680113A CN104680113A CN201410830194.8A CN201410830194A CN104680113A CN 104680113 A CN104680113 A CN 104680113A CN 201410830194 A CN201410830194 A CN 201410830194A CN 104680113 A CN104680113 A CN 104680113A
- Authority
- CN
- China
- Prior art keywords
- light source
- array
- sensor
- axis
- illumination
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Facsimile Scanning Arrangements (AREA)
- Image Input (AREA)
Abstract
The invention relates to an image capturing device, in particular to an imager-type image capturing device. The image capturing device comprises an image forming device and a lighting device, wherein the image forming device comprises a sensor; the sensor defines an optical receiving axis, at least one reading distance and an area which is framed by the sensor at the at least one reading distance on a substrate; the lighting device comprises an array of adjacent light sources, and the array is used for defining an optical lighting axis. The image capturing device is characterized in that the light sources can be independently driven, and each light source is suitable for an area of which the lighting size is much smaller than the size of the area framed by the sensor; the lighting axis does not coincide with the receiving axis; moreover, the image capturing device comprises a driver for the light sources; the driver is suitable for driving the light sources to at least close the light source lighting the outer side of a boundary of the area framed by the sensor at the at least one reading distance on the substrate.
Description
divisional application explanation
The divisional application that the application is the applying date is on March 11st, 2010, application number is 201080066573.4, denomination of invention is the Chinese invention patent application of " image capture device ".
Technical field
The present invention relates to a kind of image capture device, and particularly relate to the optical information reading system of " imager " type or this kind of device of reader.
Background technology
The optical information reader of imager type is known.Such reader comprises the image capture device of the image of the optical information that can catch or obtain on the substrate being present in any type, to comprise on it again by any electrically or the display of electronic installation display optical information.
In this manual and in the appended claims, word " optical information " in the meaning that it is the most wide in range for comprising the optical code of one dimension, optical code that is stacking and two dimension, wherein information is encoded as shape, size, the mutual alignment of the element of color and/or at least two different colours, and alphanumeric character, signature, logo, seal, trade mark, label, handwritten text and general image also have their combination, particularly be present in pre-printed pro forma, be suitable for identifying based on its shape and/or volume and/or the image of feature of alternative with comprising.
In this manual and in the appended claims, term " light " uses in the meaning that it is the most wide in range, the wavelength of instruction not only in the visible spectrum but also in ultraviolet and infrared spectrum or the electromagnetic radiation of wavelength coverage.Term such as " color ", " optics ", " image " and " view " use too in its broadest sense.Especially, invisible ink can be used coded message to be marked on substrate, but for ultraviolet or infrared-ray sensitivity.
The optical information reader of imager type generally includes device that is that have one or more other different function or that communicate with it except image capture device.
Other device like this comprises at this: for the treatment of the device of captured image, can from such image or the extracting section information content from this image; Memory storage; For the image of acquisition and/or the information content of extraction being transferred to device outside reader or interface; For inputting device or the interface of the setting data of the reader from external source; For to user's indication example as relate to reader running status, the alphanumeric of the content of information that reads etc. and/or the device of graphical information; For the device of manual input control signal and data; For powering or obtaining the interior arrangement of power supply signal from outside.
In addition, in the optical information reader that can be included in imager type or the other device combined with it comprise at this: indicated by the vision showing the region of being got by image capture device frame on substrate, the center in such as this region and/or its edge and/or bight at least part of, carrys out the sighting device that reader is located relative to optical information by assist operator; For the servicing unit (stadimeter) that image capture device is correctly focused on, described servicing unit is presented at the luminous pattern with shape-variable, size and/or position between focused condition and non-focusing condition on substrate, and can indicate that the direction reaching focused condition and manual mobile image capture device and substrate; Result indicating device, described result indicating device is by the change of the shape of luminous pattern, size, color and/or position, substrate shows the luminous pattern that positive or negative result when catching image and/or decoded picture information is attempted in instruction, and shows the reason of negative decision possibly; For detecting the existence of substrate, and/or for the reference substance measured or estimate to read distance-namely in the reader distance particularly between the sensor of image capture device and substrate-device.To marking and indicate the function focused on also by the suitable luminous pattern that projects, the bar or a pair cross that are such as respectively pair of angled complete together, and bar or a pair cross of described pair of angled are only crossing or overlapped respectively at center at the region place of being got by image capture device frame in the distance focused on.
The measurement of distance or estimation are usually read device and use, with only when optical information just activates decoding algorithm when being included in the distance between minimum and maximum functional distance, and/or to control zoom lens control device and/or the device for the focusing distance (automatic focus) that automatically changes image capture device.In addition, when the measurement of distance or estimation can be used in the digital restoration wherein needing image, because the downgrade functions of the optical device of image processing system or PSF (point spread function) depend on reading distance.In addition, distance measurement or estimate that for calculating the volume of object be needs.
For aiming at and/or indicating the device of focus such as at US 5,949,057, US 6,811,085, to describe in US 7,392,951 B2, US 5,331,176, US 5,378,883 and EP 1 466 292 B1.
Result indicating device such as at aforementioned documents US 5,331,176 and EP 1 128 315 A1 in describe.
It is worth emphasizing that, aim at, the instruction of focused condition, result instruction, to exist in the measurement or assessment function that detect and read distance each implements by different modes, described different mode itself be know and the projection of the unfavorable light be used on substrate.This only exemplarily it is mentioned that, be aim at and/or focused condition, the things got by sensor institute frame is found a view and is shown; For instruction result, sound instruction and vision instruction are not projected on substrate, but projects towards operator; For detect exist, measure or estimated distance and/or assessment focused condition, electro-optical system, radar or Vltrasonic device etc.
The image capture device of imager type comprises image processing system or part, described image processing system or part comprise in light activated element-straight line or preferably matrix-type-sensor of regularly arranged or array format, described sensor can produce electric signal from optical signalling, and described image processing system or part also comprise the receiver optical device of image usually, described receiver optical device can be formed on a sensor and comprise the substrate of optical information or the image in its region.
The feature of image capture device is receive optical axis, described receive optical axis is limited by the center of the element of receiver optical device or is limited by the center of curvature of optical surface when single lens, and described receive optical axis defines the main operative orientation of image capture device.The feature of image capture device is also workspace areas, and described workspace areas is usually configured as the pyramidal truncated cone being similar to and extending in sensor front.Workspace areas, in other words wherein optical information by sensor correctly frame to get and the area of space that its image focuses on fully is on a sensor characterised in that visual field and the depth of field usually, described visual field represents that perform region is about the angular breadth receiving axis, and the described depth of field represents the size of perform region along the direction of reception axis.Therefore the depth of field represent along the scope between the minimum and maximum useful distance received between region that axis got by sensor frame on reader and substrate.Visual field also can represent with the form of " vertically " and " level " visual field, in other words to represent in the form by receiving axis and two angular ranges in orthogonal plane, to consider the form factor of sensor, or even when representing with four angular ranges in 90 ° of half-planes separated without when any symmetric receiving system.
Workspace areas-and the therefore visual field and the depth of field-can be fixing or dynamically changed by the zoom known and/or autofocus system in size and/or ratio, described zoom and/or autofocus system be such as one or more lens mobile or aperture, mirror or receiver optical device other parts or for movable sensor and/or for change receiver optical device one or more lens-such as liquid lens or deformable lens-the electromechanics of curvature, piezoelectricity or electro-optical actuators.
EP 1 764 835 A1 describes a kind of optical sensor, and wherein the group of each light activated element or light activated element all has the lens or other optical element that combine, such as aperture, prism surface, photoconduction or GRIN Lens.This document does not mention the illumination in the region of being got by sensor frame generally.
Although only know the image capture device run with surround lighting, the image capture device of imager type usually also comprises and is suitable for one or more light beam to project with variable intensity and/or spectral component possibly to carry lighting device on the substrate of optical information or part.The light beam sent by lighting device or the entirety of light beam define illumination optical axis, described illumination optical axis is the mean direction of the light beam of this independent or compound, in the case of a two-dimensional array as the axis of symmetry of described light beam at least one plane and usually in two vertical planes.
For correctly operating image capture device, lighting device must can illuminate the whole workspace areas of image processing system.
A kind of image capture device, described image capture device is similar to above mentioned US5,378, the image capture device of Fig. 4 of 883, wherein as illustrated in FIG, lighting device 90 is not coaxial with image processing system 91, but be arranged in image processing system 91 side and be configured so that the illumination axis 92 of illuminating bundle 93 and receive axis 94 and converge, under two-dimensional case, described image capture device is subject to the impact of intrinsic parallactic error and is subject to the impact of intrinsic perspective distortion error.This error makes the intersection point between the workspace areas 95 of the intersection point between substrate S and illuminating bundle 93 and substrate S and image processing system 91 be concentric in the scope of very little reading distance (being approximately the distance at the substrate S place partly indicated in FIG) at the most substantially.Therefore, be the whole workspace areas 95 making lighting device 90 can illuminate image processing system 91, read distance in major part, illumination is that over-redundancy is (with reference to the substrate S partly indicated in FIG
1or substrate S
2the distance at place), in other words, illumination extends to by the areas outside that sensor frame is got on substrate, the consequently waste of energy.
In prior art for catching in some devices of image, parallactic error coaxially solves by making lighting device and image processing system.
US 5,319,182 describes the image capture device of non-imaged device type but scan type, and wherein lighting device and sensor are coaxial generally, because they are by wherein with forming by the matrix that replaces of the transmitter of program activation and the light activated element of sensor.This device potentially closely and flexibly, but be also subject to the puzzlement of the obvious optoisolated problem between transmitter and light activated element: though by provide between transmitter and light activated element as in this document the isolator of advising, even if sent by transmitter and minimum degree reflexed to the intensity of the light on light activated element also far above the intensity of the light received from the substrate carrying optical information by the rear surface with antireflection process of any surface such as opaque partition wall or projection optical device.In addition, single substrate is arranged light activated element and light-emitting component cause the compromise in efficiency because for the feature with the material that effective light-emitting component requires contrary with the material characteristics obtained required by effective light activated element.
At US 5,430, in 286, the alignment between the light sent by lighting device and image processing system is obtained by beam splitter.Due to along illumination path and along RX path 50% power loss, so cause the result that there is very large space and the low-down efficiency be occupied in reader.
Such system of the space problem be also occupied, at aforementioned US 5,331, describes in 176, and described system employs semitransparent mirror substituting as beam splitter.This document also teach that the size of the part of adjustment illuminating bundle, but is undertaken by mechanical mobile device, and described device causes the consumption of space and the reader occupied.In addition, such solution cannot avoid the shortcoming of the waste for the energy thrown light on, because the part of illuminating bundle is only blocked.
US 2007/0158427 A1 representing immediate prior art describes a kind of illuminator in figure 5b, this illuminator to comprise on each opposite side being arranged in sensor and a pair illumination array combined with larger operating distance with each described opposite side being also arranged in sensor on and a pair illumination array combined with less distance.Because be directed by the part of the light beam of a pair array emitter combined with larger operating distance and be sized to generally and at least illuminate the whole region of being got by sensor frame equably in maximum distance apart, so result is in this distance and in shorter reading distance, over-redundancy by the illumination of this array, in other words, the areas outside that described array is being got by sensor frame extends.This type of shortcoming occurs relative to the pair array combined with less operating distance.Therefore the unit efficiency of this document is very low, is particularly suitable for the energy-conservation battery powered portable reader as important requirement hardly.Document also teach that the array only opening every centering is to avoid the problem from substrate reflection, therefore drops into the situation of the system being subject to parallax and the puzzlement of perspective distortion error, or teaches when reading two apart from opening time unknown in a pair array.Described document also describes: another is to luminaire, and the fine rule illuminated for reading one-dimensional coding is sentenced in each other both sides being arranged in sensor; With four luminaires at summit place being arranged in sensor, to aim at the region of concern.
Summary of the invention
The present invention based on technical matters be: a kind of effective image capture device is provided, and so a kind of device of the optical information reader of imager type is more specifically provided, described device no parallax error particularly, the excessive redundancy be not also provided in by the areas outside that sensor frame is got extends is thrown light on, and is avoided any possibility of the optical interference between light source and light activated element.
In a first aspect of the present invention, the present invention relates to the image capture device of imager type, comprising:
– image processing system, described image processing system comprises sensor, described sensor comprises one dimension or the two-dimensional array of light activated element, and defines receive optical axis, at least one reading Distance geometry on substrate in the region that at least one reading distance described is got by sensor frame;
– lighting device, described lighting device comprises the array of array or adjacent light source, and it limits illumination optical axis,
It is characterized in that:
– light source can drive individually, and each light source is suitable for illuminating the district of size much smaller than the size in the described district of being got by sensor frame,
– throw light on axis not with reception dead in line,
Image capture device described in – comprises the driver of light source, and the driver of described light source is suitable for driving light source, with at least close illuminate described at least one read the light source of the outside boundaries in the region that distance is got by sensor frame on substrate.
In this manual and in the appended claims, term " receive optical axis " means the direction that the center of the element indicating received device optical device limits, or when independent lens by direction that the center of curvature of optical surface limits.
In this manual and in the appended claims, term " illumination optical axis " means the mean direction indicating following highest luminance light beam, namely except the possible different angles of the light source except the relative end at array are fuzzy, when all light sources of array are unlocked, the highest luminance light beam sent by lighting device.
It should be noted, in this manual and in the appended claims, term " axis " uses for the sake of simplicity, but be half axis in practice in both cases.
In this manual and in the appended claims, term " adjacent " means that instruction does not exist the parts with following function between light source, described function and emission function and/or different with the addressing of function such as light source, driving, heat radiation, the optoisolated function that are under the jurisdiction of this; Therefore this term need not be interpreted as instruction light source with restrictive, sense and contact with each other.
In this manual and in the appended claims, " border " term in the region that substrate is got by sensor frame means that instruction has the ranks equaled at the most by the thickness in the region of the independent light illuminating of array.In other words, the following fact considered in term, namely light source is quantitatively under any circumstance limited, and the region of each light illuminating has limited size, therefore defines the resolving limit of illuminator relative to the geometrical boundary in the region of being got by sensor frame.
Each drivable light source separately preferably includes an independent illumination component, but it can comprise more than an illumination component.
Preferably, at least one reading distance described is included in the multiple reading distances in the depth of field, in other words, is included in multiple reading distances of (containing minimum reading Distance geometry maximum read distance) between minimum reading Distance geometry maximum read distance.
Being suitable for driving light source at its place's driver can be mutually discontinuous with the reading distance of the light source of closing the outside boundaries at least illuminating the region of being got by sensor frame on substrate, or can in the depth of field consecutive variations.
Usually, for increasing the depth of field and/or limit direction and/or shape better in the space in the region of being got by sensor frame, image processing system also comprises at least one receiver optical device with fixing or variable focal length.Such receiver optical device can comprise the independent lens or optics group shared by the light activated element of sensor especially, and/or each lens arra, prism surface and/or aperture be all associated with the subgroup of light activated element or element, such as, described in aforementioned EP 1 764 835 A1.
Usually, image processing system comprises zoom and/or autofocus system, in the case, is changed with the reading not directly and in the depth of field by the region that sensor frame is got apart from proportional mode.
Receive axis to overlap with the normal to a surface of sensor, or with an angle relative to this normal slope.
Preferably, for increase depth of focus in image-side and/or by illumination axis relative to the normal slope of array of source, array of source is combined with at least one projecting lens.More specifically, each light source can be provided with its oneself projecting lens, and/or can provide at least one independent projecting lens shared by the light source of array.
Each projecting lens can be replaced by other optical element or combine with other optical element, and described optical element is aperture, prism surface, photoconduction and/or GRIN Lens such as, is similar to described in aforementioned EP 1 764 835 A1.
Illumination axis can overlap with the normal of the plane of array, or with an angle relative to this normal slope.
In certain embodiments, illumination axis line parallel is in receiving axis and separating with it.
In other embodiments, the axis that throws light on tilts and not copline relative to reception axis.When two axis tilt, two axis can intersect, and usually intersect in sensor front, or their tiltables.
In certain embodiments, array and sensor are coplanar, they advantageously can be formed on the identical support member of identical surface-mounted integrated circuit, or be formed on identical ic substrate.
In other embodiments, array and sensor are arranged in the plane of inclination mutually, make the angle of inclination of advantageously throwing light between axis and reception axis be determined or contribute to being determined.
Preferably, if the light source of array is all opened, be suitable for illuminating generally the district being greater than the maximum region of being got by sensor frame in the depth of field.
More specifically, the quantity of light source is chosen as and makes: when an independent light source is unlocked/closes, the number percent change that district's experience that illuminated device illuminates generally on substrate is enough little.
Preferably, number percent change is less than or equal to 15%, is more preferably less than or equal to 10%, is more more preferably less than or equal to 5%.
Preferably, driver is suitable for all light sources not opening array in any reading distance.
More preferably, driver is suitable for being closed in each reading distance at least one light source in the edge of array.In other words, two light sources that the relative end that driver is suitable for not being open at array in any reading distance is arranged.
Preferably, driver is suitable for being closed in and reads all light sources that distance illuminates the outside boundaries in the region of being got by sensor frame, and is suitable for opening in run mode all light sources in the border illuminating the region of being got by sensor frame.
Preferably, at least one being suitable for only opening in run mode in the region that illuminates and got by sensor frame of driver pays close attention to the light source in district.
Driver can in response to reading the measuring appliance of distance or the device for estimating to read distance.
The measuring appliance reading distance can be device that is different from reader and that communicate with it, such as photronic system, the device of the measurement of the device based on phase measurement or the flight time based on laser instrument or LED light bundle, visible ray or IR (infrared ray), or based on the device etc. of radar or ultrasound in nature.
But preferably, the figure being selected as projection light emitting that driver is suitable for opening in run mode array reads the light source of distance for assessment.Read distance measure based on passing through on a sensor by the shape of the image of the light formation of at least some light sources transmittings of described array and/or position or estimate.
Driver can be suitable for opening in run mode array be selected as illuminate generally for aim at the region of being got by sensor frame and/or its at least one pay close attention to the light source of the figure in district.
Driver can be suitable for the light source being selected as illuminating generally the figure being used to indicate the result of attempting seizure image in the region of being got by sensor frame opening array in run mode.
The light source of array preferably also can be driven individually in emissive porwer.
Preferably, the array of light source is suitable for launching the light exceeding a kind of wavelength.Especially, array can comprise be suitable for the multiple light sources launching first wave length the first subset sums at least one be suitable for the second subset of the multiple light sources launching the second wave length different from first wave length.Alternatively, each light source can be suitable for the light launching different wave length alternatively.
When such measure, such as can based on the color of the color adjustment illumination of optical encoding and background thereof.In addition, such as by the result projection red illumination figure for result projection green illumination figure certainly for negative, the different instruction catching or read the result of attempting can easily be provided.In addition, for aiming at multiple concern district also for its person easy to use selects, lighting figure variation can be made.
The array of light source can be one dimension or two dimension.
Array of source can be flat or bending.By by light source arrangement on curved surface, the length of the light path between each light source and substrate can be made identical or identical substantially, therefore compensate for light that light source sends will the different decay of experience when flat array, and therefore obtains in the uniform illumination of reading distance.Bending layout also can be used for the difference determining or contribute to the primary beam determining Different Light.
Preferably, the quantity of the light source of array is more than or equal to 32 under one-dimensional case, or is more than or equal to 32 × 32 under two-dimensional case.
More preferably, the quantity of the light source of two-dimensional array is selected from the group comprising 32 × 32,64 × 64,44 × 32 and 86 × 64, and selects from the group comprising 32 and 64 under one-dimensional case.
In one embodiment, driver be suitable for closing at least illuminate read the distance region of being got by sensor frame the first half all light sources of outside boundaries, image capture device also comprises: the second array of the adjacent light source that can drive individually, described second array defines the second illumination axis, second illumination axis not with reception dead in line; And the driver of light source is suitable for the light source of driving second array, to close the light source of that at least illuminate the region of being got by sensor frame and the first half complementations the second half outside boundaries.
In one embodiment, image capture device also comprises: the second array of the adjacent light source that can drive individually, and described second array defines the second illumination axis, described second illumination axis not with reception dead in line; The driver of light source is suitable for the light source of driving second array, to close the light source of the outside boundaries at least illuminating the region of being got by sensor frame.
In one embodiment, driver is suitable at least according to which light source read apart from determining to open respectively or close closed array in real time.
In an embodiment, determine in real time to be performed by analytic method, in other words, use only depend on reader and known (design) geometric parameter of particularly image processing system, its lighting device, and/or the analytic formula that its space is arranged is determined, described space arranges that the space comprising its parts or sub-component is arranged.
Preferably, analytic method comprises the steps:
– calculates the coordinate of the specified point in the region of being got by sensor frame on substrate in the first reference frame relevant to receiving trap;
– performs the coordinate transform in the second reference frame relevant to lighting device; With
–, in the second reference frame, calculates the light source of the array illuminating corresponding specified point.
Preferably, in abovementioned steps, perform from one or more in the formula of following (1) to (31).
In an embodiment, determine in real time to be performed by experience or adaptive approach at least in part, the mode that described experience or adaptive approach comprise returning drives the subset of opening light source, the position in the district be illuminated on substrate relative to the regional assessment got by sensor frame and/or scope, and based on this assessment, the subset of light source is adapted to.
The initial subset of light source by analysis mode, determined in advance by experience or adaptive approach, therefore described analysis mode, be used to the inexactness of the array of source of each image capture device such as revising batch production by experience or adaptive approach.
In an embodiment, the described recurrence of the subset be unlocked of light source adapts to perform along the direction that multiple radial direction is separated.
In an embodiment, the subset be unlocked of light source is by determining the position of the end be unlocked light source along described multiple directions interpolation.
In an interchangeable embodiment, driver is suitable for determining to open or close which light source respectively according to reading distance by reading them from question blank.
Driver can be suitable for disposable (una tantum) and set up described question blank, particularly utilizes and resolves or inspection/adaptive approach, be similar to and determine in real time.
Alternatively, driver can be suitable for receiving described question blank as input, and described question blank to resolve or the disposable foundation of experience/adaptive approach by independent treating apparatus, is similar to and determines in real time.
If determine the light source should opened or close respectively according to the reading distance of disposable generation in an independent treating apparatus, then implement preferably by computer program, one or more amount of described computer program parameter ground managing image trap setting.In this way, advantageously, identical computer program can be used for such as a series of reader model.
Such computer program represents an other aspect of the present invention.
The light source of array is preferably light source or the organic light sources of solid-state type, and more preferably described light source is selected from the group comprising LED, OLED, miniature LED and microlaser.
In another aspect of the present invention, the present invention relates to the optical information reader of imager type, described reader comprises image capture device as above.
In another aspect of the present invention, the present invention relates to the memory storage of embodied on computer readable, described memory storage comprises foregoing routine.
In another aspect of the present invention, the present invention relates to optical pickup, described optical pickup comprises the array of the adjacent light source that can drive individually, and is suitable for light illumination mode, aiming pattern and reads the driver that result pointing-type drives the light source of array.
Preferably, described driver is also suitable for optical distance measurement system or measurement pattern driving light source.
Accompanying drawing explanation
The description of the some embodiments of the present invention completed by reference to accompanying drawing is emphasized by other feature and advantage of the present invention better, wherein:
Fig. 1 is described in detail, which illustrates the image capture device of prior art, wherein lighting device and image processing system disalignment,
Fig. 2 schematically illustrates the optical information reader according to imager type of the present invention,
Fig. 3 schematically illustrates according to image capture device of the present invention,
Fig. 4 illustrates the part with the array of the miniature LED of pre-collimator lens on each light source with the ratio of more amplifying,
Fig. 5 illustrates the illumination with the flat array of the light source of image processing system out-of-alignment lighting device,
Fig. 6 illustrates the illumination with the bending array of the light source of image processing system out-of-alignment lighting device,
Fig. 7 to Fig. 9 be a diagram that the block diagram of some embodiments of the driving of the light source of lighting device,
Figure 10 to Figure 17 is the expression of the geometric configuration of image capture device or its part,
Figure 18 be a diagram that the block diagram of another embodiment of the driving of the light source of lighting device,
Figure 19 is that the figure of the embodiment of the driving of the light source of the lighting device of Figure 18 represents,
Figure 20 is that the figure of the embodiment of the driving of the light source of lighting device represents,
Figure 21 a, Figure 21 b and Figure 21 c illustrate whole block diagram, and described block diagram illustrate in detail the embodiment of the driving of the light source of the lighting device in Figure 20,
Figure 22 to Figure 27 is schematically showing of the various embodiments of image capture device,
Figure 28 is the expression of the geometric configuration of the embodiment of the lighting device of image capture device,
Figure 29 illustrates the light source of the image capture device of embodiment, and this light source is waited to be unlocked to illuminate the whole region of being got by sensor frame at different operating distance places,
Figure 30 to Figure 37 schematically illustrates the other function of the lighting device of image capture device,
Figure 38 and Figure 39 schematically illustrates other embodiment of image capture device.
Embodiment
Fig. 2 is the block diagram according to the reading system of the optical information of imager type of the present invention or reader 1 in brief.
Reader 1 comprises the image capture device 2 that can catch or obtain the image of optical information C, and described optical information C is illustrated by the two-dimensional optical code be present on substrate S in fig. 2.
The image capture device 2 described better is hereinafter comprised image processing system or part 3, described image processing system or part 3 comprise array format in light activated element-linear or preferably as directed matrix-type-sensor 4, described sensor 4 can from optical signalling, in other words from the light R generation electric signal launched by substrate S, described smooth R is modulated by existing graphic element, is modulated especially by code or other optical information C.
Even if unnecessary, image processing system 3 yet comprises picture receiver optical device 5 usually, and described picture receiver optical device 5 can be formed and focus on fully containing the substrate S of optical information C or the image of its part on sensor 4.
Image capture device 2 also comprises: be suitable for illuminating bundle T being projected the lighting device to substrate S or part 6.
Reader 1 also comprises process and/or control device 7, and described process and/or control device 7 can from the Image Acquisition information contents caught by image capture device 2 or its part, and such as decode 2 d code C, and control other parts of reader 1.
Process and/or control device 7 are known in essence, and comprise: for the treatment of hardware and/or the software service of the signal launched by sensor 4, such as, be wave filter, amplifier, sampling thief and/or binaryzation device; For reconstructing and/or the module of decodes optical code, comprise for inquire about may code table, for inquire about with may the model of table of code dependent any plain text information; Optical character recognition module etc.
The image obtained and/or the programming code of its result and reader 1, process parameter values and described question blank are stored in that at least one is interim and/or in jumbo memory storage 8 usually in digital form, and memory storage 8 is the possible removable memory storage of reader 1.Memory storage 8 is also used as service memory with executive software algorithm.
Reader 1 also can comprise communicator or interface 9, for by the PERCOM peripheral communication of the image of acquisition and/or the information content of extraction and reader 1 and/or be used for inputting the configuration data being used for reader 1 from external source.
Reader 1 also comprises at least one output unit 10, for user's indication example as related to the information content etc. of the alphanumeric of the running status of reader 1 and/or graphical information, reading, and/or for showing the current image got by sensor 4 frame.Output unit 10 alternatively or extraly comprises printer, the output unit of voice operation demonstrator or other aforementioned information.
Reader 1 also comprises such as the control signal of configured readers and/or at least one manual input device 11 of data, such as, be similar to keyboard or multiple button or control lever, directionkeys, mouse, touch pad, touch-screen, phonetic controller etc.
Reader 1 also comprises at least one supply unit 12, to use battery supply or by from primary power or obtain power supply direction of signal multiple parts from external device (ED) and supply suitable voltage and current level.
Reader 1 also comprises the driver 13 described better hereinafter of lighting device 6.
As described better hereinafter, driver 13 and lighting device 6 preferably except realize substrate S or its one or more pay close attention to district (ROI) illumination functions to catch except image by image processing system 3, also realize the illumination functions as one or more device in lower device: sighting device, output indicating device, for detecting the existence of substrate S and/or reading the device of the focused condition of distance and/or image capture device 2 (view finder) for optical measurement or estimation.
Process and/or control device 7 by one or more processor particularly one or more microprocessor or microcontroller, and/or are implemented with the circuit of discrete or integrated parts.
Similarly, driver 13 by one or more with the circuit of discrete or integrated parts and/or particularly one or more microprocessor or microcontroller are implemented by one or more processor.
In addition, although process and/or control device 7 are depicted as with driver 13 device separated in fig. 2, they can share one or more such circuit and processor, and/or share one or more device performing memory storage 8.
More generally, it should be understood that Fig. 2 illustrates different square frames from the angle of function.From the angle of physics, all parts of above-mentioned reader 1 may be made in discrete object, assuming that they are each other as schematically illustratively intercomed mutually in fig. 2, for controlling, the communication of data and/or power supply signal.Connection can be carried out via cable and/or wireless mode.
Therefore, above-mentioned reader 1 may be made in an independent object, and wherein all parts is contained in unshowned shell, and described shell has and is such as used in suitable shape in fixed or portable stage and size; Described shell comprises at least one transparent region, as the passage of launched light T with the light R received.Shell and/or one or more internal support are also configured to the parts supporting image capture device 2 and lighting device 6 with predetermined mutual relationship.
Otherwise output unit 10 and/or manual input device 11 and/or process and/or control device 7 can at least in part by computer-implemented.
In addition, lighting device 6 and image processing system 3 can be formed in shell separately, each all with the transparent region of self in described shell, and lighting device 6 and image processing system 3 are constrained in space by with predetermined mutual relationship during the installation steps of reader or reading system 1.
Fig. 3 in more detail but schematically illustrate image capture device 2 according to an embodiment of the invention.
The sensor 4 of the image processing system 3 of image capture device 2 comprises the array of light activated element 14, and each in described light activated element all provides electric signal, and the intensity of electric signal is the function of the light projected thereon.Exemplarily, Fig. 3 shows foursquare dimension sensor 4, but described sensor also can be rectangle, circle or oval.Sensor 4 can such as be made with C-MOS or CCD technology.Alternatively, sensor 4 can be actuated to extract the signal generated by the subset of its light activated element 14, and as critical condition, each independent light activated element 14 can be driven individually.
The receiver optical device 5 of the image processing system 3 of image capture device 2 is designed to be formed on sensor 4 comprise the substrate S of optical information C or the image in its region.Receiver optical device 5 can comprise one or more lens, one or more aperture, and refraction, reflection or diffraction optical element, receiver optical device 5 can be out of shape the effective aspect ratio revising sensor 4.Exemplarily, receiver optical device 5 to be depicted as in the plane that is placed on and is parallel to sensor 4 and coaxial with it inverse lens in figure 3.
Image processing system 3 defines the workspace areas 15 extended in sensor 4 front.Workspace areas 15 are optical information C by sensor 4 correctly frame to get and the image of optical information C fully focuses on the region in the space on sensor 4.In this workspace areas 15, best focal plane can be fixing or be changed by autofocus system.When representated by the square shaped sensor device 4 of the particular case as rectangular sensor, workspace areas 15 is Pyramid or pyramidal truncated cone; When circle or oval sensor 4, workspace areas 15 is conical or conical truncated cone; When one dimension sensor 4, pyramidal base portion becomes thinner substantially, and can think that perform region 15 is flat substantially.
Image processing system 3 also defines the optical axial of receiver optical device 5, in brief, receives axis Z.Receive axis Z to be limited by the center of the element of receiver optical device 5, or limited by the center of curvature of optical surface when independent lens.As become clearer hereinafter, receiving axis Z perpendicular to sensor 4, need not need the center by sensor 4 yet.
Especially, when receiver optical device 5 comprises deflecting element, reception axis Z can be the non-rectilinear in image processing system 3, but can under any circumstance be simulated by straight reception axis Z in meaning of the present invention.
Arrange the summit O of workspace areas 15 along reception axis Z, in brief, receive summit O.The summit O of workspace areas 15 is pyramid or conical summit, and described summit is dropped in the optical centre of optical device 5 when inverted receiver optical device 5, and when the receiver optical device 5 of non-inverted, sensor 4 rear is dropped on described summit usually.
Image processing system 3 also defines perform region 15 about the angular width receiving axis Z, and described angular width is usually with four angle betas
1, β
2, β
3, β
4form represent, the initial point of described four angles in reception summit O, and with receive axis Z and to coincide and in the side extended in orthogonal four half-planes one.Two principal directions of reference sensor 4, namely the line direction of its light activated element 4 and column direction, can touch upon and pass through angle beta
1, β
3represent " level " visual field and pass through angle beta
2, β
4represent " vertically " visual field.Under the particular case that sensor 4 is coaxial and placed in the middle relative to receiver optical device 5, workspace areas 15 has symmetry, and in absolute value β
1=β
3and β
2=β
4.When one dimension sensor, " vertically " visual field much smaller than " level " visual field, and can be ignored substantially.
Image processing system 3 also defines depth of field DOF, and described depth of field DOF illustrates the scope along the workspace areas 15 receiving axis Z.
In figure 3, substrate S sentences S instruction at the general distance D that reads, and is correspondingly indicated with 16 by the region that sensor frame is got; As special circumstances, substrate S gets distance D in minimum readable
1sentence S
1instruction, and the region of being got by sensor frame is with 16
1instruction, and substrate S is at maximum readable distance D
2sentence S
2instruction, and the region of being got by sensor frame is with 16
2instruction.Therefore the depth of field pass through DOF=D
2-D
1provide.
Even if need not perpendicular to sensor 4 and need not perpendicular to the region 16 of the substrate got by sensor 4 frame it is emphasized that receive axis Z, also to measure from reception summit O read distance D, D along receiving axis Z
1, D
2.
Workspace areas 15 can be fixing, or dynamically changed by the zoom known and/or autofocus system in size and/or ratio, described zoom and/or autofocus system be electromechanics, piezoelectricity or electro-optical actuators such as, for one or more lens or aperture, mirror or other parts of mobile receiver optical device 5, and/or for changing the curvature of one or more lens such as liquid lens or deformable lens, and/or for movable sensor 4.In a preferred embodiment, receiver optical device 5 comprises the Arctic 416 SL-C1 liquid lens manufactured by the Varioptic SA of France.
In other words, although suppose β for the sake of simplicity in figure 3
1, β
2, β
3, β
4visual field identical at different reading distance D places, but this can be changed along reception axis Z by zoom system, pancreatic system, makes perform region 15 be no longer pyramid or conical static truncated cone, but has variable size and/or ratio.Description of the invention under any circumstance remains effectively overall.
The lighting device 6 of the image capture device 2 of the optical information reader 1 of imager type comprises the array 17 of adjacent light source 18.In figure 3, for clarity, illustrate only some light sources in described light source 18.
The light source 18 of array 17 drives individually by driver 13, to open and close, and preferably also can drive individually according to the wavelength of intensity and/or transmitting or wavelength coverage.Therefore, this is as " pixelation source " restriction array inside, maybe can be defined as the array of PPEA (photon transmitter array able to programme).
The light source 18 of array 17 is preferably each includes single illumination component, and illumination component is mutually the same in shape and size.But the light source 18 of array 17 also can comprise the illumination component of difformity and/or size.In addition, the light source 18 of array 17 eachly can include multiple illumination component being gathered into the group of identical or different shape and/or size.In other words, if according to the present invention can drive individually bunch the quantity still fully large following function to realizing lighting device 6 of quantity in other words light source 18, then pixelation source driving can illumination component or pixel bunch aspect on carry out.
Lighting device 6 comprises illumination optics alternatively.
Illumination optics can comprise one or more lens and possible aperture that can be out of shape, and refraction, reflection or diffraction optical element, this light source 18 for all arrays 17 is public.Illumination optics can be public, and image inversion optical device 19a is exemplarily depicted as with array 17 coaxial in figure 3.
Describe as exemplarily better in following Figure 14 to Figure 16, illumination optics also can be used as alternatively or comprises multiple lens 19b extraly, and each lens 19b combines with the light source 18 of array 17.Have with such lens 19b of light source 18 or the comparable size of its illumination component there is the function determined and reduce the Effective emission angle degree of independent light source 18 particularly, and such lens 19b also can have the function of the orientation determining the illuminating bundle reflected by independent light source 18.
Each lens 19b can be replaced or combine with other optical element such as aperture, prism surface, photoconduction or GRIN Lens, to select the direction of the light beam launched by independent light source better, such as, as described in aforementioned EP 1 764 835 A1.
As exemplarily illustrated in following Figure 16, multiple lens 19b also can be combined with the image forming optics 19c of public non-inverted, or is combined with public inverted image forming optics 19a.
The light source 18 of array 17 is preferably formed in the form in integrated circuit on public substrate.Preferably, light source 18 is also driven by the address bus with row index and row index.
Preferably, fill factor, the ratio between the total area of the substrate of the total area namely occupied by the activated surface of light source 18 (or multiple lens 19b) and the integrated circuit that arranges source (lens) is high, preferably more than 90%.
In one embodiment, the light source 18 of array 17 is miniature LED.Described miniature LED is miniature transmitter, and described miniature transmitter is such as made with gallium nitride (GaN) technology, and with the emitting area of larger linear dimension equaling about 20 microns, but this linear dimension is usually also little of 4 microns; With this technology, array 17 can be fabricated in very small dimensions and comprise light source thousands of or up to ten thousand 18 (size of such as, very little several mm is used for the array of the illumination component of 512 × 512) and with minimum cost and power consumption.Such device can also with different wavelength emission.
In one embodiment, the light source 18 of array 17 is OLED (Organic Light Emitting Diodes).OLED is the electrooptical device obtained by being arranged between the two conductors by the organic film of series of thin.When a current is applied, light stream is launched.This process is called as electroluminescent phosphorescence.Even if utilize the system of multilayer, the array 17 of OLED 18 is also very thin, is usually less than 500 nanometers (a millimeter 0.5/1000th), and little of 100nm.OLED consumes low-down energy, requires low-down voltage (2 to 10 volts).OLED can be luminous with the different wave length in visible spectrum.OLED also can be arranged in array closely with the density usually reaching per inch (pixel/inch) 740 illumination components, each array is 15 square microns (" OLED/CMOS combination opens the New World (OLED/CMOS combo opens a new world of microdisplay) of miniscope ", Laser Focus World (Laser Focus World), Dec calendar year 2001, 37th volume, 12nd phase, Pan Weier publication (Pennwell Publications), it can obtain in following link place: " http://www.optoiq.com/index/photonics-technologies-applications/lfw-d isplay/lfw-article-display/130152/articles/laser-focus-w orld/volume-37/iss ue-12/features/microdisplays/oled-cmos-combo-opens-a-new-world-of-mic rodisplay.html ", " organically grow up: luminous organic crystal and polymkeric substance imply that flat-panel monitor revolutionary variation can occur thus manufacturing cost may be made to reduce and portability increase (Organically grown:Luminescent organic crystals and polymers promise to revolutionize flat-panel displays with possibilities for low-cost manufacture and more portability) ", Laser Focus World (Laser Focus World), August calendar year 2001,37th volume, 8th phase, Pan Weier publication (Pennwell Publications), it can obtain in following link place: " http://www.optoiq.com/index/photonics-technologies-applications/lfw-d isplay/lfw-article-display/113647/articles/laser-focus-w orld/volume-37/iss ue-8/features/back-to-basics/organically-grown.html ").OLED has the emission angle of non-constant width, usually reaches 160 °.The array 17 of OLED also can arrange on flexible substrates and therefore present bending structure.The array 17 of OLED also can be formed as making radiated element have different shapes and/or size.
In one embodiment, the light source 18 of array 17 is LED (light emitting diodes).LED is the photoelectron emission device of the maximum linear dimension with 50 microns, and described linear dimension can reach 350 microns or larger; These devices can realize high efficiency, but cost is large chip size and needs heat dissipation element each other, and this makes array 17 volume that therefore formed bigger than normal and has large idle area each other at transmitter, and namely fill factor is low; Alternatively, LED emitter can be formed on substrate, such as, at aforementioned documents US 5, and 319, describe in 182, such as C-MOS substrate, but there is lower efficiency.In addition, the driver chip of LED 18 tends to have the contact in center, this generates by the shade of the center in region illuminated respectively.Even if there is method to avoid this defect, such as at aforementioned US 6,811, the contact geometry shape proposed in 085, but these systems are also relatively costly and consume relatively a large amount of energy, often need relatively large area of dissipation near this external each light source 18, it reduce its fill factor as mentioned above.
In one embodiment, the light source 18 of array 17 is the laser instruments combined with the micro mirror of MEMS (MEMS (micro electro mechanical system)) technology manufacture, described laser instrument be movable to do not allow light to pass through orientation on, in other words in meaning of the present invention, laser instrument is closed, and described laser instrument is movable at least one orientation allowing light to pass through, and in other words opens laser instrument in meaning of the present invention.Such device is called as " micro projector " in field.Can be provided for the laser instrument be combined with each micro mirror, or also can to provide for multiple micro mirror be public independent laser instrument.But the existence of movable part relates to a certain amount of consumption and wearing and tearing.
Other technology can be used for the array 17 making light source 18.
As an example of the array 17 of light source 18, Fig. 4 has illustrated the part with the array 17 of the miniature LED of pre-collimator lens 19b on each light source 18 with very large magnification ratio.
Lighting device 6 is constructed so that each light source 18 of array 17 launches basic illuminating bundle, and described base lighting light beam has oneself average propagation direction in the front space of lighting device 6.As will be explained better hereinafter, lighting device 6 is also constructed so that mutually to be close to by the region that the adjacent light source 18 of array 17 illuminates on substrate S and may be slightly overlapping, to be formed in hereinafter with the general lighting light beam of T instruction, the shape and size of general lighting light beam depend on that current how many light sources 18 and which light source 18 are opened by driver 13.The quantity of the light source 18 of array 17 is chosen as and makes the region that illuminated device 6 all illuminates on substrate S experienced by enough little number percent change when independent light source 18 On/Off.Preferably, described number percent change is less than or equal to 15%, is more preferably less than or equal to 10%, is more more preferably less than or equal to 5%.
Fig. 3 illustrates illuminating bundle T
0if open whole light sources 18 of array 17, then, except the direction ambiguity of the light source of the relative end at array 17, above-mentioned illuminating bundle T will be launched by lighting device 6
0.
Lighting device 6 defines illumination optical axis A, and described illumination optical axis A is this highest luminance light beam T
0mean direction, as at least one plane intraoral illumination light beam T
0the axis of symmetry, and to be generally at two vertical plane intraoral illumination light beam T in illustrated in two-dimensional array 17
0the axis of symmetry.
When the optical axial of public illumination optics 19a, 19c and illumination optics 19a, the 19c public relative to this placed in the middle array 17, illumination axis A is limited by the center of the element of public illumination optics 19a, 19c, or is limited by the center of curvature of the optical surface when public single lens 19a, 19c.Especially, when illumination optics 19a, 19b, 19c comprise deflecting element, illumination axis A can not form straight line inside illumination optics 6, but still simulates by straight illumination axis A at meaning intraoral illumination axis A of the present invention.
In representative situation that is square or the usually two-dimensional array 17 of rectangle, highest luminance light beam T
0pyramid or pyramidal truncated cone shape; When circle or oval-shaped array 17, illuminating bundle T
0it is conical or frustoconical; When one-dimensional array 17, pyramidal base portion becomes thinner substantially, and its thickness equals the size in the district illuminated by independent light source 18, and can think highest luminance light beam T
0flat substantially.
Lighting device 6 also defines illumination summit A
0, described illumination summit A
0the summit of following pyramid or circular cone: when public inversion illumination optics 19a, illumination summit A
0overlap with its optical centre, and when illumination optics 19b, 19c of non-inverted, illumination summit A
0usually the rear of array 17 is dropped on.
It is worth emphasizing that, as by obvious hereinafter, depend on that public illumination optics 19a, 19c are relative to the orientation of array 17 and location and/or the geometric configuration depending on the independent lens 19b be combined with light source 18, illumination axis A perpendicular to array 17, need not need the center by array 17 yet.
According to the present invention, illumination axis A does not overlap with reception axis Z.Especially, lighting device 6 and image processing system 3 disalignment.Usually, summit O and illumination summit A is received
0do not overlap, and illumination axis A and reception axis Z tilts mutually.So supposition receives summit O and illumination A
0do not overlap, then the axis A that throws light on can be parallel with reception axis Z.So supposition illumination axis A and reception axis Z tilts mutually, then receive summit O and illumination A
0principle can be overlapped.
According to the present invention, the driver 13 of the light source 18 of array 17 is suitable for mode driving light source 18 described after a while, to be closed in the light source 18 that the outside boundaries in the region 16 that substrate S is got by sensor 4 frame is illuminated at general reading distance D place.Therefore, in figure 3, drawing reference numeral 20 indicates and is reading distance D place and illuminate the light source 18 on the border in the region 16 of being got by sensor 4 frame in array 17.At distance D place, driver 13 is responsible for opening the light source 18 in circumference 20 (containing circumference 20), and is closed in the light source outside circumference 20.When the part in the region 16 of being got by sensor 4 frame is only illuminated in hope, as described better hereinafter, driver 13 is by an only subset of the light source 18 in responsible unlatching circumference 20.
It is worth emphasizing that, at this and in the remainder of this instructions and claim, " closedown " and " unlatching " and derivative form not necessarily mean the switching of indicating status, and if mean and comprise the state that light source 18 has been in hope, then driver 13 maintains this state.
Be understood that, " border " in the region 16 of being got by sensor frame means that the layer of the geometry circumference in region is crossed in instruction, the thickness of this layer passes through to be determined by the region that the independent light source 18 of array 17 illuminates, and therefore quite little relative to the whole region 16 of being got by sensor 4 frame.
As specific situation, in figure 3, drawing reference numeral 20
1indicate in array 17 at minimum reading distance D
1the region 16 of being got by sensor 4 frame is illuminated at place
1the light source 18 on border, at this distance D
1place's driver 13 is responsible for being opened into many at circumference 20
1(comprise circumference 20
1) in all light sources, and be responsible for close circumference 20
1the light source in outside; Drawing reference numeral 20
2indicate in array 17 at maximum read distance D
2the region 16 of being got by sensor 4 frame is illuminated at place
2the light source 18 on border, at this distance D
2place's driver 13 is responsible for being opened into many at circumference 20
2(comprise circumference 20
2) in all light sources, and be responsible for close circumference 20
2the light source in outside.
As seen from Figure 3, when reading distance D is at D
1and D
2between when changing, the light source 20 that periphery is opened
1, 20 and 20
2change in array 17 allow for revises parallactic error and perspective distortion error, described parallactic error and perspective distortion error the non-coaxial relative to image processing system 3 of lighting device 6 is arranged be intrinsic (in this special case, axis A tilts relative to axis Z equally, therefore region 20,20
1, 20
2for trapezoidal shape).Drawing reference numeral 21,21
1, 21
2if all light sources 18 illustrating array 17 are unlocked, respectively at distance D, D
1, D
2the circumference in region that place will be illuminated, in other words, at different distance D, D
1, D
2place substrate S, S
1, S
2with highest luminance light beam T
0intersection: it should be noted that, such highest luminance region 21,21
1, 21
2in eachly how to extend to the region 16,16 far exceeding and got by sensor 4 frame in the distance of correspondence
1, 16
2, this will correspond to the waste of energy, and when the light launched by light source 18 is in visible spectrum, brings the region 16,16 of being got by sensor 4 frame
1, 16
2the shortcoming of vision instruction aspect of misleading user.
Although unclear generally from Fig. 3, the independent light source 18 be unlocked of array 17, exist public illumination optics 19a, 19c place and at substrate S, S
1, S
2on the region 16,16 of being got by sensor 4 frame
1, 16
2between light path be inconstant.As this result, there is the illumination unevenness and/or focused lost that schematically show in Figure 5.
Such illumination unevenness and/or focused lost are revised by the suitable design of illumination optics 19a, 19b, 19c, but prove that this may be numerous especially tired.
Alternatively or extraly, driver 13 can driving light source 18, makes light source with different intensity, and intensity increases ground and launches from right to left in figure 3 especially.
It is emphasized that the intensity by modulating independent light source 18, also can revise the possible unevenness in the intensity of light source 18 itself, therefore increasing the not clear perception of lighting device 6 for product tolerance.In other words, do not need that there is uniform transmitter array 17.
Still alternatively or extraly, the array 17 of light source 18 can be arranged on the curved surface corresponding to the optimum focusing curve (caustic curve) of public illumination optics 19a, 19b, 19c, this curved surface becomes curve substantially in the case of a one-dimensional array, the outermost light source 18 of array 17 is made to be placed in correct distance by public illumination optics 19a, 19b, 19c, with the image projection that will focus on on substrate S.Embodiment with curved arrays 17 schematically illustrates in figure 6, and is possible especially in case of oleds.In an embodiment, the light source 18 of array 17 can be arranged in there is the concavity contrary with the concavity of Fig. 6 curved surface on.In the case, the illuminating bundle of independent light source 18 is dispersed and illumination optics can save.
There is diverse ways, namely the light source 18 that will open of array 17 is selected according to described method drive 13, and alternatively using the intensity of described light source 18 and/or emission wavelength (s) as the function of the reading distance D in the depth of field DOF at image processing system 3, only to illuminate on substrate S by complete region 16 that sensor 4 frame is got.Below, for for purpose of brevity, only relate to the determination of the light source 18 that will open, which imply and can determine corresponding intensity and/or emission wavelength (multiple wavelength) simultaneously.
First, describedly determine to carry out in real time or once.
When determining in real time, driver 13 self should comprise hardware and/or software module, to implement to determine algorithm.Illustrated in the figure 7, in step 100, setting or the work at present distance D (D of detection in depth of field DOF
1≤ D≤D
2).In a step 101, determine the subset 18a that must be unlocked only to illuminate the whole region 16 of being got by sensor 4 frame of light source 18, determine with one of following method particularly.In a step 102, at the most all light sources of subset 18a are all opened.
When disposable determine set up question blank, then at the normal operation period of the image capture device 2 of reader 1, driver 13 is with reference to this question blank.Driver 13 can comprise described hardware and/or software module again, or method performs by ppu, and only question blank can be loaded in the storer 8 relevant to driver 13 of reader 1.Disposablely determine preferably substantially each reading distance D in depth of field DOF to be carried out, in other words, at D
1and D
2between continuously or with the change of suitable sampling rate, and therefore provide this disposablely to determine for the cycle of operation.The sampling scope of operating distance D can be non-constant, and especially, the operating distance D of sampling can at close minimum operating distance D
1time mutually closer to, and close to maximum functional distance D
2time mutually more not close, the structure of the light source 18 be wherein unlocked changes more lentamente.With reference to figure 8, in step 103 operating distance D is chosen as minimum operating distance D respectively
1or maximum functional distance D
2.Then perform step 101, determine the subset 18a that must open only to illuminate the whole region 16 of being got by sensor 4 frame of light source 18.Then be stored in question blank by record at step 104, described record comprises selected operating distance D and (corresponds respectively to D when the first time of this circulation performs
1or D
2) and the subset 18a that determines in a step 101.In step 105, then increase or reduction operating distance D with infinitesimal amount or based on the amount of preselected sampling respectively.In step 106, then check whether circulation performs on whole depth of field DOF, and in other words whether operating distance D correspondingly exceedes maximum functional distance D
2or be less than minimum operating distance D
1.In the negative case, step 101,104,105 and 106 repeats, and is therefore inserted in question blank by new record.When circulate in whole depth of field DOF executes time, in other words when the inspection of step 106 is for time certainly, driver 13 can enter into normal using forestland.In this mode, set in step 100 or detect the current operating distance D (D in depth of field DOF
1≤ D≤D
2).In step 107, the subset 18a that must open only to illuminate the whole region 16 of being got by sensor 4 frame at current operating distance D of light source 18 is read from question blank.In a step 102, the light source of subset 18a is opened at the most.
In the different embodiment of image capture device 2, determine that the step 101 must opening only to illuminate the subset 18a in the whole region 16 of being got by sensor 4 frame of the given operating distance D place light source 18 in depth of field DOF can according to diverse ways, (step 101 of Fig. 7) and disposable (step 101 of Fig. 8) perform in real time.
First method is the method for resolving type.Once the geometric configuration of built vertical image capture device 2 and optical configuration, which light source 18 that in fact just can calculate array 17 for each reading distance D illuminates the fundamental region of being got by each light activated element 14 frame of sensor 4.It should be noted in practice, can be illuminated by foursquare four light sources 18 of formation arranged adjacent to each other in array 17 at the most by the fundamental region that each light activated element 14 frame of sensor 4 is got.
A preferred embodiment of analytic method schematically illustrates in fig .9, and is described in more detail with reference to figures 10 to Figure 17 subsequently.
With reference to figure 9, in step 108, to combine with receiving trap 3 and in the first reference frame based on the structure of image processing system 3, particularly to receive in the first reference frame that summit O is initial point, calculate the coordinate of some specified points, described point allows the border in the region 16 of being got by sensor 4 frame on substrate S to be identified.Such specified point is preferably its image and is formed in the light activated element 14 of the circumference defining sensor 4 and/or its image and is formed in point on the center light photosensitive elements 14 of sensor 4.Especially, when rectangle or square shaped sensor device 4, reference frame is preferably cartesian coordinate system, and those points that the light activated element 14 that specified point preferably corresponds to the pass the relative summit place of two of being at least in sensor 4 is seen; When circle or oval sensor 4, reference frame is preferably cylindrical coordinate system and specified point preferably corresponds to center light photosensitive elements 14 and periphery light activated element 14, or corresponds to the two or four periphery light activated element 14 along the axis of symmetry of sensor 4.In fact, there is the analytic relationship corresponding with the circumference in the region 16 of being got by sensor 4 frame coordinate being a little expressed as the function of such specified point.
In step 109, perform the coordinate of specified point to the conversion of the second reference frame, described second reference frame combines with lighting device 6 and particularly with the summit A that throws light on
0for initial point.Especially, when rectangle or quadrate array 17, the second reference frame is preferably cartesian coordinate system; When circle or oval-shaped array 17, the second reference frame is preferably cylindrical-coordinate system.In some cases, can change suitably, increase or reduce and be delivered to another reference frame from a reference frame and use the specified point of the analytic relationship illustrating corresponding with the circumference in the region 16 that substrate S is got by sensor 4 frame coordinate a little and/or illustrate corresponding the coordinate a little with the circumference in the region illuminated by array 17 on substrate: such as, if be rectangle by the region 16 that sensor 4 frame is got and be viewed as trapezoidal region by lighting device 6 on substrate S, then can operate on four summits, or can such as on two relative summits, operation or the center in the first reference frame and a summit operate, and obtain four the trapezoidal summits in the second reference frame by the analytic relationship of rectangle.
In step 110, in the second reference frame, based on the structure of lighting device 6, the light source 18 illuminating corresponding specified point of array 17 is calculated.
The coordinate transform between two reference frames performed in step 109 is known in itself.Only exemplarily, with reference to Figure 10, being initial point in the first reference frame is receiving the cartesian coordinate system X at O place, summit, Y, Z and the second reference frame is that initial point is at the summit A that throws light on
0when the cartesian coordinate system U located, V, W, coordinate transform is usually rotate translation (rotation plus translation), and this can be reduced to rotation or translation under specific circumstances.With x
0, y
0, z
0indicate the illumination summit A of the second reference frame
0coordinate in the first reference frame, and with cos α
1cos α
9indicate axle U, V, W of the second reference frame relative to the first reference frame X, Y, Z direction cosine (be reduced representation, angle [alpha]
1α
9that instruction is relative to reference frame U ', the V ' of the O moved in Figure 10, the angle of W '), described conversion is represented by following relational expression group:
u=(x–x
0)*cosα
1+(y–y
0)*cosα
2+(z–z
0)*cosα
3(1)
v=(x–x
0)*cosα
4+(y–y
0)*cosα
5+(z–z
0)*cosα
6(2)
w=(x–x
0)*cosα
7+(y–y
0)*cosα
8+(z–z
0)*cosα
9(3)
Illumination summit A
0the location drawing be shown in first quartile (x
0, y
0, z
0for on the occasion of) in, but illumination summit A
0position can in any quadrant.Illumination summit A
0position also and/or can be positioned at and receive O place, summit along in axis.In addition, one or more axle in two reference frames parallel and/or overlap or vertical, direction cosine cos α
1cos α
9one or more can be zero or unit one.
By means of Figure 11 and Figure 12, now the relation associated with light activated element 14 by the point in the region 16 that substrate S is got by sensor 4 frame will be explained, described relation uses in the case where there in the step 108 of the method for Fig. 9, namely sensor 4 is two-dimensional rectangle or foursquare as its special circumstances, and the inverted light device being illustrated as paraxial lens 5 has the parallel plane principal plane with sensor 4, described principal plane is the plane vertical with the reception axis Z of the optical centre by receiver optical device 5 on other occasions.It should be noted, for maintaining general type, receiving axis Z not through the center of sensor 4, and being through the some O as one
s.
When this embodiment of image processing system 3, first reference frame is preferably chosen as initial point and is receiving cartesian coordinate system X, Y, the Z at O place, summit, z axis is chosen as and receives axis Z and to overlap but with the mode orientation contrary with the path receiving light R, and X, Y are oriented to the principal direction being parallel to sensor 4, i.e. the column direction of light activated element 14 and line direction.
At general operating distance D place, that is, in the plane with following equation,
z=D (4)
Workspace areas 15 (with dot-and-dash line and dotted line instruction) defines on substrate S by region 16 (for not shown for purpose of brevity) that sensor 4 frame is got.
Be limited to the angle beta of the visual field on the side of substrate S
1, β
2, β
3, β
4with the angle beta between the edge receiving axis Z and sensor 4 on the side of the sensor 4 in contrary quadrant '
1, β '
2, β '
3, β '
4be associated with following relation:
β’
k=AMAG
S*β
k(5)
Wherein AMAG
sthe angular magnification of receiver optical device 5, usually AMAG
s≤ 1.
As mentioned above, although visual field β
1, β
2, β
3, β
4it is constant for being depicted as along receiving axis Z, but usually this is optional, such as optional when there is visual field zoom and/or the autofocus system of the function as operating distance namely as the function of current z coordinate.In this case, above formula (5) and following by some formulas of setting forth hereinafter, will the value of the visual field at considered operating distance place be used in.
If s is the distance between the principal plane of sensor 4 and receiver optical device 5, then receive axis Z at coordinate (0,0, some O s)
splace and sensor 4 meet.With reference to Figure 12, as fruit dot O
sdrop on the rear, center of the light activated element 14 of sensor 4, if and I and J is row index and the row index axis at intervals of sensor 4, difference between the center of i.e. two adjacent light activated elements 14 in the row direction with the distance of column direction, then each light activated element 14 is limited by its center, and described center has the coordinate represented by following relation in reference frame X, Y, Z:
F(i*I,j*J,s) (6)
Wherein i and j is row index and the row index of sensor 4 respectively, the desirable positive integer value of described index and negative integer value, and with O
scentered by light activated element 14 place get null value.
As fruit dot O
sdo not drop on the center of light activated element 14, but drop on the distance I of distance center
1, J
1place, then the coordinate at the center of each light activated element will with (i*I+I
1, j*J+J
1, s) represent.If the light activated element 14 of sensor 4 is mutually unequal, then still can calculate its coordinate in reference frame X, Y, Z.It should be noted, when the square or circular light activated element evenly separately on sensor 4, the columns and rows between centers spacing I of sensor 4, J is equal to each other.
As fruit dot O
sdrop on the center of light activated element 14, then receive the axis of symmetry that axis Z is sensor 4, and workspace areas 15 has two symmetrical planes, therefore β
1=β
3and β
2=β
4.In the case, row index and row index have ultimate value equal on absolute value.
Can easily recognize, the coordinate of the center P in the region that general light activated element 14 frames limited by index i, j at distance D place are got has the coordinate represented with following relation:
z=D (9)
At unit angle magnification AMAG
swhen=1, relation (7), (8) are reduced to simple ratio:
In fig. 11 when illustrated embodiment, in the step 108 of the method for Fig. 9, relational expression (7), (8), (9) or (10), (11), (9) are applied to four some P on the summit being limited to the region 16 that substrate S is got by sensor 4 frame respectively
1, P
2, P
3, P
4, or be only applied in relative summit P
1and P
3or P
2and P
4place.
Although do not use in the method for Fig. 9, be worth the relation that is described below and relational expression (7), (8) trans, and wherein operating distance D is replaced by general coordinate z:
Described trans permission, for the arbitrfary point P of given workspace areas 15, makes the index of the light activated element 14 of its image of reception of sensor 4 be identified.Certainly, because index i, j are integers, so relational expression is taken as hithermost integer by approximate.When the visual field of independent light activated element 14 is slightly overlapping, in overlay region, by two integers being similar to insufficiently or too much by identify receive the light activated element 14 of the image of point right considered.
Recognize significantly, situation about discussing with reference to Figure 11 and Figure 12 is set up, according to the embodiment be applicable to as lower lighting device 6, lighting device 6 has rectangle or the square two-dimensional array 17 as special circumstances in the described embodiment, with the public inversion illumination optics 19a of principal plane with the plane being parallel to array 17, described principal plane is perpendicular to the plane of illumination optics 19a by the optical axial of the optical centre of himself in this special case.Relevant reference symbol indicates in the bracket of Figure 11 and Figure 12.Point G indicates the position of the virtual light source illuminating a P, and at least one light source 18 of array 17 corresponds to described some G, and foursquare four light sources 18 of formation adjacent one another are correspond to described some G at the most.
When this embodiment of lighting device 6, the second reference frame is advantageously chosen as cartesian coordinate system U, V, W, and its initial point is at illumination summit A
0place, its axis W overlaps with the optical axial of public inversion illumination optics 19a, and axis U, V are oriented to the principal direction being parallel to array 17, i.e. the line direction of light source 18 and column direction.It should be noted, only under illumination axis A passes through the specific situation at the center of array 17, axis W overlaps with illumination axis A.
Once general some P or specifically put P
1, P
2, P
3, P
4or P
1, P
3or P
2, P
4coordinate u, v, w in coordinate system U, V, W in the step 109 of the method for Fig. 9 and by relation (1), (2), (3) obtain, then in the step 110 of the method for Fig. 9 therefore by following relational application in such coordinate:
Corresponding to row index and row index m, n of the relation (14) of relation (12), (13), the light source 18 illuminating a P of (15) permission computing array 17, wherein index 0,0 (puts A with the light source 18 arranged along axis W
2) relevant.
In relation (14), (15), M and N is row distance between axles and the row distance between axles of light source 18, AMAG
abe any angular magnification of public inversion illumination optics 19a, wherein following relation (16) is set up:
γ’
k=AMAGa*γ
k(16)
T is plane and the illumination summit A of array 17
0between distance, therefore described distance is measured along axis W, and the generalized case of the above discussion of reference picture trap setting 3 and special circumstances are set up.
In the embodiment of illustrated lighting device 6 in fig. 11, additionally can there is the lens 19b combined with the independent light source 18 of array 17, launch width and/or transmit direction with the angle of revising described light source 18.Given this, reference is carried out by the description subsequently of Figure 14 to Figure 16.
Relation (14) and (15) illustrate the correlativity between the row index of the light source 18 illuminating this P of the arbitrfary point P in the workspace areas 15 and array 17 used in the step 110 of the method for Fig. 9 and row index, this correlativity is equally for the embodiment of lighting device 6, described lighting device 6 is with the illumination optics 19a of public non-inverted, described illumination optics 19a has again the principal plane of the plane being parallel to array 17, described principal plane is perpendicular to the plane of the optical axial by its optical centre of illumination optics 19c in this special case, as shown in Figure 13.
It is emphasized that, relation (1) to relation (16) is the analytic relationship of known (design) geometric parameter only depending on reader 1, and only depend on (design) geometric parameter of the image processing system 3 of reader 1 and the known of lighting device 6 especially, and/or only depend on that its space is arranged, known (design) geometric parameter that the space comprising its parts or sub-component is arranged.
Relation (14) and (15) are also set up in the case where there, namely the illumination optics of non-inverted type comprises aforementioned multiple lens 19b be combined with the independent light source 18 of array 17, lens 19b may be combined with public non-inverted lens 19c, as illustrated in Figure 14 to Figure 16.
For for purpose of brevity, the lighting device 6 with these lens 19b illustrates in fig. 14 in the plane being assumed to principal direction and the illumination axis A comprising one or more array 17, thinks that this figure adequately describes the more generally three-dimensional situation according to existing instruction at this.
The light that these lens 19b of being launched by light source 18 of each independent lens 19b process is placed on it, to form the illumination axis A around himself
mthe placed in the middle angle ω determined in the size by lens 19b
minterior light beam, described illumination axis A
mdetermined by the line that the center of light source 18 is connected with the center of the lens 19b combined with it.By being positioned properly relative to light source 18 at the center of lens 19b, independent illumination axis A therefore can be ensured
mrelative to the plane of array 17 with the angular slope of hope.
In the embodiment of Figure 14, illumination axis A
mmutual dispersion, to limit illumination summit A
0, described illumination summit A
0drop on array 17 rear in the present case, and separately radial equably with angle μ.In the case, the illumination axis A of lighting device 6 is the angular bisectors of the angle limited by the illumination axis of first light source and last light source, and perpendicular to the plane of array 17.But, by suitably that lens 19b is directed relative to light source 18, the elementary beam at array 17 front cross can be had.In the illustrated case, emission angle ω
mall equal angle μ, therefore adjacent light source 18 illuminates adjacent contact region 22
m, but emission angle ω
mslightly larger than angle μ, the district 22 be illuminated can be made
mslightly overlapping.
The difference of the embodiment of Figure 15 and the embodiment of Figure 14 is: the axis A that throws light on is not orthogonal to the plane of array 17, but with angle η
0relative to the normal slope of the plane of array 17.
The advantage of these embodiments of lighting device 6 is: on the direction of the plane perpendicular to array 17, have low-down thickness.
In both cases, for reducing the size of the light beam launched, lens being placed as making angular magnification <1 fully, and accurately making at each light source 18 front angle magnification AMAG
m=ω
1/ ω, described each light source 18 will be launched in its own angle ω.
The embodiment of lighting device 6 of Figure 16 and the embodiment of the lighting device of Figure 14 distinguish the arranged downstream angle enlargement multiple AMAG be at array 17 and independent lens 19b
mthe other public non-inverted optical device 19c of <1, to be reduced to value ω '=AMAG further by the emission angle of independent light source 18
m* ω
m.The thickness of lighting device 6 on the direction of the plane perpendicular to array 17 increases, but the collimation of illuminating bundle T further.When having light source 18 of own emission angle ω little especially, angle enlargement multiple AMAG
mmay be AMAG
m>1.The situation that similar public non-inverted optical device 19c also can be the embodiment of Figure 15 provides.
In the embodiment of Figure 14 to Figure 16, the illumination axis A of independent light source 18
m, emission angle ω
mwith angle enlargement multiple AMAG
mcan be mutually different, and similarly, though determine array 17 the light source 18 that will open method (step 101) relevant complicacy situation under, illumination axis A
malso need not be spaced.But for given lighting device 6, then the angle that independent illumination axis and the illumination axis A of light source 18 are formed can under any circumstance calculate or measure.Therefore, always can determine the function be associated with the light source 18 illuminating a P of array 17 by the some P on substrate S, no matter this function may how complicated.
In Figure 14 to Figure 16, by the region 22 that light source 18 illuminates
mbeing only purposes of illustration illustrates in the plane being parallel to array plane, but described plane needs not be the plane at given reading distance D place, also need not be image processing system 3 focal plane or etc. vague plane.
Figure 11 and Figure 13 to Figure 16 and the Figure 17 described hereinafter can be thought of as and represent many embodiments that wherein array 17 is lighting devices 6 of bending array (Fig. 6).
Recognize significantly, situation about discussing with reference to figures 13 to Figure 16 is set up, according to the corresponding embodiment being applicable to image processing system 3.For for purpose of brevity, drawing reference numeral relative in Figure 13 to Figure 16 does not indicate in bracket.
By means of Figure 17, now by explain by substrate S by relation that the point in region 16 that sensor 4 frame is got is associated with the light activated element 14 of sensor 4, described relation is used in the step 108 of method of Fig. 9 in the case where there, namely rectangle or as the foursquare dimension sensor 4 of special circumstances, with the inverted light device being illustrated as independent paraxial lens 5, principal plane-described principal plane that described inverted light device has a plane being not parallel to sensor 4 is perpendicular to the plane of the reception axis Z of the optical centre by receiver optical device 5 in that particular case.
When this embodiment of image processing system 3, first reference frame is advantageously selected as its initial point at cartesian reference system X, Y, Z of receiving summit O, its axis Z is chosen as and receives axis Z and to overlap but with the mode orientation (Fig. 2) contrary with the path receiving light R, and Y-axis line is orientated the line direction being parallel to sensor 4, indicate with index i at this line direction upper light sensitive element 14.The column direction of sensor 4 that indicates with index j of light activated element 14 and X-axis line define angle δ thereon.The situation that wherein principal plane of receiver optical device 5 also tilts relative to the line direction of sensor 4 is its generalized case, for simplicity not to its process.In addition, in fig. 17, reception axis Z is designated as the some O corresponding to its center by sensor 4 for the sake of simplicity
s, and especially by the center of the light activated element 14 of sensor 4, but usually this is not strictly required and sets up with reference to the consideration that Figure 12 discusses.
Sensor 4 place plane 30 (schematically being indicated by three dot-and-dash lines) is thereon met along the straight line 31 limited by following equation set with plane X, Y:
x=-s/tanδ (17)
Arbitrary y (18)
z=0 (19)
Negative sign wherein in relation (17) considers the following fact, namely receives the intersection point O of summit O and reception axis Z and sensor 4
sbetween distance s be negative value in reference frame X, Y, Z.
In the depth of field DOF of image processing system 3, measure along receiving axis Z and pass through coordinate
Q(0,0,D) (20)
The general operating distance D place that limits of the some Q of reception axis Z, to define by axis 31 and by the plane 32 (also schematically being indicated by three dot-and-dash lines) of some Q, and therefore represent by following relation:
x*D+z*(-s/tanδ)–[(-s/tanδ)*D]=0 (21)
For the regulation of above used reference Figure 12, each light activated element 14 of sensor 4 is limited by its center, and the coordinate of each light activated element 14 in reference frame X, Y, Z is represented by following relation (22):
F(j*J*cosδ,i*I,s+j*J*sinδ) (22)
For the sake of simplicity, the angular magnification AMAG of the unit of middle supposition receiver optical device 5 is write
s.When this supposes, being centrally located at by light activated element 14 self and by the straight line (the straight line FOP of Figure 17) that receives summit O, and therefore representing by following parameter equation group of the region of being got by general light activated element 14 frame identified with index i, j:
x=j*J*cosδ*f (23)
y=i*I*f (24)
z=(s+j*J*sinδ)*f (25)
F gets arbitrary value.
In the region 16 that plane 32 is got by sensor 4 frame, therefore the coordinate of the center P in the region of being got by each light activated element 14 frame is represented by the formula (23) of the value for parameter f, (24), (25), the value of described parameter f obtains by being combined with relation (21) in formula (23), (24), (25), that is:
In fig. 17 when illustrated embodiment, in the step 108 of the method for Fig. 9, by having the relation (23) of the value of the f of relation (26), (24), (25) are applied to four some P on the summit defining the region 16 of being got by sensor 4 frame on substrate S
1, P
2, P
3, P
4, or be even only applied to relative summit P
1and P
3or P
2and P
4.
Figure 17 and aforementioned description also according to ground be applied to lighting device 6 wherein have correspondence embodiment when lighting device 6, in other words described lighting device 6 with rectangle or foursquare two-dimensional array 17 and inverted light device under special circumstances, described inverted light device has the plane that the described principal plane of the principal plane of the plane being not parallel to array 17-is in that particular case the axis of the optical centre by illumination optics 19a self perpendicular to illumination optics 19a.Drawing reference numeral relative equally is in the case put in bracket in fig. 17.
Once general some P in coordinate system U, V, W or better place P
1, P
2, P
3, P
4or P
1, P
3or P
2, P
4coordinate u, v, w obtain, then in the step 109 of Fig. 9 and by the relation (1) in the step 110 of the method for Fig. 9, (2), (3), therefore following relation will be applied to such coordinate:
n=u/(N*cosε*f) (27)
m=v/(M*f) (28)
This is the backstepping of relation (23), (24), and wherein the value of parameter f also meets following relation:
Simultaneously
w=(t+n*N*sinε)*f (30)
Wherein relation (29) and (30) are corresponding to relation (26) and (25).
By relation (30) and (27) are combined, obtain:
By (31) being substituted in (29), obtain f (u, w), and eventually through by f (u, w) value is updated in (28), and (u, v, w), this is for be omitted for purpose of brevity to obtain m.
It is emphasized that, the analytic relationship of known (design) geometric parameter only depending on reader 1 equally from (17) to the relation of (31), and depend on (design) geometric parameter of the image processing system 3 of reader 1 and the known of lighting device 6 thereof especially, and/or depending on that the space of image processing system 3 and lighting device 6 is arranged, the space of the parts or sub-component that comprise image processing system 3 and lighting device 6 is arranged.
In the multiple structure of image processing system 3 and lighting device 6, therefore described relation allows to calculate illuminating specified point and being typically row index m and the row index n of the light source 18 of the arbitrfary point P in workspace areas 15 of array 17, wherein index 0,0 (puts A with the light source 18 placed along axis W
2) relevant.
As described in reference diagram 9 above, it should be understood that may need or advantageously according to reference frame each in the type of figure that causes and change/increase/reduce the specified point in any one reference frame.Therefore, relation (1) to (3) and backstepping thereof not only can be applicable to specific point, and can be applicable to the expression of straight line or curve equally.
Therefore above-mentioned formula allows determining as follows according to analytic method, namely in the step 101 of the step 101 of the real-time method of Fig. 7 or the disposable method of Fig. 8, at the given operating distance D place that will complete, determine the subset 18a that will open to illuminate the light source 18 in the whole region 16 of being got by sensor 4 frame on substrate S of array 17.
Above-mentioned formula can simplify according to the specific structure of image capture device 2.In addition, different reference frames identical or more advantageously can apply corresponding different formula.
In order to calculate the intensity of each light source 18 determined in the step 101 of the method for Fig. 7 or Fig. 8 of the array 17 that will open, when described intensity variable, be easy to be calculate each some P in the upper region 16 of being got by sensor 4 frame of substrate S apart from the distance d (described distance d does not indicate for purpose of brevity in the accompanying drawings) of light source 18 illuminating this P.When Figure 10 to Figure 17, can easily this distance be represented by following relation in reference frame U, V, W:
Wherein amplitude gets suitable symbol.
For the object of the initial designs of the array 17 of light source 18, worth calculating within it array 17 must can launch the minimum solid angle illuminating the whole visual field of whole workspace areas 15 i.e. image processing system 3 in its whole depth of field DOF.In other words, by highest luminance light beam T
0the solid angle comprised at least must equal this minimum solid angle.
This can easily by above-mentioned concept and formula are applied to suitable specified point and obtained from index m and n by assessment which be the most positive value and which to be the most negative value realize, described specified point is such as at minimum reading distance D=D
1the perform region 16 that place is got by sensor 4 frame
1summit and at maximum read distance D=D
2the perform region 16 that place is got by sensor 4 frame
2summit.The structure of guidance lighting device 6 and geometric configuration and advantageously keep with parametric form in this assessment relative to one or more in the amount of the position of image processing system 3.Such amount comprises illumination summit A
0coordinate x in the reference frame X relevant to image processing system 3, Y, Z
0, y
0, z
0, direction cosine cos α
1to cos α
9, the distance illumination summit A of array 17
0distance, the angular magnification AMAG of illumination optics
a, the angle of inclination ε when the embodiment of Figure 17, and also usually comprise row distance between axles and row distance between axles M, N, the row index of array 17 and the ultimate value of row index m, n, in other words, the quantity of the light source 18 of array 17, and the some A in array 17
2position.Perhaps, the effective value of such amount is subject to possible design restriction, the full-size of such as image capture device 2 and have the availability of array 17 of suitable feature.But, usually, always the size of array 17 and position can be defined as making all light sources 18 of array 17 read distance D place at least one and all be utilized, in other words all be unlocked.When one or more in the structure of indicating image forming apparatus 3 and the amount of geometric configuration also will be maintained with parametric form, similar consideration is effective when designing whole image capture device 2, described amount is such as: sensor 4, apart from the distance s receiving summit O, receives the angular magnification AMAG of optical device 5
s, the angle of inclination δ when the embodiment of Figure 17, and also such as usually columns and rows distance between axles I and J, the row index of sensor 4 and the ultimate value of row index i, j, the in other words quantity of the light activated element 14 of sensor 4, an and O
sposition in sensor 4.
It is to be understood, however, that the value of above listed amount is known constants for given image capture device 2.
According to discussed situation, when the formula can deriving simplification is to be applied in one dimension sensor 4 and/or array 17, and/or usually more complicated formula can be derived to be applied to when bending array 17 (Fig. 6) above.
Also must be clear that, assuming that optical information C occupies the region in the space being positioned at workspace areas 15 generally and is in fully similar local distance place, make the focusing on sensor 4 enough even, then substrate S can have any orientation relative to receiving axis Z in practice; In the case, even if calculate based on independent operating distance D, the lighting condition obtained by lighting device 6 in practice is also suitable.Even if more complicated according to the formula that parsing situation will be applied, but also can perform this determination of tilting is considered more accurately to the light source 18 that will open.
Second method is empirical or the method for self-adaptation type, and its typical embodiment is shown in Figure 18, described second method is used for real-time (Fig. 7) and disposable (Fig. 8) implementation step 101 in the different embodiment of image capture device 2, this step 101 determines the given work at present distance D place in depth of field DOF, and must being unlocked of light source 18 only illuminates the subset 18a in the whole region 16 of being got by sensor 4 frame.
This embodiment is suitable for the situation that the plane of sensor 4 and described plane parallel with the plane of array 17 is all rectangle.More generally method describes hereinafter.
In the step 120, driver is responsible for all light sources 18 opening array 17 at first.In the case, be certainly illuminated by the whole region 16 that sensor 4 frame is got, and this checks in step 121.The situation of negative means design and/or the assembly error of image capture device 2, in other words highest luminance light beam T
0the condition being more than or equal to required minimum solid angle does not meet, and/or illumination summit A
0position and/or illumination axis A incorrect relative to the inclination receiving axis Z, and/or the fault of array 17, and terminating because of the method.But, step 120,121 can be omitted.
When the result of step 121 is affirmatives, be TRUE by traffic sign placement in step 122, the edge of preselected array 17, and start following operation circulation.In step 123, driver is responsible for closing p light source 18 from the preselected border of array 17.In step 124, check whether the whole region 16 by sensor 4 frame is got still is illuminated.In the negative case, be FALSE by traffic sign placement in step 125, and make quantity p reduce quantity a in step 126.Then the execution of step 123 and inspection step 124 is subsequently turned back to.When the result of step 124 is affirmative, whether checkmark is still TRUE in this step 127.In the yes case, make quantity p accelerate b in step 128, and turn back to the execution of step 123.When the result of step 127 be negative, namely when mark is set to FALSE in step 125, in step 129 by quantity a, b reduce and be TRUE by traffic sign placement.Check whether quantity a, b are null value in step 130.In the negative case, step 128 is turned back to.In the yes case, current value p indicates the light source 18 should closed from the preselected edge of array 17, and in step 131, therefore set the temporal form of the subset 18a of the light source 18 that should illuminate.Then check whether that all edges of array 17 are all examined in step 132, in the negative case, turn back to step 122, wherein certainly select the different edge of array 17.When all edges of array 17 are examined, the result of step 132 is affirmative, the final subset 18a that illuminate of light source 18 sets in step 133.
Wish the description with reference to aforementioned analytic model, also with reference to Figure 19, what select array is-m as having the maximum index m of negative value i.e. this index
minthe edge of light source 18, in the result duration p=p of the affirmative of step 130
1the row index indicating the first light source 18 be unlocked is m
1=-m
min+ p
1; Select array as have on the occasion of maximum row index m i.e. this index be m
maxthe edge of light source 18, in the result duration p=p of the affirmative of step 130
2the row index indicating the last light source 18 be unlocked is m
2=m
max-p
2; What select array is-n as having the maximum row index of negative value i.e. this index
minthe edge of light source 18, in the result duration p=p of the affirmative of step 130
3the row index indicating the first light source 18 be unlocked is n
3=-n
min+ p
3; Select array as have on the occasion of maximum row index i.e. this index be n
maxthe edge of light source 18, in the result duration p=p of the affirmative of step 130
3the row index indicating the last light source 18 be unlocked is n
4=n
max-p
4.Therefore, index (m
1, n
3), (m
2, n
3), (m
2, n
4), (m
1, n
4) light source will be unlocked.
The circulation of above-mentioned step 123 to 130 can perform for two of one-dimensional array 17 edge simultaneously, or perform (namely for a pair relative edge when two-dimensional array 17 simultaneously, determine row, corresponding row subset simultaneously) or adjacent edge is performed (that is, the ranks index simultaneously determining the first light source to be opened from the summit of array) simultaneously; Certainly, in the case, variable p, a, b and mark will increase suitably.In certain structure of image capture device 2, in addition on two or three edges of array, only can repeat the circulation of step 123 to 130 fully, such as when substrate S is rectangular area by the region 16 that sensor 4 frame is got and for when observed by sensor 4 observe with lighting device 6 time all placed in the middle time.
It should be understood that the loop number that the use of quantity a and b allows to be performed generally reduces, this is realized by the binary search in the first source performing the subset 18a of the light source 18 to be opened from the preselected edge of array 17.In other words, as long as substrate S to be illuminated entirely by the region 16 that sensor 4 frame is got, indicate and maintain TRUE, then every this closes many (b) individual light source 18 in step 128.When too much light source 18 is closed-and mark becomes FALSE, then attempts to close less light source by switching back some (a) individual light source at every turn, can pent last light source from edge until run into.Especially in step 126,129, the minimizing of a and/or b and/or increase are by half point in succession and/or double (dichotomy) to carry out, with the Fast Convergent of implementation algorithm.But the use of quantity a and b is optional, can open and close single source at every turn.
How those skilled in the art revise the block diagram of Figure 18 by understanding, all to keep from all light sources 18 closing and only opening one or more structure of light source at every turn, or from the structure of the light source 18 of the zone line of initial unlatching array 17.
Furthermore, it is to be understood that the initial number p of pent light source 18 may be selected to be the performed function finally determined.Really, when operating distance D increases (or corresponding reduction), the quantity of the light source 18 that should close from an edge of array 17 increases, and the quantity of the light source 18 that should open from the relative edge of array 17 reduces (comparison diagram 3).Therefore, as always with the illumination getting back to whole array 17 start substitute, can from light source be defined as the subset 18a of immediate operating distance D.
District to be opened wherein on array 17 is general quadrilateral and is not in rectangle or foursquare more generally situation, this situation generally when the plane residing for sensor 4 and array 17 is not parallel occur and especially at Figure 17 occur, the different embodiment of experience/adaptive approach be more conducive in real time (Fig. 7) and once (Fig. 8) perform step 101, the given work at present distance D place of this step 101 in depth of field DOF, determines the subset 18a that must open only to illuminate the region 16 of being got by sensor 4 frame of light source 18.
This embodiment is based on the same following steps described with reference to Figure 20:
A) one after the other open a series of row or column of array 17, until illuminated at least in part by the region 16 that sensor 4 frame is got, especially until sensor 4 detect usually tilt and be not in the image of the line at center;
B) identify and open the light source 18 being designated as " beginning light source " below of array 17, this is in fig. 20 with a G
0representative, described light source 18 illuminates the some P of this line be illuminated
0, described P
0illuminate again the some F of sensor 4
0(or light activated element 14); Preferably beginning light source is chosen as the light source of the intermediate point of the part illuminating the line be illuminated seen by sensor 4; Selection can such as be carried out with all light sources 18 one after the other opening rapidly checked row or column;
C) select directed direction along array 17, the initial point in the direction of this orientation is at beginning light source G
0place and illuminate along this direction discernment and put P as follows
1light source 18 (by the some G in Figure 20
1representative), i.e. described P
1image be formed in the light activated element 14 of the edge of sensor 4 on one, this light activated element is by the some F in Figure 20
1representative;
D) edge of the correspondence of the light source 18 and sensor 4 recognized is stored;
E) each orientation direction separated angularly with the direction previously performed of selecting carrys out repetition step c) and d) until complete 360 °, identify and correspond to light activated element F
2, F
3, F
4some G
2, G
3, G
4it should be noted that the edge be at every turn identified of sensor 4 can be the edge identical or adjacent with the edge previously performed; Angular interval suitably between choice direction, makes to there are at least eight iterative step c) and d), preferably at least ten two iterative steps, go out at least two light sources 18 with each limb recognition for sensor 4;
F) for each light source group of the point on that illuminates its image and be formed in the light activated element 14 at the identical edge of sensor 4, such as, for the light source G of Figure 20
2, G
3, identify the straight line described light source group connected on array 17; With
G) this straight line is connected the circumference to form the polygon (quadrilateral) limiting light source 18 to be opened.
For circle/oval sensor, method is identical, but the different edge of distinguishing sensor 4 is insignificant significantly, and in order to find the border of light source 18 to be opened, from recognized light source, need the non-linear interpolation used between the position of the light source recognized, described non-linear interpolation is that those of skill in the art are known.
The possible embodiment of such embodiment is being divided shown in the Figure 21 on multipage.
Therefore run circulation with first and implement abovementioned steps a).In first step 150, initialization counter QUAD, such as, be initialized as 0.This counter identify array 17 as lower area, perform the subset 18a of light source 18 to be opened searching array 17 in this region.In preferred enforcement, value QUAD=0 identifies whole array 17, and is worth four quadrants that QUAD=1 to 4 indicates array 17.Other segmentation of array 17 can be used.In step 151 subsequently, be unlocked by the central series in district of the currency of counter QUAD mark, all light sources 18 that to make as QUAD=0 be the central series of array 17 are all unlocked.In step 152, check whether to be illuminated at least in part by the region 16 that sensor 4 frame is got on substrate S, or the line whether be illuminated is at least part of by sensor 4 " seeing ".In the negative case, arrive step 153, wherein the row of the current light source 18 be unlocked are closed, and are unlocked by the central row in the district of the currency of counter QUAD mark, and all light sources 18 of the central row of the array 17 as QUAD=0 are unlocked.In step 154 subsequently, whether rechecking is illuminated at least in part by the region 16 that sensor 4 frame is got on substrate S, or the line whether be illuminated is at least part of by sensor 4 " seeing ".In the negative case, the parameter of QUAD increases by 1 in step 155, and in step 156, check four quadrant (QUAD>QUAD that region that wherein array 17 divides is ideally all particularly
max, particularly QUAD>4) in all districts be not yet exhausted.In the yes case, because there is the design error of reader 1 or its fault terminates in method.If quadrant is not yet all detected (QUAD≤QUAD
max), then return and perform step 151, therefore open the central series (central series of the quadrant be considered in step 153) of considered quadrant.If step 152 or step 154 give the result of affirmative, then this means to be illuminated by the ranks of the current unlatching on array 17 at least in part by the region 16 that sensor 4 frame is got on substrate S.It should be noted that, if reader 1 is designed appropriately, then the size of the subset 18a of the light source 18 to be opened of any reading distance D place array 17 in depth of field DOF can be ignored relative to the overall dimensions of array 17, and is usually enough to QUAD=0 iteration.
In step 158, implement abovementioned steps b), in other words, identify and open array 17 the ranks (row or column) being under the jurisdiction of current unlatching and be selected as making to illuminate equally by the independent light source 18 of the point of sensor 4 " seeing "; Preferably, that light source that light source is chosen as the intermediate point of the part of being seen by sensor 4 illuminating the line that is illuminated on substrate S is started.Step 158 can such as comprise: the light activated element 14 be between those current light activated elements 14 be illuminated of identification sensor 4, and then perform all light sources 18 one after the other opening checked row or column rapidly, the output of each this light activated element 14 of assessment.
After step 158, implement above-mentioned steps c), operation d), e) circulation is performed.In step 159, initialization is used for four device variables of this circulation: FIR=1, SENSOR_EDGE=FALSE and two positive integer value H, L, their meaning will be illustrated hereinafter.The light source 18 illuminating following point is searched in the direction that first variable indicates along which orientation of array 17, on one that namely the image of described point is formed in the light activated element 14 of an edge of sensor 4.Such as, variables D IR can such as respectively from as 1 change-step 152 of column or row or step 154 success thereupon respectively-to along the greatest measure performing the direction of searching, MAX_DIR.Each direction relative to previous direction rotary constant or non-constant angle, and preferably rotates 45 ° on array 17, to obtain eight directed directions, or preferably rotates 30 ° to obtain 12 directed directions.The second variable SENSOR_EDGE mark whether be instruction had been found that along direction DIR by the light source (namely illuminating the light source of the point on that image is formed in the light activated element 14 of the edge of sensor 4) searched.
At this some place, in a step 160, the light source of the H on orientation direction DIR of array 17 is unlocked.Then be step 161, at least one wherein checking whether the light activated element 14 at an edge in the edge of sensor 4 is illuminated.In the negative case, SENSOR_EDGE=TRUE is checked whether in step 162; In the negative case, when the first time be similar in step 162 performs, step 160 is turned back to, therefore with the ranks that H light source " prolongation " is opened on the DIR of direction.
When at least one of the light activated element 14 at the edge of in the edge finding sensor 4 is in the step 161 illuminated, export "Yes", perform step 165, wherein indicate that SENSOR_EDGE is changed to TRUE; In step 166 subsequently, H and L will be worth and reduce; And in step 167 subsequently, check whether H=0 and L=0.
In the negative case, if namely quantity L, H be still on the occasion of, then then carry out step 168, the light source 18 wherein opened on the DIR of direction reduces L, in other words, from the straight line along directed direction DIR with the relative end of light source close L light source 18.Then, turn back to 161, whether the light activated element 14 therefore assessing the edge of sensor 4 is still illuminated.In the yes case, repeat step 165 to 168, therefore each light source of closing cumulative a small amount of L, namely shortens the ranks that (but at every turn less) is illuminated on the DIR of direction.
When checking that step 161 has the result of negative but previously had the result of affirmative, because SENSOR_EDGE is that TRUE is so the inspection of step 162 is affirmatives; Therefore, the step 163 that reduces of the value of performance variable H and variables L and execute flag SENSOR_EDGE becomes the step 164 of FALSE; Then return and perform step 160.Under these conditions, really the light activated element 14 at the edge of sensor 4 had once been illuminated but had no longer been illuminated, therefore restart the ranks that " prolongation " is illuminated on DIR direction, to turn back to the edge-illuminated towards sensor 4, but extend the ranks be illuminated with less amount.
Abovementioned steps repeat until value H, L be all zero, the result of step 167 is the result of affirmative, the light source 18 that this indicates the point illuminated on that its image is formed in the light activated element 14 of the edge of sensor 4 is identified.The value indicating this light source 18 is generally a pair row index and the row index of described light source 18, and described value is stored together with the edge of corresponding sensor 4 in step 169, therefore implements abovementioned steps d).
After step 169, perform and check namely whether last search direction reached the step 170 of whether DIR>MAX_DIR; In the negative case, to indicate that in step 171 SENSOR_EDGE arranges back TRUE and counter DIR increases by 1, then in step 172, (ranks along with the preceding value of DIR) light source 18 outside beginning light source of all current unlatchings is all closed, then step 160 is turned back to, repeat to search the whole circulation of the light source 18 of the point on that illuminates in light activated element 14 that its image is formed on the same edge of sensor 4 or neighboring edge, and this light source 18 is stored together with the edge of sensor 4.
When the result of step 170 is affirmative, step e already pointed out) repeat terminate.Then, proceed to respectively by step 173 and step 174 implementation step f) and step g), found straight line that the light source 18 of the point of the light activated element 14 at the identical edge illuminated corresponding to sensor 4 links up on array 17 by method of interpolation in described step 173, and connected by these straight lines in described step 174, thus define the summit of the circumference of light source 18a to be opened on array 17.
The use of parameter L, H is strictly unrequired, but this use allows the search of the light source 18 of the point of the light activated element 14 illuminated corresponding to the edge of sensor 4 to accelerate.Preferably, parameter L, H are set as the power of 2 at first, and reduce by half at every turn.Alternatively, described parameter L, H can reduce constant amount at every turn, reduce 1 especially.
Alternatively, light source 18 can open one along each direction DIR at every turn, until the light source 18 illuminating the point of the light activated element 14 corresponding to the edge of sensor 4 is recognized directly.
Perform in step 121,124,152,154 and 161 whether the region 16 of being got by sensor 4 frame is illuminated and the assessment be illuminated in which way performs by the automatic analysis of the image exported for the image processing system 3 by image capture device 2.
If as the alternative analysis made it based on the part for image making automatic evaluation based on the analysis of whole output image, then automatic evaluation can be accelerated, the analysis at the edge based on image is made it especially when one dimension sensor 4, and the analysis made it when dimension sensor 4 based on the row and column of the circumference forming image or the analysis that makes it based on only central columns and/or rows.The partial analysis of this type make use of the possibility being called ROI or Multi_ROI of the imageing sensor known, this allows to limit one or more and pays close attention to district (ROI), and this makes the output of sensor 4 far away relative to the whole Kuang Qu district of reading quicker.Alternatively or extraly, the image caught with lower resolution can be assessed, namely such as only analyze whole sensor 4 or one or more pay close attention to the light activated element 14 replaced in district.
Make in step 121,124,152,154 and 161 perform whether the region 16 of being got by sensor 4 frame is illuminated and the assessment be illuminated in which way also visually performs by operator, the image obtained by sensor 4 will be shown on output unit 10.In the case, suitable feedback is provided to reading system 1 by the manual input device 11 of control signal and/or data by user, this use of mark that will be similar in the method for Figure 18.Otherwise, two or more controls allowing user to increase or reduce the quantity of the light source of unlatching from each edge of array 17 (or closedown) respectively can be there are, therefore perform the function being similar to square frame 126,128,129,163,166.
Must be noted that other factor works, and namely substrate S-phase works for the inclination receiving axis Z when using the determining in real time of adaptive approach and image automatic evaluation.When described substrate S is with when receiving axis Z out of plumb, distance of each point in the region of being got by sensor 4 frame is in the distance range of about average operation distance D, and in the case, adaptive approach by provide the light source of array 17 from when substrate S perpendicular to the different subset 18a of subset when receiving axis Z unlatching as a result; But, if image capture device 2 is properly designed, then there is not wherein array 17 and be of a size of the emission angle T with the minimum emission angle equaling required
0all light sources 18 by the situation of opening simultaneously.
The reduction form of adaptive approach also can be used for improving with analytic method (such as, that above-mentioned method) selection of the subset 18a of light source 18 determined, such as, for revising the out of true of the array 17 of the light source of each image capture device 2 of batch production.In the case, step 123 only performs to 131 or 160 to 169 in the neighborhood of the subset 18a calculated with analytic method, in other words, from indicating the border of this subset 18a (index m, quantity p n), H, L.
Figure 22 to Figure 27 schematically illustrates some specific advantageous embodiments of the device 2 for catching image.For simplifying for the purpose of statement, all embodiments describe being assumed to comprise illumination axis A and receive in axis Z, the direction of sensor 4 or the direction of principal direction and array 17 or the plane of principal direction, wherein consider aforementioned instruction, think that they adequately describe more generally situation, comprise the situation (Fig. 6) of curved arrays.
According to the embodiment of Figure 22, image processing system 3 be according to Figure 11, Figure 13 with the image processing system of in the embodiment of the receiver optical device 5 coaxial with sensor 4, or according to the image processing system of in the embodiment of Figure 14 or Figure 16.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of illumination optics 19a, the 19c coaxial with array 17, or according to the lighting device of in the embodiment of Figure 14 or Figure 16.Receive axis Z therefore perpendicular to sensor 4, and illumination axis A is therefore perpendicular to array 17.Illumination axis A is parallel to reception axis Z but does not overlap with it.Therefore array 17 and sensor 4 can be arranged as coplanar, and are advantageously arranged on identical support member, on identical surface-mounted integrated circuit or on identical surface-mounted integrated circuit and manufacture.It should be noted in the case, lighting device 6 should be designed to have be greater than required minimum three-dimensional emission angle namely correspond to highest luminance light beam T
0total three-dimensional emission angle, and therefore some light sources 18 of array 17 are always closed.For reducing this shortcoming, array 17 also can be arranged as and be parallel to sensor 4, but not coplanar with it.The advantage of this embodiment is that design and assembling are simple.
On the other hand, the lighting device 6 of the following embodiment according to Figure 23 to Figure 27 can be designed to have the three-dimensional emission angle of the minimum three-dimensional emission angle equaling required, gauge without light source 18 on array 17 is always closed, and array 17 is utilized completely.
According to the embodiment of Figure 23, image processing system 3 be according to Figure 11, Figure 13 with the image processing system of in the embodiment of the receiver optical device 5 coaxial with sensor 4, or according to the image processing system of in the embodiment of Figure 14 or Figure 16.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of not coaxial with array 17 illumination optics 19a, 19c, or according to the lighting device of in the embodiment of Figure 15 or Figure 17.Receive axis Z therefore perpendicular to sensor 4, and the plane of axis A relative to array 17 of throwing light on is to use θ at this
0the angular slope of instruction.Illumination axis A tilts relative to reception axis Z with equal angle θ=θ
0.Therefore array 17 and sensor 4 can be arranged in parallel plane, are particularly arranged as coplanar, with the above advantage discussed with reference to Figure 22.It should be noted, if employ the structure of Figure 17 for lighting device 6, then the words that illumination plane tilts very much are not very favourable.
According to the embodiment of Figure 24, image processing system 3 be according to Figure 11, Figure 13 with the image processing system of in the embodiment of the receiver optical device 5 coaxial with sensor 4, or according to the image processing system of in the embodiment of Figure 14 or Figure 16.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of illumination optics 19a, the 19c coaxial with array 17, or according to the lighting device of in the embodiment of Figure 14 or Figure 16.Receive axis Z therefore perpendicular to sensor 4, and illumination axis A is therefore perpendicular to array 17.Sensor 4 and array 17 are arranged in angulation θ betwixt
1plane on, illumination axis A is tilted relative to reception axis Z with identical angle θ=θ
1.
According to the embodiment of Figure 25, image processing system 3 be according to Figure 11, Figure 13 with the image processing system of in the embodiment of the receiver optical device 5 coaxial with sensor 4, or according to the image processing system of in the embodiment of Figure 14 or Figure 16.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of not coaxial with array 17 illumination optics 19a, 19c, or according to the lighting device of in the embodiment of Figure 15 or Figure 17.Receive axis Z therefore perpendicular to sensor 4, and throw light on axis A relative to array 17 plane with at this with θ
0the angular slope of instruction.Sensor 4 and array 17 are arranged in angulation θ betwixt
1plane on, illumination axis A is tilted relative to reception axis Z with angle θ=θ
1+ θ
0.This embodiment allows angle θ
1and θ
0absolute value little, and the undersized image capture device 2 be therefore maintained still obtains large angle θ and larger design flexibility by having two parameters worked.When depth of field DOF concentrates on the region close to reader 1, this embodiment is useful especially.
According to the embodiment of Figure 26, image processing system 3 is image processing systems of the embodiment according to Figure 17.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of illumination optics 19a, the 19c coaxial with array 17, or according to the lighting device of in the embodiment of Figure 14 or Figure 16.Axis A is therefore perpendicular to array 17 in illumination, and receive axis Z relative to sensor 4 plane with at this with θ
2the angular slope of instruction, makes illumination axis A tilt relative to reception axis Z with equal angle θ=θ
2.Therefore array 17 can be arranged in parallel plane with sensor 4, arranges particularly coplanarly, with the above advantage discussed with reference to Figure 22.
According to the embodiment of Figure 27, image processing system 3 is image processing systems of the embodiment according to Figure 17.Lighting device 6 be according to Figure 11, Figure 13 with the lighting device of in the embodiment of not coaxial with array 17 illumination optics 19a, 19c, or according to the lighting device of in the embodiment of Figure 15 or Figure 17.Illumination axis A therefore relative to array 17 plane with at this with θ
0the angular slope of instruction, and receive axis Z relative to sensor 4 plane with at this with θ
2the angular slope of instruction.Therefore array 17 can be arranged in parallel plane with sensor 4, arranges particularly coplanarly, and with the above advantage discussed with reference to Figure 22, and illumination axis A tilts relative to reception axis Z with angle θ=θ
0+ θ
2.This embodiment allows angle θ equally
0and θ
2absolute value little, and the undersized image capture device 2 be therefore maintained still obtains wide-angle θ and larger design flexibility by having two parameters worked.
The illumination optics 19a of the public non-inverted of the embodiment of Figure 13 and Figure 16 also can be arranged as and its axis is tilted relative to array 17, is similar to situation described in reference diagram 17.Such lighting device 6 can be advantageously used in the embodiment of the image capture device of Figure 23, Figure 25, Figure 27.
In addition, the measure of Figure 11 and Figure 17 may be combined with, and this passes through inversion illumination optics 19a to tilt relative to array 17 and arrange the tilt angle theta of the large value obtained between illumination axis A and the plane normal of array 17 with offseting
0realize, the increase of the overall dimensions of image capture device 2 is less simultaneously, and when depth of field DOF concentrates on the district close to reader, this is useful especially.
With reference to Figure 28, this figure relates to the lighting device 6 of the embodiment of Figure 11, time in the object plane that array 17 is positioned at the projecting lens 19a of inversion type, the illuminating bundle launched by the array 17 of light source 18 can be made to focus on for the suitable distance range in lighting device 6 front by selecting the f number of projecting lens 19a suitably.Such distance range at least should correspond to the depth of field DOF of the sensor 4 of image capture device 2.
Known in the literature as such distance range (W.J.Smith of image-side depth of focus, " Modern Optical engineering; 3rd ed.; ed.McGraw Hill 2000, chap.6.8) according to carrying out measuring away from projecting lens 19a (δ ') from the distance D ' of projecting lens 19a focal position or carrying out measuring and different towards projecting lens 19a (δ ").But, for the distance D ' of large value, assuming that δ '=δ ", then this difference can be ignored, and the scope of distance is substantially equal to δ '=D '
2* κ/W
a=D ' K '/W
a, wherein K ' is the image 22 of each light source 18 on substrate S
mdesign maximum magnification, the unit identical amount that to be mm, κ be represents with direction ambiguity (K '=D ' * tg κ), and W
ait is the aperture of projecting lens 19a.For low-angle, such as, when this investigates, κ ≈ ω
m, or κ ≈ ω '
m, wherein angle ω
m, ω '
mindicate in Figure 14 to Figure 16.
The work f number of projecting lens 19a (D '/W
a) and focal length (D ') higher, then image-side depth of focus δ ", δ ' is larger.For example, assuming that will have ± the illuminating bundle T of the emission angle of 25 ° is in distance the D '=350mm place focusing apart from projecting lens 19a, and think the image that direction ambiguity κ equals approximately to be illuminated size 2.5%, i.e. κ=1.25 °, the f number 35 that then works is enough to make image-side depth of focus δ ' ≈ δ "=267mm; if i.e. described DOF=2 δ ', then the whole depth of field DOF of the image be incident upon by array 17 on substrate S for image processing system 3 is kept focusing on.
By the aperture W by projecting lens 19a
aselect between 5mm and 20mm, preferably between 6mm and 12mm, and the focal length D ' of lighting device 6 is selected between 100mm and 350mm, the image-side depth of focus δ ' of the representative value with application can be obtained, in other words, the representative value of the depth of field DOF of image processing system 2 and minimum reading distance D is obtained
1representative value and maximum read distance D
2representative value.
Therefore, assuming that select projecting lens 19a suitably, then can obtain the image of the projection of the array 17 as light source 18 projected by lighting device 6, this image has edge clearly at each operating distance D place.
Similar Consideration is applied to the lighting device of the embodiment of Figure 13 to Figure 17.
In the embodiment of the lighting device 6 of Figure 11, Figure 13, Figure 16, Figure 17, illumination optics 19a or 19c preferably includes collimation lens, and described collimation lens has such as constant angle enlargement characteristic AMAG
a, preferably with 0.7 angle enlargement ratio.Illumination optics 19a, 19c preferably have fixing focal length.
As specific example, let us considers the following structure of Figure 24, described structure with the lighting device of Figure 11, with to be in plane Z, Y (plane of Figure 24) and there is multiple m
tot=52 along axis Y arrange the one-dimensional array 17 of light source 18, described light source 18 be separated from each other with M=100 μm and total length for 52*100 μm=5.2mm.Assuming that the tilt angle theta between illumination axis A and reception axis Z
1for θ
1=14 °, make cos α
1=1.000, cos α
2=0.000, cos α
3=0.000, cos α
4=0.000, cos α
5=0.970, cos α
6=-0.242, cos α
7=0.000, cos α
8=0.242, cos α
9=0.970.Illumination summit A
0be in apart from receiving the y of summit O in the scope of 0 to 20mm
0distance, such as=y
0-10mm, and illumination summit A
0move 10mm towards axis Z, therefore its coordinate is A
0(0 ,-10,10).Also suppose with arrange along axis X and centered by initial point O the image capture device 3 of one dimension sensor 4 have constant and relative to the visual field of z axis symmetry, this visual field is limited at β
1=+20 ° and β
3(usually, visual field β between=-20 °
1=β
3between 10 ° and 30 °).Also suppose that the depth of field DOF of sensor 4 is included in minimum operating distance D
1=30mm and maximum functional distance D
2between=500mm, make depth of field DOF=470mm.Then suppose that lighting device 19a has constant angle enlargement characteristic, gets amplification ratio AMAG
a=0.7, and array 17 is arranged in the distance t=-6mm place apart from illumination optics 19a.Applying equation (1) is to formula (15), and at each distance D place, what obtain on array 17 is to be opened with the minimum index m of the end light source 18 covering the ranks that this distance is got by sensor 4 frame exactly
1with Maximum Index m
2.For with the step-length of 30mm, the in the end operating distance D that samples with the step-length of 20mm of a step, the change of this index illustrates below in table 1.
Table 1
D | m 1 | m 2 |
30 | -25 | 12 |
60 | -14 | 21 |
90 | -10 | 23 |
120 | -9 | 24 |
150 | -8 | 24 |
180 | -7 | 25 |
210 | -7 | 25 |
240 | -7 | 25 |
270 | -6 | 25 |
300 | -6 | 25 |
330 | -6 | 26 |
360 | -6 | 26 |
390 | -6 | 26 |
420 | -6 | 26 |
450 | -6 | 26 |
480 | -5 | 26 |
500 | -5 | 26 |
Figure 29 provides the visable representation of the subset 18a of light source 18 to be opened on array 17, and this visable representation is designated as the continuous band proceeding to Maximum Index from minimum index.
From angle qualitatively, it is evident that from the research of table 1 and Figure 29, equal visual field β
1, β
3or angle target is close to ultimate range D
2operating distance D place, there is not significantly large change in the position of first and last light source 18a to be opened on array 17, in other words, end index m
1and m
2slow change, and from close minor increment D
1operating distance place near, experienced by larger change, in other words index m in the position of first and last light source 18a to be opened in array 17
1and m
2change more quickly.
It is noted that at operating distance D place, all light sources 18 of array 17 are not all opened, but at each operating distance D place, a certain amount of light source 18 from least one edge of array 17 is closed.In addition, first and last light source 18 respectively at minimum operating distance D
1with maximum functional distance D
2place opens.As mentioned above, if lighting device 6 (by highest luminance light beam T
0comprising) three-dimensional emission angle equals required minimum three-dimensional emission angle, then and these optimal conditions can utilize any embodiment in the embodiment of Figure 23 to Figure 27 to obtain.But can use the structure of Figure 22, simply, some light sources 18 of array 17 will always be closed at all working distance D place in the case.
Be being similar to Table I but usually expanding to the expression of the situation of two-dimensional array 17 of question blank form from Figure 29 and/or from it, and/or apply above-mentioned method, therefore obtain the index m of end light source 18 to be opened in array 17
i, n
i, thus perform step 101, it is to be opened with the subset 18a of the light source 18 illuminating the whole region 16 of being got by sensor 4 frame on substrate S that each operating distance D place of this step 101 in the depth of field DOF of sensor 4 determines array 17.
If that determines array 17 to be openedly uses analytic method with the subset 18a illuminating whole region 16 light source 18 got by sensor 4 frame on substrate S, in real time (Fig. 7) and once (Fig. 8) carry out, then driver 13 must know read distance D.
Such information provides by the suitable measuring appliance reading distance, and described measuring appliance can be the part of reading system 1 shown in figure 2 or communicate with it by communication interface 9.The such measuring appliance reading distance D can use the modes known different in essence to make, such as by the device based on photronic system, based on the measurement of phase place or laser instrument or LED, visible ray or the device of measurement of light beam flight time of IR (infrared ray), or the device of radar or ultrasound in nature etc.
But, the intrinsic dirigibility of the array 17 of the light source 18 that can drive separately of lighting device 16 provides following possibility, namely illuminate on substrate S shape-variable, size luminous pattern and/or illuminate in the region 16 of being got by sensor 4 frame as the luminous pattern of position of function reading distance D, can also illuminate with variable wavelength (multiple wavelength), this allows to measure or estimate to read distance D, and the existence of detected substrate S, and can measure or the focused condition of estimated image forming apparatus 3.
By obtaining the image illuminated by luminous pattern (partly) of substrate S with image capture device 3, therefore can estimate or even accurately measure the distance at substrate S present position place, namely read distance D.Alternatively, estimation or the measurement of reading distance D are performed by user, and manually control and/or data input device 11 are supplied to driver 13 suitably.
Such as, with reference to Figure 30 and Figure 31, wherein image capture device 2 is such as the image capture device of the embodiment according to Figure 23, the sub-set of light sources 18b that driver 13 can drive array 17 such as to launch in certain predetermined emission angle φ with the subset 18b opening light source 18, only to illuminate the part in the region 16 of being got by sensor 4 frame wittingly.When reading distance D and changing, because the parallactic error between lighting device 6 and image processing system 3 (is not corrected in the case, but intentionally for this object), size and the position on the border of the luminous pattern 23 that substrate S is illuminated change in the image caught by sensor 4.In the depicted situation, by the luminous pattern 23 projected be the rectangle increasingly widened from the edge in the region 16 of being got by sensor 4 frame and towards the opposite edges in described region 16 along with the increase of reading distance D.Similarly, the band that can widen of movement can be projected, or cross can be projected when two-dimensional array 17.If substrate S does not exist or outside depth of field DOF, then luminous pattern 23 does not drop on or only partly drops in the region 16 of being got by sensor 4 frame, or luminous pattern 23 is excessively fuzzy, makes the function that also obtain the existence detecting substrate S.
In an interchangeable embodiment, driver 13 can drive array 17, the bar of pair of angled is such as projected with the structure 18b opening light source 18, described bar defines by changing position on substrate S according to operating distance D the luminous pattern continuously changed between the bar that separates at two, described luminous pattern is V, X, inverted V and two bars separated with the inclination relative with initial inclination, as such as described in aforementioned EP 1466292B1.X-shaped shape can advantageously be associated with the optimum focusing distance of image processing system 3.In another embodiment, driver 13 can drive array 17 such as to project a pair cross with the structure 18b opening light source 18, described cross by define according to reading the distance D position of change on substrate S continuously change between two different crosses and the luminous pattern that can differently from each other tilt, can advantageously be associated with the optimum focusing distance of image processing system 3 with the independent cross at the operating distance D place two cross overlaps, as such as at aforementioned US 5,949, describe in 057.Estimation or the measurement of operating distance D also can utilize the following fact, and the luminous pattern 23 be namely incident upon on substrate S little by little loses definition when opening from the optimum focusing distance of image processing system 3 is mobile, in other words fogs, described by above reference Figure 28.Therefore these embodiments and other similar embodiment also allow for the function estimated or assess focused condition, and/or when visible-range intraoral illumination, allow the visual information providing focused condition to user, this visual information utilizes image capture device 2 to implement.When such as projecting two oblique stripe, luminous pattern also advantageously indicates to user the direction that image capture device 2 and substrate S-phase are shifted to realize focused condition mutually.
Alternatively or extraly, as schematically illustrated in Figure 32 to Figure 34, after measuring or automatically estimating operating distance D or based on the information received from the device outside image capture device 2, driver 13 can drive array 17 to open the structure 18c of light source 18, to be in the luminous pattern 24 substrate S is incident upon in limit of visible spectrum at distance D, described luminous pattern 24 can intuitivism apprehension, be such as word " TOO FAR (excessively far away) ", " TOO CLOSE (excessively near) ", and this projection may be adjoint with the ambiguity of described figure 24, the suitable coupling of the focal length of this focal length by array 17 and receiving trap 3 overcomes to pass on be intended to meaning further, and word can be " OK " in the in-focus situation.
It should be noted, although above-mentioned structure 18b by driver 13 determine light source 18 illuminate the subset 18a in the region 16 of being got by sensor 4 frame on substrate S before open, but the structure 18c described now by driver 13 determine light source 18 illuminate the subset 18a in the region 16 of being got by sensor 4 frame on substrate S after open, and therefore can be advantageously placed in the middle relative to this subset 18a, as shown in Figure 34.
The intrinsic dirigibility of the array 17 of the light source 18 that can drive separately of lighting device 16 also provides the possibility of result of implementation indicating device.In this operational mode, driver 13 drives array 17 to open the structure of light source 18, to illuminate the luminous pattern that instruction attempts catching image and/or the positive or negative result of decodes optical information C and the possible cause of negative decision on substrate S, such as " OK " illustrates in Figure 34 by the structure 18c of light source 18, or illustrates in a similar fashion " NO ".As interchangeable, or except be used to indicate result luminous pattern vpg connection this change except, the change of the size of luminous pattern, color and/or aspect, position can be used, such as any green emitting figure is by instruction result certainly, and the luminous pattern of redness is by the result of instruction negative.Equally in the case, 18c is constructed preferably placed in the middle relative to subset 18a.
The intrinsic dirigibility of the array 17 of the light source 18 that can drive separately of lighting device 6 additionally provides the possibility implementing sighting device.
Therefore, such as, for being provided for the image of the aiming at whole region 16 place of being got by sensor 4 frame, this carrys out assist operator by the vision instruction showing the region 16 of being got by sensor 4 frame on substrate S and is located relative to optical information C by reader, once limited light source 18 to be opened with the subset 18a illuminating the whole region 16 of being got by sensor 4 frame, driver 13 just can drive array 17 to one or a certain amount of light source 18d of the edge or adjacent edges place of opening this subset 18a, to illuminate the border in region 16 or one or more part on border of being got by sensor 4 frame, bight such as when dimension sensor 4, as what schematically shown by the luminuous aiming figure 26 in Figure 35.Alternatively or extraly, one or a certain amount of light source 18 of the middle part on four limits that driver 13 will be responsible for illuminating at the rectangle limited by subset 18a or quadrilateral, and/or illuminate a certain amount of light source being arranged as cross in the center of the rectangle limited by subset 18a or quadrilateral.
Also exist in the multiple region 16 that wherein can advantageously make image capture device 2 only catch to be got by sensor 4 frame one or more pay close attention to the application of district ROI.Multiple light sources 18 that can drive separately of array 17 are provided to allow easily to obtain this illumination paying close attention to the part of district ROI and/or aiming.In the case, driver 13 drives the array 17 of lighting device 6 only to open one or more structure 18e (not shown) of the light source 18 of the subset 18a determined according to one of above-mentioned method, each structure 18e is relative to subset 18a sizing and location as follows, and namely which is corresponding to the how sizing of the concern district ROI be associated relative to the whole region 16 of being got by sensor 4 frame on substrate S and the mode of location.
First application comprises the reader 1 with dimension sensor 4 of structure as linear reader.For increasing frame rate and reading reactive, the quantity of the row of the work of sensor 4 can be reduced to only number row (low to a line) and this is known in itself; In the case, pay close attention to district ROI and be arranged in vertical field of view β ideally
2, β
4or horizontal field of view β
1, β
3the thin rectangle region being reduced to ranks of center.Therefore the structure 18e of the light source 18 opened by driver 13 comprises the central source of subset 18a in one direction or all light sources of some central source and subset 18a in vertical direction, to pay close attention to the thin light belt of ROI place of district projection on substrate S.
Second application is the image that process has the file of standardized format or form.Exemplarily, Figure 36 illustrates the file or form 200 that comprise pending dissimilar information, especially:
-comprise with OCR (optical mask identification) form, in other words with can by the district 201 of the coded message of the written symbol of suitable software identification;
– comprises the district 202 of the linear of one or more optics and/or 2 d code;
– comprises the district 203 of other coded message in graphic form, and described graphic form is handwritten text, signature, trade mark or logo, stamp or image such as.
The first image caught by sensor 4 or its part suitable process, this process may be carried out with low resolution, the position in such district 201 to 203 is positioned in the region 16 of being got by sensor 4 frame by the mode known in essence, for the purpose of reduced representation, this region 16 is assumed to and overlaps with whole file 200.When the region 16 of being got by sensor 4 frame extends beyond whole file 200, this can be thought of as other concern district.
Once one or more in such district 201 to 203 is located, then driver 13 can drive array 17 to illuminate at least part of, the center in located district (multiple district) 200 to 203 and/or the light source 18 on border only to open to be similar to mode described in reference diagram 35, to serve as the auxiliary of the interactive selection in the district of locating aiming at and/or be actually processed per entropy coder, as passed through the aiming luminous pattern 26,26 in Figure 36
1, 26
2, 26
3shown.
The interactive selection of user can such as by show different districts 201 to 203 and can by show by sensor 4 frame get whole region 16 carry out with associated different digital, wherein associated different digital also self is incident upon the aiming luminous pattern 26,26 in located district 200 to 203 by lighting device 6
1, 26
2, 26
3place or near, wherein light source 18 be suitable for according to the different wavelength of at least two in visible-range individually or launch as a whole when by with the different aiming luminous pattern 26,26 of different color shows
1, 26
2, 26
3come carry out.Each numeral or color such as can have the different button of the manual input device 11 of the reader 1 of associated, maybe can there is one or two for the button selected that circulates in different districts 200 to 203, after a certain time or by the selection of pressing other button or make selection become last in another suitable mode.The district 200 to 203 selected in every case can such as be given prominence to by the illumination of larger illumination, interval or similar fashion, or can illuminate independent district 201 to 203 once when pressing selection key at every turn.
For the identical object that interaction is selected, or the region 200 to 203 selecting one or more to locate by user in a subsequent step, driver 13 can drive array 17 to open the structure 18e only comprising the light source 18 in the district 200 to 203 illuminating location, to provide the illumination of optimization for catching the object of the image in each district, as such as passed through the light-emitting zone 27 in Figure 37
2, 27
3illustrate.
When use together with levels of standards or form reader 1, aim at figure 26,26
iand/or partial illumination figure 27,27
isize in the region 16 corresponding to whole form 200 of being got by sensor 4 frame and position can be preset in reader 1 in configuration step, and as substituting of locating in real time.
As other application, when nobody looks after reader 1, such as reading the optical code C entrained by the object doing relative motion relative to reader 1, such as optical code C on a moving belt, driver 13 can drive array 17 to illuminate to open only to comprise the structure 18f that its place has been provided with the light source 18 in the district of optical code C.
Also be the object in the whole region 16 of aiming at and/or select the concern district on substrate S and/or got by frame, as illuminating substituting of described regional center and/or border, can illuminating described region completely (Figure 37) at least partly.
In other embodiment shown in Figure 38, the array 17 of two light sources that can drive separately 18,17a can be arranged on the relative both sides of the sensor 4 of image processing system 3.Two arrays 17,17a drive by driver 13, illuminate each half 28, the 28a in the region 16 of being got by sensor 4 frame so that each all at the most.In the case, the large region 16 of being got by sensor 4 frame can more easily be illuminated.
Alternatively, two arrays 17,17a by driver 13 to drive relative to the mode receiving axis Z symmetry, to double in the whole region of being got by sensor 4 frame 16 or in its one or more radiosity paid close attention in district, this is by realizing the transmitting of two arrays 17,17a superposition, as shown in Figure 39.The region 16 that substrate S is got by sensor 4 frame evenly illuminate and also automatically obtain because array 17 correspond to the light source 18 because throwing light on higher closer to substrate S of array 17a because of the light source 18 throwing light on lower further from substrate S, vice versa.
In Figure 38 and Figure 39, assuming that use non-inverted illumination optics at two arrays 17,17a place, such as, comprise independent lens 19b, and use the illumination optics of non-inverted at sensor 4 place, but be understood that and can use above-mentioned possessive construction.
Similarly, in other embodiment (not shown), four arrays of the light source 18 that can drive separately can be provided, described four arranged in arrays image processing system 3 rectangle or four limits place of foursquare sensor 4 especially.
In above-mentioned multiple subsidiary function, lighting device 6 can be caused to work with low " resolution ", in other words, respectively with subset 18a, 18b, 18c, 18d, 18e, 18f work, only open light source 18 alternately, maybe can open and close the group of one or more light source 18 alternately, to consume less energy.Alternatively or extraly, by only analyzing some light activated elements 14, such as only whole sensor 4 or its one or more pay close attention to the light activated element 14 replaced in district or the group of light activated element 14, image processing system 3 can run with low resolution, in other words reader 1 can implement suitable algorithm, for assessing at least one first sample image with low resolution.
How those skilled in the art apply above-mentioned concept and methodology by easily understanding, particularly by the arbitrfary point P in the workspace areas 15 of the sensor 4 represented by above-mentioned equation and each array 17, 17a's is to be opened with the incidence relation between the light source illuminating this P (multiple light source) 18, the selection followed accurately to limit driver 13 is opened and opens array 17 with which kind of intensity and/or emission wavelength (multiple wavelength) alternatively, the standard of which light source 18 of 17a, to implement to illuminate the various embodiments in the region 16 of being got by sensor 4 frame and/or multiple other function, described by above reference Figure 30 to Figure 37.
Persons skilled in the art will be understood, in described various embodiments, the quantity of the light source 18 of array 17,17a and/or its intensity can according to different selecting factors, and described factor comprises: the depth of field DOF of image processing system 3, the size of sensor 4 and resolution, cost, driver 13 or build the calculated capacity of processor of question blank.
Find, under two-dimensional case, the quantity of suitable light source 18 is at least 32 × 32, is preferably 64 × 64 or more, or is 44 × 32 when having sensor 4 of form factor 4:3, is preferably 86 × 64 or more.Similarly, under one-dimensional case, the quantity of suitable independent addressable light source 18 is 32 or 64 or more.
Above-mentioned image capture device 2 and particularly its lighting device 6 therefore there is obvious advantage.
First advantage is: although the illuminated field of the visual field of image capture device 3 and lighting device 6 is not coaxial, avoids any parallactic error therebetween and perspective distortion error.This allows conserve energy because do not need to make to be illuminated extend beyond the region 16 of being got by sensor 4 frame region to consider parallactic error.
When without any movable part (except with except the embodiment of micro mirror) but in the following way, the intrinsic dirigibility of the array 17 of the light source 18 that can drive separately also provides the possibility changing the region be illuminated on substrate S easily, namely as described above by those light sources only opened simply as illuminating all light sources 18 needed for region 16 that substrate S is got by sensor 4 frame in other words subset 18a, or realized by the light source 18 of only opening this sub-set of light sources 18a part for above-mentioned different object.In other words, the array 17 of the light source 18 that can drive separately allows to use lighting device 6 of the present invention to realize one or more other different function, because this reducing cost and the volume of reader 1, wherein said function is implemented by different devices usually according to prior art.
Even if when throw light on axis A and receive axis Z overlap, merge in one or more aforementioned subsidiary function to independent image capture device 2 and be also novelty and himself also represents Promethean aspect.
There is following modification as the array 17 forming light source 18 on substrate, described modification makes to allow the array 17 of the light source 18 of its layout concentric with image processing system 3 to form by having perforate at its center.This solution tool outside the scope of claim 1 has the following advantages, and namely implements the layout of the symmetry relative to receive optical axis Z, and its cost is the support member having manufactured perforation, and this is not standard and makes the design of driver 13 complicated.
Be similar to and use zoom and/or autofocus system, the highest luminance light beam T of lighting device 6
0also may be made in zoom by having known and/or autofocus system dynamically changes in size and/or ratio, described zoom and/or autofocus system be electromechanics, piezoelectricity or electro-optical actuators such as, described actuator such as by using liquid lens or deformable lens for one or more lens of portable lighting optical device, and/or for the curvature of one or more lens of changing illumination optics, and/or for mobile array 17.
Other solution outside the scope dropping on claim 1 comprises and forms lighting device by multiple relatively few OLED part, particularly form lighting device by following irregular part, the shape of described part is the sided figure that can put together the local overlap forming some such as three.Based on reading distance, form the irregular part that there is the figure of parallactic error relative to image capture device 3 and be unlocked.Also can there is one or more and be arranged as one or more rectangle around the concentric rectangles of irregular part of formation and/or the sequence of angle part, the sequence of described rectangle and/or angle part is illuminated to provide aiming figure.
Illumination optics 19a, 19b, 19c can be collimated fully at light source 18 and launch according to suitable direction, such as when array 17 is arranged along curved surface, (Fig. 6) does not exist.
In array 17 situation of arranging along curved surface (Fig. 6), the reference of all planes for array 17 is applied to the plane tangent with array 17 local.
Claims (11)
1. the optical information reader (1) of an imager type, comprising:
– image processing system (3), described image processing system (3) comprises sensor (4), described sensor (4) comprises one dimension or the two-dimensional array of light activated element (14), and defines receive optical axis (Z), at least one reading distance (D, D
1, D
2) and at described at least one reading distance (D, D
1, D
2) be in substrate (S, S
1, S
2) on the region (16,16 of being got by described sensor (4) frame
1, 16
2),
– lighting device (6), described lighting device (6) comprises the array (17) of adjacent light source (18), and described lighting device (6) limits illumination optical axis (A),
Wherein, each light source (18) is suitable for illuminating size and is less than the described region (16,16 of being got by described sensor (4) frame –
1, 16
2) the district (22 of size
m),
The light source (18) of array described in – (17) is miniature transmitter, and described miniature transmitter is made with gallium nitride (GaN) technology,
– and, described illumination axis (A) does not overlap with described reception axis (Z).
2. the optical information reader (1) of imager type according to claim 1, wherein each light source (18) comprises single illumination component.
3. the optical information reader (1) of the imager type according to any one of claim 1-2, wherein the array (17) of light source is one-dimensional array.
4. the optical information reader (1) of the imager type according to any one of claim 1-3, the number percent change that the total district wherein illuminated by described lighting device (6) on substrate (S) experiences when single source (18) On/Off is less than or equal to 15%.
5. the optical information reader (1) of imager type according to claim 4, wherein said number percent change is less than or equal to 10%.
6. the optical information reader (1) of imager type according to claim 4, wherein said number percent change is less than or equal to 5%.
7. the optical information reader (1) of the imager type according to any one of claim 1-6, wherein image capture device (2) also comprises second array (17a) of adjacent light source (18), described second array (17a) defines the second illumination axis, and described second illumination axis does not overlap with described reception axis (Z).
8. the optical information reader (1) of the imager type according to any one of claim 1-7, wherein the array (17) of light source (18) is associated with at least one projecting lens (19a, 19b, 19c).
9. the optical information reader (1) of imager type according to claim 8, wherein provide at least one projecting lens (19a, 19c), described at least one projecting lens (19a, 19c) is shared by the light source of array (17) (18).
10. the optical information reader (1) of the imager type according to Claim 8 according to any one of-9, wherein said projecting lens (19a, 19b, 19c) is associated with another optical element.
The optical information reader (1) of 11. imager types according to any one of claim 1-10, wherein said light source (18) can drive individually.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410830194.8A CN104680113B (en) | 2010-03-11 | 2010-03-11 | Image capture device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080066573.4A CN102870121B (en) | 2010-03-11 | 2010-03-11 | Image capturing device |
CN201410830194.8A CN104680113B (en) | 2010-03-11 | 2010-03-11 | Image capture device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201080066573.4A Division CN102870121B (en) | 2010-03-11 | 2010-03-11 | Image capturing device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104680113A true CN104680113A (en) | 2015-06-03 |
CN104680113B CN104680113B (en) | 2018-05-15 |
Family
ID=53315136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410830194.8A Active CN104680113B (en) | 2010-03-11 | 2010-03-11 | Image capture device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104680113B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106980316A (en) * | 2017-02-08 | 2017-07-25 | 南昌大学 | A kind of ultrasonic array obstacle avoidance system applied to mobile platform |
CN107124533A (en) * | 2016-02-25 | 2017-09-01 | 讯宝科技有限责任公司 | The module or arrangement and method of target are read by picture catching |
CN108594451A (en) * | 2018-03-12 | 2018-09-28 | 广东欧珀移动通信有限公司 | Control method, control device, depth camera and electronic device |
CN110222712A (en) * | 2019-04-30 | 2019-09-10 | 杰创智能科技股份有限公司 | A kind of more special algorithm of target detection based on deep learning |
WO2019174436A1 (en) * | 2018-03-12 | 2019-09-19 | Oppo广东移动通信有限公司 | Control method, control device, depth camera and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5168167A (en) * | 1991-01-31 | 1992-12-01 | International Business Machines Corporation | Optical scanner having controllable light sources |
US20010035489A1 (en) * | 2000-03-17 | 2001-11-01 | Chaleff Edward I. | Coplanar camera scanning system |
US20090039383A1 (en) * | 2007-08-10 | 2009-02-12 | Hong Kong Applied Science Technology Research Institute | Vertical light emiting diode and method of making a vertical light emiting diode |
CN101485235A (en) * | 2006-06-30 | 2009-07-15 | 皇家飞利浦电子股份有限公司 | Device and method for controlling a lighting system by proximity sensing of a spotlight control device and spotlight control device |
-
2010
- 2010-03-11 CN CN201410830194.8A patent/CN104680113B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5168167A (en) * | 1991-01-31 | 1992-12-01 | International Business Machines Corporation | Optical scanner having controllable light sources |
US20010035489A1 (en) * | 2000-03-17 | 2001-11-01 | Chaleff Edward I. | Coplanar camera scanning system |
CN101485235A (en) * | 2006-06-30 | 2009-07-15 | 皇家飞利浦电子股份有限公司 | Device and method for controlling a lighting system by proximity sensing of a spotlight control device and spotlight control device |
US20090039383A1 (en) * | 2007-08-10 | 2009-02-12 | Hong Kong Applied Science Technology Research Institute | Vertical light emiting diode and method of making a vertical light emiting diode |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107124533A (en) * | 2016-02-25 | 2017-09-01 | 讯宝科技有限责任公司 | The module or arrangement and method of target are read by picture catching |
CN107124533B (en) * | 2016-02-25 | 2020-10-09 | 讯宝科技有限责任公司 | Module or arrangement and method for reading objects by image capture |
CN106980316A (en) * | 2017-02-08 | 2017-07-25 | 南昌大学 | A kind of ultrasonic array obstacle avoidance system applied to mobile platform |
CN106980316B (en) * | 2017-02-08 | 2019-10-29 | 南昌大学 | A kind of ultrasonic array obstacle avoidance system applied to mobile platform |
CN108594451A (en) * | 2018-03-12 | 2018-09-28 | 广东欧珀移动通信有限公司 | Control method, control device, depth camera and electronic device |
WO2019174436A1 (en) * | 2018-03-12 | 2019-09-19 | Oppo广东移动通信有限公司 | Control method, control device, depth camera and electronic device |
US11441895B2 (en) | 2018-03-12 | 2022-09-13 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Control method, depth camera and electronic device |
CN110222712A (en) * | 2019-04-30 | 2019-09-10 | 杰创智能科技股份有限公司 | A kind of more special algorithm of target detection based on deep learning |
CN110222712B (en) * | 2019-04-30 | 2021-06-22 | 杰创智能科技股份有限公司 | Multi-special-item target detection algorithm based on deep learning |
Also Published As
Publication number | Publication date |
---|---|
CN104680113B (en) | 2018-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102870121B (en) | Image capturing device | |
US9292723B2 (en) | Indicia reading terminal including optical filter | |
US7516899B2 (en) | Hand held wireless reading viewer of invisible bar codes | |
ES2820451T3 (en) | Imaging barcode reader with light-emitting diode to generate a field of view | |
CN100483178C (en) | Hand-held imaging-based bar code symbol reader for supporting narrow scope and wide scope mode illumination and image catch | |
EP2577558B1 (en) | Arrangement for and method of generating uniform distributed illumination pattern for imaging reader | |
CN101999128A (en) | Systems and methods for forming a composite image of multiple portions of an object from multiple perspectives | |
CN104680113A (en) | Image capturing device | |
WO2008121483A1 (en) | Compact imaging lens assembly for an imaging-based bar code reader | |
CN108351955A (en) | Compact image-forming module with rangefinder | |
MX2013002848A (en) | DEVICE AND METHOD FOR PERCEPTION OF MONOLITIC IMAGE. | |
CN1288892C (en) | Photoelectric file reader for reading UV/IR visual mark | |
CN102985786A (en) | A light projector and vision system for distance determination | |
FR3055447A1 (en) | MORE VISIBLE VIEWING BRAND FOR OPTICAL READER | |
WO2006083531A2 (en) | Asymmetrical scanner | |
EP2572315B1 (en) | Focus adjustment with liquid crystal device in imaging scanner | |
ES2463390T3 (en) | Reverse vending machine and representation method using images for it | |
CN1902508A (en) | Method and apparatus for capturing images using a color laser projection display | |
KR100724118B1 (en) | Optical information reader | |
CN103119610A (en) | Method and apparatus for detecting and/or assessing three-dimensional elevated structures on a surface of a document | |
US20100213258A1 (en) | Arrangement for and method of generating uniform distributed line pattern for imaging reader | |
CN114663638A (en) | Systems, methods, and apparatus for imaging using dual-purpose illuminator | |
JP3222659B2 (en) | Data symbol reading device | |
CN109902650A (en) | Biological characteristic detects mould group and backlight module and electronic device | |
CN1920888A (en) | counterfeit detector |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |