[go: up one dir, main page]

CN118489128A - Imaging system and method of operation - Google Patents

Imaging system and method of operation Download PDF

Info

Publication number
CN118489128A
CN118489128A CN202180105044.9A CN202180105044A CN118489128A CN 118489128 A CN118489128 A CN 118489128A CN 202180105044 A CN202180105044 A CN 202180105044A CN 118489128 A CN118489128 A CN 118489128A
Authority
CN
China
Prior art keywords
image sensor
image
partial images
radiation
partial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180105044.9A
Other languages
Chinese (zh)
Inventor
曹培炎
刘雨润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xpectvision Technology Co Ltd
Original Assignee
Shenzhen Xpectvision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xpectvision Technology Co Ltd filed Critical Shenzhen Xpectvision Technology Co Ltd
Publication of CN118489128A publication Critical patent/CN118489128A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Measurement Of Radiation (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

Disclosed herein is a method comprising capturing a plurality of partial images of an object with an image sensor of an imaging system. When the image sensor is on a circular track of a plurality of circular tracks (i), i=1, … …, M, the image sensor captures each partial image of the plurality of partial images. All centers of the plurality of circular tracks (i), i=1, … …, M are on the same axis. All of the plurality of circular tracks (i), i=1, … …, M have the same radius and are respectively on M different planes perpendicular to the axis. For each value of i, when the image sensor is on the circular track (i), the image sensor captures Ni partial images of the plurality of partial images. M, ni, i=1, … …, M is an integer greater than 1.

Description

Imaging system and method of operation
[ Background Art ]
A radiation detector is a device that measures the characteristics of radiation. Examples of such characteristics may include the spatial distribution of the intensity, phase and polarization of the radiation. The radiation measured by the radiation detector may be radiation that has been transmitted through the object. The radiation measured by the radiation detector may be electromagnetic radiation, such as infrared light, visible light, ultraviolet light, X-rays or gamma rays. The radiation may be of other types, such as alpha rays and beta rays. The imaging system may include one or more image sensors, each of which may have one or more radiation detectors.
[ Invention ]
Disclosed herein is a method comprising: capturing a plurality of partial images of an object with an image sensor of an imaging system, wherein when the image sensor is on one of a plurality of circular tracks (i), i=1, … …, M, the image sensor captures each partial image of the plurality of partial images, wherein all centers of the plurality of circular tracks (i), i=1, … …, M are on a same axis, wherein all of the plurality of circular tracks (i), i=1, … …, M have a same radius and are respectively on M different planes perpendicular to the axis, wherein for each value of i, the image sensor captures Ni partial images of the plurality of partial images when the image sensor is on the circular track (i), and wherein M, ni, i=1, … …, M are integers greater than 1.
In one aspect, the capturing the plurality of partial images includes moving the image sensor between the plurality of circular tracks (i), i=1, … …, M.
In one aspect, for each value of i, the image sensor captures the Ni partial images from Ni different image capture locations on the circular orbit (i).
In one aspect, the method further includes reconstructing a three-dimensional (3D) image of the object based on the plurality of partial images.
In one aspect, the reconstructing the three-dimensional image of the object comprises: reconstructing a local three-dimensional image of the object based on the Ni local images for each value of i; and combining the obtained M partial three-dimensional images, thereby obtaining the three-dimensional image of the object.
In one aspect, each point of the object is in at least one partial image of the plurality of partial images.
In one aspect, each point of the object is in at least two partial images of the plurality of partial images, the at least two partial images being captured by the image sensor when the image sensor is on a circular track of the plurality of circular tracks (i), i=1, … …, M.
In one aspect, the image sensor captures the plurality of partial images one by one.
In one aspect, the image sensor captures each partial image of the plurality of partial images as the image sensor moves along a circular track of the plurality of circular tracks (i), i=1, … …, M relative to the object.
In one aspect, for each value of i, the image sensor captures all of the Ni partial images without exiting the circular orbit (i).
In one aspect, the image sensor has an angular direction, and wherein, for each value of i, the image sensor moves in the angular direction as the image sensor captures the Ni partial images.
In one aspect, for at least one value of i, the image sensor captures at least one partial image of the Ni partial images without capturing all of the Ni partial images, and then moves to another circular track of the plurality of circular tracks (i), i=1, … …, M.
In one aspect, the capturing the plurality of partial images with the image sensor includes rotating the image sensor about the axis.
In an aspect, the capturing the plurality of partial images with the image sensor further comprises translating the image sensor relative to the object in a direction parallel to the axis.
In one aspect, the imaging system includes a radiation source configured to transmit radiation to the object and to the image sensor, wherein the image sensor uses radiation from the radiation source that has passed through the object when capturing the plurality of partial images of the object, and wherein the capturing the plurality of partial images includes rotating the radiation source and the image sensor about the axis while the radiation source and the image sensor remain stationary relative to each other.
In an aspect, the capturing the plurality of partial images further comprises translating the image sensor relative to the object from one circular track of the plurality of circular tracks (i), i=1, … …, M to another circular track of the plurality of circular tracks (i), i=1, … …, M along a direction parallel to the axis while the radiation source and the object are stationary relative to each other.
In one aspect, the radiation transmitted by the radiation source comprises X-rays.
In one aspect, the radiation transmitted by the radiation source comprises radiation pulses, and wherein radiation in each of the radiation pulses that has passed through the object is used by the image sensor to capture a partial image of the plurality of partial images.
In one aspect, the image sensor includes P active areas, wherein each of the P active areas includes a plurality of sensing elements, wherein the P active areas are arranged in Q active area rows, wherein each of the Q active area rows includes a plurality of active areas of the P active areas, wherein for each of the Q active area rows, a straight line perpendicular to the axis intersects all active areas of the each active area row, wherein P and Q are integers of 1 or more.
In one aspect, any two adjacent active areas of any one of the Q active area rows overlap each other with respect to a direction perpendicular to a best fit plane that intersects all of the sensing elements of the image sensor.
In one aspect, the image sensor further comprises a row gap between any two adjacent active area rows of the Q active area rows, and wherein the row gap is along a direction perpendicular to the axis.
In one aspect, when the image sensor captures the plurality of partial images, the image sensor moves on the circular track (1), then moves on the circular track (2), … …, then moves on the circular track (M), wherein for each value of i, i=1, … …, (M-1), for each pair of (a) a first image capture position of the image sensor on the circular track (i) and (B) a second image capture position of the image sensor on the circular track (i+1), a straight line passing through the first capture position and the second capture position is not parallel to the axis.
[ Description of the drawings ]
Fig. 1 schematically shows a radiation detector according to an embodiment.
Fig. 2 schematically shows a simplified cross-sectional view of a radiation detector according to an embodiment.
Fig. 3 schematically shows a detailed cross-sectional view of a radiation detector according to an embodiment.
Fig. 4 schematically shows a detailed cross-sectional view of a radiation detector according to an alternative embodiment.
Fig. 5 schematically illustrates a top view of a radiation detector package including a radiation detector and a Printed Circuit Board (PCB) according to an embodiment.
Fig. 6 schematically illustrates a cross-sectional view of the packaged image sensor of fig. 5 including mounting to a system printed circuit board (printed circuit board), in accordance with an embodiment.
Fig. 7A to 8D schematically show perspective views of an imaging system in operation according to an embodiment.
Fig. 9 shows a flowchart outlining the operation of an imaging system according to an embodiment.
Fig. 10 schematically illustrates a top view of the image sensor of fig. 6, according to an embodiment.
Fig. 11 illustrates a cross-sectional view of the image sensor of fig. 10 according to an alternative embodiment.
[ Detailed description ] of the invention
Radiation detector
Fig. 1 schematically shows an exemplary radiation detector 100. The radiation detector 100 may include an array of pixels 150 (also referred to as sensing elements 150). The array may be a rectangular array (as shown in fig. 1), a cellular array, a hexagonal array, or any other suitable array. The array of pixels 150 in the example of fig. 1 has 4 rows and 7 columns; in general, however, an array of pixels 150 may have any number of rows and any number of columns.
Each pixel 150 may be configured to detect radiation incident thereon from a radiation source (not shown) and may be configured to measure characteristics of the radiation (e.g., energy, wavelength, and frequency of the particles). The radiation may include particles such as photons and subatomic particles. Each pixel 150 may be configured to count the number of radiation particles incident thereon and having energy falling into a plurality of energy bins (bins) over a period of time. All pixels 150 may be configured to count the number of radiation particles incident thereon within a plurality of energy bins (bins) over the same time period. When the incident radiation particles have similar energies, the pixel 150 may be configured to count the number of radiation particles incident thereon for a period of time only, without measuring the energy of a single radiation particle.
Each pixel 150 may have its own analog-to-digital converter (ADC) configured to digitize an analog signal representing the energy of the incident radiation particle into a digital signal, or to digitize an analog signal representing the total energy of a plurality of incident radiation particles into a digital signal. The pixels 150 may be configured to operate in parallel. For example, while one pixel 150 is measuring an incident radiation particle, another pixel 150 may be waiting for the radiation particle to arrive. The pixels 150 may not necessarily be individually addressable.
The radiation detector 100 described herein may have applications such as X-ray telescope, X-ray mammography, industrial X-ray defect detection, X-ray microscope or microscopy, X-ray casting inspection, X-ray nondestructive inspection, X-ray welding inspection, X-ray digital subtraction angiography, and the like. It may be appropriate to use the radiation detector 100 instead of a photographic plate, photographic film, photo-excited fluorescent plate (PSP plate), X-ray image intensifier, scintillator or other semiconductor X-ray detector.
Fig. 2 schematically illustrates a simplified cross-sectional view of the radiation detector 100 of fig. 1 along line 2-2, in accordance with an embodiment. In particular, radiation detector 100 may include a radiation absorbing layer 110 and an electronics layer 120 (which may include one or more ASICs or application specific integrated circuits) for processing or analyzing electrical signals generated in radiation absorbing layer 110 by incident radiation. The radiation detector 100 may or may not include a scintillator (not shown). The radiation absorbing layer 110 may comprise a semiconductor material such as silicon, germanium, gaAs, cdTe, cdZnTe, or a combination thereof. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest.
By way of example, fig. 3 schematically shows a detailed cross-sectional view of the radiation detector 100 of fig. 1 along line 2-2. In particular, the radiation absorbing layer 110 can include one or more diodes (e.g., p-i-n or p-n) formed from one or more discrete regions 114 of the first doped region 111, the second doped region 113. The second doped region 113 may be separated from the first doped region 111 by an optional intrinsic region 112. The discrete regions 114 may be separated from each other by the first doped region 111 or the intrinsic region 112. The first doped region 111 and the second doped region 113 may have opposite types of doping (e.g., the first doped region 111 is p-type, the second doped region 113 is n-type, or the first doped region 111 is n-type, the second doped region 113 is p-type). In the example of fig. 3, each discrete region 114 of the second doped region 113 forms a diode with the first doped region 111 and the optional intrinsic region 112. That is, in the example of fig. 3, the radiation absorbing layer 110 has a plurality of diodes (more specifically, 7 diodes correspond to 7 pixels 150 of a row in the array of fig. 1, of which only 2 pixels 150 are labeled in fig. 3 for simplicity). The plurality of diodes may have electrical contacts 119A as common (common) electrodes. The first doped region 111 may also have a plurality of discrete portions.
The electronics layer 120 may include an electronic system 121 adapted to process or interpret signals generated by radiation incident on the radiation absorbing layer 110. The electronic system 121 may include analog circuits such as a filter network, amplifiers, integrators, and comparators, or digital circuits such as a microprocessor and memory. The electronics 121 may include one or more analog-to-digital converters (analog to digital converter). The electronic system 121 may include components common to multiple pixels 150 or components dedicated to a single pixel 150. For example, the electronic system 121 may include an amplifier dedicated to each pixel 150 and a microprocessor shared among all pixels 150. The electronic system 121 may be electrically connected to the pixel 150 through the via 131. The space between the vias may be filled with a filler material 130, which may increase the mechanical stability of the connection of the electronic device layer 120 with the radiation absorbing layer 110. Other bonding techniques may connect the electronics 121 to the pixel 150 without the use of a via 131.
When radiation from a radiation source (not shown) impinges on the radiation absorbing layer 110, which includes a diode, the radiation particles may be absorbed and one or more charge carriers (e.g., electrons, holes) are generated by a variety of mechanisms. Charge carriers may drift under an electric field to the electrode of one of the diodes. The electric field may be an external electric field. The electrical contact 119B can include a plurality of discrete portions, each of which is in electrical contact with the discrete region 114. The term "electrical contact" may be used interchangeably with the word "electrode". In one embodiment, the charge carriers may drift in multiple directions and such that charge carriers generated by a single radiating particle are not substantially shared by two different discrete regions 114 (here, "not substantially shared by … …" means that less than 2%, less than 0.5%, less than 0.1%, or less than 0.01% of the charge carriers flow to a different discrete region 114 than the rest of the charge carriers). Charge carriers generated by radiation particles incident around the footprint (footprint) of one of the discrete regions 114 are substantially not shared with another one of the discrete regions 114. The pixels 150 associated with the discrete region 114 may be the region surrounding the discrete region 114 in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%) of the charge carriers generated by the radiation particles incident therein flow to the discrete region 114. That is, less than 2%, less than 1%, less than 0.1%, or less than 0.01% of the charge carriers flow out of the pixel 150.
Fig. 4 schematically illustrates a detailed cross-sectional view of the radiation detector 100 of fig. 1 along line 2-2, in accordance with an alternative embodiment. More specifically, the radiation absorbing layer 110 may include a resistor of semiconductor material, such as silicon, germanium, gaAs, cdTe, cdZnTe, or a combination thereof, but not a diode. The semiconductor material may have a high mass attenuation coefficient for the radiation of interest. In one embodiment, the electronic device layer 120 of fig. 4 is similar in structure and function to the electronic device layer 120 of fig. 3.
When radiation strikes the radiation absorbing layer 110, which includes a resistor but does not include a diode, it can be absorbed and one or more charge carriers are generated by a variety of mechanisms. The radiation particles may generate 10 to 100000 charge carriers. Charge carriers may drift under an electric field to electrical contacts 119A and 119B. The electric field may be an external electric field. The electrical contact 119B may include a plurality of discrete portions. In one embodiment, the charge carriers may drift in multiple directions and such that charge carriers generated by a single radiating particle are not substantially shared by two different discrete portions of the electrical contact 119B (here, "substantially not shared by … …" means that less than 2%, less than 0.5%, less than 0.1%, or less than 0.01% of the charge carriers flow to a different discrete portion than the rest of the charge carriers). Charge carriers generated by radiation particles incident around the footprint of one of the discrete portions of electrical contact 119B are not substantially shared with the other of the discrete portions of electrical contact 119B. The pixel 150 associated with one discrete portion of the electrical contact 119B may be a region surrounding the discrete portion in which substantially all (more than 98%, more than 99.5%, more than 99.9%, or more than 99.99%) of the charge carriers generated by the radiation particles incident therein flow to the discrete portion of the electrical contact 119B. That is, less than 2%, less than 0.5%, less than 0.1%, or less than 0.01% of the charge carriers flow out of the pixel associated with the one discrete portion of electrical contact 119B.
Radiation detector package
Fig. 5 schematically shows a top view of a radiation detector package 500 comprising a radiation detector 100 and a Printed Circuit Board (PCB) 510. The term "printed circuit board" as used herein is not limited to a particular material. For example, the printed circuit board may include a semiconductor. The radiation detector 100 may be mounted to a printed circuit board 510. For clarity, wiring between radiation detector 100 and printed circuit board 510 is not shown. Package 500 may have one or more radiation detectors 100. The printed circuit board 510 may include an input/output (I/O) area 512 that is not covered by the radiation detector 100 (e.g., to accommodate bond wires 514). The radiation detector 100 may have an active area 190, which is where the pixels 150 (fig. 1) are located. The radiation detector 100 may have a peripheral region 195 located near the edge of the radiation detector 100. The peripheral region 195 is devoid of pixels 150, and the radiation detector 100 does not detect radiation particles incident on the peripheral region 195.
Image sensor
Fig. 6 schematically shows a cross-sectional view of an image sensor 600 according to an embodiment. The image sensor 600 may include one or more of the radiation detector packages 500 of fig. 5 mounted to a system printed circuit board 650. The electrical connection between the printed circuit board 510 and the system printed circuit board 650 may be made by bonding wires 514. To accommodate bond wires 514 on the printed circuit board 510, the printed circuit board 510 may have input/output areas 512 that are not covered by the radiation detector 100. To accommodate bond wires 514 on the system printed circuit board 650, there may be a gap between packages 500. The gap may be about 1mm or greater. Radiation particles incident on the peripheral region 195, the input/output region 512, or the gap cannot be detected by the package 500 on the system printed circuit board 650. The dead zone of a radiation detector (e.g., radiation detector 100) is the area of the radiation receiving surface of the radiation detector on which radiation particles incident thereon cannot be detected by the radiation detector. The dead zone of the package (e.g., package 500) is the area of the radiation receiving surface of the package on which radiation particles incident on the package cannot be detected by one or more radiation detectors in the package. In this example shown in fig. 5 and 6, the dead zone of package 500 includes peripheral region 195 and input/output region 512. Dead zones (e.g., 688) of an image sensor (e.g., image sensor 600) having a set of packages (e.g., multiple packages 500 mounted on the same printed circuit board and arranged in the same layer or different layers) include a combination of dead zones of packages and gaps between packages in the set.
In one embodiment, the self-running radiation detector 100 (FIG. 1) may be considered an image sensor. In one embodiment, the self-running package 500 (FIG. 5) may be considered an image sensor.
The image sensor 600 including the radiation detector 100 may have dead zones 688 between the active areas 190 of the plurality of radiation detectors 100. However, the image sensor 600 may capture a plurality of partial images of an object or scene (not shown) one by one, and then may stitch the captured partial images to form a stitched image of the entire object or scene.
The term "image" in this specification is not limited to the spatial distribution of radiation characteristics (e.g. intensity). For example, the term "image" may also include a spatial distribution of the density of a substance or element.
Imaging system
Fig. 7A-8D schematically illustrate perspective views of an imaging system 700 in operation according to an embodiment. In one embodiment, referring to fig. 7A, an imaging system 700 may include a radiation source 710 and an image sensor 600. In one embodiment, the object 720 may be located between the radiation source 710 and the image sensor 600.
In one embodiment, the radiation source 710 may transmit the radiation beam 712 to the object 720 and to the image sensor 600. The image sensor 600 may take an image of the object 720 using radiation of the radiation beam 712 that has passed through the object 720.
In one embodiment, the radiation beam 712 may comprise X-rays. In one embodiment, the radiation beam 712 may include radiation pulses, wherein radiation in each of the radiation pulses that has passed through the object 720 may be used by the image sensor 600 to capture an image of the object 720.
Operation of an imaging system
Image sensor on first circular track
In one embodiment, referring to fig. 7A-7D, the operation of the imaging system 700 may be as follows. In one embodiment, the radiation source 710 and the image sensor 600 may rotate counterclockwise about the axis 730 while the radiation source 710 and the image sensor 600 remain stationary relative to each other. As a result, the image sensor 600 moves along a first circular track (not shown) from a first start position as shown in fig. 7A, then passes through a first image capturing position as shown in fig. 7B, then passes through a second image capturing position as shown in fig. 7C, and then returns to the first start position as shown in fig. 7D. The rotation of the radiation source 710 and the image sensor 600 need not be completed a full revolution.
In one embodiment, the axis 730 may be stationary relative to the object 720. In one embodiment, as shown, the axis 730 may intersect the object 720. In general, the axis 730 may or may not intersect the object 720.
In one embodiment, referring to fig. 7B, as image sensor 600 moves along a first circular trajectory through a first image capture location, image sensor 600 may capture a first partial image of object 720 by using radiation in radiation beam 712 from radiation source 710 that has passed through object 720.
Similarly, in one embodiment, referring to fig. 7C, as image sensor 600 moves along a first circular trajectory through a second image capture location, image sensor 600 may capture a second partial image of object 720 by using radiation in radiation beam 712 from radiation source 710 that has passed through object 720.
The image sensor being translated to another circular track
In one embodiment, after image sensor 600 returns to the first starting position shown in fig. 7D, image sensor 600 may be translated relative to object 720 in a direction parallel to axis 730 (e.g., into the page) from the first starting position shown in fig. 7D to the second starting position shown in fig. 8A. In fig. 8A, a broken line (except for a broken line on the axis 730) represents the image sensor 600 in the first start position.
In one embodiment, the radiation source 710 and the object 720 may be stationary relative to each other when the image sensor 600 translates from the first starting position to the second starting position as described above.
Image sensor on second circular track
In one embodiment, referring to fig. 8A-8D, after the image sensor 600 reaches the second starting position as shown in fig. 8A, the radiation source 710 and the image sensor 600 may be rotated counterclockwise about the axis 730 while the radiation source 710 and the image sensor 600 remain stationary relative to each other. As a result, the image sensor 600 moves along a second circular track (not shown) from a second start position as shown in fig. 8A, then passes through a third image capturing position as shown in fig. 8B, then passes through a fourth image capturing position as shown in fig. 8C, and then returns to the second start position as shown in fig. 8D. The rotation of the radiation source 710 and the image sensor 600 need not be completed a full revolution.
For simplicity, in the above embodiment, the number of image capturing positions on the first circular track is the same as the number of image capturing positions on the second circular track (both numbers are 2). Typically, the number of image capturing positions on each circular track is greater than 1, and need not be the same as the number of image capturing positions on another circular track. For example, the number of image capturing positions on the first circular track may be 2 as described above, and the number of image capturing positions on the second circular track may be 3 (not 2 as described above).
The centers of the first circular orbit and the second circular orbit are on the axis 730 due to the rotation and translation of the image sensor 600 as described above. In addition, the first circular track and the second circular track have the same radius and lie in two different planes perpendicular to the axis 730, respectively.
In one embodiment, referring to fig. 8B, as image sensor 600 moves along the second circular orbit through the third image capture location, image sensor 600 may capture a third partial image of object 720 by using radiation in beam 712 from radiation source 710 that has passed through object 720.
Similarly, in one embodiment, referring to fig. 8C, as image sensor 600 moves along the second circular orbit through the fourth image capture location, image sensor 600 may capture a fourth partial image of object 720 by using radiation in radiation beam 712 from radiation source 710 that has transmitted through object 720.
Flow chart summarizing the operation of an imaging system
Fig. 9 illustrates a flowchart 900 outlining the operation of the imaging system 700 according to an embodiment. At step 910, an image sensor of an imaging system captures a plurality of partial images of an object. For example, in the above-described embodiment, referring to fig. 7A to 8D, the image sensor 600 of the imaging system 700 captures a first partial image, a second partial image, a third partial image, and a fourth partial image of the object 720.
Further, also in step 910, when the image sensor is on a circular track of the plurality of circular tracks (i), i=1, … …, M, the image sensor captures each of the plurality of partial images. For example, in the above-described embodiment, referring to fig. 7A to 8D, when the image sensor 600 is in one of the first circular orbit and the second circular orbit (here, m=2), the image sensor 600 captures each of the first partial image, the second partial image, the third partial image, and the fourth partial image.
Also in step 910, a plurality of circular tracks (i), i=1, … …, M, all centered on the same axis. For example, in the above-described embodiment, referring to fig. 7A to 8D, the centers of the first circular orbit and the second circular orbit are on the same axis 730.
Furthermore, also in step 910, all circular tracks (i), i=1, … …, M have the same radius and lie respectively on M different planes perpendicular to the axis. For example, in the above-described embodiment, referring to fig. 7A to 8D, all of the first circular tracks and the second circular tracks have the same radius and are respectively located on 2 different planes perpendicular to the axis 730.
Further, also in step 910, for each value of i, the image sensor captures Ni partial images of the plurality of partial images while the image sensor is on the circular track (i). For example, in the above-described embodiment, referring to fig. 7A to 8D, for i=1, when the image sensor 600 is on the first circular orbit, the image sensor 600 captures n1=2 partial images (i.e., the first partial image and the second partial image) out of the 4 partial images. For i=2, when the image sensor 600 is on the second circular orbit, the image sensor 600 captures n2=2 partial images (i.e., the third partial image and the fourth partial image) out of the 4 partial images.
OTHER EMBODIMENTS
Three-dimensional (3D) image of an object
In one embodiment, a three-dimensional image of the object 720 may be reconstructed based on the first partial image, the second partial image, the third partial image, and the fourth partial image of the object 720. Specifically, in one embodiment, a first partial three-dimensional image of the object 720 may be reconstructed based on the first partial image and the second partial image of the object 720. Similarly, a second local three-dimensional image of the object 720 may be reconstructed based on the third partial image and the fourth partial image of the object 720. Then, a three-dimensional image of the object 720 may be created by combining the first partial three-dimensional image and the second partial three-dimensional image of the object 720. Note that the partial three-dimensional image of the object 720 is a three-dimensional image of a portion of the object 720.
The object being imaged completely
In one embodiment, the object 720 may be fully imaged or scanned. In other words, referring to step 910 of FIG. 9, in general, each point of the object is in at least one of the plurality of partial images. In the above-described embodiment, each point of the object 720 is in at least one of the first partial image, the second partial image, the third partial image, and the fourth partial image.
In an alternative embodiment, each point of object 720 is located in at least two partial images taken by image sensor 600 while image sensor 600 is on a circular orbit. In other words, in the above-described embodiment, each point of the object 720 is at least in (a) the first partial image and the second partial image or (B) the third partial image and the fourth partial image.
Details of image sensor
Fig. 10 schematically illustrates a top view of the image sensor 600 of fig. 6, according to an embodiment. Note that fig. 6 schematically shows a cross-sectional view of the image sensor 600 of fig. 10 along line 6-6, according to an embodiment. However, in fig. 10, the peripheral region 195 is not shown for simplicity.
In one embodiment, referring to fig. 6 and 10, an image sensor 600 may have 2 radiation detector packages 500, each of which may have 3 active areas 190. In general, the image sensor 600 may have a plurality of radiation detector packages 500, each radiation detector package 500 may have a plurality of active areas 190. As shown in fig. 10, 6 active areas 190 of the image sensor 600 may be arranged in 2 active area rows, each active area row having 3 active areas 190.
In one embodiment, the 2 active area rows may be on2 rows of printed circuit boards 510, respectively. In one embodiment, the 2-row printed circuit board 510 may be on a system printed circuit board 650.
In one embodiment, direction 1091 may be parallel to the active area rows of image sensor 600. In other words, for each of the 2 active area rows of the image sensor 600, a straight line parallel to the direction 1091 intersects all 3 active areas 190 of said each active area row.
In one embodiment, axis 730 (fig. 7A-8D) may be selected such that direction 1091 is perpendicular to axis 730. As a result, the 2 active area rows of the image sensor 600 are perpendicular to the axis 730. In other words, for each of the 2 active area rows of the image sensor 600, a straight line perpendicular to the axis 730 intersects all 3 active areas 190 of said each active area row.
In one embodiment, referring to fig. 10, the image sensor 600 may include column gaps 192 between any 2 adjacent active areas 190 of any one of the 2 active area rows of the image sensor 600. In one embodiment, each column gap 192 of image sensor 600 may be along a direction 1092 that is perpendicular to direction 1091.
In one embodiment, referring to fig. 10, the image sensor 600 includes 2 rows of input/output regions 512 for 2 active area rows, respectively. In one embodiment, 2 rows of input/output regions 512 and 2 active region rows may be arranged in an alternating fashion as shown in fig. 10.
In the above-described embodiment, referring to fig. 10, there is a column gap 192 between any 2 adjacent active areas 190 of any one of 2 active area rows of the image sensor 600. In alternative embodiments, any 2 adjacent active areas 190 in any one of the 2 active area rows of the image sensor 600 may overlap each other with respect to a direction perpendicular to a best fit plane that intersects all of the sensing elements 150 of the image sensor 600 (the best fit plane is not shown, but should be parallel to the page of fig. 10). In other words, for any 2 adjacent active areas 190, there is a straight line perpendicular to the best fit plane and intersecting any 2 adjacent active areas 190.
For example, referring to FIG. 11 (which shows a cross-sectional view of the image sensor 600 of FIG. 10 along line 11-11 in the case of the alternative embodiment described above), the 2 left active areas 190 (which are adjacent) overlap one another with respect to a direction 1120 perpendicular to the best-fit plane.
In one embodiment, referring to fig. 11, the 2 left active areas 190 described above may be located in two different wafer layers 1101 and 1102, respectively. In one embodiment, the image sensor 600 may be manufactured as follows. The components of image sensor 600 may be formed on two separate wafer layers 1101 and 1102, and then the two wafer layers 1101 and 1102 may be bonded together, resulting in image sensor 600 of FIG. 11.
In one embodiment, referring back to fig. 10, the image sensor 600 may include a row gap 194 between 2 adjacent active area rows. In one embodiment, row gap 194 may be along a direction perpendicular to axis 730 (fig. 7A-8D).
No two image capturing positions on two temporally consecutive circular tracks are at the same angle
In one embodiment, referring to fig. 7A to 8D, no two image capturing positions may be at the same angle on two temporally consecutive circular tracks, respectively. In other words, in the above-described embodiment, for each pair of (a) the image capturing position of the image sensor 600 on the first circular orbit and (B) the image capturing position of the image sensor 600 on the second circular orbit, the straight line passing through the two image capturing positions is not parallel to the axis 730.
Note that in the above-described embodiment, the first circular track and the second circular track are two temporally continuous circular tracks because the image sensor 600 moves on the first circular track and then moves on the second circular track, and does not move on the third circular track after moving on the first circular track and before moving on the second circular track.
Alternative embodiment
The image sensor leaves a circular track before taking all partial images associated with the circular track
In the above-described embodiment, the image sensor 600 captures all partial images associated with a circular track without leaving the circular track. In other words, the image sensor 600 does not leave the circular orbit until the image sensor 600 captures all partial images associated with the circular orbit. For example, the image sensor 600 does not leave the first circular track until the image sensor 600 captures all partial images (i.e., the first partial image and the second partial image) associated with the first circular track.
In an alternative embodiment, the image sensor 600 may leave a circular track before the image sensor 600 captures all partial images associated with the circular track. For example, the image sensor 600 may capture a first partial image as the image sensor 600 moves along a first circular track past a first image capture location. The image sensor 600 may then be translated from the first circular orbit to the second circular orbit. Then, the image sensor 600 may capture a third partial image as the image sensor 600 moves along the second circular orbit through the third image capturing position.
The image sensor 600 may then be translated from the second circular orbit back to the first circular orbit. Then, the image sensor 600 may capture a second partial image as the image sensor 600 moves along the first circular orbit through the second image capturing position. The image sensor 600 may then again be translated from the first circular track to the second circular track. Then, the image sensor 600 may capture a fourth partial image as the image sensor 600 moves along the second circular orbit through the fourth image capturing position.
The image sensor moving in different angular directions
In the above-described embodiment, the image sensor 600 moves in the same angular direction (i.e., counterclockwise) along the first circular orbit and the second circular orbit. In alternative embodiments, the image sensor 600 may move in different angular directions along the first circular track and the second circular track. For example, as described above, the image sensor 600 may move counterclockwise along the first circular track to capture the first partial image and the second partial image, but the image sensor 600 may move clockwise along the second circular track to capture the third partial image and the fourth partial image.
In yet another alternative embodiment, the image sensor 600 may reverse its angular direction on a circular track. For example, when the image sensor 600 captures a first partial image of the object 720, the image sensor 600 may move counterclockwise along a first circular orbit through a first image capturing position, but when the image sensor 600 captures a second partial image of the object 720, the image sensor 600 may reverse its angular direction and then may move clockwise along the first circular orbit through a second image capturing position.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (22)

1. A method, comprising:
a plurality of partial images of an object are taken with an image sensor of an imaging system,
Wherein when the image sensor is on a circular track of a plurality of circular tracks (i), i=1, … …, M, the image sensor captures each partial image of the plurality of partial images,
Wherein all centers of the plurality of circular tracks (i), i=1, … …, M are on the same axis,
Wherein all of said plurality of circular tracks (i), i=1, … …, M have the same radius and are respectively on M different planes perpendicular to said axis,
Wherein for each value of i, when the image sensor is on the circular orbit (i), the image sensor captures Ni partial images of the plurality of partial images, and
Wherein M, ni, i=1, … …, M is an integer greater than 1.
2. The method according to claim 1,
Wherein said capturing said plurality of partial images comprises moving said image sensor between said plurality of circular tracks (i), i=1, … …, M.
3. The method according to claim 1,
Wherein for each value of i, the image sensor captures the Ni partial images from Ni different image capturing positions on the circular orbit (i).
4. The method of claim 1, further comprising reconstructing a three-dimensional (3D) image of the object based on the plurality of partial images.
5. The method of claim 4, wherein the reconstructing the three-dimensional image of the object comprises:
reconstructing a local three-dimensional image of the object based on the Ni local images for each value of i; and
Combining the obtained M partial three-dimensional images, thereby obtaining the three-dimensional image of the object.
6. The method according to claim 1,
Wherein each point of the object is in at least one partial image of the plurality of partial images.
7. The method according to claim 1,
Wherein each point of the object is in at least two partial images of the plurality of partial images, the at least two partial images being captured by the image sensor when the image sensor is on a circular track of the plurality of circular tracks (i), i=1, … …, M.
8. The method according to claim 1,
Wherein the image sensor captures the plurality of partial images one by one.
9. The method according to claim 1,
Wherein the image sensor captures each partial image of the plurality of partial images as the image sensor moves along a circular track of the plurality of circular tracks (i), i=1, … …, M relative to the object.
10. The method according to claim 1,
Wherein for each value of i, the image sensor captures all of the Ni partial images without leaving the circular orbit (i).
11. The method according to claim 10,
Wherein the image sensor has an angular direction, and
Wherein for each value of i, the image sensor moves in the angular direction when the image sensor captures the Ni partial images.
12. The method according to claim 1,
Wherein for at least one value of i, the image sensor captures at least one partial image of the Ni partial images without capturing all of the Ni partial images, and then moves to another circular track of the plurality of circular tracks (i), i=1, … …, M.
13. The method according to claim 1,
Wherein said capturing said plurality of partial images with said image sensor comprises rotating said image sensor about said axis.
14. The method according to claim 13,
Wherein said capturing said plurality of partial images with said image sensor further comprises translating said image sensor relative to said object in a direction parallel to said axis.
15. The method according to claim 1,
Wherein the imaging system comprises a radiation source configured to transmit radiation to the object and to the image sensor,
Wherein the image sensor uses radiation that has passed through the object from among the radiation from the radiation source when capturing the plurality of partial images of the object, and
Wherein said capturing said plurality of partial images comprises rotating said radiation source and said image sensor about said axis while said radiation source and said image sensor remain stationary relative to each other.
16. The method according to claim 15,
Wherein the capturing the plurality of partial images further comprises translating the image sensor relative to the object from one circular track of the plurality of circular tracks (i), i=1, … …, M to another circular track of the plurality of circular tracks (i), i=1, … …, M along a direction parallel to the axis, while the radiation source and the object are stationary relative to each other.
17. The method according to claim 15,
Wherein the radiation transmitted by the radiation source comprises X-rays.
18. The method according to claim 15,
Wherein the radiation transmitted by the radiation source comprises a radiation pulse, and
Wherein radiation of each of the radiation pulses that has passed through the object is used by the image sensor to take a partial image of the plurality of partial images.
19. The method according to claim 1,
Wherein the image sensor comprises P effective areas,
Wherein each of the P active areas includes a plurality of sensing elements,
Wherein the P active areas are arranged in Q active area rows,
Wherein each of the Q active area rows includes a plurality of active areas of the P active areas,
Wherein, for each of the Q active area rows, a straight line perpendicular to the axis intersects all active areas of the each active area row, and
Wherein P and Q are integers of 1 or more.
20. The method according to claim 19,
Wherein any two adjacent active areas of any one of the Q active area rows overlap each other with respect to a direction perpendicular to a best fit plane intersecting all sensing elements of the image sensor.
21. The method according to claim 19,
Wherein the image sensor further comprises a row gap between any two adjacent active area rows of the Q active area rows, and
Wherein the row gaps are in a direction perpendicular to the axis.
22. The method according to claim 1,
Wherein when the image sensor captures the plurality of partial images, the image sensor moves on the circular orbit (1), then moves on the circular orbit (2), … …, then moves on the circular orbit (M),
Wherein for each value of i, i=1, … …, (M-1), for each pair (a) a first image capturing position of the image sensor on the circular track (i) and (B) a second image capturing position of the image sensor on the circular track (i+1), a straight line passing through the first capturing position and the second capturing position is not parallel to the axis.
CN202180105044.9A 2021-12-24 2021-12-24 Imaging system and method of operation Pending CN118489128A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/141110 WO2023115516A1 (en) 2021-12-24 2021-12-24 Imaging systems and methods of operation

Publications (1)

Publication Number Publication Date
CN118489128A true CN118489128A (en) 2024-08-13

Family

ID=86901068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180105044.9A Pending CN118489128A (en) 2021-12-24 2021-12-24 Imaging system and method of operation

Country Status (3)

Country Link
CN (1) CN118489128A (en)
TW (1) TW202326175A (en)
WO (1) WO2023115516A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9743893B2 (en) * 2011-12-21 2017-08-29 Carestream Health, Inc. Dental imaging with photon-counting detector
CN103873751A (en) * 2014-03-28 2014-06-18 陈维龙 Three-dimensional panoramic scanning device and three-dimensional module generating method
US9996944B2 (en) * 2016-07-06 2018-06-12 Qualcomm Incorporated Systems and methods for mapping an environment
CN114915827A (en) * 2018-05-08 2022-08-16 日本聚逸株式会社 Moving image distribution system, method thereof, and recording medium

Also Published As

Publication number Publication date
WO2023115516A1 (en) 2023-06-29
TW202326175A (en) 2023-07-01

Similar Documents

Publication Publication Date Title
US12019193B2 (en) Imaging system
US11948285B2 (en) Imaging systems with multiple radiation sources
US11904187B2 (en) Imaging methods using multiple radiation beams
US11882378B2 (en) Imaging methods using multiple radiation beams
US20230281754A1 (en) Imaging methods using an image sensor with multiple radiation detectors
US20240003830A1 (en) Imaging methods using an image sensor with multiple radiation detectors
US20230280482A1 (en) Imaging systems
WO2023115516A1 (en) Imaging systems and methods of operation
WO2023123301A1 (en) Imaging systems with rotating image sensors
WO2024031301A1 (en) Imaging systems and corresponding operation methods
US20240337531A1 (en) Methods of operation of image sensor
US20230411433A1 (en) Imaging systems with image sensors having multiple radiation detectors
US11617554B2 (en) Imaging systems using x-ray fluorescence
WO2024138360A1 (en) Arrangements of radiation detectors in an image sensor
WO2025030306A1 (en) Imaging systems with side incident and surface incident imaging arrangements and corresponding operation methods
WO2023123302A1 (en) Imaging methods using bi-directional counters
TW202314291A (en) Imaging method and imaging system
TW202331301A (en) Method and system for performing diffractometry
CN117916039A (en) 3D (three-dimensional) printing with void filling
CN116669632A (en) Imaging method using multiple radiation beams

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination