DK202370162A1 - Optical 3d scanner with improved accuracy - Google Patents
Optical 3d scanner with improved accuracy Download PDFInfo
- Publication number
- DK202370162A1 DK202370162A1 DKPA202370162A DKPA202370162A DK202370162A1 DK 202370162 A1 DK202370162 A1 DK 202370162A1 DK PA202370162 A DKPA202370162 A DK PA202370162A DK PA202370162 A DKPA202370162 A DK PA202370162A DK 202370162 A1 DK202370162 A1 DK 202370162A1
- Authority
- DK
- Denmark
- Prior art keywords
- projector
- unit
- scanner
- camera
- pattern
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 112
- 210000001747 pupil Anatomy 0.000 claims description 12
- 238000005096 rolling process Methods 0.000 claims description 7
- 230000003068 static effect Effects 0.000 claims description 7
- 230000008901 benefit Effects 0.000 description 16
- 238000012545 processing Methods 0.000 description 16
- 238000003384 imaging method Methods 0.000 description 14
- 238000000034 method Methods 0.000 description 12
- 239000000523 sample Substances 0.000 description 9
- 230000000694 effects Effects 0.000 description 8
- 238000002474 experimental method Methods 0.000 description 8
- 230000004075 alteration Effects 0.000 description 7
- 238000009877 rendering Methods 0.000 description 7
- 210000000214 mouth Anatomy 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000033001 locomotion Effects 0.000 description 4
- -1 poly(methyl methacrylate) Polymers 0.000 description 4
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 4
- 239000004926 polymethyl methacrylate Substances 0.000 description 4
- 230000007480 spreading Effects 0.000 description 4
- 238000003892 spreading Methods 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000001537 neural effect Effects 0.000 description 3
- 239000000853 adhesive Substances 0.000 description 2
- 230000001070 adhesive effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 210000002455 dental arch Anatomy 0.000 description 2
- 210000004513 dentition Anatomy 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000887 face Anatomy 0.000 description 2
- 210000004195 gingiva Anatomy 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000000116 mitigating effect Effects 0.000 description 2
- 150000003071 polychlorinated biphenyls Chemical class 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 229910052594 sapphire Inorganic materials 0.000 description 2
- 239000010980 sapphire Substances 0.000 description 2
- 230000036346 tooth eruption Effects 0.000 description 2
- 230000004304 visual acuity Effects 0.000 description 2
- 101100248200 Arabidopsis thaliana RGGB gene Proteins 0.000 description 1
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000004026 adhesive bonding Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 208000002925 dental caries Diseases 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000001506 fluorescence spectroscopy Methods 0.000 description 1
- 229910052602 gypsum Inorganic materials 0.000 description 1
- 239000010440 gypsum Substances 0.000 description 1
- 239000007943 implant Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 229910001220 stainless steel Inorganic materials 0.000 description 1
- 239000010935 stainless steel Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 239000003826 tablet Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
- A61C9/006—Optical means or methods, e.g. scanning the teeth by a laser or light beam projecting one or more stripes or patterns on the teeth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61C—DENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
- A61C9/00—Impression cups, i.e. impression trays; Impression methods
- A61C9/004—Means or methods for taking digitized impressions
- A61C9/0046—Data acquisition means or methods
- A61C9/0053—Optical means or methods, e.g. scanning the teeth by a laser or light beam
- A61C9/0066—Depth determination through adaptive focusing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Surgery (AREA)
- Epidemiology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
Abstract
The present disclosure relates to an optical system for an intraoral scanner, comprising: at least one projector unit comprising: a light source for generating light; a pattern generating element configured for generating a pattern of light to be projected on a surface an object; and one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and an aperture having a predetermined size. The optical system further comprises one or more camera units, each camera unit comprising: an image sensor for acquiring one or more image(s); and one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis; wherein the projector optical axis and the camera optical axis are non-parallel. The present disclosure further relates to an intraoral scanner comprising such an optical system.
Description
DK 2023 70162 A1
OPTICAL 3D SCANNER WITH IMPROVED ACCURACY
The present disclosure relates to an intraoral 3D scanner for generating a three-dimensional representation of a scanned object. The disclosure further relates to a scan unit for an intraoral 3D scanner, wherein the scan unit comprises multiple camera units. The disclosure further relates to a method for generating a three-dimensional representation.
For dental scanning applications it is generally desired to resolve fine details on the patient's teeth in order to generate an accurate 3D representation of said teeth. Consequently, this allows for more accurate restorations, such as bridges and crowns, to be digitally designed such that they fit accurately in the patient's mouth. Generally, it is known that the resolution of the resulting 3D representation scales with the number of pattern features in the projected pattern, since said number determines the number of 3D points obtainable from a single exposure. For a given area, a higher number of pattern features results in a more dense pattern, and consequently the size of the pattern features is diminished in order to fit more features on the same area. An associated problem herewith is that smaller pattern features sets more requirements on the optics of the scanner.
In particular, it is more difficult to project a pattern with many small pattern features, such that the pattern features are in focus in a wide focus range. It is generally the case for any optical system that a given point or feature projected through the optical system will have a certain spread (blurring) when imaged by the optical system. The degree of spreading of the point can be described by a point spread function (PSF). The resolution of the 3D representation is limited by the spreading of the features described by the PSF, since the features need to be sharp in the images in order to accurately determine the 3D points for generating the 3D representation. In other words, the minimum feature size in the pattern is limited by the imaging resolving power of the optics of the scanner. The imaging resolution is limited primarily by three effects: defocus, lens aberrations, and diffraction.
The aperture of the projector unit or camera(s) influence the imaging resolution primarily via the aforementioned three effects. A small aperture is advantageous for minimizing the negative effects of defocus and lens aberrations, since a camera or projector having a small aperture is highly tolerant of defocus. The extreme example is a pinhole camera, in which case all objects are in focus almost regardless of their distance from the pinhole aperture. However, a small aperture causes more diffraction, which negatively affects the imaging resolution.
, DK 2023 70162 A1
Thus, it is difficult to optimize the imaging resolution by changing the size of the aperture, since the three effects are affected differently by the size of the aperture.
Existing 3D scanners typically utilize either dynamic patterns that change in time or structured light patterns of lower density due to the difficulties associated with the projection of high- density patterns. In particular, it is challenging to project a high-density pattern with a large depth of focus (i.e. in a large focus range) for a short working distance of the optical system.
A short working distance is typically desired for dental scanning applications since it is desired to be able to scan the patient's teeth in very close proximity to the teeth. A larger working distance either requires more space inside the scanner or it requires the operator to hold the scanner in some predetermined distance over the teeth in order to have the projected pattern in focus; both of which are undesired. A large depth of focus is desired in order to capture all detail e.g. of steep surfaces and cavities inside the patient's mouth. Similarly, a large depth of focus is more tolerant to changes in the height over the teeth in which the scanner is held.
Thus, it is desired to develop an improved 3D scanner system and intraoral scanner overcoming the abovementioned challenges. In particular, it is desired to improve the imaging resolution and accuracy of 3D scanner systems.
The present disclosure addresses the above-mentioned challenges by providing an intraoral scanner comprising: — at least one projector unit comprising: — alight source for generating light; — a pattern generating element configured for generating a pattern of light to be projected on a surface of an object; and — one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and — one or more camera units, such as two or more camera units, each camera unit comprising: — an image sensor for acquiring one or more image(s); and — one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis.
In accordance with some embodiments of the present disclosure, the scanner is a triangulation-based intraoral scanner comprising:
2 DK 2023 70162 A1 — atleast one projector unit comprising: — alight source for generating light; — a pattern generating element configured for generating a pattern of light to be projected on a surface of an object; and — one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and — an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm; — one or more camera units, such as two or more camera units, each camera unit comprising: — an image sensor for acquiring one or more image(s); and — one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis; wherein the projector optical axis and the camera optical axis are non-parallel, and wherein the working distance of the projector unit and/or a given camera unit is between 15 mm and 50 mm, and wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015.
The present disclosure further relates to an optical system for an intraoral scanner as disclosed herein, the optical system comprising: — at least one projector unit comprising: — alight source for generating light; — a pattern generating element configured for generating a pattern of light to be projected on a surface of an object; and — one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and — an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm; — one or more camera units, such as two or more camera units, each camera unit comprising: — an image sensor for acquiring one or more image(s); and — one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis;
1 DK 2023 70162 A1 wherein the projector optical axis and the camera optical axis are non-parallel, and wherein the working distance of the projector unit and/or a given camera unit is between 15 mm and 50 mm, and wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015.
The present disclosure further relates to a 3D scanner system for generating a three- dimensional representation of an object, the 3D scanner system comprising: — an intraoral scanner according to any of the embodiments disclosed herein; and — one or more processors configured for generating a three-dimensional representation of the object based on image(s) obtained by the camera unit(s).
According to some embodiments, the 3D scanner system comprises: — an intraoral scanner comprising: — at least one projector unit configured for projecting a pattern on a surface of the object, wherein the pattern comprises a plurality of pattern features; — two or more camera units configured for acquiring a set of images comprising at least one image from each camera unit, wherein each image includes at least a portion of the projected pattern, wherein the images within the set of images are acquired simultaneously; - one or more processors configured for: — determining one or more image features within the set of images; — solving a correspondence problem within the set of images such that points in 3D space are determined based on the image features, wherein said points form a solution to the correspondence problem, wherein the correspondence problem is solved for groups of pattern features, each group of pattern features forming a connected subset within the pattern; and — generating a three-dimensional representation of the object, wherein the solution to the correspondence problem is used to generate the three-dimensional representation of the object.
The 3D scanner system may comprise an optical system and/or intraoral scanner according to any of the embodiments disclosed herein. The 3D scanner system may further comprise a display configured for displaying the three-dimensional representation of the object.
DK 2023 70162 A1
The present disclosure further relates to a method for generating a three-dimensional representation of an object using the intraoral scanner disclosed herein, the method comprising the steps of: — projecting a pattern of light, such as a static pattern of light, onto a surface of the 5 object, wherein the pattern is projected by a projector unit of the intraoral scanner; — acquiring a set of images of the object, wherein the set of images is acquired by multiple camera units of the intraoral scanner, wherein the number of images in the set of images corresponds to the number of camera units, wherein each camera unit contributes one image to the set of images, wherein images within the set of images are acquired simultaneously by the camera units; — determining image features in the set of images; — solving a correspondence problem associated with the set of images such that points in 3D space are determined based on the image features, wherein said points form a solution to the correspondence problem, wherein the correspondence problem is solved for groups of pattern features, each group of pattern features forming a connected subset within the pattern; and — generating a three-dimensional representation of the object, wherein the solution to the correspondence problem is used to generate the three-dimensional representation of the object.
Fig. 1 shows a scan unit according to the present disclosure.
Fig. 2 shows a cross-section through the scan unit of figure 1.
Fig. 3 shows an exploded view of a scan unit according to the present disclosure.
Fig. 4 shows the embodiment according to figure 3, wherein the units, i.e. the projector unit and the four camera units, are inserted and mounted/fixed in the fixation unit.
Fig. 5 shows a lens mount configured for receiving and mounting the projector or camera lens stack.
Fig. 6 shows a fixation unit according to the present disclosure.
Fig. 7 shows a camera lens stack, a lens mount, an image sensor, and a flexible printed circuit board, according to the present disclosure.
Fig. 8 shows an embodiment of a scan unit, wherein one or more lens mounts are integrated in the fixation unit.
Fig. 9 shows an exploded view of a projector unit according to the present disclosure.
Fig. 10 shows a cross-sectional view of a scan unit according to the present disclosure.
Fig. 11 shows a cross-sectional view of a scan unit according to the present disclosure, wherein the focus lens or lens stack comprises an outer thread.
. DK 2023 70162 A1
Fig. 12 shows a cross-sectional view of a scan unit according to the present disclosure.
Fig. 13 shows two scan units according to the present disclosure.
Fig. 14 shows a cross-section through an intraoral scanner according to the present disclosure.
Three-dimensional object
The three-dimensional (3D) object may be a dental object. Examples of dental objects include any one or more of: tooth/teeth, gingiva, implant(s), dental restoration(s), dental prostheses, edentulous ridge(s), and/or combinations thereof. Alternatively, the dental object may be a gypsum model or a plastic model representing a subject's teeth. As an example, the three- dimensional (3D) object may comprise teeth and/or gingiva of a subject. The dental object may only be a part of the subject’s teeth and/or oral cavity, since the entire set of teeth of the subject is not necessarily scanned during a scanning session. A scanning session may be understood herein as a period of time during which data (such as images) of the 3D object is obtained.
Scanner
The scanner disclosed herein may be an intraoral scanner for acquiring images within an intraoral cavity of a subject. The scanner may be a handheld scanner, i.e. a device configured for being held with a human hand. The scanner may employ any suitable scanning principle such as triangulation-based scanning, stereo vision, structure from motion, confocal scanning, or other scanning principles.
In preferred embodiments, the scanner employs a triangulation-based scanning principle. As an example, the scanner may comprise a projector unit and one or more camera units for determining points in 3D space based on triangulation. As another example, the scanner comprises a projector unit and two or more camera units, wherein the camera units are configured to image the scanned object from separate views, i.e. from different directions. In particular, the camera units may be configured to acquire a set of images, wherein a correspondence problem is solved within said set of images based on triangulation. The images within the set of images may be acquired by separate camera units of the scanner.
The images within the set of images are preferably acquired simultaneously, i.e. at the same moment in time, wherein each camera contributes with at least one image to the set of images.
The images within the set of images preferably captures substantially the same region of the dental object. The images may comprise a plurality of image features corresponding to pattern
, DK 2023 70162 A1 features in a pattern of light projected on the surface of the dental object. The correspondence problem may generally refer to the problem of ascertaining which parts, or image features, of one image correspond to which parts of another image within the set of images. Specifically, in this context, the correspondence problem may refer to the task of associating each image feature with a projector ray emanating from the projector unit. In other words, the problem can also be stated as the task of associating points in the images with points in the projector plane of the projector unit. A system and method for solving the correspondence problem is further described in PCT/EP2022/086763 and PA 2023 70115 by the same applicant, which are herein incorporated by reference in their entirety.
The projector unit may be configured to project a plurality of projector rays, which are projected onto a surface of the dental object. Solving the correspondence problem may include the steps of determining image features in the images within a set of images, and further associate said image features with a specific projector ray. In preferred embodiments, the correspondence problem is solved jointly for groups of projector rays, as opposed to e.g. solving the correspondence problem projector ray by projector ray. The inventors have found that by solving the correspondence problem jointly for groups or collections of projector rays, a particular reliable and robust solution can be obtained, consequently leading to a more accurate 3D representation. Subsequently, the depth of each projector ray may be computed, whereby a 3D representation of the scanned object may be generated.
The scanner may comprise one or more scan units, wherein each scan unit comprises a projector unit and one or more camera units. As an example, the scanner may comprise one scan unit comprising one projector unit and at least two camera units. As another example, the scanner may comprise one scan unit comprising one projector unit and four camera units.
In yet another example, the scanner may comprise at least two scan units, wherein each scan unit comprises a projector unit and two or more camera units. In yet another example, the scanner may comprise at least two scan units, wherein each scan unit comprises a projector unit and four camera units. In some embodiments, the scanner has a field of view of at least 18x18 mm?, such as at least 20 x 20 mm?, at a given working distance, such as at a working distance between 15 mm and 50 mm.
The scanner may further comprise a reflecting element, such as a mirror, arranged in combination with a given scan unit. The reflecting element is preferably configured to reflect light from the projector unit of the scan unit and/or from the surface of the dental object and onto the image sensor(s) of each camera unit of the scan unit associated with the reflecting element. In preferred embodiments, the scanner comprises or constitutes an elongated probe,
9 DK 2023 70162 A1 which defines a longitudinal axis of the scanner. In some embodiments, the height of the mirror as seen along the projector optical axis is between about 13 mm to about 20 mm. An advantage hereof is that the tip height of the scanner is kept at a minimum, in particular for working distances above 15 mm. Thus, the mirror allows for part of the optical path to be folded or redirected inside the scanner, such that the scanner may accommodate an optical system, e.g. a scan unit, having a working distance longer than the intended height of the probe or tip of the scanner. Consequently, the tip or probe can be made smaller, in particular the height of the probe, such that it may easily enter e.g. an oral cavity of a patient. A smaller probe also more easily captures data in the back of the mouth of the patient. — Projector unit
A projector unit may be understood herein as a device configured for projecting light onto a surface, such as the surface of a three-dimensional object. In preferred embodiments, the projector unit is configured to project a pattern of light onto the surface of a dental object.
Preferably, the projector unit is configured to project a pattern of light such that the pattern of light is in focus at a predefined focus distance, or focus range, measured along a projector optical axis. In some embodiments, the projector unit is configured to project the pattern of light such that the pattern of light is defocused at the opening of the probe of the scanner and/or at a surface of an optical window in said probe. The projector unit may be configured to project unpolarized light.
The projector unit may comprise Digital Light Processing (DLP) projectors using a micro mirror array for generating a time varying pattern, or a diffractive optical element (DOE), or front-lit reflective mask projectors, or micro-LED projectors, or Liquid crystal on silicon (LCoS) projectors or back-lit mask projectors, wherein a light source is placed behind a mask having a spatial pattern, whereby the light projected on the surface of the dental object is patterned.
The pattern may be dynamic, i.e. such that the pattern changes over time, or the pattern may be static in time, i.e. such that the pattern remains the same over time. An advantage of projecting a static pattern is that it allows the capture of all the image data simultaneously, thus preventing warping due to movement between the scanner and the object.
The projector unit may comprise one or more collimation lenses for collimating the light from the light source. The collimation lens(es) may be placed between the light source and the mask. In some embodiments, the one or more collimation lenses are Fresnel lenses. The projector unit may further comprise one or more focus lenses, or lens elements, configured for focusing the light at a predefined working distance. In some embodiments, the projector unit comprises a projector lens stack comprising a plurality of lens elements. The projector lens o DK 2023 70162 A1 stack may define the projector optical axis. In some embodiments, the lens elements of the projector lens stack are attached together to form a single unit.
In preferred embodiments, the projector unit of the scanner comprises at least one light source and a pattern generating element for defining a pattern of light. The pattern generating element is preferably configured for generating a light pattern to be projected on a surface of a dental object. As an example, the pattern generating element may be a mask having a spatial pattern.
Hence, the projector unit may comprise a mask configured to define a pattern of light. The mask may be placed between the light source of the projector unit and the one or more focus lenses, such that light transmitted through the mask is patterned into a light pattern. As an example, the mask may define a polygonal pattern comprising a plurality of polygons, such as a checkerboard pattern. The projector unit may further comprise one or more lenses such as collimation lenses or projection lenses. In other embodiments, the pattern generating element is based on diffraction and/or refraction to generate the light pattern, such as a pattern comprising an array of discrete unconnected dots.
Preferably, the projector unit is configured to generate a predefined static pattern, which may be projected onto a surface. Alternatively, the projector unit may be configured to generate a dynamic pattern, which changes in time. The projector unit may be associated with its own projector plane, which is determined by the projector optics. As an example, if the projector unit is a back-lit mask projector, the projector plane may be understood as the plane wherein the mask is contained. The projector plane comprises a plurality of pattern features of the projected pattern. Preferably, the camera units and projector unit are arranged such that the image sensors and the projector plane, e.g. defined by the mask, are in the same plane.
The projector unit may define a projector optical axis. An optical axis may be understood as a line along which there is some degree of rotational symmetry in the optical system such as a camera lens or a projector unit. In some embodiments, the projector optical axis of the projector unit is substantially parallel with the longitudinal axis of the scanner. In other embodiments, the projector optical axis of the scan unit defines an angle, such as at least 45° or at least 75°, with the longitudinal axis of the scanner. In other embodiments, the projector optical axis of the projector unit is substantially orthogonal to the longitudinal axis of the scanner. — The projector unit may comprise an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm, such as between 0.3 mm to 0.6 mm. In experiments performed by the inventors, a pupil diameter of between 0.2 mm to 0.7 mm was
0 DK 2023 70162 A1 found to be particularly useful because it provided a projected pattern in particularly good focus in a large focus range, e.g. a focus range of between 16 mm and 22 mm. In particular, a pupil diameter between from about 0.3 mm to about 0.5 mm was found to provide a good compromise between the imaging resolution, e.g. the resolution of the pattern, and the depth of focus, i.e. the focus range. Depth of focus may in some cases be understood as the maximum range where the object appears to be in acceptable focus, e.g. within a given predetermined tolerance.
For some applications, e.g. for dental scanning applications, it is preferred that the scanner has a working distance of between 10 mm and 100 mm. In experiments performed by the inventors, it has been found that a working distance of the projector unit of between 10 mm and 70 mm, such as between 15 mm and 50 mm, is particularly useful, since the optics, e.g. the scan unit(s), take up less space inside the scanner, and also since it is desired to be able to scan objects very close to the scanner. Since the optical system then takes up less space inside the scanner, it also allows for multiple scan units to be placed in succession inside the scanner. In preferred embodiments, the scanner is able to project a pattern in focus at the exit of the tip of the scanner, e.g. at the optical window of the scanner or at an opening in the scanner surface. The working distance may be understood as the object to lens distance where the image is at its sharpest focus. The working distance may also, or alternatively, be understood as the distance from the object to a front lens, e.g. a front lens of the projector unit. The front lens may be the one or more focus lenses of the projector unit.
In some embodiments, the choice of aperture and working distance results in a numerical aperture of the projector unit of between 0.0035 and 0.015, which was found to provide a good imaging resolution, i.e. a pattern with pattern features in good focus in a given focus range, and further a good compromise in terms of defocus, lens aberrations, and diffraction. In experiments performed by the inventors, a numerical aperture of the projector unit of between 0.005 and 0.009 was found to provide an ideal compromise between imaging resolution and depth of focus. Thus, a numerical aperture in this range was found to be the best balance between mitigating the negative effects on resolution caused by defocus, lens aberrations, and diffraction. The numerical aperture may be the same for the projector unit and the camera unit(s). The numerical aperture may be understood as the object-space numerical aperture.
Light source
The projector unit may comprise one or more light sources. The projector unit may be configured to project a pattern of light defined by a plurality of projector rays when the light source(s) are on/active. The projector unit may be configured for sequentially turning the light i DK 2023 70162 A1 source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period. The light source(s) may be configured to generate light of a single wavelength or a combination of wavelengths (mono- or polychromatic). The combination of wavelengths may be produced by a light source configured to produce light comprising different wavelengths, or a range of wavelengths (such as white light). The light source may be configured to generate unpolarized light, such as unpolarized white light.
In some embodiments, each projector unit comprises a light source for generating white light.
An advantage hereof is that white light enables the scanner to acquire data or information relating to the surface geometry and to the surface color simultaneously. Consequently, the same set of images can be used to provide both geometry of the object, e.g. in terms of 3D data / a 3D representation, and color of the object. Hence, there is no need for an alignment of data relating to the recorded surface geometry and data relating to the recorded surface color in order to generate a digital 3D representation of the object expressing both color and geometry of the object. Alternatively, the projector unit may comprise multiple light sources such as LEDs individually producing light of different wavelengths (such as red, green, and blue) that may be combined to form light comprising different wavelengths. Thus, the light produced by the light source(s) may be defined by a wavelength defining a specific color, or a range of different wavelengths defining a combination of colors such as white light. In some embodiments, the light source is a diode, such as a white light diode, or a laser diode. In some embodiments, the projector unit comprises a laser, such as a blue or green laser diode for generating blue or green light, respectively. An advantage hereof is that a more efficient projector unit can be realized, which enables a faster exposure compared to utilizing e.g. a white light diode.
In some embodiments, the scanner comprises a light source configured for exciting fluorescent material to obtain fluorescence data from the dental object such as from teeth.
Such a light source may be configured to produce a narrow range of wavelengths. In other embodiments, the scanner comprises an infrared light source, which is configured to generate wavelengths in the infrared range, such as between 700 nm and 1.5 pm. In some embodiments, the scanner comprises one or more light sources selected from the group of:
Infrared (IR) light source, near-infrared (NIR) light source, blue light source, violet light source, ultraviolet (UV) light source, and/or combinations thereof. In some embodiments, the scanner comprises a first light source forming part of the projector unit, and one or more second light sources, e.g. IR-LED(s) or NIR-LED(s) and/or blue or violet LED(s), located in a distal part of the scanner, such as in the tip of the scanner. Some of the light sources may be utilized for diagnostic purposes, such as for aiding in the detection of regions of caries. A scanner
10 DK 2023 70162 A1 configured for detecting fluorescence is further described in WO 2014/000745 A1 by the same applicant, and is herein incorporated by reference in its entirety.
In some embodiments, the projector unit is configured for sequentially turning the light source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period. As an example, the time period may be between 3 milliseconds (ms) and 10 milliseconds (ms), such as between 4 ms and 8 ms. The predetermined frequency for turning the light source on and off may be between 25 Hz and 35 Hz, such as approximately 30 Hz.
Pattern of light — The projector unit may be configured to project a pattern of light defined by a plurality of projector rays when a light source of the projector unit is turned on. The terms ‘illumination pattern’, ‘pattern of light’, ‘spatial pattern’, and ‘pattern’ are used herein interchangeably. The pattern may be generated using a pattern generating element, e.g. located in the projector unit. The pattern generating element may be a mask, such as a transparency or transmission mask, having a spatial pattern. The mask may be a chrome photomask. In other embodiments, the pattern generating element is configured to utilize diffraction and/or refraction to generate a light pattern. The use of a pattern of light may lead to a correspondence problem, where a correspondence between points in the light pattern and points seen by the camera unit(s) viewing the pattern needs to be determined. In some embodiments, the correspondence problem is solved jointly for groups of projector rays emanating from the projector unit.
The spatial pattern may be a polygonal pattern comprising a plurality of polygons. The polygons may be selected from the group of: triangles, rectangles, squares, pentagons, hexagons, and/or combinations thereof. Other polygons can also be envisioned. In general, the polygons are composed of edges and corners. In preferred embodiments, the polygons are repeated in the pattern in a predefined manner. As an example, the pattern may comprise a plurality of repeating units, wherein each repeating unit comprises a predefined number of polygons, wherein the repeating units are repeated throughout the pattern. Alternatively, the pattern may comprise a predefined arrangement comprising any of stripes, squares, dots, triangles, rectangles, and/or combinations thereof. In some embodiments, the pattern is non- coded, such that no part of the pattern is unique.
In some embodiments, the generated pattern of light is a polygonal pattern, such as a checkerboard pattern comprising a plurality of checkers. Similar to a common checkerboard, the checkers in the pattern may have alternating dark and bright areas corresponding to areas of low light intensity (dark) and areas of high(er) light intensity (bright). In some embodiments
19 DK 2023 70162 A1 the pattern of light is a checkerboard pattern comprising alternating squares of dark and bright light. In some embodiments, each square in the checkerboard pattern has a length of between 100 um to 200 um. This may in some cases correspond to a pixel size of between 4-16 pixels, such as 8-12 pixels, when the pattern is imaged on the image sensor(s). Through experiments the inventors have realized that at least 4 resolvable pixels per checker period is sufficient to get a reasonable contrast and well-defined edges of the projected checkerboard pattern. In some embodiments, the pattern comprises at least 100 x 100 squares arranged in a checkerboard pattern, e.g. of the size mentioned above. Such a pattern has a high number of pattern features, e.g. wherein the corners of the squares constitute features. Consequently, such a pattern sets high requirements to the optical system of the scanner. Thus, the pattern of light may resemble a checkerboard pattern with alternating squares of different intensity in light. In other embodiments, the light pattern comprises a distribution of discrete unconnected spots of light.
The pattern preferably comprises a plurality of pattern features. The pattern features may be arranged in a regular grid. In some embodiments of the presently disclosed scanner, the total number of pattern features in the pattern is at least 1000, preferably at least 3000, more preferably at least 10000, even more preferably at least 15000. When projecting a pattern comprising such pattern features onto a surface of the 3D object, the acquired images of the object will similarly comprise a plurality of image features corresponding to the pattern features. A pattern/image feature may be understood as an individual well-defined location in the pattern/image. Examples of image/pattern features include corners, edges, vertices, points, transitions, dots, stripes, etc. In preferred embodiments, the image/pattern features comprise the corners of checkers in a checkerboard pattern. In other embodiments, the image/pattern features comprise corners in a polygon pattern such as a triangular pattern.
In some embodiments, the projector unit is configured for projecting a high-density pattern. A high-density pattern may be understood as a pattern comprising more than 3000 pattern features. Typically, a dense light pattern leads to a more complex correspondence problem since there is a large number of features for which to solve the correspondence problem.
Furthermore, a high-density pattern is more difficult to resolve due to the small features, which consequently sets a high requirement on the optics of the scanner, as discussed below. The inventors have found that a pattern comprising more than 3000 pattern features provides a very good resolution of the corresponding 3D representation of the scanned object, since the high number of features provides for a high number of 3D points. A scanner for projecting a high-density pattern is further described in EP 22183907.9 by the same applicant, and is herein incorporated by reference in its entirety.
DK 2023 70162 A1
It is challenging to project a high-density pattern with a large depth of focus (i.e. in a large focus range) for a short working distance of the optical system, e.g. of the projector unit. In general, the projected features will not be imaged by the scanner as ideal points, but rather they will have a certain spread (blurring) when imaged by scanner. The degree of spreading of the feature can be described by a point spread function (PSF). The resolution of the 3D representation is limited by the spreading of the features described by the PSF, since the features need to be sharp in the images in order to accurately determine the 3D points for generating the 3D representation. In some cases, the features are described by a PSF having an Airy disk radius of equal to or less than 100 um, such as equal to or less than 50 um. The minimum feature size in the pattern is limited by the imaging resolving power of the optics of the scanner. As mentioned herein above, the imaging resolution is limited primarily by three effects: defocus, lens aberrations, and diffraction. — Through experiments performed by the inventors, it has been determined that an optical system, e.g. comprising the projector unit and/or the camera unit(s) as disclosed herein, having a numerical aperture of between 0.0035 and 0.015 enables the ability to resolve very fine details of between 50-200 um in size in a focus range between 10 mm and 36 mm. In some applications, the optical system is configured to have a working distance of between 15 mm and 50 mm, e.g. the working distance of the projector unit and/or the camera unit(s). In some cases, the working distance can be longer than 50 mm, e.g. in case the scanner comprises a mirror arranged in the distal end of the scanner. Conversely, the working distance may in some cases be shorter than 15 mm if the scan unit is provided without a mirror in the scanner. A numerical aperture of between 0.0035 and 0.015 may correspond to apertures providing a pupil diameter of between 0.2 mm and 0.7 mm. Accordingly, the technical effect of the choice of numerical aperture is that it provides the ability to project a high-density pattern, wherein the pattern is in focus in a relatively wide focus range in close proximity to the scanner, and wherein the blurring of the pattern features is below a given tolerance, e.g. given by the Airy disk mentioned previously. Consequently, a more accurate 3D representation may be generated since the position of the 3D points can be determined more accurately, i.e. with less uncertainty, and also since the smaller features allows for more features to be present in the pattern, thereby leading to a 3D representation comprising more 3D points.
Camera unit
A camera unit may be understood herein as a device for capturing an image of an object. Each camera unit may comprise an image sensor for generating an image based on incoming light e.g. received from an illuminated 3D object. As an example, the image sensor may be an
DK 2023 70162 A1 electronic image sensor such as a charge-coupled device (CCD) or an active-pixel sensor (CMOS sensor). In some embodiments, the image sensor is a global shutter sensor configured to expose the entire image area (all pixels) simultaneously and generate an image in a single point in time. The image sensor(s) may have a frame rate of at least 30 frames per second, 5 such as at least 60 frames per second, or even at least 75 frames per second. Thus, in some applications, the camera units may capture images, or sets of images, at a frame rate of at least 30 frames per second, e.g. at a frame of at least 60 frames per second, e.g. at least 75 frames per second. The number of 3D frames generated by the scanner per second may correspond to, or be less than, the above-indicated frame rates of the image sensors.
In some embodiments, the image sensor is a rolling shutter sensor. An advantage of utilizing a rolling shutter sensor is that the sensor can be made smaller for a given pixel array size compared to e.g. a global shutter sensor, which typically comprises more electronics per pixel leading to a sensor having a larger area or volume footprint, thus taking up more space. Thus, a rolling shutter sensor is advantageous for applications with restricted space, such as for intraoral scanners, in particular for realizing a compact intraoral scanner. The rolling shutter sensor may be configured to expose individual rows of pixels with a time lag and output an image based on that. In that case, the image frames may overlap in time because pixel(s) within one frame have been exposed to light at different times. An associated problem hereof is that the imaged object may have moved during the exposure. In some embodiments, the light source of the projector unit is configured to flash during a time period, such that all pixels of the image sensor(s) are exposed simultaneously, effectively meaning that the pixels are exposed globally. However, unlike a global shutter sensor, the global exposure is controlled by the light source of the projector unit as opposed to electronically on the image sensor(s).
In some embodiments, the exposure time of the image sensor is below 15 milliseconds (ms), such as below 10 ms, such as between 4 to 8 ms. These indicated exposure times preferably corresponds to the time period of the flash of the light source of the projector unit as described above. Thus, an advantage of configuring the light source to flash during a time period as indicated above, is that blurring due to relative movement between the scanner and the object being scanned is minimized. This kind of blurring is also referred to as motion blur.
The image sensor(s) may comprise an array of pixels, wherein each pixel is associated with a corresponding camera ray. The array of pixels may be a two-dimensional (2D) array. Each pixel may be covered by a micro lens. In some embodiments, the image area, i.e. the 2D array of pixels, is rectangular or quadratic. The resolution of the image sensor may be between about 0.25 megapixel to about 2.5 megapixel. In some embodiments, the image sensor is a
6 DK 2023 70162 A1
CMOS sensor comprising an analog-to-digital converter (ADC) for each column of pixels, making conversion time significantly faster and allowing each camera unit to benefit from greater speed. Each image sensor may define an image plane, which is the plane that contains the object's projected image. Each image obtained by the image sensor(s) may comprise a plurality of image features, wherein each image feature originates from a pattern feature of the projected pattern. In some embodiments, one or more of the camera units comprise a light field camera. Preferably, each camera unit defines a camera optical axis. The camera units may further comprise one or more focus lenses for focusing light.
In some embodiments, the image sensor is a monochrome image sensor, wherein each pixel is associated with a single color channel, e.g. is a grayscale color channel, wherein the value of each pixel represents only an amount of light. In other embodiments, the image sensor is a color image sensor or an image sensor comprising a color filter array on the array of pixels.
As an example, the color filter array may be a Bayer filter employing an arrangement of four — color filters: Red (R), Green (G), Green (G), and Blue (B). The Bayer filter may also be referred to as a RGGB filter. When utilizing the image sensor data, color pixels may be combined to monochrome pixels of 2 x 2 color pixels for 3D depth reconstruction. In this case, the resolution of the 3D depth reconstruction is only half the resolution of the image sensor in each direction.
When obtaining texture (color) images the full native resolution is preferably utilized (with color filtered pixels).
In accordance with some embodiments, the projector optical axis and the camera optical axis, or axes, are non-parallel. As an example, the projector optical axis and the camera optical axis of at least one camera unit may define a camera-projector angle of approximately 5 to 15 degrees, preferably 5 to 10 degrees, even more preferably 8 to 10 degrees. All of the camera units may be angled similarly with respect to the projector unit, such that each camera optical axis defines approximately the same angle with the projector optical axis. In some embodiments, the camera units are defocused at the opening of the probe of the scanner and/or at the surface of an optical window in said probe. In preferred embodiments of the scanner, the camera units and projector unit of a given scan unit are focused at the same distance. In some embodiments, each camera unit has a field of view of 50-115 degrees, such as 65-100 degrees, such as 65-75 degrees. In other embodiments, each camera unit has a field of view of 80-90 degrees.
Each camera unit may comprise one or more focus lenses for focusing light onto the image sensor of the given camera unit. In some embodiments, each camera unit comprises two or more lenses, or lens elements, assembled in a camera lens stack. Thus, each camera unit
7 DK 2023 70162 A1 may comprise a camera lens stack comprising a plurality of lens elements. The purpose of the focus lens(es) or camera lens stack may be to define or ensure a predetermined focus distance, or working distance, of the camera unit. The camera lens stack may further define the camera optical axis. In some embodiments, the lens elements of the camera lens stack are attached together to form a single unit. In some embodiments, the projector lens stack and the camera lens stack(s) are similar, such that similar lens elements are used in the lens stacks of the scan unit. Utilizing a similar lens design of the camera units and projector unit has the benefit of lowering production costs compared to developing different lens designs.
The camera unit(s) may comprise an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm, such as between 0.3 mm to 0.6 mm.
In experiments performed by the inventors, a pupil diameter of between 0.2 mm to 0.7 mm was found to be particularly useful because it provided an imaged pattern in particularly good focus in a large focus range, e.g. a focus range of between 16 mm and 22 mm. In particular, a pupil diameter between from about 0.3 mm to about 0.5 mm was found to provide a good compromise between the imaging resolution, e.g. the resolution of the pattern, and the depth of focus, i.e. the focus range. In experiments performed by the inventors, it has been found that a working distance of the camera unit(s) of between 10 mm and 70 mm, such as between 15 mm and 50 mm, is particularly useful for the same reasons as stated in relation to the projector unit. In some embodiments, the projector unit and the camera unit(s) have the same working distance. In some embodiments, the images formed by each camera unit are maintained focused over all object distances located between 10 mm and 50 mm, e.g, between 12 mm and 40 mm, e.g., between 15 mm and 36 mm from the lens that is farthest from the image sensor.
In some embodiments, the choice of aperture and working distance results in a numerical aperture of each camera unit of between 0.0035 and 0.015, which was found to provide a good imaging resolution, i.e. a pattern with pattern features in good focus in a given focus range, and further a good compromise in terms of defocus, lens aberrations, and diffraction.
In experiments performed by the inventors, a numerical aperture of the camera unit(s) of between 0.005 and 0.009 was found to provide an ideal compromise between imaging resolution and depth of focus. Thus, a numerical aperture in this range was found to be the best balance between mitigating the negative effects on resolution caused by defocus, lens aberrations, and diffraction. The numerical aperture may be the same for the projector unit and the camera unit(s).
DK 2023 70162 A1
Each camera unit may further comprise a lens mount configured for receiving and mounting the camera lens stack. The lens mount may comprise a cylindrically shaped section adapted to receive the camera lens stack, such that said lens stack can be fixedly mounted herein. The lens mount may further comprise a flange adapted to interface with a fixation unit to ensure 5 correct placement of the lens mount within the fixation unit in at least one direction. In some embodiments, the flange comprises one or more flat surfaces for interfacing with the fixation unit to fix the position of a given lens mount in the fixation unit in at least two directions. The flat surfaces of the lens mount and fixation unit are shown in figures 5-6. In some embodiments, the camera units are symmetrically arranged around the projector unit, wherein the distance between the projector unit and a given camera unit is between 2 mm to 6 mm, such as between 3 mm to 4 mm.
In some embodiments, the scanner comprises two or more camera units configured for acquiring a set of images comprising at least one image from each camera unit, wherein each image includes at least a portion of the projected pattern. In preferred embodiments, the images within the set of images are acquired simultaneously. Furthermore, the number of images in the set of images may preferably correspond to the number of camera units, wherein each camera unit contributes one image to the set of images. An advantage hereof, is that the light-budget is improved; thus, less power is consumed by the light source and the projector unit. Consequently, less heat is generated by said components, which is desired, since oftentimes it is difficult to remove heat from intraoral scanners. Preferably, the 3D scanner system comprises one or more processors configured to generate a 3D representation based on the set of images, e.g. by identifying image features in the set of images and determining points in 3D space based on triangulation. The processors may be located on the scanner or they may be located on an external computer. The 3D representation may be generated continuously during a scanning session, and/or it may be generated in real-time. The scanner system may further comprise a display for displaying the 3D representation. The rendering of the 3D representation and the display of said representation may further occur in real-time, or perceived real-time to the user, e.g. with time lags below 50 ms.
Scan unit
A scan unit may be understood herein as a unit comprising at least one projector unit and one or more camera units. In some embodiments, each scan unit comprises at least two camera units having at least partly overlapping fields of view along different camera optical axes.
Preferably, each scan unit comprises at least four camera units having at least partly overlapping fields of view along different camera optical axes. An advantage of having io DK 2023 70162 A1 overlapping fields of view of the camera units is an improved accuracy due to reduced amount of image stitching errors.
The at least one projector unit and one or more camera units of the scan unit may be provided as modular units for being inserted into a fixation unit of the scan unit, as shown in figs. 3-4.
This has the benefit of providing a more easy and intuitive assembly method of the scan unit.
It further has the benefit of fixing the projector unit and camera unit(s) in a rigid structure, such that the geometric relationship between said units is fixed and maintained. Each unit, camera unit or projector unit, may be connected to its own flexible printed circuit board (PCB). The — projector unit and camera units may be fixedly mounted inside the fixation unit, e.g. using an adhesive.
The scan unit itself may further be considered a modular unit in the sense that it provides a complete optical system with projector unit and one or more camera units. In some embodiments, the scanner is adapted for receiving several of such scan units, such that the field of view of the scanner may be extended or enlarged. An example of such an embodiment is shown in figure 14, which shows a scanner comprises two such scan units placed in series in a tip or distal end of the scanner. In this embodiment, each scan unit is arranged in combination with a mirror to redirect the projected light, e.g. toward a dental object, such as the dentition or dental arch of a subject. A scanner having an extended field of view is further described in EP 23158001.0 by the same applicant, which is herein incorporated by reference in its entirety.
The scan unit may comprise a fixation unit configured for receiving and mounting the projector unit and the camera units in the scan unit. The fixation unit is preferably further configured such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit. In preferred embodiments, the fixation unit is configured to accommodate at least one projector unit and two or more camera units, such as four camera units. In some embodiments, the projector lens stack and/or the camera lens stack(s) protrude from the fixation unit as seen in figs. 1 and 4. An advantage hereof is that it enables a more dense packaging of the camera units such that they are positioned closer to the projector unit at a given angle, which in turn ensures a smaller baseline. A smaller baseline between the projector unit and the camera units reduces occlusion and limits distortion of the projected light pattern, in particular on steep surfaces of the imaged object. The smaller amount of distortion makes it easier to identify image features in the acquired images.
20 DK 2023 70162 A1
The fixation unit may comprise one or more openings for receiving and mounting each lens mount. The openings of the fixation unit may be provided with one or more flat surfaces for interfacing with the flat surfaces of the lens mounts. The flat surfaces are useful for ensuring proper placement of the lens mounts inside the fixation unit, such that they can ideally only move in one dimension during insertion. The openings of the fixation unit may be shaped to fit the lens mounts, e.g. the openings may comprise a cylindrically shaped section to receive and mount a cylindrical section of the lens mounts. The lens mounts may be fixedly mounted inside the fixation unit, e.g. using an adhesive. In some embodiments, the lens mounts are physically integrated in the fixation unit as shown in fig. 8. The disclosure further relates to an — optical system for an intraoral scanner, said optical system comprising at least one projector unit as described herein and one or more camera units as described herein. The optical system may comprise any of the optical components disclosed herein, and it may be embodied in several different ways as suggested herein.
Reflecting element
A reflecting element may be understood herein as an element configured to change the direction of light rays incident on the surface of said reflecting element or being transmitted through said reflecting element, e.g. in case of a prism. In particular, the reflecting element is preferably configured to change the direction of a center beam of the projected light from a projector unit from a direction substantially parallel to the longitudinal axis of the scanner to a direction substantially orthogonal to said longitudinal axis. In preferred embodiments, a surface normal of each reflecting element defines an angle with respect to the projector optical axis of approximately 40-50 degrees, preferably approximately 45 degrees.
As an example, the reflecting element may be selected from the group of: mirrors, prisms, and/or combinations thereof. In preferred embodiments, the reflecting element is configured to reflect light from the projector unit of the scan unit and/or reflect light from the surface of the object being scanned and onto the image sensors of the scan unit. In some embodiments, the scanner comprises a mirror as the reflecting element. In other embodiments, the scanner comprises a prism as the reflecting element. Some embodiments feature a combination of mirror(s) and prism(s). In case of a prism as the reflecting element, the prism is preferably configured to change the direction of a center beam of the projected pattern of light from substantially parallel to the longitudinal axis of the scanner to a direction having an angle of at least 45 degrees with respect to said longitudinal axis. Even more preferably, the prism is configured to change the direction of a center beam of the projected pattern of light from substantially parallel to the longitudinal axis of the scanner to a direction having an angle of approximately 90 degrees with respect to said longitudinal axis.
0 DK 2023 70162 A1
In some embodiments, the scanner comprises a scan unit, wherein the scanner further comprises a reflecting element positioned on the projector optical axis of the projector unit of said scan unit. In other embodiments, the scanner comprises at least two scan units, wherein the scanner further comprises a reflecting element arranged in combination with each scan unit. The reflecting element is then configured to reflect light projected from the projector unit of said scan unit. The reflecting element is preferably further arranged to reflect light from the object being scanned and onto the image sensor(s) of each camera unit of the scan unit. Such an embodiment is exemplified in figure 14. In some embodiments, the reflecting element of each scan unit is positioned on the projector optical axis. The projector optical axis may in some embodiments coincide with the longitudinal axis of the scanner. The reflecting element(s) may be shaped in a variety of ways. As an example, each reflecting element may be substantially rectangular. In some embodiments, the reflecting element(s) comprise a plurality of corners, wherein at least some of the corners are rounded. — Flexible printed circuit boards
In some embodiments, the scanner comprises one or more flexible printed circuit boards (PCB). Each of the flexible printed circuit boards may connect one of the camera units to a main printed circuit boards (PCB). Preferably, a flexible PCB is connected to each camera unit, wherein each PCB comprises a plurality of wires or circuits connected to the image sensor of a given camera unit. In some embodiments, each flexible PCB is bent at a first radius of curvature, wherein the first radius of curvature lies in a first plane. Each PCB may be further bent at a second radius of curvature, wherein the second radius of curvature lies in a second plane. Alternatively, each PCB is a flexible PCB which is manufactured at a predetermined shape, such that each flexible PCB contains a first section which is straight, and a second section which includes one or more turns. Preferably, the second section includes three turns, which may be right-angle turns; however, the turns may have rounded corners. The first plane may be substantially parallel to the longitudinal axis of the scanner, and the second plane may be perpendicular to the first plane. Thus, the manufactured flexible PCB may be bent only along the first radius of curvature, such that the entirety of the second section lies in the second plane. The first radius of curvature is selected from about 0.5 mm to about 5 mm. A first section of each PCB may be substantially parallel to the longitudinal axis of the scanner, and a second section of each PCB may lie in the second plane. Furthermore, the PCBs may overlap each other along a section, wherein the PCBs are bent at the first radius of curvature. An advantage of providing the flexible printed circuit boards in the arrangement described above is that it ensures a low stress in each PCB. In particular, this is the case if each flexible PCB is bent
> DK 2023 70162 A1 along a first radius of curvature, wherein the radius is selected from about 0.5 mm to about 5 mm. Such a large radius of curvature ensures a minimum of stress in each PCB.
Processor
In accordance with some embodiments, the scanner comprises one or more processors. The scanner may comprise a first processor configured for determining image features in the acquired images. The first processor may be selected from the group of: central processing units (CPU), accelerators (offload engines), general-purpose microprocessors, graphics processing units (GPU), neural processing units (NPU), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA), dedicated logic circuitry, dedicated artificial intelligence processor units, or combinations thereof. As an example, the first processor may be a field-programmable gate array (FPGA). As another example, the first processor may be a neural processing unit (NPU). The NPU may be configured to execute one or more machine learning algorithms. A neural processing unit may be understood herein as a circuit configured to implement control and arithmetic logic necessary to execute machine learning algorithms, — such as a neural network.
The scanner may further comprise a second processor configured for performing the steps of carrying out a computer-implemented method for generating a digital representation of a three- dimensional (3D) object. As an example, the second processor may be configured for running a tracking algorithm configured for solving the correspondence problem, e.g. for determining corresponding image features in the obtained images. It may further be configured for determining 3D points based on the determined image features and triangulation.
The scanner may further comprise computer memory for storing instructions, which when executed, causes the first processor to carry out the step of determining image features in the set(s) of images. The scanner may further comprise a second processor configured for performing the steps of carrying out the computer-implemented method for generating a digital representation of a three-dimensional (3D) object. The computer memory may further store instructions, which when executed, causes the second processor to carry out the method of generating a digital representation of a three-dimensional (3D) object. As an example, the second processor may be a central processing unit (CPU) such as an ARM processor or another suitable microprocessor. The second processor may comprise computer memory.
The processor(s), such as the first and second processor, may both be located on the scanner, and they may be operatively connected such that the first processor provides input to the second processor. Alternatively, the first processor may be located on the scanner, and the
D DK 2023 70162 A1 second processor may be located on the computer system described herein. As an example, the first processor may be configured to determine image features in the images, and subsequently provide data related to the determined image features to the second processor.
The data may comprise image feature coordinates as well as other attributes such as a camera index or a predefined property, such as the phase, of the image feature(s).
The second processor may then be configured to generate the digital representation of the 3D object, e.g. in the form of a point cloud. The scanner may be further configured to provide the digital representation to a computer system for rendering the representation. The computer system may further process the digital representation, e.g. by stitching scan data, such as 3D representations or point clouds, received from the scanner and/or by fitting one or more surfaces to the stitched scan data / point clouds. This further processing by the computer system may also be referred to herein as reconstruction. The output of the reconstruction is a digital 3D model of the scanned object. The digital 3D model may be rendered and displayed on a display, e.g. connected to the computer system. The rendering and/or display of the 3D model may occur in real-time.
Module for transmitting data
The scanner preferably comprises a module for transmitting data, such as images or point clouds, to one or more external devices, such as a computer system. The module may be a wireless module configured to wirelessly transfer data from the scanner to the computer system. The wireless module may be configured to perform various functions required for the scanner to wirelessly communicate with a computer network. The wireless module may utilize one or more of the IEEE 802.11 Wi-Fi protocols/ integrated TCP/IP protocol stack that allows the scanner to access the computer network. The wireless module may include a system-on- chip having different types of inbuilt network connectivity technologies. These may include commonly used wireless protocols such as Bluetooth, ZigBee, Wi-Fi, WiGig (also known as 60 GHz Wi-Fi), etc. The scanner may further (or alternatively) be configured to transmit data using a wired connection, such as an ethernet cable or a USB cable. In some embodiments, the scanner comprises a wireless module configured to wirelessly transfer data from the scanner to the computer system. The scanner may be configured to continuously transfer the data, e.g. scan data or image data, during a scanning session. It may further be configured to transfer said data in real-time, such that a good scan experience is achieved.
Computer system
A computer system may be understood as an electronic processing device for carrying out sequences of arithmetic or logical operations. In the present context, a computer system refers
24 DK 2023 70162 A1 to one or more devices comprising at least one processor, such as a central processing unit (CPU), along with some type of computer memory. Examples of computer systems falling within this definition include desktop computers, laptop computers, computer clusters, servers, cloud computers, quantum computers, mobile devices such as smartphones and tablet computers, and/or combinations thereof.
The computer system may comprise hardware such as one or more central processing units (CPU), graphics processing units (GPU), and computer memory such as random-access memory (RAM) or read-only memory (ROM). The computer system may comprise a CPU, — which is configured to read and execute instructions stored in the computer memory e.g. in the form of random-access memory. The computer memory is configured to store instructions for execution by the CPU and data used by those instructions. As an example, the memory may store instructions, which when executed by the CPU, cause the computer system to perform, wholly or partly, any of the computer-implemented methods disclosed herein. The — computer system may further comprise a graphics processing unit (GPU). The GPU may be configured to perform a variety of tasks such as video decoding and encoding, rendering of the digital representation, and other image processing tasks.
The computer system may further comprise non-volatile storage in the form of a hard disc drive. The computer system preferably further comprises an I/O interface configured to connect peripheral devices used in connection with the computer system. More particularly, a display may be connected and configured to display output from the computer system. The display may for example display a 2D rendering of the generated digital 3D representation.
Input devices may also be connected to the I/O interface. Examples of such input devices include a keyboard and a mouse, which allow user interaction with the computer system. A network interface may further be part of the computer system in order to allow it to be connected to an appropriate computer network so as to receive and transmit data (such as scan data and/or images) from and to other computing devices. The scan data may comprise or constitute 3D data, such as depth maps or point clouds, or it may comprise 2D data, such as images. The CPU, volatile memory, hard disc drive, I/O interface, and network interface, may be connected together by a bus.
The computer system is preferably configured for receiving data from the scanner, either directly from the scanner or via a computer network such as a wireless network. The data may comprise images, processed images, point clouds, sets of data points, or other types of data.
The data may be transmitted/received using a wireless connection, a wired connection, and/or combinations thereof. In some embodiments, the computer system is configured for oe DK 2023 70162 A1 generating a digital representation of a three-dimensional (3D) object as described herein. In some embodiments, the computer system is configured for receiving data, such as point clouds, from the scanner and then subsequently perform the steps of reconstruction and rendering a digital representation of a three-dimensional (3D) object. Rendering may be understood as the process of generating one or more images from three-dimensional data.
The computer system may comprise computer memory for storing a computer program, said computer program comprising computer-executable instructions, which when executed, causes the computer system to carry out the method of generating a digital representation of a three-dimensional (3D) object.
Acquisition of images
In accordance with preferred embodiments, the scanner is configured to acquire images of a three-dimensional (3D) object. The images are preferably acquired using a scanner comprising one or more scan units, wherein each scan unit comprises a projector unit and one or more camera units. The scanner may be an intraoral scanner for acquiring images inside the oral cavity of a subject. The projector unit of the scanner is preferably configured for projecting a predefined pattern of light, such as a static pattern, onto a surface, e.g. onto the surface of the three-dimensional object. Once projected on the surface, some light will be reflected from the surface, which may then enter the camera unit(s) of the scanner, whereby images of the 3D object can be acquired.
The images are preferably acquired using one or more camera units per projector unit, such as at least two camera units or at least four camera units for each projector unit. In preferred embodiments, each scan unit of the scanner comprises a projector unit and four camera units.
The images may be processed by a processor located on the scanner, and then subsequently transmitted to the computer system. The images may also be transmitted, without any processing, to the computer system. In some embodiments, both raw images and processed images are transmitted by the scanner to a computer system. In some embodiments, a processor located on the scanner, receives the images as input and provides one or more point clouds.
Fig. 1 shows a scan unit according to the present disclosure. In this embodiment, the scan unit comprises a projector unit and four camera units, wherein the projector unit is arranged in the center of the four camera units. The projector unit may comprise a projector lens stack defining a projector optical axis. The projector lens stack may comprise a plurality of lens elements, or focus lenses, attached together to form a single unit. The scan unit further may
26 DK 2023 70162 A1 further comprise a fixation unit configured for mounting the projector unit and the camera unit(s) in the scan unit. The projector lens stack and/or the camera lens stack(s) may protrude from the fixation unit.
Fig. 2 shows a cross-section through the scan unit of figure 1. In this embodiment, each camera unit comprises a camera lens stack and an image sensor for acquiring one or more image(s). The camera lens stack may comprise a plurality of lens elements, or focus lenses, attached together to form a single unit, wherein each camera lens stack defines a camera optical axis. The camera optical axis may define an angle with respect to the projector optical axis. In this embodiment, each camera unit is arranged such that it forms a predefined angle with respect to the projector optical axis. The camera units may be configured to have at least partly overlapping fields of view along the different camera optical axes. The projector unit may further comprise a light source for emitting light, such as a white light source, and one or more collimation lenses for collimating light emitted by the light source and transmitted through said collimation lenses. The projector unit may further comprise a mask for defining a spatial pattern of light projected through the mask. The projector unit and/or camera unit(s) may further comprise a lens mount configured for receiving and mounting the projector lens stack or a camera lens stack. The projector lens stack and the camera lens stack(s) may be similar, such that similar lens elements are used in the lens stacks. Some of the flexible printed circuit boards are not shown to better visualize the inside of the scan unit.
Fig. 3 shows an exploded view of a scan unit according to the present disclosure. In this embodiment, the scan unit comprises a projector unit, four camera units, and a fixation unit configured for receiving and mounting the projector unit and the camera unit(s) in the scan unit such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit. In this embodiment, the projector unit and the four camera units each comprise a lens mount for accommodating the projector lens stack and the camera lens stack, respectively. Accordingly, the fixation unit may be configured to receive the units including lens mount, such that the units may be fixed in the fixation unit.
Fig. 4 shows the embodiment according to figure 3, wherein the units, i.e. the projector unit and the four camera units, are inserted and mounted/fixed in the fixation unit. The projector and camera units may be attached to the fixation unit, e.g. by adhesive bonding, such that the units are fixed in the fixation unit.
Fig. 5 shows a lens mount configured for receiving and mounting the projector or camera lens stack. Thus, each projector unit and/or camera unit may comprise a lens mount for mounting
7 DK 2023 70162 A1 a lens stack associated with the projector unit and/or camera unit. In this embodiment, the lens mount comprises a cylindrically shaped section adapted to receive the projector or camera lens stack. The lens mount may comprise a flange adapted to interface with the fixation unit to ensure correct placement of the lens mount within the fixation unit in at least one direction. The flange may comprise one or more flat surfaces for interfacing with the fixation unit to fix the position of a given lens mount in the fixation unit in at least two directions.
Fig. 6 shows a fixation unit according to the present disclosure. The fixation unit may be rigid and made in one piece. An advantage hereof is that the fixation unit may ensure a fixed geometric relationship between the projector unit and the camera unit(s). In this embodiment, the fixation unit is configured to accommodate a projector unit and four camera units. The projector unit and/or camera unit(s) may further comprise a lens mount configured for receiving and mounting the projector or camera lens stack. The fixation unit may comprise one or more openings for receiving and mounting each lens mount in the fixation unit. In some embodiments, the lens mount is integrated in the fixation unit.
Fig. 7 shows a camera lens stack, a lens mount, an image sensor, and a flexible printed circuit board, according to the present disclosure. The lens mount may be configured to be inserted on the top of the flexible printed circuit, such that the lens mount may at least partly accommodate the image sensor of a given camera unit. The lens stack may be inserted and fixed in the lens mount. Once assembled, each camera unit may be inserted into the fixation unit as shown in figure 4.
Fig. 8 shows an embodiment of a scan unit, wherein the lens mount is integrated in the fixation unit, such that there is a lens mount for each camera unit and/or projector unit. Thus, in case the scan unit comprises a projector unit (not shown in this figure) and four camera units, then the fixation unit comprises five openings, wherein a lens mount is physically integrated in each opening. Each lens mount is then configured for receiving and mounting a projector lens stack or a camera lens stack.
Fig. 9 shows an exploded view of a projector unit according to the present disclosure. In this embodiment, the projector unit comprises: a projector lens stack comprising a plurality of lens elements, or focus lenses, attached together to form a single unit, a lens mount configured for receiving and mounting the projector lens stack, a light source for emitting light, one or more collimation lenses for collimating light emitted from the light source, an illumination mount configured for accommodating the collimation lenses, a mask configured for defining a spatial pattern of light projected through the mask.
Da DK 2023 70162 A1
Fig. 10 shows a cross-sectional view of a scan unit according to the present disclosure. In this embodiment, the camera units are positioned in parallel with the projector unit, i.e. such that the camera optical axes are parallel to the projector optical axis. Furthermore, the image sensor of each camera unit is surface-mounted on one printed circuit board (PCB), preferably a rigid PCB. Utilizing surface mount technology (SMT) has the advantage of easier and cheaper high-volume manufacturing of the scan units. The embodiment further comprises a pattern generating element, e.g. a mask, as well as a lens mount, which may be attached to the PCB. The scan unit may further comprise a lens stack, e.g. comprising one or more focus lenses, for each camera unit and projector unit. During assembly, the focus of each camera unit and projector unit may be adjusted or calibrated either by active alignment or by mechanical tolerances. The scan unit may further comprise a light source and one or more collimation lenses. In this embodiment, the scan unit comprises four camera unit symmetrically arranged around the projector unit; however only two of them are visible in the figure due to — the cross-sectional view.
Fig. 11 shows a cross-sectional view of a scan unit according to the present disclosure. This embodiment is largely similar to the embodiment shown in fig. 10, except that in this embodiment the focus lenses or lens stacks comprise an outer thread configured for threaded engagement with the lens mount. The outer thread has the purpose of adjusting the focus of a given unit (projector and/or camera) by screwing the lens stack in or out, whereby the position of the lens stack is changed along a given camera/projector optical axis.
Fig. 12 shows a cross-sectional view of a scan unit according to the present disclosure. Similar tothe embodiments shown in figs. 10-11, this embodiment comprises surface-mounted image sensors on a PCB, such as a rigid PCB. The scan unit similarly comprises a light source and one or more collimation lenses. However, in this embodiment each camera unit comprises a housing in the shape of a rectangular cuboid (i.e. a hexahedron with 6 rectangles as faces, wherein adjacent faces meet at right angles). The camera unit(s) may comprise a backside- illuminated CMOS image-sensor. The camera unit(s) may comprise wafer-level optical elements constituting a lens stack. Thus, in this embodiment, the camera units are provided as packaged modular units with both image sensor and lens stack inside a rectangular cuboid housing.
Fig. 13 shows two scan units according to the present disclosure. The scan units are configured to be part of an intraoral scanner. This figure further shows a configured to accommodate the scan units. The frame may further be configured such that one or more
29 DK 2023 70162 A1 reflecting elements can be inserted and mounted in the frame. The reflecting elements may protrude from the frame, when mounted. The frame may comprise one or more openings for inserting the scan units. The frame may be made in one piece, and is preferably manufactured to be rigid. The frame may be configured such that the scan units can be inserted in the frame from opposite directions, e.g. a first scan unit from above the frame, and a second scan unit from below the frame. Accordingly, the scanner may be assembled by inserting the first scan unit into the frame through a first opening located in a first surface of the frame; and inserting the second scan unit into the frame through a second opening located in a second surface of the frame.
Fig. 14 shows a cross-section through an intraoral scanner according to the present disclosure. In this embodiment, the scanner comprises two scan units arranged in series along a longitudinal axis of the scanner in order to increase the field of view of the scanner. Having a larger field of view enables large smooth features, such as the overall curve of a given tooth, to appear in each image, which improves the accuracy of a subsequent stitching of respective 3D scan data, such as 3D surfaces, obtained from different sets of images. Each scan unit is arranged in combination with a reflecting element, such as a mirror, wherein the reflecting element is configured to alter the direction of light projected by a given scan unit. The intraoral scanner may further comprise a housing for accommodating the scan units and the frame.
The housing may comprise an optical window arranged in a distal end of the intraoral scanner.
The optical window may be made of a polymer, such as poly(methyl methacrylate) (PMMA), or it may be made of a glass, such as sapphire glass.
Reference numerals 1. Scan unit 2. Projector unit 3. Projector lens stack 4. Projector optical axis 5. Mask 6. Collimation lenses 7. Camera unit 8. Camera lens stack 9. Camera optical axis 10. Image sensor 11. Fixation unit 12. Lens mount 13. Flange
20 DK 2023 70162 A1 14. Printed circuit board 15. Frame 16. Mirror 17. Housing 18. Optical window
Further details of the invention 1. An intraoral scanner comprising: — at least one projector unit comprising: — alight source for generating light; — a pattern generating element configured for generating a pattern of light to be projected on a surface of an object; and — one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and — one or more camera units, each camera unit comprising: — an image sensor for acquiring one or more image(s); and — one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis. 2. The scanner according to item 1, wherein the projector unit comprises an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm. 3. The scanner according to item 2, wherein the pupil diameter is between 0.3 mm to 0.5 mm. 4. The scanner according to any of the preceding items, wherein the projector optical axis and the camera optical axis are non-parallel. 5. The scanner according to any of the preceding items, wherein the working distance of the projector unit and/or a given camera unit is between 10 mm and 100 mm; 6. The scanner according to any of the preceding items, wherein the working distance of the projector unit and/or a given camera unit is between 15 mm and 50 mm.
a DK 2023 70162 A1 7. The scanner according to any of the preceding items, wherein the working distance of the projector unit and/or a given camera unit is between 18 mm and 38 mm, thereby providing a focus range of approximately 20 mm. 8. The scanner according to any of the preceding items, wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015. 9. The scanner according to any of the preceding items, wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.005 and 0.009. 10. The scanner according to any of the preceding items, wherein the projector unit is configured for sequentially turning the light source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period. 11. The scanner according to item 10, wherein the time period is between 3 milliseconds (ms) and 10 milliseconds (ms). 12. The scanner according to item 10, wherein the time period is between 4 milliseconds (ms) and 8 milliseconds (ms). 13. The scanner according to any of the items 10-12, wherein the frequency is between
Hz and 35 Hz, such as approximately 30 Hz. 14. The scanner according to any of the preceding items, wherein the light source is 25 configured for generating white light. 15. The scanner according to any of the preceding items, wherein the image sensor comprises an array of pixels in a two-dimensional (2D) array, wherein the array comprises at least 1200 pixels times 1200 pixels. 16. The scanner according to any of the preceding items, wherein the image sensor is a rolling shutter image sensor. 17. The scanner according to any of the preceding items, wherein the frame rate of the image sensor(s) is at least 60 frames per second.
2 DK 2023 70162 A1 18. The scanner according to any of the preceding items, wherein the projector unit constitutes or comprises a back-lit mask projector. 19. The scanner according to any of the preceding items, wherein the projector unit is configured for projecting unpolarized light. 20. The scanner according to any of the preceding items, wherein the projector unit is configured for projecting polarized light. 21. The scanner according to any of the preceding items, wherein the pattern generating element is a mask, such as a chrome-on-glass mask, or a diffractive optical element (DOE). 22. The scanner according to any of the preceding items, wherein the pattern of light is a static pattern. 23. The scanner according to any of the preceding items, wherein the pattern of light comprises at least 3000 pattern features. 24. The scanner according to any of the preceding items, wherein the pattern of light comprises at least 10000 pattern features. 25. The scanner according to any of the preceding items, wherein the pattern of light resembles a distribution of discrete unconnected spots of light.
26. The scanner according to any of the preceding items, wherein the pattern of light resembles a checkerboard pattern with alternating squares of different intensity in light. 27. The scanner according to item 26, wherein each square in the checkerboard pattern has a length of between 100 um to 200 um. 28. The scanner according to any of the items 24-27, wherein the pixel size of a given square is between 5-15 pixels, such as 8-12 pixels, when the square is imaged on the image sensor(s).
23 DK 2023 70162 A1
29. The scanner according to any of the preceding items, wherein the scanner comprises two or more camera units, wherein the camera units are configured to acquire a set of images comprising a plurality of images.
30. The scanner according to any of the preceding items, wherein the scanner comprises four or more camera units, wherein the camera units are configured to acquire a set of images comprising a plurality of images.
31. The scanner according to any of the items 29-30, wherein the number of images in the set of images correspond to the number of camera units.
32. The scanner according to any of the items 18-31, wherein the camera units are synchronized such that images within the set of images are acquired simultaneously by the camera units.
33. The scanner according to any of the preceding items, wherein the field of view of each camera unit is between 65° and 75°.
34. The scanner according to any of the preceding items, wherein the field of view of the camera units are at least partially overlapping.
35. The scanner according to any of the preceding items, wherein each camera projector optical axis defines an angle with the projector optical axis of between 5° to 10°. 36. The scanner according to any of the preceding items, wherein the camera units are symmetrically arranged around the projector unit, wherein the distance between the projector unit and a given camera unit is between 2 mm to 5 mm. 37. The scanner according to any of the preceding items, wherein the scanner further comprises one or more second light sources for emitting light at a second wavelength or a second range of wavelengths.
38. The scanner according to item 37, wherein the second range of wavelengths is between 390 nm and 435 nm.
39. The scanner according to item 26, wherein the second range of wavelengths is between 700 nm and 900 nm.
a DK 2023 70162 A1
40. The scanner according to any of the preceding items, wherein the projector unit further comprises one or more collimation lenses for collimating light from the light source.
41. The scanner according to item 40, wherein the collimation lenses comprise one or more Fresnel lenses. 42. The scanner according to any of the preceding items, wherein the projector unit and the camera units are arranged and fixed in a fixation unit inside the scanner, said fixation unit providing a fixed geometrical relationship between the units.
43. The scanner according to item 42, wherein the fixation unit is made in one piece.
44. The scanner according to any of the items 42-43, wherein the fixation unit is made from stainless steel.
45. The scanner according to any of the preceding items, wherein the scanner further comprises a mirror arranged in a distal end of the scanner such that light projected from the projector unit is redirected e.g. onto the surface of the object.
46. The scanner according to item 45, wherein the height of the mirror as seen along the projector optical axis is between about 13 mm to about 20 mm.
47. The scanner according to any of the preceding items, wherein the scanner comprises an optical window arranged in a distal end of the scanner, wherein the optical window is transparent to light projected by the projector unit.
48. The scanner according to item 47, wherein the optical window is made of a polymer,
such as poly(methyl methacrylate) (PMMA).
49. The scanner according to item 45, wherein the optical window is made of glass, such as sapphire glass.
50. The scanner according to any of the preceding items, wherein the scanner is a handheld intraoral scanner.
as DK 2023 70162 A1 51. The scanner according to any of the preceding items, wherein the scanner comprises a wireless module for wirelessly transmitting data, such as images or 3D data, to an external processing unit, such as to a computer. 52. A 3D scanner system comprising: — an intraoral scanner according to any of the preceding items; and — one or more processors configured for generating a three-dimensional representation of an object based on image(s) obtained by the camera unit(s). 53. The 3D scanner system according to item 52, wherein the one or more processors are configured for determining 3D points based on identified image features in the set of images, wherein the 3D points are determined based on triangulation. 54. An intraoral scanner comprising one or more scan units, each scan unit comprising: — a projector unit comprising: — a pattern generating element configured for generating a light pattern to be projected on a surface of a dental object; — a projector lens stack comprising a plurality of lens elements, the projector lens stack defining a projector optical axis; — one or more camera units, each camera unit comprising: — a camera lens stack comprising a plurality of lens elements, the camera lens stack defining a camera optical axis; — an image sensor for acquiring one or more image(s); and — a fixation unit configured for receiving and mounting the projector unit and the camera unit(s) in the scan unit such that each camera optical axis forms a predefined angle with the projector axis when mounted in the fixation unit. 55. The intraoral scanner according to any of the preceding items, wherein the lens elements of the camera lens stack and/or projector lens stack are attached together to form a single unit. 56. The intraoral scanner according to any of the preceding items, wherein the projector lens stack and the camera lens stack(s) protrude from the fixation unit.
26 DK 2023 70162 A1
57. The intraoral scanner according to any of the preceding items, wherein the projector lens stack and the camera lens stack(s) are similar, such that similar lens elements are used in the lens stacks.
58. The intraoral scanner according to any of the preceding items, wherein the projector unit and/or camera unit(s) further comprises a lens mount configured for receiving and mounting the projector or camera lens stack.
59. The intraoral scanner according to item 58, wherein each lens mount comprises a cylindrically shaped section adapted to receive the projector or camera lens stack.
60. The intraoral scanner according to any of the items 58-59, wherein each lens mount comprises a flange adapted to interface with the fixation unit to ensure correct placement of the lens mount within the fixation unit in at least one direction.
61. The intraoral scanner according to item 60, wherein the flange comprises one or more flat surfaces for interfacing with the fixation unit to fix the position of a given lens mount in the fixation unit in at least two directions.
62. The intraoral scanner according to any of the preceding items, wherein the fixation unit comprises one or more openings for receiving and mounting each lens mount.
63. The intraoral scanner according to any of the preceding items, wherein the projector unit further comprises an illumination mount configured for accommodating the one or more collimation lenses.
64. The intraoral scanner according to item 63, wherein the illumination mount is further configured to receive and mount a lens mount for accommodating the projector lens stack.
65. The intraoral scanner according to any of the preceding items, wherein the image sensor is a rolling shutter sensor comprising an array of pixels.
66. The intraoral scanner according to item 65, wherein the projector unit comprises a light source configured to flash during a predefined time period such that effectively all pixels on the image sensor are exposed globally.
37 DK 2023 70162 A1
67. The intraoral scanner according to any of the preceding items, wherein each camera lens stack is mounted directly on the image sensor of a given camera unit. 68. The intraoral scanner according to any of the preceding items, wherein the fixation unit is configured to accommodate at least one projector unit and two or more camera units. 69. The intraoral scanner according to any of the preceding items, wherein each scan unit comprises at least one projector unit and four or more camera units.
70. The intraoral scanner according to any of the preceding items, wherein the scanner is configured to acquire a set of images, wherein the number of images within the set of images corresponds to the number of camera units.
71. The intraoral scanner according to item 70, wherein images within the set of images are acquired simultaneously. 72. The intraoral scanner according to any of the preceding items, wherein the object is a dental object, such as at least a part of the dentition or teeth of a subject, or at least a part of a dental arch.
73. The intraoral scanner according to any of the preceding items, wherein the scanner is a triangulation-based scanner.
> Although some embodiments have been described and shown in detail, the disclosure is not restricted to such details, but may also be embodied in other ways within the scope of the subject matter defined in the following claims.
In particular, it is to be understood that other embodiments may be utilized, and structural and functional modifications may be made without departing from the scope of the present disclosure.
Furthermore, the skilled person would find it apparent that unless an embodiment is specifically presented only as an alternative, different disclosed embodiments may be combined to achieve a specific implementation and such specific implementation is within the scope of the disclosure.
Claims (10)
1. An optical system for a triangulation-based intraoral scanner, comprising: — at least one projector unit comprising: — alight source for generating light; — a pattern generating element configured for generating a pattern of light to be projected on a surface an object; and — one or more projector focus lenses for focusing the pattern of light, wherein the projector focus lenses define a projector optical axis; and — an aperture having a predetermined size such that it provides a pupil diameter of between 0.2 mm to 0.7 mm; — two or more camera units, each camera unit comprising: — an image sensor for acquiring one or more image(s); and — one or more camera focus lenses for focusing light received from the surface of the object onto the image sensor, wherein the camera focus lenses define a projector optical axis; wherein the projector optical axis and the camera optical axis are non- parallel, and wherein the working distance of the projector unit and/or a given camera unit is between 15 mm and 50 mm, and wherein the numerical aperture of the projector unit and/or a given camera unit is between 0.0035 and 0.015.
2. The optical system according to claim 1, wherein the projector unit is configured for sequentially turning the light source on and off at a predetermined frequency, wherein the light source is on for a predetermined time period.
3. The optical system according to claim 2, wherein the time period is between 4 milliseconds (ms) and 8 milliseconds (ms), and wherein the frequency is between 25 Hz and 35 Hz, such as approximately 30 Hz.
4. The optical system according to any of the preceding claims, wherein the projector unit comprises a back-lit mask projector, and wherein the light source is configured for generating white light.
5. The optical system according to any of the preceding claims, wherein the image sensor is a rolling shutter image sensor.
3 DK 2023 70162 A1
6. The optical system according to any of the preceding claims, wherein the pattern of light is static and comprises at least 3000 pattern features.
7. The optical system according to any of the preceding claims, wherein the pattern of light resembles a checkerboard pattern with alternating squares of different intensity in light, wherein each square in the checkerboard pattern has a length of between 100 um to 200 um.
8. The optical system according to any of the preceding claims, wherein the camera units are configured for acquiring a set of images, wherein the number of images in the set of images corresponds to the number of camera units, and wherein images within the set of images are acquired simultaneously by the camera units.
9. A3D scanner system for generating a three-dimensional representation of an object, the 3D scanner system comprising: — atriangulation-based intraoral scanner comprising the optical system according to any of the claims 1-8; — one or more processors configured for generating a three-dimensional representation of the object based on images obtained by the camera units.
10. The 3D scanner system according to claim 9, wherein the intraoral scanner comprises an optical window arranged in a distal end of the scanner, wherein the optical window is transparent to light projected by the projector unit.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA202370162A DK202370162A1 (en) | 2023-03-31 | 2023-03-31 | Optical 3d scanner with improved accuracy |
PCT/EP2024/058301 WO2024200543A1 (en) | 2023-03-31 | 2024-03-27 | Optical intraoral 3d scanner with improved accuracy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA202370162A DK202370162A1 (en) | 2023-03-31 | 2023-03-31 | Optical 3d scanner with improved accuracy |
Publications (1)
Publication Number | Publication Date |
---|---|
DK202370162A1 true DK202370162A1 (en) | 2024-10-25 |
Family
ID=90716892
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DKPA202370162A DK202370162A1 (en) | 2023-03-31 | 2023-03-31 | Optical 3d scanner with improved accuracy |
Country Status (2)
Country | Link |
---|---|
DK (1) | DK202370162A1 (en) |
WO (1) | WO2024200543A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200060550A1 (en) * | 2015-01-18 | 2020-02-27 | Dentlytec G.P.L. Ltd. | Intraoral scanner |
US20200404243A1 (en) * | 2019-06-24 | 2020-12-24 | Align Technology, Inc. | Intraoral 3d scanner employing multiple miniature cameras and multiple miniature pattern projectors |
US20210137653A1 (en) * | 2019-11-12 | 2021-05-13 | Align Technology, Inc. | Digital 3d models of dental arches with accurate arch width |
WO2023187181A1 (en) * | 2022-03-31 | 2023-10-05 | 3Shape A/S | Intraoral 3d scanning device for projecting a high-density light pattern |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104780830B (en) | 2012-06-27 | 2018-09-28 | 3形状股份有限公司 | Measure scanner in the 3D mouths of fluorescence |
WO2020032572A1 (en) * | 2018-08-07 | 2020-02-13 | 주식회사 메디트 | Three-dimensional intraoral scanner |
WO2023194460A1 (en) * | 2022-04-08 | 2023-10-12 | 3Shape A/S | Intraoral scanning device with extended field of view |
-
2023
- 2023-03-31 DK DKPA202370162A patent/DK202370162A1/en unknown
-
2024
- 2024-03-27 WO PCT/EP2024/058301 patent/WO2024200543A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200060550A1 (en) * | 2015-01-18 | 2020-02-27 | Dentlytec G.P.L. Ltd. | Intraoral scanner |
US20200404243A1 (en) * | 2019-06-24 | 2020-12-24 | Align Technology, Inc. | Intraoral 3d scanner employing multiple miniature cameras and multiple miniature pattern projectors |
US20210137653A1 (en) * | 2019-11-12 | 2021-05-13 | Align Technology, Inc. | Digital 3d models of dental arches with accurate arch width |
WO2023187181A1 (en) * | 2022-03-31 | 2023-10-05 | 3Shape A/S | Intraoral 3d scanning device for projecting a high-density light pattern |
Also Published As
Publication number | Publication date |
---|---|
WO2024200543A1 (en) | 2024-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12025430B2 (en) | Intraoral scanner | |
US12274600B2 (en) | Intraoral 3D scanning system using uniform structured light projection | |
JP7506961B2 (en) | Intraoral Scanning Device | |
US11690701B2 (en) | Intraoral scanner | |
CN109475394B (en) | Three-dimensional scanner and artificial product processing device using same | |
KR101628730B1 (en) | 3d imaging method in dentistry and system of the same | |
US20250221800A1 (en) | Intraoral 3d scanning device for projecting a high-density light pattern | |
US20250228648A1 (en) | Intraoral scanning device with extended field of view | |
KR20170093445A (en) | Dental three-dimensional scanner using color pattern | |
DK202370162A1 (en) | Optical 3d scanner with improved accuracy | |
US20230233295A1 (en) | Intra-oral scanning device | |
US12259231B2 (en) | Intraoral scanner | |
US20240293206A1 (en) | System and method of solving the correspondence problem in 3d scanning systems | |
DK202370273A1 (en) | 3D scanner for minimizing motion blur | |
US20250114175A1 (en) | System and method for removing artifacts arising from reflections | |
US20240344824A1 (en) | Single pattern shift projection optical system for 3d scanner | |
WO2025031720A1 (en) | 3d scanner system with dynamic calibration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PAT | Application published |
Effective date: 20241001 |