EP0424409A1 - Range finding device - Google Patents
Range finding deviceInfo
- Publication number
- EP0424409A1 EP0424409A1 EP89907048A EP89907048A EP0424409A1 EP 0424409 A1 EP0424409 A1 EP 0424409A1 EP 89907048 A EP89907048 A EP 89907048A EP 89907048 A EP89907048 A EP 89907048A EP 0424409 A1 EP0424409 A1 EP 0424409A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- scene
- light source
- image
- image data
- range finding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 11
- 230000003287 optical effect Effects 0.000 claims description 12
- 238000000034 method Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 2
- 230000001960 triggered effect Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 description 9
- 230000035945 sensitivity Effects 0.000 description 6
- 230000001419 dependent effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000002310 reflectometry Methods 0.000 description 3
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 239000000872 buffer Substances 0.000 description 2
- 238000011084 recovery Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
Definitions
- This invention relates to optical radar systems, and more particularly to systems which acquire range data points in a parallel or simultaneous fashion, rather than point by point using a scanning mechanism.
- the invention has particular utility in robotics where it is necessary for a robot to obtain a "picture" of the surrounding and consequently the invention i generally applicable over only a short range.
- U.S. Patent No. 3,899,250 discloses a system where the delay between the outgoing pulse and receiver activation is controlled on successive pulse cycles, to give a known correction for sensitivity with distance.
- French, U.S. Patent No. 4,226,529 presents a viewing system whereby the contrast of an image of a target at a particular distance is enhanced with the time gating adjustable to view different ranges.
- U.S. Patent No. 4,603,250 discloses a viewing system in which an image intensifier is used as the receiver ⁇ with the photocathode gated and the gain of the receiver adjusted by varying the microchannel plate voltage.
- Each diode is activated in turn and the time taken for light to travel from the source to the reflecting object and back to the photodiode is measured.
- the echo time gives the time of flight, and hence distance, to the object in the scene that the particular diode is focused upon.
- an optical radar range finding system comprising a high speed switchable light source for illuminating a scene, a high speed switchable imagin device for receiving a reflected image from said scene and providing an output to control means, a pulsing means connected to said light source and to said imaging device to provide pulses to trigger said light source and said imaging device, respectively, said control -means including a store means for storing image information from said imaging device, control logic to sequence operation of said light source and logic circuitry to process data and produce range information relevan to said scene.
- said control logic operates a switching device and the switching device is able to switch said light source to be continuously on, triggered by said pulsing means or in an off condition.
- said light source is a laser diode array.
- the store means is a video frame store.
- Another broad form of the invention provides a method of obtaining range information relating to a scene comprising the steps of:
- said method includes the further step of storing third image data reflected from the scene in the absence of light from the light source and subtracting the third image dat from the first and second image data, respectively, prior to said division so as to remove background illumination effects.
- Figure 1 is a simplified block diagram of apparatus constitutin the system.
- Figure 2 shows time profiles of reflected light pulses and sensitivity of the image sensor, of the system of Figure 1.
- Figures 3 and 4 show plan view and side view, respectively, of experimental apparatus used to demonstrate the apparatus.
- Figure 5 shows the range image acquired from the system when viewing the apparatus of Figures 3 and 4.
- Figure 6 shows a profile of distances for a horizontal section through Figure 5.
- FIG. 7 is a more detailed block diagram of the apparatus of Figure 1.
- Figure 8 is a timing diagram showing timing details of the system's hardware.
- Figure 9 shows graphs of the approximate output of laser diode array, and the image intensifier sensitivity, respectively.
- control hardware 1 selects switch 2 so that the high speed light source 3 is pulsed on and off by the pulse generator 4.
- the pulse generator 4 also drives circuitry (not shown in Figure 1) that controls the sensitivity of image sensor 5.
- the resulting image is stored in the contro hardware 1.
- the reference 6 represents a video monitor.
- control hardware 1 selects switch 2 so that the light source is continuously on.
- the resulting image from the image sensor is again stored in the control hardware 1.
- the intensity of the image is dependent upon the distance to the objects, due to the inverse square law.
- the image is also dependent upon a gating effect caused by the overlap of the reflected light pulse from the scene with the 'on' time of the image sensor, again depende on distance.
- .12 is taken when the light source is continuously on and is dependent only on the surface reflectivity of objects in the scene.
- the intensity of the image is dependent upon the distan to the objects, caused by the inverse square law.
- R is a range image and K is a calibration factor which is constant.
- This processing removes the effect of surface reflectivity and inverse square light loss.
- the only effect that remains is tha due to the overlap or convolution of the pulsed light with the gated image sensor.
- the convolution value depends on the delay between the onset of the reflected pulse at the receiver, and the onset of the activation of the image sensor. This delay is due to the round trip time for light to travel from the pulsed light source, to the objects in the scene and back to the image sensor 5. Thus, for any pixel in the range, knowing the shape of the convolution function and the overall calibration factor K, the distance from the rangefinder to any point in the image may be found directly from R.
- the calibration factor, K is globally applied to all points in the image and may be found by obtaining a range image of a scene of known dimensions. This need be done only once.
- a refinement of the above mode of operation, to facilitate operation in non-ideal environments, is the extension of the control hardware to capture a third 'background' image to includ in the processing. This image is taken with the light source turned off so that background illumination effects are recorded. The processing is modified so that
- the experiment used to demonstrate the system comprises six test cards A separated from each other by 15 cm.
- the receiver 5 is 115 cm from the closet test card A.
- the cards are also offset transversely to the direction of light source.
- Figure 5 shows the range image acquired using such apparatus and
- Figure 6 shows a profile of distances to the various cards on a horizontal section through the image of Figure 5.
- a pulse generator 11 provides timed trigger signa to the image intensifier driver 12 which drives the gate of the image intensifier 9.
- the pulse generator also triggers the las diode array and associated drive circuitry 17.
- the laser diode used are the SHARP LT015 F type. Light from the laser diode array is directed toward the scene (not shown) .
- the system controller 15 selects the mode of the electronic switch 16 which allows the diode array 17 to be operated in pulsed mode (position a), continuous mode (position b) and turn off completely (position c). The controller 15 also selects on of three frame memory buffers in the frame store 14 correspondi to images obtained in the three operating modes of the laser diode array.
- the three outputs from the frame store 14 are processed by digital logic circuitry 18 to produce data representing range information about the scene. This is converted to an analogue signal by a digital to analog converter 19 the output of which i combined by a circuit 20 with video sync information form the sync processor 13. The resulting signal is displayed on a video monitor 21.
- the image intensifier 9 has a driving voltage which ranges from O V to -60 V and back to 0 V in 30 nSec.
- the microchannel plate is provided with -850 V (MCP out to MCP in) and the phosphor screen accelerating voltage is 5 V (Screen to MCP out).
- Receiving lens optics 8 consist of a manually adjusted focus len and a galvanmeter controlled aperture.
- the filter is a Kodak Wratten filter #87.
- FIG 8 shows the relevant timing details of the system's hardware.
- Timing signals (23), (24) and (25) show the select signals for operating the mode switch and governing the frame buffer memory selection in the frame store 14. These are shown in relation to the odd and even fields of the interlaced video signal 22 from the video camera 10.
- the times when a control signal is asserted is the "active" time and is indicated by reference 26.
- signal 23 selects the pulsed mode of the laser array.
- signal 24 When signal 24 is active, it selects the continuous mode of the laser array, while signal 25 being active selects the background mode, during which time the laser array i idle.
- the laser array is also idle when none of the selections is active (time intervals 27).
- the active time 26 for a select signal is two video field time in length or 20 mSec.
- the images are captured during the odd field of the interlaced video represented by the low state of signal 22.
- Reference 27 represents the recovery time between modes. This allows the phosphor of the image intensifier to return to a neutral state prior to the next selection.
- the recovery time is also two field times. These timing values are adjusted according to the persistence characteristics of the imaging system.
- Figure 8 also shows the timing of the waveforms that drive the laser diode during pulsed mode, and drive the image intensifier.
- the length 28 of the driving pulse is of the order of 30 nSec.
- the off time 29 may be varied between 30 nSec and greater than 100 nSec.
- the image intensifier 9 has the same pulse duration and duty cycle as the laser array.
- a time delay 32 is adjustab over the range + or - 15 nSec. This is to compensate for circui delays and the longitudinal displacement between the laser arra and the image intensifier. This delay is set once, after the system is constructed and is not thereafter adjusted unless recalibration is required.
- Figure 9 shows approximate graphs of the laser optical output 33 and the image intensifier sensitivity 34. These vary from ideal square waves because of the limited response times of the driver circuitry. The effects of these imperfections on the final rang readings are readily calibrated out of the system.
- Images obtained from three modes of operation of a pulsed illuminator, gated receiver imaging system are combined through the use of the special hardware to produce a representation of range data for the viewed scene.
- the system captures a complete range image of the viewed scene using just three images, one fro each mode.
- the step which measures the background image may be eliminated, particularly when background light levels are low.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Un générateur d'impulsions (11) envoie des signaux de déclenchement cadencés à un circuit d'attaque d'amplificateur image (12) qui commande la porte de l'amplificateur image (9). Le générateur d'impulsions déclenche également les circuits de commande et un réseau de diodes (7) pour qu'ils émettent de la lumière en direction d'un champ de télémétrie. La lumière réfléchie depuis le champ de télémétrie pénètre dans le système d'imagerie en traversant un filtre (7) et une lentille (8), focalisant ainsi la lumière filtrée sur l'amplificateur (9) sur laquelle est pointée une caméra vidéo (10). La sortie de la caméra vidéo (10) est connectée à une mémoire de bloc image (14) et à un processeur de synchronisation (13). Le contrôleur (15) du système règle un commutateur (16), qui permet au réseau de diodes d'être exploité en mode pulsé (a), en mode continu (b), et en mode d'arrêt. Les trois sorties provenant de la mémoire (14) sont traitées par les circuits (18) pour produire des données représentant des informations de télémétrie rélatives au champ. Ces informations, qui passent ensuite à travers le convertisseur analogique (19), sont combinées avec des informations de synchronisation vidéo provenant du processeur de synchronisation (13) et le signal qui en résulte est affiché sur un moniteur vidéo (21).A pulse generator (11) sends timed trigger signals to an image amplifier driver (12) which controls the gate of the image amplifier (9). The pulse generator also triggers the control circuits and an array of diodes (7) to emit light towards a telemetry field. Light reflected from the telemetry field enters the imaging system through a filter (7) and a lens (8), thereby focusing the filtered light on the amplifier (9) on which a video camera (10) is pointed. ). The output of the video camera (10) is connected to an image block memory (14) and to a synchronization processor (13). The system controller (15) sets a switch (16), which allows the diode network to be operated in pulsed mode (a), continuous mode (b), and stop mode. The three outputs from the memory (14) are processed by the circuits (18) to produce data representing telemetry information relating to the field. This information, which then passes through the analog converter (19), is combined with video synchronization information from the synchronization processor (13) and the resulting signal is displayed on a video monitor (21).
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AUPI887688 | 1988-06-20 | ||
AU8876/88 | 1988-06-20 |
Publications (2)
Publication Number | Publication Date |
---|---|
EP0424409A1 true EP0424409A1 (en) | 1991-05-02 |
EP0424409A4 EP0424409A4 (en) | 1992-01-15 |
Family
ID=3773167
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19890907048 Withdrawn EP0424409A4 (en) | 1988-06-20 | 1989-06-20 | Range finding device |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP0424409A4 (en) |
JP (1) | JPH03505123A (en) |
WO (1) | WO1989012837A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3915631A1 (en) * | 1989-05-12 | 1990-11-15 | Dornier Luftfahrt | NAVIGATION PROCEDURE |
DE3915627A1 (en) * | 1989-05-12 | 1990-11-15 | Dornier Luftfahrt | OPTICAL RADAR |
DE69635891T2 (en) * | 1995-06-22 | 2006-12-14 | 3Dv Systems Ltd. | IMPROVED OPTICAL CAMERA FOR DISTANCE MEASUREMENT |
US6445884B1 (en) | 1995-06-22 | 2002-09-03 | 3Dv Systems, Ltd. | Camera with through-the-lens lighting |
ATE367587T1 (en) * | 2003-10-29 | 2007-08-15 | Fraunhofer Ges Forschung | DISTANCE SENSOR AND METHOD FOR DISTANCE DETECTION |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3897150A (en) * | 1972-04-03 | 1975-07-29 | Hughes Aircraft Co | Scanned laser imaging and ranging system |
FR2389906B1 (en) * | 1977-05-04 | 1981-02-27 | Telecommunications Sa | |
US4298280A (en) * | 1979-09-25 | 1981-11-03 | Massachusetts Institute Of Technology | Infrared radar system |
US4490037A (en) * | 1982-08-18 | 1984-12-25 | Eastman Kodak Company | Image sensor and rangefinder device having background subtraction with bridge network |
US4501961A (en) * | 1982-09-01 | 1985-02-26 | Honeywell Inc. | Vision illumination system for range finder |
JPS59193322A (en) * | 1983-04-18 | 1984-11-01 | Canon Inc | Photo-electric conversion element |
US4678323A (en) * | 1984-07-20 | 1987-07-07 | Canon Kabushiki Kaisha | Distance measuring devices and light integrators therefor |
US4812035A (en) * | 1986-11-03 | 1989-03-14 | Raytheon Company | AM-FM laser radar |
-
1989
- 1989-06-20 WO PCT/AU1989/000263 patent/WO1989012837A1/en not_active Application Discontinuation
- 1989-06-20 JP JP1506625A patent/JPH03505123A/en active Pending
- 1989-06-20 EP EP19890907048 patent/EP0424409A4/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO1989012837A1 (en) | 1989-12-28 |
EP0424409A4 (en) | 1992-01-15 |
JPH03505123A (en) | 1991-11-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA1332978C (en) | Imaging lidar system using non-visible light | |
US4708473A (en) | Acquisition of range images | |
US7834985B2 (en) | Surface profile measurement | |
US5034810A (en) | Two wavelength in-situ imaging of solitary internal waves | |
EP0396865B1 (en) | Optical radar | |
US20040233416A1 (en) | Method and device for recording a three-dimensional distance-measuring image | |
JP2004523769A (en) | Surface shape measurement | |
CN1163687A (en) | device and method for detection and demodulation of intensity modulated radiation field | |
KR20010033549A (en) | Method and device for recording three-dimensional distance-measuring images | |
JPH11508359A (en) | Improved optical ranging camera | |
US5434612A (en) | Duo-frame normalization technique | |
EP3543742B1 (en) | A 3d imaging system and method of 3d imaging | |
US4119379A (en) | Optical detection and ranging apparatus | |
GB2374743A (en) | Surface profile measurement | |
Bretthauer et al. | An electronic Cranz–Schardin camera | |
EP0424409A1 (en) | Range finding device | |
EP3143427A1 (en) | Imaging system and method for monitoring a field of view | |
CN211206789U (en) | Color laser radar imaging device | |
EP0777134A1 (en) | Device for observing objects | |
AU3831189A (en) | Range finding device | |
Christie et al. | Design and development of a multi-detecting two-dimensional ranging sensor | |
EP3358367B1 (en) | Event-triggered imaging pixels | |
RU2030839C1 (en) | Charge-coupled camera | |
Kotake et al. | Performance improvement of real-time 3D imaging ladar based on a modified array receiver | |
SU712662A1 (en) | Method of remote automatic measuring of demensions of similar objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 19901219 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE CH DE FR GB IT LI LU NL SE |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 19911122 |
|
AK | Designated contracting states |
Kind code of ref document: A4 Designated state(s): AT BE CH DE FR GB IT LI LU NL SE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 19920215 |