WO2021028910A1 - A gimbal apparatus system and method for automated vehicles - Google Patents
A gimbal apparatus system and method for automated vehicles Download PDFInfo
- Publication number
- WO2021028910A1 WO2021028910A1 PCT/IL2020/050877 IL2020050877W WO2021028910A1 WO 2021028910 A1 WO2021028910 A1 WO 2021028910A1 IL 2020050877 W IL2020050877 W IL 2020050877W WO 2021028910 A1 WO2021028910 A1 WO 2021028910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gimbal
- camera
- based system
- image
- piezoelectric
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 7
- 239000003381 stabilizer Substances 0.000 claims abstract description 10
- 230000033001 locomotion Effects 0.000 claims description 32
- 230000006641 stabilisation Effects 0.000 claims description 17
- 238000011105 stabilization Methods 0.000 claims description 17
- 230000003287 optical effect Effects 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 9
- 239000002131 composite material Substances 0.000 claims description 6
- 230000003595 spectral effect Effects 0.000 claims description 6
- 230000005484 gravity Effects 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 2
- 230000004297 night vision Effects 0.000 claims description 2
- 238000000926 separation method Methods 0.000 claims description 2
- 230000000087 stabilizing effect Effects 0.000 claims description 2
- 238000012544 monitoring process Methods 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 150000001875 compounds Chemical class 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000000203 mixture Substances 0.000 description 3
- 238000004091 panning Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000036316 preload Effects 0.000 description 2
- 230000001629 suppression Effects 0.000 description 2
- 101100175317 Danio rerio gdf6a gene Proteins 0.000 description 1
- 230000005355 Hall effect Effects 0.000 description 1
- 244000107946 Spondias cytherea Species 0.000 description 1
- 235000005138 Spondias dulcis Nutrition 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 150000004770 chalcogenides Chemical class 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/64—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image
- G02B27/646—Imaging systems using optical elements for stabilisation of the lateral and angular position of the image compensating for small deviations, e.g. due to vibration or shake
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- said control unit is capable of determining, by triangulation, the distance to at least one object taking into account COG and higher mathematical moments of pixel values in said one object, or a segmented portion of said object.
- the stabilizer further comprising a piezoelectric element situated between said second stage and the at least one of the lens of the camera or the sensor array of the camera, for providing motion in a third direction, wherein said third direction is substantially in at 90 degrees in relation to both first and second directions.
- said motion in a third direction is used for focusing the camera.
- a distance to an observed object is determined by sharpness of an acquired image.
- FIG. 1 schematically depicts a possible sensor suite for AV, according to the prior art.
- FIG. 2A schematically illustrates a miniature piezoelectric driven two axis stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
- FIG. 2B schematically illustrates a small piezoelectric driven two axis stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 2C schematically illustrates a piezoelectric driven two axis narrow motion range stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 3A schematically illustrates isometric view of a lens positioning mechanism for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
- FIG. 3C schematically illustrates isometric view of a flexure-based lens positioning mechanism for image stabilization and focus system, in accordance with some exemplary embodiments of the disclosed subject matter.
- FIG. 4A schematically illustrates optional locations for mounting gimbals on an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
- FIG. 4D schematically illustrates a system for an automated vehicle applications, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 5B depicts a mosaic images, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 6A depicts a bicubic interpolation image of a plurality of pixelated images shifted by the gimbal angularly by a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 6B depicts a super resolution reconstruction of the plurality of pixelated images shown in Figure 6A shifted by the gimbal angularly by a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
- compositions, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
- AV autonomous vehicles
- CNC command and navigation controller
- Current sensors include visible light cameras fixed to the AV, ultrasonic proximity sensors, RADAR and LIDAR systems.
- FIG. 1 schematically depicts a possible sensor suite for AV, according to the prior art.
- Sensor suite 110 for AV 100 may comprises a plurality of sensors selected from a variety of technologies, wherein each sensor provides information for a specific zone and purpose, and wherein the zones and information type may overlap and augment each other.
- An intermediate range LIDAR capable of measuring distance to a target by illuminating the target with pulsed laser light, scanning or wide angle, and measuring the reflected pulses with a sensor, may be installed in the front of the AV to provide 3D image of the intermediate front zone 114 for collision avoidance by detecting emergency breaking of a vehicle in front or detecting a pedestrian or other obstacle in front of the AV.
- a visible light camera mounted in front of the AV, and having wide field of view 116 may provide images for traffic sign recognition and road lain marking recognition.
- An example is MobilEye (13 Hartom St., Jerusalem, Israel), an Intel company, which is in the business of vision-safety technology, provides systems for detection of obstacles in front of the vehicle.
- Rear collision warning may be provided by a short or intermediate range RADAR mounted in the back of the AV and monitoring the back zone 122.
- Left and right surround views may be provided by visible light cameras mounted on two sides of the AV, monitoring the side zones 126R and 126L, respectively.
- Fig. 2A and 2B respectively, that schematically illustrating a miniature and a small piezoelectric driven two axis mechanically stabilized payloads, in accordance with some exemplary embodiments of the disclosed subject matter.
- the specific miniature piezoelectric driven two axis mechanically stabilized gimbal mount payload 200m has a weight of 30 grams, a radius of 29 mm, with either a 12MP dat camera or a 120X160 pixels thermal camera, and can be stabilized to 1 milliradian (mRad) angular stabilization.
- the correct orientation under vibration is determined by using the velocity feedback from the gyro unit, to determine the absolute orientation relative to the inertial coordinates system (the ground), and the position feedback from an incremental high resolution (21 pRad) encoder coupled to each rotary axis, to determine the absolute orientation relative to the platform the gimbal is attached to.
- Vibration suppression algorithm utilizes the developed control scheme, adapted to control L1B2 ultrasonic motors.
- the architecture of gimbal 200x allows it to carry various optical components, such as cameras, lasers and more.
- FIG. 2C schematically illustrating a piezoelectric driven two axes limited motion range mechanically stabilized gimbal mounts 200n, in accordance with some exemplary embodiments of the disclosed subject matter.
- Image stabilization devices that use piezoelectric motors for translational motion of at least a part of the optical train in a camera render controlled motion that can be applied to: a) better image stabilization, when used with gimbals; b) fixed cameras c) "super resolution" (possibly moving the lens and the projected image on the detector array by 1/2 pixel in 2D. It should be understood that also with the Payload 20 lx, the sub pixel movement is facilitated.
- FIG. 3A and Fig. 3B respectively, schematically illustrating isometric view 330 and block diagram 360 of a lens positioning mechanism for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
- Optical image stabilization (in the presence of vibrations) is performed by controlling the optical path to the image sensor. This is performed by determining the correct position of a dedicated OIS lens in a plane perpendicular to the optical axis.
- vibrations-detecting sensors such as Gyro
- An exemplary embodiment of OIS lens positioning system 300 motion stage has an X-Y configuration with two independent motion axes, each powered by a Nanomotion Edge motor 332a and 332b for moving the X and Y stages 342a and 342b, respectively.
- the position of the lens is determined by a sensor such as X-Y Hall Effect position sensor 336.
- the motion is in a close loop, controlled by a Nanomotion XCD2 controller-drive 338 comprising a controller 340 and driver 342.
- the lens 370 is mounted on the X stage 342a.
- An equivalent configuration uses a piezo actuator, the benefit of which is faster response for super resolution. The latter is shown in Fig 3C.
- the imaging array of the camera is mounted on the X stage 342a and its position can be controlled in 2D for providing image stabilization. Additionally, small motions of 1/2 pixel in one or two dimensions may be used for increasing the effective resolution of the camera using algorithms of super resolution.
- FIG. 3C schematically illustrating isometric view of a flexure based lens positioning mechanism 380 for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
- An exemplary embodiment of flexure based OIS lens positioning system 380 has an X- Y flexure 352 with three independent motions: flexure -based axis X (351), flexure -based axis Y (381) and longitudinal extension / contraction focusing axis Z (391).
- Frame 382 is mounted onto the camera, for example, using screws (not seen here) or using mounting holes 388x (only two are marked).
- each piezoelectric blocks 354x is 5x5x2 mm in size. Compression force applied by preload element 356x to the corresponding piezoelectric blocks 394x may be adjusted by a corresponding preloading screw 357x.
- each piezoelectric blocks 384x is 5x5x2 mm in size. Compression force applied by preload element 386x (only one is marked) to the corresponding piezoelectric blocks 384x can be adjusted by a corresponding preloading screw 357x.
- tubular piezoelectric actuator 392 is 10 mm tall, has outer diameter of 5.7 mm and inner diameter of 4 mm.
- FIG. 1 In this exemplary embodiment, four camera(s) equipped gimbals 200x are mounted on top of AV 100. It is clear that all the viewing zones (112, 114, 116, 119, 120x, 122, 124, 126x, and 128x) seen in figure 1, can be provided by using a small number of gimbals, specifically if wide scanning range gimbals 200s or 200m are used.
- the four gimbals are positioned in or near the front left (FL), front right (FR), back left (BL) and back right (BR) corners of the top 410 of AV 100.
- the distance to an object can be calculated by triangulating from two images taken of the same object at the same time by two cameras separated by a known distance.
- the payloads need not be bore sighted and as long as the angle of each payload is known, the distance can be calculated. For long range, the gimbals would typically be bore sighted. For close range, the gimbals would typically be looking inwards.
- FIG. 4B illustrating another optional location for mounting a narrow motion range gimbals within the right front light bay and left front light bay of an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
- Lower view point can provide information obscured by structures such as when going up a ramp in an indoor parking garage.
- gimbals 470x are equipped with IR camera and are located behind the IR transparent front lights window.
- An example material could be IR transparent Chalcogenides glass.
- Fig. 4C illustrating an enlarged view of the optional location for mounting a narrow motion range gimbal 200n carrying a sensor such as a camera 212n, within the left front light bay 470L of an AV 100, in accordance with some exemplary embodiments of the disclosed subject matter.
- Fig. 4D is illustrating a system for automated vehicle applications, in accordance with some exemplary embodiments of the disclosed subject matter.
- a control unit 490 comprises three possible controllers or processors as will be explained herein after. Motions of sensor carrying gimbals TBL, TBR, TFR, TFL, 470L and 470R are controlled by motion controller 498 that receives gimbal pointing information from sensors and encoders in the gimbals, and sends motion commands to the gimbals. Optionally, each gimbal has a local motion controller that performs at least some of the motion controlling function.
- Image processor 497 receives image information from the gimbal-mounted cameras and sensors and optionally sends commands such as zooming, exposure parameters, and the likes to the cameras.
- each camera or sensor has a local image processor that performs at least some of the image processing functions such as creating supper resolution images, mosaic or panoramic images, de-blurring the images, and the likes.
- Central processing unit 495 provides automated vehicle functionality by exchange data with motion controller 498 and image processor 497 as well as communicates with the car control sub-systems (for example, engine, brakes, and steering) and other sensors in the car (for example speedometer, acceleration sensor) via communication channel 496.
- the car control sub-systems for example, engine, brakes, and steering
- other sensors in the car for example speedometer, acceleration sensor
- the control unit 490 is capable of: controlling a pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal to contiguous or partially overlapping cones; and using images data from the camera installed on the piezoelectric actuated gimbal to construct a composite image selected from the group consisting of a panoramic image and a two-dimensional mosaic image, wherein a number of pixels in the composite image is larger than a number of pixels in the camera installed on the piezoelectric actuated gimbal.
- control unit 490 is capable of: controlling the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal such that successive images provided by the camera are shifted by a fraction of a pixel; and combining the successive images provided that are shifted by a fraction of a pixel to a combined image having superior spatial resolution than a single image provided by the camera.
- control unit 490 is capable of: aiming at least two piezoelectric actuated gimbals such that two different cameras installed on the two piezoelectric actuated gimbals are viewing at least partially overlapping FOV; and using at least two images taken by the two different cameras to determine, by triangulation, the distance to at least one object within the overlapping FOV.
- control unit is capable also, by using at least one combined image having superior spatial resolution, to improve the accuracy of distance to at least one object within the overlapping FOV.
- FIG. 5A and Fig. 5B respectively illustrating a single image and a mosaic image, in accordance with some exemplary embodiments of the disclosed subject matter.
- the piezo motors-based gimbal payload has high stabilization performance with high dynamic properties enabling fast move and settle of a full FOV (field of view) at 10 or more FPS.
- This allows producing a panoramic mosaic image 520 comprising one or two dimensional arrays of (preferably slightly overlapping) FOV images 510a, 510b... 510x, enabling overview of a large area with angular resolution of an original single image 510a.
- Simple image stitching can be used to construct the panoramic mosaic image 520.
- the overlapping zones between frames 51 Ox may be averaged or joined using other image processing algorithms known in the art. Refresh rate of 10 frames/sec or faster is possible.
- FIG. 6A and Fig. 6B respectively, depicting a bicubic interpolation image 610 and super resolution reconstruction 620 of a plurality (in this example, four or more images were used) of pixelated images shifted by the gimbal a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
- the high accuracy of the images received from the embodiments discloses in figures 2 and 3 enables acquiring multiple images of the same object, while shifting the images by a fraction of pixel size with respect to each other.
- four images can be taken, such that: image No. 2 is shifted by 1/2 pixel size in the X direction with respect to image No. 1; image No. 3 is shifted by 1/2 pixel size in the Y direction with respect to image No. 1; and image No. 4 is shifted by 1/2 pixel size in both the X and Y directions with respect to image No. 1. It effectively increases the number of pixels on account of frame rate.
- COG center of gravity
- distance calculation can use center of gravity (COG) of features as well as higher moments in the image instead of using the edge of the features.
- COG can be calculated to a sub- pixel resolution, and may benefit from super resolution image reconstruction and partial-pixel image shifting. This way, the triangulation benefits a finer angular accuracy and better depth accuracy for a given spacing between the cameras.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
Abstract
A gimbal-based system for an automated vehicle to be used with a vehicle that comprises a plurality of piezoelectric actuated gimbals to be mounted on the vehicle, wherein at least two of the piezoelectric actuated gimbals are each carrying at least one camera and a control unit for controlling the piezoelectric actuated gimbals. An image stabilizer is also provided that comprises a frame for connecting to a camera; a first stage piezoelectrically actuated to move in a first direction; and a second stage piezoelectrically actuated to move in a second direction, wherein the second stage is connected to said first stage, wherein the second direction is substantially at 90 degrees in relation to the first direction, wherein one of: a) the lens of the camera; or b) the sensor array of the camera is moving with the two stages.
Description
A GIMBAL APPARATUS SYSTEM AND METHOD FOR AUTOMATED VEHICLES
TECHNICAL FIELD
[0001] The present disclosed subject matter relates to the use of plurality of gimbal devices for automated vehicle applications. More particularly, the present disclosed subject matter relates to the use of plurality of cameras mounted on piezoelectric activated gimbal devices, a control system and an image processing unit in automated vehicle applications.
BACKGROUND
[0002] Level 4 Automatic Vehicle (AV) is a vehicle that can drive itself almost all the time without any human input, but may be programmed not to drive in unmapped areas or during severe weather. Such level 4 automation requires tight sensor's coverage of all relevant directions and ranges around the AV. Typical sensor suites utilizing in a level 4 AV will be shown in Figure. 1. Level 4 autonomy requires Global Navigation Satellite System (GNSS), such as GPS, GLONASS, Galileo, Beidou, or other regional systems.
[0003] Lidars, Radars, Inertial Motion units (IMUs), and Ultrasonic (US) systems provide data relative to the AV location, while GNSS provides absolute global coordinates.
[0004] Cameras mounted on the AV, combined with road maps, are used for navigation in mapped areas as in Level 3 automation defined as an AV that can take over all driving functions under certain circumstances.
[0005] Sensor fusion requires coherency amongst all systems. This coherency requires a single accurate source of time acquired from the GNSS receivers. All sensors either come with their own GNSS or are synced externally. GNSS is required as timing backbone to autonomous communication.
BRIEF SUMMARY
[0006] It is therefore an object of the present subject matter to provide, in accordance with a preferred embodiment, a gimbal-based system for an automated vehicle to be used with a vehicle comprising: a plurality of piezoelectric actuated gimbals to be mounted on the vehicle, wherein at least two of the piezoelectric actuated gimbals are each carrying at least one camera; and a control unit for controlling the piezoelectric actuated gimbals.
[0007] In accordance with another preferred embodiment disclosed herein, said control unit is capable of controlling a pointing direction of the piezoelectric actuated gimbals.
[0008] In accordance with another preferred embodiment disclosed herein, said control unit is capable of: receiving image data from the cameras; and providing automated vehicle information to the vehicle.
[0009] In accordance with another preferred embodiment disclosed herein, said control unit is capable of: controlling and stabilizing the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal to contiguous or partially overlapping cones.
[0010] In accordance with another preferred embodiment disclosed herein, said control unit is capable of: using images data from the camera installed on the piezoelectric actuated gimbal to construct a composite image selected from the group consisting of a panoramic image and a one or two-dimensional mosaic image, wherein a number of pixels in the composite image is larger than a number of pixels in the camera installed on the piezoelectric actuated gimbal.
[0011] In accordance with another preferred embodiment disclosed herein, said providing automated vehicle information to the vehicle comprises providing imaging information over field of view (FOV) larger than the FOV of the camera installed on the piezoelectric actuated gimbal.
[0012] In accordance with another preferred embodiment disclosed herein, said control unit is capable of controlling the pointing direction of at least one piezoelectric actuated gimbal to stabilize the image provided by a camera installed on the piezoelectric actuated gimbal to reduce image blurring while the car is in motion.
[0013] In accordance with another preferred embodiment disclosed herein, said control unit is capable of: controlling the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal such that successive images provided by the camera are shifted by a fraction of a pixel in one or two dimensions; and processing the successive images provided that are shifted by a fraction of a pixel to a combined image having superior spatial resolution than a single image provided by the camera.
[0014] In accordance with another preferred embodiment disclosed herein, said control unit is capable of: aiming at least two piezoelectric actuated gimbals such that two different cameras installed on the two piezoelectric actuated gimbals are viewing at least partially overlapping FOV; and using at least two images taken by the two different cameras to determine, by triangulation, the distance to at least one object within the overlapping FOV.
[0015] In accordance with another preferred embodiment disclosed herein, said control unit is capable, by using at least one combined image having superior spatial resolution, to improve an accuracy of distance to at least one object within the overlapping FOV.
[0016] In accordance with another preferred embodiment disclosed herein, the piezoelectric actuated gimbals are mounted at or near the roof of the car.
[0017] In accordance with another preferred embodiment disclosed herein, at least four piezoelectric actuated gimbals are mounted at or near the four corners of the roof of the car.
[0018] In accordance with another preferred embodiment disclosed herein, the system further comprising at least one piezoelectric actuated gimbal mounted in each headlight bay of the car.
[0019] In accordance with another preferred embodiment disclosed herein, said control unit is capable of determining, by triangulation, the distance to at least one object taking into account the vertical separation between a roof-mounted camera and a headlights bay mounted camera.
[0020] In accordance with another preferred embodiment disclosed herein, said control unit is capable of determining, by triangulation, the distance to at least one object taking into account calculated Center Of Gravity (COG) of the said one object, or a segmented portion of said object to achieve sub-pixel resolution and improve depth measurement.
[0021] In accordance with another preferred embodiment disclosed herein, said control unit is capable of determining, by triangulation, the distance to at least one object taking into account COG and higher mathematical moments of pixel values in said one object, or a segmented portion of said object.
[0022] In accordance with another preferred embodiment disclosed herein, at least two cameras are operating in at least two different spectral ranges or using different exposure parameters.
[0023] In accordance with another preferred embodiment disclosed herein, the at least two cameras are selected from a group consisting of visible light camera, limited spectral band camera, multiple spectral range camera, high-speed camera, FLIR, IR camera, and night vision camera.
[0024] It is yet another object of the present subject matter to provide, in accordance with a preferred embodiment, to provide an image stabilizer comprising: a frame for connecting to a camera; a first stage piezoelectrically actuated to move in a first direction; and a second stage piezoelectrically actuated to move in a second direction, wherein said second stage is connected to said first stage, wherein the second direction is substantially at 90 degrees in relation to the first direction, wherein one of: a) the lens of the camera; or b) the sensor array of the camera is moving with the two stages.
[0025] In accordance with another preferred embodiment disclosed herein, the stabilizer further comprising a piezoelectric element situated between said second stage and the at least one of the lens of the camera or the sensor array of the camera, for providing motion in a third direction, wherein said third direction is substantially in at 90 degrees in relation to both first and second directions.
[0026] In accordance with another preferred embodiment disclosed herein, said motion in a third direction is used for focusing the camera.
[0027] In accordance with another preferred embodiment disclosed herein, said piezoelectric element is in a form of a tube having a bore along the optical axis of the camera, and the lens of the camera is mounted on the distal end of the tube.
[0028] In accordance with another preferred embodiment disclosed herein, said frame, said first stage and said second stage are in the form of two-dimensional flexure device.
[0029] In accordance with another preferred embodiment disclosed herein, at least one of said first stage and said second stage are actuated by two oppositely activated piezoelectric actuators.
[0030] In accordance with another preferred embodiment disclosed herein, said two oppositely activated piezoelectric actuators is preloaded in compressional stress.
[0031] It is another object of the present subject matter to provide, in accordance with a preferred embodiment, to provide a gimbal-based system for an automated vehicle to be used with a vehicle comprising: a single piezoelectric actuated gimbal to be mounted on the vehicle, carrying at least one camera or at least one LIDAR system and a control unit for controlling the piezoelectric actuated gimbals, providing stabilization, panoramic view and super-resolution capabilities.
[0032] In accordance with another preferred embodiment disclosed herein, a distance to an observed object is determined by sharpness of an acquired image.
[0033] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosed subject matter belongs. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of the present disclosed subject matter, suitable
methods and materials are described below. In case of conflict, the specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
[0034] Some embodiments of the disclosed subject matter described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present disclosed subject matter only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the disclosed subject matter. In this regard, no attempt is made to show structural details of the disclosed subject matter in more detail than is necessary for a fundamental understanding of the disclosed subject matter, the description taken with the drawings making apparent to those skilled in the art how the several forms of the disclosed subject matter may be embodied in practice.
In the drawings:
[0035] Fig. 1 schematically depicts a possible sensor suite for AV, according to the prior art.
[0036] Fig. 2A schematically illustrates a miniature piezoelectric driven two axis stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
[0037] Fig. 2B schematically illustrates a small piezoelectric driven two axis stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
Fig. 2C schematically illustrates a piezoelectric driven two axis narrow motion range stabilized gimbal mounts, in accordance with some exemplary embodiments of the disclosed subject matter.
[0038] Fig. 3A schematically illustrates isometric view of a lens positioning mechanism for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
[0039] Fig. 3B illustrates a block diagram of a lens positioning mechanism for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
[0040] Fig. 3C schematically illustrates isometric view of a flexure-based lens positioning mechanism for image stabilization and focus system, in accordance with some exemplary embodiments of the disclosed subject matter.
[0041] Fig. 4A schematically illustrates optional locations for mounting gimbals on an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
[0042] Fig. 4B schematically illustrates another optional location for mounting a narrow motion range gimbals within the right front light bay and left front light bay of an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
[0043] Fig. 4C schematically illustrates enlarged view of the optional location for mounting a narrow motion range gimbal within the left front light bay of an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
[0044] Fig. 4D schematically illustrates a system for an automated vehicle applications, in accordance with some exemplary embodiments of the disclosed subject matter.
[0045] Fig. 5A depicts a single stabilized image, in accordance with some exemplary embodiments of the disclosed subject matter.
[0046] Fig. 5B depicts a mosaic images, in accordance with some exemplary embodiments of the disclosed subject matter.
Fig. 6A depicts a bicubic interpolation image of a plurality of pixelated images shifted by the gimbal angularly by a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
Fig. 6B depicts a super resolution reconstruction of the plurality of pixelated images shown in Figure 6A shifted by the gimbal angularly by a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
DETAILED DESCRIPTION
[0047] Before explaining at least one embodiment of the disclosed subject matter in detail, it is to be understood that the disclosed subject matter is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting. The drawings are generally not to scale. For clarity, non-essential elements were omitted from some of the drawings.
[0048] The terms "comprises", "comprising", "includes", "including", and "having" together with their conjugates mean "including but not limited to". The term "consisting of" has the same meaning as "including and limited to".
[0049] The term "consisting essentially of" means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
[0050] As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.
[0051] Throughout this application, various embodiments of this disclosed subject matter may be presented in a range format. It should be understood that the description in range format is merely for convenience and brevity and should not be construed as an inflexible limitation on the scope of the disclosed subject matter. Accordingly, the description of a range should be
considered to have specifically disclosed all the possible sub-ranges as well as individual numerical values within that range.
[0052] It is appreciated that certain features of the disclosed subject matter, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the disclosed subject matter, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the disclosed subject matter. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
[0053] As autonomous vehicles (AV) become popular, there is a growing need for a set of sensors to provide a command and navigation controller (CNC) of the AV with information about the terrain as well as structure and obstacles around it. The information needed has to be obtained in real time, and provides visual mapping as well as accurate three-dimensional (3D) details to determine distance of obstacles.
[0054] Current sensors include visible light cameras fixed to the AV, ultrasonic proximity sensors, RADAR and LIDAR systems.
[0055] Fig. 1 schematically depicts a possible sensor suite for AV, according to the prior art.
[0056] Sensor suite 110 for AV 100 may comprises a plurality of sensors selected from a variety of technologies, wherein each sensor provides information for a specific zone and purpose, and wherein the zones and information type may overlap and augment each other.
[0057] For example, a long-range RADAR may be installed in the front of the AV to provide distance and relative speed to vehicles in the narrow far front zone 112 for adaptive cruise control, as an example for high speed cruising on a highway.
[0058] An intermediate range LIDAR capable of measuring distance to a target by illuminating the target with pulsed laser light, scanning or wide angle, and measuring the reflected pulses with a sensor, may be installed in the front of the AV to provide 3D image of
the intermediate front zone 114 for collision avoidance by detecting emergency breaking of a vehicle in front or detecting a pedestrian or other obstacle in front of the AV.
[0059] A visible light camera mounted in front of the AV, and having wide field of view 116 may provide images for traffic sign recognition and road lain marking recognition. An example is MobilEye (13 Hartom St., Jerusalem, Israel), an Intel company, which is in the business of vision-safety technology, provides systems for detection of obstacles in front of the vehicle.
[0060] Short or intermediate range RADAR, or few such radars, mounted in front of the AV may provide cross traffic alert from wide field front zone 118.
[0061] Front and back ultrasonic sensor arrays may provide park assist information from near front 120f, and near back zone, 102b, respectively.
[0062] Rear collision warning may be provided by a short or intermediate range RADAR mounted in the back of the AV and monitoring the back zone 122.
[0063] Park assistance and surround view may be provided by a visible light camera mounted in the back of the AV and monitoring the back zone 124.
[0064] Left and right surround views may be provided by visible light cameras mounted on two sides of the AV, monitoring the side zones 126R and 126L, respectively.
[0065] Left and right blind spots obstacle detection may be provided by short or intimidate range RADARs mounted on two sides of the AV, monitoring the side zones 128R and 128L, respectively.
[0066] It should be noticed that the use of a large number of sensors and sensor types is cumbersome and expensive, complicates the computation required by the CNC unit, and reduces the overall reliability of the system. Additionally, while LIDAR technology provides 3D information, its range, accuracy and refresh rate may be limited by power consumption and eye safety concerns. Moreover, in a crowded environment, mutual interference between LIDAR systems on different A Vs may cause interference resulting in wrong reading ghosts or even "blinding" of near-by LIDAR and camera systems.
[0067] Referring now to Fig. 2A and 2B, respectively, that schematically illustrating a miniature and a small piezoelectric driven two axis mechanically stabilized payloads, in accordance with some exemplary embodiments of the disclosed subject matter.
[0068] The specific miniature piezoelectric driven two axis mechanically stabilized gimbal mount payload 200m, taught herein as a non-limiting exemplary embodiment, has a weight of 30 grams, a radius of 29 mm, with either a 12MP dat camera or a 120X160 pixels thermal camera, and can be stabilized to 1 milliradian (mRad) angular stabilization.
[0069] The specific small piezoelectric driven two axis mechanically stabilized gimbal mount payload 200s, taught herein as a non-limiting exemplary embodiment, has a weight of 190 grams, a radius of 58 mm, and comprises 12MP day camera and a VGA thermal camera and can be stabilized to 70 micro radian (pRad) angular stabilization.
[0070] So as to not clutter the text, a numeral followed by the letter "x" will refer to any of the letters that follow that numeral in the drawing, for example 200x can stand for any of 200m and 200s, 120x can stand for any of 120f and 120b, etc.
[0071] Details of gimbals 200x can be found in the publication "Piezo-based, high dynamic range, wide bandwidth steering system for optical applications" to Nir Karasikov, et. al., published in the Proc. of SPIE Vol. 10190 101901C-4, which is incorporated herein by reference.
[0072] Gimbal 200x comprises a base 202x that is affixed to an AV (not seen in these figures) and houses piezoelectric motor for the yaw rotation axis 20 lx. Turret 204x rotates in respect to base 202x about yaw (panning) rotation axis 20 lx over wide range of rotation angles, optionally a limitless range of any number of full rotations. Payload housing 206x rotates about a pitch axis 208x in a wide range of pitch angles, optionally covers a +135 to -45 degrees or more. The Payload housing 206x houses a visible light camera 21 Ox. Optionally, additionally, or alternatively, other types of cameras can be used such as Infra-Red (IR) cameras, SWIR, thermal camera, and the like. The specific, non-limiting exemplary embodiment of small gimbal 200s is taught herein carrying both a visible light camera 210s and a thermal camera such as a forward looking IR camera (FLIR).
[0073] The Gimbaled payloads 200x provide steering and mechanical image stabilization capability, based on two perpendicular rotation axes, powered in this non-limiting exemplary embodiment by Nanomotion Edge (L1B2) motors. Each motor weighs 0.55 grams and provides up to 0.45N of force (at zero velocity) and a linear velocity up to 200 mm/s (at zero force).
[0074] In this non-limiting exemplary embodiment, four motors drive the pan axis and two motors drive the tilt axis in the small gimbal 200s. The motion of both axes is controlled in a closed servo loop, executed by the Nanomotion XCD2 controller-driver, receiving a feedback from the gyro and the angular position sensors. Image stabilization is performed by driving the motors to continuously correct the angle of each rotary axis to keep the camera locked onto the desired original direction. The correct orientation under vibration is determined by using the velocity feedback from the gyro unit, to determine the absolute orientation relative to the inertial coordinates system (the ground), and the position feedback from an incremental high resolution (21 pRad) encoder coupled to each rotary axis, to determine the absolute orientation relative to the platform the gimbal is attached to. Vibration suppression algorithm utilizes the developed control scheme, adapted to control L1B2 ultrasonic motors. The architecture of gimbal 200x allows it to carry various optical components, such as cameras, lasers and more. Small gimbal 200s can carry two cameras; for example, a day camera (SONY - CMOS IMX117CQT 4024x3036, 1 .55pm) and a thermal (IR) camera (Quark 640 LWIR 640x480 17pm or equivalent). Image processing is performed via an Ambarella video processing card (A12A75). Vibration suppression of camera vibrations down to 43 micro-radians rms in the Pan axis and to 70 micro-radians rms in the pitch axis were achieved, an improvement of 300:1 and 200:1, respectively to the input vibration.
[0075] Referring now to Fig. 2C schematically illustrating a piezoelectric driven two axes limited motion range mechanically stabilized gimbal mounts 200n, in accordance with some exemplary embodiments of the disclosed subject matter.
[0076] Piezoelectric driven two axes limited motion range mechanically stabilized gimbal mounts 200n comprises a base 202n that is affixed to an AV (not shown in the figure) and comprises piezoelectric motor 219n for the yaw rotation axis 20 In. Turret 204n rotates in respect to base 202n about yaw (panning) rotation axis 20 In over narrow range of rotation
angles, restricted by the size of the rotation arc 218n over which piezoelectric rotation motor(s) 219n can act.
[0077] Payload carrier 206n rotates about a pitch axis 208n in a narrow range of pitch angles, restricted by the size of the pitch arc 238n over which piezoelectric rotation motors 239na and 239nb can act. Payload carrier 206n can rotate in respect to turret 204n using a ball bearing (not seen herein), as an example. Payload mounting 236n can be used for mounting sensors such as a visible light camera, 2 Infra-Red (IR) cameras, SWIR, FLIR, thermal camera and the likes.
[0078] While the principle of operation is similar to the configurations seen in figures 2A-B, this configuration can save space and/or manage a larger payload sensor, directed to preferred limited angular range.
[0079] Image stabilization devices that use piezoelectric motors for translational motion of at least a part of the optical train in a camera render controlled motion that can be applied to: a) better image stabilization, when used with gimbals; b) fixed cameras c) "super resolution" (possibly moving the lens and the projected image on the detector array by 1/2 pixel in 2D. It should be understood that also with the Payload 20 lx, the sub pixel movement is facilitated.
[0080] Referring now to Fig. 3A and Fig. 3B, respectively, schematically illustrating isometric view 330 and block diagram 360 of a lens positioning mechanism for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
[0081] Optical image stabilization (OIS) (in the presence of vibrations) is performed by controlling the optical path to the image sensor. This is performed by determining the correct position of a dedicated OIS lens in a plane perpendicular to the optical axis. Typically, vibrations-detecting sensors, such as Gyro, provide feedback to a microcomputer, which determines the position of the lens, are required in order to correct the effect of vibrations, and commands the lens positioning system to position the lens accordingly. An exemplary embodiment of OIS lens positioning system 300 motion stage has an X-Y configuration with two independent motion axes, each powered by a Nanomotion Edge motor 332a and 332b for moving the X and Y stages 342a and 342b, respectively. The position of the lens is determined
by a sensor such as X-Y Hall Effect position sensor 336. The motion is in a close loop, controlled by a Nanomotion XCD2 controller-drive 338 comprising a controller 340 and driver 342. The lens 370 is mounted on the X stage 342a. An equivalent configuration uses a piezo actuator, the benefit of which is faster response for super resolution. The latter is shown in Fig 3C.
[0082] Both configurations can be applicable to stationary cameras, for example the types used in systems such as Mobileye.
[0083] In some embodiments, the imaging array of the camera is mounted on the X stage 342a and its position can be controlled in 2D for providing image stabilization. Additionally, small motions of 1/2 pixel in one or two dimensions may be used for increasing the effective resolution of the camera using algorithms of super resolution.
[0084] It should be noted that for low light applications and for thermal cameras, in which the integration time for a frame is comparable or larger than the vibration timescale, mechanical image stabilization is superior comparing to post image software (SW) processing. The latter can stabilize inter frames but can't overcome the intra frame bluer.
[0085] Referring now to Fig. 3C, schematically illustrating isometric view of a flexure based lens positioning mechanism 380 for optical image stabilization system, in accordance with some exemplary embodiments of the disclosed subject matter.
[0086] Some elements in this figure are not marked to reduce cluttering the figure or because they are not seen in this isometric view.
[0087] An exemplary embodiment of flexure based OIS lens positioning system 380 has an X- Y flexure 352 with three independent motions: flexure -based axis X (351), flexure -based axis Y (381) and longitudinal extension / contraction focusing axis Z (391).
[0088] Frame 382 is mounted onto the camera, for example, using screws (not seen here) or using mounting holes 388x (only two are marked).
[0089] Motion along the X axis 351 is actuated by the two opposing X piezoelectric blocks 354a and 354b, which are oppositely powered, compressing or extending the four X flexure structure 355x. In the exemplary, non-limiting embodiment, each piezoelectric blocks 354x is
5x5x2 mm in size. Compression force applied by preload element 356x to the corresponding piezoelectric blocks 394x may be adjusted by a corresponding preloading screw 357x.
[0090] Similarity, motion along the Y axis 381 is actuated by the two opposing Y piezoelectric blocks 384x, which are oppositely powered, compressing or extending the four Y flexure structure 385x (only two are marked in the figure). In the exemplary, non-limiting embodiment, each piezoelectric blocks 384x is 5x5x2 mm in size. Compression force applied by preload element 386x (only one is marked) to the corresponding piezoelectric blocks 384x can be adjusted by a corresponding preloading screw 357x.
[0091] Additionally, tubular piezoelectric actuator 392, affixed to the inner flexure block 390 on one side, provides motion in Z axis 391 for focusing lens (not seen here) attached to the top side of the tubular piezoelectric actuator 392.
[0092] In the exemplary, non-limiting embodiment, tubular piezoelectric actuator 392 is 10 mm tall, has outer diameter of 5.7 mm and inner diameter of 4 mm.
[0093] Referring now is made to Fig. 4A illustrating optional locations for mounting gimbals on AV, in accordance with some exemplary embodiments of the disclosed subject matter.
[0094] In this exemplary embodiment, four camera(s) equipped gimbals 200x are mounted on top of AV 100. It is clear that all the viewing zones (112, 114, 116, 119, 120x, 122, 124, 126x, and 128x) seen in figure 1, can be provided by using a small number of gimbals, specifically if wide scanning range gimbals 200s or 200m are used.
[0095] Typically, the four gimbals are positioned in or near the front left (FL), front right (FR), back left (BL) and back right (BR) corners of the top 410 of AV 100.
[0096] The distance to an object can be calculated by triangulating from two images taken of the same object at the same time by two cameras separated by a known distance. The larger the distance between cameras, the larger the parallax differences between the images, and thus, the better the accuracy of the calculation. Also, the finer the angular resolution of the gimbal and camera is, the better is the accuracy. Furthermore, with the high angular accuracy of the payloads, the payloads need not be bore sighted and as long as the angle of each payload is
known, the distance can be calculated. For long range, the gimbals would typically be bore sighted. For close range, the gimbals would typically be looking inwards.
[0097] Referring now to Fig. 4B illustrating another optional location for mounting a narrow motion range gimbals within the right front light bay and left front light bay of an AV, in accordance with some exemplary embodiments of the disclosed subject matter.
[0098] In this exemplary embodiment, two camera(s) equipped gimbals, 470R and 470L mounted in the right and left front light bays 460R and 460L, respectively, of AV 100. This configuration can augment the car-top mounted gimbals seen in figure 4A. This configuration can provide one or few of the following advantages:
[0099] A) Continuous monitoring of the forward-looking zones (112, 144, 116, 118 in figure 1) while the FL and FR gimbals are rotated to monitor the side zones such as 126x in figure 1.
[00100] B) Accurate monitoring of cross traffic zone 118, parking assist zone 120f, and road lane markings 116, all shown in figure 1.
[00101] C) Lower view point can provide information obscured by structures such as when going up a ramp in an indoor parking garage.
[00102] D) The vertical distance between the top mounted gimbal and the front light bay gimbal can provide better calculated distance accuracy for objects having primarily horizontal features.
[00103] E) Distance calculation for forward zone has now redundant six possible camera pairs. Distance estimation is thus more accurate and robust.
[00104] F) The larger distance between camera pairs TFR-470L and TFL-470R enables greater distance accuracy at longer range (from convenience reasons, the top mounted gimbals are marked as Top-Front-Right (TFR) and Top-Back-Left (TBL)).
[00105] In some exemplary embodiments, gimbals 470x are equipped with IR camera and are located behind the IR transparent front lights window. An example material could be IR transparent Chalcogenides glass.
[00106] Reference is now made to Fig. 4C illustrating an enlarged view of the optional location for mounting a narrow motion range gimbal 200n carrying a sensor such as a camera 212n, within the left front light bay 470L of an AV 100, in accordance with some exemplary embodiments of the disclosed subject matter.
[00107] Fig. 4D is illustrating a system for automated vehicle applications, in accordance with some exemplary embodiments of the disclosed subject matter.
[00108] The system is shown as a non-limiting example and the actual number of gimbals and the topology of the system may vary. Additionally, and optionally, non-gimbaled sensors may be used as well. A control unit 490 comprises three possible controllers or processors as will be explained herein after. Motions of sensor carrying gimbals TBL, TBR, TFR, TFL, 470L and 470R are controlled by motion controller 498 that receives gimbal pointing information from sensors and encoders in the gimbals, and sends motion commands to the gimbals. Optionally, each gimbal has a local motion controller that performs at least some of the motion controlling function.
[00109] Image processor 497 receives image information from the gimbal-mounted cameras and sensors and optionally sends commands such as zooming, exposure parameters, and the likes to the cameras. Optionally, each camera or sensor has a local image processor that performs at least some of the image processing functions such as creating supper resolution images, mosaic or panoramic images, de-blurring the images, and the likes.
[00110] Central processing unit 495 provides automated vehicle functionality by exchange data with motion controller 498 and image processor 497 as well as communicates with the car control sub-systems (for example, engine, brakes, and steering) and other sensors in the car (for example speedometer, acceleration sensor) via communication channel 496.
[00111] It should be noted that the controllers / processors can be incorporated within the control unit or separated or combined in any architecture possible.
[00112] The control unit 490 is capable of: controlling a pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal to contiguous or partially overlapping cones; and using images data from the camera installed on the piezoelectric actuated gimbal to construct a composite image selected from the group consisting of a panoramic image and a two-dimensional mosaic image, wherein a number of pixels in the composite image is larger than a number of pixels in the camera installed on the piezoelectric actuated gimbal.
[00113] Moreover, the control unit 490 is capable of: controlling the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal such that successive images provided by the camera are shifted by a fraction of a pixel; and combining the successive images provided that are shifted by a fraction of a pixel to a combined image having superior spatial resolution than a single image provided by the camera.
[00114] Moreover, the control unit 490 is capable of: aiming at least two piezoelectric actuated gimbals such that two different cameras installed on the two piezoelectric actuated gimbals are viewing at least partially overlapping FOV; and using at least two images taken by the two different cameras to determine, by triangulation, the distance to at least one object within the overlapping FOV.
[00115] The control unit is capable also, by using at least one combined image having superior spatial resolution, to improve the accuracy of distance to at least one object within the overlapping FOV.
[00116] Referring now to Fig. 5A and Fig. 5B, respectively illustrating a single image and a mosaic image, in accordance with some exemplary embodiments of the disclosed subject matter.
[00117] The piezo motors-based gimbal payload has high stabilization performance with high dynamic properties enabling fast move and settle of a full FOV (field of view) at 10 or more
FPS. This allows producing a panoramic mosaic image 520 comprising one or two dimensional arrays of (preferably slightly overlapping) FOV images 510a, 510b... 510x, enabling overview of a large area with angular resolution of an original single image 510a. Simple image stitching can be used to construct the panoramic mosaic image 520. Alternatively or additionally, the overlapping zones between frames 51 Ox may be averaged or joined using other image processing algorithms known in the art. Refresh rate of 10 frames/sec or faster is possible.
[00118] Referring now to Fig. 6A and Fig. 6B, respectively, depicting a bicubic interpolation image 610 and super resolution reconstruction 620 of a plurality (in this example, four or more images were used) of pixelated images shifted by the gimbal a fraction of pixel size with respect to each other, in accordance with some exemplary embodiments of the disclosed subject matter.
[00119] The theoretical limit of resolution of a pixelated camera is, other than diffraction limit, limited by the pixel size and the magnification of the optics. Increasing the number of pixels is costly, and provides no benefit once the resolution is limited by the optics. High magnification optics requires large front lens focal length and diameter and provides limited field of view (FOV). Acquiring mosaic panoramic images (as seen in figures 5A-B) is one way to provide large FOV, high resolution image. It requires substantial panning of the gimbal, and the resolution remains the original resolution of the camera.
[00120] Furthermore, the high accuracy of the images received from the embodiments discloses in figures 2 and 3 enables acquiring multiple images of the same object, while shifting the images by a fraction of pixel size with respect to each other. For example, four images can be taken, such that: image No. 2 is shifted by 1/2 pixel size in the X direction with respect to image No. 1; image No. 3 is shifted by 1/2 pixel size in the Y direction with respect to image No. 1; and image No. 4 is shifted by 1/2 pixel size in both the X and Y directions with respect to image No. 1. It effectively increases the number of pixels on account of frame rate.
[00121] It can be noticed that blurred details 612 seen within the bicubic image 610 are better seen 622 in super resolution reconstructed image 620.
[00122] Distance calculation can use center of gravity (COG) of features as well as higher moments in the image instead of using the edge of the features. COG can be calculated to a sub-
pixel resolution, and may benefit from super resolution image reconstruction and partial-pixel image shifting. This way, the triangulation benefits a finer angular accuracy and better depth accuracy for a given spacing between the cameras.
[00123] It should be noted that the better the image resolution is, the better is the accuracy of the distance calculation.
[00124] Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.
Claims
1. A gimbal-based system for an automated vehicle to be used with a vehicle comprising: a plurality of piezoelectric actuated gimbals to be mounted on the vehicle, wherein at least two of the piezoelectric actuated gimbals are each carrying at least one camera; and a control unit for controlling the piezoelectric actuated gimbals.
2. The gimbal-based system of Claim 1 , wherein said control unit is capable of controlling a pointing direction of the piezoelectric actuated gimbals.
3. The gimbal-based system of Claim 1, wherein said control unit is capable of: receiving image data from the cameras; and providing automated vehicle information to the vehicle.
4. The gimbal-based system of Claims 2 and 3, wherein said control unit is capable of: controlling and stabilizing the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal to contiguous or partially overlapping cones.
5. The gimbal-based system of Claims 1 to 3, wherein said control unit is capable of: using images data from the camera installed on the piezoelectric actuated gimbal to construct a composite image selected from the group consisting of a panoramic image and a one or two-dimensional mosaic image, wherein a number of pixels in the composite image is larger than a number of pixels in the camera installed on the piezoelectric actuated gimbal.
6. The gimbal-based system of Claims 3, 4 and 5, wherein said providing automated vehicle information to the vehicle comprises providing imaging information over field of view (FOV) larger than the FOV of the camera installed on the piezoelectric actuated gimbal.
7. The gimbal-based system of Claim 2 and 3, wherein said control unit is capable of controlling the pointing direction of at least one piezoelectric actuated gimbal to stabilize the image provided by a camera installed on the piezoelectric actuated gimbal to reduce image blurring while the car is in motion.
8. The gimbal-based system of Claims 2 and 3, wherein said control unit is capable of: controlling the pointing direction of at least one piezoelectric actuated gimbal to aim a camera installed on the piezoelectric actuated gimbal such that successive images provided by the camera are shifted by a fraction of a pixel in one or two dimensions; and processing the successive images provided that are shifted by a fraction of a pixel to a combined image having superior spatial resolution than a single image provided by the camera.
9. The gimbal-based system of Claims 2 and 3, wherein said control unit is capable of: aiming at least two piezoelectric actuated gimbals such that two different cameras installed on the two piezoelectric actuated gimbals are viewing at least partially overlapping FOV; and using at least two images taken by the two different cameras to determine, by triangulation, the distance to at least one object within the overlapping FOV.
10. The gimbal-based system of Claim 9, wherein said control unit is capable, by using at least one combined image having superior spatial resolution, to improve an accuracy of distance to at least one object within the overlapping FOV.
11. The gimbal-based system of Claims 1 to 10, wherein the piezoelectric actuated gimbals are mounted at or near the roof of the car.
12. The gimbal-based system of Claim 1 to 11, wherein at least four piezoelectric actuated gimbals are mounted at or near the four corners of the roof of the car.
13. The gimbal-based system of Claims 1 to 10, further comprising at least one piezoelectric actuated gimbal mounted in each headlight bay of the car.
14. The gimbal-based system of Claims 12 and 13, wherein said control unit is capable of determining, by triangulation, the distance to at least one object taking into account the vertical separation between a roof-mounted camera and a headlights bay mounted camera.
15. The gimbal-based system of Claims 14, wherein said control unit is capable of determining, by triangulation, the distance to at least one object taking into account calculated Center Of Gravity (COG) of the said one object, or a segmented portion of said object to achieve sub pixel resolution and improve depth measurement.
16. The gimbal-based system of Claim 15, wherein said control unit is capable of determining, by triangulation, the distance to at least one object taking into account COG and higher mathematical moments of pixel values in said one object, or a segmented portion of said object.
17. The gimbal-based system of Claims ltol6, wherein at least two cameras are operating in at least two different spectral ranges or using different exposure parameters.
18. The gimbal-based system of Claim 17, wherein the at least two cameras are selected from a group consisting of visible light camera, limited spectral band camera, multiple spectral range camera, high-speed camera, FLIR, IR camera, and night vision camera.
19. An image stabilizer comprising: a frame for connecting to a camera; a first stage piezoelectrically actuated to move in a first direction; and a second stage piezoelectrically actuated to move in a second direction, wherein said second stage is connected to said first stage, wherein the second direction is substantially at 90 degrees in relation to the first direction,
wherein one of: a) the lens of the camera; or b) the sensor array of the camera is moving with the two stages.
20. The image stabilizer of Claim 19, further comprising a piezoelectric element situated between said second stage and the at least one of the lens of the camera or the sensor array of the camera, for providing motion in a third direction, wherein said third direction is substantially in at 90 degrees in relation to both first and second directions.
21. The image stabilizer of Claim 19, wherein said motion in a third direction is used for focusing the camera.
22. The image stabilizer of Claim 20, wherein said piezoelectric element is in a form of a tube having a bore along the optical axis of the camera, and the lens of the camera is mounted on the distal end of the tube.
23. The image stabilizer of Claim 19, wherein said frame, said first stage and said second stage are in the form of two-dimensional flexure device.
24. The image stabilizer of Claim 19, wherein at least one of said first stage and said second stage are actuated by two oppositely activated piezoelectric actuators.
25. The image stabilizer of Claim 24, wherein said two oppositely activated piezoelectric actuators is preloaded in compressional stress.
26. A gimbal-based system for an automated vehicle to be used with a vehicle comprising: a single piezoelectric actuated gimbal to be mounted on the vehicle, carrying at least one camera or at least one LIDAR system and a control unit for controlling the piezoelectric actuated gimbals, providing stabilization, panoramic view and super-resolution capabilities.
27. A gimbal-based system as in claim 26, wherein a distance to an observed object is determined by sharpness of an acquired image.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962885340P | 2019-08-12 | 2019-08-12 | |
US62/885,340 | 2019-08-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021028910A1 true WO2021028910A1 (en) | 2021-02-18 |
Family
ID=74570589
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IL2020/050877 WO2021028910A1 (en) | 2019-08-12 | 2020-08-11 | A gimbal apparatus system and method for automated vehicles |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021028910A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115356261A (en) * | 2022-07-29 | 2022-11-18 | 燕山大学 | Defect detection system and method for automobile ball cage dust cover |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7139412B2 (en) * | 2001-04-24 | 2006-11-21 | Matsushita Electric Industrial Co., Ltd. | Image synthesis display method and apparatus for vehicle camera |
US8212880B2 (en) * | 2007-12-20 | 2012-07-03 | Utah State University Research Foundation | Three-axis image stabilization system |
US20180048870A1 (en) * | 2010-11-17 | 2018-02-15 | Omron Scientific Technologies, Inc. | Method and Apparatus for Monitoring Zones |
-
2020
- 2020-08-11 WO PCT/IL2020/050877 patent/WO2021028910A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7139412B2 (en) * | 2001-04-24 | 2006-11-21 | Matsushita Electric Industrial Co., Ltd. | Image synthesis display method and apparatus for vehicle camera |
US8212880B2 (en) * | 2007-12-20 | 2012-07-03 | Utah State University Research Foundation | Three-axis image stabilization system |
US20180048870A1 (en) * | 2010-11-17 | 2018-02-15 | Omron Scientific Technologies, Inc. | Method and Apparatus for Monitoring Zones |
Non-Patent Citations (1)
Title |
---|
KARASIKOV, NIR ET AL.: "Piezo-based, high dynamic range, wide bandwidth steering system for optical applications", GROUND/AIR MULTISENSOR INTEROPERABILITY, INTEGRATION, AND NETWORKING FOR PERSISTENT ISR VIII, vol. 10190, 31 December 2017 (2017-12-31), pages 1 - 15, XP060088903 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115356261A (en) * | 2022-07-29 | 2022-11-18 | 燕山大学 | Defect detection system and method for automobile ball cage dust cover |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7334881B2 (en) | Cross-field of view for autonomous vehicle systems | |
JP6596745B2 (en) | System for imaging a target object | |
US7652686B2 (en) | Device for image detecting objects, people or similar in the area surrounding a vehicle | |
US9531928B2 (en) | Gimbal system with imbalance compensation | |
KR102025445B1 (en) | Intelligent laser tracking system and method for mobile and fixed position traffic monitoring and enforcement applications | |
US10739441B2 (en) | System and method for adjusting a LiDAR system | |
US9681065B2 (en) | Gimbal positioning with target velocity compensation | |
EP2525235B1 (en) | Multi-function airborne sensor system | |
US7417210B2 (en) | Multi-spectral sensor system and methods | |
US20110164108A1 (en) | System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods | |
US20150350569A1 (en) | Multiple-sensor imaging system | |
US20110304737A1 (en) | Gimbal positioning with target velocity compensation | |
CN111226154B (en) | Autofocus camera and system | |
US9121758B2 (en) | Four-axis gimbaled airborne sensor having a second coelostat mirror to rotate about a third axis substantially perpendicular to both first and second axes | |
WO2006050430A2 (en) | Optical tracking system using variable focal length lens | |
JP2007240506A (en) | Three-dimensional shape and 3-dimensional topography measuring method | |
EP3380892B1 (en) | Aerial photography camera system | |
EP1391769A2 (en) | A steerable night vision system | |
WO2021028910A1 (en) | A gimbal apparatus system and method for automated vehicles | |
Srinivasan et al. | An optical system for guidance of terrain following in UAVs | |
US20220138965A1 (en) | Focus tracking system | |
EP3757617A1 (en) | Wearable dead reckoning system for gps-denied navigation | |
EP3957954A1 (en) | Active gimbal stabilized aerial visual-inertial navigation system | |
US20210341922A1 (en) | Optical object tracking on focal plane with dynamic focal length | |
WO1995000820B1 (en) | Wide-field high-resolution camera device for aircraft |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20852090 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20852090 Country of ref document: EP Kind code of ref document: A1 |