[go: up one dir, main page]

EP2569756A1 - System and method for image registration - Google Patents

System and method for image registration

Info

Publication number
EP2569756A1
EP2569756A1 EP11724137A EP11724137A EP2569756A1 EP 2569756 A1 EP2569756 A1 EP 2569756A1 EP 11724137 A EP11724137 A EP 11724137A EP 11724137 A EP11724137 A EP 11724137A EP 2569756 A1 EP2569756 A1 EP 2569756A1
Authority
EP
European Patent Office
Prior art keywords
sensors
sensor
registration
control
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11724137A
Other languages
German (de)
French (fr)
Inventor
Laird Jason John Hay
Jason Lepley
Mark Edgar Bray
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leonardo UK Ltd
Original Assignee
Selex Galileo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Selex Galileo Ltd filed Critical Selex Galileo Ltd
Publication of EP2569756A1 publication Critical patent/EP2569756A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30212Military

Definitions

  • the invention relates to a system and method for image registration. More specifically but not exclusively it relates to a system and method for image registration incorporating the introduction of artificial common registration control point data into a sensor field of view (FoV) which can then be shared by disparate sensors in order to provide common field of reference for the entire system.
  • FoV sensor field of view
  • Intensity-based methods compare intensity patterns in images via correlation metrics. Intensity-based methods register entire images or sub images by comparing their respective intensity profiles by a given metric (e.g. Sum of Absolute Difference). If they are registered, the metric is at minima and the centre of each corresponding sub image is treated as a corresponding feature point.
  • a given metric e.g. Sum of Absolute Difference
  • Feature-based methods find correspondence between image features such as points, lines, and contours.
  • Feature-based method established correspondence between numbers of points in images. Knowing the correspondence between those points, a transform is then determined to map the target image to the reference images.
  • Passive methods rely upon either tightly controlled interrelated physical aspects (calibrated fields of view for example) or shared features between disparate sensors (landmarks for example).
  • a system for improving image data extraction from a target image comprising a plurality of sensors having a common field of view in which at least one sensor maps a series of control data points on to the target, the remaining sensor or sensors using the control data points to calculate a transform with which to map image data output by the first sensor to image data output by the remaining sensor or sensors to allow greater information to be gained about the target area.
  • a method of improving image registration in a system having a plurality of sensors comprising the steps of superimposing a series of data control points on to a target area, extracting image data relating to the target using the sensors, and transforming the image data received from the sensors relating to the target area using the data control points superimposed on the target thereby improving the accuracy and detail of the image data received.
  • Phase sensitive arrays can also be used in order to allow orientation information to be passed by uniquely identifying each data point in each dataset. This provides a given number of registration markers, which can then be automatically identified and used to calculate the necessary transforms without the need for user intervention.
  • Figure 1 shows a schematic drawing of the system of one form of the invention.
  • Figure 2 shows a schematic drawing of another form of the invention showing
  • FIG. 1 shows a first embodiment of the invention.
  • an Unmanned Aerial Vehicle UAV with an integrated sensor and multiple laser designators gives a plan view of the target area and applies registration control points to the target area, which are then observed by the ground based vehicle.
  • the ground based vehicle can then use the registration control points to calculate a transform with which to map the UAV image output to its own image output thereby allowing greater information to be gained about the target area, including visibility of occluded areas (e.g. directly behind a tower or walls).
  • an Unmanned Aerial Vehicle with integrated laser designators applies a registration control pattern to the target area, which is then observed by the ground based multi- sensor platform.
  • the system can then use the registration control points to calculate a transform with which to map all of the sensors to one another.
  • the laser designators could equally be co-located on the multi-sensor ground platform provided they are sufficiently visible by all of the relevant sensors.
  • the sensors can be co-located or on disparate platforms provided they have sufficient visibility of the registration control patterns.
  • This technique could be applied to provide extraction of 3 dimensional information about a target scene by allowing more accurate calculation of the system inter-sensor parallax, or the resolution of sub-pixel scene elements by the more accurate correlation of multiple sensors.
  • the technique could be applied to allow more accurate registration of multi-band sensors for image fusion applications (e.g. visual and thermal band cameras - the registration control patterns will be selected to be visible by all relevant sensors). It will also be appreciated that all of these techniques can be applied both static or on the move. Moreover, these techniques can also employ unambiguous registration control patterns to ensure target area orientation is also visible to all sensors - this is particularly useful for airborne sensor platforms.
  • Previous registration methods are restricted by their reliance on information contained in the existing image space whereas the proposed method introduces additional information into that image space specifically designed to allow accurate registration of the image space from the perspective of all sensors in the system.
  • this invention can be applied to any computer vision system which uses image registration - be that temporal or spatial. This includes but is not limited to Medical, Topographical, Photographic and video image registration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

A system and method is described for improving image registration systems. A target area to be monitored is superimposed with a series of control data points by a sensor within the system. Image data output by all sensors within the system is then processed using a suitable transformation based on the positioning and visibility of the control data points by all the sensors in the system.

Description

System and Method for Image Registration
The invention relates to a system and method for image registration. More specifically but not exclusively it relates to a system and method for image registration incorporating the introduction of artificial common registration control point data into a sensor field of view (FoV) which can then be shared by disparate sensors in order to provide common field of reference for the entire system.
In current known systems of image registration solely passive image registration methods are used, including but not limited to:
1. Intensity Based Image Registration
Intensity-based methods compare intensity patterns in images via correlation metrics. Intensity-based methods register entire images or sub images by comparing their respective intensity profiles by a given metric (e.g. Sum of Absolute Difference). If they are registered, the metric is at minima and the centre of each corresponding sub image is treated as a corresponding feature point.
2. Feature Based Image Registration
Feature-based methods find correspondence between image features such as points, lines, and contours. Feature-based method established correspondence between numbers of points in images. Knowing the correspondence between those points, a transform is then determined to map the target image to the reference images.
3. Frequency Based Image Registration
These methods use metrics in the frequency domain to compare respective images, and include methods such as phase correlation. 4. Interactive Image Registration
This is a manual method of image registration, which requires the user to identify a series of correlated points in each input image which are then used to calculate a transform to bring all of the input images onto the same coordinate set. This can also be done by placing image registration markers into the scene for later identification for use in Feature Based Methods (see 2).
5. Calibrated Field of View Image Registration
This uses co-bore sighted sensors, often on the same platform (and even on the same optics or even sensor array), with known offsets in parallax and field of view. These are then accounted for in the calculation of the required image transform.
Passive methods rely upon either tightly controlled interrelated physical aspects (calibrated fields of view for example) or shared features between disparate sensors (landmarks for example).
Both of these become increasingly difficult in real world applications due to uncontrolled movement (vibration affecting common physical calibration for example), difficulty in common feature detection, and lack of shared features (especially in disparate band sensors - it would be very difficult to align a Terrahertz imager with an Infra Red using Feature Based methods for example, even using multi-modal registration algorithms).
According to the invention there is provided a system for improving image data extraction from a target image comprising a plurality of sensors having a common field of view in which at least one sensor maps a series of control data points on to the target, the remaining sensor or sensors using the control data points to calculate a transform with which to map image data output by the first sensor to image data output by the remaining sensor or sensors to allow greater information to be gained about the target area. According to a further aspect of the invention there is provided a method of improving image registration in a system having a plurality of sensors comprising the steps of superimposing a series of data control points on to a target area, extracting image data relating to the target using the sensors, and transforming the image data received from the sensors relating to the target area using the data control points superimposed on the target thereby improving the accuracy and detail of the image data received.
In this way, the introduction of artificial common registration control point data in to a sensor field of view which can then be shared by disparate sensors provides a common field of reference for the entire system. Phase sensitive arrays can also be used in order to allow orientation information to be passed by uniquely identifying each data point in each dataset. This provides a given number of registration markers, which can then be automatically identified and used to calculate the necessary transforms without the need for user intervention.
The invention will now be described with reference to the accompanying diagrammatic drawings in which:
Figure 1 shows a schematic drawing of the system of one form of the invention; and
Figure 2 shows a schematic drawing of another form of the invention showing
Figure 1 shows a first embodiment of the invention. In this embodiment an Unmanned Aerial Vehicle (UAV) with an integrated sensor and multiple laser designators gives a plan view of the target area and applies registration control points to the target area, which are then observed by the ground based vehicle. The ground based vehicle can then use the registration control points to calculate a transform with which to map the UAV image output to its own image output thereby allowing greater information to be gained about the target area, including visibility of occluded areas (e.g. directly behind a tower or walls).
It will be appreciated that both platforms described above could be static or on the move to allow greater flexibility. These techniques can also employ unambiguous registration control patterns to ensure target area orientation is also visible to all sensors - this is particularly useful for airborne sensor platforms.
In a second embodiment of the invention shown in Figure 2, an Unmanned Aerial Vehicle (UAV) with integrated laser designators applies a registration control pattern to the target area, which is then observed by the ground based multi- sensor platform. The system can then use the registration control points to calculate a transform with which to map all of the sensors to one another.
It will be appreciated that the laser designators could equally be co-located on the multi-sensor ground platform provided they are sufficiently visible by all of the relevant sensors.
Furthermore, the sensors can be co-located or on disparate platforms provided they have sufficient visibility of the registration control patterns.
This technique could be applied to provide extraction of 3 dimensional information about a target scene by allowing more accurate calculation of the system inter-sensor parallax, or the resolution of sub-pixel scene elements by the more accurate correlation of multiple sensors.
Furthermore, the technique could be applied to allow more accurate registration of multi-band sensors for image fusion applications (e.g. visual and thermal band cameras - the registration control patterns will be selected to be visible by all relevant sensors). It will also be appreciated that all of these techniques can be applied both static or on the move. Moreover, these techniques can also employ unambiguous registration control patterns to ensure target area orientation is also visible to all sensors - this is particularly useful for airborne sensor platforms.
Previous registration methods are restricted by their reliance on information contained in the existing image space whereas the proposed method introduces additional information into that image space specifically designed to allow accurate registration of the image space from the perspective of all sensors in the system.
It will be appreciated that this invention can be applied to any computer vision system which uses image registration - be that temporal or spatial. This includes but is not limited to Medical, Topographical, Photographic and video image registration.

Claims

1. A system for improving image data extraction from a target image comprising a plurality of sensors having a common field of view in which at least one sensor maps a series of control data points on to the target, the remaining sensor or sensors using the control data points to calculate a transform with which to map image data output by the first sensor to image data output by the remaining sensor or sensors to allow greater information to be gained about the target area.
2. A system according to claim 1 in which the control data points comprise an unambiguous registration control pattern thereby ensuring the target area orientation is visible to all sensors in the system.
3. A system according to claim 1 or 2 in which at least one of the sensors is mounted on an aircraft and at least one of the sensors is ground based.
4. A system according to claim 3 in which the airborne sensor is mounted on a UAV.
5. A system according to any preceding claim in which the control data points comprise a registration control pattern applied to the target area by integrated laser designators, said control pattern being observable by a ground based multi-sensor platform.
6. A system according to claim 5 in which the registration control points are used to calculate a transform with which to map all of the sensors to one another.
7. A system according to claim 5 or 6 in which the laser designators are co- located on the multi-sensor ground platform such that the control pattern is visible to all of the relevant sensors.
8. A system according to any one of claim 5 to 7 in which the sensors can be co-located or on disparate platforms provided they have sufficient visibility of the registration control patterns.
9. A method of improving image registration in a system having a plurality of sensors comprising the steps of superimposing a series of data control points on to a target area, extracting image data relating to the target using the sensors, and transforming the image data received from the sensors relating to the target area using the data control points superimposed on the target thereby improving the accuracy and detail of the image data received.
10. A method of improving image registration according to claim 9 further comprising the step of utilising unambiguous registration control patterns to ensure target area orientation is also visible to all sensors.
1 1 . A system or method as hereinbefore described with reference to the accompanying diagrammatic drawings.
EP11724137A 2010-05-14 2011-05-03 System and method for image registration Withdrawn EP2569756A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB1008104.0A GB201008104D0 (en) 2010-05-14 2010-05-14 System and method for image registration
PCT/EP2011/057014 WO2011141322A1 (en) 2010-05-14 2011-05-03 System and method for image registration

Publications (1)

Publication Number Publication Date
EP2569756A1 true EP2569756A1 (en) 2013-03-20

Family

ID=42334794

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11724137A Withdrawn EP2569756A1 (en) 2010-05-14 2011-05-03 System and method for image registration

Country Status (3)

Country Link
EP (1) EP2569756A1 (en)
GB (1) GB201008104D0 (en)
WO (1) WO2011141322A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961150B (en) * 2018-04-11 2019-05-03 西安科技大学 Method of automatic deployment and control of image control points based on UAV images
CN109696675B (en) * 2018-12-27 2021-01-05 河海大学 InSAR time series image ensemble registration method based on Dijkstra algorithm

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020114509A1 (en) * 2000-09-12 2002-08-22 Hideto Takeuchi Three-dimensional data processing device, three-dimensional data processing method,and program providing medium
US20020122113A1 (en) * 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
GB2390792A (en) * 2002-07-08 2004-01-14 Vision Rt Ltd Patient positioning system using images from two viewpoints
US20040151365A1 (en) * 2003-02-03 2004-08-05 An Chang Nelson Liang Multiframe correspondence estimation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020122113A1 (en) * 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
US20020114509A1 (en) * 2000-09-12 2002-08-22 Hideto Takeuchi Three-dimensional data processing device, three-dimensional data processing method,and program providing medium
GB2390792A (en) * 2002-07-08 2004-01-14 Vision Rt Ltd Patient positioning system using images from two viewpoints
US20040151365A1 (en) * 2003-02-03 2004-08-05 An Chang Nelson Liang Multiframe correspondence estimation

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
FUMIKI HOSOI ET AL: "Estimation and Error Analysis of Woody Canopy Leaf Area Density Profiles Using 3-D Airborne and Ground-Based Scanning Lidar Remote-Sensing Techniques", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 48, no. 5, 1 May 2010 (2010-05-01), pages 2215 - 2223, XP011302705, ISSN: 0196-2892 *
LORI A MAGRUDER ET AL: "ICESat Geolocation Validation Using Airborne Photography", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 48, no. 6, 1 June 2010 (2010-06-01), pages 2758 - 2766, XP011306578, ISSN: 0196-2892 *
MADHAVAN R ET AL: "Temporal Range Registration for Unmanned Ground and Aerial Vehicles", JOURNAL OF INTELLIGENT AND ROBOTIC SYSTEMS ; THEORY AND APPLICATIONS - (INCORPORATING MECHATRONIC SYSTEMS ENGINEERING), KLUWER ACADEMIC PUBLISHERS, DO, vol. 44, no. 1, 1 September 2005 (2005-09-01), pages 47 - 69, XP019248525, ISSN: 1573-0409 *
MADHAVAN R ET AL: "Temporal range registration for unmanned ground and aerial vehicles", ROBOTICS AND AUTOMATION, 2004. PROCEEDINGS. ICRA '04. 2004 IEEE INTERN ATIONAL CONFERENCE ON NEW ORLEANS, LA, USA APRIL 26-MAY 1, 2004, PISCATAWAY, NJ, USA,IEEE, US, vol. 3, 26 April 2004 (2004-04-26), pages 3180 - 3188, XP010768594, ISBN: 978-0-7803-8232-9, DOI: 10.1109/ROBOT.2004.1307552 *
See also references of WO2011141322A1 *

Also Published As

Publication number Publication date
GB201008104D0 (en) 2010-06-30
WO2011141322A1 (en) 2011-11-17

Similar Documents

Publication Publication Date Title
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
US8059887B2 (en) System and method for providing mobile range sensing
CN113847930A (en) Multi-sensor calibration system
KR102001659B1 (en) Method and apparatus for providing camera calibration for vehicles
WO2018081348A1 (en) Vision-inertial navigation with variable contrast tracking residual
US20040179107A1 (en) Video augmented orientation sensor
EP2169589A2 (en) Image processing device for vehicle and corresponding method and computer program product
EP3929872A1 (en) Multi-sensor calibration system
Yahyanejad et al. Incremental mosaicking of images from autonomous, small-scale uavs
WO2013069012A1 (en) Method and system for determining position and/or orientation
KR20200038140A (en) Apparatus and method for updating high definition map
US20160169662A1 (en) Location-based facility management system using mobile device
JP2023004964A (en) Sensor calibration method and device
Stow et al. Evaluation of geometric elements of repeat station imaging and registration
Kharchenko et al. Correlation-extreme visual navigation of unmanned aircraft systems based on speed-up robust features
EP2569756A1 (en) System and method for image registration
EP2776786B1 (en) Method and system for determining a relation between a first scene and a second scene
Mazzei et al. A lasers and cameras calibration procedure for VIAC multi-sensorized vehicles
Araar et al. Towards low-cost indoor localisation using a multi-camera system
JP6448427B2 (en) Facility name superimposing device
Wiesmann et al. Joint Intrinsic and Extrinsic Calibration of Perception Systems Utilizing a Calibration Environment
Hrabar et al. PTZ camera pose estimation by tracking a 3D target
CN111638500A (en) Calibration method for a measuring device and measuring device
Toriya et al. A mobile camera localization method using aerial-view images
Pritt et al. Georegistration of multiple-camera wide area motion imagery

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20121114

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SELEX ES LTD

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20160412

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: LEONARDO MW LTD

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20190906