[go: up one dir, main page]

CN113450414B - Camera calibration method, equipment, system and storage medium - Google Patents

Camera calibration method, equipment, system and storage medium Download PDF

Info

Publication number
CN113450414B
CN113450414B CN202010214083.XA CN202010214083A CN113450414B CN 113450414 B CN113450414 B CN 113450414B CN 202010214083 A CN202010214083 A CN 202010214083A CN 113450414 B CN113450414 B CN 113450414B
Authority
CN
China
Prior art keywords
robot
coordinate system
image
positioning
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010214083.XA
Other languages
Chinese (zh)
Other versions
CN113450414A (en
Inventor
朱凯
张友群
彭忠东
冯雪涛
井连杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shenxiang Intelligent Technology Co ltd
Original Assignee
Zhejiang Shenxiang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shenxiang Intelligent Technology Co ltd filed Critical Zhejiang Shenxiang Intelligent Technology Co ltd
Priority to CN202010214083.XA priority Critical patent/CN113450414B/en
Publication of CN113450414A publication Critical patent/CN113450414A/en
Application granted granted Critical
Publication of CN113450414B publication Critical patent/CN113450414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application provides a camera calibration method, equipment, a camera calibration system and a storage medium. The system comprises: the system comprises a camera to be calibrated, a robot and a server, wherein the server is respectively in communication connection with the camera and the robot; the robot is used for carrying out autonomous positioning according to a robot coordinate system in the moving process and providing the positioning coordinates of the robot at the target position to the server; the camera is used for shooting the robot and providing a first image of the robot at the target position to the server; the server is used for converting the positioning coordinates into scene coordinates based on the mapping relation between the robot coordinate system and the scene coordinate system; determining image coordinates of the target position in the first image according to the image coordinate system; and determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera. According to the embodiment of the application, the automatic calibration of the camera can be realized, and the efficiency and the precision of the camera calibration are improved.

Description

Camera calibration method, equipment, system and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method, an apparatus, a system, and a storage medium for calibrating a camera.
Background
Camera calibration is a key basis for implementing machine vision applications. The accuracy of camera calibration affects the accuracy of machine vision applications.
At present, the camera is usually calibrated in a manual mode, and a large amount of manual measurement, manual debugging and other works are needed by a calibrator, so that the camera calibration efficiency is low, and the calibration precision is not ideal.
Disclosure of Invention
Aspects of the application provide a camera calibration method, device, system and storage medium, which are used for realizing automatic calibration of a camera and improving the calibration efficiency and precision.
The embodiment of the application provides a camera calibration method, which comprises the following steps:
Acquiring positioning coordinates when the robot moves to a target position, wherein the positioning coordinates are generated by the robot performing autonomous positioning according to a robot coordinate system;
Converting the positioning coordinates into scene coordinates based on a mapping relation between a robot coordinate system and a scene coordinate system;
according to an image coordinate system, determining image coordinates of the target position in a first image of the robot at the target position, which is shot by a camera;
And determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
The embodiment of the application also provides a camera calibration method, which is suitable for robots and comprises the following steps:
Receiving a navigation instruction sent by a control terminal;
moving in the scene where the camera is according to the navigation instruction;
in the moving process, autonomous positioning is carried out according to a self coordinate system so as to generate positioning data;
And providing the positioning data to a server, so that the server can determine the calibration parameters of the camera according to the positioning data and calibrate the camera.
The application also provides a camera calibration system, comprising: the system comprises a camera to be calibrated, a robot and a server, wherein the server is respectively in communication connection with the camera and the robot;
the robot is used for carrying out autonomous positioning according to a robot coordinate system in the moving process and providing the positioning coordinates of the robot at the target position to the server;
the camera is used for shooting the robot and providing a first image of the robot at the target position to the server;
The server is used for converting the positioning coordinates into scene coordinates based on the mapping relation between the robot coordinate system and the scene coordinate system; determining image coordinates of the target position in the first image according to an image coordinate system; and determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
The embodiment of the application also provides a computing device which comprises a memory, a processor and a communication component;
the memory is used for storing one or more computer instructions;
The processor is coupled with the memory and the communication component for executing the one or more computer instructions for:
the method comprises the steps that positioning coordinates when a robot moves to a target position are obtained through the communication component, wherein the positioning coordinates are generated by the robot which performs autonomous positioning according to a robot coordinate system;
Converting the positioning coordinates into scene coordinates based on a mapping relation between a robot coordinate system and a scene coordinate system;
according to an image coordinate system, determining image coordinates of the target position in a first image of the robot at the target position, which is shot by a camera;
And determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
The embodiment of the application also provides a robot, which comprises: a memory, a processor, and a communication component;
the memory is used for storing one or more computer instructions;
The processor is coupled with the memory and the communication component for executing the one or more computer instructions for:
receiving a navigation instruction sent by a control terminal through the communication component;
moving in the scene where the camera is according to the navigation instruction;
in the moving process, autonomous positioning is carried out according to a self coordinate system so as to generate positioning data;
And providing the positioning data to a server through the communication component so that the server calculates the calibration parameters of the camera according to the positioning data and calibrates the camera.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform the aforementioned camera calibration method.
In the embodiment of the application, an innovative camera calibration scheme is provided, a robot is used as a calibration object in a field, and the target position for camera calibration can be rapidly and conveniently determined by controlling the robot to move in the field; the robot can be utilized to perform autonomous positioning, so that scene coordinates of the target position in the field can be accurately determined; and the calibration parameters of the camera can be determined according to the image coordinates and scene coordinates of the robot in the image shot by the camera, so that the calibration of the camera is realized. Therefore, in the embodiment of the application, the automatic calibration of the camera can be realized, and the efficiency and the precision of the camera calibration are improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1a is a schematic diagram of a camera calibration system according to an exemplary embodiment of the present application;
FIG. 1b is a schematic diagram of a camera calibration system according to an exemplary embodiment of the present application;
fig. 2 is a schematic structural view of a robot according to an exemplary embodiment of the present application;
FIG. 3 is a schematic diagram of a logic diagram for establishing an image coordinate mapping relationship between a top surface and a bottom surface of a robot at a target position according to an exemplary embodiment of the present application;
FIG. 4 is a flow chart of a camera calibration method according to another exemplary embodiment of the present application;
FIG. 5 is a flow chart of another camera calibration method according to yet another exemplary embodiment of the present application;
FIG. 6 is a schematic diagram of a computing device according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a robot according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
At present, the camera calibration is usually carried out manually, which not only results in low camera calibration efficiency, but also is not ideal in calibration precision. To ameliorate these technical problems, some embodiments of the application: the innovative camera calibration scheme is provided, the robot is used as a calibration object in the field, and the target position for camera calibration can be rapidly and conveniently determined by controlling the robot to move in the field; the robot can be utilized to perform autonomous positioning, so that scene coordinates of the target position in the field can be accurately determined; and the calibration parameters of the camera can be determined according to the image coordinates and scene coordinates of the robot in the image shot by the camera, so that the calibration of the camera is realized. Therefore, in the embodiment of the application, the automatic calibration of the camera can be realized, and the efficiency and the precision of the camera calibration are improved.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1a is a schematic structural diagram of a camera calibration system according to an exemplary embodiment of the present application. Fig. 1b is a schematic diagram of a camera calibration system according to an exemplary embodiment of the present application. Referring to fig. 1a and 1b, the camera calibration system comprises a camera 10 to be calibrated, a robot 20 and a server 30, the server 30 being communicatively connected to the camera 10 and the robot 20, respectively.
The camera calibration system provided by the embodiment can be applied to various application scenes needing camera calibration, for example, camera calibration can be performed in various indoor places such as business, home or enterprises. The present embodiment is not limited to the application scenario.
As shown in fig. 1a, one or more cameras 10 may be included in the scene, and calibration of any camera 10 in the scene may be achieved in this embodiment. In practical applications, calibration operations for different cameras 10 may be independent of each other. For convenience of description, hereinafter, a description will be made of a technical solution with one camera 10 in a scene as a calibration object, but it should be understood that the camera calibration solution provided in this embodiment may be applied to any one camera 10 in a scene.
In this embodiment, the robot 20 is located in the scene in which the camera 10 is located and is movable in the scene. In this embodiment, the robot refers to a device capable of automatically executing work, which can not only accept human command, but also run a pre-programmed program.
In this embodiment, the robot 20 may be a ground robot, such as a device that moves on the ground by means of wheels or mechanical legs. The robot 20 may also be a non-ground robot such as a drone, a device with a floating mobile unit, or other device capable of moving off the ground.
During movement, robot 20 may perform autonomous positioning to generate positioning coordinates of any of the track points in its movement track. The robot 20 may provide the server 30 with positioning data generated during the movement.
Wherein the robot 20 may establish its own coordinate system, which is described herein as the robot coordinate system. The robot coordinate system is a coordinate system dedicated to autonomous positioning of the robot 20, and is independent of a scene coordinate system and an image coordinate system mentioned later. In practical applications, the robot coordinate system may have the motion start point of the robot 20 as the origin, which is not limited in this embodiment.
Based on this, the robot 20 can perform autonomous positioning in accordance with the robot coordinate system during movement. Since the robot 20 performs autonomous positioning based on an actual moving process, the generated positioning data can accurately reflect the actual position of the robot 20.
In this embodiment, the robot 20 may perform autonomous positioning by using SLAM (Simultaneous Localization AND MAPPING, instant positioning and mapping) technology. Of course, the present embodiment is not limited thereto, and the robot 20 may also perform autonomous positioning using other positioning techniques.
The camera 10 may photograph the robot 20 during the movement of the robot 20 within the field. In general, the photographing field of view of the camera 10 is limited, in this embodiment, whether the robot 20 is located within the photographing field of view of the camera 10 may be detected by a technique such as object detection, and the camera 10 may be controlled to perform photographing only when it is detected that the robot 20 is located within the photographing field of view. Of course, this is not essential, and the camera 10 may continue shooting, which is not limited in this embodiment.
The camera 10 may provide the photographed image to the server 30.
In this embodiment, part or all of the track points in the movement track of the robot 20 may be used as the target positions, where the target positions refer to track points participating in calculation of the calibration parameters, and the number of the target positions may be one or more.
At least the positioning coordinates of the robot 20 at the target position can be provided to the server 30.
For the camera 10, at least a first image taken when the robot 20 is located at the target position may be provided to the server 30.
Of course, the robot 20 may also provide the positioning coordinates of other track points to the server 30, and the camera 10 may also provide other images to the server 30, which is not limited in this embodiment.
Accordingly, for the target location, the server 30 may obtain at least two aspects of data: the positioning coordinates provided by the robot 20 and the image provided by the camera 10.
For the server 30, the positioning coordinates may be converted into scene coordinates based on a mapping relationship between the robot coordinate system and the scene coordinate system; determining image coordinates of the target position in the first image according to the image coordinate system; calibration parameters of the camera 10 are determined according to scene coordinates and image coordinates of the target position to calibrate the camera 10.
In a physical implementation, the server 30 may be a conventional server, a cloud host, a virtual center, and other server devices, where the server devices mainly include a processor, a hard disk, a memory, a system bus, and the like, and are similar to a general computer architecture.
In this embodiment, the calibration parameters of the camera 10 may include a mapping relationship between a scene coordinate system and an image coordinate system.
Wherein the image coordinate system is the basis for locating in the image taken by the camera 10, the image coordinates being used to characterize the position in the image.
The scene coordinate system refers to a coordinate system corresponding to a scene in which the camera 10 is located. The scene coordinate system is the basis for locating the scene map, and the scene coordinates are used for representing the position in the scene map. The scene map may be a plan view or a three-dimensional view of the scene, or the like. In practice, scene maps are generally known, and accordingly, scene coordinate systems are also known.
Based on this, the server 30 can determine the image coordinates of the robot 20 in the first image as the image coordinates of the target position according to the image coordinate system.
The server 30 may also convert the positioning coordinates of the target position into scene coordinates according to the association relationship between the preset scene coordinate system and the robot coordinate system, thereby obtaining the scene coordinates of the target position.
In this way, the server 30 can determine the mapping relationship between the image coordinate system and the scene coordinate system according to the image coordinate and the scene coordinate of the target position.
In this process, the server 30 may determine a mapping relationship between the image coordinate system and the robot coordinate system based on the image coordinate and the positioning coordinate of the target position. On the basis, the robot coordinate system can be used as an intermediate coordinate system, and the mapping relation between the image coordinate system and the scene coordinate system can be determined according to the mapping relation between the scene coordinate system and the robot coordinate system and the mapping relation between the image coordinate system and the robot coordinate system.
The process of determining the mapping relationship between two coordinate systems is essentially a process of calculating the mapping relationship between two coordinate systems when knowing the coordinates of at least one position in the two coordinate systems, and several calculation schemes for such problems are currently available and will not be described in detail herein.
It should be noted that in this embodiment, the calibration parameters may include, in addition to the mapping relationship between the image coordinate system and the scene coordinate system, other parameters such as the focal length of the camera 10, and the calibration manners of the other parameters are not described in detail herein.
In this embodiment, an innovative camera calibration scheme is provided, and the robot 20 is used as a calibration object in the field, so that the target position for camera calibration can be quickly and conveniently determined by controlling the robot 20 to move in the field; autonomous positioning can be performed by using the robot 20, which can accurately determine scene coordinates of the target position in the field; furthermore, the calibration parameters of the camera 10 can be determined according to the image coordinates and scene coordinates of the robot 20 in the image shot by the camera 10, so as to realize the calibration of the camera 10. Accordingly, in the embodiment of the application, the automatic calibration of the camera 10 can be realized, and the efficiency and the precision of the camera calibration are improved.
In the above or the following embodiments, before camera calibration is performed, the robot 20 may perform environmental scanning on the scene where the camera 10 is located, so as to obtain environmental data; and generating positioning data according to a robot coordinate system in the scanning process, and generating a first map according to the environment data and the positioning data.
Wherein the environment scanning process may be implemented based on radar or the like components mounted on the robot 20. Through the environmental scan, the locations of various environmental elements within the field, including but not limited to walls, columns, facilities, etc., within the field may be determined in the first map.
Accordingly, the first map can accurately describe the actual environment in the scene in which the camera 10 is located.
In practice, the layout of facilities within a scene, etc. may change from time to time, and the scene map may not be updated synchronously, which may cause the scene map to be inconsistent with the actual environment in the scene. While inaccurate scene maps may lead to erroneous results from machine vision applications.
In this embodiment, the scene map may be corrected by using the first map.
The server 30 may determine the scene coordinates of at least one environmental element based on the mapping relationship between the scene coordinate system and the robot coordinate system according to the positioning coordinates of the at least one environmental element in the first map; at the scene coordinates of at least one environmental element, the corresponding environmental elements are rendered separately to modify the scene map.
For example, several shelves are added to the scene, but the scene map is not marked, so the scene map is not accurate any more. In this embodiment, the first map provided by the robot 20 includes the positioning coordinates of the plurality of shelves, and the server 30 may determine the scene coordinates of the plurality of shelves according to the mapping relationship between the scene coordinate system and the robot coordinate system, and render the plurality of shelves in the scene map according to the determined scene coordinates, so as to implement the correction of the scene map.
Accordingly, the corrected scene map can more accurately reflect the actual environment in the scene where the camera 10 is located, thereby improving the accuracy of machine vision application.
In addition, in this embodiment, the server 30 may determine the mapping relationship between the scene coordinate system and the robot coordinate system at least by adopting the following implementation manner:
Determining the positioning coordinates of at least one environmental element in the scene in a robot coordinate system according to the positioning data and the environmental data generated by the robot 20;
Acquiring scene coordinates of at least one environmental element in a scene coordinate system;
and determining the mapping relation between the robot coordinate system and the scene coordinate system according to the positioning coordinate and the scene coordinate of at least one environment element.
Wherein the server 30 may select at least one fixed-location environmental element in the scene from among the environmental elements in the scene. Such as walls, columns, etc. in a scene. And a mapping relation between the robot coordinate system and the scene coordinate system can be established based on the positioning coordinates and the scene coordinates of the environment elements.
Of course, in the present embodiment, the implementation manner of establishing the mapping relationship between the robot coordinate system and the scene coordinate system is not limited thereto, and is not exhaustive herein.
In the above or the following real-time, the camera calibration system may further include a control terminal 40, where the control terminal 40 is communicatively connected to the robot 20, and the control terminal 40 may provide navigation services for the moving process of the robot 20 in the field.
The robot 20 may provide the first map to the control terminal 40, and the control terminal 40 may issue a robot 20 navigation instruction based on the first map to control the robot 20 to move in the scene.
In this embodiment, a plurality of implementations may be used to determine the target position during movement of the robot 20.
In one implementation, one or more target locations in the field of view of camera 10 may be specified by a technician.
In this implementation, based on the first map, the technician may select one or more target locations in the first map. The selected target position is a position capable of comprehensively representing the mapping relation between the image coordinate system and the scene coordinate system based on experience of a technician.
The control terminal 40 may generate navigation instructions to control the movement of the robot 20 to one or more target locations in response to the location selection operation.
In the case where there are a plurality of target positions, the control terminal 40 may control the robot 20 to sequentially move to the plurality of target positions.
In the case of the robot 20, autonomous positioning is performed when moving to the target position, and the positioning coordinates of the target position are provided to the server 30.
Accordingly, in this implementation, the navigation instruction sent by the control terminal 40 includes the target positions, the robot 20 may move to one or more manually selected target positions under the control of the control terminal 40, and the positioning coordinates of the selected one or more target positions may be used as a data base for camera calibration.
In another implementation, the control terminal 40 may issue a cruise command based on the first map, or the technician may select a number of waypoints in the first map that are distributed within the field of view of the camera 10, and the control terminal 40 may issue a navigation command based on the number of waypoints. Under both schemes, the control terminal 40 will control the robot 20 to move densely in the field, thereby generating a large number of movement trajectories.
In this implementation, the control terminal 40 does not designate the target position any more, and the robot 20 can make intensive movements within the shooting field of view of the camera 10 and provide the positioning coordinates of at least one track point in the movement track to the server 30.
For the server 30, at least one track point meeting the marking requirement can be selected from the moving tracks of the robot 20 as the target position.
In practical applications, in order to ensure the rationality of the selected target position, the server 30 may determine whether the coverage degree of the movement track of the robot 20 on the shooting field of view of the camera 10 meets a preset standard, and select at least one track point meeting the marking requirement from the movement track of the robot 20 as the target position when it is determined that the coverage degree of the movement track of the robot 20 on the shooting field of view of the camera 10 meets the preset standard.
The preset standard of the coverage degree may be greater than 80%, which is not limited in this embodiment, and the preset standard may be adjusted according to actual requirements.
The marking requirement may be that the distance from the field boundary of the camera 10 meets a preset requirement. For example, the distance from the field boundary of the camera 10 is less than 5 (the unit is not limited here). The present embodiment is not limited thereto, and the preset requirement can be adjusted according to actual requirements.
Accordingly, in this implementation, the navigation instruction sent by the control terminal 40 will not include the target positions any more, the robot 20 will perform intensive movement, but the server 30 automatically selects one or more target positions from the movement track of the robot 20, and the positioning coordinates of the selected one or more target positions may be used as the data base for calibrating the camera.
Of course, in addition to the above two implementations, other implementations may be used to determine the target position in this embodiment, and the embodiment is not limited thereto.
In addition, in the present embodiment, the control terminal 40 may be further connected to the camera 10 in a communication manner, and the control terminal 40 may display an image captured by the camera 10. In practice, the camera 10 may provide a video stream to the control terminal 40 to present the movement of the robot 20 within the shooting field of view of the camera 10.
Accordingly, a technician can observe the movement of the robot 20 within the photographing view of the camera 10 through the control terminal 40, and can adjust the movement scheme of the robot 20 in the control terminal 40 in case that the movement trajectory of the robot 20 is found to be inconsistent with the desired effect.
For example, if the robot 20 moves according to the target position selected by the technician, the technician may reselect the target position if the actual position of the robot 20 within the field of view of the camera 10 is found to be not in accordance with the expected effect when the robot moves to the target position.
For another example, if the robot 20 moves according to the cruise instruction or according to a plurality of route points specified by the technician, and it is found that the robot 20 exits the shooting field of view when the coverage of the movement trajectory to the shooting field of view of the camera 10 does not reach the preset standard, the technician may newly add a route point in the control terminal 40 or adjust the movement direction of the robot 20 in the cruise mode.
The control terminal 40 may generate a navigation instruction of the robot 20 in response to the adjustment operation to adjust the movement of the robot 20.
Accordingly, in this embodiment, the control of the moving process of the robot 20 may be implemented based on the control terminal 40, which makes the calibration process of the camera 10 more intelligent, and on this basis, the target positions participating in the calibration and calculation may be selected more reasonably and more conveniently.
In the above or below embodiments, the image coordinates of the target location may be determined in a variety of ways.
In one implementation, robot 20 may position its bottom center point when moving to the target location to obtain the positioning coordinates of its bottom center point as the positioning coordinates of the target location.
Based on this, for the server 30, the image coordinates corresponding to the top center point of the robot 20 may be determined in the first image according to the image coordinate system; according to the image coordinate mapping relationship between the top surface and the bottom surface of the robot 20 at the target position, the image coordinate corresponding to the top surface center point is converted into the image coordinate corresponding to the bottom surface center point, as the image coordinate of the target position.
In the image obtained by photographing the robot 20 by the camera 10, the center point of the bottom surface of the robot 20 will be in a blocked state, and the server 30 may convert the image coordinates of the center point of the top surface that can be used in the image into the image coordinates of the center point of the bottom surface, and use the image coordinates of the center point of the bottom surface as the image coordinates of the target position.
In this implementation, robot 20 may be configured to assist server 30 in determining the image coordinates of the center point of the floor.
Fig. 2 is a schematic structural diagram of a robot 20 according to an exemplary embodiment of the present application.
Referring to fig. 2, a plurality of pairs of symmetrical mark points may be provided on the top and bottom surfaces of the robot 20. One symmetrical way, in terms of one pair of marker points, may be: the marking point on the top surface and the marking point on the bottom surface are on the same straight line, which is perpendicular to the ground plane in the scene.
Wherein the marker may be an LED lamp or an icon or the like. Further, different color, shape, etc. attributes may be configured for different pairs of mark points to distinguish between the different pairs of mark points. For example, different marking points may use different LED colors, different marking points may use different patterns, and the like, which is not limited in this embodiment.
In addition, in practical application, at least two pairs of mark points can be simultaneously shot by the camera 10 as targets, and the layout of the mark points can be determined. For example, in fig. 2,4 LEDs or the like are provided on the top surface and the bottom surface, respectively, and the 4 LEDs or the like are uniformly distributed on the periphery of the top surface or the bottom surface.
Based on this, at least two pairs of marker points will be included in the image including the robot 20 captured by the camera 10.
For the server 30, the camera 10 may be used to capture a second image of the robot 20 at a reference position before moving to the target position; identifying at least two pairs of mark points from the top and bottom surfaces of the robot 20 in the second image, and determining first image coordinates of the at least two pairs of mark points; determining second image coordinates of at least two pairs of mark points in the first image; an image coordinate mapping relationship between the top and bottom surfaces of the robot 20 at the target position is established based on the first image coordinates and the second image coordinates of at least two pairs of mark points.
Based on this, the server 30 may determine an image coordinate mapping relationship between any two symmetrical points on the top and bottom surfaces of the robot 20 based on the image coordinate mapping relationship between the top and bottom surfaces of the robot 20 at the target position. Accordingly, the server 30 can determine the graphic coordinates of the center point of the bottom surface of the robot 20 according to the image coordinates of the center point of the top surface of the robot 20.
Alternatively, the server 30 may construct the first polygon based on at least two marker points located on the top surface of at least two pairs of marker points; constructing a second polygon according to at least two marking points positioned on the bottom surface of at least two pairs of marking points; an image coordinate mapping relationship between the first polygon and the second polygon is established as an image coordinate mapping relationship between the top surface and the bottom surface of the robot 20 at the target position, based on the image coordinates of each of the mark points.
The process can also be regarded as a process of calculating the mapping relationship between the two coordinate systems given the coordinates of at least one point in the two coordinate systems, and the specific calculation process will not be described in detail.
Fig. 3 is a schematic diagram of a logic diagram for establishing an image coordinate mapping relationship between a top surface and a bottom surface of a robot 20 at a target position according to an exemplary embodiment of the present application.
As shown in fig. 3, two pairs of marker points [ B, B- ] and [ C, C- ] may be characterized in the first image as [ B 1,B1 - ] and [ C 1,C1 - ] and in the second image as [ B 2,B2 - ] and [ C 2,C2 - ] as shown in the first image.
The server 30 may construct a first quadrilateral based on the marker points B 1、C1、B2 and C 2 and a second quadrilateral based on the marker points B 1-、C1-、B2 -and C 2 -.
And an image coordinate mapping relationship between the two quadrangles can be calculated according to the image coordinates of the mark points, and the image coordinate mapping relationship is used as the image coordinate mapping relationship between the top surface and the bottom surface of the robot 20 when the robot 20 is at the target position.
Further, based on the image coordinate mapping relationship between the two quadrangles, the image coordinates of the bottom surface center point P-of the robot 20 can be calculated from the image coordinates of the top surface center point P of the robot 20 in the first image.
Of course, in this implementation, other structural designs may be performed on the robot 20 to assist the server 30 in determining the image coordinate mapping relationship between the top surface and the bottom surface of the robot 20 at the target position, which is not limited herein.
In another implementation, the robot 20 may be coupled to a marker, and the robot 20 may position the marker when moving to the target position, and take the obtained positioning coordinates of the marker as the positioning coordinates of the target position.
In this implementation, the connection between the robot 20 and the marker is not limited, and any hard connection between the robot 20 and the marker may be used, where hard connection means that the relative position between the two remains unchanged.
In practical application, the marker may be shot by the camera 10 as a target in the moving process of the robot 20, and parameters such as a distance between the marker and the robot 20 may be determined. In addition, the height difference between the marker and the bottom surface of the robot 20 is less than a preset threshold, i.e., the marker may be located in or near the plane of the bottom surface of the robot 20. Preferably, the marker may be located in a plane in which the bottom surface of the robot 20 lies.
Based on this, the server 30 can determine, in the first image, the image coordinates corresponding to the marker as the image coordinates of the target position in accordance with the image coordinate system.
Of course, other implementations may be used to determine the image coordinates of the target position in this embodiment, and this embodiment is not limited to the above two implementations.
Accordingly, in this embodiment, the image coordinates of the target position can be accurately determined, and the image coordinates are not affected by the three-dimensional structure of the robot 20, which can effectively improve the accuracy of camera calibration.
In the above or in the following embodiments, an information carrier for showing time information may be provided on the robot 20.
On this basis, the robot 20 can display time information when moving to the target position through the information carrier; and may correlate the time information with the location coordinates of the target location and provide it to the server 30.
That is, when the robot 20 moves to the target position, the positioning coordinates of the target position may be provided to the server 30, and time information when the robot 20 moves to the target position may be provided to the server 30 in synchronization.
Wherein the time information displayed by the information carrier dynamically changes according to the change rule of natural time. Also, at the same locus point, the time information provided by the robot 20 to the server 30 is kept consistent with the time information presented by the information carrier.
For the server 30, time information associated with the positioning coordinates of the target position may be obtained from the robot 20. Based on this, the server 30 can find, as the first image, a target image in which the time information presented by the information carrier matches the time information associated with the positioning coordinates, from the image containing the robot 20 captured by the camera 10.
The server 30 may recognize the included time information from the image including the robot 20 captured by the camera 10 by using OCR (Optical Character Recognition ) technology or the like.
In practical applications, the information carrier may employ a flexible display, and the number of flexible displays may be plural, and the plural flexible displays are wrapped around the side wall of the robot 20.
Referring to fig. 2, a plurality of flexible display screens are wrapped around the side wall of the robot 20, which ensures that the time information presented in at least one of the flexible display screens is completely captured by the camera 10, thereby ensuring that the first image contains complete time information.
Accordingly, in this embodiment, alignment of the data provided by the camera 10 and the robot 20 can be achieved from the time dimension. This effectively avoids a time difference between the positioning coordinates provided by the robot 20 and the image coordinates extracted from the image photographed by the camera 10 due to the network instability, thereby improving the accuracy of camera calibration.
Fig. 4 is a flowchart of a camera calibration method according to another exemplary embodiment of the present application. The camera calibration method provided in this embodiment may be performed by a camera calibration apparatus, which may be implemented as software or as a combination of software and hardware, and may be integrally provided in a computing device. As shown in fig. 4, the camera calibration method includes:
400, obtaining positioning coordinates when the robot moves to a target position, wherein the positioning coordinates are generated by the robot performing autonomous positioning according to a robot coordinate system;
Step 401, converting the positioning coordinates into scene coordinates based on the mapping relation between the robot coordinate system and the scene coordinate system;
Step 402, determining image coordinates of a target position in a first image of the robot at the target position, which is shot by a camera, according to an image coordinate system;
And 403, determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
In an alternative embodiment, if the positioning coordinates are coordinates corresponding to a center point of a bottom surface of the robot, the step of determining the image coordinates of the target position includes:
according to the image coordinate system, determining an image coordinate corresponding to a top surface center point of the robot in a first image;
And determining the image coordinate corresponding to the center point of the bottom surface as the image coordinate of the target position according to the image coordinate corresponding to the center point of the top surface based on the image coordinate mapping relation between the top surface and the bottom surface when the robot is at the target position.
In an alternative embodiment, the robot has a plurality of pairs of symmetrical marking points disposed on the top and bottom surfaces, the method further comprising:
shooting a second image of the robot when the robot moves to a reference position before the target position by using a camera;
Identifying at least two pairs of mark points from the top surface and the bottom surface of the robot in the second image, and determining first image coordinates of the at least two pairs of mark points;
Determining second image coordinates of at least two pairs of mark points in the first image;
and establishing an image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position according to the first image coordinates and the second image coordinates of at least two pairs of mark points.
In an alternative embodiment, the robot is provided with an information carrier for presenting time information, the positioning coordinates being associated with the time information, the method further comprising:
And searching a target image, which is taken by the camera and contains the robot, with the time information displayed by the information carrier matched with the time information associated with the positioning coordinates, and taking the target image as a first image.
In an alternative embodiment, the target positions are multiple, and the step of determining calibration parameters of the camera according to scene coordinates and image coordinates of the target positions includes:
Selecting a target position with a distance from the visual field boundary of the camera meeting a preset requirement from a plurality of target positions as a marking position;
and determining calibration parameters of the camera according to the scene coordinates and the image coordinates corresponding to the marking positions.
In an alternative embodiment, the calibration parameters comprise a mapping between the image coordinate system and the scene coordinate system.
In an alternative embodiment, the method further comprises:
the method comprises the steps of performing environment scanning on a scene where a camera is located by using a robot to obtain environment data;
Determining the positioning coordinates of at least one environmental element in the scene in the robot coordinate system according to the positioning data and the environmental data generated by the robot in the scanning process based on the robot coordinate system;
Acquiring scene coordinates of at least one environmental element in a scene coordinate system;
and determining the mapping relation between the robot coordinate system and the scene coordinate system according to the positioning coordinate and the scene coordinate of at least one environment element.
In an alternative embodiment, the robot performs autonomous positioning using SLAM real-time positioning and mapping techniques.
It should be noted that, for the technical details of the embodiments of the camera calibration method, reference may be made to the description of the server in the related embodiments of the camera calibration system, which is omitted herein for brevity and should not cause any loss of the protection scope of the present application.
Fig. 5 is a flowchart of another camera calibration method according to still another exemplary embodiment of the present application. The camera calibration method provided in this embodiment may be performed by a camera calibration device, which may be implemented as software or as a combination of software and hardware, and which may be integrally provided in a robot. As shown in fig. 5, the camera calibration method includes:
Step 500, receiving a navigation instruction sent by a control terminal;
Step 501, moving in a scene where a camera is located according to a navigation instruction;
Step 502, in the moving process, performing autonomous positioning according to a self coordinate system to generate positioning data;
Step 503, the positioning data is provided to the server, so that the server can determine the calibration parameters of the camera according to the positioning data and calibrate the camera.
In an alternative embodiment, the step of autonomously positioning according to its own coordinate system to generate positioning data includes:
positioning the bottom surface center point of the self according to the self coordinate system to obtain positioning coordinates of the bottom surface center point;
And taking the positioning coordinates of the center point of the bottom surface as positioning data.
In an alternative embodiment, a plurality of pairs of symmetrical marking points are provided on the top and bottom surfaces of the robot.
In an alternative embodiment, the robot is connected with a marker, the height difference between the marker and the bottom surface of the robot is smaller than a preset threshold, and the step of performing autonomous positioning according to a coordinate system thereof to generate positioning data comprises:
Positioning the marker according to the self coordinate system to obtain positioning coordinates of the marker;
the positioning coordinates of the markers are used as positioning data.
In an alternative embodiment, the robot is provided with an information carrier for presenting time information, the method further comprising:
Displaying time information through the information carrier in the moving process;
the time information is associated with the positioning data and provided to the server.
In an alternative embodiment, the method further comprises, prior to receiving the navigation instruction:
performing environment scanning on the scene to obtain environment data;
generating positioning data according to a self coordinate system in the scanning process;
Generating a first map according to the environment data and the positioning data;
the first map is provided to the control terminal so that the control terminal can send out navigation instructions based on the first map.
In an alternative embodiment, the navigation instruction includes a target position; the steps of moving in the scene where the camera is according to the navigation instruction include:
Moving to a target position according to the navigation instruction;
In the moving process, performing autonomous positioning according to a self coordinate system to generate positioning data, including:
And performing autonomous positioning according to the self coordinate system to generate positioning coordinates of the target position as positioning data.
In an alternative embodiment, the navigation instruction is a cruise instruction or the navigation instruction includes a plurality of path points distributed in a shooting view of the camera, and the steps of autonomously positioning according to a self coordinate system to generate positioning data include:
And performing autonomous positioning according to the self coordinate system to generate positioning coordinates of at least one track point in the moving track as positioning data.
In an alternative embodiment, the step of autonomously positioning according to its own coordinate system comprises:
and (5) performing autonomous positioning by adopting SLAM instant positioning and map construction technology.
It should be noted that, for the technical details of the embodiments of the camera calibration method, reference may be made to the description of the robot in the related embodiments of the camera calibration system, which is omitted herein for brevity and should not cause any loss of the protection scope of the present application.
It should be noted that, the execution subjects of each step of the method provided in the above embodiment may be the same device, or the method may also be executed by different devices.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations appearing in a specific order are included, but it should be clearly understood that the operations may be performed out of the order in which they appear herein or performed in parallel, the sequence numbers of the operations such as 400, 401, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
Fig. 6 is a schematic structural diagram of a computing device according to another exemplary embodiment of the present application. As shown in fig. 6, the computing device includes a memory 60, a processor 61, and a communication component 62;
memory 60 is used to store one or more computer instructions;
Processor 61 is coupled to memory 61 and communication component 62 for executing one or more computer instructions for:
Acquiring positioning coordinates of the robot when the robot moves to the target position through the communication component 62, wherein the positioning coordinates are generated by the robot performing autonomous positioning according to a robot coordinate system;
Converting the positioning coordinates into scene coordinates based on a mapping relation between the robot coordinate system and the scene coordinate system;
According to an image coordinate system, determining image coordinates of a target position in a first image of the robot at the target position, which is shot by a camera;
And determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
In an alternative embodiment, if the positioning coordinates are coordinates corresponding to a center point of the bottom surface of the robot, the processor 61 is configured to, when determining the image coordinates of the target position:
according to the image coordinate system, determining an image coordinate corresponding to a top surface center point of the robot in a first image;
And determining the image coordinate corresponding to the center point of the bottom surface as the image coordinate of the target position according to the image coordinate corresponding to the center point of the top surface based on the image coordinate mapping relation between the top surface and the bottom surface when the robot is at the target position.
In an alternative embodiment, the robot is provided with a plurality of pairs of symmetrical marking points on the top and bottom surfaces, and the processor 61 is further configured to:
shooting a second image of the robot when the robot moves to a reference position before the target position by using a camera;
Identifying at least two pairs of mark points from the top surface and the bottom surface of the robot in the second image, and determining first image coordinates of the at least two pairs of mark points;
Determining second image coordinates of at least two pairs of mark points in the first image;
and establishing an image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position according to the first image coordinates and the second image coordinates of at least two pairs of mark points.
In an alternative embodiment, the robot is provided with an information carrier for presenting time information, the positioning coordinates being associated with the time information, the processor 61 being further adapted to:
And searching a target image, which is taken by the camera and contains the robot, with the time information displayed by the information carrier matched with the time information associated with the positioning coordinates, and taking the target image as a first image.
In an alternative embodiment, the target positions are plural, and the processor 61 is configured to, when determining the calibration parameters of the camera according to the scene coordinates and the image coordinates of the target positions:
Selecting a target position with a distance from the visual field boundary of the camera meeting a preset requirement from a plurality of target positions as a marking position;
and determining calibration parameters of the camera according to the scene coordinates and the image coordinates corresponding to the marking positions.
In an alternative embodiment, the calibration parameters comprise a mapping between the image coordinate system and the scene coordinate system.
In an alternative embodiment, processor 61 is further configured to:
the method comprises the steps of performing environment scanning on a scene where a camera is located by using a robot to obtain environment data;
Determining the positioning coordinates of at least one environmental element in the scene in the robot coordinate system according to the positioning data and the environmental data generated by the robot in the scanning process based on the robot coordinate system;
Acquiring scene coordinates of at least one environmental element in a scene coordinate system;
and determining the mapping relation between the robot coordinate system and the scene coordinate system according to the positioning coordinate and the scene coordinate of at least one environment element.
In an alternative embodiment, the robot performs autonomous positioning using SLAM real-time positioning and mapping techniques.
It should be noted that, for the technical details of the embodiments of the computing device, reference may be made to the description of the server in the related embodiments of the camera calibration system, which is not repeated herein for the sake of brevity, but should not cause a loss of protection scope of the present application.
Further, as shown in fig. 6, the computing device further includes: power supply assembly 63, and the like. Only some of the components are schematically shown in fig. 6, which does not mean that the computing device only includes the components shown in fig. 6.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing a computer program that, when executed, is capable of implementing the steps of the method embodiments described above that are executable by a computing device.
Fig. 7 is a schematic structural view of a robot according to still another exemplary embodiment of the present application. As shown in fig. 7, the robot includes a memory 70, a processor 71, and a communication component 72;
memory 70 is used to store one or more computer instructions;
Processor 71 is coupled with memory 70 and communication component 72 for executing one or more computer instructions for:
Receiving a navigation instruction sent by the control terminal through the communication component 72;
moving in the scene where the camera is according to the navigation instruction;
in the moving process, autonomous positioning is carried out according to a self coordinate system so as to generate positioning data;
the positioning data is provided to the server via the communication component 72 for the server to calculate calibration parameters for the camera based on the positioning data and to calibrate the camera.
In this embodiment, the robot may further include a structure such as a robot body, and the shape and size of the robot body are not limited.
In an alternative embodiment, the processor 71, when autonomously locating in its own coordinate system to generate location data, is configured to:
positioning the bottom surface center point of the self according to the self coordinate system to obtain positioning coordinates of the bottom surface center point;
And taking the positioning coordinates of the center point of the bottom surface as positioning data.
In an alternative embodiment, a plurality of pairs of symmetrical marking points are provided on the top and bottom surfaces of the robot.
In an alternative embodiment, the robot is connected with a marker, the height difference between the marker and the bottom surface of the robot is smaller than a preset threshold, and the processor 71 is configured to, when performing autonomous positioning according to its own coordinate system to generate positioning data:
Positioning the marker according to the self coordinate system to obtain positioning coordinates of the marker;
the positioning coordinates of the markers are used as positioning data.
In an alternative embodiment, the robot is provided with an information carrier 74 for presenting time information, and the processor 71 is further arranged for:
During movement, the time information is presented by means of the information carrier 74;
the time information is associated with the positioning data and provided to the server.
In an alternative embodiment, the processor 71 is further configured to, prior to receiving the navigation instruction:
performing environment scanning on the scene to obtain environment data;
generating positioning data according to a self coordinate system in the scanning process;
Generating a first map according to the environment data and the positioning data;
the first map is provided to the control terminal so that the control terminal can send out navigation instructions based on the first map.
In an alternative embodiment, the navigation instruction includes a target position; the processor 71 is configured to, when moving in a scene where the camera is located according to the navigation instruction:
Moving to a target position according to the navigation instruction;
in the moving process, when autonomous positioning is performed according to the self coordinate system to generate positioning data, the method is used for:
And performing autonomous positioning according to the self coordinate system to generate positioning coordinates of the target position as positioning data.
In an alternative embodiment, the navigation command is a cruise command or the navigation command includes several path points distributed in the shooting field of view of the camera, and the processor 71 is configured to:
And performing autonomous positioning according to the self coordinate system to generate positioning coordinates of at least one track point in the moving track as positioning data.
In an alternative embodiment, the processor 71, when autonomously positioned according to its own coordinate system, is configured to:
and (5) performing autonomous positioning by adopting SLAM instant positioning and map construction technology.
It should be noted that, for the technical details of the embodiments of the robot, reference may be made to the description of the related embodiments of the camera calibration system for the robot, which is omitted herein for brevity, but this should not cause a loss of the protection scope of the present application.
Further, as shown in fig. 7, the computing device further includes: power supply assembly 73, and the like. Only some of the components are schematically shown in fig. 7, which does not mean that the computing device only includes the components shown in fig. 7.
Accordingly, the embodiment of the present application also provides a computer readable storage medium storing a computer program, where the computer program when executed can implement the steps executable by the robot in the above method embodiment.
The memory of fig. 6 and 7, among other things, is used to store a computer program and may be configured to store various other data to support operations on a computing platform. Examples of such data include instructions for any application or method operating on a computing platform, contact data, phonebook data, messages, pictures, videos, and the like. The memory may be implemented by any type of volatile or nonvolatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
Wherein the communication assembly of fig. 6 and 7 is configured to facilitate wired or wireless communication between the device in which the communication assembly is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a mobile communication network of WiFi,2G, 3G, 4G/LTE, 5G, etc., or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further comprises a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
Wherein the power supply assembly of fig. 6 and 7 provides power to the various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (34)

1. A camera calibration method, comprising:
The method comprises the steps of obtaining positioning coordinates when a robot moves to a target position, wherein the positioning coordinates are generated by the robot which performs autonomous positioning according to a robot coordinate system, the robot coordinate system is a coordinate system for automatically positioning the robot, and a motion starting point of the robot is used as an origin of the robot coordinate system;
Converting the positioning coordinates into scene coordinates based on a mapping relation between a robot coordinate system and a scene coordinate system, wherein the scene coordinate system is a coordinate system corresponding to a scene where a camera is located, and the scene coordinates represent positions in a scene map;
according to an image coordinate system, determining image coordinates of the target position in a first image of the robot at the target position, which is shot by a camera;
And determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
2. The method of claim 1, wherein determining the image coordinates of the target location if the positioning coordinates are coordinates corresponding to a bottom center point of the robot comprises:
according to an image coordinate system, determining an image coordinate corresponding to a top surface center point of the robot in the first image;
And determining the image coordinate corresponding to the center point of the bottom surface as the image coordinate of the target position according to the image coordinate corresponding to the center point of the top surface based on the image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position.
3. The method of claim 2, wherein the robot has a plurality of pairs of symmetrical marker points disposed on a top surface and a bottom surface, the method further comprising:
Shooting a second image of the robot when the robot moves to a reference position before the target position by using the camera;
Identifying at least two pairs of mark points from the top surface and the bottom surface of the robot in the second image, and determining first image coordinates of the at least two pairs of mark points;
determining second image coordinates of the at least two pairs of mark points in the first image;
And establishing an image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position according to the first image coordinates and the second image coordinates of the at least two pairs of mark points.
4. Method according to claim 1, characterized in that the robot is provided with an information carrier for presenting time information, the positioning coordinates being associated with time information, the method further comprising:
And searching a target image, which is taken by a camera and contains the robot, of which the time information displayed by the information carrier is matched with the time information associated with the positioning coordinates, and taking the target image as the first image.
5. The method of claim 1, wherein the target location is a plurality of, and wherein determining calibration parameters of the camera based on scene coordinates and image coordinates of the target location comprises:
selecting a target position, of which the distance from the visual field boundary of the camera meets a preset requirement, from the target positions as a marking position;
and determining calibration parameters of the camera according to the scene coordinates and the image coordinates corresponding to the marking positions.
6. The method of claim 1, wherein the calibration parameters comprise a mapping between the image coordinate system and the scene coordinate system.
7. The method according to claim 1, wherein the method further comprises:
performing environment scanning on a scene where the camera is located by using the robot so as to obtain environment data;
Determining the positioning coordinates of at least one environmental element in the scene in the robot coordinate system according to the positioning data generated by the robot in the scanning process based on the robot coordinate system and the environmental data;
Acquiring scene coordinates of the at least one environmental element in the scene coordinate system;
and determining a mapping relation between the robot coordinate system and the scene coordinate system according to the positioning coordinate and the scene coordinate of the at least one environment element.
8. The method of claim 1, wherein the robot performs autonomous positioning using SLAM point-in-time positioning and mapping techniques.
9. A camera calibration method, suitable for use in a robot, comprising:
Receiving a navigation instruction sent by a control terminal;
moving in the scene where the camera is according to the navigation instruction;
in the moving process, performing autonomous positioning according to a self coordinate system to generate positioning data, wherein the self coordinate system is a robot coordinate system, the robot coordinate system is a coordinate system for automatically positioning the robot, and a motion starting point of the robot is used as an origin of the robot coordinate system;
Providing the positioning data to a server for the server to determine calibration parameters of the camera and calibrate the camera according to the positioning data according to the method of claim 1.
10. The method of claim 9, wherein said autonomously positioning according to its own coordinate system to generate positioning data comprises:
positioning a bottom surface center point of the self according to a self coordinate system to obtain positioning coordinates of the bottom surface center point;
and taking the positioning coordinates of the bottom surface center point as the positioning data.
11. The method of claim 10, wherein the robot has a plurality of pairs of symmetrical marker points disposed on the top and bottom surfaces.
12. The method of claim 9, wherein the robot has a marker attached thereto, a height difference between the marker and a bottom surface of the robot is less than a preset threshold, and wherein the autonomous positioning according to the self coordinate system to generate positioning data comprises:
Positioning the marker according to a self coordinate system to obtain positioning coordinates of the marker;
And taking the positioning coordinates of the marker as the positioning data.
13. The method according to claim 9, wherein the robot is provided with an information carrier for presenting time information, the method further comprising:
displaying time information by the information carrier during movement;
and associating the time information with the positioning data and providing the time information to the server.
14. The method of claim 9, further comprising, prior to receiving the navigation instruction:
Performing environment scanning on the scene to obtain environment data;
generating positioning data according to a self coordinate system in the scanning process;
Generating a first map according to the environment data and the positioning data;
And providing the first map for the control terminal so that the control terminal can send out the navigation instruction based on the first map.
15. The method of claim 9, wherein the navigation instruction includes a target location; the moving in the scene where the camera is according to the navigation instruction comprises the following steps:
Moving to the target position according to the navigation instruction;
in the moving process, the autonomous positioning is performed according to the self coordinate system to generate positioning data, which comprises the following steps:
and performing autonomous positioning according to the self coordinate system to generate positioning coordinates of the target position as the positioning data.
16. The method of claim 9, wherein the navigation instruction is a cruise instruction or the navigation instruction includes a plurality of path points distributed in a shooting view of the camera, and the autonomous positioning is performed according to a self coordinate system to generate positioning data, and the method comprises:
And performing autonomous positioning according to the self coordinate system to generate positioning coordinates of at least one track point in the moving track as the positioning data.
17. The method of claim 9, wherein said autonomously positioning according to its own coordinate system comprises:
and (5) performing autonomous positioning by adopting SLAM instant positioning and map construction technology.
18. The camera calibration system is characterized by comprising a camera to be calibrated, a robot and a server, wherein the server is respectively in communication connection with the camera and the robot;
The robot is used for carrying out autonomous positioning according to a robot coordinate system in the moving process and providing the positioning coordinate of the robot at the target position to the server, wherein the robot coordinate system is a coordinate system for automatically positioning the robot, and the motion starting point of the robot is used as an origin of the robot coordinate system;
the camera is used for shooting the robot and providing a first image of the robot at the target position to the server;
The server is used for converting the positioning coordinate into a scene coordinate based on a mapping relation between a robot coordinate system and a scene coordinate system, wherein the scene coordinate system is a coordinate system corresponding to a scene where the camera is located, and the scene coordinate represents a position in a scene map; determining image coordinates of the target position in the first image according to an image coordinate system; and determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
19. The system of claim 18, further comprising a control terminal communicatively coupled to the robot;
the control terminal is used for controlling the robot to move in the scene where the camera is located.
20. The system of claim 19, wherein the robot is further configured to: performing environment scanning on a scene where the camera is located to obtain environment data; generating positioning data according to the robot coordinate system in the scanning process, and generating a first map according to the environment data and the positioning data;
The control terminal is specifically configured to: displaying the first map; the robot is controlled to move in the scene in response to robot navigation instructions occurring in the first map.
21. The system according to claim 20, wherein the navigation instruction includes the target position, and the control terminal is configured to control the robot to move to the target position;
wherein the number of target locations is one or more.
22. The system according to claim 20, wherein the navigation instruction is a patrol instruction or the navigation instruction includes a plurality of path points distributed in a shooting view of the camera, and the control terminal is used for controlling the robot to move according to the navigation instruction;
The server is further configured to select, as the target position, at least one track point that meets a marking requirement from the moving track of the robot when it is determined that the coverage degree of the moving track of the robot to the shooting field of view of the camera reaches a preset standard.
23. The system of claim 22, wherein the server, when selecting the target location, is configured to:
and selecting at least one track point with the distance from the visual field boundary of the camera meeting the preset requirement from the moving track of the robot as the target position.
24. The system of claim 20, wherein the server is further configured to:
determining the positioning coordinates of at least one environmental element in the scene in the robot coordinate system according to the positioning data generated by the robot and the environmental data;
Acquiring scene coordinates of the at least one environmental element in the scene coordinate system;
and determining a mapping relation between the robot coordinate system and the scene coordinate system according to the positioning coordinate and the scene coordinate of the at least one environment element.
25. The system according to claim 18, wherein the robot is specifically configured to locate a bottom center point of the robot when the robot moves to the target position, and take the obtained location coordinates of the bottom center point as the location coordinates of the target position;
The server is specifically configured to: according to an image coordinate system, determining an image coordinate corresponding to a top surface center point of the robot in the first image; and determining the image coordinate corresponding to the bottom surface center point as the image coordinate of the target position according to the image coordinate corresponding to the top surface center point based on the image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position.
26. The system of claim 25, wherein the robot has a plurality of pairs of symmetrical marker points disposed on a top surface and a bottom surface, the server being configured to:
Shooting a second image of the robot when the robot moves to a reference position before the target position by using the camera;
Identifying at least two pairs of mark points from the top surface and the bottom surface of the robot in the second image, and determining first image coordinates of the at least two pairs of mark points;
determining second image coordinates of the at least two pairs of mark points in the first image;
And establishing an image coordinate mapping relation between the top surface and the bottom surface of the robot at the target position according to the first image coordinates and the second image coordinates of the at least two pairs of mark points.
27. The system of claim 18, wherein the robot has a marker attached thereto, the marker having a height differential from a bottom surface of the robot that is less than a preset threshold, the robot being specifically configured to: when the marker is moved to the target position, positioning the marker, and taking the obtained positioning coordinates of the marker as the positioning coordinates of the target position;
The server is specifically configured to: and determining the image coordinates corresponding to the marker in the first image according to an image coordinate system, and taking the image coordinates as the image coordinates of the target position.
28. The system according to claim 18, characterized in that the robot is provided with an information carrier for showing time information, the robot being further adapted to: displaying, by the information carrier, time information when moving to the target location; associating the time information with the positioning coordinates of the target position and providing the time information to the server;
The server is specifically configured to: and searching a target image, which is taken by a camera and contains the robot, of which the time information displayed by the information carrier is matched with the time information associated with the positioning coordinates, and taking the target image as the first image.
29. The system of claim 28, wherein the information carrier is a flexible display screen, the number of flexible display screens being a plurality, the plurality of flexible display screens encircling a sidewall of the robot.
30. The system of claim 18, wherein the calibration parameters comprise a mapping between the image coordinate system and the scene coordinate system.
31. The system of claim 18, wherein the robot is autonomously positioned using SLAM point-in-time positioning and mapping techniques.
32. A computing device comprising a memory, a processor, and a communication component;
the memory is used for storing one or more computer instructions;
The processor is coupled with the memory and the communication component for executing the one or more computer instructions for:
the method comprises the steps that positioning coordinates when a robot moves to a target position are obtained through the communication component, wherein the positioning coordinates are generated by the robot which performs autonomous positioning according to a robot coordinate system, the robot coordinate system is a coordinate system for automatically positioning the robot, and a motion starting point of the robot serves as an origin of the robot coordinate system;
Converting the positioning coordinates into scene coordinates based on a mapping relation between a robot coordinate system and a scene coordinate system, wherein the scene coordinate system is a coordinate system corresponding to a scene where a camera is located, and the scene coordinates represent positions in a scene map;
according to an image coordinate system, determining image coordinates of the target position in a first image of the robot at the target position, which is shot by a camera;
And determining calibration parameters of the camera according to the scene coordinates and the image coordinates of the target position so as to calibrate the camera.
33. A robot comprising a robot body, a robot body and a robot body, characterized by comprising the following steps: a memory, a processor, and a communication component;
the memory is used for storing one or more computer instructions;
The processor is coupled with the memory and the communication component for executing the one or more computer instructions for:
receiving a navigation instruction sent by a control terminal through the communication component;
moving in the scene where the camera is according to the navigation instruction;
in the moving process, performing autonomous positioning according to a self coordinate system to generate positioning data, wherein the self coordinate system is a robot coordinate system, the robot coordinate system is a coordinate system for automatically positioning the robot, and a motion starting point of the robot is used as an origin of the robot coordinate system;
providing the positioning data to a server through the communication component for the server to calculate calibration parameters of the camera and calibrate the camera according to the positioning data according to the method of claim 1.
34. A computer-readable storage medium storing computer instructions that, when executed by one or more processors, cause the one or more processors to perform the camera calibration method of any of claims 1-17.
CN202010214083.XA 2020-03-24 2020-03-24 Camera calibration method, equipment, system and storage medium Active CN113450414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010214083.XA CN113450414B (en) 2020-03-24 2020-03-24 Camera calibration method, equipment, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010214083.XA CN113450414B (en) 2020-03-24 2020-03-24 Camera calibration method, equipment, system and storage medium

Publications (2)

Publication Number Publication Date
CN113450414A CN113450414A (en) 2021-09-28
CN113450414B true CN113450414B (en) 2024-09-24

Family

ID=77807471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010214083.XA Active CN113450414B (en) 2020-03-24 2020-03-24 Camera calibration method, equipment, system and storage medium

Country Status (1)

Country Link
CN (1) CN113450414B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023131883A (en) * 2022-03-10 2023-09-22 カイトウ建築設備工業株式会社 Position marking method and position marking system
US12282333B2 (en) * 2022-04-01 2025-04-22 Ford Global Technologies, Llc Systems and methods for calibrating a map of an autonomous robot
CN114782555B (en) * 2022-06-20 2022-09-16 深圳市海清视讯科技有限公司 Map mapping method, apparatus, and storage medium
CN115953485B (en) * 2023-03-15 2023-06-02 中国铁塔股份有限公司 Camera calibration method and device
CN116862994A (en) * 2023-06-21 2023-10-10 杭州海康威视系统技术有限公司 Camera calibration method, device, system, equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105785989A (en) * 2016-02-24 2016-07-20 中国科学院自动化研究所 System for calibrating distributed network camera by use of travelling robot, and correlation methods

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965687B (en) * 2017-05-22 2021-01-29 阿里巴巴集团控股有限公司 Shooting direction identification method, server, monitoring method, monitoring system and camera equipment
CN107481284A (en) * 2017-08-25 2017-12-15 京东方科技集团股份有限公司 Method, apparatus, terminal and the system of target tracking path accuracy measurement
CN109668551B (en) * 2017-10-17 2021-03-26 杭州海康机器人技术有限公司 Robot positioning method, device and computer readable storage medium
CN107766855B (en) * 2017-10-25 2021-09-07 南京阿凡达机器人科技有限公司 Chessman positioning method and system based on machine vision, storage medium and robot
CN110243360B (en) * 2018-03-08 2022-02-22 深圳市优必选科技有限公司 Method for constructing and positioning map of robot in motion area
CN109540144A (en) * 2018-11-29 2019-03-29 北京久其软件股份有限公司 A kind of indoor orientation method and device
CN110888957B (en) * 2019-11-22 2023-03-10 腾讯科技(深圳)有限公司 Object positioning method and related device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105785989A (en) * 2016-02-24 2016-07-20 中国科学院自动化研究所 System for calibrating distributed network camera by use of travelling robot, and correlation methods

Also Published As

Publication number Publication date
CN113450414A (en) 2021-09-28

Similar Documents

Publication Publication Date Title
CN113450414B (en) Camera calibration method, equipment, system and storage medium
US9479703B2 (en) Automatic object viewing methods and apparatus
KR102699341B1 (en) Surveying and mapping systems, surveying and mapping methods, devices and instruments
US20160327946A1 (en) Information processing device, information processing method, terminal device, and setting method
CN110910460B (en) Method and device for acquiring position information and calibration equipment
CN108007344B (en) Method, storage medium and measuring system for visually representing scan data
JP7182710B2 (en) Surveying methods, equipment and devices
US20220357641A1 (en) System and method for image projection mapping
CN106444846A (en) Mobile terminal positioning and control method, device and unmanned aerial vehicle
CN109709977B (en) Method and device for planning movement track and moving object
CN107636729B (en) Lighting Plan Generator
JP2023546739A (en) Methods, apparatus, and systems for generating three-dimensional models of scenes
WO2019193859A1 (en) Camera calibration method, camera calibration device, camera calibration system and camera calibration program
EP3400577A2 (en) Motion capture from a mobile self-tracking device
CN214308594U (en) A measuring and positioning device based on image recognition
JP2022507714A (en) Surveying sampling point planning method, equipment, control terminal and storage medium
JP2021173801A (en) Information processing equipment, control methods, programs and storage media
CN107238920B (en) Control method and device based on telescope equipment
CN111914048B (en) Automatic generation method for corresponding points of longitude and latitude coordinates and image coordinates
CN113008135B (en) Method, device, electronic device and medium for determining the position of a target point in space
CN110444102B (en) Map construction method, device and unmanned vehicle
KR20180018563A (en) An apparatus and a method for obtaining a registration error map representing a level of sharpness of an image
CN111738906B (en) Indoor road network generation method, device, storage medium and electronic equipment
CN111210471B (en) Positioning method, device and system
CN115019167A (en) Fusion positioning method, system, equipment and storage medium based on mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220701

Address after: Room 5034, building 3, 820 wenerxi Road, Xihu District, Hangzhou City, Zhejiang Province

Applicant after: ZHEJIANG LIANHE TECHNOLOGY Co.,Ltd.

Address before: Box 847, four, Grand Cayman capital, Cayman Islands, UK

Applicant before: ALIBABA GROUP HOLDING Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240621

Address after: Room 801-6, No. 528 Yan'an Road, Gongshu District, Hangzhou City, Zhejiang Province, 310005

Applicant after: Zhejiang Shenxiang Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: Room 5034, Building 3, No. 820, Wener West Road, Xihu District, Hangzhou City, Zhejiang Province, 310050

Applicant before: ZHEJIANG LIANHE TECHNOLOGY Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant