CN108216624A - A kind of method, apparatus and unmanned plane for controlling unmanned plane landing - Google Patents
A kind of method, apparatus and unmanned plane for controlling unmanned plane landing Download PDFInfo
- Publication number
- CN108216624A CN108216624A CN201711421045.6A CN201711421045A CN108216624A CN 108216624 A CN108216624 A CN 108216624A CN 201711421045 A CN201711421045 A CN 201711421045A CN 108216624 A CN108216624 A CN 108216624A
- Authority
- CN
- China
- Prior art keywords
- landing
- unmanned plane
- identified areas
- identified
- current location
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract 12
- 238000006073 displacement reaction Methods 0.000 claims abstract 7
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/04—Control of altitude or depth
- G05D1/06—Rate of change of altitude or depth
- G05D1/0607—Rate of change of altitude or depth specially adapted for aircraft
- G05D1/0653—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
- G05D1/0676—Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
Abstract
The method, apparatus of unmanned plane landing and unmanned plane, this method is controlled to include the invention discloses a kind of:When unmanned plane reaches designated position, acquisition includes the image of landing identified areas, wherein, landing identified areas is that the region of preset landing place is identified by setpoint color;Determine the current location of landing identified areas in the picture;The displacement of unmanned plane in the horizontal direction is adjusted according to current location;Control unmanned plane is dropped on landing place.The method landed by the control unmanned plane of the present invention, it is possible to so that unmanned plane is more accurately dropped on the landing place specified.
Description
Technical field
The present invention relates to unmanned plane land technical field, more particularly, to a kind of unmanned plane landing method, device and
Unmanned plane.
Background technology
UAV referred to as " unmanned plane ", is manipulated using radio robot and the presetting apparatus provided for oneself
Not manned aircraft or fully or intermittently independently operated by car-mounted computer.Unmanned air vehicle technique is dual-use
Aspect is widely used.
Unmanned plane independently takes off, lands as an important content in autonomous control key technology, it is to realize nobody
The premise of recycling and the reuse of machine.It takes off simpler for landing, mainly includes sliding run is accelerated to climb with liftoff
Two stages when takeoff condition meets, are taken off by that can be realized compared with simple program control, not had to the independence of system
Too big requirement;And landing phases are then more complicated, and unmanned plane is needed to have high-precision independent location navigation, robust landing path
Ability of tracking.
At present, there is the unmanned plane to land automatically, it is most of that unmanned plane landing is controlled according to GPS position information.But
It is that, since GPS position information is there are several meters of error, unmanned plane tends not to land on pre-set landing place.Such as
Fruit uses Differential GPS Technology to improve landing precision, needs to dispose GPS reference stations in advance, this can increase the lower deployment cost of system.
Invention content
It is an object of the present invention to provide a kind of new technical solutions for unmanned plane to be controlled to land.
According to the first aspect of the invention, a kind of method for controlling unmanned plane landing is provided, including:
When unmanned plane reaches designated position, acquisition includes the image of landing identified areas, wherein, the landing tag slot
Domain is the region that preset landing place is identified by setpoint color;
Determine current location of the landing identified areas in described image;
The displacement of the unmanned plane in the horizontal direction is adjusted according to the current location;
The unmanned plane is controlled to drop on the landing place.
Optionally, the designated position system:
The collected latitude and longitude information of locating module being arranged in the unmanned plane and the longitude and latitude of the landing place
In the case that information is identical, the location of described unmanned plane.
Optionally, the designated position system:
The current flight height of the unmanned plane is more than or equal to the position of preset minimum descent altitude.
Optionally, the method further includes:
The unmanned plane is determined most according to the error range of the size of the landing identified areas and the locating module
Low descent altitude;
If the current flight height of the unmanned plane is less than the minimum descent altitude, control on the unmanned plane
It rises.
Optionally, the current location for determining the landing identified areas in described image includes:
The landing identified areas in described image is identified according to setpoint color;
Determine position of the central point in described image of the landing identified areas, and by the landing identified areas
Position of the central point in described image is as the current location.
Optionally, the landing identified areas identified according to setpoint color in described image includes:
Calculate the HSV value of each pixel in described image;
Pixel of the color for the setpoint color in described image is identified according to the HSV value;
The landing identification region is determined according to the pixel of the setpoint color.
Optionally, it is described to be included according to the displacement of the current location adjustment unmanned plane in the horizontal direction:
The current location is calculated to the first displacement between described image central point;
According to the second displacement of unmanned plane in the horizontal direction described in first adjustment of displacement.
According to the second aspect of the invention, a kind of device for controlling unmanned plane landing is provided, including:
Image capture module, for when unmanned plane reaches designated position, acquisition to include the image of landing identified areas,
In, the landing identified areas is that the region of preset landing place is identified by setpoint color;
Current location determining module, for determining current location of the landing identified areas in described image;
Module is adjusted, for adjusting the displacement of the unmanned plane in the horizontal direction according to the current location;And
Landing module, for the unmanned plane to be controlled to drop to the landing place.
Optionally, the designated position system:
The collected latitude and longitude information of locating module being arranged in the unmanned plane and the longitude and latitude of the landing place
In the case that information is identical, the location of described unmanned plane.
Optionally, the designated position system:
The current flight height of the unmanned plane is more than or equal to the position of preset minimum descent altitude.
Optionally, described device further includes:
Minimum altitude determining module, for the size and the error model of the locating module according to the landing identified areas
Enclose the minimum descent altitude of the determining unmanned plane;And
Rise control module, for being less than the situation of the minimum descent altitude in the current flight height of the unmanned plane
Under, the unmanned plane is controlled to rise.
Optionally, the current location determining module further includes:
Region identification block, for identifying the landing identified areas in described image according to setpoint color;And
Position determination unit, for determining position of the central point of the landing identified areas in described image, and will
Position of the central point of the landing identified areas in described image is as the current location.
Optionally, the region identification block further includes:
Computation subunit, for calculating the HSV value of each pixel in described image;
Color identifies subelement, for identifying that color is the setpoint color in described image according to the HSV value
Pixel;
Region recognition subelement, for determining the landing identification region according to the pixel of the setpoint color.
Optionally, the adjustment module further includes:
Displacement computing unit, for calculating the current location to the first displacement between described image central point;And
Adjustment unit, for the second displacement of unmanned plane in the horizontal direction according to first adjustment of displacement.
According to the third aspect of the invention we, a kind of unmanned plane is provided, including the dress described according to a second aspect of the present invention
It puts.
According to the fourth aspect of the invention, a kind of unmanned plane is provided, including processor and memory, the memory is used
In store instruction, described instruction is used to that the processor to be controlled to perform the method described according to a first aspect of the present invention.
The advantageous effect of the present invention is, passes through the method for the control unmanned plane landing of the present invention, it is possible to so that
Unmanned plane is more accurately dropped on the landing place specified.
By referring to the drawings to the detailed description of exemplary embodiment of the present invention, other feature of the invention and its
Advantage will become apparent.
Description of the drawings
It is combined in the description and the attached drawing of a part for constitution instruction shows the embodiment of the present invention, and even
With its explanation together principle for explaining the present invention.
Fig. 1 is according to a kind of a kind of flow chart of the embodiment for the method that unmanned plane is controlled to land of the present invention;
Fig. 2 is the imaging schematic diagram of photographing module;
Fig. 3 is the schematic diagram that minimum descent altitude is calculated in longitudinal;
Fig. 4 is according to a kind of a kind of frame principle figure of the implementation structure for the device that unmanned plane is controlled to land of the present invention;
Fig. 5 is another frame principle figure for implementing structure according to a kind of device that unmanned plane is controlled to land of the present invention;
Fig. 6 is according to a kind of a kind of frame principle figure of implementation structure of unmanned plane of the present invention;
Fig. 7 is the schematic diagram of landing identified areas.
Specific embodiment
Carry out the various exemplary embodiments of detailed description of the present invention now with reference to attached drawing.It should be noted that:Unless in addition have
Body illustrates that the unlimited system of component and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally
The range of invention.
It is illustrative to the description only actually of at least one exemplary embodiment below, is never used as to the present invention
And its application or any restrictions that use.
Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable
In the case of, the technology, method and apparatus should be considered as part of specification.
In shown here and discussion all examples, any occurrence should be construed as merely illustrative, without
It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined in a attached drawing, then in subsequent attached drawing does not need to that it is further discussed.
It is in the prior art in order to solve, when controlling unmanned plane landing by GPS position information, due to GPS location
There is deviation in the position that information has error and unmanned plane is caused to land, provides a kind of side for controlling unmanned plane landing
Method.
Fig. 1 is according to a kind of a kind of flow chart of the embodiment for the method that unmanned plane is controlled to land of the present invention.
According to Fig. 1, this method includes the following steps:
Step S110, when unmanned plane reaches designated position, acquisition includes the image of landing identified areas.Wherein, land
Identified areas is that the region of preset landing place is identified by setpoint color, between identified areas of landing and landing place
There are specific identified relationships so that unmanned plane can carry out related fortune after identified areas is landed in identification according to identified relationships
It calculates, and then identifies landing place.In one particular embodiment of the present invention, as shown in fig. 7, landing place O is located at landing
On the central point of identified areas S.It is easily understood that the central point of landing place system landing identified areas is the specific embodiment
In the identified relationships.Again it is easily understood that in other specific embodiments of the present invention, landing place can be located at drop
Any point in addition to above-mentioned central point in identified areas is fallen, can also be located at except landing identified areas, as long as it meets
With landing identified areas there are specific identified relationships, in this way, unmanned plane just can be after identified areas be landed in identification
Related operation is carried out, and then identify landing place according to the specific identified relationships.Wherein, above-mentioned identified relationships can be advance
It is stored in unmanned plane.It is identified in landing identified areas S beforehand through setpoint color, which for example can be
It is red.Further, landing identified areas S can be the figure of rule, such as can be rectangle, square or circle, with
Landing place is accurately determined by identified areas of landing convenient for unmanned plane.
The collected location information of locating module that designated position specifically refers to be arranged in unmanned plane meets preset condition
In the case of, the location of unmanned plane.The location information of the present embodiment includes latitude and longitude information and elevation information, in positioning mould
When the latitude and longitude information of block acquisition is identical with the latitude and longitude information of landing place, it is believed that unmanned plane reaches designated position.Its
In, the location information of landing place is stored in advance in unmanned plane.Locating module for example can be that GPS positioning module, the Big Dipper are fixed
Position module, Galileo locating module, base station location mould any one or arbitrary combination in the block.
It is easy to understand that make landing identified areas complete imaging, unmanned plane on the imaging sensor of photographing module
Suitable height need to be located at.Again due to there is certain miss between the location information of locating module acquisition and unmanned plane location
Difference, in order to enable locating module acquisition location information there are error in the case of land identified areas be still in acquisition
In image, the designated position further refers to the current flight height when unmanned plane more than or equal to preset minimum
During descent altitude, the location of unmanned plane.Specifically, current flight height can be barometer by being set to unmanned plane or
Range sensor measures.When the current flight height of unmanned plane is more than or equal to minimum descent altitude, identified areas of landing
It all will be in the image of acquisition.It is noted that flying height herein, current flight height, descent altitude and
Minimum descent altitude is relative altitude, for the height of landing place.
On this basis, this method further includes:
The minimum descent altitude of unmanned plane is determined according to the error range of the size of landing identified areas and locating module.
Fig. 2 is the imaging schematic diagram of photographing module.
Shown according to fig. 2, scene 2 is imaged by the camera lens 3 of photographing module on the imaging sensor 1 of photographing module, and
The size of scene 2 changes with the change of camera lens 3 and the distance between scene 2.
The size calculation formula of scene 2 is:
Wherein, W1 is the width of imaging sensor 1;L1 is the length of imaging sensor 1;W2 is the width of scene 2;L2 is
The length of scene 2;F is the focal length of photographing module, i.e. the distance between camera lens 3 and imaging sensor 1;D is camera lens 3 and scene 2
The distance between, also correspond to the current flight height of unmanned plane.Aforementioned length is specially relative to the direction of longitude, width
For the direction relative to latitude.
Fig. 3 is the schematic diagram that minimum descent altitude is calculated in longitudinal.
A, the boundary point on photographed scene length direction when B is imaged vertically downward for photographing module, i.e. AB=L2, nobody
The position of machine is C, and landing place O, the length for identified areas of landing is M.CD is the straight line perpendicular to AB, and point D is met at AB,
So AD=DB.
The height of C points is equivalent to the d in Fig. 2, and the measurement error of the flying height of unmanned plane is d', and d' is used by unmanned plane
Sensor determine or obtained according to actual test result, be given value.The longitude error that locating module measures is J, by positioning
Module is determined or is obtained according to actual test result, is given value.Since unmanned plane needs to acquire comprising landing identified areas
Image, then need to meet:
Since the length of photographing module photographed scene is L2, i.e. L2=AB, according to
Above-mentioned formula two, thenDue to f, M, J, L1 it is known that
The value of d can be obtained.
Due to unmanned plane flying height error for d', it is consequently possible to calculate going out error on longitude to flying height
Requirement,
Similarly, it is assumed that the width for mark of landing is N, can be according to error K of the unmanned plane on latitude, according to above-mentioned formula
One, it is possible to requirement of the error on latitude to flying height is calculated,
So, minimum descent altitude is thenWithAmong maximum value.
If the current flight height of unmanned plane is less than minimum descent altitude, the unmanned plane is controlled to rise.
It is this way it is secured that identical with the latitude and longitude information of landing place in the collected latitude and longitude information of locating module
In the case of, unmanned plane can collect the image for completely including landing identified areas.
Step S120 determines the current location of landing identified areas in the picture.
Since landing identified areas is identified according to setpoint color, perform the specific side of step S120
Method can be:The landing identified areas in image is identified according to setpoint color;Determine that the central point of landing identified areas is being schemed
Position as in, as the current location.
Further, the method for the landing identified areas in image being identified according to setpoint color can be:By RGB,
The color identification methods such as HSV, YUV identify landing identified areas.
The present embodiment is illustrated by taking hsv color recognition methods as an example.Wherein H (form and aspect) refers to the type and title of color,
The type variation of such as red, orange, yellow, green, blue, blue, purple color;S (saturation degree) value is bigger, represents that color gets over saturation;V (brightness)
Value is bigger, represents brighter, the type of color can be judged according to H, S and V value in object to be detected.When the image of photographing module
After information is input to color recognition module, color recognition module calculates H, S and V value of each pixel in image, and identifies H, S
Meet the pixel of H, S and V value of setpoint color with V values, and then identify the landing identified areas identified by setpoint color.
For example, when setpoint color is red, need to detect that H values are in the range of 0~10 or in the range of 156~180, at S values
In the range of 43~255 and V values are in the pixel in the range of 46~255, these pixels form with identified areas of landing
The identical region of shape is then the landing identified areas in image.
When the landing identified areas of setting is identified with red object, landing identified areas is imaged as red on photographing module
Color can use color recognition technology to detect the red area in image, and then identify landing identified areas;Landing mark
Region may be set to be to be identified with infrared light supply, at this point, infrared filtering element is configured on photographing module, then landing mark
Region is imaged as white on photographing module, can be with the white area in detection image, and then identifies landing identified areas.
On this basis, the position of the central point of landing identified areas in the picture, and the identified areas that will land are calculated
Central point position in the picture as the current location.Specifically, landing mark can be calculated using method of geometry
The central point in region, and then calculate relative position of the central point of landing identified areas in the scene of photographing module.
In one particular embodiment of the present invention, landing place is located on the central point of landing identified areas, therefore, just
Relative position of the landing place in the scene of photographing module is obtained.
Step S130 adjusts the displacement of unmanned plane in the horizontal direction according to current location.
Specifically, current location can be calculated to the displacement between image center location, and according to the adjustment of displacement nobody
The displacement of machine in the horizontal direction, until the central point of landing identified areas is overlapped with the central point of image.
In one particular embodiment of the present invention, landing identified areas around landing place can be set in advance, made
Landing place is obtained to be located on the central point of landing identified areas, then calculate minimum descent altitude, and in advance in unmanned plane memory storage
The latitude and longitude information and elevation information of landing place.When controlling unmanned plane landing, it can first control unmanned plane fixed according to GPS etc.
The latitude and longitude information of position module acquisition reaches designated position, obtains the image for including landing identified areas, passes through color recognition skill
Art, calculates image space of the landing place in photographing module, that is, the central point of identified areas of landing in the picture opposite
Position.According to the offset of image space of the landing place in photographing module and image center, control unmanned plane is moved to
Right over landing place, that is, the central point for identified areas of landing overlaps, then control unmanned plane vertical landing with the central point of image
On landing place.
Wherein, current location is first displacement to the displacement between image center location, and unmanned plane is in the horizontal direction
Displacement is second displacement, then, the first displacement is identical with the direction of second displacement.
Wherein it is possible to the size of second displacement is calculated according to the descent altitude of the size of the first displacement and unmanned plane, and
Control unmanned plane moves in the horizontal direction according to second displacement, so that unmanned plane is located at the surface of landing place.
In another specific embodiment of the present invention, using the size of the first displacement as the first distance, by second displacement
Size as second distance, the numberical range which may fall into is divided into several (including the first numerical value model
Enclose, second value range), and pre-set and the unique corresponding second distance of each numberical range.Such as first distance
When in one numberical range, unmanned plane need to be controlled to move second distance 1m towards the direction of the first displacement, in the first distance in the
During two numberical ranges, control unmanned plane moves second distance 0.5m towards the direction of the first displacement.And repeat step S120
With step S130, until unmanned plane is located at the surface of landing place.
Step S140 controls the unmanned plane to drop on landing place.
In one particular embodiment of the present invention, when performing step S140, unmanned plane be located at landing place just on
Side.
Optionally, which can be that vertical landing can also land according to preset particular path, which can
Think curve or straight line or both combination.
In another specific embodiment of the present invention, landing place is not located at the central point of landing identified areas, that is, drop
Dropping place is setting in the specified point in addition to central point in landing identified areas or the specified point except landing identified areas.Drop
Dropping place is put is provided with specific identified relationships with the central point of landing identified areas, which is pre-stored within unmanned plane,
Unmanned plane carries out related operation after execution of step S110~S130, according to the specific identified relationships and obtains landing place
Relative to the relative position of landing identified areas central point, and particular path is generated so as to drop to landing place automatically.
In this way, the method landed by the control unmanned plane of the present invention, it is possible to so that unmanned plane more accurately lands
On specified landing place.
Corresponding with the above method, the present invention also provides a kind of devices for controlling unmanned plane landing.Fig. 4 in order to control without
A kind of frame principle figure of implementation structure of the device of man-machine landing.
According to Fig. 4, which includes image capture module 410, current location determining module 420, adjustment module 430
And landing module 440.
The image capture module 410 is used for when unmanned plane reaches designated position, and acquisition includes the figure of landing identified areas
Picture, wherein, landing identified areas is that the region of preset landing place is identified by setpoint color;The current location determines
Module 420 is for the current location of identified areas central point in the picture that determines to land;The adjustment module 430 is used for according to current
The displacement of position adjustment unmanned plane in the horizontal direction, so that unmanned plane is dropped on landing place;Landing module 440 is used to control
Unmanned plane processed is dropped on landing place..
Specifically, the collected latitude and longitude information of locating module and landing place that designated position system is arranged in unmanned plane
Latitude and longitude information it is identical in the case of, the location of unmanned plane.
Further, the current flight height of designated position system unmanned plane is more than or equal to preset minimum descent
The position of height.
On this basis, as shown in figure 5, the device further includes minimum altitude determining module 510 and rises control module
520.The minimum altitude determining module 510 is used to determine nothing according to the size of landing identified areas and the error range of locating module
Man-machine minimum descent altitude;The rising control module 520 is used for the current flight height in unmanned plane less than minimum descent height
In the case of degree, control unmanned plane rises.
As shown in figure 5, current location determining module 420 can also include region identification block 421 and position determination unit
422.The region identification block 421 is used to identify the landing identified areas in image according to setpoint color;The location determination list
Member 422 is scheming the central point for identified areas of landing for determining the position of the central point of landing identified areas in the picture
Position as in is as current location.
Region identification block 421 further includes computation subunit, color identification subelement and region recognition subelement, the calculating
Subelement is used to calculate the HSV value of each pixel in image;Color identification subelement is used to be identified in image according to HSV value
Color is the pixel of setpoint color;The region recognition subelement is used to determine landing cog region according to the pixel of setpoint color
Domain.
Adjustment module 430 can also include displacement computing unit 431 and adjustment unit 432.The displacement computing unit 431 is used
In calculating current location to the first displacement between image center;The adjustment unit 432 be used for according to the first adjustment of displacement without
Man-machine second displacement in the horizontal direction, so that current location is overlapped with image center.
Landing module 440 calculates landing place according to the specific identifier relationship between landing place and landing identified areas
Relative to the relative position of landing identified areas central point, and then particular path is generated, the unmanned plane is controlled to drop to described
On landing place.
The present invention also provides a kind of unmanned planes, and on the one hand, which includes aforementioned control unmanned plane landing
Device.
Fig. 6 is the frame principle figure according to the implementation structure of the unmanned plane of another aspect of the present invention.
According to Fig. 6, which includes memory 601 and processor 602, which refers to for storing
It enables, which is operated the method to perform above-mentioned control unmanned plane landing for control processor 602.
The processor 602 is such as can be central processor CPU, Micro-processor MCV.The memory 601 for example including
ROM (read-only memory), RAM (random access memory), hard disk nonvolatile memory etc..
In addition to this, according to Fig. 6, which further includes interface arrangement 603, input unit 604, display dress
Put 605, communication device 606, photographic device 607, positioning device 608 etc..Although multiple devices are shown in FIG. 6,
Unmanned plane of the present invention can only relate to partial devices therein, for example, processor 601, memory 602, photographic device 607 and fixed
Position device 608 etc..
Above-mentioned communication device 606 can for example carry out wired or wireless communication.
Above-mentioned interface arrangement 603 is such as including earphone jack, USB interface.
Above-mentioned input unit 604 is such as can include touch screen, button.
Above-mentioned display device 605 is, for example, liquid crystal display, touch display screen etc..
The various embodiments described above primary focus describes difference from other examples, but those skilled in the art should be clear
Chu, the various embodiments described above can be used alone or be combined with each other as needed.
Each embodiment in this specification is described by the way of progressive, identical similar portion between each embodiment
Point cross-reference, the highlights of each of the examples are difference from other examples, but people in the art
Member is it should be understood that the various embodiments described above can be used alone or be combined with each other as needed.In addition, for device
For embodiment, since it is corresponding with embodiment of the method, so describing fairly simple, related part is implemented referring to method
The explanation of the corresponding part of example.System embodiment described above is only schematical, wherein as separating component
The module of explanation may or may not be physically separate.
The present invention can be device, method and/or computer program product.Computer program product can include computer
Readable storage medium storing program for executing, containing for make processor realize various aspects of the invention computer-readable program instructions.
Computer readable storage medium can keep and store to perform the tangible of the instruction that uses of equipment by instruction
Equipment.Computer readable storage medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage
Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer readable storage medium
More specific example (non exhaustive list) includes:Portable computer diskette, random access memory (RAM), read-only is deposited hard disk
It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable
Compact disk read-only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon
It is stored with the punch card of instruction or groove internal projection structure and above-mentioned any appropriate combination.Calculating used herein above
Machine readable storage medium storing program for executing is not interpreted instantaneous signal in itself, and the electromagnetic wave of such as radio wave or other Free propagations leads to
It crosses the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or is transmitted by electric wire
Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer readable storage medium it is each calculate/
Processing equipment downloads to outer computer or outer by network, such as internet, LAN, wide area network and/or wireless network
Portion's storage device.Network can include copper transmission cable, optical fiber transmission, wireless transmission, router, fire wall, interchanger, gateway
Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment are received from network to be counted
Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment
In calculation machine readable storage medium storing program for executing.
For perform the computer program instructions that operate of the present invention can be assembly instruction, instruction set architecture (ISA) instruction,
Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages
Arbitrarily combine the source code or object code write, the programming language includes the programming language of object-oriented-such as
Procedural programming languages-such as " C " language or similar programming language of Smalltalk, C++ etc. and routine.Computer
Readable program instructions can be performed fully, partly perform on the user computer, is only as one on the user computer
Vertical software package performs, part performs or on the remote computer completely in remote computer on the user computer for part
Or it is performed on server.In situations involving remote computers, remote computer can pass through network-packet of any kind
Include LAN (LAN) or wide area network (WAN)-be connected to subscriber computer or, it may be connected to outer computer (such as profit
Pass through Internet connection with ISP).In some embodiments, by using computer-readable program instructions
Status information carry out personalized customization electronic circuit, such as programmable logic circuit, field programmable gate array (FPGA) or can
Programmed logic array (PLA) (PLA), the electronic circuit can perform computer-readable program instructions, so as to fulfill each side of the present invention
Face.
Referring herein to according to the method for the embodiment of the present invention, the flow chart of device (system) and computer program product and/
Or block diagram describes various aspects of the invention.It should be appreciated that each box and flow chart of flow chart and/or block diagram and/
Or in block diagram each box combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special purpose computer or other programmable datas
The processor of processing unit, so as to produce a kind of machine so that these instructions are passing through computer or other programmable datas
When the processor of processing unit performs, produce and realize work(specified in one or more of flow chart and/or block diagram box
The device of energy/action.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to
It enables so that computer, programmable data processing unit and/or other equipment work in a specific way, so as to be stored with instruction
Computer-readable medium then includes a manufacture, including realizing in one or more of flow chart and/or block diagram box
The instruction of the various aspects of defined function/action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other
In equipment so that series of operation steps are performed on computer, other programmable data processing units or miscellaneous equipment, with production
Raw computer implemented process, so that performed on computer, other programmable data processing units or miscellaneous equipment
Function/action specified in one or more of flow chart and/or block diagram box is realized in instruction.
Flow chart and block diagram in attached drawing show the system, method and computer journey of multiple embodiments according to the present invention
Architectural framework in the cards, function and the operation of sequence product.In this regard, each box in flow chart or block diagram can generation
One module of table, program segment or a part for instruction, the module, program segment or a part for instruction include one or more use
In the executable instruction of logic function as defined in realization.In some implementations as replacements, the function of being marked in box
It can be occurred with being different from the sequence marked in attached drawing.For example, two continuous boxes can essentially be held substantially in parallel
Row, they can also be performed in the opposite order sometimes, this is depended on the functions involved.It is also noted that block diagram and/or
The combination of each box in flow chart and the box in block diagram and/or flow chart can use function or dynamic as defined in performing
The dedicated hardware based system made is realized or can be realized with the combination of specialized hardware and computer instruction.It is right
It is well known that, realized for those skilled in the art by hardware mode, realized by software mode and by software and
It is all of equal value that the mode of combination of hardware, which is realized,.
Various embodiments of the present invention are described above, above description is exemplary, and non-exclusive, and
It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill
Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport
In the principle for best explaining each embodiment, practical application or to the technological improvement of the technology in market or lead this technology
Other those of ordinary skill in domain are understood that each embodiment disclosed herein.The scope of the present invention is limited by appended claims
It is fixed.
Claims (10)
- A kind of 1. method for controlling unmanned plane landing, which is characterized in that including:When unmanned plane reaches designated position, acquisition includes the image of landing identified areas, wherein, the landing identified areas is The region of preset landing place is identified by setpoint color;Determine current location of the landing identified areas in described image;The displacement of the unmanned plane in the horizontal direction is adjusted according to the current location;The unmanned plane is controlled to drop on the landing place.
- 2. the according to the method described in claim 1, it is characterized in that, designated position system:The collected latitude and longitude information of locating module being arranged in the unmanned plane and the latitude and longitude information of the landing place In the case of identical, the location of described unmanned plane.
- 3. the according to the method described in claim 2, it is characterized in that, designated position system:The current flight height of the unmanned plane is more than or equal to the position of preset minimum descent altitude.
- 4. according to the method described in claim 3, it is characterized in that, the method further includes:The most sinking of the unmanned plane is determined according to the error range of the size of the landing identified areas and the locating module Drop height degree;If the current flight height of the unmanned plane is less than the minimum descent altitude, the unmanned plane is controlled to rise.
- 5. according to the method described in claim 1, it is characterized in that, described determine the landing identified areas in described image Current location include:The landing identified areas in described image is identified according to setpoint color;Determine position of the central point in described image of the landing identified areas, and by the center of the landing identified areas Position of the point in described image is as the current location.
- 6. the according to the method described in claim 5, it is characterized in that, drop identified according to setpoint color in described image Identified areas is fallen to include:Calculate the HSV value of each pixel in described image;Pixel of the color for the setpoint color in described image is identified according to the HSV value;The landing identification region is determined according to the pixel of the setpoint color.
- 7. according to the method described in claim 1, it is characterized in that, described exist according to the current location adjustment unmanned plane Displacement in horizontal direction includes:The current location is calculated to the first displacement between described image central point;According to the second displacement of unmanned plane in the horizontal direction described in first adjustment of displacement.
- 8. a kind of device for controlling unmanned plane landing, which is characterized in that including:Image capture module, for when unmanned plane reaches designated position, acquisition to include the image of landing identified areas, wherein, The landing identified areas is that the region of preset landing place is identified by setpoint color;Current location determining module, for determining current location of the landing identified areas in described image;Module is adjusted, for adjusting the displacement of the unmanned plane in the horizontal direction according to the current location;AndLanding module, for the unmanned plane to be controlled to drop to the landing place.
- 9. a kind of unmanned plane, which is characterized in that including device according to claim 8.
- 10. a kind of unmanned plane, which is characterized in that including processor and memory, the memory for storing instruction, the finger It enables that the processor is controlled to perform according to the described method of any one of claim 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711421045.6A CN108216624A (en) | 2017-12-25 | 2017-12-25 | A kind of method, apparatus and unmanned plane for controlling unmanned plane landing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711421045.6A CN108216624A (en) | 2017-12-25 | 2017-12-25 | A kind of method, apparatus and unmanned plane for controlling unmanned plane landing |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108216624A true CN108216624A (en) | 2018-06-29 |
Family
ID=62647867
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711421045.6A Pending CN108216624A (en) | 2017-12-25 | 2017-12-25 | A kind of method, apparatus and unmanned plane for controlling unmanned plane landing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108216624A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109299311A (en) * | 2018-09-18 | 2019-02-01 | 甘肃启远智能科技有限责任公司 | Photovoltaic module longitude and latitude data configuration method and device |
CN109298387A (en) * | 2018-08-20 | 2019-02-01 | 京信通信系统(中国)有限公司 | Search and rescue localization method, device, computer storage medium and equipment |
CN109521791A (en) * | 2018-09-28 | 2019-03-26 | 易瓦特科技股份公司 | Identification method and device based on earth station |
CN110871893A (en) * | 2018-09-03 | 2020-03-10 | 中强光电股份有限公司 | UAV landing system and landing method thereof |
CN110989687A (en) * | 2019-11-08 | 2020-04-10 | 上海交通大学 | A UAV Landing Method Based on Nested Square Visual Information |
CN111061300A (en) * | 2018-09-28 | 2020-04-24 | 易瓦特科技股份公司 | Method and device for dynamically setting ground identification |
CN111457874A (en) * | 2020-04-29 | 2020-07-28 | 厦门大学 | Refuse landfill displacement change monitoring system and control method thereof |
CN111752288A (en) * | 2020-06-19 | 2020-10-09 | 中国科学院地理科学与资源研究所 | A method and system for landing multiple drones through obstacles |
CN114706405A (en) * | 2018-12-20 | 2022-07-05 | 深圳市道通智能航空技术股份有限公司 | Unmanned aerial vehicle landing obstacle avoidance method and device and unmanned aerial vehicle |
WO2022261901A1 (en) * | 2021-06-17 | 2022-12-22 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle landing control method and apparatus, unmanned aerial vehicle, system, and storage medium |
RU2792974C1 (en) * | 2022-04-01 | 2023-03-28 | Автономная некоммерческая организация высшего образования "Университет Иннополис" | Method and device for autonomous landing of unmanned aerial vehicle |
CN116466741A (en) * | 2023-03-29 | 2023-07-21 | 江苏科技大学 | Unmanned aerial vehicle parking apron automatic landing method based on monocular ranging |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106143931A (en) * | 2016-08-04 | 2016-11-23 | 北京京东尚科信息技术有限公司 | For the methods, devices and systems guiding unmanned plane to land |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN106444824A (en) * | 2016-05-23 | 2017-02-22 | 重庆零度智控智能科技有限公司 | UAV (unmanned aerial vehicle), and UAV landing control method and device |
KR101756380B1 (en) * | 2017-03-06 | 2017-07-10 | 한국지질자원연구원 | Detection methods of porphyry copper deposits using malachite hyperspectral imagery |
RU2016119455A (en) * | 2016-05-20 | 2017-11-23 | Михаил Кириллович Нараленков | SYSTEM OF INDICATING TAKEOFF AND LANDING OF AIRCRAFT |
CN107403450A (en) * | 2017-02-25 | 2017-11-28 | 天机智汇科技(深圳)有限公司 | A kind of method and device of unmanned plane pinpoint landing |
CN107450590A (en) * | 2017-08-07 | 2017-12-08 | 深圳市科卫泰实业发展有限公司 | A kind of unmanned plane auxiliary landing method |
-
2017
- 2017-12-25 CN CN201711421045.6A patent/CN108216624A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2016119455A (en) * | 2016-05-20 | 2017-11-23 | Михаил Кириллович Нараленков | SYSTEM OF INDICATING TAKEOFF AND LANDING OF AIRCRAFT |
CN106444824A (en) * | 2016-05-23 | 2017-02-22 | 重庆零度智控智能科技有限公司 | UAV (unmanned aerial vehicle), and UAV landing control method and device |
CN106143931A (en) * | 2016-08-04 | 2016-11-23 | 北京京东尚科信息技术有限公司 | For the methods, devices and systems guiding unmanned plane to land |
CN106371447A (en) * | 2016-10-25 | 2017-02-01 | 南京奇蛙智能科技有限公司 | Controlling method for all-weather precision landing of unmanned aerial vehicle |
CN107403450A (en) * | 2017-02-25 | 2017-11-28 | 天机智汇科技(深圳)有限公司 | A kind of method and device of unmanned plane pinpoint landing |
KR101756380B1 (en) * | 2017-03-06 | 2017-07-10 | 한국지질자원연구원 | Detection methods of porphyry copper deposits using malachite hyperspectral imagery |
CN107450590A (en) * | 2017-08-07 | 2017-12-08 | 深圳市科卫泰实业发展有限公司 | A kind of unmanned plane auxiliary landing method |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109298387A (en) * | 2018-08-20 | 2019-02-01 | 京信通信系统(中国)有限公司 | Search and rescue localization method, device, computer storage medium and equipment |
CN110871893A (en) * | 2018-09-03 | 2020-03-10 | 中强光电股份有限公司 | UAV landing system and landing method thereof |
CN109299311A (en) * | 2018-09-18 | 2019-02-01 | 甘肃启远智能科技有限责任公司 | Photovoltaic module longitude and latitude data configuration method and device |
CN109521791A (en) * | 2018-09-28 | 2019-03-26 | 易瓦特科技股份公司 | Identification method and device based on earth station |
CN111061300A (en) * | 2018-09-28 | 2020-04-24 | 易瓦特科技股份公司 | Method and device for dynamically setting ground identification |
CN114706405A (en) * | 2018-12-20 | 2022-07-05 | 深圳市道通智能航空技术股份有限公司 | Unmanned aerial vehicle landing obstacle avoidance method and device and unmanned aerial vehicle |
CN110989687B (en) * | 2019-11-08 | 2021-08-10 | 上海交通大学 | Unmanned aerial vehicle landing method based on nested square visual information |
CN110989687A (en) * | 2019-11-08 | 2020-04-10 | 上海交通大学 | A UAV Landing Method Based on Nested Square Visual Information |
CN111457874A (en) * | 2020-04-29 | 2020-07-28 | 厦门大学 | Refuse landfill displacement change monitoring system and control method thereof |
CN111457874B (en) * | 2020-04-29 | 2021-08-31 | 厦门大学 | Displacement change monitoring system of waste landfill and its control method |
CN111752288A (en) * | 2020-06-19 | 2020-10-09 | 中国科学院地理科学与资源研究所 | A method and system for landing multiple drones through obstacles |
WO2022261901A1 (en) * | 2021-06-17 | 2022-12-22 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle landing control method and apparatus, unmanned aerial vehicle, system, and storage medium |
RU2792974C1 (en) * | 2022-04-01 | 2023-03-28 | Автономная некоммерческая организация высшего образования "Университет Иннополис" | Method and device for autonomous landing of unmanned aerial vehicle |
CN116466741A (en) * | 2023-03-29 | 2023-07-21 | 江苏科技大学 | Unmanned aerial vehicle parking apron automatic landing method based on monocular ranging |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108216624A (en) | A kind of method, apparatus and unmanned plane for controlling unmanned plane landing | |
US12236612B2 (en) | Methods and system for multi-target tracking | |
CN112258567B (en) | Visual positioning method and device for object grabbing point, storage medium and electronic equipment | |
US11454988B2 (en) | Systems and methods for automated landing of a drone | |
CN110825101B (en) | An autonomous landing method of unmanned aerial vehicle based on deep convolutional neural network | |
EP3586314B1 (en) | Improved forest surveying | |
CN107194399B (en) | A method, system and unmanned aerial vehicle for visual calibration | |
CN106774431B (en) | Method and device for planning air route of surveying and mapping unmanned aerial vehicle | |
KR102154950B1 (en) | Method and apparatus for matching image captured by unmanned air vehicle with map, cadaster, or satellite image | |
CN110826549A (en) | Inspection robot instrument image identification method and system based on computer vision | |
CN110991207A (en) | Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition | |
CN106371447A (en) | Controlling method for all-weather precision landing of unmanned aerial vehicle | |
CN109767637A (en) | Method and device for identifying and processing countdown signal lights | |
CN107544550A (en) | A kind of Autonomous Landing of UAV method of view-based access control model guiding | |
CN109357673A (en) | Vision navigation method and device based on image | |
CN105447868B (en) | A kind of Small and micro-satellite is taken photo by plane the automatic check methods of data | |
CN108446707B (en) | Remote sensing image aircraft detection method based on key point screening and DPM confirmation | |
KR102269792B1 (en) | Method and apparatus for determining altitude for flying unmanned air vehicle and controlling unmanned air vehicle | |
CN110914780A (en) | Action plan making system, method and program for unmanned aerial vehicle | |
CN106292126A (en) | A kind of intelligence aerial survey flight exposal control method, unmanned aerial vehicle (UAV) control method and terminal | |
KR102364615B1 (en) | Method and apparatus for determining route for flying unmanned air vehicle and controlling unmanned air vehicle | |
CN110030928A (en) | The method and system of space object positioning and measurement based on computer vision | |
WO2021056139A1 (en) | Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium | |
CN109145902B (en) | Method for recognizing and positioning geometric identification by using generalized characteristics | |
CN113365382B (en) | Light control method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180629 |
|
RJ01 | Rejection of invention patent application after publication |