CN120317019B - Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage medium - Google Patents
Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage mediumInfo
- Publication number
- CN120317019B CN120317019B CN202510775033.1A CN202510775033A CN120317019B CN 120317019 B CN120317019 B CN 120317019B CN 202510775033 A CN202510775033 A CN 202510775033A CN 120317019 B CN120317019 B CN 120317019B
- Authority
- CN
- China
- Prior art keywords
- simulation
- unmanned aerial
- aerial vehicle
- target
- flight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/02—CAD in a network environment, e.g. collaborative CAD or distributed simulation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Traffic Control Systems (AREA)
Abstract
The application provides a scheduling method, a scheduling device, electronic equipment and a storage medium of a simulation unmanned aerial vehicle, wherein the method comprises the steps of constructing a simulation virtual scene aiming at a target area based on a simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information; the method comprises the steps of sending simulation residual capacity information of each simulation unmanned aerial vehicle to a server, receiving a target flight instruction sent by the server, wherein the target flight instruction carries identification information and flight target position information of the target simulation unmanned aerial vehicle, screening the simulation residual capacity information of each simulation unmanned aerial vehicle from at least one simulation unmanned aerial vehicle by the aid of the target simulation unmanned aerial vehicle, and controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information. Because the simulation unmanned aerial vehicle has simulation electric quantity information, the matching property of the simulation scheduling and the real scheduling of the simulation unmanned aerial vehicle can be improved, and the verification effect can be improved.
Description
Technical Field
The application relates to the technical field of unmanned aerial vehicle control, in particular to a scheduling method, a scheduling device, electronic equipment and a storage medium of a simulation unmanned aerial vehicle.
Background
Along with the rapid popularization of unmanned aerial vehicle technology in the fields of electric power inspection, disaster relief, geographical mapping and the like, simulation verification is used as a key link before actual deployment, and the technical limitation of the unmanned aerial vehicle technology is increasingly prominent. For example, for the scheduling of the simulation unmanned aerial vehicle cluster, the simulation unmanned aerial vehicle which needs to execute the task is generally determined according to the shortest path priority or polling allocation method, and the factors which need to be referred to for the scheduling of the real unmanned aerial vehicle are more, so that the scheduling algorithm of the unmanned aerial vehicle has deviation in the simulation scene and the real scene, and the verification effect is influenced.
Disclosure of Invention
In view of the above, the application provides a scheduling method, device, equipment and storage medium for a simulation unmanned aerial vehicle, which can promote the matching of the simulation scheduling and the real scheduling of the simulation unmanned aerial vehicle and is beneficial to promoting the verification effect.
According to a first aspect of the present application, there is provided a scheduling method of a simulated unmanned aerial vehicle, applied to a terminal device, the method comprising:
constructing a simulation virtual scene aiming at a target area based on a simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information;
sending the simulation residual capacity information of each simulation unmanned aerial vehicle to a server;
Receiving a target flight instruction sent by the server, wherein the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening at least one simulation unmanned aerial vehicle from the server based on the user flight instruction and simulation residual electric quantity information of each simulation unmanned aerial vehicle;
And controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information.
According to a second aspect of the present application, there is provided a scheduling method of a simulated unmanned aerial vehicle, applied to a server, the method comprising:
acquiring a user flight instruction, and receiving simulation residual electric quantity information of at least one simulation unmanned aerial vehicle and identification information of each simulation unmanned aerial vehicle, which are sent by a terminal device;
determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction;
Generating a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction;
And the target flight instruction is used for indicating the target simulation unmanned aerial vehicle to execute a target flight task.
According to a third aspect of the present application, there is provided a scheduling apparatus for a simulated unmanned aerial vehicle, the apparatus comprising:
The scene construction module is used for constructing a simulation virtual scene aiming at the target area based on the simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information;
The information sending module is used for sending the simulation residual capacity information of each simulation unmanned aerial vehicle to a server;
The system comprises a server, an instruction receiving module, a target simulation unmanned aerial vehicle, a control module and a control module, wherein the server is used for receiving a target flight instruction sent by the server, the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening the at least one simulation unmanned aerial vehicle based on simulation residual electric quantity information and user flight instructions of each simulation unmanned aerial vehicle by the server;
And the flight control module is used for controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information.
According to a fourth aspect of the present application, there is provided a scheduling apparatus for a simulated unmanned aerial vehicle, the apparatus comprising:
The information receiving module is used for acquiring a user flight instruction and receiving simulation residual electric quantity information of at least one simulation unmanned aerial vehicle sent by the terminal equipment;
The unmanned aerial vehicle screening module is used for determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction;
the instruction generation module is used for generating a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction;
the instruction sending module is used for sending the target flight instruction to the terminal equipment, and the target flight instruction is used for indicating the target simulation unmanned aerial vehicle to execute a target flight task.
According to a fifth aspect of the present application there is provided an electronic device comprising a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is in operation, the machine readable instructions when executed by the processor performing the method of scheduling an emulated drone according to the first or second aspect.
According to a sixth aspect of the present application, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the scheduling method of the simulated drone of the first or second aspect described above.
According to the scheduling method, the scheduling device, the electronic equipment and the storage medium of the simulation unmanned aerial vehicle, each simulation unmanned aerial vehicle has the simulation residual capacity information, the simulation residual capacity information of each simulation unmanned aerial vehicle is sent to the server, and the server can conduct task decision according to the user flight instruction and the simulation residual capacity information so as to screen and obtain the optimal target simulation unmanned aerial vehicle capable of executing the target flight task from at least one simulation unmanned aerial vehicle. Therefore, the screening of the target simulation unmanned aerial vehicle not only depends on the flight instruction of the user, but also combines the simulation residual capacity information of each simulation unmanned aerial vehicle, so that the simulation result is more matched with the real scene, and the simulation verification effect is improved.
The foregoing objects, features and advantages of the disclosure will be more readily apparent from the following detailed description of the preferred embodiments taken in conjunction with the accompanying drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for the embodiments are briefly described below, which are incorporated in and constitute a part of the specification, these drawings showing embodiments consistent with the present disclosure and together with the description serve to illustrate the technical solutions of the present disclosure. It is to be understood that the following drawings illustrate only certain embodiments of the present disclosure and are therefore not to be considered limiting of its scope, for the person of ordinary skill in the art may admit to other equally relevant drawings without inventive effort.
FIG. 1 is a schematic architecture diagram of a scheduling system for a simulated drone according to an exemplary embodiment of the present application;
FIG. 2 is a flow chart illustrating a method of scheduling a simulated drone according to an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of constructing a simulated virtual scene according to an exemplary embodiment of the application;
FIG. 4 is a flow chart illustrating another method of simulated drone scheduling according to an exemplary embodiment of the present application;
FIG. 5 is a schematic diagram illustrating an interactive process of a simulated unmanned aerial vehicle scheduling method according to an exemplary embodiment of the present application;
FIG. 6 is a functional block diagram of an emulated drone scheduler according to an exemplary embodiment of the present application;
FIG. 7 is a functional block diagram of another simulated drone scheduler shown in an exemplary embodiment of the application;
Fig. 8 is a schematic structural view of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited by these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the application. The term "if" as used herein may be interpreted as "at..once" or "when..once" or "in response to a determination", depending on the context.
The term "and/or" is used herein to describe only one relationship, and means that three relationships may exist, for example, A and/or B, and that three cases exist, A alone, A and B together, and B alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, may mean including any one or more elements selected from the group consisting of A, B and C.
Referring to fig. 1, a schematic architecture diagram of a scheduling system of a simulation unmanned aerial vehicle according to an embodiment of the present application is shown. As shown in fig. 1, the simulated drone scheduling system may include a terminal device 100 and a server 200 capable of communicating with the terminal device 100. The terminal device 100 may include a mobile device, a user terminal, an in-vehicle device, a computing device, a wearable device, and the like. For example, the terminal device 100 may include a tablet computer, a cell phone, a notebook computer, an ultra-mobile personal computer, a UMPC, a netbook, or the like.
The server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud storage, big data, and artificial intelligence platforms, which are not limited herein.
For example, the terminal device 100 may be installed with a simulation platform that may simulate real-world behavior and rules using computer simulation techniques based on mathematical models and algorithms. The server 200 may be deployed with a large language model, where the large language model is an artificial intelligence system based on deep learning, and through learning on massive text data, natural language can be understood and generated, and multiple complex tasks can be completed.
Referring to fig. 2, a flowchart of a scheduling method of a simulation unmanned aerial vehicle according to an embodiment of the present application is shown, and the method is applied to a terminal device 100, and includes the following steps S101 to S104:
S101, constructing a simulation virtual scene aiming at a target area based on a simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information.
The interface of the simulation platform is consistent with the interface of the real unmanned aerial vehicle, and an algorithm verified on the simulation unmanned aerial vehicle can be deployed on the real unmanned aerial vehicle so as to correspondingly control the real unmanned aerial vehicle.
In some embodiments, each of the simulated unmanned aerial vehicles is integrated with a battery simulation model, and the method further comprises determining simulated residual capacity information of the simulated unmanned aerial vehicle according to preset discharge interval parameters by using the battery simulation model for each of the simulated unmanned aerial vehicles.
The battery simulation model may be implemented by battery simulator code, for example. The preset discharge interval parameter may be obtained according to multiple experiments, and is not particularly limited. Therefore, the simulation residual capacity information of the simulation unmanned aerial vehicle is determined according to the preset discharge interval parameters, so that the simulation residual capacity of the simulation unmanned aerial vehicle can be attenuated along with time, and further the simulation precision of the simulation residual capacity is improved.
In some embodiments, the method may further comprise the following (a) - (b):
(a) Aiming at any simulation unmanned aerial vehicle, generating a battery fault simulation instruction by using the battery simulation model;
(b) And responding to the battery fault simulation instruction, and setting the simulation residual capacity information of the simulation unmanned aerial vehicle to zero.
For example, the voltage or current of the battery simulation model may be detected, and when it is determined that the simulated battery is abnormal, an alarm prompt message is generated, and the simulated remaining power is set to zero (cleared). Thus, the simulation unmanned aerial vehicle with the simulation residual capacity of zero will not participate in scheduling. The alarm prompt information is not limited in the same way, and can be a buzzer alarm or a voice alarm.
In the embodiment, the battery simulation model is utilized to simulate not only the residual capacity but also the battery fault, so that the simulation environment is more matched with the real scene.
In some embodiments, each simulated drone has a virtual port and an instance identification, the virtual ports of different simulated drones being different, and the instance identifications being different. Here, the emulation environment can avoid port collision by setting different ports. In addition, by configuring different instance identifiers for each simulation unmanned aerial vehicle, the corresponding simulation unmanned aerial vehicle can be started after the system is started each time, and the gesture of the restarted simulation unmanned aerial vehicle is matched with the previous gesture. For example, after each simulated drone pose is configured, the pose remains after the system is restarted. For another example, if the simulated electric quantity of the a-simulated unmanned aerial vehicle before starting is 80%, after the system is started again, the simulated electric quantity of the a-simulated unmanned aerial vehicle is 80%, and reconfiguration is not needed.
S102, sending the simulation residual capacity information of each simulation unmanned aerial vehicle to a server.
For example, the terminal device 100 may send the simulated residual capacity information of each simulated unmanned aerial vehicle to the server 200, and further carry the identification information of the simulated unmanned aerial vehicle when sending the simulated residual capacity information.
And S103, receiving a target flight instruction sent by the server, wherein the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening at least one simulation unmanned aerial vehicle from the server based on the user flight instruction and simulation residual electric quantity information of each simulation unmanned aerial vehicle.
It may be appreciated that, after receiving the simulated remaining power information of each simulated unmanned aerial vehicle, the server 200 may determine, from the at least one simulated unmanned aerial vehicle, a target simulated unmanned aerial vehicle for executing the target task according to a user flight instruction and the simulated remaining power information of each simulated unmanned aerial vehicle, and this specific determination process will be described in detail later. In addition, after determining the target simulation unmanned aerial vehicle, the server 200 also generates a target flight instruction for instructing the target simulation unmanned aerial vehicle to execute the target task, and transmits the target flight instruction to the terminal device 100.
S104, controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information.
After receiving the target flight command sent by the server 200, the terminal device 100 may control the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information. For example, the target flight mission may include flying from a first target location (e.g., a current location) to a second target location indicated by target location information.
According to the scheduling method of the simulation unmanned aerial vehicle, each simulation unmanned aerial vehicle has the simulation residual capacity information, and the simulation residual capacity information of each simulation unmanned aerial vehicle is sent to the server 200. The server 200 may perform task decision according to the user flight command and the simulated residual power information, so as to screen out an optimal target simulated unmanned aerial vehicle capable of executing the target flight task from at least one simulated unmanned aerial vehicle. Therefore, the screening of the target simulation unmanned aerial vehicle not only depends on the flight instruction of the user, but also combines the simulation residual capacity information of each simulation unmanned aerial vehicle, so that the simulation result is more matched with the real scene, and the simulation verification effect is improved.
Referring to fig. 3, in some embodiments, for the step S101, when constructing a simulation virtual scene for a target area based on a simulation platform, the following steps S1011 to S1013 may be specifically included:
S1011, acquiring satellite image data, elevation data and a vector map for the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data for the target area.
Here, the three-dimensional fused map data is obtained by fusing the satellite image data (three-dimensional), the elevation data, and the vector map (two-dimensional) of the target area, so that the accuracy of the three-dimensional map data can be improved. Wherein the elevation data may provide information about the elevation of points on the earth's surface relative to a particular vertical reference (typically sea level). Further, semantic recognition can be performed on the key geographic features in the three-dimensional fusion map data, so that semantic tags of the geographic features can be obtained. For example, key geographic features may be identified from semantic tags of an open street map (OpenStreetMap, OSM). Illustratively, the key geographic features may include mountains, rivers, roads, and the like.
S1012, performing simulation processing on the three-dimensional fusion map data by using the simulation platform to obtain the simulated underground environment, and performing simulation processing on the at least one called unmanned aerial vehicle model by using the simulation platform to obtain the at least one simulated unmanned aerial vehicle.
Here, when the simulation unmanned aerial vehicle is generated, the designed unmanned aerial vehicle model can be called to perform simulation processing, so that the simulation efficiency can be improved.
The method comprises the steps of carrying out simulation processing on the three-dimensional fusion map data by utilizing the simulation platform to obtain the simulated ground environment, wherein the adjustment comprises at least one of element path adjustment, scaling adjustment and coordinate system adjustment.
For example, when three-dimensional fused map data is imported into the simulation platform, the element model in the three-dimensional fused map data may be adjusted so that the path length in the simulation platform coincides with the path length in the real map. For example, the flight path and position coordinates of the simulated unmanned aerial vehicle in the simulated geographic environment need to be consistent with those of the real unmanned aerial vehicle on the real map. For another example, if the position path between elements (models) in the three-dimensional fusion map data is changed, the path of the models needs to be adjusted so that the path between each element in the simulation environment coincides with the path between corresponding elements in the real map.
And the simulated geographic environment and the real map can be restored by 1:1 through scaling. For example, if three-dimensional fused map data (captured map) is small, an expansion scale is required in the simulation platform to make the simulation environment map coincide with the real map.
In addition, the automatic conversion between the geographic coordinates and the local coordinates of the simulation platform can be realized by adopting preset software or algorithm.
It should be noted that, the adjustment of the three-dimensional fusion map data may be implemented by manual adjustment or may be implemented by automatic adjustment, which is not limited specifically.
S1013, constructing the simulation virtual scene based on the simulation ground environment and the at least one simulation unmanned aerial vehicle.
The simulation virtual scene in the embodiment of the application can contain other factors such as simulation airflow, simulation illumination, sensor noise and the like besides the simulation ground environment and the at least one simulation unmanned aerial vehicle, so that the scene restoration degree is improved, and the simulation environment is more matched with the real environment.
In some embodiments, in order to further improve the accuracy of the three-dimensional fusion map data, after generating the three-dimensional fusion map data for the target area, the method further comprises performing parameterization modeling on target elements in the three-dimensional fusion map data, generating a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags. The target element may include an infrastructure that needs to be identified or avoided during cruising, for example, the infrastructure may include a utility pole, and the infrastructure may be different depending on the cruising scenario.
For example, key infrastructure (e.g., powerline poles) can be parametrically modeled, a three-dimensional model with millimeter-scale accuracy generated, real physical attributes embedded, and semantic tags embedded as the model is derived to enhance the semantic recognition capabilities of the model. For example, the real physical properties may include an additional 5 cm redundant buffer layer for impact volume, material stiffness (steel elastic modulus 200 GPa), and coefficient of friction (concrete surface 0.6).
In the embodiment of the application, the precision of the simulation result can be improved by carrying out parameterization modeling on the target element in the three-dimensional fusion map data and embedding the real physical attribute information. Taking electric power inspection as an example, in the related art, because a telegraph pole is generally simplified into a cylindrical model in simulation, collision detection parameters of the telegraph pole are inconsistent with real physical characteristics, so that the erroneous judgment rate of an unmanned aerial vehicle obstacle avoidance algorithm in a real scene is up to 25%, and by adopting the scheme, the erroneous judgment rate of the unmanned aerial vehicle obstacle avoidance algorithm in the real scene can be greatly reduced.
In some embodiments, in a case where the target area includes a plurality of sub-areas, when the satellite image data, the elevation data, and the vector map are fused, generating three-dimensional fused map data for the target area may include:
for any subarea, fusing satellite image data, elevation data and a vector map corresponding to the subarea to obtain sub-three-dimensional fused map data for the subarea;
and splicing the sub three-dimensional fusion map data corresponding to each sub region respectively by using a least square method to generate three-dimensional fusion map data aiming at the target region.
Therefore, when a plurality of sub-maps are spliced, the least square fitting is combined, so that the multi-source data splicing error can be eliminated, the precision of the three-dimensional fusion map data can be improved, and for example, the root mean square average (RMS) of the positioning precision can be lower than 0.3 meter.
In some embodiments, in the process of executing the target flight task by the target simulation unmanned aerial vehicle, an environment image of an environment in which the target simulation unmanned aerial vehicle is located when executing the target flight task may be obtained, target detection is performed on the environment image, a target detection result is obtained, and the target detection result is sent to the server. For example, the environmental image may be subject to target detection using a target detection model, resulting in target detection results, which may include vehicles, birds, and the like. In this way, the server may adjust the flight instructions according to the target detection result, for example, hover or descent instead.
In other embodiments, in order to provide more information to improve the decision accuracy of the server, a multi-mode large language model may be invoked to perform a picture analysis on the target detection result, so as to generate picture description information for the target detection result. For example, if the target detection result is that a vehicle is detected, the corresponding screen description information "this is a white and tire broken vehicle" may be generated based on the target detection result. Optionally, the target detection result and the picture description information corresponding to the target detection result are also sent to the server, so that the server can judge whether the flight instruction for the target simulation unmanned aerial vehicle needs to be changed according to the target detection result and the picture description information.
Thus, in some embodiments, the method further comprises invoking a multi-modal large language model to perform a visual analysis of the target detection results, generating visual descriptive information for the target detection results;
the sending the target detection result to the server includes:
And sending the target detection result and the description information corresponding to the target detection result to the server.
It should be noted that, in the process of executing the target flight task by the target simulation unmanned aerial vehicle, the simulation unmanned aerial vehicle may further perform three-dimensional trajectory re-planning according to the target detection result to avoid the obstacle, in addition to feeding back the target detection result and the picture description information for the target detection result to the server.
Referring to fig. 4, a flowchart of a scheduling method of a simulation unmanned aerial vehicle according to another embodiment of the present application is shown, and the method is applied to a server 200, and includes the following steps S201 to S204:
s201, a user flight instruction is obtained, and simulation residual capacity information of at least one simulation unmanned aerial vehicle and identification information of each simulation unmanned aerial vehicle, which are sent by terminal equipment, are received.
The identification information of each simulation unmanned aerial vehicle is the identity information of each simulation unmanned aerial vehicle and is used for distinguishing different simulation unmanned aerial vehicles.
Illustratively, a user flight instruction may be obtained through a user interaction interface, where the user flight instruction is a flight instruction issued by a user. Of course, the user flight instructions may also be obtained by voice input. For example, the obtained user flight command may be "dispatch unmanned aerial vehicle with electricity greater than 60% near north latitude 31.2 °, or" dispatch unmanned aerial vehicle with electricity greater than 70% near north latitude 31.2 ° cruise to XX target point ". In the embodiment of the application, the user flight instruction is a natural language instruction.
S202, determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight command.
Illustratively, a large language model is deployed on the server 200, and a target simulation unmanned aerial vehicle may be determined from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction by using the large language model.
In some embodiments, when determining the target simulated unmanned aerial vehicle from the at least one simulated unmanned aerial vehicle according to the simulated residual capacity information of each simulated unmanned aerial vehicle and the user flight instruction, the method may include the following steps (I) - (III):
(I) And determining a target task range corresponding to the user flight instruction based on the user flight instruction.
Specifically, after the user flight instruction is obtained, the target task range corresponding to the user flight instruction can be known. For example, the task range may be from 31.2 ° north latitude to 33.2 ° north latitude, which may be determined according to practical situations.
(II) determining the maximum flight task range which can be executed by each simulation unmanned aerial vehicle based on simulation residual capacity information of the simulation unmanned aerial vehicle.
By way of example, equation modeling may be employed to determine a maximum task range for the simulated drone, which may be achieved, for example, by the following equation (1):
R= R0×(1-) (1)
Wherein, R represents the maximum task radius corresponding to the current simulated residual capacity of the simulated unmanned aerial vehicle, R0 represents the corresponding task radius when the simulated unmanned aerial vehicle is full, Δsoc represents the variation between the current simulated residual capacity and full capacity of the simulated unmanned aerial vehicle, for example, if the current simulated residual capacity of the simulated unmanned aerial vehicle is 80%, Δsoc is 20%. As can be seen from the above formula (1), the patrol distance is automatically reduced by 48% when the SOC is reduced from 80% to 70%.
(III) determining a simulated unmanned aerial vehicle having a maximum flight mission range greater than the target mission range as a candidate simulated unmanned aerial vehicle, and determining the target simulated unmanned aerial vehicle from at least one of the candidate simulated unmanned aerial vehicles.
For example, after determining the maximum task range in which each simulation unmanned aerial vehicle can fly, the simulation unmanned aerial vehicle with the maximum flight task range being greater than the target task range may be determined as a candidate simulation unmanned aerial vehicle, and then the target simulation unmanned aerial vehicle is determined from at least one candidate simulation unmanned aerial vehicle.
In some embodiments, when determining the target simulated drone from at least one of the candidate simulated drones, the following may be included:
Determining evaluation index information of each candidate simulation unmanned aerial vehicle based on the maximum flight task range, the electric quantity consumption information, the emergency degree information, the maximum flight task range, the electric quantity consumption information and the emergency degree information, wherein the maximum flight task range corresponds to a first weight, the electric quantity consumption information corresponds to a second weight and the emergency degree information corresponds to a third weight;
and determining the target simulation unmanned aerial vehicle from at least one candidate simulation unmanned aerial vehicle based on the evaluation index information of the at least one candidate simulation unmanned aerial vehicle.
The power consumption information refers to the power currently consumed by the candidate simulation unmanned aerial vehicle, for example, if the current remaining power of the candidate simulation unmanned aerial vehicle is 80%, the power consumption information is 20%. The urgency information is used to characterize the urgency of the flight mission.
Specifically, the candidate simulation unmanned aerial vehicle with the optimal evaluation index information can be determined as the target simulation unmanned aerial vehicle.
Here, the evaluation index information of each candidate simulation unmanned aerial vehicle may be determined by the following formula (2):
m=α×maximum flight mission range+β×electric power consumption information+γ×emergency degree information (2)
Wherein M represents evaluation index information of the candidate simulation unmanned aerial vehicle, alpha represents a first weight, beta represents a second weight, and gamma represents a third weight. When the M value is minimum, the evaluation index information is optimal.
Optionally, the simulation virtual scene has a real-time simulation wind speed, the first weight, the second weight and the third weight are determined based on the real-time simulation wind speed, and each weight can be dynamically adjusted according to a simulation environment, so that the determination accuracy of the target simulation unmanned aerial vehicle is improved.
In other embodiments, the user flight command includes a delay time, and when determining, for each simulated unmanned aerial vehicle, a maximum flight task range that the simulated unmanned aerial vehicle can execute based on the simulated residual capacity information of the simulated unmanned aerial vehicle, determining, for each simulated unmanned aerial vehicle, the maximum flight task range that the simulated unmanned aerial vehicle can execute based on the simulated residual capacity information of the simulated unmanned aerial vehicle after the delay time may be included.
For example, if the user flight command is "3 hours later send the simulated unmanned aerial vehicle around 31.2 ° north latitude to the XX target point", the behavior of each simulated unmanned aerial vehicle within 3 hours needs to be considered, for example, if some simulated unmanned aerial vehicles have less current residual electric power, but charge after 1 hour, so that the simulated unmanned aerial vehicle may be in a full-power state after 3 hours, and thus may be listed as a candidate simulated unmanned aerial vehicle for executing the target task. Thus, the time delay take-off is realized through the timing function, and the decision flexibility can be improved.
S203, generating a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction.
Illustratively, the server 200 is deployed with a large language model, and when generating the target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction, the method may include the following steps:
performing format conversion on the user flight instruction by using the large language model to obtain a flight instruction in a target format;
And generating the target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the flight target position information indicated by the flight instruction in the target format.
The method comprises the steps of carrying out format conversion on the user flight instruction by utilizing the large language model to obtain a flight instruction in a target format (such as a JSON object format) so as to analyze the natural language instruction, and then generating the target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the flight target position information indicated by the flight instruction in the target format. In this way, the generated target flight command may carry the identification information of the target simulation unmanned aerial vehicle and the flight target position information (also referred to as the waypoint parameter).
For example, for a user flight command of "dispatch north latitude 31.2 ° unmanned aerial vehicle patrol", after the user flight command is parsed by a large language model, a structural constraint condition (JSON format) can be dynamically generated, and an optimal unmanned aerial vehicle is selected in real time, so that the response time is reduced. Specifically, when the large language model analyzes a user flight instruction by using an attention mechanism, 1 km radius expansion can be implemented on the geographic fuzzy description, for example, the latitude 31.2 degrees are converted into rectangular geographic fences 31.195-31.205 degrees, and further, the accuracy of determining a target task can be improved.
S204, the target flight instruction is sent to the terminal equipment, and the target flight instruction is used for indicating the target simulation unmanned aerial vehicle to execute a target flight task.
After the target flight command is generated, the target flight command can be sent to the terminal equipment so as to control the target simulation unmanned aerial vehicle to execute the target flight task.
In some embodiments, the method further comprises the following:
Receiving a target detection result sent by a terminal device, wherein the target detection result is obtained by performing target detection on the basis of the acquired environment image when the target simulation unmanned aerial vehicle executes the target flight task;
And generating a flight adjustment instruction based on the target detection result, and sending the flight adjustment instruction to the terminal equipment.
Here, when the unmanned aerial vehicle is simulated to perform a task, an environment image may be acquired, and target detection may be performed on the environment image to obtain a target detection result, and after the server 200 receives the target detection result, it may be determined whether the target point indicated by the target flight position information is found according to the target detection result. The target point indicated by the target flight position information may be a fixed location or may be a non-fixed target (e.g., a white vehicle in motion), for example.
Optionally, in order to improve the accuracy of judgment, the method further comprises receiving picture description information, wherein the picture description information is obtained based on the target detection result, and when generating a flight adjustment instruction based on the target detection result, the method can comprise generating the flight adjustment instruction based on the target detection result and the picture description information. For example, if the target flight command is to make the target simulation unmanned aerial vehicle fly above a white car with broken tires, the server 200 may determine whether the target point is found according to the detection result and the picture description information, and send an adjustment command after determining that the target point is found, for example, hover above the target point.
In addition, after the target simulation unmanned aerial vehicle clears the simulated residual electric quantity due to battery faults, other simulation unmanned aerial vehicles can be replaced to continue to execute the target flight task. Specifically, the fault point of the simulation unmanned aerial vehicle currently executing the task can be taken as a starting point, and the target simulation unmanned aerial vehicle can be rescreened according to the screening strategy.
The following describes an interactive process of the scheduling method of the simulation unmanned aerial vehicle.
Referring to fig. 5, the terminal device 100 first executes step S101 to construct a simulation virtual scene for the target area based on the simulation platform, where the simulation virtual scene includes a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each of the simulation unmanned aerial vehicles has simulation residual power information. And then, executing step S102, and sending the simulated residual capacity information of each simulated unmanned aerial vehicle to a server. Then, the server 200 sequentially executes step S201 to obtain a user flight command, and receives at least one piece of simulation residual capacity information of the simulation unmanned aerial vehicle and identification information of each simulation unmanned aerial vehicle sent by the terminal device, step S202 to determine a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight command, step S203 to generate a target flight command based on the identification information of the target simulation unmanned aerial vehicle and the user flight command, and step S204 to send the target flight command to the terminal device. Finally, the terminal device 100 executes step S103 to receive the target flight command sent by the server, and step S104 to control the target simulation unmanned aerial vehicle to execute the target flight task according to the flight target position information.
Experiments show that the following technical effects can be achieved by adopting the scheduling method of the application, and the technical effects are shown in the table 1.
TABLE 1
The instruction response delay refers to response delay time of an interactive instruction between the terminal device 100 and the server 200, and the edge inference energy efficiency ratio refers to energy consumed for processing an image of a preset frame number.
According to the scheduling method of the simulation unmanned aerial vehicle, the server 200 can receive the simulation residual capacity information of each simulation unmanned aerial vehicle, and make a task decision according to the user flight instruction and the simulation residual capacity information so as to screen and obtain the optimal target simulation unmanned aerial vehicle capable of executing the target flight task from at least one simulation unmanned aerial vehicle. Therefore, the screening of the target simulation unmanned aerial vehicle not only depends on the flight instruction of the user, but also combines the simulation residual capacity information of each simulation unmanned aerial vehicle, so that the simulation result is more matched with the real scene, and the simulation verification effect is improved.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
Based on the same technical concept, the embodiment of the disclosure further provides a scheduling device of the simulation unmanned aerial vehicle, which corresponds to the scheduling method of the simulation unmanned aerial vehicle.
Referring to fig. 6, a schematic diagram of a scheduling apparatus of a simulation unmanned aerial vehicle according to an embodiment of the present disclosure is shown, where the scheduling apparatus 300 of a simulation unmanned aerial vehicle includes:
the scene construction module 301 is configured to construct a simulation virtual scene for the target area based on a simulation platform, where the simulation virtual scene includes a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information;
an information sending module 302, configured to send simulated residual capacity information of each simulated unmanned aerial vehicle to a server;
the instruction receiving module 303 is configured to receive a target flight instruction sent by the server, where the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening from the at least one simulation unmanned aerial vehicle based on simulation residual capacity information and a user flight instruction of each simulation unmanned aerial vehicle by the server;
And the flight control module 304 is configured to control the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information.
In some embodiments, each of the simulated drones is integrated with a battery simulation model, and the scene construction module 301 is further configured to:
and aiming at each simulation unmanned aerial vehicle, determining simulation residual capacity information of the simulation unmanned aerial vehicle according to preset discharge interval parameters by using the battery simulation model.
In some embodiments, the scene construction module 301 is further configured to:
Aiming at any simulation unmanned aerial vehicle, generating a battery fault simulation instruction by using the battery simulation model;
And responding to the battery fault simulation instruction, and setting the simulation residual capacity information of the simulation unmanned aerial vehicle to zero.
In some embodiments, each simulated drone has a virtual port and an instance identification, the virtual ports of different simulated drones being different, the instance identifications being different.
In some embodiments, the scene construction module 301 is specifically configured to, when constructing a simulated virtual scene for a target area based on a simulation platform:
acquiring satellite image data, elevation data and a vector map aiming at the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data aiming at the target area;
Performing simulation processing on the three-dimensional fusion map data by using the simulation platform to obtain the simulated ground environment, and performing simulation processing on the at least one called unmanned aerial vehicle model by using the simulation platform to obtain the at least one simulated unmanned aerial vehicle;
And constructing the simulation virtual scene based on the simulation ground environment and the at least one simulation unmanned aerial vehicle.
In some embodiments, the scene construction module 301 is further specifically configured to:
Performing parameterization modeling on target elements in the three-dimensional fusion map data to generate a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags.
In some embodiments, the scene construction module 301 is further specifically configured to:
Adjusting the three-dimensional fusion map data to obtain target map data, wherein the adjustment comprises at least one of element path adjustment, scaling adjustment and coordinate system adjustment;
and carrying out simulation processing on the target map data by using the simulation platform to obtain the simulated ground environment.
In some embodiments, flight control module 304 is also configured to:
Acquiring an environment image of the environment where the target simulation unmanned aerial vehicle is located when executing the target flight task;
And carrying out target detection on the environment image to obtain a target detection result, and sending the target detection result to the server.
In some embodiments, flight control module 304 is also configured to:
invoking a multi-mode large language model to carry out picture analysis on the target detection result, and generating picture description information aiming at the target detection result;
And sending the target detection result and the picture description information corresponding to the target detection result to the server.
Referring to fig. 7, a schematic diagram of another scheduling apparatus for a simulated unmanned aerial vehicle according to an embodiment of the present disclosure is shown, where the scheduling apparatus 400 for a simulated unmanned aerial vehicle includes:
the information receiving module 401 is configured to obtain a user flight instruction, and receive simulated residual power information of at least one simulated unmanned aerial vehicle sent by the terminal device;
The unmanned aerial vehicle screening module 402 is configured to determine a target simulated unmanned aerial vehicle from the at least one simulated unmanned aerial vehicle according to the simulated residual capacity information of each simulated unmanned aerial vehicle and the user flight instruction;
the instruction generating module 403 is configured to generate a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction;
the instruction sending module 404 is configured to send the target flight instruction to the terminal device, where the target flight instruction is used to instruct the target simulation unmanned aerial vehicle to execute a target flight task.
In some embodiments, the drone screening module 402 is specifically to:
Determining a target task range corresponding to the user flight instruction based on the user flight instruction;
Aiming at each simulation unmanned aerial vehicle, determining the maximum flight task range which can be executed by the simulation unmanned aerial vehicle based on simulation residual capacity information of the simulation unmanned aerial vehicle;
And determining the simulation unmanned aerial vehicle with the maximum flight task range larger than the target task range as a candidate simulation unmanned aerial vehicle, and determining the target simulation unmanned aerial vehicle from at least one candidate simulation unmanned aerial vehicle.
In some embodiments, the drone screening module 402 is specifically to:
Determining evaluation index information of each candidate simulation unmanned aerial vehicle based on the maximum flight task range, the electric quantity consumption information, the emergency degree information, the maximum flight task range, the electric quantity consumption information and the emergency degree information, wherein the maximum flight task range corresponds to a first weight, the electric quantity consumption information corresponds to a second weight and the emergency degree information corresponds to a third weight;
and determining the target simulation unmanned aerial vehicle from at least one candidate simulation unmanned aerial vehicle based on the evaluation index information of the at least one candidate simulation unmanned aerial vehicle.
In some embodiments, the simulated virtual scene has a real-time simulated wind speed therein, and the first weight, the second weight, and the third weight are determined based on the real-time simulated wind speed.
In some embodiments, the user flight instructions include a delay time, and the drone screening module 402 is specifically configured to:
and determining the maximum flight task range which can be executed by each simulation unmanned aerial vehicle based on the simulation residual capacity information of the simulation unmanned aerial vehicle after the delay time.
In some embodiments, the server is deployed with a large language model, and the drone screening module 402 is specifically configured to:
And determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction by using the large language model.
In some embodiments, the server is deployed with a large language model, and the drone screening module 402 is specifically configured to:
performing format conversion on the user flight instruction by using the large language model to obtain a flight instruction in a target format;
And generating the target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the flight target position information indicated by the flight instruction in the target format.
In some embodiments, the information receiving module 401 is further configured to:
Receiving a target detection result sent by a terminal device, wherein the target detection result is obtained by performing target detection on the basis of the acquired environment image when the target simulation unmanned aerial vehicle executes the target flight task;
The instruction generating module 403 is further configured to generate a flight adjustment instruction based on the target detection result;
The instruction sending module 404 is further configured to send the flight adjustment instruction to the terminal device.
In some embodiments, the information receiving module 401 is further configured to:
receiving picture description information, wherein the picture description information is obtained based on the target detection result;
the instruction generation module 403 is further configured to:
And generating the line adjustment instruction based on the target detection result and the picture description information.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purposes of the present application. Those of ordinary skill in the art will understand and implement the present application without undue burden.
Based on the same technical concept, the embodiment of the disclosure also provides electronic equipment. Referring to fig. 8, a schematic structural diagram of an electronic device 500 according to an embodiment of the disclosure includes a processor 501, a memory 502, and a bus 503. The memory 502 is configured to store execution instructions, including a memory 5021 and an external memory 5022, where the memory 5021 is also referred to as an internal memory, and is configured to temporarily store operation data in the processor 501 and data exchanged with the external memory 5022, such as a hard disk, and the processor 501 exchanges data with the external memory 5022 through the memory 5021.
In the embodiment of the present application, the memory 502 is specifically configured to store application program codes for executing the solution of the present application, and the processor 501 controls the execution. That is, when the electronic device 500 is running, communication between the processor 501 and the memory 502 is via the bus 503, such that the processor 501 executes the application code stored in the memory 502, thereby performing the methods described in any of the foregoing embodiments.
The Memory 502 may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc.
The processor 501 may be an integrated circuit chip having signal processing capabilities. The processor may be a general-purpose processor including a central Processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc., or may be a digital signal processor (DIGITAL SIGNAL Processing, DSP), application Specific Integrated Circuit (ASIC), field programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 500. In other embodiments of the application, electronic device 500 may include more or fewer components than shown, or may combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The disclosed embodiments also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the scheduling method of the simulation robot in the method embodiments described above. Wherein the storage medium may be a volatile or nonvolatile computer readable storage medium.
The embodiments of the present disclosure further provide a computer program product, where the computer program product carries program code, where instructions included in the program code may be used to execute the steps of the scheduling method of the simulation robot in the above method embodiments, and specifically refer to the above method embodiments, which are not described herein.
Wherein the above-mentioned computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
Furthermore, embodiments of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, in tangibly embodied computer software or firmware, in computer hardware including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions encoded on a tangible, non-transitory program carrier for execution by, or to control the operation of, data processing apparatus. Alternatively or additionally, the program instructions may be encoded on a manually-generated propagated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode and transmit information to suitable receiver apparatus for execution by data processing apparatus. The computer storage medium may be a machine-readable storage device, a machine-readable storage substrate, a random or serial access memory device, or a combination of one or more of them.
The processes and logic flows described in this specification can be performed by one or more programmable computers executing one or more computer programs to perform corresponding functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Computers suitable for executing computer programs include, for example, general purpose and/or special purpose microprocessors, or any other type of central processing unit. Typically, the central processing unit will receive instructions and data from a read only memory and/or a random access memory. The essential elements of a computer include a central processing unit for carrying out or executing instructions and one or more memory devices for storing instructions and data. Typically, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks, etc. However, a computer does not have to have such a device. Furthermore, the computer may be embedded in another device, such as a mobile phone, a Personal Digital Assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device such as a Universal Serial Bus (USB) flash drive, to name a few.
Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices including, for example, semiconductor memory devices (e.g., EPROM, EEPROM, and flash memory devices), magnetic disks (e.g., internal hard disk or removable disks), magneto-optical disks, and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features of specific embodiments of particular inventions. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. On the other hand, the various features described in the individual embodiments may also be implemented separately in the various embodiments or in any suitable subcombination. Furthermore, although features may be acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous. Moreover, the separation of various system modules and components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments of the subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. Furthermore, the processes depicted in the accompanying drawings are not necessarily required to be in the particular order shown, or sequential order, to achieve desirable results. In some implementations, multitasking and parallel processing may be advantageous.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the application.
Claims (20)
1. A scheduling method for a simulated unmanned aerial vehicle, the method being applied to a terminal device, the method comprising:
constructing a simulation virtual scene aiming at a target area based on a simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information;
sending the simulation residual capacity information of each simulation unmanned aerial vehicle to a server;
Receiving a target flight instruction sent by the server, wherein the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening at least one simulation unmanned aerial vehicle from the server based on the user flight instruction and simulation residual electric quantity information of each simulation unmanned aerial vehicle;
Controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information;
the constructing a simulation virtual scene for a target area based on the simulation platform comprises the following steps:
acquiring satellite image data, elevation data and a vector map aiming at the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data aiming at the target area;
Performing parameterization modeling on target elements in the three-dimensional fusion map data to generate a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags;
Performing simulation processing on the three-dimensional fusion map data by using the simulation platform to obtain the simulation geographic environment, and performing simulation processing on the at least one called unmanned aerial vehicle model by using the simulation platform to obtain the at least one simulation unmanned aerial vehicle;
And constructing the simulation virtual scene based on the simulation geographic environment and the at least one simulation unmanned aerial vehicle.
2. The method of claim 1, wherein each of the simulated drones is integrated with a battery simulation model, the method further comprising:
and aiming at each simulation unmanned aerial vehicle, determining simulation residual capacity information of the simulation unmanned aerial vehicle according to preset discharge interval parameters by using the battery simulation model.
3. The method according to claim 2, wherein the method further comprises:
Aiming at any simulation unmanned aerial vehicle, generating a battery fault simulation instruction by using the battery simulation model;
And responding to the battery fault simulation instruction, and setting the simulation residual capacity information of the simulation unmanned aerial vehicle to zero.
4. The method of claim 1, wherein each simulated drone has a virtual port and an instance identification, the virtual ports of different simulated drones being different and the instance identifications being different.
5. The method according to claim 1, wherein the performing, by using the simulation platform, the simulation processing on the three-dimensional fused map data to obtain the simulated geographic environment includes:
Adjusting the three-dimensional fusion map data to obtain target map data, wherein the adjustment comprises at least one of element path adjustment, scaling adjustment and coordinate system adjustment;
And carrying out simulation processing on the target map data by using the simulation platform to obtain the simulated geographic environment.
6. The method according to claim 1, wherein the method further comprises:
Acquiring an environment image of the environment where the target simulation unmanned aerial vehicle is located when executing the target flight task;
And carrying out target detection on the environment image to obtain a target detection result, and sending the target detection result to the server.
7. The method of claim 6, wherein the method further comprises:
invoking a multi-mode large language model to carry out picture analysis on the target detection result, and generating picture description information aiming at the target detection result;
the sending the target detection result to the server includes:
and sending the target detection result and picture description information corresponding to the target detection result to the server.
8. A scheduling method for a simulated unmanned aerial vehicle, the method being applied to a server, the method comprising:
acquiring a user flight instruction, and receiving simulation residual electric quantity information of at least one simulation unmanned aerial vehicle and identification information of each simulation unmanned aerial vehicle, which are sent by a terminal device;
determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction;
Generating a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction;
The target flight instruction is sent to the terminal equipment and used for indicating the target simulation unmanned aerial vehicle to execute a target flight task in a simulation virtual scene, and the simulation virtual scene is constructed by the following steps:
acquiring satellite image data, elevation data and a vector map aiming at the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data aiming at the target area;
Performing parameterization modeling on target elements in the three-dimensional fusion map data to generate a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags;
Performing simulation processing on the three-dimensional fusion map data by using a simulation platform to obtain a simulation geographic environment, and performing simulation processing on at least one called unmanned aerial vehicle model by using the simulation platform to obtain at least one simulation unmanned aerial vehicle;
And constructing the simulation virtual scene based on the simulation geographic environment and the at least one simulation unmanned aerial vehicle.
9. The method of claim 8, wherein said determining a target simulated drone from said at least one simulated drone based on the simulated residual charge information for each of said simulated drones and said user flight instructions, comprises:
Determining a target task range corresponding to the user flight instruction based on the user flight instruction;
Aiming at each simulation unmanned aerial vehicle, determining the maximum flight task range which can be executed by the simulation unmanned aerial vehicle based on simulation residual capacity information of the simulation unmanned aerial vehicle;
And determining the simulation unmanned aerial vehicle with the maximum flight task range larger than the target task range as a candidate simulation unmanned aerial vehicle, and determining the target simulation unmanned aerial vehicle from at least one candidate simulation unmanned aerial vehicle.
10. The method of claim 9, wherein said determining the target simulated drone from at least one of the candidate simulated drones comprises:
Determining evaluation index information of each candidate simulation unmanned aerial vehicle based on the maximum flight task range, the electric quantity consumption information, the emergency degree information, the maximum flight task range, the electric quantity consumption information and the emergency degree information, wherein the maximum flight task range corresponds to a first weight, the electric quantity consumption information corresponds to a second weight and the emergency degree information corresponds to a third weight;
and determining the target simulation unmanned aerial vehicle from at least one candidate simulation unmanned aerial vehicle based on the evaluation index information of the at least one candidate simulation unmanned aerial vehicle.
11. The method of claim 10, wherein the simulated virtual scene has a real-time simulated wind speed therein, and wherein the first weight, the second weight, and the third weight are determined based on the real-time simulated wind speed.
12. The method of claim 10, wherein the user flight instructions include a delay time, wherein the determining, for each simulated drone, a maximum range of flight tasks that the simulated drone can perform based on simulated residual power information of the simulated drone includes:
and determining the maximum flight task range which can be executed by each simulation unmanned aerial vehicle based on the simulation residual capacity information of the simulation unmanned aerial vehicle after the delay time.
13. The method of claim 8, wherein the server is deployed with a large language model, and wherein the determining the target simulated drone from the at least one simulated drone based on the simulated residual power information for each of the simulated drones and the user flight instructions includes:
And determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction by using the large language model.
14. The method of claim 8, wherein the server is deployed with a large language model, the generating a target flight instruction based on the identification information of the target simulated drone and the user flight instruction, comprising:
performing format conversion on the user flight instruction by using the large language model to obtain a flight instruction in a target format;
And generating the target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the flight target position information indicated by the flight instruction in the target format.
15. The method of claim 8, wherein the method further comprises:
Receiving a target detection result sent by a terminal device, wherein the target detection result is obtained by performing target detection on the basis of the acquired environment image when the target simulation unmanned aerial vehicle executes the target flight task;
And generating a flight adjustment instruction based on the target detection result, and sending the flight adjustment instruction to the terminal equipment.
16. The method of claim 15, wherein the method further comprises:
receiving picture description information, wherein the picture description information is obtained based on the target detection result;
based on the target detection result, generating a flight adjustment instruction, including:
And generating the line adjustment instruction based on the target detection result and the picture description information.
17. A scheduling device for a simulated unmanned aerial vehicle, the device comprising:
The scene construction module is used for constructing a simulation virtual scene aiming at the target area based on the simulation platform, wherein the simulation virtual scene comprises a simulation geographic environment and at least one simulation unmanned aerial vehicle, and each simulation unmanned aerial vehicle has simulation residual electric quantity information;
The information sending module is used for sending the simulation residual capacity information of each simulation unmanned aerial vehicle to a server;
The system comprises a server, an instruction receiving module, a target simulation unmanned aerial vehicle, a control module and a control module, wherein the server is used for receiving a target flight instruction sent by the server, the target flight instruction carries identification information and flight target position information of a target simulation unmanned aerial vehicle, and the target simulation unmanned aerial vehicle is obtained by screening the at least one simulation unmanned aerial vehicle based on simulation residual electric quantity information and user flight instructions of each simulation unmanned aerial vehicle by the server;
the flight control module is used for controlling the target simulation unmanned aerial vehicle to execute a target flight task according to the flight target position information;
the scene construction module is specifically configured to:
acquiring satellite image data, elevation data and a vector map aiming at the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data aiming at the target area;
Performing parameterization modeling on target elements in the three-dimensional fusion map data to generate a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags;
Performing simulation processing on the three-dimensional fusion map data by using the simulation platform to obtain the simulation geographic environment, and performing simulation processing on the at least one called unmanned aerial vehicle model by using the simulation platform to obtain the at least one simulation unmanned aerial vehicle;
And constructing the simulation virtual scene based on the simulation geographic environment and the at least one simulation unmanned aerial vehicle.
18. A scheduling device for a simulated unmanned aerial vehicle, the device comprising:
The information receiving module is used for acquiring a user flight instruction and receiving simulation residual electric quantity information of at least one simulation unmanned aerial vehicle sent by the terminal equipment;
The unmanned aerial vehicle screening module is used for determining a target simulation unmanned aerial vehicle from the at least one simulation unmanned aerial vehicle according to the simulation residual capacity information of each simulation unmanned aerial vehicle and the user flight instruction;
the instruction generation module is used for generating a target flight instruction based on the identification information of the target simulation unmanned aerial vehicle and the user flight instruction;
The instruction sending module is used for sending the target flight instruction to the terminal equipment, the target flight instruction is used for indicating the target simulation unmanned aerial vehicle to execute a target flight task in a simulation virtual scene, and the simulation virtual scene is constructed by the following steps:
acquiring satellite image data, elevation data and a vector map aiming at the target area, and fusing the satellite image data, the elevation data and the vector map to generate three-dimensional fused map data aiming at the target area;
Performing parameterization modeling on target elements in the three-dimensional fusion map data to generate a three-dimensional model corresponding to the target elements, and embedding real physical attribute information into the three-dimensional model to obtain a target three-dimensional model, wherein the target three-dimensional model is provided with semantic tags;
Performing simulation processing on the three-dimensional fusion map data by using a simulation platform to obtain a simulation geographic environment, and performing simulation processing on at least one called unmanned aerial vehicle model by using the simulation platform to obtain at least one simulation unmanned aerial vehicle;
And constructing the simulation virtual scene based on the simulation geographic environment and the at least one simulation unmanned aerial vehicle.
19. An electronic device comprising a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor in communication with the memory via the bus when the electronic device is in operation, the machine-readable instructions when executed by the processor performing the method of scheduling an emulated drone according to any of claims 1-16.
20. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the scheduling method of the simulated drone of any of claims 1-16.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510775033.1A CN120317019B (en) | 2025-06-11 | 2025-06-11 | Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510775033.1A CN120317019B (en) | 2025-06-11 | 2025-06-11 | Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN120317019A CN120317019A (en) | 2025-07-15 |
| CN120317019B true CN120317019B (en) | 2025-09-19 |
Family
ID=96321885
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510775033.1A Active CN120317019B (en) | 2025-06-11 | 2025-06-11 | Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120317019B (en) |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| PL4273656T3 (en) * | 2021-02-19 | 2025-05-12 | Anarky Labs Oy | Apparatus, method and software for assisting an operator in flying a drone using a remote controller and ar glasses |
| CN116048273A (en) * | 2023-01-16 | 2023-05-02 | 北京字跳网络技术有限公司 | Simulation method of virtual object and related equipment |
| US12152563B1 (en) * | 2023-09-26 | 2024-11-26 | Nyocor Intelligent Maintenance (Ningxia) Technology Co., Ltd | Method, apparatus, and electronic device for detecting wind turbine blade based on drone aerial photography |
| CN119339001B (en) * | 2024-12-18 | 2025-05-09 | 东海实验室 | Three-dimensional visual scene countermeasure simulation deduction method and related device |
| CN119761136A (en) * | 2024-12-30 | 2025-04-04 | 中国人民解放军海军特色医学中心 | Multi-scenario simulation and data evaluation method for UAV flight performance |
-
2025
- 2025-06-11 CN CN202510775033.1A patent/CN120317019B/en active Active
Non-Patent Citations (3)
| Title |
|---|
| 多无人机输电线路巡检联合轨迹设计方法;高云飞 等;电子与信息学报;20240531;第1958-1967页 * |
| 无人机飞行场景及数据的可视化仿真与实现;易姝姝;万方数据库;20101229;第1-84页 * |
| 高云飞 等.多无人机输电线路巡检联合轨迹设计方法.电子与信息学报.2024,第1958-1967页. * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120317019A (en) | 2025-07-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12333756B2 (en) | Systems and methods for 3D model based drone flight planning and control | |
| US11248930B2 (en) | Microclimate wind forecasting | |
| Jiang et al. | SunChase: Energy-efficient route planning for solar-powered EVs | |
| EP3582146A1 (en) | Method, apparatus, device and medium for classifying driving scenario data | |
| US11449080B2 (en) | UAV flight management planner | |
| US10421372B2 (en) | Electric power trade brokering system, non-transitory computer readable medium storing program and electric power trade brokering method | |
| CN110134143A (en) | A kind of electric inspection process method, system and electronic equipment and storage medium | |
| US20220050446A1 (en) | Operational testing of autonomous vehicles | |
| CN112347605A (en) | System and method for simulation of vehicle-based item delivery | |
| CN111373339A (en) | Flight mission generation method, control terminal, unmanned aerial vehicle and storage medium | |
| CN115657726B (en) | Control switching method of multiple unmanned aerial vehicles | |
| CN117213489A (en) | Unmanned aerial vehicle path planning method, device and system under influence of complex wind environment | |
| WO2022083487A1 (en) | Method and apparatus for generating high definition map and computer-readable storage medium | |
| CN112699765A (en) | Method and device for evaluating visual positioning algorithm, electronic equipment and storage medium | |
| CN113625770B (en) | Autonomous navigation planning method and device for inspecting photovoltaic power stations based on flying drones | |
| CN116518960A (en) | Road network updating method, device, electronic equipment and storage medium | |
| CN117192998A (en) | Unmanned aerial vehicle autonomous decision-making method and device based on state prediction of Transformer neural network | |
| CN106323272B (en) | A kind of method and electronic equipment obtaining track initiation track | |
| CN120317019B (en) | Scheduling method and device of simulation unmanned aerial vehicle, electronic equipment and storage medium | |
| US20210118086A1 (en) | Robot and method for correcting position of same | |
| CN116772846A (en) | Unmanned aerial vehicle track planning method, unmanned aerial vehicle track planning device, unmanned aerial vehicle track planning equipment and unmanned aerial vehicle track planning medium | |
| US20230169873A1 (en) | Flight management apparatus and flight management method | |
| CN117252296A (en) | Trajectory prediction methods, devices, systems, electronic equipment and autonomous vehicles | |
| CN119376414B (en) | A demand-oriented UAV parameter self-setting method and system | |
| Uçan et al. | Using genetic algorithms for navigation planning in dynamic environments |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |