[go: up one dir, main page]

CN118942115B - Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system - Google Patents

Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system Download PDF

Info

Publication number
CN118942115B
CN118942115B CN202410999525.4A CN202410999525A CN118942115B CN 118942115 B CN118942115 B CN 118942115B CN 202410999525 A CN202410999525 A CN 202410999525A CN 118942115 B CN118942115 B CN 118942115B
Authority
CN
China
Prior art keywords
image information
information
animal
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410999525.4A
Other languages
Chinese (zh)
Other versions
CN118942115A (en
Inventor
唐汉通
张盼锋
陈玉林
黄锦恩
杨富
韦炊之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taishan Saike Agriculture Technology Co ltd
Original Assignee
Taishan Saike Agriculture Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taishan Saike Agriculture Technology Co ltd filed Critical Taishan Saike Agriculture Technology Co ltd
Priority to CN202410999525.4A priority Critical patent/CN118942115B/en
Publication of CN118942115A publication Critical patent/CN118942115A/en
Application granted granted Critical
Publication of CN118942115B publication Critical patent/CN118942115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/46Control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/70Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in livestock or poultry

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Environmental Sciences (AREA)
  • Animal Husbandry (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Agronomy & Crop Science (AREA)
  • Economics (AREA)
  • Biophysics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Automation & Control Theory (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Mining & Mineral Resources (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the invention relates to the technical field of intelligent cultivation and discloses an unmanned aerial vehicle-based intelligent cultivation management method, which comprises the steps of receiving first image information acquired when a first unmanned aerial vehicle is located at a first flying height; the method comprises the steps of inputting preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, determining corresponding flight track information according to a clustering center point, and sending the flight track information to a first unmanned aerial vehicle to control the first unmanned aerial vehicle to acquire a second image information group and an infrared image information group along a set flight track according to a second flight height. According to the unmanned aerial vehicle-based auxiliary intelligent breeding management method, according to the density information of animal distribution, the system can intelligently plan the flight track of the unmanned aerial vehicle. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition.

Description

Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system
Technical Field
The invention relates to the technical field of intelligent culture management, in particular to an unmanned aerial vehicle-based intelligent culture management assisting method and system.
Background
At present, along with the continuous development of the breeding industry to large scale and industrialization, a large number of large-scale breeding enterprises have introduced informationized and intelligent breeding management systems. However, most of these systems focus on monitoring and adjusting the cultivation environment and improving the cultivation quality. As for management of the structure of the farm, it is mainly dependent on artificial experience and professional judgment, and there is no more intensive study on the open cultivation environment. This also makes wisdom breed still have certain blind area to can not realize better supplementary function of breeding.
Disclosure of Invention
Aiming at the defects, the embodiment of the invention discloses an unmanned aerial vehicle-based intelligent cultivation management method, which can realize efficient cultivation management.
The first aspect of the embodiment of the invention discloses an unmanned aerial vehicle-based auxiliary intelligent cultivation management method, which comprises the following steps:
receiving first image information acquired when a first unmanned aerial vehicle is located at a first flying height, and performing image preprocessing operation on the first image information;
Inputting the preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, and determining the central point position of each animal in the first image information according to the animal image information;
Determining density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, performing clustering operation on the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determining corresponding flight track information according to the clustering center points;
The flight track information is sent to the first unmanned aerial vehicle to control the first unmanned aerial vehicle to collect a second image information group and an infrared image information group along a set flight track according to a second flight height, and the obtained second image information group and the obtained infrared image information group are sent to a background server, wherein the second image information group comprises a plurality of second image information, the infrared image information group comprises a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, after the sending the acquired second image information set and the acquired infrared image information set to the background server, the method further includes:
Performing feature recognition on each piece of second image information in the second image information group according to a pre-constructed object recognition model so as to determine corresponding object information and object position information corresponding to the object information;
Matching the determined object information with the constructed object-animal numbering relation table to determine corresponding animal numbering information;
Temperature analysis is carried out on the infrared image information to determine temperature information corresponding to the corresponding target object, and animal number information and temperature information are subjected to data association according to the target position information; and if the temperature information corresponding to the target object is not matched with the set threshold range, sending the animal number information with abnormal temperature to a cultivation manager.
In a first aspect of the embodiment of the present invention, the receiving the first image information acquired when the first unmanned aerial vehicle is located at the first flight altitude, and performing an image preprocessing operation on the first image information includes:
Receiving a first image information group acquired when a first unmanned aerial vehicle is located at a first flying height, wherein the first image information group comprises a plurality of first image information, and the first image information group is acquired through the following steps:
acquiring first image information shot at a corresponding acquisition point through an onboard camera of a first unmanned aerial vehicle, and acquiring gesture information corresponding to the first image information through a gesture sensor;
Determining whether a shooting state of the airborne camera when shooting first image information meets a preset jitter condition according to the gesture data;
When the shooting state is determined to meet the preset dithering condition, marking the first image information, and carrying out graying operation on the first image information meeting the dithering condition to obtain a first gray level image;
Contour feature recognition is performed on the first gray level image by using a background splitting model to determine contour image information of each animal in the first image information.
As an alternative implementation manner, in the first aspect of the embodiment of the present invention, the animal identification model is constructed by the following steps:
acquiring sample image information, wherein the sample image information is a sample image shot by an unmanned aerial vehicle at a first set height, and marking the appearing animal in a manual marking mode;
dividing the sample image information to obtain a model training set and a model testing set;
inputting the model training set into a pre-constructed animal identification model to identify, and continuously adjusting network parameters until the animal identification model meets set conditions;
And inputting the model test set into an animal identification model meeting the set conditions to carry out test verification so as to determine whether the set requirements are met, outputting a corresponding animal identification model if the set requirements are met, and returning to continue training if the set requirements are not met.
In a first aspect of the embodiment of the present invention, the inputting the preprocessed first image information into a pre-constructed animal identification model to identify to determine each animal image information in the first image information includes:
Dividing the preprocessed first image information into a first identification area, a second identification area and a third identification area, wherein the first identification area, the second identification area and the third identification area have the same center point, and the distances from the edges of the first identification area, the edges of the second identification area and the edges of the third identification area to the center point are sequentially increased;
inputting the first identification area, the second identification area and the third identification area into a pre-built first animal identification model, a pre-built second animal identification model and a pre-built third animal identification model respectively to identify so as to determine each animal image information in the first image information;
In the process that the first unmanned aerial vehicle collects the second image information group and the infrared image information group along the set flight track according to the second flight height, sensing parameters of all areas are obtained through a sensor assembly arranged at the first unmanned aerial vehicle, the sensor assembly comprises a gas sensor, a temperature sensor and a humidity sensor, and environmental parameter information of the corresponding areas is determined according to the sensing parameters of all the areas.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the cultivation management method further includes:
receiving motion sensing signals acquired by motion sensors arranged at all animals;
and determining the animal health state according to the animal motion state identified by the motion sensing signal.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, after the determining corresponding flight trajectory information according to the cluster center, the method further includes:
performing signal disassembly on the flight track information to determine the position information of each acquisition point in the flight track information, and generating corresponding throwing information according to the position information of each acquisition point;
the flight track information and the throwing information are sent to a second unmanned aerial vehicle to enable the second unmanned aerial vehicle to throw in medicines or feeds at corresponding acquisition points;
the intelligent cultivation management method further comprises the following steps:
Generating a corresponding key pair according to the identity of the client in a client management module, wherein the key pair comprises an encryption key and a decryption key, extracting the encryption key from the key pair and sending the encryption key to the client corresponding to the identity, extracting the decryption key from the key pair, binding the decryption key and the identity to generate decryption information, and sending the decryption information to a client management node at a blockchain system;
Dividing the collected culture management information into a plurality of public data packets and private data packets at a data management module, and encrypting the private data packets by using the encryption key, wherein the public data packets comprise identification marks and timestamp information of the culture management information, and the private data packets comprise culture quantity, animal health status and animal number information;
When the culture management information of the data management module is obtained, the culture management node extracts the public data packet and the private data packet from each culture data block according to the corresponding timestamp information and the identity of the client, inquires the decryption information according to the identity, extracts the corresponding decryption key to decrypt the private data packet, and restores the culture management information by using a preset data template based on the public data packet and the private data packet.
The second aspect of the embodiment of the invention discloses an unmanned aerial vehicle-based auxiliary intelligent cultivation management system, which comprises the following components:
the receiving module is used for receiving first image information acquired when the first unmanned aerial vehicle is located at a first flying height and carrying out image preprocessing operation on the first image information;
The center point determining module is used for inputting the preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, and determining the center point position of each animal in the first image information according to the animal image information;
the flight track determining module is used for determining density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, carrying out clustering operation on the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determining corresponding flight track information according to the clustering center points;
The image acquisition module is used for sending the flight track information to the first unmanned aerial vehicle to control the first unmanned aerial vehicle to acquire a second image information group and an infrared image information group along a set flight track according to a second flight height, and sending the acquired second image information group and the acquired infrared image information group to the background server, wherein the second image information group comprises a plurality of second image information, the infrared image information group comprises a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one.
The third aspect of the embodiment of the invention discloses electronic equipment, which comprises a memory storing executable program codes, and a processor coupled with the memory, wherein the processor calls the executable program codes stored in the memory and is used for executing the unmanned aerial vehicle-based auxiliary intelligent aquaculture management method disclosed in the first aspect of the embodiment of the invention.
A fourth aspect of the embodiment of the present invention discloses a computer-readable storage medium storing a computer program, where the computer program causes a computer to execute the unmanned aerial vehicle-based assisted smart cultivation management method disclosed in the first aspect of the embodiment of the present invention.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
According to the unmanned aerial vehicle-based auxiliary intelligent breeding management method, according to the density information of animal distribution, the system can intelligently plan the flight track of the unmanned aerial vehicle. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an unmanned aerial vehicle-based auxiliary intelligent cultivation management method disclosed by the embodiment of the invention;
FIG. 2 is a schematic diagram of a process for obtaining first image information according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a process for constructing an animal identification model according to an embodiment of the present invention;
Fig. 4 is a schematic structural diagram of an unmanned aerial vehicle-based auxiliary intelligent cultivation management system according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present invention are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. The terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
At present, along with the continuous development of the breeding industry to large scale and industrialization, a large number of large-scale breeding enterprises have introduced informationized and intelligent breeding management systems. However, most of these systems focus on monitoring and adjusting the cultivation environment and improving the cultivation quality. As for management of the structure of the farm, it is mainly dependent on artificial experience and professional judgment, and there is no more intensive study on the open cultivation environment. This also makes wisdom breed still have certain blind area to can not realize better supplementary function of breeding. Based on the information, the embodiment of the invention discloses an unmanned aerial vehicle-based intelligent cultivation management method, an unmanned aerial vehicle-based intelligent cultivation management system, electronic equipment and a storage medium. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition.
Referring to fig. 1, fig. 1 is a schematic flow chart of an unmanned aerial vehicle-based intelligent cultivation management method according to an embodiment of the invention. The execution main body of the method described in the embodiment of the invention is an execution main body composed of software or/and hardware, and the execution main body can receive related information in a wired or/and wireless mode and can send a certain instruction. Of course, it may also have certain processing and storage functions. The execution body may control a plurality of devices, such as a remote physical server or cloud server and related software, or may be a local host or server and related software that performs related operations on a device that is located somewhere, etc. In some scenarios, multiple storage devices may also be controlled, which may be located in the same location or in different locations than the devices. As shown in fig. 1, the unmanned aerial vehicle-based auxiliary intelligent cultivation management method comprises the following steps:
S101, receiving first image information acquired when a first unmanned aerial vehicle is located at a first flying height, and performing image preprocessing operation on the first image information;
s102, inputting the preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, and determining the center point position of each animal in the first image information according to the animal image information;
S103, determining density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, clustering the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determining corresponding flight track information according to the clustering center points;
And S104, sending the flight track information to the first unmanned aerial vehicle to control the first unmanned aerial vehicle to acquire a second image information group and an infrared image information group along the set flight track according to a second flight height, and sending the acquired second image information group and the acquired infrared image information group to a background server, wherein the second image information group comprises a plurality of second image information, the infrared image information group comprises a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one.
The unmanned aerial vehicle-assisted intelligent cultivation management method has the advantages that the animals in the image can be rapidly identified and the positions of the animals can be determined through receiving and processing the image information acquired by the first unmanned aerial vehicle. The process obviously improves the monitoring efficiency of the culture environment and reduces the time and cost of manual inspection. The distribution condition of the animals in the culture environment can be accurately mastered by determining the position of the animal center point in the first image information and further calculating the density distribution image information. This helps farmers or manager to properly feed and manage animals, ensuring healthy growth of animals. According to the density information of animal distribution, the system can intelligently plan the flight track of the unmanned aerial vehicle. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition. Through gathering second image information group and infrared image information group to ensure that both one-to-one, the system can acquire more comprehensive and abundant cultivation environment data. The infrared image information can reveal key information such as animal body temperature and the like, and provides important references for cultivation management. The whole management flow realizes automation and intellectualization, reduces human intervention and errors, and improves the efficiency and the accuracy of cultivation management. Meanwhile, by acquiring and analyzing the breeding environment data in real time, problems can be found in time and measures can be taken, so that the breeding risk is reduced.
In the implementation process, a situation that animals are gathered in a certain area possibly exists, if the images are collected according to the conventional sequence, some empty images are generated, namely, images without the animals exist, the overall image collection efficiency is reduced, the concentration of the corresponding animals is determined through clustering in the implementation process, and then the corresponding flight route is determined according to the concentration center point, so that the overall monitoring efficiency is improved to the greatest extent, and the use cost of an overall system is reduced.
In the implementation, the position information from the animals in the subareas to the most edge of the subareas needs to be determined, because the corresponding animals need to be determined to be in the corresponding areas when the image acquisition is carried out for the second time, and the animals cannot run to other areas, and are easy to cause statistical confusion of the number. The movement speed of the unmanned aerial vehicle is larger than the movement speed of the animal so that the unmanned aerial vehicle can reach a specific image acquisition point more quickly, and the most accurate graphic data can be acquired.
According to the scheme provided by the embodiment of the invention, different unmanned aerial vehicle travel routes are determined from different heights, and when the unmanned aerial vehicle travel route is at a higher position, the information of the corresponding point can be acquired only through the image, so that the quantity statistics can be performed. In addition to counting the number, the current animal group motion state can be counted, and when the animal overall motion state is monitored, the image acquired by the unmanned aerial vehicle at the first height can be identified, but video information is needed to be acquired, because only the video information can count the density change of the next step, key frames in the video information are extracted, then the key frames are segmented into 100 square images, the number of the animals in the corresponding areas is counted as corresponding feature vectors, the feature change corresponding to the adjacent frames is judged, and then the animal group motion rule is determined.
The boundary area is determined based on different movement state information and movement point position information of animals when the animal is implemented, and the boundary area distance is determined from the nearest animal distance to determine the frame selection range.
More preferably, after the second image information set and the infrared image information set are sent to the background server, the method further includes:
S105, carrying out feature recognition on each piece of second image information in the second image information group according to a pre-constructed object recognition model so as to determine corresponding object information and object position information corresponding to the object information;
S106, matching the determined object information with the constructed object-animal number relation table to determine corresponding animal number information;
and S107, performing temperature analysis on the infrared image information to determine temperature information corresponding to the corresponding target object, performing data association on animal number information and temperature information according to the target position information, and if the temperature information corresponding to the target object is not matched with a set threshold range, transmitting the animal number information with abnormal temperature to a cultivation manager.
According to the embodiment, through the pre-constructed object identification model, the system can accurately identify the object in the second image information set and determine the corresponding object position information. This greatly enhances the ability to monitor specific targets in the management of farming. By matching the identified object information to the constructed object-animal number relationship table, the system is able to quickly determine the animal number information associated with the object. The step effectively correlates the image recognition result with the culture management database, and provides a basis for subsequent data analysis and processing.
The temperature analysis is carried out on the infrared image information, so that the temperature information of the target object can be obtained in real time, and the temperature information is in data association with the animal number information. When the temperature information of the object (usually an animal) does not match the set threshold range, the system can quickly identify the animal with abnormal temperature and send the relevant information to the cultivation manager. The function can discover animal health problems in time and provide timely and effective early warning for cultivation management. The steps are introduced, so that the cultivation management is more refined. By acquiring and analyzing key data (such as animal positions, temperatures and the like) in the culture environment in real time, the system can provide more comprehensive, accurate and timely culture management advice for culture management staff, help the culture management staff to master the culture condition better, and improve the culture efficiency and quality.
The whole process realizes high automation and intellectualization, reduces manual intervention and errors, and improves the efficiency and the accuracy of cultivation management. Meanwhile, through real-time data analysis and early warning functions, the system can autonomously find and process problems in the cultivation process, and the cultivation risk is reduced.
When the method is implemented, each animal is required to be marked on the back, and the mark can be used as the mark characteristic of the corresponding animal by using the color, the shape or the combination of the color and the shape, so that the information is corresponding, and the follow-up management is convenient.
More preferably, as shown in fig. 2, the receiving the first image information collected when the first unmanned aerial vehicle is located at the first flight altitude, and performing an image preprocessing operation on the first image information includes:
S1011, receiving a first image information group acquired when the first unmanned aerial vehicle is located at a first flying height, wherein the first image information group comprises a plurality of first image information, and the first image information group is acquired through the following steps:
S1012, acquiring first image information shot at a corresponding acquisition point through an onboard camera of a first unmanned aerial vehicle, and acquiring gesture information corresponding to the first image information through a gesture sensor;
s1013, determining whether a shooting state of the airborne camera when shooting first image information meets a preset jitter condition according to the gesture data;
S1014, when the shooting state is determined to meet the preset dithering condition, marking the first image information, and carrying out graying operation on the first image information meeting the dithering condition to obtain a first gray scale image;
And S1015, performing contour feature recognition on the first gray level image by adopting a background splitting model to determine contour image information of each animal in the first image information.
According to the embodiment, the image information and the attitude information are acquired through the onboard camera and the attitude sensor of the first unmanned aerial vehicle, so that the environmental state during image acquisition, such as the shaking condition of the unmanned aerial vehicle, can be considered in subsequent processing, and the accuracy and the integrity of data processing are improved. In step S1013, it is determined whether the image is blurred due to the shake of the unmanned aerial vehicle based on the attitude data. In step S1014, the image satisfying the shake condition is subjected to the graying operation, which can reduce the data processing amount, reduce the image blurring problem due to the shake to some extent, and improve the image quality. In step S1015, the contour feature of the gray-scale image is identified by using the background resolution model, so that the contour information of the animal in the image can be accurately extracted. The method provides important basic data for subsequent target identification, positioning, temperature analysis and other works, and enhances the accuracy of cultivation management. The pretreatment flow is highly automated, reduces manual intervention and improves the treatment efficiency. Meanwhile, by means of intelligent image processing and feature recognition technology, a more intelligent and efficient means is provided for cultivation management. The preprocessed image information is clearer and more accurate, more reliable data support can be provided for culture management staff, the culture management staff can be helped to better grasp the culture condition, management decisions can be made in time, and therefore culture management efficiency is improved. The more preferable image preprocessing flow effectively improves the accuracy and usability of the image information by introducing technical means such as gesture information, shake correction, graying processing, contour feature recognition and the like, and provides more solid data base and technical support for subsequent cultivation management.
The background splitting model mainly sets different gray level thresholds to process images, sets tolerance values to identify images after gray level processing, and obtains animal outline images. The tolerance value can be set according to actual requirements, and when the color difference between the two is not very large, the tolerance value can be set to be relatively small, so that the extraction can be more convenient. Because the contours are collected, and the grasslands and grazing animals (such as cattle and sheep) have larger chromatic aberration, when the contour images are obtained by adopting the tolerance values, the calculation speed is faster, and an animal model is not required to be constructed, so that the method is a more efficient contour obtaining mode.
More preferably, as shown in fig. 3, the animal identification model is constructed by the following steps:
s1021, acquiring sample image information, wherein the sample image information is a sample image shot by the unmanned aerial vehicle at a first set height, and marking the appearing animal in a manual marking mode;
s1022, dividing the sample image information to obtain a model training set and a model testing set;
S1023, inputting the model training set into a pre-constructed animal identification model to identify, and continuously adjusting network parameters until the animal identification model meets set conditions;
S1024, inputting the model test set into an animal identification model meeting the set conditions to carry out test verification so as to determine whether the set requirements are met, if yes, outputting a corresponding animal identification model, and if no, returning to continue training.
According to the embodiment, the sample image shot by the unmanned aerial vehicle at the first set height is used as the basic data of model training, so that the model can be ensured to learn the image characteristics of the animal in the real environment. Meanwhile, the method adopts a manual labeling mode to label the appearing animals, so that the accuracy and the reliability of sample data are ensured. The sample image information is divided into a model training set and a model testing set, and a common training-testing flow in machine learning is followed. And then, testing and verifying the trained model by using a test set to ensure the accuracy and generalization capability of the model in practical application. In the training process, the automatic optimization of the model is realized by continuously adjusting network parameters until the animal identification model meets the set conditions. The method not only can improve the recognition accuracy of the model, but also can reduce manual intervention and improve the efficiency and stability of model training.
In the model test stage, the model is verified through a test set, so that the model can meet the set requirements in practical application. If the model fails to meet the requirements, returning to continue training until the performance of the model reaches the preset standard. This iterative optimization approach ensures a high degree of reliability and stability of the final output animal identification model.
By constructing a high-quality animal identification model, the intelligent monitoring and management of the culture environment can be realized. The model can automatically identify animals in the breeding environment, classify and count the animals, provide accurate data support for breeding management staff, help the breeding management staff to master the breeding situation better, and improve the breeding efficiency and management level. The application range of the animal model can be enlarged by constructing the animal model, and the animal model is not limited by places.
More preferably, the inputting the preprocessed first image information into a pre-constructed animal identification model for identification to determine each animal image information in the first image information includes:
Dividing the preprocessed first image information into a first identification area, a second identification area and a third identification area, wherein the first identification area, the second identification area and the third identification area have the same center point, and the distances from the edges of the first identification area, the edges of the second identification area and the edges of the third identification area to the center point are sequentially increased;
inputting the first identification area, the second identification area and the third identification area into a pre-built first animal identification model, a pre-built second animal identification model and a pre-built third animal identification model respectively to identify so as to determine each animal image information in the first image information;
In the process that the first unmanned aerial vehicle collects the second image information group and the infrared image information group along the set flight track according to the second flight height, sensing parameters of all areas are obtained through a sensor assembly arranged at the first unmanned aerial vehicle, the sensor assembly comprises a gas sensor, a temperature sensor and a humidity sensor, and environmental parameter information of the corresponding areas is determined according to the sensing parameters of all the areas.
Because the unmanned aerial vehicle is not all the images of the front and back of the animal when shooting, the outline model construction is needed to be carried out aiming at different areas, such as places far away from the boundary, the places possibly being laterally discharged when shooting the cattle image, the problem of angle deviation exists, and when the unmanned aerial vehicle is implemented, the identification can be carried out according to different models arranged at different distances, so that the outline model construction is convenient, the follow-up quick identification is convenient, and the identification precision and the efficiency are greatly improved.
In addition to the above-mentioned angular deviations, there may be cases where two animals are close together when the implementation is performed, and likewise, the probability of occurrence of such a possibility is greater for a place further from the unmanned aerial vehicle, or because there is one angular deviation, the model construction is also required for a case where a plurality of cows are stacked together when the implementation is performed. When the specific implementation is carried out, when the outline of the cow is detected to be larger than the set value, the animal superposition identification model is started, so that the identification accuracy can be improved.
According to the scheme provided by the embodiment of the invention, the sensor assembly (comprising the gas sensor, the temperature sensor and the humidity sensor) arranged at the first unmanned aerial vehicle can acquire the environmental parameter information of each area in real time. These environmental parameters are of great importance for understanding the health of the farming environment, predicting animal behaviour, and developing scientific management strategies. Through real-time monitoring, the cultivation manager can timely find out the environment abnormality and take corresponding measures to intervene, so that the stability of the cultivation environment and the health of animals are ensured.
And by combining animal identification and environmental parameter monitoring results, intelligent decision support can be provided for culture management staff. For example, the relationship between animal distribution and environmental parameters can be analyzed to optimize the culture density and the feed delivery strategy, and by predicting animal behaviors, preventive measures can be taken in advance to avoid the problems of animal escape, disease transmission and the like.
More preferably, the cultivation management method further includes:
receiving motion sensing signals acquired by motion sensors arranged at all animals;
and determining the animal health state according to the animal motion state identified by the motion sensing signal.
In the more preferable cultivation management method, a function of providing a motion sensor on the animal body to collect motion sensing signals and identifying the motion state of the animal by the signals so as to determine the health state of the animal is introduced. By arranging the motion sensor on the animal body, the motion state of the animal can be monitored continuously in real time. The real-time data acquisition mode enables a breeding manager to timely discover possible health problems of animals, so that measures can be taken rapidly. The data collected by the motion sensor can be accurately processed and analyzed for identifying the motion state of the animal. These status information, such as liveness, movement patterns, etc., are important indicators for assessing the health of animals. Thus, the health status of the animal can be more accurately assessed by this method.
Traditional cultivation management methods often require that cultivation personnel regularly observe the behaviors and signs of animals, which is time-consuming and labor-consuming, and may cause inaccurate assessment results due to human factors. And the data are collected and analyzed through the motion sensor, so that the management efficiency can be greatly improved, and the labor input is reduced. The cultivation management method based on the motion sensor data embodies the idea of intelligent cultivation. The animals are monitored and managed by means of automation and data, so that the cultivation process is more scientific and standard, and meanwhile, the cultivation efficiency and benefit are improved. By monitoring and analyzing the animal motion state in real time, the possible health problems of the animal can be found in time, and early warning is carried out. This helps the cultivation manager to take measures in advance to prevent the problem from deteriorating.
In summary, the more preferable cultivation management method collects the motion sensing signals of the animals by introducing the motion sensor, so that real-time and accurate monitoring and evaluation of the health state of the animals are realized, the efficiency and the intelligent level of cultivation management are improved, and better guarantee is provided for the health of the animals.
Animal behavioral parameters refer to quantitative indicators used to describe and monitor animal behavioral characteristics during the course of cultivation. These parameters can help the breeder to know whether the animal's health, welfare level, and feeding environment are appropriate. The following are some common animal behavioral parameters that can be selected and monitored according to different farmed animals and specific needs:
Activity level, frequency, speed and distance of movement of the animal, reflects the activity level and energy expenditure of the animal. The animal activity data is recorded and analyzed by means of motion sensors, video monitoring systems and the like. The change in activity level may be related to the health condition, disease state or discomfort of the feeding environment of the animal.
Eating behavior, i.e., frequency of eating, food intake, and water intake, of an animal reflects the appetite and nutrient intake of the animal. The intelligent feeder and the drinking water equipment are used, so that the diet data of animals can be automatically recorded. Abnormal eating behavior may be a sign of illness or poor feeding conditions for animals.
Sleep and rest behavior-sleep duration, sleep quality, and rest posture of an animal reflect the comfort and physiological state of the animal. Sleep and rest behaviors of animals are observed and analyzed by video monitoring and animal activity sensors. Inadequate sleep and rest may affect the growth, immunity and reproductive ability of the animal.
Social behavior, interaction, communication and interaction between animals, reflects social needs and group structure of animals. A video surveillance system is used to observe and analyze social behavior of animals. Abnormal social behavior may be a sign of stress, fight, or discomfort between animals.
Self-care behavior, namely, the behavior of the animal to clean, comb and decorate the body surface of the animal, and reflects the self-care capability and health condition of the animal. Animals were assessed for self-care behavior by video monitoring and direct observation. A reduction in self-care behavior may mean that the animal has health problems or that the feeding environment is poor.
Abnormal behavior-unusual or abnormal behavior that is not compatible with normal behavior patterns, such as excessive anxiety, aggression enhancement, excessive ringing, etc. Abnormal behavior is identified and recorded by video surveillance, voice analysis, and direct observation. Abnormal behavior may be a direct manifestation of illness, injury, or feeding environmental discomfort to the animal, requiring timely intervention and treatment.
In monitoring and analyzing these animal behavioral parameters, data are detected by motion sensors (i.e., gyroscopic sensors) to better understand the animal's behavioral patterns and health. These parameters can provide important decision support for breeders, helping them optimize the feeding environment, increase animal welfare levels, and reduce disease risk. Proximity sensors can be arranged at the above important points (such as water sources and grasslands) of the pasture to determine the actual action intention of the corresponding stocking animal even when the implementation is carried out.
Animal behavioral parameters are quantitative indicators used to describe and monitor animal behavioral characteristics during the breeding process. The following are some common animal behavioral parameters that help the breeder to understand the health, welfare level and proper feeding environment of the animal.
And the activity level parameter is moving distance, namely the moving distance of the animal in a certain time is measured through equipment such as a motion sensor and the like, and the activity degree and the energy consumption of the animal are reflected. Behavioral frequency-the frequency of an animal performing a particular action (e.g., standing, lying, walking, etc.) is recorded, and its daily activity pattern is assessed.
Eating frequency, namely recording the eating times of the animal per day or meal, and knowing the appetite of the animal. Food intake by measuring the food intake of animals using intelligent feeders and other devices, and assessing their nutritional needs and health status. Water intake of monitoring animals, and judging water demand and health condition.
And (3) social behavior parameters, namely, interaction frequency, namely, detecting interaction behaviors (such as mutual combing, game and the like) among animals, recording the occurrence frequency and duration of the interaction behaviors, and evaluating social demands and group structures of the animals.
Self-care behavior parameters-combing behavior-the frequency and duration of combing by animals are recorded, and self-care ability and health condition of the animals are evaluated. Since the grooming behaviour is responsive to its own fluctuating changes in the motion sensor, the behaviour parameters of the respective animal can be determined by determining the detection signal of the motion sensor.
Cleaning behavior, namely detecting the cleaning behavior (such as face washing, sand-removing bath and the like) of animals and knowing the sanitary habit and health condition of the animals.
More preferably, after determining the corresponding flight trajectory information according to the cluster center point, the method further includes:
performing signal disassembly on the flight track information to determine the position information of each acquisition point in the flight track information, and generating corresponding throwing information according to the position information of each acquisition point;
the flight track information and the throwing information are sent to a second unmanned aerial vehicle to enable the second unmanned aerial vehicle to throw in medicines or feeds at corresponding acquisition points;
The scheme can be applied before clustering, namely, animals are gathered in a certain area range by putting food, so that the follow-up image acquisition is more stable and accurate, and the possibility of deviation is reduced. When the method is implemented, the grassland state can be identified, the animals can be moved to the area wanted by the user by throwing food or other articles attracting the animals, for example, the animals are moved from the area A to the area B, the water grass in the area A is poor, the water grass in the area B is lush, the cultivation is more scientific by the guiding, and the pasture is convenient to maintain.
In the more preferable cultivation management method, the functions of signal disassembly of flight track information, generation of throwing information and throwing of medicines or feeds through the second unmanned aerial vehicle are introduced. The technical effects corresponding to the method are mainly as follows:
By signal disassembly of the flight path information, accurate position information of each acquisition point can be determined. Based on the delivery information generated by the position information, the medicine or the feed can be ensured to be accurately delivered to the target position, and the waste and the misdelivery of resources are avoided. Traditional drug or feed delivery often requires manual handling, which is not only inefficient, but may also present a safety risk. Automatic throwing is carried out through the second unmanned aerial vehicle, so that throwing efficiency and safety can be greatly improved, and the requirement of manual intervention is reduced. By combining the flight trajectory information and the throwing information, intelligent decision support can be provided for the cultivation manager. For example, the position and the amount of the delivery point can be dynamically adjusted according to the distribution condition of the animals and the environmental parameter information so as to achieve the optimal management effect. Through accurate throwing, the medicine or the feed can be ensured to be fully utilized, and the waste of resources is reduced. Meanwhile, as the selection of the delivery point is more reasonable, the influence on the environment can be reduced.
The intelligent cultivation management method further comprises the following steps:
Generating a corresponding key pair according to the identity of the client in a client management module, wherein the key pair comprises an encryption key and a decryption key, extracting the encryption key from the key pair and sending the encryption key to the client corresponding to the identity, extracting the decryption key from the key pair, binding the decryption key and the identity to generate decryption information, and sending the decryption information to a client management node at a blockchain system;
Dividing the collected culture management information into a plurality of public data packets and private data packets at a data management module, and encrypting the private data packets by using the encryption key, wherein the public data packets comprise identification marks and timestamp information of the culture management information, and the private data packets comprise culture quantity, animal health status and animal number information;
When the culture management information of the data management module is obtained, the culture management node extracts the public data packet and the private data packet from each culture data block according to the corresponding timestamp information and the identity of the client, inquires the decryption information according to the identity, extracts the corresponding decryption key to decrypt the private data packet, and restores the culture management information by using a preset data template based on the public data packet and the private data packet.
Since information of each pasture and status information of animals cultivated in the pasture are important confidential information in the smart cultivation process, it is necessary to encrypt the information to prevent others from obtaining the information. The intelligent culture management method realizes safe, reliable and efficient management of culture management information by introducing the blockchain technology, the client management module and the data management module. The following are the corresponding technical effects:
the client management module is used for generating a corresponding key pair (an encryption key and a decryption key) according to the identity of the client, so that the security in the data transmission process is ensured. The encryption key is sent to the client, and the decryption key is stored in the client management node of the blockchain system after being bound with the identity, so that the confidentiality of data is enhanced. The distributed storage of the block chain system ensures that the data is not easy to tamper, and improves the credibility of the data.
The flexibility and efficiency of data management are that the breeding management information is split into public data packets and private data packets, and the public data packets contain identity identification and time stamp information, so that quick positioning and searching are facilitated. The private data packet contains key information such as the number of the cultivated animals, the health status of the animals, the number of the animals and the like, and is encrypted by an encryption key, so that the safety of the data is ensured. The distributed storage is in each culture data block of the block chain system, so that the access and the acquisition of the data are more flexible and efficient.
And the intelligent management and decision support is that the cultivation management information is restored through a preset data template, so that a manager can conveniently acquire required data to monitor and decide cultivation activities. By combining the timestamp information and the identity, the related culture data can be quickly and accurately obtained, and powerful support is provided for management decision.
Data traceability and credibility, namely due to the characteristics of the blockchain technology, the data stored in the block chain technology has non-tamper property and traceability, and the authenticity and credibility of the culture management information are ensured. The source and authenticity of the data can be conveniently traced and verified when disputes occur or verification is required. In summary, the intelligent cultivation management method realizes safe, reliable and efficient management of cultivation management information by introducing the blockchain technology, the client management module and the data management module. The method not only improves the safety of the data, but also improves the flexibility and efficiency of data management, and provides powerful support for the development of the breeding industry.
According to the unmanned aerial vehicle-based auxiliary intelligent breeding management method, according to the density information of animal distribution, the system can intelligently plan the flight track of the unmanned aerial vehicle. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an unmanned aerial vehicle-based intelligent cultivation management system according to an embodiment of the invention. As shown in fig. 4, the unmanned aerial vehicle-based intelligent farming management system may include:
the receiving module 21 is used for receiving first image information acquired when the first unmanned aerial vehicle is located at a first flying height and performing image preprocessing operation on the first image information;
a center point determining module 22, configured to input the preprocessed first image information into a pre-built animal recognition model to perform recognition to determine each animal image information in the first image information, and determine a center point position of each animal in the first image information according to the animal image information;
The flight trajectory determining module 23 determines density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, performs clustering operation on the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determines corresponding flight trajectory information according to the clustering center points;
The image acquisition module 24 is configured to send the flight path information to the first unmanned aerial vehicle to control the first unmanned aerial vehicle to acquire a second image information set and an infrared image information set along a set flight path according to a second flight altitude, and send the acquired second image information set and the acquired infrared image information set to a background server, where the second image information set includes a plurality of second image information, the infrared image information set includes a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one.
According to the unmanned aerial vehicle-based auxiliary intelligent breeding management method, according to the density information of animal distribution, the system can intelligently plan the flight track of the unmanned aerial vehicle. The unmanned aerial vehicle system not only improves the working efficiency of the unmanned aerial vehicle, but also ensures the comprehensiveness and accuracy of data acquisition.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention. The electronic device may be a computer, a server, or the like, and of course, may also be an intelligent device such as a mobile phone, a tablet computer, a monitor terminal, or the like, and an image acquisition device having a processing function. As shown in fig. 5, the electronic device may include:
A memory 510 storing executable program code;
A processor 520 coupled to the memory 510;
Wherein the processor 520 invokes executable program code stored in the memory 510 to perform some or all of the steps in the unmanned aerial vehicle-based assisted smart farming management method of the first embodiment.
The embodiment of the invention discloses a computer readable storage medium storing a computer program, wherein the computer program enables a computer to execute part or all of the steps in the unmanned aerial vehicle-based auxiliary intelligent aquaculture management method in the first embodiment.
The embodiment of the invention also discloses a computer program product, wherein when the computer program product runs on a computer, the computer is caused to execute part or all of the steps in the unmanned aerial vehicle-based auxiliary intelligent aquaculture management method in the first embodiment.
The embodiment of the invention also discloses an application release platform, wherein the application release platform is used for releasing a computer program product, and when the computer program product runs on a computer, the computer is caused to execute part or all of the steps in the unmanned aerial vehicle-based auxiliary intelligent aquaculture management method in the first embodiment.
In various embodiments of the present invention, it should be understood that the size of the sequence numbers of the processes does not mean that the execution sequence of the processes is necessarily sequential, and the execution sequence of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-accessible memory. Based on this understanding, the technical solution of the present invention, or a part contributing to the prior art or all or part of the technical solution, may be embodied in the form of a software product stored in a memory, comprising several requests for a computer device (which may be a personal computer, a server or a network device, etc., in particular may be a processor in a computer device) to execute some or all of the steps of the method according to the embodiments of the present invention.
In the embodiments provided herein, it should be understood that "B corresponding to a" means that B is associated with a, from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
Those of ordinary skill in the art will appreciate that some or all of the steps of the various methods of the described embodiments may be implemented by hardware associated with a program that may be stored in a computer-readable storage medium, including Read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM), one-time programmable Read-Only Memory (One-time Programmable Read-Only Memory, OTPROM), electrically erasable programmable Read-Only Memory (EEPROM), compact disc Read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), or other optical disk Memory, magnetic disk Memory, tape Memory, or any other medium capable of being used to carry or store data.
The unmanned aerial vehicle-based intelligent cultivation management method, system, electronic device and storage medium disclosed in the embodiments of the present invention are described in detail, and specific examples are applied to illustrate the principles and implementation manners of the present invention, and the description of the above embodiments is only for helping to understand the method and core ideas of the present invention, and meanwhile, for those skilled in the art, according to the ideas of the present invention, there are changes in the specific implementation manners and application ranges, so the disclosure should not be construed as limiting the present invention.

Claims (9)

1. An unmanned aerial vehicle-based auxiliary intelligent cultivation management method is characterized by comprising the following steps of:
receiving first image information acquired when a first unmanned aerial vehicle is located at a first flying height, and performing image preprocessing operation on the first image information;
Inputting the preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, and determining the central point position of each animal in the first image information according to the animal image information;
Determining density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, performing clustering operation on the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determining corresponding flight track information according to the clustering center points;
performing signal disassembly on the flight track information to determine the position information of each acquisition point in the flight track information, and generating corresponding throwing information according to the position information of each acquisition point;
the flight track information and the throwing information are sent to a second unmanned aerial vehicle to enable the second unmanned aerial vehicle to throw in medicines or feeds at corresponding acquisition points;
The flight track information is sent to a first unmanned aerial vehicle to control the first unmanned aerial vehicle to collect a second image information group and an infrared image information group along a set flight track according to a second flight height, and the acquired second image information group and the acquired infrared image information group are sent to a background server, wherein the second image information group comprises a plurality of second image information, the infrared image information group comprises a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one;
Performing feature recognition on each piece of second image information in the second image information group according to a pre-constructed object recognition model so as to determine corresponding object information and object position information corresponding to the object information;
Matching the determined object information with the constructed object-animal numbering relation table to determine corresponding animal numbering information;
Temperature analysis is carried out on the infrared image information to determine temperature information corresponding to the corresponding target object, and animal number information and temperature information are subjected to data association according to the target position information; and if the temperature information corresponding to the target object is not matched with the set threshold range, sending the animal number information with abnormal temperature to a cultivation manager.
2. The unmanned aerial vehicle-based assisted smart culture management method of claim 1, wherein the receiving the first image information acquired when the first unmanned aerial vehicle is located at the first flight altitude and performing the image preprocessing operation on the first image information comprises:
Receiving a first image information group acquired when a first unmanned aerial vehicle is located at a first flying height, wherein the first image information group comprises a plurality of first image information, and the first image information group is acquired through the following steps:
acquiring first image information shot at a corresponding acquisition point through an onboard camera of a first unmanned aerial vehicle, and acquiring gesture information corresponding to the first image information through a gesture sensor;
Determining whether a shooting state of the airborne camera when shooting first image information meets a preset jitter condition according to the gesture data;
When the shooting state is determined to meet the preset dithering condition, marking the first image information, and carrying out graying operation on the first image information meeting the dithering condition to obtain a first gray level image;
Contour feature recognition is performed on the first gray level image by using a background splitting model to determine contour image information of each animal in the first image information.
3. The unmanned aerial vehicle-based auxiliary intelligent aquaculture management method according to claim 2, wherein the animal identification model is constructed by the following steps:
acquiring sample image information, wherein the sample image information is a sample image shot by an unmanned aerial vehicle at a first set height, and marking the appearing animal in a manual marking mode;
dividing the sample image information to obtain a model training set and a model testing set;
inputting the model training set into a pre-constructed animal identification model to identify, and continuously adjusting network parameters until the animal identification model meets set conditions;
And inputting the model test set into an animal identification model meeting the set conditions to carry out test verification so as to determine whether the set requirements are met, outputting a corresponding animal identification model if the set requirements are met, and returning to continue training if the set requirements are not met.
4. The unmanned aerial vehicle-based assisted smart culture management method of claim 1, wherein the inputting the preprocessed first image information into a pre-built animal recognition model for recognition to determine each animal image information in the first image information comprises:
Dividing the preprocessed first image information into a first identification area, a second identification area and a third identification area, wherein the first identification area, the second identification area and the third identification area have the same center point, and the distances from the edges of the first identification area, the edges of the second identification area and the edges of the third identification area to the center point are sequentially increased;
inputting the first identification area, the second identification area and the third identification area into a pre-built first animal identification model, a pre-built second animal identification model and a pre-built third animal identification model respectively to identify so as to determine each animal image information in the first image information;
In the process that the first unmanned aerial vehicle collects the second image information group and the infrared image information group along the set flight track according to the second flight height, sensing parameters of all areas are obtained through a sensor assembly arranged at the first unmanned aerial vehicle, the sensor assembly comprises a gas sensor, a temperature sensor and a humidity sensor, and environmental parameter information of the corresponding areas is determined according to the sensing parameters of all the areas.
5. The unmanned aerial vehicle-based assisted smart farming management method of claim 1, further comprising:
receiving motion sensing signals acquired by motion sensors arranged at all animals;
and determining the animal health state according to the animal motion state identified by the motion sensing signal.
6. The unmanned aerial vehicle-based auxiliary intelligent farming management method of claim 1, further comprising:
Generating a corresponding key pair according to the identity of the client in a client management module, wherein the key pair comprises an encryption key and a decryption key, extracting the encryption key from the key pair and sending the encryption key to the client corresponding to the identity, extracting the decryption key from the key pair, binding the decryption key and the identity to generate decryption information, and sending the decryption information to a client management node at a blockchain system;
Dividing the collected culture management information into a plurality of public data packets and private data packets at a data management module, and encrypting the private data packets by using the encryption key, wherein the public data packets comprise identification marks and timestamp information of the culture management information, and the private data packets comprise culture quantity, animal health status and animal number information;
When the culture management information of the data management module is obtained, the culture management node extracts the public data packet and the private data packet from each culture data block according to the corresponding timestamp information and the identity of the client, inquires the decryption information according to the identity, extracts the corresponding decryption key to decrypt the private data packet, and restores the culture management information by using a preset data template based on the public data packet and the private data packet.
7. Based on supplementary wisdom aquaculture management system of unmanned aerial vehicle, its characterized in that includes:
the receiving module is used for receiving first image information acquired when the first unmanned aerial vehicle is located at a first flying height and carrying out image preprocessing operation on the first image information;
The center point determining module is used for inputting the preprocessed first image information into a pre-built animal identification model to identify so as to determine each animal image information in the first image information, and determining the center point position of each animal in the first image information according to the animal image information;
the flight track determining module is used for determining density distribution image information of each animal in the first image information according to the center point position of each animal in the first image information and the first image information, carrying out clustering operation on the density distribution image information to determine a plurality of clustering center points in the density distribution image information, and determining corresponding flight track information according to the clustering center points;
performing signal disassembly on the flight track information to determine the position information of each acquisition point in the flight track information, and generating corresponding throwing information according to the position information of each acquisition point;
the flight track information and the throwing information are sent to a second unmanned aerial vehicle to enable the second unmanned aerial vehicle to throw in medicines or feeds at corresponding acquisition points;
The image acquisition module is used for transmitting the flight track information to the first unmanned aerial vehicle so as to control the first unmanned aerial vehicle to acquire a second image information group and an infrared image information group along a set flight track according to a second flight height, and transmitting the acquired second image information group and the acquired infrared image information group to a background server, wherein the second image information group comprises a plurality of second image information, the infrared image information group comprises a plurality of infrared image information, and the second image information corresponds to the infrared image information one by one;
Performing feature recognition on each piece of second image information in the second image information group according to a pre-constructed object recognition model so as to determine corresponding object information and object position information corresponding to the object information;
Matching the determined object information with the constructed object-animal numbering relation table to determine corresponding animal numbering information;
Temperature analysis is carried out on the infrared image information to determine temperature information corresponding to the corresponding target object, and animal number information and temperature information are subjected to data association according to the target position information; and if the temperature information corresponding to the target object is not matched with the set threshold range, sending the animal number information with abnormal temperature to a cultivation manager.
8. An electronic device comprising a memory storing executable program code, a processor coupled to the memory, the processor invoking the executable program code stored in the memory for performing the unmanned aerial vehicle-based assisted smart culture management method of any of claims 1 to 6.
9. A computer-readable storage medium storing a computer program, wherein the computer program causes a computer to execute the unmanned aerial vehicle-based assisted smart farming management method according to any one of claims 1 to 6.
CN202410999525.4A 2024-07-24 2024-07-24 Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system Active CN118942115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410999525.4A CN118942115B (en) 2024-07-24 2024-07-24 Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410999525.4A CN118942115B (en) 2024-07-24 2024-07-24 Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system

Publications (2)

Publication Number Publication Date
CN118942115A CN118942115A (en) 2024-11-12
CN118942115B true CN118942115B (en) 2025-02-18

Family

ID=93349391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410999525.4A Active CN118942115B (en) 2024-07-24 2024-07-24 Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system

Country Status (1)

Country Link
CN (1) CN118942115B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119206629B (en) * 2024-11-26 2025-02-28 贵州大学 A chicken farming image monitoring method and system based on cloud computing

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105425815A (en) * 2015-11-27 2016-03-23 杨珊珊 Intelligent pasture management system and method by using unmanned aerial vehicle
CN110058610A (en) * 2019-05-07 2019-07-26 南京信息工程大学 A kind of auxiliary of real-time inspection flock of sheep number is put sheep out to pasture method and system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101532348B1 (en) * 2014-05-13 2015-06-29 금호마린테크 (주) Statistics system for observing birds and statistics method using the same
US10321663B2 (en) * 2015-09-24 2019-06-18 Digi-Star, Llc Agricultural drone for use in livestock monitoring
CN205229809U (en) * 2015-11-27 2016-05-11 杨珊珊 Utilize unmanned vehicles's pasture intelligent management system and unmanned vehicles thereof
WO2018127452A1 (en) * 2017-01-05 2018-07-12 Novelty Aps Surveillance method, drone, mobile device, surveillance system, data carrier
US11460867B2 (en) * 2020-06-30 2022-10-04 Sony Group Corporation System of multi-swarm drone capturing
CN113359847B (en) * 2021-07-06 2022-03-11 中交遥感天域科技江苏有限公司 Unmanned aerial vehicle counter-braking method and system based on radio remote sensing technology and storage medium
CN115868443A (en) * 2023-01-16 2023-03-31 上海海洋大学 A precision feeding device and method based on drones
CN117036963B (en) * 2023-10-08 2024-01-26 中国农业大学 Detection and estimation method for typical grassland surface plant quantity

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105425815A (en) * 2015-11-27 2016-03-23 杨珊珊 Intelligent pasture management system and method by using unmanned aerial vehicle
CN110058610A (en) * 2019-05-07 2019-07-26 南京信息工程大学 A kind of auxiliary of real-time inspection flock of sheep number is put sheep out to pasture method and system

Also Published As

Publication number Publication date
CN118942115A (en) 2024-11-12

Similar Documents

Publication Publication Date Title
Tedeschi et al. Advancements in sensor technology and decision support intelligent tools to assist smart livestock farming
Neethirajan The role of sensors, big data and machine learning in modern animal farming
Neethirajan et al. Digital livestock farming
Dohmen et al. Computer vision-based weight estimation of livestock: a systematic literature review
Racewicz et al. Welfare health and productivity in commercial pig herds
CN104463688B (en) A kind of free-ranging beef cattle positioning early-warning system
CN109640640B (en) Systems for Monitoring Grass Intake
US11278007B2 (en) Methods and systems for using sound data to analyze health condition and welfare states in collections of farm animals
KR102165891B1 (en) Livestock data analysis-based farm health state diagnosis system
KR102315991B1 (en) Device and system for managing livestck remotly using artificial intelligence
KR102341715B1 (en) Apparatus and method for livestock monitoring
CN118942115B (en) Unmanned aerial vehicle-based auxiliary intelligent cultivation management method and system
CN111274975A (en) Pig feeding behavior prediction method and device
CN113159244A (en) Poultry breeding management system based on Internet of things
Haldar et al. Application of information and electronic technology for best practice management in livestock production system
CN114070862A (en) IoT-based animal husbandry control method, system and readable storage medium
Malhotra et al. Application of AI/ML approaches for livestock improvement and management
CN117745036A (en) Livestock information management method and system based on feature recognition and near field communication
Singhal et al. Cattle Collar: An End-to-End Multi-Model Framework for Cattle Monitoring
KR102253236B1 (en) Farm disease analysis device using big data
CN113516139A (en) Data processing method, apparatus, device and storage medium
Yılmaz Transforming animal husbandry: leveraging herd management, automation and artificial intelligence for enhanced productivity and sustainability
JP2024046596A (en) Condition prediction device, condition prediction method, program, and recording medium
CN113971227B (en) Big data based livestock monitoring method, system and readable storage medium
Dawkins et al. Poultry welfare monitoring: group-level technologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant