CN114366029A - Wearable pet equipment and pet monitoring method - Google Patents
Wearable pet equipment and pet monitoring method Download PDFInfo
- Publication number
- CN114366029A CN114366029A CN202111659310.0A CN202111659310A CN114366029A CN 114366029 A CN114366029 A CN 114366029A CN 202111659310 A CN202111659310 A CN 202111659310A CN 114366029 A CN114366029 A CN 114366029A
- Authority
- CN
- China
- Prior art keywords
- pet
- data
- unit
- processing unit
- voice
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K13/00—Devices for grooming or caring of animals, e.g. curry-combs; Fetlock rings; Tail-holders; Devices for preventing crib-biting; Washing devices; Protection against weather conditions or insects
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording for evaluating the cardiovascular system, e.g. pulse, heart rate, blood pressure or blood flow
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/40—Animals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M21/00—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
- A61M2021/0005—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
- A61M2021/0027—Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the hearing sense
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61M—DEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
- A61M2250/00—Specially adapted for animals
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biophysics (AREA)
- General Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Environmental Sciences (AREA)
- Physiology (AREA)
- Cardiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Biodiversity & Conservation Biology (AREA)
- Animal Husbandry (AREA)
- Dentistry (AREA)
- Psychology (AREA)
- Radar, Positioning & Navigation (AREA)
- Pulmonology (AREA)
- Acoustics & Sound (AREA)
- Anesthesiology (AREA)
- Hematology (AREA)
- Zoology (AREA)
- Alarm Systems (AREA)
Abstract
The application discloses a wearable pet device and a pet monitoring method. Wearable equipment of pet is used for dressing on the pet body, including acquisition unit, processing unit and speech unit, acquisition unit and speech unit connect processing unit respectively. The acquisition unit is used for acquiring the status data of the pet and sending the status data to the processing unit. The processing unit is used for judging whether the pet has an abnormal condition according to the condition data and controlling the voice unit to play corresponding voice content when the pet has the abnormal condition so as to sooth the emotion of the pet and/or indicate the pet to escape from danger. The wearable pet equipment and the pet monitoring method can monitor the condition of the pet in real time, and can take corresponding measures in time when the pet is in an abnormal condition.
Description
Technical Field
The application relates to the technical field of wearable equipment, in particular to wearable pet equipment and a pet monitoring method.
Background
At present, many families raise pets. Because the pet owner cannot stay with the pet all day long, when the pet leaves the sight range of the owner, if the pet is excited suddenly, or suffers attack or other sudden dangers, the pet may have out-of-control behavior to cause injury or hurt other people, but the owner cannot know the condition of the pet in real time, so that corresponding measures cannot be taken at the first time.
Disclosure of Invention
The main objective of this application provides a wearable equipment of pet, can long-range real time monitoring pet's situation, can in time pacify pet mood when pet mood is unusual, also can in time instruct the pet to flee when the pet meets unexpected danger.
An object of this application is to provide a wearable equipment of pet, wherein wearable equipment of pet is used for wearing on the pet, including acquisition unit, processing unit and speech unit. The collecting unit can collect the status data of the pet and send the status data of the pet to the processing unit, and the processing unit can judge whether the pet is in an abnormal state or not according to the status data of the pet and control the voice unit to play corresponding voice content when the pet is in the abnormal state. So, when the pet owner is not near the pet, the wearable equipment of pet can whether unusual situation appears in the automatic analysis pet, when unusual situation appears in the pet, can play corresponding pronunciation content automatically to give the pet mood and pacify and/or in time instruct the pet to flee from danger.
It is an object of the present application to provide a wearable device for pets, wherein the acquisition unit comprises at least one of a motion sensor, a physiological sensor, a sound sensor and a force sensor. The condition data is at least one of motion data of the pet collected by the motion sensor, physiological data of the pet collected by the physiological sensor, sound data of the pet collected by the sound sensor and impact force data born by the pet collected by the force sensor. Therefore, the processing unit can judge whether the emotion of the pet is abnormal according to the motion data, the physiological data and/or the sound data, and controls the voice unit to play corresponding voice content when the emotion of the pet is abnormal so as to placate the emotion of the pet. The processing unit can judge whether the pet is struck by external force or impacted by external force according to the impact force data, and controls the voice unit to play corresponding voice content when the pet is struck by external force or impacted by external force so as to indicate the pet to escape from danger.
An object of the present application is to provide a wearable device for pet, wherein the motion data collected by the motion sensor includes, but is not limited to, at least one of motion steps, motion duration, motion speed, and motion acceleration, and the physiological data collected by the physiological sensor includes, but is not limited to, at least one of body temperature, body surface humidity, heart rate, respiratory rate, and myoelectric signal of the pet. The processing unit can accurately judge the emotion and the abnormal condition of the body of the pet by combining various motion data and/or various physiological data.
It is an object of the present application to provide a wearable device for pets, wherein the wearable device for pets further comprises a storage unit. Therefore, the audio for soothing the emotion of the pet and indicating the escape danger of the pet can be recorded in advance and stored in the storage unit, so that the audio can be played by the voice unit.
An object of the present application is to provide a wearable pet device, wherein the wearable pet device further includes a communication unit, and the processing unit can send the status data of the pet and the judgment result of whether the abnormal status occurs to the terminal device of the pet owner through the communication unit, so that the pet owner can remotely monitor the status of the pet.
An object of the present application is to provide a wearable device for a pet, wherein when the processing unit is connected to a terminal device of a pet owner through the communication unit, the processing unit is configured to receive real-time voice content input by the pet owner at the terminal device through the communication unit, and control the voice unit to play the real-time voice content. In this manner, the mood of the pet may be more effectively soothed and/or the pet may be better directed.
It is an object of the present application to provide a wearable pet device, wherein the wearable pet device further comprises a positioning unit. The positioning unit can collect the geographic position data of the pet, the processing unit can generate the motion trail of the pet according to the geographic position data, and the geographic position data and the motion trail of the pet are sent to the terminal device through the communication unit. Therefore, the pet owner can know the direction of the pet in real time, and can timely drive to the pet when the pet is in emotional abnormality, body abnormality or encounters sudden dangers such as striking, bumping and the like.
An object of this application is to provide a wearable equipment of pet, wherein processing unit can judge whether the distance between positioning element and the terminal equipment is greater than the visual distance of settlement, and control collection unit and speech unit and get into dormant state when the distance between positioning element and the terminal equipment is not more than visual distance, and control collection unit and speech unit and get into operating condition when the distance between positioning element and the terminal equipment is greater than visual distance. Therefore, the power can be saved, and the service lives of the acquisition unit and the voice unit are prolonged.
An object of the present application is to provide a wearable device for pets, wherein the wearable device for pets further comprises a camera. The processing unit can control the camera to shoot the ambient environment data of the pet when judging that the abnormal condition of the pet occurs, and the ambient environment data shot by the camera is sent to the terminal equipment through the communication unit, so that a pet owner can know the ambient environment of the pet when the abnormal condition occurs.
An object of the present application is to provide a pet monitoring method, which is based on a wearable pet device worn on a pet body, and determines whether an abnormal situation occurs in the pet by collecting status data of the pet and according to the status data, and when it is determined that the abnormal situation occurs in the pet, automatically plays corresponding voice content to sooth the mood of the pet and/or instruct the pet to escape from danger. Therefore, the pet monitoring method can monitor the condition of the pet and can take corresponding measures in time when the pet is in an abnormal condition.
Compared with the prior art, the method has the following advantages:
1. wearable equipment of this application pet is wearable on the pet. When the pet owner is not beside the pet, the wearable pet equipment can automatically analyze whether the pet has abnormal conditions or not, can automatically play corresponding voice content when the emotion of the pet is abnormal, and can automatically play corresponding voice content when the pet is subjected to dangers such as external force striking or impact, and can timely indicate the pet to escape from the dangers.
2. This application wearable equipment of pet can communicate with terminal equipment, and the pet owner can come the situation of remote monitoring pet through wearable equipment of pet to can in time discover the abnormal situation of pet and catch up to the pet and handle.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic view of a wearable pet device provided in an embodiment of the present application.
Fig. 2 is a schematic view of the wearable pet device of fig. 1 worn on a pet.
Fig. 3 is another schematic view of the wearable pet device of fig. 1 worn on a pet.
Fig. 4 is a schematic connection diagram of the wearable pet device and the terminal device shown in fig. 1.
FIG. 5 is a flow chart of a pet monitoring method according to an embodiment of the present application.
Description of the main elements
Wearable device for pets 100
Wearing article 1
Communication unit 6
Camera 7
The following detailed description will further illustrate the present application in conjunction with the above-described figures.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "comprises" and "comprising," and any variations thereof, in the description of the present application, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to the listed steps or modules but may alternatively include other steps or modules not listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, the present application provides a wearable pet device 100. The wearable pet device 100 comprises a wearing piece 1, a collecting unit 2, a processing unit 3 and a voice unit 4. Wherein, collection unit 2, processing unit 3 and speech unit 4 are installed respectively on wearing 1, and wearable equipment 100 of pet can be dressed on the pet through wearing 1.
It is to be understood that the form of the garment 1 is not limited by the present application. For example, referring to fig. 2, the wearing member 1 may be a collar for wearing on the neck of a pet. For another example, referring to fig. 3, the wearing member 1 may be a vest for wearing on the waist and back of the pet. Of course, in some cases, the wearing member 1 may be more than one piece, for example, a collar and a vest may be used as the wearing member 1. It can be understood that the wearing piece 1 can be fixed on the pet body by means of magic tape, bandage, buckle, etc.
The acquisition unit 2 is used for acquiring the status data of the pet so as to know the status of the pet. It is understood that the condition of the pet may be an emotional condition. Taking a dog as an example, the dog has different expressions under different emotions, such as excessive gasping, sounding trembling and even whole body contraction into a mass and the like easily occurring in fear; when a user is excited to have a violent temper, the body is straight, the limbs of the user stretch, a threatening whining sound is emitted, even behaviors such as front plucking, biting and the like occur, and the body temperature is increased; the amount of exercise increases during excitation, and the body temperature also increases, for example, the body bends, and the body swings strongly left and right with the front legs or the tail. The condition of the pet may also be a physical condition. For example, when a dog is subjected to an external shock or impact, the body of the dog is subjected to an impact force generated by the shock or impact. Therefore, in the embodiment of the present application, please refer to fig. 4, the collecting unit 2 includes at least one of a motion sensor 21, a physiological sensor 22, a sound sensor 23 and a force sensor 24. Wherein, the motion sensor 21 can be used to collect the motion data of the pet, and the motion data includes but is not limited to at least one of the motion duration, the motion speed and the motion acceleration. The physiological sensor 22 may be used to collect physiological data of the pet, including but not limited to at least one of body temperature, body surface humidity, heart rate, respiratory rate, and electromyographic signals of the pet. The sound sensor 23 may be used to collect sound data of the pet and its surroundings. The force sensor 24 is used to collect impact force data experienced by the pet.
Correspondingly, the condition data of the pet is at least one of motion data, physiological data, sound data and impact force data born by the pet. The motion data, physiological data and/or sound data of the pet can be used for identifying the emotional condition of the pet, and the impact force data born by the pet can be used for judging whether the pet is struck by external force or impacted by the external force.
The processing unit 3 may employ a central processing unit. The processing unit 3 is connected to each sensor in the acquisition unit 2 and receives acquired data from each sensor.
The processing unit 3 is used for judging whether the pet is in emotional abnormality (such as fear, excitement and violence) according to at least one of the motion data collected by the motion sensor 21, the physiological data collected by the physiological sensor 22 and the sound data collected by the sound sensor 23. For example, the processing unit 3 may utilize a neural network model as an emotion recognition model, and take the motion data, the physiological data and/or the sound data as the input of the model, and perform analysis processing by the model, and output the emotion type of the pet corresponding to the physiological data.
The processing unit 3 is further configured to determine whether the impact force data collected by the force sensor 24 is greater than a set impact force threshold, and determine that the pet is hit by an external force or hit by the external force if the impact force data is greater than the set impact force threshold.
It can be understood that, since the physiological signs of the dog, such as body temperature, respiratory rate, heart rate, etc., are abnormal when the dog is injured and ill, the processing unit 3 can also determine whether the pet is abnormal according to the physiological data of the pet collected by the physiological sensor 22.
The speech unit 4 may employ a speaker. The speech unit 4 is connected to the processing unit 3. When the pet is in an abnormal condition, the processing unit 3 is further configured to control the voice unit 4 to play corresponding voice content. For example, when the emotion of the pet is abnormal, the processing unit 3 may control the voice unit 4 to play corresponding voice content, so as to sooth the emotion of the pet. For another example, when the pet is hit by external force or hit by external force, the processing unit 3 may control the voice unit 4 to play corresponding voice content to indicate the pet to escape from the danger.
In some embodiments, the voice content may be pre-recorded audio. It will be appreciated that the content of the audio may include the voice of the pet owner soothing the pet, light music to soothe the mood of the pet, and verbal instructions to instruct the pet to escape the hazard, etc.
It is understood that referring to fig. 4, the pet wearable device 100 further includes a storage unit 5. The storage unit 5 is arranged on the wearing piece 1 and can be used for storing pre-recorded audio. The storage unit 5 is connected to the processing unit 3 and the speech unit 4. When the pet is abnormal in emotion, the processing unit 3 can control the voice unit 4 to play the voice or the soft music in the storage unit 5, so that the pet feels mild and cool. When the pet is hit by external force or impacted by external force, the processing unit 3 can control the voice unit 4 to play the oral command in the storage unit 5 to instruct the pet to escape from the danger immediately.
In some embodiments, referring again to fig. 4, the pet wearable device 100 further comprises a communication unit 6. The communication unit 6 is disposed on the wearing member 1 and connected with the processing unit 3. Wherein, the pet wearable device 100 can be wirelessly connected with the terminal device 200 through the communication unit 6. In this way, the processing unit 3 can transmit the status data of the pet, such as the motion data, the physiological data, the sound data, the impact data, and the like, to the terminal device 200 through the communication unit 6, and transmit the judgment result of whether the abnormal status of the pet occurs to the terminal device 200, so that the pet owner can remotely monitor whether the abnormal status of the pet occurs.
It is understood that the terminal device 200 may be a pet owner's cell phone, computer, tablet, etc. The terminal device 200 may be used to input voice content. For example, when the pet owner remotely monitors that the abnormal condition of the pet occurs, the corresponding voice content may be input through the terminal device 200. The processing unit 3 receives the voice content in real time through the communication unit 6, and then controls the voice unit 4 to play the voice content, so that the pet owner can placate the emotion of the pet more effectively, or guide the pet to escape from danger more accurately.
In some embodiments, referring again to fig. 4, the pet wearable device 100 further comprises a camera 7 for capturing ambient data of the pet. The camera 7 is arranged on the wearing piece 1, and the camera 7 is connected with the processing unit 3 and is controlled by the processing unit 3. For example, when the processing unit 3 determines that the emotion of the pet is abnormal or the pet is hit or bumped by an external force, the processing unit 3 may control the camera 7 to start up to capture the data of the surrounding environment of the pet. Further, the processing unit 3 may also transmit the photographed content to the terminal device 200 through the communication unit 6, so that the pet owner can know the surrounding environment where the abnormal situation occurs in the pet.
In some embodiments, referring again to fig. 4, the pet wearable device 100 further comprises a positioning unit 8. The positioning unit 8 is arranged on the wearing piece 1 and connected with the processing unit 3. The positioning unit 8 may be, for example, a GPS locator, a beidou locator, or the like. The positioning unit 8 is used for collecting the geographic position data of the pet and sending the geographic position data to the processing unit 3. The processing unit 3 may generate the motion trail of the pet according to the geographic position data. Further, the processing unit 3 can also send the geographic position data and the motion trail of the pet to the terminal device 200 through the communication unit 6, so that the owner of the pet can know the heading of the pet. For example, the pet may catch up with the pet when the pet is suffering from emotional abnormality, physical abnormality, or a risk of hitting, bumping, etc.
In some embodiments, the processing unit 3 is further configured to determine whether the distance between the positioning unit 8 and the terminal device 200 is greater than a set visible distance. It is understood that the processing unit 3 may receive the positioning data of the terminal device 200 through the communication unit 6, and calculate the distance between the positioning unit 8 and the terminal device 200 by using the positioning data of the terminal device 200 and the geographic position data of the pet collected by the positioning unit 8. When the distance between the positioning unit 8 and the terminal device 200 is not greater than the visible distance, it indicates that the pet is still in the sight range of the pet owner, and it is not necessary to start the acquisition unit 2 and the voice unit 4 at this time, so the processing unit 3 controls the acquisition unit 2 and the voice unit 4 to enter the sleep state, so that the power consumption can be saved, and the service lives of the acquisition unit 2 and the voice unit 4 can be prolonged. When the distance between the positioning unit 8 and the terminal device 200 is larger than the visible distance, which indicates that the pet is out of the sight of the pet owner, the processing unit 3 controls the acquisition unit 2 and the voice unit 4 to enter the working state to monitor the condition of the pet.
Referring to fig. 5, the pet monitoring method provided by the present application can be used for monitoring the status of a pet, so as to find the abnormal status of the pet in time and take corresponding measures.
The pet monitoring method can be based on the wearable pet device 100, and specifically comprises the following steps:
s1, collecting pet condition data.
In step S1, after the wearable pet device 100 is worn on the pet, the condition data of the pet is collected by the collecting unit 2 in the wearable pet device 100, and the condition data is sent to the processing unit 3.
The condition data may include at least one of motion data (e.g., motion duration, motion speed, motion acceleration, etc.) of the pet, physiological data (e.g., body temperature, body surface humidity, heart rate, respiratory rate, electromyographic signals, etc.) of the pet, sound data of the pet and its surrounding environment, and impact force data received by the pet.
In some embodiments, the method may further comprise:
before step S1, the geographic location data of the pet is collected by the positioning unit 8 and sent to the processing unit 3.
The processing unit 3 judges whether the distance between the positioning unit 8 and the terminal device 200 of the pet owner is larger than the set visible distance according to the geographic position data of the pet.
When the distance between the positioning unit 8 and the terminal device 200 is not greater than the visible distance, the processing unit 3 controls the acquisition unit 2 and the voice unit 4 to enter a sleep state.
When the distance between the positioning unit 8 and the terminal device 200 is larger than the visible distance, the processing unit 3 controls the acquisition unit 2 and the voice unit 4 to enter the working state.
And S2, judging whether the pet has abnormal conditions according to the condition data.
In step S2, the condition data of the pet is analyzed by the processing unit 3. Specifically, the processing unit 3 may identify the emotion type of the pet corresponding to the motion data, the physiological data and/or the voice data through a machine learning algorithm such as a neural network, and when the emotion of the pet is identified, for example, fear and excitement, the processing unit 3 determines that the emotion of the pet is abnormal and needs emotional comfort. The processing unit 3 can also judge whether the pet is hit by external force or impacted by external force according to the impact force data. When the impact force data is larger than the set impact force threshold value, the processing unit 3 judges that the pet is struck by external force or impacted by the external force.
And S3, when the abnormal condition of the pet is judged, playing corresponding voice content to placate the emotion of the pet and/or indicate the pet to escape from danger.
When the abnormal condition of the pet is judged not to occur, the condition data of the pet is continuously collected through the collecting unit 2.
In step S3, for example, when the processing unit 3 determines that the pet is in an abnormal emotion state, such as a fear or violent emotion state, the processing unit 3 controls the voice unit 4 to play the voice of the pet owner, which is recorded in the storage unit 5 in advance, or the light music which can soothe the emotion of the pet, so as to make the emotion of the pet mild and cool. For another example, when the processing unit 3 determines that the pet is hit by external force or hit by external force, the processing unit 3 controls the voice unit 4 to play a pre-recorded oral command for instructing the pet to escape from the storage unit 5, so as to instruct the pet to escape.
In some embodiments, step S3 may further include:
the status data of the pet, such as the motion data, the physiological data, the sound data, the impact force data and the like, is sent to the terminal device 200 through the communication unit 6, and the judgment result of whether the abnormal status of the pet occurs is sent to the terminal device 200, so that the pet owner can check the abnormal status of the pet on the terminal device 200, and the remote monitoring of the pet status is realized.
In some embodiments, step S3 may further include the steps of:
when the processing unit 3 judges that the abnormal condition of the pet occurs, the processing unit 3 controls the camera 7 to start, and the data of the surrounding environment of the pet are shot.
The photographed contents are transmitted to the terminal device 200 through the communication unit 6 so that the pet owner can view the surrounding environment where the abnormal condition of the pet occurs on the terminal device 200.
In some embodiments, step S3 may further include the steps of:
real-time voice content from the terminal device 200 is received through the communication unit 6.
The processing unit 3 controls the voice unit 4 to play the real-time voice content, so as to more effectively placate the emotion of the pet or more accurately guide the pet to escape from danger.
In some embodiments, step S3 may further include the steps of:
and generating the motion trail of the pet by the processing unit 3 according to the geographic position data of the pet.
The geographic position data and the motion trail of the pet are sent to the terminal device 200 through the communication unit 6, so that the pet owner can know the heading of the pet, and the pet can timely catch up to the pet when the abnormal condition occurs.
The wearable pet device 100 and the pet monitoring method can be applied to pets. When the wearable pet device 100 is worn on a pet, the collecting unit 2 can collect various data of the pet, such as motion data, physiological data, sound data, impact force data, and the like. When the pet owner is not around the pet, the processing unit 3 can analyze the data collected by the collecting unit 2, judge the abnormal situation of the pet, and when the emotion of the pet is abnormal, the automatic control voice unit 4 plays the voice content for pacifying the pet, and the emotion of the pet is pacified in time, when the pet is attacked by external force or impacted, the automatic control voice unit 4 plays the voice content for indicating the pet to escape from danger, and the time indication is far away from the danger. Therefore, the wearable pet device 100 and the pet monitoring method can remotely monitor the condition of the pet and can judge that corresponding measures are taken in time when the abnormal condition of the pet occurs.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (10)
1. The pet wearable device is characterized by being worn on a pet, and comprising an acquisition unit, a processing unit and a voice unit, wherein the acquisition unit and the voice unit are respectively connected with the processing unit;
the acquisition unit is used for acquiring the status data of the pet and sending the status data to the processing unit;
the processing unit is used for judging whether the pet has an abnormal condition according to the condition data and controlling the voice unit to play corresponding voice content when the pet has the abnormal condition so as to sooth the emotion of the pet and/or indicate the pet to escape from danger.
2. The pet wearable device of claim 1, wherein the acquisition unit comprises at least one of a motion sensor, a physiological sensor, a sound sensor, and a force sensor;
the pet monitoring system comprises a motion sensor, a physiological sensor, a sound sensor and a force sensor, wherein the motion sensor is used for collecting motion data of a pet, the physiological sensor is used for collecting physiological data of the pet, the sound sensor is used for collecting sound data of the pet and the surrounding environment of the pet, and the force sensor is used for collecting impact force data born by the pet;
at least one of the motion data, the physiological data, the sound data, and the impact force data constitutes the condition data.
3. The wearable pet device of claim 2, wherein the processing unit is configured to determine whether the pet is emotional abnormally according to at least one of the motion data, the physiological data, and the sound data, and determine whether the pet is hit by an external force or hit by an external force according to the impact force data.
4. The pet wearable device of claim 1, further comprising a storage unit, the storage unit being connected to the processing unit and the voice unit, the storage unit storing pre-recorded audio for soothing mood of the pet and for indicating danger of escaping from the pet, the processing unit being configured to control the voice unit to play the audio in the storage unit.
5. The pet wearable device of claim 1, further comprising a communication unit, wherein the processing unit is connected to a terminal device through the communication unit, and the processing unit is configured to send the status data of the pet and a determination result of whether the pet has an abnormal status to the terminal device through the communication unit, and to receive the voice content of the terminal device through the communication unit in real time, and control the voice unit to play the voice content.
6. The pet wearable device of claim 5, further comprising a positioning unit connected to the processing unit, the positioning unit configured to collect geo-location data of the pet and send the geo-location data to the processing unit, the processing unit configured to generate a motion trajectory of the pet according to the geo-location data.
7. The wearable pet device of claim 6, wherein the processing unit is further configured to determine whether a distance between the positioning unit and the terminal device is greater than a set visible distance, and when the distance between the positioning unit and the terminal device is not greater than the visible distance, the processing unit controls the acquisition unit and the voice unit to enter a sleep state, and when the distance between the positioning unit and the terminal device is greater than the visible distance, the processing unit controls the acquisition unit and the voice unit to enter an operating state.
8. The pet wearable device of claim 1, further comprising a camera, wherein the processing unit is connected to the camera, and the processing unit is configured to control the camera to capture the ambient data of the pet when it is determined that the pet is in the abnormal condition.
9. A pet monitoring method for monitoring the condition of a pet, the method being based on a wearable pet device according to any of claims 1 to 8, the method comprising:
s1, collecting the status data of the pet;
s2, judging whether the pet has abnormal conditions according to the condition data;
and S3, when the abnormal condition of the pet is judged, playing corresponding voice content to placate the emotion of the pet and/or indicate the pet to escape from danger.
10. A pet monitoring method in accordance with claim 9 wherein the condition data comprises at least one of athletic data of the pet, physiological data of the pet, sound data of the pet and its surroundings, and impact force data experienced by the pet;
step S2 includes the following steps:
judging whether the pet is abnormal in emotion according to at least one of the motion data, the physiological data and the sound data;
and judging whether the pet is struck by external force or impacted by the external force according to the impact force data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111659310.0A CN114366029A (en) | 2021-12-30 | 2021-12-30 | Wearable pet equipment and pet monitoring method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111659310.0A CN114366029A (en) | 2021-12-30 | 2021-12-30 | Wearable pet equipment and pet monitoring method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114366029A true CN114366029A (en) | 2022-04-19 |
Family
ID=81142136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111659310.0A Withdrawn CN114366029A (en) | 2021-12-30 | 2021-12-30 | Wearable pet equipment and pet monitoring method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114366029A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118680092A (en) * | 2024-08-06 | 2024-09-24 | 广东小狼星物联有限公司 | A positioning method and system for recording and analyzing pet activity trajectories |
-
2021
- 2021-12-30 CN CN202111659310.0A patent/CN114366029A/en not_active Withdrawn
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118680092A (en) * | 2024-08-06 | 2024-09-24 | 广东小狼星物联有限公司 | A positioning method and system for recording and analyzing pet activity trajectories |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109407504B (en) | Personal safety detection system and method based on smart watch | |
US11696611B2 (en) | Helmet-based system for improved practice efficiency and athlete safety | |
US9610028B2 (en) | Method and apparatus for sensing a horse's moods | |
US7980998B2 (en) | Training and instructing support device | |
EP2399513B1 (en) | System for non-invasive automated monitoring, detection, analysis, characterisation, prediction or prevention of seizures and movement disorder symptoms | |
WO2019132803A2 (en) | Health monitoring and tracking system for animals | |
CN107532959A (en) | Individual hits monitoring system | |
KR101907598B1 (en) | Electronic Apparatus, Method and System for using Wearable Apparatus for Prescribing of Psychology associated with Emotion and Behavior of User through Artificial Intelligence Model | |
CN104966380A (en) | Alarm system and method capable of monitoring accidental tumble of human body | |
US11547088B2 (en) | System and method for selecting and executing training protocols for autonomously training an animal | |
US6782847B1 (en) | Automated surveillance monitor of non-humans in real time | |
CN105007808A (en) | Visit duration control system and method | |
CN113205661A (en) | Anti-cheating implementation method and system, intelligent wearable device and storage medium | |
KR102331335B1 (en) | Vulnerable person care robot and its control method | |
CN108039025A (en) | Drowning alarm method based on wearable device and wearable device | |
CN112188296A (en) | Interaction method, device, terminal and television | |
KR20180063625A (en) | Method for detecting the emotion of pet and the device for detecting the emotion of pet | |
CN114366029A (en) | Wearable pet equipment and pet monitoring method | |
CN110506661B (en) | Method for preventing pet fighting based on machine learning | |
CN111543351B (en) | Breeding monitoring system and monitoring method thereof | |
JP5669302B2 (en) | Behavior information collection system | |
JP2019058098A (en) | Pet and human friendship degree measuring device, and pet and human friendship degree measuring program | |
WO2018069774A1 (en) | A system and method for generating a status output based on sound emitted by an animal | |
KR20190005370A (en) | System for monitoring a fet dog | |
CN116233182A (en) | Pet house wisdom management and control system based on thing networking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20220419 |
|
WW01 | Invention patent application withdrawn after publication |