CN117547803A - Remote interaction method, device, equipment and computer readable storage medium - Google Patents
Remote interaction method, device, equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN117547803A CN117547803A CN202311516201.2A CN202311516201A CN117547803A CN 117547803 A CN117547803 A CN 117547803A CN 202311516201 A CN202311516201 A CN 202311516201A CN 117547803 A CN117547803 A CN 117547803A
- Authority
- CN
- China
- Prior art keywords
- projection
- data
- interaction
- mobile terminal
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 133
- 238000000034 method Methods 0.000 title claims abstract description 62
- 230000002452 interceptive effect Effects 0.000 claims abstract description 81
- 230000006399 behavior Effects 0.000 claims description 123
- 230000004044 response Effects 0.000 claims description 24
- 230000002159 abnormal effect Effects 0.000 claims description 23
- 238000006243 chemical reaction Methods 0.000 claims description 22
- 206010000117 Abnormal behaviour Diseases 0.000 claims description 20
- 230000001795 light effect Effects 0.000 claims description 20
- 238000004458 analytical method Methods 0.000 claims description 19
- 238000004891 communication Methods 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 11
- 238000001514 detection method Methods 0.000 claims description 8
- 238000010191 image analysis Methods 0.000 claims description 5
- 230000000694 effects Effects 0.000 abstract description 25
- 238000005516 engineering process Methods 0.000 abstract description 8
- 241000282326 Felis catus Species 0.000 description 82
- 230000008569 process Effects 0.000 description 16
- 230000033001 locomotion Effects 0.000 description 8
- 230000008859 change Effects 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 230000008451 emotion Effects 0.000 description 5
- 238000012423 maintenance Methods 0.000 description 5
- 238000012806 monitoring device Methods 0.000 description 5
- 230000003542 behavioural effect Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000004907 flux Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000005802 health problem Effects 0.000 description 3
- 238000010183 spectrum analysis Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 208000019901 Anxiety disease Diseases 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 2
- 230000016571 aggressive behavior Effects 0.000 description 2
- 230000036506 anxiety Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000007405 data analysis Methods 0.000 description 2
- 235000005911 diet Nutrition 0.000 description 2
- 230000002996 emotional effect Effects 0.000 description 2
- 230000029142 excretion Effects 0.000 description 2
- 210000003746 feather Anatomy 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000036651 mood Effects 0.000 description 2
- 206010001497 Agitation Diseases 0.000 description 1
- 206010004716 Binge eating Diseases 0.000 description 1
- 208000032841 Bulimia Diseases 0.000 description 1
- 206010010774 Constipation Diseases 0.000 description 1
- 206010012735 Diarrhoea Diseases 0.000 description 1
- 208000027534 Emotional disease Diseases 0.000 description 1
- 241000255777 Lepidoptera Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000004596 appetite loss Effects 0.000 description 1
- 231100000871 behavioral problem Toxicity 0.000 description 1
- 208000014679 binge eating disease Diseases 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000037213 diet Effects 0.000 description 1
- 230000000378 dietary effect Effects 0.000 description 1
- 230000020595 eating behavior Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000019637 foraging behavior Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 238000003706 image smoothing Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 235000021266 loss of appetite Nutrition 0.000 description 1
- 208000019017 loss of appetite Diseases 0.000 description 1
- 230000027939 micturition Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000027387 rhythmic behavior Effects 0.000 description 1
- 230000004622 sleep time Effects 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2461—Projection of a two-dimensional real image
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Toys (AREA)
Abstract
The invention relates to the technical field of Internet, and discloses a remote interaction method, a device, equipment and a computer readable storage medium, wherein the method comprises the following steps: acquiring behavior data of a target interactive object, wherein the behavior data at least comprises image data acquired by the camera equipment; the behavior data are sent to a mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the behavior data is received, wherein the projection instruction at least comprises a projection pattern; and controlling the projection equipment to carry out projection interaction according to the projection instruction. According to the invention, the projection equipment of the vehicle is combined with the camera equipment of the mobile terminal to remotely control the vehicle, and the traditional toy interaction mode is combined with modern technology, so that more intimate, more convenient and more interesting interaction between people and target interaction objects is realized. The situation that the conventional toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
Description
Technical Field
The present invention relates to the field of internet technologies, and in particular, to a remote interaction method, device, apparatus, and computer readable storage medium.
Background
In daily life, when a user is not at home but a pet is at home, the conventional pet toy has poor accompany effect to the pet and poor interactivity.
Disclosure of Invention
In view of the above, the present invention provides a remote interaction method, device, apparatus and computer readable storage medium, which can solve the technical problems of poor companion effect and poor interaction of the conventional pet toy to the pet.
According to an aspect of an embodiment of the present invention, there is provided a remote interaction method applied to a vehicle on which an image capturing apparatus and a projection apparatus are provided, the method including:
acquiring behavior data of a target interactive object, wherein the behavior data at least comprises image data acquired by the camera equipment;
the behavior data are sent to a mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the behavior data is received, wherein the projection instruction at least comprises a projection pattern;
And controlling the projection equipment to carry out projection interaction according to the projection instruction.
In an optional manner, the sending the behavior data to the mobile terminal corresponding to the vehicle, and receiving a projection instruction fed back by the mobile terminal according to the behavior data, includes:
acquiring the behavior type of the target interactive object based on the behavior data;
the behavior type is sent to a mobile terminal corresponding to the vehicle, a projection instruction fed back by the mobile terminal according to the behavior type is received, and the projection instruction at least comprises the projection pattern and basic parameters of the projection pattern;
correspondingly, the controlling the projection device to perform projection interaction according to the projection instruction includes:
and controlling the projection equipment to adjust the projection pattern according to the basic parameters so as to carry out projection interaction.
In an optional manner, the controlling the projection device to adjust the projection pattern according to the basic parameter to perform projection interaction includes:
controlling the projection equipment to project the projection pattern to a preset range of the target interactive object, wherein the preset range at least comprises a visible range corresponding to the target interactive object;
And sending the response data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction.
In an optional manner, the sending the response data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameter to perform projection interaction includes:
collecting reaction data of the target interactive object in response to the projection pattern;
the response data are sent to the mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the response data is received;
adjusting the light effect parameter of the projection pattern according to the projection instruction to obtain a changed light effect pattern, wherein the basic parameters at least comprise the light effect parameter or the speed parameter;
and controlling the projection equipment to adjust the speed parameter of the light effect pattern to perform projection interaction.
In an optional manner, the collecting behavior data of the target interactive object includes:
image data, radar data and audio data of the target interactive object are collected, wherein the image data is obtained by shooting based on the camera equipment, the radar data is obtained by detecting based on a radar sensor carried by the vehicle, and the audio data is obtained by collecting based on a sound sensor carried by the vehicle.
In an optional manner, after collecting the behavior data of the target interactive object, the method further includes:
performing image analysis on the image data through an image processing algorithm to obtain position data and posture data corresponding to the target interactive object;
obtaining a distance parameter between the target interactive object and the projection equipment based on the radar data;
performing audio analysis on the audio data through a sound analysis algorithm to obtain volume and tone parameters corresponding to the target interactive object;
correspondingly, the sending the behavior data to the mobile terminal corresponding to the vehicle includes:
and sending the position data, the gesture data, the distance parameter and the volume and tone parameter to a mobile terminal corresponding to the vehicle.
In an optional manner, after the controlling the projection device to perform projection interaction according to the projection instruction, the method further includes:
performing abnormal detection on the target interactive object according to the behavior data to obtain abnormal behavior data;
judging whether the abnormal behavior data are preset abnormal data or not;
and when the abnormal behavior data is the preset abnormal data, carrying out abnormal alarm according to the abnormal behavior data.
According to another aspect of the embodiment of the present invention, the present invention also proposes a remote interaction device, the device comprising:
the data acquisition module is used for acquiring behavior data of the target interactive object, wherein the behavior data at least comprises image data acquired by the camera equipment;
the data feedback module is used for sending the behavior data to the mobile terminal corresponding to the vehicle and receiving a projection instruction fed back by the mobile terminal according to the behavior data, wherein the projection instruction at least comprises a projection pattern;
and the projection interaction module is used for controlling the projection equipment to carry out projection interaction according to the projection instruction.
According to another aspect of the embodiment of the present invention, the present invention further provides a remote interaction device, including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform operations of the remote interaction method as described above.
According to yet another aspect of embodiments of the present invention, there is provided a computer readable storage medium having stored therein at least one executable instruction that, when run on a remote interaction device/apparatus, causes the remote interaction device/apparatus to perform operations of the remote interaction method as described above.
According to the invention, the behavior data of the target interactive object is acquired, and the behavior data at least comprises image data acquired by the camera equipment; then the behavior data is sent to a mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the behavior data is received, wherein the projection instruction at least comprises a projection pattern; and finally, controlling the projection equipment to carry out projection interaction according to the projection instruction. According to the invention, the traditional pet toy interaction mode is combined with modern technology by combining the projection equipment of the vehicle with the camera equipment of the mobile terminal for remotely controlling the vehicle, so that more intimate, more convenient and more interesting interaction between people and pets is realized. The situation that the conventional pet toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a schematic flow chart of a first embodiment of a remote interaction method according to the present invention;
FIG. 2 is a schematic flow chart of a second embodiment of the remote interaction method according to the present invention;
FIG. 3 is a schematic flow chart of a third embodiment of the remote interaction method according to the present invention;
FIG. 4 is a block diagram of a first embodiment of a remote interactive apparatus according to the present invention;
fig. 5 shows a schematic structural diagram of an embodiment of the remote interactive apparatus of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
The embodiment of the invention provides a remote interaction method, and referring to fig. 1, fig. 1 shows a schematic flow chart of a first embodiment of the remote interaction method provided by the invention. The method is performed by a remote interactive device. The remote interaction method is applied to a vehicle, and the vehicle is provided with an image pickup device and a projection device.
Step S10: and collecting behavior data of the target interactive object.
It should be noted that, the execution body of the method of the present embodiment may be an electronic device having functions of data collection, data transmission and projection interaction, for example, a personal computer, a vehicle-mounted terminal, etc., or may be other electronic devices capable of implementing the same or similar functions, for example, the remote interaction device described above, which is not limited in this embodiment. The present embodiment and the following embodiments are specifically described herein with the above remote interactive apparatus.
In one embodiment, the target interactive object comprises a pet, a child, etc., which may comprise a cat, a dog, etc., as this example is not limited in this regard.
Illustratively, a cat is used as an example, but the present solution is not limited thereto.
In daily life, when a user is not at home but a pet is at home, the conventional cat-comma mode is mainly performed by means of a manual mode, for example, using cat-comma feathers, laser pens, etc.
However, these methods have certain limitations, such as manual operation, damage to cat feathers, etc.
The embodiment of the invention provides a remote interaction method, which can solve the technical problems of poor accompany effect and poor interaction of the traditional pet toy to the pet.
In one implementation, the embodiments of the present invention are applied to a vehicle, and by way of example, a system including a mobile terminal, a mobile terminal application, an image capturing apparatus, and a projection apparatus may be constructed. Through the system, remote control and real-time transmission are realized by using the application program of the mobile terminal, so that a user can interact with the cat at any time and any place.
In one implementation, the embodiment of the invention can be applied to a vehicle with DLP and high definition cameras installed, and the device is configured by using a mobile phone application program. In the configuration process, the detailed information of the vehicle can be input and the special mobile phone application program can be downloaded and installed. After the configuration is completed, the user can remotely access the vehicle-mounted camera and the DLP projection lamp through the mobile phone application program.
In one embodiment, the behavior data is objective data about the behavior of the target interactive object and the environment when the behavior occurs, and by way of example, the above pet may include an activity time, an activity range, an activity mode, a sound, a limb language, and the like, which is not limited in this embodiment.
For example, the activity time may include: daily activity time, sleep time, feeding time, etc. of the pet. The range of motion may include: the range of the pet's home or outdoor activities, which areas the pet prefers to stay in, etc. The activity pattern may include: the interaction mode between the pet and the user or other pets, such as playing, nicking, etc. Sound and limb language may include: the pet's voice, posture and expression, etc. to reflect the emotional state and needs of the pet. The present embodiment is not limited thereto.
Illustratively, taking a cat as an example, behavioral data of the cat, such as voice, motion, body language, etc., may be collected.
In one embodiment, the behavior data includes at least image data acquired by the image capturing apparatus. By way of example, the image data may include pictures, videos, and the like.
By way of example, the image capturing apparatus may include an in-vehicle camera, a high definition camera, and the like, to which the present embodiment is not limited.
In one embodiment, behavioral data may be acquired by a variety of devices. Exemplary include monitoring devices, smart collars, smart toys, sound sensors, and the like. The behavior data can help the user to better know the behavior habit and the demand of the pet, and more personalized maintenance service is provided.
Step S20: and sending the behavior data to a mobile terminal corresponding to the vehicle, and receiving a projection instruction fed back by the mobile terminal according to the behavior data.
By way of example, the mobile terminal may include a cell phone, a computer, a smart watch, etc., and the present embodiment is not limited thereto.
In one embodiment, the projection instruction is an instruction that the user performs operation according to the behavior data based on the mobile terminal and feeds back to the remote interaction device.
In one embodiment, the projecting instructions may include: the present embodiment is not limited thereto, and the projection device is turned on/off, the projection pattern is turned on, the projection parameters are adjusted, the input signal source is adjusted, the focal length is adjusted, the projection device angle is adjusted, the screen calibration is performed, the brightness and contrast are adjusted, the screen mode is switched, and the like.
Illustratively, turning on/off the projection device may include: after ensuring that the power connection is normal, the power button is pressed or the power button on the mobile terminal is used to turn on or off the projection device.
Illustratively, the input signal source may include: the correct signal source is selected, for example a computer, DVD player or other video device.
Illustratively, the focal length may be adjusted according to the distance between the image capturing apparatus and the target interactive object and the screen size, so as to obtain a clear image.
For example, the brightness and contrast of the mobile terminal image may be adjusted according to ambient light and personal preferences.
In one embodiment, the projecting instructions include at least a projected pattern.
By way of example, the projection device may include a DLP projection lamp, projector, or the like. The present embodiment is not limited thereto.
In one embodiment, the projected pattern may include a bird, butterfly, or like cat attracting pattern.
Step S30: and controlling the projection equipment to carry out projection interaction according to the projection instruction.
In one embodiment, the remote interactive device may launch an application and turn on a DLP projection light to project the projected pattern onto a surface viewable by the cat. In the projection process of the cat, a user can observe the reaction of the cat through a mobile phone application program, and select parameters of a projection pattern on the mobile phone according to the interest and the reaction of the cat. By adjusting the parameters of the DLP projection lamp, the attention of the cat can be drawn and the interactive pleasure can be increased.
Based on the foregoing embodiment, after step S30, the present embodiment further includes:
in one embodiment, abnormal behavior data is obtained by detecting the target interactive object according to the behavior data.
In one embodiment, the abnormal behavior data may include: abnormal call, abnormal behavior, abnormal diet, abnormal excretion, etc., which is not limited in this embodiment.
In one embodiment, health or mood problems can be discovered in time from abnormal behavioral data.
In one embodiment, it is determined whether the abnormal behavior data is preset abnormal data.
In one embodiment, the preset exception data includes: abnormal data such as abnormal eating behavior, abnormal excretion habits, behavioral problems, emotional problems, and the like, which are not limited in this embodiment.
For example, the abnormal call may include: the pet emits unusual sounds that can be caused by pain, discomfort, anxiety, or fear.
For example, the abnormal behavior may include: pets exhibit abnormal behavior such as excessive licking, hyperexcitability, increased aggression, etc.
For example, a dietary abnormality may include: the loss of appetite or binge eating of pets can be caused by health problems or mood problems.
Illustratively, draining the differences may include: the problems of diarrhea, constipation, frequent urination and the like of pets can be caused by health problems.
In one embodiment, the preset anomaly data may be constructed by manually creating a specific scene, simulating a pathological condition, or using an anomaly sample in existing pet health and behavioral data.
In one embodiment, pet health may be monitored by preset anomaly data.
And when the abnormal behavior data is the preset abnormal data, carrying out abnormal alarm according to the abnormal behavior data.
In one embodiment, for example, a cat, the behavior of the cat may be analyzed during a cat race to detect some common health problems. For example, a cat's voice, motion, and body language may exhibit pain, discomfort, anxiety, or the like. The behavior of the cat is monitored by the mobile phone application program to detect these abnormal performances. The remote interaction device can automatically send out an alarm to remind a user to process in time. So as to ensure the health and safety of the cat and improve the experience of the user.
The behavior data of the target interactive object can be collected through various devices such as monitoring devices, intelligent collars and intelligent toys, sound sensors and the like. Based on the behavior data, the system can help the user to better know the behavior habit and the demand of the pet, and provide more personalized maintenance service. And then after the remote interaction equipment sends the behavior data to the mobile terminal corresponding to the vehicle, the user can start an application program through the remote interaction equipment according to the implementation behavior data of the pet, and a projection instruction is fed back to the remote interaction equipment. For example, turning on a DLP projection lamp, the projected pattern is projected onto a plane viewable by the cat. In the projection process of the cat, a user can observe the reaction of the cat through a mobile phone application program, and select parameters of a projection pattern on the mobile phone according to the interest and the reaction of the cat. By adjusting the parameters of the DLP projection lamp, the attention of the cat can be drawn and the interactive pleasure can be increased. Because the projection equipment of the vehicle is combined with the camera equipment of the mobile terminal remote control vehicle, the traditional pet toy interaction mode is combined with modern technology, and more intimate, more convenient and more interesting interaction between people and pets is realized. The situation that the conventional pet toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
Referring to fig. 2, fig. 2 is a schematic flow chart of a second embodiment of the remote interaction method provided by the present invention. The method is performed by a remote interactive device. As shown in fig. 2, based on the first embodiment described above, in the present embodiment, the step S20 includes:
step S21: and acquiring the behavior type of the target interactive object based on the behavior data.
In one embodiment, the behavior types may include foraging behavior, food storage behavior, aggression, defensive behavior, domain behavior, rhythmic behavior, and the like of the pet. Depending on the emotional division of the pet, active behavior, static behavior, etc., may be included, which is not limited by the present embodiment.
For example, a pet may establish its own territory in its living environment and exhibit the behavior of protecting the territory.
For example, some pets may exhibit regular behavior, such as eating at regular intervals, sleeping, etc.
In one embodiment, the pet owner is helped to better understand and care for the pet by knowing the behavior type of the pet, so as to improve the interaction effect of the pet.
Step S22: and sending the behavior type to a mobile terminal corresponding to the vehicle, and receiving a projection instruction fed back by the mobile terminal according to the behavior type.
Illustratively, the projection instructions include at least the projection pattern and a base parameter of the projection pattern.
In one embodiment, the basic parameters may include light efficiency, speed, resolution, brightness and contrast, number of colors, trapezoidal correction, etc. of the projected pattern, which the present example is not limited to.
Illustratively, the resolution determines the degree of refinement of the projected pattern details, and in general, the higher the resolution of the image, the more pixels that are included, the sharper the image, and the more likely the cat will be attracted. Illustratively, brightness and contrast affect the vision of the cat.
Correspondingly, the step S30 includes:
step S31: and controlling the projection equipment to adjust the projection pattern according to the basic parameters so as to carry out projection interaction.
In one embodiment, a dedicated application is installed on the mobile phone, through which the picture of the camera can be viewed in real time, and parameters such as the light efficiency, the image or the speed of the DLP projection lamp can be controlled. The user can interact with the cat through the mobile phone application program at any time and any place.
In one embodiment, the remote interaction device may acquire a behavior type of the target interaction object, such as a food behavior, a field behavior, an active behavior, a static behavior, and the like, based on the behavior data, and then send the behavior type to a mobile terminal corresponding to the vehicle, and receive a projection instruction fed back by the mobile terminal according to the behavior type, where the projection instruction includes at least the projection pattern and a basic parameter of the projection pattern; and finally, controlling the projection equipment to adjust the projection pattern according to the basic parameters so as to carry out projection interaction. The pattern suitable for the current emotion of the cat can be provided by adjusting the basic parameters according to the behavior change of the cat, so that a user can interact with the cat at any time and any place, and the projection interaction effect is improved.
Based on the foregoing embodiment, in this embodiment, step S31 includes:
and controlling the projection equipment to project the projection pattern to a preset range of the target interactive object, wherein the preset range at least comprises a visual range corresponding to the target interactive object.
In one embodiment, the predetermined range is a range in which the projection device projects the projection pattern, and in this range, the projection interaction with the cat can be performed.
And sending the response data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction.
In one embodiment, the remote interactive device may control the projection device to project the projection pattern to a preset range of the target interactive object. And then, sending the reaction data of the target interaction object to a mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction. Therefore, the visual range of the cat can attract the attention of the cat, and the safety range of the cat-commaing process is controlled so as to ensure the safety and reliability of the cat-commaing process.
Based on the foregoing embodiment, in this embodiment, sending the reaction data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameter to perform projection interaction includes:
And collecting reaction data of the target interactive object in response to the projection pattern.
In one embodiment, the response data is a reflection of the projected pattern by the cat, and the corresponding projected pattern can be adjusted according to the reflection data to improve the effect of the projected interaction.
In one embodiment, the status of the cat may be determined by the collected information, and the user may select an appropriate animation or pattern to project through the phone, and if it is determined that the cat is currently active, the user selects some dynamic, fast moving pattern to project, such as fast moving birds, butterflies, etc., to attract the attention of the cat.
And sending the response data to the mobile terminal corresponding to the vehicle, and receiving a projection instruction fed back by the mobile terminal according to the response data.
And adjusting the light effect parameter of the projection pattern according to the projection instruction to obtain a changed light effect pattern, wherein the basic parameters at least comprise the light effect parameter or the speed parameter.
And controlling the projection equipment to adjust the speed parameter of the light effect pattern to perform projection interaction.
In one embodiment, the light efficiency parameter may include luminous efficiency, luminous flux, illuminance, color temperature, and the like, which is not limited in this example.
In one embodiment, the attention of the cat can be quickly attracted by adjusting the light effect parameter.
In one embodiment, the speed parameters may include average speed, instantaneous speed, maximum speed, acceleration, and the like.
In one embodiment, the projected pattern can be moved based on different speeds by adjusting the speed parameter to increase the activity of the cat.
In one embodiment, the method of playing sound in a vehicle audio system may also be used to increase the interactivity and enjoyment of the cat.
In one embodiment, the mobile phone can also be used for remotely controlling the switch of the DLP projection lamp, adjusting parameters such as the size and the color of the light effect or the image, and controlling the duration of the cat-commaing process.
In one embodiment, a remote interactive device may collect reaction data of the target interactive object in response to the projected pattern; and then the response data is sent to the mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the response data is received. And then, adjusting the light efficiency parameters of the projection pattern, such as luminous efficiency, luminous flux, illuminance, color temperature and the like, according to the projection instruction to obtain a changed light efficiency pattern. Finally, the projection equipment is controlled to adjust the speed parameter of the light effect pattern to carry out projection interaction, the projection pattern can be moved based on different speeds by adjusting the speed parameter, and the activity of the cat is improved. Thereby according to the behavior change of the cat, provide the pattern that is fit for the current emotion of cat to the interest and the custom change of adaptation cat improve interactive effect.
According to the embodiment, the remote interaction device may acquire a behavior type of the target interaction object, such as food behavior, field behavior, active behavior, static behavior, and the like, based on the behavior data, and then send the behavior type to a mobile terminal corresponding to the vehicle, and receive a projection instruction fed back by the mobile terminal according to the behavior type, where the projection instruction at least includes the projection pattern and basic parameters of the projection pattern; and finally, controlling the projection equipment to adjust the projection pattern according to the basic parameters so as to carry out projection interaction. The pattern suitable for the current emotion of the cat can be provided by adjusting the basic parameters according to the behavior change of the cat, so that a user can interact with the cat at any time and any place, and the projection interaction effect is improved. Furthermore, the remote interaction device may further control the projection device to project the projection pattern to a preset range of the target interaction object. And then, sending the reaction data of the target interaction object to a mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction. Therefore, the visual range of the cat can attract the attention of the cat, and the safety range of the cat-commaing process is controlled so as to ensure the safety and reliability of the cat-commaing process. Still further, the remote interaction device may also collect reaction data of the target interaction object in response to the projection pattern; and then the response data is sent to the mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the response data is received. And then, adjusting the light efficiency parameters of the projection pattern, such as luminous efficiency, luminous flux, illuminance, color temperature and the like, according to the projection instruction to obtain a changed light efficiency pattern. Finally, the projection equipment is controlled to adjust the speed parameter of the light effect pattern to carry out projection interaction, the projection pattern can be moved based on different speeds by adjusting the speed parameter, and the activity of the cat is improved. Thereby according to the behavior change of the cat, provide the pattern that is fit for the current emotion of cat to the interest and the custom change of adaptation cat improve interactive effect.
Referring to fig. 3, fig. 3 is a schematic flow chart of a third embodiment of the remote interaction method provided by the present invention.
Based on the above embodiments, in this embodiment, after step S10, the method further includes:
step S101: and carrying out image analysis on the image data through an image processing algorithm to obtain position data and posture data corresponding to the target interactive object.
In one embodiment, the image processing algorithm is an algorithm for analyzing an image, and may include image transformation, image enhancement, image smoothing, and the like.
For example, the image transformation may include: fourier transforms, walsh transforms, discrete cosine transforms, and the like.
In one embodiment, the amount of computation can be reduced by image transformation, resulting in a more efficient processing effect.
In one embodiment, the high-frequency component of the image is enhanced, so that the outline of an object in the image is clear and the details are obvious; such as enhancing low frequency components, may reduce noise effects in the image.
In one embodiment, by smoothing the image, image distortion caused by the imaging device and environment during the actual imaging process can be removed, and useful information can be extracted. By the image processing algorithm, the efficiency of image analysis can be improved.
In one embodiment, the location data set is data describing the location of the cat in space. Such as latitude and longitude, relative position, etc.
In one embodiment, the pose data is information describing the orientation and angle of the cat in space.
For example, the gesture data may include: features such as outline, body posture, movement track, etc. of the cat.
Step S102: and obtaining the distance parameter between the target interactive object and the projection equipment based on the radar data.
In one embodiment, the radar data is information collected by a radar system. The position, distance, speed, direction and other information of the cat can be detected and positioned by utilizing electromagnetic waves to detect and measure.
For example, the radar data may include a distance of the cat, an angle of the cat, a speed of the cat, and the like.
Step S103: and carrying out audio analysis on the audio data through a sound analysis algorithm to obtain volume and tone parameters corresponding to the target interactive object.
In one embodiment, the sound analysis algorithm refers to an algorithm that processes and analyzes sound signals. Acoustic signals of the cat are collected by an acoustic sensor or microphone and then analyzed using an algorithm to extract useful information.
By way of example, the sound analysis algorithm may include: spectrum analysis, voice recognition, voice event detection, voice enhancement, etc.
Illustratively, the sound signal may be converted to a frequency domain representation by spectral analysis, resulting in spectral features of the cat sound. By way of example, different sound types, such as speech, music, ambient noise, etc., may be identified and classified by sound recognition. By way of example, specific sound events, such as sudden sounds of a cat, a sharp sound, etc., may be detected and localized by sound event detection. Illustratively, the quality of the sound signal disturbed by the noise can be improved by the sound enhancement to reduce the interference of the ambient noise on the cat sound signal.
In one embodiment, the accuracy of the voice recognition of the cat can be improved through a voice analysis algorithm.
In one embodiment, the volume and pitch parameters are two important parameters in the sound analysis algorithm. The volume parameter controls the amplitude of the sound signal, i.e. the magnitude or intensity of the sound. In the voice analysis algorithm, the voice volume parameter can be calculated by measuring the amplitude or energy of the voice signal, and different voice intensities and dynamic ranges of the cat can be distinguished.
In one embodiment, the pitch parameter describes the frequency or pitch of the sound signal. The tone is related to the frequency of the sound, the higher the frequency, the higher the tone. In the voice analysis algorithm, the tone parameter can be calculated by a spectrum analysis method or a pitch estimation method so as to reflect the emotion of the cat voice.
Correspondingly, the sending the behavior data to the mobile terminal corresponding to the vehicle includes:
step S201: and sending the position data, the gesture data, the distance parameter and the volume and tone parameter to a mobile terminal corresponding to the vehicle.
In one embodiment, the behavior of the cat may be analyzed using artificial intelligence techniques, including image processing algorithms, voice analysis algorithms, and the like. And the behavior data of the cat is recorded through the mobile phone application program, and analyzed by using an artificial intelligent algorithm, and the information is combined, so that the accuracy and the reliability of information analysis are improved.
Based on the foregoing embodiment, step S10 in this embodiment includes:
and collecting image data, radar data and audio data of the target interactive object.
In one embodiment, the audio data is digital voice data of the cat, which is acquired by a microphone or other voice sensor and then converted to digital voice data by analog-to-digital conversion (ADC).
The image data is obtained based on photographing by the image pickup apparatus, for example. The radar data is obtained based on detection of a radar sensor mounted on the vehicle. The audio data is acquired based on the sound sensor carried by the vehicle.
In one embodiment, the motion and body language of the cat may be captured using a high definition camera to identify the position and motion of the cat. Then, analyzing the shot image through an image processing algorithm, and determining the position and the posture of the cat. For example, image recognition and localization can be performed by detecting features such as contours, body gestures, motion trajectories, and the like of the cat.
In one embodiment, a radar sensor may be used to detect the position and movement of the cat. The radar sensor can transmit signals and receive reflected signals, and the position of the cat and the distance from the vehicle can be determined by calculating the propagation time of the signals and the intensity of the reflected signals.
In one embodiment, a voice sensor or microphone may be used to collect the voice of the cat and identify the voice of the cat through a voice analysis algorithm.
In one embodiment, the accuracy and reliability of information acquisition can be improved through the mode.
In one embodiment, the information such as the behavior, interaction state and the like of the cat can be recorded through the mobile phone application program, and the interests and habits of the cat can be better known through data analysis. The data can help the user to better know the behavior mode and the hobbies of the cat, and after knowing the behavior mode and the hobbies of the cat, the mobile phone actively screens the animation images for the user to select so as to better interact with the cat.
In one embodiment, the mobile phone application program can also automatically record and store the cat comma process, so that the user can conveniently review and share with friends or pet specialists at any time. Such a recording function may help the user to better understand the reaction and interests of the cat in order to improve the way the cat is amused.
The embodiment can improve the accuracy and reliability of information acquisition through various information acquisition modes. Furthermore, the behavior of the cat can be analyzed by utilizing an artificial intelligence technology, including an image processing algorithm, a sound analysis algorithm and the like. The behavior data of the cat is recorded through the mobile phone application program, and is analyzed by using an artificial intelligent algorithm, and the information is combined, so that the accuracy and the reliability of information analysis are improved.
Referring to fig. 4, fig. 4 is a block diagram illustrating a first embodiment of a remote interactive apparatus according to the present invention. The remote interaction device is applied to a vehicle, and the vehicle is provided with image pickup equipment and projection equipment. As shown in fig. 4, the apparatus includes: the system comprises a data acquisition module 41, a data feedback module 42 and a projection interaction module 43.
A data acquisition module 41, configured to acquire behavior data of a target interactive object, where the behavior data at least includes image data acquired by the image capturing device;
The data feedback module 42 is configured to send the behavior data to a mobile terminal corresponding to the vehicle, and receive a projection instruction fed back by the mobile terminal according to the behavior data, where the projection instruction at least includes a projection pattern;
and the projection interaction module 43 is used for controlling the projection equipment to perform projection interaction according to the projection instruction.
In an alternative manner, the data feedback module 42 is further configured to obtain a behavior type of the target interactive object based on the behavior data; the behavior type is sent to a mobile terminal corresponding to the vehicle, a projection instruction fed back by the mobile terminal according to the behavior type is received, and the projection instruction at least comprises the projection pattern and basic parameters of the projection pattern;
correspondingly, the projection interaction module 43 is further configured to control the projection device to adjust the projection pattern to perform projection interaction according to the basic parameter.
In an optional manner, the projection interaction module 43 is further configured to control the projection device to project the projection pattern to a preset range of the target interaction object, where the preset range at least includes a visual range corresponding to the target interaction object; and sending the response data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction.
In an alternative manner, the projection interaction module 43 is further configured to collect reaction data of the target interaction object in response to the projection pattern; the response data are sent to the mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the response data is received; adjusting the light effect parameter of the projection pattern according to the projection instruction to obtain a changed light effect pattern, wherein the basic parameters at least comprise the light effect parameter or the speed parameter; and controlling the projection equipment to adjust the speed parameter of the light effect pattern to perform projection interaction.
In an optional manner, the data acquisition module 41 is further configured to acquire image data, radar data and audio data of the target interactive object, where the image data is acquired based on the capturing device, the radar data is acquired based on the detection of the radar sensor carried by the vehicle, and the audio data is acquired based on the acquisition of the sound sensor carried by the vehicle.
In an alternative manner, the remote interaction device further includes a data analysis module 44, configured to perform image analysis on the image data through an image processing algorithm, so as to obtain position data and gesture data corresponding to the target interaction object; obtaining a distance parameter between the target interactive object and the projection equipment based on the radar data; performing audio analysis on the audio data through a sound analysis algorithm to obtain volume and tone parameters corresponding to the target interactive object;
Correspondingly, the data feedback module 42 is further configured to send the position data, the gesture data, the distance parameter, and the volume and tone parameter to a mobile terminal corresponding to the vehicle.
In an optional manner, the remote interaction device further includes an anomaly detection module 45, configured to perform anomaly detection on the target interaction object according to the behavior data, so as to obtain abnormal behavior data; judging whether the abnormal behavior data are preset abnormal data or not; and when the abnormal behavior data is the preset abnormal data, carrying out abnormal alarm according to the abnormal behavior data.
The behavior data of the target interactive object can be collected through various devices such as monitoring devices, intelligent collars and intelligent toys, sound sensors and the like. Based on the behavior data, the system can help the user to better know the behavior habit and the demand of the pet, and provide more personalized maintenance service. And then after the remote interaction equipment sends the behavior data to the mobile terminal corresponding to the vehicle, the user can start an application program through the remote interaction equipment according to the implementation behavior data of the pet, and a projection instruction is fed back to the remote interaction equipment. For example, turning on a DLP projection lamp, the projected pattern is projected onto a plane viewable by the cat. In the projection process of the cat, a user can observe the reaction of the cat through a mobile phone application program, and select parameters of a projection pattern on the mobile phone according to the interest and the reaction of the cat. By adjusting the parameters of the DLP projection lamp, the attention of the cat can be drawn and the interactive pleasure can be increased. Because the projection equipment of the vehicle is combined with the camera equipment of the mobile terminal remote control vehicle, the traditional pet toy interaction mode is combined with modern technology, and more intimate, more convenient and more interesting interaction between people and pets is realized. The situation that the conventional pet toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
FIG. 5 is a schematic structural diagram of an embodiment of the remote interactive apparatus according to the present invention, and the embodiment of the present invention is not limited to the specific implementation of the remote interactive apparatus.
As shown in fig. 5, the remote interactive apparatus may include: a processor 502, a communication interface (Communications Interface) 504, a memory 506, and a communication bus 508.
Wherein: processor 502, communication interface 504, and memory 506 communicate with each other via communication bus 508. A communication interface 504 for communicating with network elements of other devices, such as clients or other servers. The processor 502 is configured to execute the program 510, and may specifically perform the relevant steps in the foregoing embodiments of the remote interaction method.
In particular, program 510 may include program code comprising computer-executable instructions.
The processor 502 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors comprised by the remote interaction device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 506 for storing a program 510. Memory 506 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may be specifically adapted to be invoked by the processor 502 to cause the remote interaction device to perform the relevant steps described above for the remote interaction method embodiment.
The behavior data of the target interactive object can be collected through various devices such as monitoring devices, intelligent collars and intelligent toys, sound sensors and the like. Based on the behavior data, the system can help the user to better know the behavior habit and the demand of the pet, and provide more personalized maintenance service. And then after the remote interaction equipment sends the behavior data to the mobile terminal corresponding to the vehicle, the user can start an application program through the remote interaction equipment according to the implementation behavior data of the pet, and a projection instruction is fed back to the remote interaction equipment. For example, turning on a DLP projection lamp, the projected pattern is projected onto a plane viewable by the cat. In the projection process of the cat, a user can observe the reaction of the cat through a mobile phone application program, and select parameters of a projection pattern on the mobile phone according to the interest and the reaction of the cat. By adjusting the parameters of the DLP projection lamp, the attention of the cat can be drawn and the interactive pleasure can be increased. Because the projection equipment of the vehicle is combined with the camera equipment of the mobile terminal remote control vehicle, the traditional pet toy interaction mode is combined with modern technology, and more intimate, more convenient and more interesting interaction between people and pets is realized. The situation that the conventional pet toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
Embodiments of the present invention provide a computer readable storage medium storing at least one executable instruction that, when executed on a remote interaction device/apparatus, causes the remote interaction device/apparatus to perform the remote interaction method of any of the above-described method embodiments.
The executable instructions may be particularly useful for causing a remote interaction device/apparatus to perform the relevant steps described above for embodiments of the remote interaction method.
The behavior data of the target interactive object can be collected through various devices such as monitoring devices, intelligent collars and intelligent toys, sound sensors and the like. Based on the behavior data, the system can help the user to better know the behavior habit and the demand of the pet, and provide more personalized maintenance service. And then after the remote interaction equipment sends the behavior data to the mobile terminal corresponding to the vehicle, the user can start an application program through the remote interaction equipment according to the implementation behavior data of the pet, and a projection instruction is fed back to the remote interaction equipment. For example, turning on a DLP projection lamp, the projected pattern is projected onto a plane viewable by the cat. In the projection process of the cat, a user can observe the reaction of the cat through a mobile phone application program, and select parameters of a projection pattern on the mobile phone according to the interest and the reaction of the cat. By adjusting the parameters of the DLP projection lamp, the attention of the cat can be drawn and the interactive pleasure can be increased. Because the projection equipment of the vehicle is combined with the camera equipment of the mobile terminal remote control vehicle, the traditional pet toy interaction mode is combined with modern technology, and more intimate, more convenient and more interesting interaction between people and pets is realized. The situation that the conventional pet toy has poor accompanying effect on the pet when the user is not at home and the pet is at home is avoided, and the real-time interactivity between the user and the pet is improved.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. In addition, embodiments of the present invention are not directed to any particular programming language.
In the description provided herein, numerous specific details are set forth. It will be appreciated, however, that embodiments of the invention may be practiced without such specific details. Similarly, in the above description of exemplary embodiments of the invention, various features of embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. Wherein the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components. Except that at least some of such features and/or processes or elements are mutually exclusive.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.
Claims (10)
1. A remote interaction method, wherein the remote interaction method is applied to a vehicle, and the vehicle is provided with an image capturing apparatus and a projection apparatus, the method comprising:
Acquiring behavior data of a target interactive object, wherein the behavior data at least comprises image data acquired by the camera equipment;
the behavior data are sent to a mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the behavior data is received, wherein the projection instruction at least comprises a projection pattern;
and controlling the projection equipment to carry out projection interaction according to the projection instruction.
2. The method of claim 1, wherein the sending the behavior data to the mobile terminal corresponding to the vehicle and receiving the projection instruction fed back by the mobile terminal according to the behavior data comprise:
acquiring the behavior type of the target interactive object based on the behavior data;
the behavior type is sent to a mobile terminal corresponding to the vehicle, a projection instruction fed back by the mobile terminal according to the behavior type is received, and the projection instruction at least comprises the projection pattern and basic parameters of the projection pattern;
correspondingly, the controlling the projection device to perform projection interaction according to the projection instruction includes:
and controlling the projection equipment to adjust the projection pattern according to the basic parameters so as to carry out projection interaction.
3. The method of claim 2, wherein controlling the projection device to adjust the projection pattern for projection interaction according to the base parameter comprises:
controlling the projection equipment to project the projection pattern to a preset range of the target interactive object, wherein the preset range at least comprises a visible range corresponding to the target interactive object;
and sending the response data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameters to carry out projection interaction.
4. The method according to claim 3, wherein the sending the reaction data of the target interactive object to the mobile terminal corresponding to the vehicle, and adjusting the projection pattern according to the basic parameter, performs projection interaction, includes:
collecting reaction data of the target interactive object in response to the projection pattern;
the response data are sent to the mobile terminal corresponding to the vehicle, and a projection instruction fed back by the mobile terminal according to the response data is received;
adjusting the light effect parameter of the projection pattern according to the projection instruction to obtain a changed light effect pattern, wherein the basic parameters at least comprise the light effect parameter or the speed parameter;
And controlling the projection equipment to adjust the speed parameter of the light effect pattern to perform projection interaction.
5. The method of claim 1, wherein the collecting behavior data of the target interactive object comprises:
image data, radar data and audio data of the target interactive object are collected, wherein the image data is obtained by shooting based on the camera equipment, the radar data is obtained by detecting based on a radar sensor carried by the vehicle, and the audio data is obtained by collecting based on a sound sensor carried by the vehicle.
6. The method of claim 5, further comprising, after the collecting the behavior data of the target interactive object:
performing image analysis on the image data through an image processing algorithm to obtain position data and posture data corresponding to the target interactive object;
obtaining a distance parameter between the target interactive object and the projection equipment based on the radar data;
performing audio analysis on the audio data through a sound analysis algorithm to obtain volume and tone parameters corresponding to the target interactive object;
correspondingly, the sending the behavior data to the mobile terminal corresponding to the vehicle includes:
And sending the position data, the gesture data, the distance parameter and the volume and tone parameter to a mobile terminal corresponding to the vehicle.
7. The method of claim 1, wherein after controlling the projection device to perform projection interaction according to the projection instruction, further comprising:
performing abnormal detection on the target interactive object according to the behavior data to obtain abnormal behavior data;
judging whether the abnormal behavior data are preset abnormal data or not;
and when the abnormal behavior data is the preset abnormal data, carrying out abnormal alarm according to the abnormal behavior data.
8. A remote interactive apparatus, the remote interactive apparatus being applied to a vehicle on which a camera device and a projection device are provided, the apparatus comprising:
the data acquisition module is used for acquiring behavior data of the target interactive object, wherein the behavior data at least comprises image data acquired by the camera equipment;
the data feedback module is used for sending the behavior data to the mobile terminal corresponding to the vehicle and receiving a projection instruction fed back by the mobile terminal according to the behavior data, wherein the projection instruction at least comprises a projection pattern;
And the projection interaction module is used for controlling the projection equipment to carry out projection interaction according to the projection instruction.
9. A remote interactive apparatus, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform the operations of the tele-interaction method of any one of claims 1-7.
10. A computer readable storage medium having stored therein at least one executable instruction which, when run on a remote interaction device/apparatus, causes the remote interaction device/apparatus to perform the operations of the remote interaction method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311516201.2A CN117547803A (en) | 2023-11-13 | 2023-11-13 | Remote interaction method, device, equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311516201.2A CN117547803A (en) | 2023-11-13 | 2023-11-13 | Remote interaction method, device, equipment and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117547803A true CN117547803A (en) | 2024-02-13 |
Family
ID=89810450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311516201.2A Pending CN117547803A (en) | 2023-11-13 | 2023-11-13 | Remote interaction method, device, equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117547803A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118585070A (en) * | 2024-08-05 | 2024-09-03 | 高维度(深圳)生物信息智能应用有限公司 | Intelligent interaction method, system and device based on entertainment device |
-
2023
- 2023-11-13 CN CN202311516201.2A patent/CN117547803A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118585070A (en) * | 2024-08-05 | 2024-09-03 | 高维度(深圳)生物信息智能应用有限公司 | Intelligent interaction method, system and device based on entertainment device |
CN118585070B (en) * | 2024-08-05 | 2024-10-29 | 高维度(深圳)生物信息智能应用有限公司 | Intelligent interaction method, system and device based on entertainment device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12149819B2 (en) | Autonomous media capturing | |
CN110826358B (en) | Animal emotion recognition method and device and storage medium | |
JP7154678B2 (en) | Target position acquisition method, device, computer equipment and computer program | |
CN112088379A (en) | Method and apparatus for determining weights for convolutional neural networks | |
EP3229459B1 (en) | Information processing device, information processing method and program | |
US11977981B2 (en) | Device for automatically capturing photo or video about specific moment, and operation method thereof | |
JP6589880B2 (en) | Information processing system, control method, and storage medium | |
CN106782544A (en) | Interactive voice equipment and its output intent | |
US20190294129A1 (en) | Work support system and information processing method | |
CN110163066A (en) | Multi-medium data recommended method, device and storage medium | |
US10743061B2 (en) | Display apparatus and control method thereof | |
CN117547803A (en) | Remote interaction method, device, equipment and computer readable storage medium | |
WO2017212958A1 (en) | Information processing device, information processing method, and program | |
US20130286244A1 (en) | System and Method for Image Selection and Capture Parameter Determination | |
CN110177258A (en) | Image processing method, image processing apparatus, server, and storage medium | |
US20150022329A1 (en) | Assisted Animal Communication | |
CN109257490A (en) | Audio-frequency processing method, device, wearable device and storage medium | |
CN116156048B (en) | Volume adjustment method, system, equipment and medium based on artificial intelligence | |
JPWO2020022371A1 (en) | Robots and their control methods and programs | |
CN107452381A (en) | A kind of multi-media voice identification device and method | |
CN113728941B (en) | Intelligent pet dog domestication method and system | |
JP7092110B2 (en) | Information processing equipment, information processing methods, and programs | |
CN104184943A (en) | Image shooting method and device | |
EP3776537A1 (en) | Intelligent assistant device communicating non-verbal cues | |
CN109361859A (en) | A kind of image pickup method, terminal and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |