CN113612969A - Method and device for transmitting video data for remote control of unmanned equipment - Google Patents
Method and device for transmitting video data for remote control of unmanned equipment Download PDFInfo
- Publication number
- CN113612969A CN113612969A CN202110864531.5A CN202110864531A CN113612969A CN 113612969 A CN113612969 A CN 113612969A CN 202110864531 A CN202110864531 A CN 202110864531A CN 113612969 A CN113612969 A CN 113612969A
- Authority
- CN
- China
- Prior art keywords
- video data
- camera
- state
- transmission
- unmanned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 78
- 230000005540 biological transmission Effects 0.000 claims abstract description 186
- 230000008859 change Effects 0.000 claims description 18
- 238000003860 storage Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 16
- 230000002093 peripheral effect Effects 0.000 claims description 10
- 230000002159 abnormal effect Effects 0.000 claims description 9
- 238000009826 distribution Methods 0.000 claims description 4
- 230000003111 delayed effect Effects 0.000 claims description 2
- 238000009434 installation Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 20
- 238000010586 diagram Methods 0.000 description 16
- 230000000007 visual effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 230000006872 improvement Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Studio Devices (AREA)
Abstract
The specification discloses a method and a device for video data transmission of remote control of unmanned equipment, which are used for acquiring state data of the unmanned equipment at the current moment. Secondly, determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when the video data collected by the cameras are transmitted. And then, adjusting the video data acquired by each camera to be transmitted according to the video transmission strategy corresponding to each camera. And finally, transmitting the adjusted video data to a remote control system, and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run. According to the method, the most appropriate video data in the current state are transmitted, so that a remote driver can timely acquire the video data with the highest definition in the current state, and therefore the safety of the unmanned equipment in the driving process is improved.
Description
Technical Field
The specification relates to the technical field of unmanned driving, in particular to a method and a device for video data transmission of remote control of unmanned equipment.
Background
In the technical field of unmanned driving, in order to guarantee safe driving of unmanned equipment, a mode of monitoring by a remote driver in real time is mainly adopted, when the unmanned equipment breaks down or is in an emergency, the remote driver takes over and intervenes temporarily, and the unmanned equipment is parked nearby or driven away from a fault position.
At present, a remote driver obtains environmental information around the unmanned device through a video shot by a camera on the unmanned device, and drives and controls a vehicle through a wireless communication network. In the process of remote monitoring and driving, the situation that a remote driver cannot timely acquire clear video information may occur, and therefore potential safety hazards are caused when the unmanned equipment runs.
Therefore, how to obtain clear video information in time is a problem to be solved urgently.
Disclosure of Invention
The present specification provides a method, an apparatus, a storage medium, and an unmanned aerial vehicle for video data transmission for remote control of an unmanned aerial vehicle, which partially solve the above problems in the prior art.
The technical scheme adopted by the specification is as follows:
the present specification provides a method for video data transmission for remote control of a drone, the drone is provided with a plurality of cameras, the cameras mounted on the drone are used for shooting the surrounding environment of the drone, and the method includes:
acquiring state data of the unmanned equipment at the current moment;
determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution;
adjusting video data collected by the cameras to be transmitted according to a video transmission strategy corresponding to the cameras determined based on the current state of the unmanned equipment;
and transmitting the adjusted video data to a remote control system, and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
Optionally, determining, according to the state data, a video transmission policy corresponding to each camera in a current state of the unmanned aerial vehicle, specifically including:
determining the current state of the unmanned equipment as the current state according to the state data;
and determining the video transmission strategies corresponding to the cameras in the current state through a preset transmission strategy table according to the current state, wherein the transmission strategy table comprises the video transmission strategies of the cameras corresponding to the states of the unmanned equipment.
Optionally, the camera includes a compensation camera, and the transmission parameters further include: the information is used for representing whether the video data collected by the camera is transmitted or not;
determining a video transmission strategy corresponding to each camera in the current state through a preset transmission strategy table according to the current state, wherein the video transmission strategy specifically comprises the following steps:
and if the unmanned equipment is determined to be in the steering state, determining that video data collected by the compensation cameras corresponding to the steering direction of the unmanned equipment is transmitted in the steering state through a preset transmission strategy table, and reducing transmission parameters adopted when video data collected by the cameras corresponding to the non-steering direction of the unmanned equipment is transmitted.
Optionally, determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table, specifically including:
for each peripheral obstacle, if the distance between the peripheral obstacle and the unmanned equipment is smaller than a set distance threshold value, determining that the unmanned equipment is in a meeting state;
and determining transmission parameters adopted when video data collected by cameras in the direction of surrounding obstacles are transmitted in the meeting state through a preset transmission strategy table.
Optionally, the camera comprises a plurality of look-around cameras with different orientations;
determining a video transmission strategy corresponding to each camera in the current state through a preset transmission strategy table according to the current state, wherein the video transmission strategy specifically comprises the following steps:
and if the unmanned equipment is determined to be in an abnormal state, determining transmission parameters adopted when panoramic all-around video data collected by the all-around cameras are transmitted in the abnormal state through a preset transmission strategy table, wherein the panoramic all-around video data are obtained by fusing the video data collected by the all-around cameras in different directions.
Optionally, determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table, specifically including:
determining the signal intensity of the unmanned equipment at the current moment according to the positioning information of the unmanned equipment at the current moment and a preset signal intensity map, wherein the signal intensity map records the intensity of network signals in different geographic areas;
and determining the video transmission strategy corresponding to each camera under the signal intensity through a preset transmission strategy table.
Optionally, determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table, specifically including:
if the fact that the network delay between the unmanned equipment and the remote control system is larger than a set delay threshold value is detected, the unmanned equipment is determined to be in a delay state;
and determining transmission parameters adopted when video data collected by other cameras except the video data collected by the camera in front of the unmanned equipment are transmitted in the delayed state through a preset transmission strategy table.
Optionally, the camera further comprises a warehouse camera;
the method further comprises the following steps:
for each cargo space in a cargo space in the unmanned equipment, determining cargo state change information corresponding to the cargo space according to video data acquired by a cargo space camera and cargo distribution information corresponding to the cargo space, wherein the cargo state change information is used for representing whether the cargo in the cargo space is taken away;
and transmitting the goods state change information corresponding to each goods position to a remote control system.
Optionally, the method further comprises:
receiving an automatic adjustment instruction sent by the remote control system, wherein the automatic adjustment instruction is generated when the remote control system detects that the network delay between the unmanned equipment and the remote control system is greater than a set delay threshold value or the packet loss rate in the video data transmission process is greater than a set packet loss rate threshold value;
and reducing transmission parameters adopted when the video data collected by each camera is transmitted according to the automatic adjustment instruction.
Optionally, the method further comprises:
transmitting the adjusted video data to a remote control system so that the remote control system amplifies an image area where the image of the specified object in the adjusted video data is located if detecting that the adjusted video data contains the image of the specified object, wherein the specified object comprises: traffic lights.
Optionally, the current state of the unmanned device comprises: a specific state, the specific state comprising: at least one of turning, turning around, hard braking and collision;
according to the state data, determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment, specifically comprising:
and if the current state of the unmanned equipment is determined to be the specific state, determining a video transmission strategy for increasing the number of cameras on which the current collected video data is based and/or increasing the numerical value of a transmission parameter adopted when the current collected video data is transmitted.
This specification provides a video data transmission's device for unmanned aerial vehicle equipment remote control, installs a plurality of cameras on the unmanned aerial vehicle equipment, the camera of installation on the unmanned aerial vehicle equipment is used for shooing unmanned aerial vehicle equipment's surrounding environment includes:
the acquisition module is used for acquiring the state data of the unmanned equipment at the current moment;
the determining module is used for determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution;
the adjusting module is used for adjusting the video data acquired by each camera to be transmitted according to the video transmission strategy corresponding to each camera determined based on the current state of the unmanned equipment;
and the transmission module is used for transmitting the adjusted video data to a remote control system and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
The present specification provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the above-described method for video data transmission for remote control of an unmanned device.
The present specification provides an unmanned aerial vehicle comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the above-described method for video data transmission for remote control of an unmanned aerial vehicle when executing the program.
The technical scheme adopted by the specification can achieve the following beneficial effects:
in the method for video data transmission for remote control of the unmanned equipment, provided by the specification, state data of the unmanned equipment at the current moment is acquired. Secondly, determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when the video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution. And then, adjusting the video data acquired by each camera to be transmitted according to the video transmission strategy corresponding to each camera determined based on the current state of the unmanned equipment. And finally, transmitting the adjusted video data to a remote control system, and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
It can be seen from the above method for transmitting video data for remote control of a drone device, the method can determine whether to transmit video data acquired by each camera to a remote control system and adjust the video data acquired by each camera to be transmitted in a current state of the drone device. Compared with the prior art, the video data collected by the camera on the unmanned equipment is not processed, and the video data is directly transmitted to the remote control system. According to the method, the most appropriate video data in the current state are transmitted, so that a remote driver can timely acquire the video data with the highest definition in the current state, and therefore the safety of the unmanned equipment in the driving process is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the specification and are incorporated in and constitute a part of this specification, illustrate embodiments of the specification and together with the description serve to explain the specification and not to limit the specification in a non-limiting sense. In the drawings:
fig. 1 is a schematic flowchart of a method for video data transmission by remote control of an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a compensated camera deployment provided by an embodiment of the present description;
fig. 3 is a schematic view of a deployment of a look-around camera provided in an embodiment of the present description;
fig. 4 is a schematic diagram of a camera deployment on an unmanned device provided by an embodiment of the present description;
FIG. 5 is a schematic structural diagram of an apparatus for video data transmission remotely controlled by an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an unmanned aerial vehicle provided in an embodiment of the present specification.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clear, the technical solutions of the present disclosure will be clearly and completely described below with reference to the specific embodiments of the present disclosure and the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present disclosure, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
The technical solutions provided by the embodiments of the present description are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a method for transmitting video data for remote control of an unmanned aerial vehicle, provided in an embodiment of the present specification, and specifically includes the following steps:
s100: and acquiring the state data of the unmanned equipment at the current moment.
In the embodiment of the specification, during the movement of the unmanned device, the state data of the unmanned device at the current moment can be acquired. The drone may be equipped with various sensors, such as a camera, a navigation positioning system, a lidar, a millimeter wave radar, etc., for sensing the environment around the drone during travel to obtain the required status data. The state data mentioned here can be used to characterize the data acquired by the unmanned device through the sensor at various times, and may include: position data of the unmanned aerial vehicle, position data of obstacles around the unmanned aerial vehicle, speed data of the unmanned aerial vehicle, steering angle data of the unmanned aerial vehicle, and the like.
The unmanned device referred to in this specification may refer to an unmanned vehicle, a robot, an automatic distribution device, or the like capable of realizing automatic driving. Based on this, the unmanned device to which the method for remotely controlling video data transmission by the unmanned device provided by the present specification is applied can be used for executing delivery tasks in the delivery field, such as business scenes of delivery such as express delivery, logistics, takeaway and the like by using the unmanned device.
The execution main body of the video data transmission for the remote control of the unmanned equipment provided by the specification can be the unmanned equipment, and can also be terminal equipment such as a server, a desktop computer and the like. For convenience of description, the following description will be made of a method for video data transmission for remote control of an unmanned aerial device provided in this specification, with only the unmanned aerial device as an execution subject.
S102: and determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution.
In this specification, a plurality of cameras are installed on the unmanned aerial vehicle, the cameras installed on the unmanned aerial vehicle are used for shooting the surrounding environment of the unmanned aerial vehicle, and the video data may be data acquired by a camera provided on the unmanned aerial vehicle, where the camera may be any type of camera, such as a wide-angle camera, a high-definition camera, and the like.
In this embodiment of the present specification, the unmanned aerial vehicle may determine, according to the state data, a video transmission policy corresponding to each camera in a state in which the unmanned aerial vehicle is currently located, where the video transmission policy includes a transmission parameter used when video data collected by the camera is transmitted, and the transmission parameter includes at least one of a frame rate, a code rate, and a resolution. For example, in different states of the unmanned device, one or more of a frame rate, a code rate, and a resolution used when video data collected by the camera is transmitted are adjusted.
Further, the unmanned aerial vehicle may determine, according to the state data, a current state of the unmanned aerial vehicle as a current state, and determine, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table, where the transmission policy table includes video transmission policies of each camera corresponding to each state of the unmanned aerial vehicle.
It should be noted that, in addition to modifying the transmission parameters used when transmitting the video data acquired by the cameras, the video transmission policy may also include changing the number of cameras on which the currently acquired video data is based to obtain video ranges with different fields of view, or changing the video acquisition frequency of the cameras on which the currently acquired video data is based to obtain video data with a higher frame rate.
For example, if the drone is in a standby state (i.e., stops driving), the drone may be considered to be safer at the current time, and thus the video capture frequency of the camera on which the video data is currently captured may be reduced. For another example, if the unmanned device is in a driving state, attention needs to be paid to the surrounding environment of the unmanned device, and in order to ensure safe driving of the unmanned device, the video acquisition frequency of the camera based on which the current video data is acquired can be increased, so as to obtain video data with a higher frame rate. That is, the change of the surrounding environment can be timely recognized through the video data with a high frame rate, so that the safe driving of the unmanned equipment is ensured.
Correspondingly, under the condition that other conditions are not changed, if the unmanned equipment reduces the video acquisition frequency of the camera on which the current acquired video data is based, the data volume of the transmitted video data is reduced. If the unmanned equipment improves the video acquisition frequency of the camera on which the current video data is acquired, the data volume of the transmitted video data is increased.
In the embodiment of the specification, the unmanned device is in an automatic driving state or a standby state, and it can be determined that the unmanned device is safer at the current moment. In order to reduce the consumption of network resources, the unmanned equipment can reduce the frame rate and code rate adopted when transmitting the video data acquired by each camera when the unmanned equipment is in an automatic driving state or a standby state.
In practical application, the unmanned equipment comprises four cameras which are arranged outside the vehicle, namely a front camera, a rear camera, a left camera and a right camera, but in the driving process of the unmanned equipment, the visual angle range of video data collected by the four cameras is limited, namely, a visual field blind area exists in the driving process of the unmanned equipment. Therefore, a compensation camera can be added to the unmanned equipment, the visual angle range of video data acquired by the camera on the unmanned equipment is enlarged, and the safety of the unmanned equipment in the driving process is improved.
In this embodiment, the unmanned aerial vehicle includes a compensation camera thereon, where the compensation camera includes a left compensation camera located in the left front of the unmanned aerial vehicle and a right compensation camera located in the right front of the unmanned aerial vehicle, and the transmission parameter further includes information used to represent whether to transmit video data collected by the cameras, as shown in fig. 2.
Fig. 2 is a schematic diagram of a compensated camera deployment provided in an embodiment of the present description.
In fig. 2, black filled circles are used to characterize the camera. The left compensation camera is located unmanned vehicles left side the place ahead, can gather the video data in the visual angle range in unmanned vehicles left side the place ahead, and right compensation camera is located unmanned vehicles right side front, can gather the video data in the visual angle range in unmanned vehicles right side front to the video data that avoid gathering appear the field of vision blind area, thereby, improve the security of unmanned vehicles driving in-process.
In practical application, due to the fact that the network bandwidth on the unmanned equipment is limited, when the unmanned equipment transmits video data collected by the compensation camera, the network bandwidth can not meet the requirement, and the video data received by the remote control system can be blocked, displayed or even broken. Therefore, it is necessary to reduce transmission parameters used in transmitting video data acquired by other cameras to ensure the fluency of the video data received by the remote control system.
In this embodiment, the unmanned aerial vehicle may determine, according to the acquired sensing data, a current state of the unmanned aerial vehicle, and if it is determined that the current state is a specific state such as a turn, a turn around, a sudden brake, and a collision, a video transmission policy for increasing the number of cameras on which the currently acquired video data is based and/or increasing a value of a transmission parameter used when the currently acquired video data is transmitted may be determined.
Specifically, if it is determined that the unmanned aerial vehicle is currently in a steering state, the unmanned aerial vehicle may determine, through a preset transmission policy table, that video data collected by a compensation camera corresponding to a steering direction of the unmanned aerial vehicle is transmitted in the steering state, and reduce transmission parameters used when video data collected by cameras corresponding to non-steering directions of the unmanned aerial vehicle is transmitted. For example, when the unmanned device turns left or turns left, the video data collected by the left compensation camera of the unmanned device is transmitted to the remote control system, and the frame rate and code rate adopted when the video data collected by the right camera, the front camera and the rear camera of the unmanned device are transmitted are reduced. It should be noted that, when the unmanned aerial vehicle is in a steering state, transmission parameters (frame rate and code rate) adopted when video data acquired by any one or more cameras in the cameras corresponding to the non-steering direction of the unmanned aerial vehicle is transmitted can be reduced.
In this embodiment, for each peripheral obstacle, if it is detected that the distance between the peripheral obstacle and the unmanned equipment is smaller than a set distance threshold, the unmanned equipment may determine that the unmanned equipment is in a meeting state, and then determine, through a preset transmission policy table, a transmission parameter used when video data acquired by a camera in a direction where the peripheral obstacle is located is transmitted in the meeting state. For example, when the unmanned device meets another vehicle, if it is detected that the distance between the unmanned device and the vehicle in front is smaller than a set distance threshold, the frame rate and the code rate used when transmitting the video data collected by the camera in front of the unmanned device are increased.
In an embodiment of the present specification, the camera on the unmanned aerial vehicle device includes a plurality of around-looking cameras in different orientations, and the unmanned aerial vehicle device can fuse video data collected by the around-looking cameras in different orientations to obtain panoramic around-looking video data. The panoramic surround view video data referred to herein may be used to provide a remote driver with full-range visual information around the drone. As shown in fig. 3.
Fig. 3 is a schematic view of a camera deployment in a look-around manner provided in an embodiment of the present disclosure.
In fig. 3, black filled circles are used to characterize the around-the-camera. The all-round camera can be the wide angle camera that the field of vision scope is great, and unmanned aerial vehicle equipment can satisfy the requirement of visual angle scope through less all-round camera of looking around to gather unmanned aerial vehicle equipment all-round video data.
It should be noted that the panoramic camera may be a high-definition camera with a smaller visual field range and higher definition. The unmanned equipment can meet the requirement of a visual angle range through more high-definition cameras, and the definition of the collected video data is improved.
Specifically, the unmanned aerial vehicle can place a plurality of index points in the video area of each all around looking camera collection, carry out distortion correction respectively to the video data of each all around looking camera collection, splice the video data of each all around looking camera collection again to carry out regional fusion to the region of coincidence in two adjacent video data, in order to realize the seamless concatenation of the video data of each all around looking camera collection, obtain panorama all around looking video data.
In practical application, because the data volume of the panoramic all-around video data is larger in the normal driving process of the unmanned equipment, the panoramic all-around video data with low frame rate and low code rate can be sent to the remote control system, so that the consumption of network resources is reduced. If the unmanned equipment has an emergency in the driving process, the transmission parameters adopted in the process of transmitting the panoramic all-around video data can be improved, so that a remote driver can timely acquire clearer video data and better handle the emergency.
In the embodiment of the present specification, if it is determined that the unmanned device is in an abnormal state, the unmanned device determines, through a preset transmission policy table, that transmission parameters (frame rate, code rate) used when transmitting panoramic all-around video data collected by the all-around camera are increased in the abnormal state. The abnormal state mentioned here may refer to sudden situations such as sudden braking and collision of the unmanned equipment.
In practical applications, the strength of the network signal varies according to the area through which the unmanned device passes during driving. That is, the network signals in different areas have different strengths. When the drone is in an area where the network signal is weak, the network bandwidth currently available for use by the drone is also low. Due to the fact that the network delay of the area with weak network signals is large, if the frame rate of the video data is reduced, the time interval between the video data acquired by the remote driver and the actual situation is longer, and therefore safety of the unmanned device in the driving process is reduced. Therefore, the unmanned equipment can reduce the code rate and the resolution ratio adopted when the video data collected by all the cameras are transmitted, so as to reduce the network bandwidth and improve the stability of video transmission.
In this embodiment, the signal strength of the unmanned aerial vehicle at the current time may be determined by the unmanned aerial vehicle according to positioning information of the unmanned aerial vehicle at the current time and a preset signal strength map, where the signal strength map mentioned here records the strength of network signals in different geographic areas, and may be a high-precision map labeled with the network signal strength of each geographic area, or may be a map specially used for labeling the network signal strength. And determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the signal intensity. If the unmanned equipment is in a geographical area with weak network signals, the unmanned equipment can reduce the code rate and the resolution ratio adopted when video data collected by all cameras are transmitted. If the unmanned equipment is in a geographical area with strong network signals, the unmanned equipment can improve the code rate and the resolution ratio adopted when video data collected by all cameras are transmitted.
It should be noted that the strength of network signals in the same geographic area may change, and therefore, in order to ensure the accuracy of the signal strength of each geographic area in the signal strength map, the unmanned device may update the acquired signal strength of each geographic area into the signal strength map during driving, so as to ensure the accuracy of the signal strength map.
In the embodiment of the present specification, if the unmanned device detects that the network delay is large, in order to enable the unmanned device to normally travel, it may be preferentially ensured that the definition of a video collected by a front camera, which is important, is not reduced. Therefore, if the unmanned device detects that the network delay between the unmanned device and the remote control system is larger than a set delay threshold value, the unmanned device is determined to be in a delay state, and then transmission parameters (frame rate, code rate and resolution ratio) adopted when video data collected by other cameras except video data collected by a camera in front of the unmanned device are transmitted are determined to be reduced in the delay state through a preset transmission strategy table.
In the embodiment of the description, the unmanned device can be applied to unmanned distribution business, a warehouse in the unmanned device comprises a plurality of cargo spaces, and a warehouse camera for monitoring the cargo state change in the warehouse. Because the number of goods positions in the unmanned equipment is large, the size of a video picture displayed by the remote control system is small, and a remote driver may not be capable of accurately observing the state change of goods in each goods position. Therefore, the goods state change in each goods space can be identified through the image identification technology.
In an embodiment of the present description, the unmanned device may determine, for each cargo space in a cargo bay of the unmanned device, cargo state change information corresponding to the cargo space according to video data acquired by a camera of the cargo bay and cargo delivery information corresponding to the cargo space, where the cargo state change information is used to represent whether cargo in the cargo space is removed, and transmit the cargo state change information corresponding to each cargo space to the remote control system. The cargo delivery information mentioned here can be used to represent the delivery information of the cargo during the delivery process, including the delivery information such as the cargo location, the delivery time of the cargo, and the pick-up information of the cargo. That is to say, unmanned equipment can judge whether the goods are taken away correctly in the goods position that the goods in the goods delivery information are located through the video data of discernment storehouse camera collection according to goods delivery information, again with the transmission of identification result to remote control system.
In the embodiments of the present description, cameras with different functions, such as a compensation camera, a look-around camera, and a warehouse camera, may be deployed on the unmanned device at the same time. As shown in particular in fig. 4.
Fig. 4 is a schematic diagram of camera deployment on an unmanned device provided in an embodiment of the present specification.
In fig. 4, black filled circles are used to characterize the camera. The warehouse camera is located inside unmanned vehicles, and other cameras all are located unmanned vehicles outside. The positions of the cameras and the around-looking camera can be the same or different, and the position of the around-looking camera can be higher than the positions of other cameras, so that the visual angle range of video data collected by the around-looking camera is increased. The video data that the all-round looking camera was gathered need fuse earlier, obtains panorama all-round looking video data, transmits the remote control system again, and other cameras except that all can transmit the video data of gathering to the remote control system alone.
S104: and adjusting the video data acquired by each camera to be transmitted according to the video transmission strategy corresponding to each camera determined based on the current state of the unmanned equipment.
In this embodiment, the unmanned aerial vehicle may adjust video data acquired by each camera that needs to be transmitted according to a video transmission policy corresponding to each camera determined based on the current state of the unmanned aerial vehicle.
In this embodiment of the present specification, an automatic adjustment instruction sent by a remote control system is received, where the automatic adjustment instruction is generated when the remote control system detects that a network delay between an unmanned device and the remote control system is greater than a set delay threshold, or a packet loss rate in a video data transmission process is greater than a set packet loss rate threshold. And then, according to the automatic adjustment instruction, reducing transmission parameters adopted when the video data acquired by each camera is transmitted. That is to say, the remote control system can detect the network delay and the packet loss rate in the video data transmission process in real time so as to adjust the video data acquired by each camera which needs to be transmitted.
In this embodiment, the unmanned device may receive a remote control instruction sent by a remote control system, where the remote control instruction is generated by the remote control system through input transmission parameters used when transmitting video data acquired by each camera, and then adjusts the video data acquired by each camera to be transmitted according to the remote control instruction. That is to say, the remote driver can input the transmission parameters adopted when transmitting the video data collected by each camera in the remote control system according to the requirements of the remote driver on the visual angle and the definition of the video data, and send a remote control instruction to the unmanned equipment. For example, when the unmanned device turns left, the remote driver may send, to the unmanned device, the frame rate and the code rate for improving the transmission of the panoramic all-around video data through the remote control system.
S106: and transmitting the adjusted video data to a remote control system, and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
In this embodiment, the unmanned aerial vehicle may transmit the adjusted video data to the remote control system, and receive a control instruction sent by the remote control system to control the unmanned aerial vehicle to travel.
The unmanned equipment transmits the adjusted video data to a remote control system, so that if the remote control system detects that the adjusted video data contains an image of a specified object, the image area where the image of the specified object in the adjusted video data is located is enlarged, and the specified object comprises: traffic lights. The image of the designated object mentioned here may be an image of one frame in the adjusted video data, that is, a video region in which the designated object in the adjusted video data is located may be actually enlarged. The specified object mentioned here may also be an object related to the travel of the unmanned aerial vehicle, such as various signboards and the like. The designated object may also be set manually as required. For example, when the unmanned device drives to a traffic light intersection, the remote control system detects the image of the traffic light in the adjusted video data and amplifies the image area where the image of the traffic light is located, so that the error identification of the traffic light is avoided, and the safety of the unmanned device in the driving process is improved. Of course, the remote driver can also arbitrarily adjust or enlarge the region of interest in the video data according to the own requirements.
The method can determine whether to transmit the video data acquired by each camera to the remote control system or not in the current state of the unmanned equipment, and adjust the frame rate, the code rate and the resolution ratio corresponding to the video data acquired by each camera to be transmitted. And the remote driver can determine the video transmission strategy corresponding to each camera according to the requirements of the remote driver on the visual angle and the definition of the video data so as to adjust the frame rate, the code rate and the resolution ratio corresponding to the video data acquired by each camera. According to the method, the remote driver can be ensured to obtain the video data with the highest definition in the current state in time by transmitting the most appropriate video data in the current state, so that the safety of the unmanned equipment in the driving process is improved.
Based on the same idea, the present specification also provides a corresponding apparatus for video data transmission for remotely controlling an unmanned aerial vehicle, as shown in fig. 5, for the method for video data transmission for remotely controlling an unmanned aerial vehicle provided in one or more embodiments of the present specification.
Fig. 5 is a schematic structural diagram of an apparatus for remotely controlling video data transmission by using an unmanned aerial vehicle according to an embodiment of the present disclosure, where a plurality of cameras are installed on the unmanned aerial vehicle, and the cameras installed on the unmanned aerial vehicle are used for capturing an environment around the unmanned aerial vehicle, and the apparatus specifically includes:
an obtaining module 500, configured to obtain state data of the unmanned device at a current moment;
a determining module 502, configured to determine, according to the state data, a video transmission policy corresponding to each camera in a current state of the unmanned aerial vehicle, where the video transmission policy includes a transmission parameter used when video data acquired by the camera is transmitted, and the transmission parameter includes at least one of a frame rate, a code rate, and a resolution;
an adjusting module 504, configured to adjust video data acquired by each camera that needs to be transmitted according to a video transmission policy corresponding to each camera determined based on the current state of the unmanned aerial vehicle;
a transmission module 506, configured to transmit the adjusted video data to a remote control system, and receive a control instruction sent by the remote control system based on the adjusted video data, so as to control the unmanned device to run.
Optionally, the determining module 502 is specifically configured to determine, according to the state data, a current state of the unmanned aerial vehicle, as a current state, and determine, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table, where the transmission policy table includes video transmission policies of each camera corresponding to each state of the unmanned aerial vehicle.
Optionally, the camera includes a compensation camera, and the transmission parameters further include: the information is used for representing whether the video data collected by the camera is transmitted or not;
the determining module 502 is specifically configured to, if it is determined that the unmanned aerial vehicle is in a steering state, determine, through a preset transmission policy table, that video data collected by a compensation camera corresponding to a steering direction of the unmanned aerial vehicle is transmitted in the steering state, and reduce transmission parameters adopted when video data collected by cameras corresponding to a non-steering direction of the unmanned aerial vehicle is transmitted.
Optionally, the determining module 502 is specifically configured to, for each peripheral obstacle, determine that the unmanned device is in a vehicle-meeting state if it is detected that a distance between the peripheral obstacle and the unmanned device is smaller than a set distance threshold, and determine, through a preset transmission policy table, a transmission parameter used when video data collected by cameras in a direction in which the peripheral obstacle is located is transmitted in the vehicle-meeting state.
Optionally, the camera comprises a plurality of look-around cameras with different orientations;
the determining module 502 is specifically configured to, if it is determined that the unmanned aerial vehicle is in an abnormal state, determine, through a preset transmission policy table, that a transmission parameter used when transmitting panoramic all-around video data acquired by the all-around cameras in the abnormal state is improved, where the panoramic all-around video data is obtained by fusing video data acquired by the all-around cameras in different orientations.
Optionally, the determining module 502 is specifically configured to determine, according to the positioning information of the unmanned aerial vehicle at the current time and a preset signal strength map, the signal strength of the unmanned aerial vehicle at the current time, where the signal strength map records the strength of network signals in different geographic areas, and determine, through a preset transmission policy table, a video transmission policy corresponding to each camera at the signal strength.
Optionally, the determining module 502 is specifically configured to, if it is detected that the network delay between the unmanned aerial vehicle and the remote control system is greater than a set delay threshold, determine that the unmanned aerial vehicle is in a delay state, and determine, through a preset transmission policy table, that in the delay state, transmission parameters used when video data acquired by other cameras, except video data acquired by a camera in front of the unmanned aerial vehicle, are transmitted are reduced.
Optionally, the camera further comprises a warehouse camera;
the transmission module 506 is specifically configured to, for each cargo space in a cargo bay in the unmanned device, determine cargo state change information corresponding to the cargo space according to the video data acquired by the cargo bay camera and cargo delivery information corresponding to the cargo space, where the cargo state change information is used to represent whether cargo in the cargo space is removed, and transmit the cargo state change information corresponding to each cargo space to a remote control system.
Optionally, the transmission module 506 is specifically configured to receive an automatic adjustment instruction sent by the remote control system, where the automatic adjustment instruction is generated when the remote control system detects that a network delay between the unmanned equipment and the remote control system is greater than a set delay threshold, or a packet loss rate in a video data transmission process is greater than a set packet loss rate threshold, and according to the automatic adjustment instruction, reduce a transmission parameter used when transmitting the video data acquired by each camera.
Optionally, the transmission module 506 is specifically configured to transmit the adjusted video data to a remote control system, so that if the remote control system detects that the adjusted video data includes an image of a specified object, the remote control system enlarges an image area where the image of the specified object in the adjusted video data is located, where the specified object includes: traffic lights.
Optionally, the current state of the unmanned device comprises: a specific state, the specific state comprising: at least one of turning, turning around, hard braking and collision;
the determining module 502 is specifically configured to determine, if it is determined that the current state of the unmanned aerial vehicle is the specific state, a video transmission policy for increasing the number of cameras on which the currently acquired video data is based and/or increasing the value of a transmission parameter used when the currently acquired video data is transmitted.
The present specification also provides a computer-readable storage medium storing a computer program operable to execute the method for video data transmission for remote control of an unmanned aerial device as provided in fig. 1 above.
The present specification also provides a schematic block diagram of an unmanned device corresponding to that of figure 1, shown in figure 6. As shown in fig. 6, the drone includes, at the hardware level, a processor, an internal bus, a network interface, a memory, and a non-volatile memory, although it may also include hardware needed for other services. The processor reads a corresponding computer program from the non-volatile memory into the memory and then runs the computer program to implement the method for video data transmission for remote control of an unmanned aerial vehicle described in fig. 1 above. Of course, besides the software implementation, the present specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may be hardware or logic devices.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the various elements may be implemented in the same one or more software and/or hardware implementations of the present description.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
This description may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present specification, and is not intended to limit the present specification. Various modifications and alterations to this description will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present specification should be included in the scope of the claims of the present specification.
Claims (14)
1. A method for video data transmission of a remote control of an unmanned aerial vehicle is characterized in that a plurality of cameras are installed on the unmanned aerial vehicle, the cameras installed on the unmanned aerial vehicle are used for shooting the surrounding environment of the unmanned aerial vehicle, and the method comprises the following steps:
acquiring state data of the unmanned equipment at the current moment;
determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution;
adjusting video data collected by the cameras to be transmitted according to a video transmission strategy corresponding to the cameras determined based on the current state of the unmanned equipment;
and transmitting the adjusted video data to a remote control system, and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
2. The method according to claim 1, wherein determining, according to the state data, a video transmission policy corresponding to each camera in a current state of the unmanned aerial vehicle includes:
determining the current state of the unmanned equipment as the current state according to the state data;
and determining the video transmission strategies corresponding to the cameras in the current state through a preset transmission strategy table according to the current state, wherein the transmission strategy table comprises the video transmission strategies of the cameras corresponding to the states of the unmanned equipment.
3. The method of claim 2, wherein the camera comprises a compensation camera, the transmission parameters further comprising: the information is used for representing whether the video data collected by the camera is transmitted or not;
determining a video transmission strategy corresponding to each camera in the current state through a preset transmission strategy table according to the current state, wherein the video transmission strategy specifically comprises the following steps:
and if the unmanned equipment is determined to be in the steering state, determining that video data collected by the compensation cameras corresponding to the steering direction of the unmanned equipment is transmitted in the steering state through a preset transmission strategy table, and reducing transmission parameters adopted when video data collected by the cameras corresponding to the non-steering direction of the unmanned equipment is transmitted.
4. The method according to claim 2, wherein determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table specifically includes:
for each peripheral obstacle, if the distance between the peripheral obstacle and the unmanned equipment is smaller than a set distance threshold value, determining that the unmanned equipment is in a meeting state;
and determining transmission parameters adopted when video data collected by cameras in the direction of surrounding obstacles are transmitted in the meeting state through a preset transmission strategy table.
5. The method of claim 2, wherein the camera comprises a number of differently oriented look-around cameras;
determining a video transmission strategy corresponding to each camera in the current state through a preset transmission strategy table according to the current state, wherein the video transmission strategy specifically comprises the following steps:
and if the unmanned equipment is determined to be in an abnormal state, determining transmission parameters adopted when panoramic all-around video data collected by the all-around cameras are transmitted in the abnormal state through a preset transmission strategy table, wherein the panoramic all-around video data are obtained by fusing the video data collected by the all-around cameras in different directions.
6. The method according to claim 2, wherein determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table specifically includes:
determining the signal intensity of the unmanned equipment at the current moment according to the positioning information of the unmanned equipment at the current moment and a preset signal intensity map, wherein the signal intensity map records the intensity of network signals in different geographic areas;
and determining the video transmission strategy corresponding to each camera under the signal intensity through a preset transmission strategy table.
7. The method according to claim 2, wherein determining, according to the current state, a video transmission policy corresponding to each camera in the current state through a preset transmission policy table specifically includes:
if the fact that the network delay between the unmanned equipment and the remote control system is larger than a set delay threshold value is detected, the unmanned equipment is determined to be in a delay state;
and determining transmission parameters adopted when video data collected by other cameras except the video data collected by the camera in front of the unmanned equipment are transmitted in the delayed state through a preset transmission strategy table.
8. The method of claim 1, wherein the camera further comprises a warehouse camera;
the method further comprises the following steps:
for each cargo space in a cargo space in the unmanned equipment, determining cargo state change information corresponding to the cargo space according to video data acquired by a cargo space camera and cargo distribution information corresponding to the cargo space, wherein the cargo state change information is used for representing whether the cargo in the cargo space is taken away;
and transmitting the goods state change information corresponding to each goods position to a remote control system.
9. The method of claim 1, wherein the method further comprises:
receiving an automatic adjustment instruction sent by the remote control system, wherein the automatic adjustment instruction is generated when the remote control system detects that the network delay between the unmanned equipment and the remote control system is greater than a set delay threshold value or the packet loss rate in the video data transmission process is greater than a set packet loss rate threshold value;
and reducing transmission parameters adopted when the video data collected by each camera is transmitted according to the automatic adjustment instruction.
10. The method of claim 1, wherein the method further comprises:
transmitting the adjusted video data to a remote control system so that the remote control system amplifies an image area where the image of the specified object in the adjusted video data is located if detecting that the adjusted video data contains the image of the specified object, wherein the specified object comprises: traffic lights.
11. The method of any one of claims 1-10, wherein the current state of the drone includes: a specific state, the specific state comprising: at least one of turning, turning around, hard braking and collision;
according to the state data, determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment, specifically comprising:
and if the current state of the unmanned equipment is determined to be the specific state, determining a video transmission strategy for increasing the number of cameras on which the current collected video data is based and/or increasing the numerical value of a transmission parameter adopted when the current collected video data is transmitted.
12. The utility model provides a device that is used for unmanned aerial vehicle equipment remote control's video data transmission, its characterized in that installs a plurality of cameras on the unmanned aerial vehicle equipment, the camera of installation is used for shooing on the unmanned aerial vehicle equipment's surrounding environment includes:
the acquisition module is used for acquiring the state data of the unmanned equipment at the current moment;
the determining module is used for determining a video transmission strategy corresponding to each camera in the current state of the unmanned equipment according to the state data, wherein the video transmission strategy comprises transmission parameters adopted when video data collected by the cameras are transmitted, and the transmission parameters comprise at least one of a frame rate, a code rate and a resolution;
the adjusting module is used for adjusting the video data acquired by each camera to be transmitted according to the video transmission strategy corresponding to each camera determined based on the current state of the unmanned equipment;
and the transmission module is used for transmitting the adjusted video data to a remote control system and receiving a control instruction sent by the remote control system based on the adjusted video data so as to control the unmanned equipment to run.
13. A computer-readable storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method of any of the preceding claims 1 to 10.
14. An unmanned aerial vehicle comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the method of any of claims 1 to 11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110864531.5A CN113612969A (en) | 2021-07-29 | 2021-07-29 | Method and device for transmitting video data for remote control of unmanned equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110864531.5A CN113612969A (en) | 2021-07-29 | 2021-07-29 | Method and device for transmitting video data for remote control of unmanned equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113612969A true CN113612969A (en) | 2021-11-05 |
Family
ID=78338561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110864531.5A Withdrawn CN113612969A (en) | 2021-07-29 | 2021-07-29 | Method and device for transmitting video data for remote control of unmanned equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113612969A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113852795A (en) * | 2021-11-29 | 2021-12-28 | 新石器慧通(北京)科技有限公司 | Video picture adjusting method and device, electronic equipment and storage medium |
CN114025401A (en) * | 2021-11-23 | 2022-02-08 | 广州小鹏自动驾驶科技有限公司 | Remote driving processing method and device and vehicle |
CN114179823A (en) * | 2021-11-18 | 2022-03-15 | 鄂尔多斯市普渡科技有限公司 | Speed control method of unmanned vehicle |
CN114928715A (en) * | 2022-04-07 | 2022-08-19 | 西安万像电子科技有限公司 | Data transmission method and system |
CN115065807A (en) * | 2022-05-30 | 2022-09-16 | 东风汽车集团股份有限公司 | Remote driving monitoring video picture adjusting method |
CN115442408A (en) * | 2022-09-13 | 2022-12-06 | 脉冲视觉(北京)科技有限公司 | Image data transmission processing method, device, medium and electronic equipment |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130202025A1 (en) * | 2012-02-02 | 2013-08-08 | Canon Kabushiki Kaisha | Method and system for transmitting video frame data to reduce slice error rate |
CN106558121A (en) * | 2015-09-30 | 2017-04-05 | 中兴通讯股份有限公司 | The method and device of shooting |
CN107719356A (en) * | 2017-11-03 | 2018-02-23 | 李青松 | Pure electronic special-purpose vehicle Unmanned Systems and method |
CN108931971A (en) * | 2018-05-24 | 2018-12-04 | 奇瑞汽车股份有限公司 | For unpiloted mobile terminal, vehicle, server and Unmanned Systems |
CN111510735A (en) * | 2020-04-21 | 2020-08-07 | 新石器慧通(北京)科技有限公司 | Encoding transmission method and device for multi-channel video in weak network environment and unmanned vehicle |
CN111510681A (en) * | 2020-04-23 | 2020-08-07 | 新石器慧通(北京)科技有限公司 | Video processing method and device for unmanned vehicle, terminal equipment and storage medium |
CN111634234A (en) * | 2020-05-26 | 2020-09-08 | 东风汽车股份有限公司 | Remote driving vehicle end scene information acquisition and information display method based on combination of multiple cameras and radar and remote driving method |
-
2021
- 2021-07-29 CN CN202110864531.5A patent/CN113612969A/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130202025A1 (en) * | 2012-02-02 | 2013-08-08 | Canon Kabushiki Kaisha | Method and system for transmitting video frame data to reduce slice error rate |
CN106558121A (en) * | 2015-09-30 | 2017-04-05 | 中兴通讯股份有限公司 | The method and device of shooting |
CN107719356A (en) * | 2017-11-03 | 2018-02-23 | 李青松 | Pure electronic special-purpose vehicle Unmanned Systems and method |
CN108931971A (en) * | 2018-05-24 | 2018-12-04 | 奇瑞汽车股份有限公司 | For unpiloted mobile terminal, vehicle, server and Unmanned Systems |
CN111510735A (en) * | 2020-04-21 | 2020-08-07 | 新石器慧通(北京)科技有限公司 | Encoding transmission method and device for multi-channel video in weak network environment and unmanned vehicle |
CN111510681A (en) * | 2020-04-23 | 2020-08-07 | 新石器慧通(北京)科技有限公司 | Video processing method and device for unmanned vehicle, terminal equipment and storage medium |
CN111634234A (en) * | 2020-05-26 | 2020-09-08 | 东风汽车股份有限公司 | Remote driving vehicle end scene information acquisition and information display method based on combination of multiple cameras and radar and remote driving method |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114179823A (en) * | 2021-11-18 | 2022-03-15 | 鄂尔多斯市普渡科技有限公司 | Speed control method of unmanned vehicle |
CN114025401A (en) * | 2021-11-23 | 2022-02-08 | 广州小鹏自动驾驶科技有限公司 | Remote driving processing method and device and vehicle |
CN113852795A (en) * | 2021-11-29 | 2021-12-28 | 新石器慧通(北京)科技有限公司 | Video picture adjusting method and device, electronic equipment and storage medium |
CN113852795B (en) * | 2021-11-29 | 2022-08-30 | 新石器慧通(北京)科技有限公司 | Video picture adjusting method and device, electronic equipment and storage medium |
CN114928715A (en) * | 2022-04-07 | 2022-08-19 | 西安万像电子科技有限公司 | Data transmission method and system |
CN115065807A (en) * | 2022-05-30 | 2022-09-16 | 东风汽车集团股份有限公司 | Remote driving monitoring video picture adjusting method |
CN115442408A (en) * | 2022-09-13 | 2022-12-06 | 脉冲视觉(北京)科技有限公司 | Image data transmission processing method, device, medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113612969A (en) | Method and device for transmitting video data for remote control of unmanned equipment | |
CN111619560B (en) | Vehicle control method and device | |
JP7481790B2 (en) | Enhanced High Dynamic Range Imaging and Tone Mapping | |
US10394237B2 (en) | Perceiving roadway conditions from fused sensor data | |
CN107209856B (en) | Environmental scene condition detection | |
US9840253B1 (en) | Lane keeping system for autonomous vehicle during camera drop-outs | |
CN110874925A (en) | Intelligent road side unit and control method thereof | |
US20180137629A1 (en) | Processing apparatus, imaging apparatus and automatic control system | |
CN109218598B (en) | Camera switching method and device and unmanned aerial vehicle | |
CN111142528B (en) | Method, device and system for sensing dangerous scene for vehicle | |
CN111873989A (en) | Vehicle control method and device | |
EP4207103A1 (en) | Electronic device for detecting rear surface of target vehicle and operating method thereof | |
US20240056694A1 (en) | Imaging device, image processing method, and image processing program | |
KR20220142590A (en) | Electronic device, method, and computer readable storage medium for detection of vehicle appearance | |
CN111546986A (en) | Trailer panoramic looking-around method | |
US20230331235A1 (en) | Systems and methods of collaborative enhanced sensing | |
KR102670773B1 (en) | A camera system for ADAS and driving Assistance System by using the same | |
CN106945605B (en) | Vehicle blind area monitoring system and control method | |
US10354148B2 (en) | Object detection apparatus, vehicle provided with object detection apparatus, and non-transitory recording medium | |
CN114274978B (en) | Obstacle avoidance method for unmanned logistics vehicle | |
CN113614810A (en) | Image processing device, vehicle control device, method, and program | |
US11312393B2 (en) | Artificially falsifying sensor data to initiate a safety action for an autonomous vehicle | |
CN113096427B (en) | Information display method and device | |
WO2025058837A1 (en) | Vehicle control function | |
CN113643321A (en) | Sensor data acquisition method and device for unmanned equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20211105 |