[go: up one dir, main page]

CN112051586A - Interference preventing method for combined work of multiple TOF cameras, TOF camera and electronic equipment - Google Patents

Interference preventing method for combined work of multiple TOF cameras, TOF camera and electronic equipment Download PDF

Info

Publication number
CN112051586A
CN112051586A CN202010972620.7A CN202010972620A CN112051586A CN 112051586 A CN112051586 A CN 112051586A CN 202010972620 A CN202010972620 A CN 202010972620A CN 112051586 A CN112051586 A CN 112051586A
Authority
CN
China
Prior art keywords
random code
tof
random
image data
data acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010972620.7A
Other languages
Chinese (zh)
Other versions
CN112051586B (en
Inventor
王立民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Weigan Zhitong Technology Co ltd
Original Assignee
Qingdao Weigan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Weigan Technology Co ltd filed Critical Qingdao Weigan Technology Co ltd
Priority to CN202010972620.7A priority Critical patent/CN112051586B/en
Publication of CN112051586A publication Critical patent/CN112051586A/en
Application granted granted Critical
Publication of CN112051586B publication Critical patent/CN112051586B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the disclosure discloses a method for preventing interference during combined work of multiple TOF cameras, the TOF cameras, electronic equipment and a computer storage medium. The method comprises the following steps: s102: acquiring a random code, wherein the random code is time delay before single image data acquisition in a current frame is completed; s104: starting timing of preset time according to the random code; s106: when the timing of the preset time is over, executing image data acquisition operation, wherein the image data acquisition operation comprises laser emission operation and exposure operation; s108: the steps S102 to S106 are repeatedly executed a predetermined number of times until the image data acquisition of the current frame is completed. The method takes the TOF as a main body, controls the work of the cameras from a single laser layer of each frame image so as to reduce the probability of mutual interference when the multiple TOF cameras work simultaneously, avoids the improvement on camera hardware and reduces the cost.

Description

Interference preventing method for combined work of multiple TOF cameras, TOF camera and electronic equipment
Technical Field
The embodiment of the disclosure relates to the technical field of camera ranging, in particular to a method for preventing multiple TOF cameras from interfering with each other, the TOF cameras, electronic equipment and a computer storage medium.
Background
TOF is an abbreviation of Time of Flight (TOF) technology, i.e. a camera emits light to irradiate the surface of an object and then the light is reflected and received by a sensor, and the distance of the photographed object is calculated by calculating the Time difference between the emission and the reception of the light to generate depth information. Compared with the traditional method based on binocular ranging or structured light ranging, the TOF ranging has the advantages of being less affected by ambient light and having no requirement on the surface texture characteristics of the object.
At present, the technology of using TOF cameras to measure distance is widely applied, wherein most commonly, a plurality of TOF cameras are applied to the same working scene, but the problem that depth images are abnormal due to mutual interference exists when the plurality of TOF cameras work simultaneously. In the prior art, a network clock synchronization mode is used for realizing simultaneous work of multiple TOF cameras, namely, a network device such as a switch or a router is installed in a system, the TOF cameras are communicated with a remote server or a PC (personal computer), each camera in the system performs clock synchronization, a main camera generally controls the operation of the multiple TOF cameras, and the main camera controls the initial working time point of each slave camera, so that the aim of time-sharing work of the multiple TOF cameras is fulfilled.
However, the prior art has the following two problems: 1. network cables need to be erected among the TOF cameras with large volume and high power consumption or the TOF cameras need to be laid out in position to realize wireless networking, so that each TOF camera has an Ethernet communication function, the cost is increased, and the technology cannot be applied to many scenes. 2. Because the synchronization precision of the network clock synchronization technology is far lower than the precision of the laser working time sequence of the TOF camera, the method cannot control the work of the TOF camera from a laser layer, only can control the time sequences of a plurality of TOF cameras through an image frame layer, and the scheme cannot be applied to a plurality of scenes needing image splicing and matching.
Disclosure of Invention
An object of the embodiments of the present disclosure is to provide a method for preventing interference when multiple TOF cameras work in combination, a TOF camera, an electronic device, and a computer-readable storage medium, where TOF itself is used as a main body, and the work of the cameras is controlled from a single laser layer of each frame of image to reduce the probability that the multiple TOF cameras generate mutual interference when working simultaneously, and the improvement on camera hardware is avoided, thereby reducing the cost.
According to a first aspect of embodiments of the present disclosure, there is provided a method for preventing interference when a plurality of TOF cameras work in combination, including the following steps:
s1: acquiring a random code, wherein the random code is time delay before single image data acquisition in a current frame is completed;
s2: starting timing of preset time according to the random code;
s3: when the timing of the preset time is over, executing image data acquisition operation, wherein the image data acquisition operation comprises laser emission operation and exposure operation;
s4: the steps S1-S3 are repeatedly performed a predetermined number of times until the image data acquisition of the current frame is completed.
According to a second aspect of embodiments of the present disclosure, there is provided a TOF camera comprising:
the acquisition module is used for acquiring a random code;
the time sequence control module is used for starting timing of preset time according to the random code;
the execution module is used for executing image data acquisition operation once after the timing of preset time is finished, wherein the image data acquisition operation comprises laser emission operation and exposure operation;
and the counting module is used for counting the operation of the execution module.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic apparatus including:
a TOF camera provided by a second aspect of embodiments of the present disclosure; or,
a processor and a memory, the memory configured to store executable instructions for controlling the processor to perform a method for preventing interference in a joint operation of a plurality of TOF cameras according to a first aspect of an embodiment of the disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the multiple TOF camera joint work interference prevention method provided according to the first aspect of embodiments of the present disclosure.
According to the embodiment of the disclosure, a method for preventing interference during combined work of multiple TOF cameras is provided, wherein a single TOF camera is taken as a main body, when multiple TOF cameras need to work simultaneously, the multiple TOF cameras start a random mode, seed data is generated according to a unique identification code of the multiple TOF cameras, and multiple random codes are generated through the seed data. Because the unique identification codes of the TOF cameras are different and the generated seeds are different, the random code sequences generated by different seeds are different, and the TOF cameras carry out time delay according to the obtained different random codes before carrying out laser emission operation and exposure operation each time, so that the purpose that the TOF cameras carry out laser emission operation and exposure operation at different moments is achieved, and the probability of mutual interference generated when the TOF cameras work simultaneously is reduced. Other features of embodiments of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which is to be read in connection with the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the embodiments of the disclosure.
Fig. 1 is a block diagram of a hardware configuration of an electronic device that can be used to implement an embodiment of the present disclosure.
Fig. 2 is a flowchart illustrating steps of a method for preventing interference when a plurality of TOF cameras work in combination according to an embodiment of the disclosure.
Fig. 3 is a flowchart illustrating steps of a method for generating random codes according to an embodiment of the disclosure.
Fig. 4 is an application scenario diagram of an embodiment of the present disclosure.
Fig. 5 is a block diagram of a TOF camera according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the embodiments of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the embodiments of the disclosure, their application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a configuration of hardware of an electronic apparatus 1000 that can implement an embodiment of the present invention.
The electronic device 1000 may be a TOF camera, a laptop, a desktop computer, a cell phone, a tablet computer, etc.
As shown in fig. 1, the configuration of electronic device 1000 includes, but is not limited to, a processor 1031, a memory 1032, an interface device 1033, a communication device 1034, a GPU (Graphics Processing Unit) 1035, a display device 1036, an input device 1037, a speaker 1038, a microphone 1039, and a camera 1030. The processor 1031 includes, but is not limited to, a central processing unit CPU, a microprocessor MCU, and the like. The memory 1032 includes, but is not limited to, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. Interface device 1033 includes, but is not limited to, a USB interface, a serial interface, a parallel interface, and the like. The communication device 1034 is capable of wired or wireless communication, for example, and specifically may include WiFi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The GPU 1035 is used to process the image. The display device 1036 may include, but is not limited to, a liquid crystal screen, a touch screen, or the like. Input devices 1037 may include, but are not limited to, a keyboard, a mouse, a touch screen, and the like.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application, or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to perform any one of the methods for preventing interference in combined operation of multiple TOF cameras provided by the embodiments of the present invention. It will be appreciated by those skilled in the art that although a plurality of means are shown for the electronic device 1000 in fig. 1, the present invention may relate to only some of the means therein, e.g. the electronic device 1000 may relate to only the processor 1100 and the storage means 1200. The skilled person can design the instructions according to the disclosed solution. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< method examples >
The TOF camera calculates the distance between the shot objects by calculating the time difference between the transmission and reception of light rays, and can generate depth information of the objects, so the technology of ranging by using the TOF camera is widely applied, wherein most commonly, a plurality of TOF cameras are applied to the same working scene, but the problem of depth image abnormity caused by mutual interference exists when a plurality of TOF cameras work simultaneously. In the prior art, a network clock synchronization mode is used for realizing simultaneous work of multiple TOF cameras, namely, a network device such as a switch or a router is installed in a system, the TOF cameras are communicated with a remote server or a PC (personal computer), each camera in the system performs clock synchronization, a main camera generally controls the operation of the multiple TOF cameras, and the main camera controls the initial working time point of each slave camera, so that the aim of time-sharing work of the multiple TOF cameras is fulfilled.
However, in the prior art, a network cable needs to be erected between TOF cameras with large volume and high power consumption or a wireless networking mode needs to be realized by position layout so that each TOF camera has an ethernet communication function, which increases the cost, and the synchronization precision of the network clock synchronization technology is far inferior to the precision of the laser working time sequence of the TOF camera, so that the method cannot control the working of the TOF camera from the laser layer, only controls the time sequences of a plurality of TOF cameras through the image frame layer, and a plurality of scenes needing image splicing and matching cannot apply the scheme, thereby generating the requirement of a new interference prevention method for the combined working of a plurality of TOF cameras.
The embodiment of the disclosure provides a method for preventing interference in combined work of multiple TOF cameras. Because the unique identification codes of the TOF cameras are different and the generated seeds are different, the random code sequences generated by different seeds are different, and the TOF cameras carry out time delay according to the obtained different random codes before carrying out laser emission operation and exposure operation each time, so that the purpose that the TOF cameras carry out laser emission operation and exposure operation at different moments is achieved, and the probability of mutual interference generated when the TOF cameras work simultaneously is reduced.
Fig. 2 is a schematic diagram of a method for preventing interference when multiple TOF cameras work in combination according to an embodiment of the present disclosure. The interference prevention method for combined work of multiple TOF cameras provided by the embodiment is implemented by computer technology and can be implemented by the electronic device shown in fig. 1.
The interference prevention method for the combined work of the plurality of TOF cameras provided by the embodiment comprises the steps S102-S108.
S102, obtaining a random code, wherein the random code is a time delay before the acquisition of single image data in the current frame is completed.
In particular, the random code is the time delay for a single image data acquisition per frame, i.e. the interval between image data acquisitions per frame, and may be a value specifically indicative of a time period, which may be precise to the nanosecond level.
In a particular embodiment, the random code may be 5, 15, 25, etc.
Illustratively, as shown in FIG. 3, to generate the random code, the generating step includes S202-S206.
S202, under the condition that the TOF camera is in a random mode, reading a unique identification code of the TOF camera, wherein the unique identification code is used for marking the TOF camera.
Specifically, there are multiple working modes of the TOF camera, and based on that the TOF camera needs to be in a random working mode according to the method of the embodiment of the disclosure, the working mode of the TOF camera may be changed by toggling a key or by reading a signal in a memory, which is not limited herein.
Each TOF camera has a unique identification code, which may be in the form of a string of characters, for marking the TOF camera to distinguish between different TOF cameras.
And S204, generating seed data according to the unique identification code.
In particular, according to the random seed function srand, the seeds are generated with a unique identification code, each TOF camera generating a different seed because it has a different camera than the other TOF.
S206, generating the random codes through a random code generating function based on the seed data, wherein N random codes are provided, and N is more than or equal to 40.
Specifically, the seed obtained in the above steps is used to generate random codes through a random code generation function rand, where N random codes are set according to actual requirements and the size of the device storage area, and the number of the random codes may be set to be greater than 40.
Optionally, the method further includes:
forming a random code pool by the N random codes and storing the random code pool in a memory address;
and reading a random code from the random code resource pool and storing the random code into a random code register so as to obtain the random code.
In particular, a plurality of random codes may form a pool of random codes, and each TOF camera may have a pool of random codes including a plurality of random codes stored in a memory address of the TOF camera. When the TOF camera starts the random mode, the TOF camera reads a random code from a random code resource pool, temporarily stores the random code in a random code register, and acquires the random code from the TOF camera random code register before image acquisition operation is performed each time.
Optionally, after step S106, the method includes:
and re-reading a random code from the random code resource pool and storing the random code into the random code register so as to update the random code register.
Specifically, only one random code is stored in each random code register, so that the TOF reads the random code in the random code register and then correspondingly delays the laser emission operation and the exposure operation to be performed. Since the image acquisition of each frame needs to perform multiple laser emission operations and exposure operations, after each laser emission operation and exposure operation is completed, the TOF camera needs to read a random code from the random code pool again and store the random code in the random code register to prepare for the next laser emission operation and exposure operation.
Optionally, the method further includes a step of performing interval limitation on the random code.
Specifically, the interval limitation is to limit the minimum value and the maximum value of the random code, so that the minimum value of the random code is longer than the time required for completing one laser emission operation and exposure operation, and the situation that the next laser emission operation and exposure operation are performed when one laser emission operation and exposure operation are not completed is avoided. Meanwhile, the maximum value of the random code is limited, so that the condition that the acquisition of the current frame image is influenced due to overlarge value of the random code is avoided.
S104: and starting timing of preset time according to the random code.
The random code is a time delay before a laser emission operation and an exposure operation are performed, so that after the TOF camera acquires the random code from the random code register, the TOF camera needs to start timing of time matched with the random code, and corresponding operations are performed after the timing is finished.
S106: and when the timing of the preset time is over, executing image data acquisition operation, wherein the image data acquisition operation comprises laser emission operation and exposure operation.
Specifically, after the time delay is over and the timing is over, the TOF camera can perform a laser emission operation and an exposure operation to perform an image acquisition operation.
S108: the steps S102 to S106 are repeatedly executed a predetermined number of times until the image data acquisition of the current frame is completed.
Specifically, the embodiments of the present disclosure may control the operation of the TOF camera from a laser layer, where multiple laser emission operations and exposure operations are required to complete each frame of image, and before each laser emission operation and exposure operation, a random code needs to be acquired from a random code register until image data acquisition of a current frame is completed. When a plurality of TOF cameras work simultaneously, the operation is the same.
Optionally, after step S108, the method includes a step of processing and transmitting the image data. Specifically, when the TOF camera completes data acquisition of one frame of image, the TOF camera outputs the frame of image data, and before the image data is output, the image data needs to be processed, which is a basic conventional processing flow of the image data and is not described herein again.
In an embodiment, a time delay is also required to be set between the TOF camera and the image data of each frame, when the TOF camera is started to operate, the TOF camera starts frame interval timing, and outputs the image data after the image acquisition operation of the frame is completed, if the frame interval timing is not yet completed, the TOF camera performs an idle waiting period until the frame interval timing is completed, and then performs the image acquisition operation of the next frame, and the operation process executes steps S102 to S106 in a cycle.
Referring to fig. 4, in a specific embodiment, three TOF cameras are in synchronous operation, and each TOF camera completes one frame of image for example, TOF camera 1, TOF camera 2, and TOF camera 3 all need to perform 7 times of laser emission operation and exposure operation in the current frame of image acquisition operation, a random code acquired by TOF camera 1 from a random code register is 5 nanoseconds, TOF camera 1 immediately starts 5 nanoseconds for timing, and finishes the first image acquisition operation after 5 nanoseconds for timing. The random code acquired by the TOF camera 2 from the random code register is 10 nanoseconds, the TOF camera 2 immediately starts timing of 10 nanoseconds, and the first image acquisition operation is completed after the timing of 10 nanoseconds is finished. The random code acquired by the TOF camera 3 from the random code register is 15 nanoseconds, the TOF camera 3 immediately starts timing of 15 nanoseconds, and the first image acquisition operation is completed after timing of 8 nanoseconds is finished. By analogy, the TOF camera 1 performs 7 time delays of 5 nanoseconds, 30 nanoseconds, 8 nanoseconds, 40 nanoseconds, 10 nanoseconds, 32 nanoseconds and 36 nanoseconds in total, performs 7 laser emission operations and exposure operations, and completes the acquisition operation of one frame of image. The TOF camera 2 performs 7 time delays of 10 nanoseconds, 28 nanoseconds, 10 nanoseconds, 32 nanoseconds, 38 nanoseconds, 8 nanoseconds and 30 nanoseconds in total, performs 7 laser emission operations and exposure operations, and completes the acquisition operation of one frame of image. The TOF camera 3 performs 7 time delays of 15 nanoseconds, 34 nanoseconds, 8 nanoseconds, 32 nanoseconds, 30 nanoseconds, 26 nanoseconds and 30 nanoseconds in total, performs 7 laser emission operations and exposure operations, and completes the acquisition operation of one frame of image.
The method for preventing interference during combined work of the multiple TOF cameras, provided by the embodiment of the disclosure, has the advantages that a single TOF camera is taken as a main body, when the multiple TOF cameras need to work simultaneously, the multiple TOF cameras start a random mode, seed data is generated according to a unique identification code of the multiple TOF cameras, and then the multiple random codes are generated through the seed data. Because the unique identification codes of the TOF cameras are different and the generated seeds are different, the random code sequences generated by different seeds are different, and the TOF cameras carry out time delay according to the obtained different random codes before carrying out laser emission operation and exposure operation each time, so that the purpose that the TOF cameras carry out laser emission operation and exposure operation at different moments is achieved, and the probability of mutual interference generated when the TOF cameras work simultaneously is reduced.
< apparatus embodiment >
In another embodiment of the present disclosure, a TOF camera is provided, please refer to fig. 5, which is a block diagram of a structure of the TOF camera according to an embodiment of the present disclosure. As shown, TOF camera 300 includes: an acquisition module 301, a timing control module 302, an execution module 303, and a counting module 304.
The obtaining module 301 obtains a random code;
the timing control module 302 is configured to start timing of a predetermined time according to the random code acquired by the acquisition module 301;
the execution module 303 is configured to execute an image data acquisition operation after timing of a predetermined time is finished, where the image data acquisition operation includes a laser emission operation and an exposure operation;
the counting module 304 is configured to count the operations of the execution module 303.
Optionally, the TOF camera 300 further comprises: a generation module 305 and a storage module 306.
The generating module 305 is configured to generate the random code;
the storage module 306 is configured to store the random code.
< computer-readable storage Medium >
Finally, according to yet another embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of preventing interference in a joint operation of a plurality of TOF cameras according to any embodiment of the present disclosure.
Embodiments of the present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement aspects of embodiments of the disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations for embodiments of the present disclosure may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the disclosed embodiments by personalizing the custom electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of the computer-readable program instructions.
Various aspects of embodiments of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the embodiments of the present disclosure is defined by the appended claims.

Claims (10)

1. A method for preventing interference during combined work of a plurality of TOF cameras is characterized by comprising the following steps:
s102: acquiring a random code, wherein the random code is time delay before single image data acquisition in a current frame is completed;
s104: starting timing of preset time according to the random code;
s106: when the timing of the preset time is over, executing image data acquisition operation, wherein the image data acquisition operation comprises laser emission operation and exposure operation;
s108: the steps S102 to S106 are repeatedly executed a predetermined number of times until the image data acquisition of the current frame is completed.
2. The method of claim 1, wherein, prior to step S102, the method comprises the step of generating the random code:
reading a unique identification code of the TOF camera when the TOF camera is in a random mode, the unique identification code being used to mark the TOF camera;
generating seed data according to the unique identification code;
and generating the random codes through a random code generation function based on the seed data, wherein N random codes are provided, and N is more than or equal to 40.
3. The method of claim 2, wherein prior to step S102, the method further comprises:
forming a random code pool by the N random codes and storing the random code pool in a memory address;
and reading a random code from the random code resource pool and storing the random code into a random code register so as to obtain the random code.
4. The method of claim 3, wherein after step S106, the method comprises:
and re-reading a random code from the random code resource pool and storing the random code into the random code register so as to update the random code register.
5. The method of claim 1, further comprising the step of interval limiting the random code.
6. The method of claim 1, wherein after step S108, the method comprises the step of processing and transmitting the image data.
7. A TOF camera, comprising:
the acquisition module is used for acquiring a random code;
the time sequence control module is used for starting timing of preset time according to the random code;
the execution module is used for executing image data acquisition operation once after the timing of preset time is finished, wherein the image data acquisition operation comprises laser emission operation and exposure operation;
and the counting module is used for counting the operation of the execution module.
8. The TOF camera of claim 7, further comprising:
and the generating module is used for generating the random code.
9. An electronic device, comprising:
a TOF camera according to any one of claims 7-8; or,
a processor and a memory, the memory for storing executable instructions for controlling the processor to perform the method of preventing interference in the joint operation of multiple TOF cameras according to any one of claims 1 to 6.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, implements a method of preventing interference in the joint operation of multiple TOF cameras according to any one of claims 1 to 6.
CN202010972620.7A 2020-09-16 2020-09-16 Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment Active CN112051586B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010972620.7A CN112051586B (en) 2020-09-16 2020-09-16 Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010972620.7A CN112051586B (en) 2020-09-16 2020-09-16 Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment

Publications (2)

Publication Number Publication Date
CN112051586A true CN112051586A (en) 2020-12-08
CN112051586B CN112051586B (en) 2023-04-28

Family

ID=73604816

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010972620.7A Active CN112051586B (en) 2020-09-16 2020-09-16 Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment

Country Status (1)

Country Link
CN (1) CN112051586B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2594959A1 (en) * 2011-11-17 2013-05-22 MESA Imaging AG System and method for multi TOF camera operation using phase hopping
US20160003946A1 (en) * 2014-07-03 2016-01-07 Advanced Scientific Concepts, Inc. Ladar sensor for a dense environment
CN106982471A (en) * 2016-05-26 2017-07-25 朱晓庆 Wireless communications method, wireless communication terminal, service end and teaching are registered device
CN108668382A (en) * 2018-05-18 2018-10-16 深圳市华奥通通信技术有限公司 A kind of communication means and communication system of Random Communication time interval
CN109613517A (en) * 2018-12-12 2019-04-12 北醒(北京)光子科技有限公司 A kind of anti-interference working method of TOF Lidar multimachine
CN109863418A (en) * 2016-09-16 2019-06-07 美国亚德诺半导体公司 Interference Handling in Time-of-Flight Depth Sensing
CN109991581A (en) * 2017-11-30 2019-07-09 索尼半导体解决方案公司 Time-of-flight acquisition method and time-of-flight camera
CN110121658A (en) * 2016-12-30 2019-08-13 帕诺森斯有限公司 LIDAR system
CN110636183A (en) * 2019-09-27 2019-12-31 杭州光珀智能科技有限公司 Anti-interference method, system and storage medium
WO2020130855A1 (en) * 2018-12-21 2020-06-25 Chronoptics Limited Time of flight camera data processing system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2594959A1 (en) * 2011-11-17 2013-05-22 MESA Imaging AG System and method for multi TOF camera operation using phase hopping
US20160003946A1 (en) * 2014-07-03 2016-01-07 Advanced Scientific Concepts, Inc. Ladar sensor for a dense environment
CN106982471A (en) * 2016-05-26 2017-07-25 朱晓庆 Wireless communications method, wireless communication terminal, service end and teaching are registered device
CN109863418A (en) * 2016-09-16 2019-06-07 美国亚德诺半导体公司 Interference Handling in Time-of-Flight Depth Sensing
CN110121658A (en) * 2016-12-30 2019-08-13 帕诺森斯有限公司 LIDAR system
CN109991581A (en) * 2017-11-30 2019-07-09 索尼半导体解决方案公司 Time-of-flight acquisition method and time-of-flight camera
CN108668382A (en) * 2018-05-18 2018-10-16 深圳市华奥通通信技术有限公司 A kind of communication means and communication system of Random Communication time interval
CN109613517A (en) * 2018-12-12 2019-04-12 北醒(北京)光子科技有限公司 A kind of anti-interference working method of TOF Lidar multimachine
WO2020130855A1 (en) * 2018-12-21 2020-06-25 Chronoptics Limited Time of flight camera data processing system
CN110636183A (en) * 2019-09-27 2019-12-31 杭州光珀智能科技有限公司 Anti-interference method, system and storage medium

Also Published As

Publication number Publication date
CN112051586B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
KR102406354B1 (en) Video restoration method and apparatus, electronic device and storage medium
EP4524896A3 (en) Electronic device for generating image including 3d avatar reflecting face motion through 3d avatar corresponding to face and method of operating same
KR20210113333A (en) Methods, devices, devices and storage media for controlling multiple virtual characters
CN108027650A (en) For measuring the method for the angle between display and the electronic equipment using this method
CN111932664A (en) Image rendering method and device, electronic equipment and storage medium
CN111524166A (en) Video frame processing method and device
TWI526921B (en) Method and device for displaying character string
US10306194B2 (en) Apparatus, method and system for location based touch
CN112291473B (en) Focusing method, device and electronic device
CN112102417B (en) Method and device for determining world coordinates
CN111815750A (en) Method and device for polishing image, electronic equipment and storage medium
CN111695682A (en) Operation method, device and related product
CN108646917B (en) Intelligent device control method and device, electronic device and medium
CN112631682B (en) Applet processing method, device, equipment and storage medium
CN111815695B (en) Depth image acquisition method and device, mobile terminal and storage medium
WO2023160608A1 (en) Robot control method and apparatus, and storage medium and robot cluster
KR102013917B1 (en) Appartus and method for displaying hierarchical depth image in virtual realilty
CN112051586B (en) Multi-TOF camera joint work anti-interference method, TOF camera and electronic equipment
CN109242782B (en) Noise processing method and device
US11748911B2 (en) Shader function based pixel count determination
CN110809041B (en) Data synchronization method and device, electronic equipment and storage medium
CN112098979A (en) Interference preventing method for combined work of multiple TOF cameras, TOF camera and electronic equipment
US11070736B2 (en) Electronic device and image processing method thereof
CN106469310A (en) Character extracting method and device in picture
CN110988840B (en) Method and device for acquiring flight time and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240822

Address after: Room 803, 8th Floor, Building F, Innovation Park Phase II, No.1 Keyuan Weiyi Road, Laoshan District, Qingdao City, Shandong Province, China 266104

Patentee after: Qingdao Weigan Zhitong Technology Co.,Ltd.

Country or region after: China

Address before: 266000 floor 18, building 2, Shandong Private Science and Technology Building (Minghui International), No. 39 Shiling Road, Laoshan District, Qingdao, Shandong Province (centralized office area)

Patentee before: Qingdao Weigan Technology Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right