[go: up one dir, main page]

CN116151714B - Three-dimensional visual safety production management and control system for harbor yard - Google Patents

Three-dimensional visual safety production management and control system for harbor yard Download PDF

Info

Publication number
CN116151714B
CN116151714B CN202310316828.7A CN202310316828A CN116151714B CN 116151714 B CN116151714 B CN 116151714B CN 202310316828 A CN202310316828 A CN 202310316828A CN 116151714 B CN116151714 B CN 116151714B
Authority
CN
China
Prior art keywords
data
thermal image
dimensional
point cloud
thermal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310316828.7A
Other languages
Chinese (zh)
Other versions
CN116151714A (en
Inventor
贾璐
张帅
张岩松
董建伟
苏青
杜雨杭
刘志明
万劲松
郭鹏
李子鹤
张文伟
贾然然
李海滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Binyuan Guoke Qinhuangdao Intelligent Technology Co ltd
Original Assignee
Qinhuangdao Yanda Binyuan Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qinhuangdao Yanda Binyuan Technology Development Co ltd filed Critical Qinhuangdao Yanda Binyuan Technology Development Co ltd
Priority to CN202310316828.7A priority Critical patent/CN116151714B/en
Publication of CN116151714A publication Critical patent/CN116151714A/en
Application granted granted Critical
Publication of CN116151714B publication Critical patent/CN116151714B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/083Shipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a three-dimensional visualized safety production management and control system for a harbor yard, which comprises the following parts: the system comprises a thermal image acquisition and splicing module, a point cloud three-dimensional data processing module, a thermal image material pile three-dimensional data processing module and a three-dimensional visual display module; and according to the coal pile detection method and the thermal imaging temperature monitoring module of the system, which are used as a temperature monitoring mode, the obtained full-field multi-angle thermal imaging temperature map is subjected to conversion and splicing processing through the thermal imaging temperature processing module. And scanning the material pile by using a laser radar to obtain material pile point cloud data, and sending the data to a three-dimensional material pile generation module to generate a material pile three-dimensional model. And processing the data of the temperature processing module and the three-dimensional model data of the material pile through the three-dimensional material pile thermal image processing module, processing the three-dimensional material pile by using layered thermal image data, and finally sending the data to a visualization module of the safety control system for three-dimensional display.

Description

Three-dimensional visual safety production management and control system for harbor yard
Technical Field
The invention relates to the technical field of safe production of ports and wharfs, in particular to a three-dimensional visual safe production management and control system for a port yard.
Background
The open-air stock yard area of the port and dock is huge, the spontaneous combustion of the coal pile is the most frequent safety accident in the coal storage yard, if the fully-closed coal bunker is implemented, the great fund pressure is increased, and the adoption of a proper coal pile temperature monitoring mode plays a vital role in safe production. The current coal pile temperature monitoring is mostly based on thermal resistors, temperature sensors and infrared thermal imagers, the thermal resistor mode is to monitor the temperature of the coal pile by inserting the thermal resistors or the thermocouples into the coal pile, the engineering quantity of the mode is large, the points needing to be monitored are many, labor is consumed, and the problem of temperature instantaneity exists. The temperature sensor is implemented by installing a wireless sensor above the coal pile, which requires a large number of sensors to be installed and does not generate image data. The infrared thermal imaging instrument needs to be used in a handheld mode, and the health of operators is threatened by harmful gas generated by coal oxidation. Moreover, the devices are generally aimed at single stockpiles and strip yards, have strong independence, cannot form the heat information of the stockpiles in the whole yard, are mostly displayed in the form of text reports, and are inconvenient to observe and master the overall stockpile temperature.
Accordingly, the prior art has drawbacks and needs improvement.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a three-dimensional visual safety production management and control system for a harbor yard aiming at the defects of the prior art.
The technical scheme of the invention is as follows:
a three-dimensional visual safety production management and control system facing a port yard comprises the following parts: the system comprises a thermal image acquisition and splicing module, a point cloud three-dimensional data processing module, a thermal image material pile three-dimensional data processing module and a three-dimensional visual display module; the thermal image acquisition and splicing module is used for acquiring full-field thermal image data and carrying out splicing treatment on the thermal image; the point cloud three-dimensional data processing module is used for scanning and acquiring original point cloud data of the material pile and preprocessing the original point cloud data; the thermal image material pile three-dimensional data processing module is used for fusing the acquired and spliced thermal image data with the data acquired and processed by the three-dimensional laser scanning radar; the three-dimensional visual display module displays all-field material pile changes, stacker-reclaimer movements, belt conveyor work and work reports in real time in a live-action three-dimensional model display mode.
The thermal image acquisition and splicing module comprises a thermal imaging camera, a communication module and a thermal image splicing processing module; the real-time temperature data acquired by the thermal imaging camera is sent to a thermal image splicing processing module through a communication module, and the thermal image data acquired at the same moment are fused and spliced by the thermal image splicing processing module to generate thermal data of all angles of a full-field single material pile.
The three-dimensional visual safety production management and control system calculates the installation number and the installation distance of the thermal imaging cameras on the premise of ensuring overlapping irradiation ranges of the thermal imaging cameras according to the length of a storage yard, and the calculation mode is as follows: according to the site condition of a storage yard, the focal length of the thermal imaging cameras is determined mainly by the position where the thermal imaging cameras can be installed, all the thermal imaging cameras are fixed to be the same focal length, and according to a field angle calculation formula: tan (ω/2) =ac/s where s is the object distance, ac is the radius of the lens visual range, ω is the visual angle, the mode of the camera is set to 4:3, the imaging size is calculated according to the thermal imaging camera CMOS, the CMOS imaging size is w×h, w is the width of the CMOS target area, and h is the target area height; the horizontal view angle and the vertical view angle can be calculated according to the formula α=2arctan (w (h)/2 x focal length), the horizontal view angle can be used for calculating the complete minimum distance which can be seen by the thermal imaging camera in the horizontal direction according to the formula dhmin' =w/2 x tan (α/2), and the minimum height which can be seen by the thermal imaging camera in the vertical direction can be calculated in the same way. The installation interval of the thermal imaging cameras is the minimum visible observation distance in the horizontal direction, and the number of cameras to be installed can be obtained by dividing the width of the strip field by the minimum distance.
The three-dimensional visual safety production management and control system is characterized in that thermal image splicing processing is to splice thermal image data acquired by a single thermal imaging camera into thermal image data of a full field, the thermal imaging cameras are distributed through a horizontal minimum observation distance, a thermal image overlapping area is generated between every two cameras, the thermal image overlapping area is eliminated through data cutting of a fixing area of each single thermal image, and the fixing area is a value in the overlapping area between the two cameras within the minimum observation distance; and splicing the cut data according to the position information to form thermal image data of a single storage yard.
The three-dimensional visual safety production management and control system sorts the collected thermal image images according to the storage yard and the position information, groups the thermal image data according to the storage yard number, and sorts the thermal image data according to the position information. The two adjacent thermal images are subjected to pixel segmentation, as shown in fig. 2, patch represents a thermal image, overlay represents an overlapping region, cut represents a segmentation line, image pixels on the left side of the segmentation line are contributed by A, and image pixels on the right side of the segmentation line are contributed by B. The overlapping area is overlapped by taking 1/5 of the right side of the width of the A thermal pixel and 1/5 of the left side of the width of the B thermal pixel. Assuming that two adjacent output pixels in the overlapping region are s and t, the output pixels may be from either a or B, where a(s) and B(s) are used to represent the colors of s at the points a and B, and a (t) and B (t) are used to represent the colors of t at the points a and B. Thus, the connection of s-point and t-point is defined as:
M(s,t,A,B)=||A(s)-B(s)||+||A(t)-B(t)|||
minimizing sigma for all s,t on the cut M (s, t, a, B), after the seam is found, the left pixel is copied from a and the right pixel is copied from B. Two Patches are continuously spliced into a whole-field thermal image graph according to the sequence of the thermal image graphs.
In the three-dimensional visual safety production management and control system, in the point cloud three-dimensional data processing module, original point cloud data of a material pile are obtained by utilizing three-dimensional laser radar scanning, the three-dimensional laser radar is arranged on a cantilever of a stacker-reclaimer or a warehouse shed roof, and the original point cloud data of the material pile are obtained in real time by utilizing the three-dimensional laser radar scanning; and the communication PLC sends the data to a point cloud processing program to perform standardized processing on the data.
According to the three-dimensional visual safety production management and control system, due to huge data volume of original point cloud, all objects in a laser scanning radar range are scanned as a result of scanned data, and further processing is needed, data which do not accord with the characteristics of a material pile point cloud and abnormal point cloud data are screened out, and the abnormal point is characterized in that compared with the normal point, the abnormal point is far away from other points, and the processing method is as follows: according to the deletion of the point with a longer distance from the adjacent point, a certain point p in the point cloud C is deleted i The kth point closest to it is p ik The distance between two points is expressed as:
m ik =||p i -p ik ||
the average of the distances between the K points and Pi is then calculated and used to measure the distance between a point and its nearest point:
for the whole point cloud C, all d are calculated i Mean μ and variance σ of 2 Wherein N represents the number of points in C;
if the abnormal point cloud data appear in the coal pile, correcting the point cloud data through difference value calculation after removal, wherein the correction mode is to perform point cloud registration by utilizing target data of different scanning radars according to the scanning data of other adjacent radars generating the abnormal point cloud data. The processed material pile point cloud data has huge point cloud data, and the generation of the coal pile three-dimensional model does not need large-scale data, and the point cloud data is thinned and simplified through a data processing program, and the method comprises the following steps: and calculating the average distance between the point clouds of normal scanning of a single stock pile in advance, taking 1/10 of the obtained average distance as a threshold value, and combining two points with the distance smaller than the threshold value into one point when all the point cloud data are processed, wherein the spatial position of the combined points is the middle position of the two points before combination.
The three-dimensional visual safety production management and control system, the thermal image material pile three-dimensional data processing module extracts material pile width points represented in the point cloud data, the ratio of the position coordinates of the new point obtained after the abnormal point cloud threshold calculation processing to the whole field length of the material field can be used for calculating the proportional position of the current point in the whole material field; then extracting the width proportion value of the thermal image data of the material field after segmentation and fusion, calculating the number of pixels corresponding to the distance of 1 meter of the material field according to the ratio of the pixel length of the current thermal image to the actual length of the material field, extracting the number of point clouds contained in the distance of 1 meter after threshold calculation, determining the relation between the number of point clouds and the number of pixels, and uniformly distributing the pixel data into the point cloud data by taking the number of the point clouds as a standard; point cloud data with thermal image data is generated.
According to the coal pile detection method and the system, the thermal imaging temperature monitoring module is used as a temperature monitoring mode, and the obtained full-field multi-angle thermal image temperature map is subjected to conversion and splicing processing through the thermal image temperature processing module. And scanning the material pile by using a laser radar to obtain material pile point cloud data, and sending the data to a three-dimensional material pile generation module to generate a material pile three-dimensional model. And processing the data of the temperature processing module and the three-dimensional model data of the material pile through the three-dimensional material pile thermal image processing module, processing the three-dimensional material pile by using layered thermal image data, and finally sending the data to a visualization module of the safety control system for three-dimensional display.
Drawings
FIG. 1 is a system frame diagram of the present invention;
FIG. 2 is a schematic diagram of a method for stitching thermal images;
Detailed Description
The present invention will be described in detail with reference to specific examples.
Referring to fig. 1, a three-dimensional visualized safety production management and control system for a harbor yard comprises the following parts: the system comprises a thermal image acquisition and splicing module, a point cloud three-dimensional data processing module, a thermal image material pile three-dimensional data processing module and a three-dimensional visual display module;
(1) Thermal image acquisition and splicing module
The thermal image acquisition and splicing module comprises a thermal imaging camera, a communication module and a thermal image splicing processing module, wherein the thermal imaging camera is installed according to a field/strip/stack/pile distribution mode, and the full-field material pile is ensured to be covered. The real-time temperature data acquired by the thermal imaging camera is sent to a thermal image splicing processing module through a communication module, and the thermal image data acquired at the same moment are fused and spliced by the thermal image splicing processing module to generate thermal data of all angles of a full-field single material pile.
(1.1) thermal imaging camera acquiring full field thermal image data
Acquiring full-field thermal image data requires installing a thermal imaging camera array in a field area to form an image closed loop, and calculating the installation number and the installation distance of the thermal imaging cameras on the premise of ensuring overlapping irradiation ranges of the thermal imaging cameras according to the length of a storage yard, wherein the calculation mode is as follows: according to the site condition of a storage yard, the focal length of the thermal imaging cameras is determined mainly by the position where the thermal imaging cameras can be installed, all the thermal imaging cameras are fixed to be the same focal length, and according to a field angle calculation formula:
tan(ω/2)=ac/s
where s is the object distance, ac is the radius of the visual range of the lens, ω is the visual angle, the mode of the camera is set to 4:3, the imaging size is calculated according to the thermal imaging camera CMOS, the CMOS imaging size is w×h, w is the width of the CMOS target area, and h is the height of the target area.
The horizontal view angle and the vertical view angle can be calculated according to the formula α=2arctan (w (h)/2 x focal length), the horizontal view angle can be used for calculating the complete minimum distance which can be seen by the thermal imaging camera in the horizontal direction according to the formula dhmin' =w/2 x tan (α/2), and the minimum height which can be seen by the thermal imaging camera in the vertical direction can be calculated in the same way. The installation interval of the thermal imaging cameras is the minimum visible observation distance in the horizontal direction, and the number of cameras to be installed can be obtained by dividing the width of the strip field by the minimum distance. Can be mounted on the top of a warehouse shed or a high-pole bracket, and can be mounted in superposition with other equipment such as an alarm device. The thermal image data generated by all thermal imaging cameras at the same moment and the electronic tags of the thermal imaging cameras are acquired by the communication module at fixed frequency, wherein the electronic tags comprise fixed numbers, position information, time and the like. And sending the acquired time serving as a label to a thermal image splicing processing module.
(1.2) the thermal image splicing processing module splices the thermal image
The thermal image splicing processing is to splice thermal image data acquired by a single thermal imaging camera into thermal image data of a full field, the thermal imaging cameras are distributed through a horizontal minimum observation distance, a thermal image overlapping area is generated between every two cameras, the thermal image overlapping area is eliminated through data cutting of a fixing area of each single thermal image, and the fixing area is a value in the overlapping area between the two cameras within the minimum observation distance. And splicing the cut data according to the position information to form thermal image data of a single storage yard.
The specific scheme is as follows: the collected thermal image images are ordered according to the storage yard and the position information, the thermal image data are grouped according to the storage yard number, and then the thermal image data are ordered according to the thermal image position information. The two adjacent thermal images are subjected to pixel segmentation, as shown in fig. 2, patch represents a thermal image, overlay represents an overlapping region, cut represents a segmentation line, image pixels on the left side of the segmentation line are contributed by A, and image pixels on the right side of the segmentation line are contributed by B. The overlapping area is overlapped by taking 1/5 of the right side of the width of the A thermal pixel and 1/5 of the left side of the width of the B thermal pixel. Assuming that two adjacent output pixels in the overlapping region are s and t, the output pixels may be from either a or B, where a(s) and B(s) are used to represent the colors of s at the points a and B, and a (t) and B (t) are used to represent the colors of t at the points a and B. Thus, the connection of s-point and t-point is defined as:
M(s,f,A,B)=||A(s)-B(s)||+||A(t)-B(t)||
minimizing sigma for all s,t on the cut M (s, t, a, B), after the seam is found, the left pixel is copied from a and the right pixel is copied from B. Two Patches are continuously spliced into a whole-field thermal image graph according to the sequence of the thermal image graphs. The machine vision has larger error when processing objects with unobvious characteristic points such as a material pile, and the thermal image graph is processed by the innovative algorithm, so that the edge of the material pile can be accurately detected, and reliable data support is provided for a safe production management and control system.
(2) Point cloud three-dimensional data processing module
(2.1) scanning and acquiring original point cloud data of a material pile by utilizing three-dimensional laser radar
The three-dimensional laser radar is generally arranged on a cantilever of a stacker-reclaimer or a warehouse shed roof, and raw point cloud data of a material pile are acquired in real time through scanning of the three-dimensional laser radar. And the communication PLC sends the data to a point cloud processing program to perform standardized processing on the data.
(2.2) processing the original Point cloud data
The original point cloud data volume is huge, the scanned data result is all objects in the range of the laser scanning radar, the data which do not accord with the characteristics of the material pile point cloud and the abnormal point cloud data are screened out, the abnormal point is characterized in that compared with the normal point, the abnormal point is far away from other points, and the processing method is as follows: according to the deletion of the point with a longer distance from the adjacent point, a certain point p in the point cloud C is deleted i The kth point closest to it is p ik The distance between two points is expressed as:
m ik =||p i -p ik ||
the average of the distances between the K points and Pi is then calculated and used to measure the distance between a point and its nearest point:
for the whole point cloud C, all d are calculated i Mean μ and variance σ of 2 Where N represents the number of points in C.
If the abnormal point cloud data appear in the coal pile, correcting the point cloud data through difference value calculation after removal, wherein the correction mode is to perform point cloud registration by utilizing target data of different scanning radars according to the scanning data of other adjacent radars generating the abnormal point cloud data. The processed material pile point cloud data has huge point cloud data, and the generation of the coal pile three-dimensional model does not need large-scale data, and the point cloud data is thinned and simplified through a data processing program, and the method comprises the following steps: and calculating the average distance between the point clouds of normal scanning of a single stock pile in advance, taking 1/10 of the obtained average distance as a threshold value, and combining two points with the distance smaller than the threshold value into one point when all the point cloud data are processed, wherein the spatial position of the combined points is the middle position of the two points before combination. And changing the data format to generate the data format required by the three-dimensional data processing module.
The update frequency of the module is required to be the same as the thermal image acquisition frequency, the three-dimensional point cloud data of the whole material pile can be acquired in real time, and the data support is provided for the three-dimensional model of the material pile after standardized screening treatment.
(3) Thermal imaging material pile three-dimensional data processing module
The thermal image material pile three-dimensional data processing relates to fusion of acquired and spliced thermal image data and data acquired and processed by a three-dimensional laser scanning radar.
And extracting a material pile width point represented in the point cloud data, and calculating the ratio of the position coordinate of the new point obtained after the abnormal point cloud threshold calculation processing to the whole field length of the material field to obtain the ratio position of the current point in the whole material field. And extracting the width proportion value of the thermal image data of the material field after segmentation and fusion, calculating the number of pixels corresponding to the distance of 1 meter of the material field according to the ratio of the pixel length of the current thermal image to the actual length of the material field, extracting the number of point clouds contained in the distance of 1 meter after threshold calculation, determining the relation between the number of point clouds and the number of pixels, and uniformly distributing the pixel data to the point cloud data by taking the number of the point clouds as a standard. Point cloud data with thermal image data is generated.
(4) Three-dimensional visual display module
The three-dimensional visual display module displays the material pile change, the stacker-reclaimer movement, the belt conveyor work, the work report and the like in a full-field manner in real time through a live three-dimensional model display mode. The change of the material pile is displayed, data are required to be acquired through a communication program, and the communication program performs data format conversion processing after acquiring the thermal image three-dimensional point cloud data and converts the thermal image three-dimensional point cloud data into a data format which can be used by a visualization program. The visualization program obtains the converted data, firstly reads information such as position numbers and the like in the converted data, determines the actual position of each material pile in a storage yard, reads a point cloud data part, generates a three-dimensional model of the material pile in real time in a visualization window through the point cloud data information, finally reads related data of a thermal image, converts the thermal image data into a thermal image pattern, covers the surface of the three-dimensional model of the material pile to form the three-dimensional model of the material pile with thermal image display, and can freely switch the original display, the thermal image display and the high-level layered display modes according to the requirements.
The three-dimensional modeling of the stacker-reclaimer, the belt conveyor work and the yard scene is carried out in advance according to the field real objects, the three-dimensional models such as the stacker-reclaimer, the belt conveyor and the like are displayed in a visual program, the operation data in a yard operation database are read in real time during operation, the motion condition of engineering machinery under a real material yard is simulated in real time in the visual program, the operation condition of the whole yard can be observed under different visual angles, the operation observation can be carried out by selecting an independent material yard, a material pile and engineering machinery, and the real-time engineering machinery operation data information, fault information and the like are displayed in a visual interface in combination with a data report. The comprehensive display of the global operation information and the operation data is realized.
It will be understood that modifications and variations will be apparent to those skilled in the art from the foregoing description, and it is intended that all such modifications and variations be included within the scope of the following claims.

Claims (6)

1. The three-dimensional visualized safety production management and control system for the harbor yard is characterized by comprising the following parts: the system comprises a thermal image acquisition and splicing module, a point cloud three-dimensional data processing module, a thermal image material pile three-dimensional data processing module and a three-dimensional visual display module; the thermal image acquisition and splicing module is used for acquiring full-field thermal image data and carrying out splicing treatment on the thermal image; the point cloud three-dimensional data processing module is used for scanning and acquiring original point cloud data of the material pile and preprocessing the original point cloud data; the thermal image material pile three-dimensional data processing module is used for fusing the acquired and spliced thermal image data with the data acquired and processed by the three-dimensional laser scanning radar; the three-dimensional visual display module displays all-field material pile changes, stacker-reclaimer movements, belt conveyor work and work reports in real time in a live-action three-dimensional model display mode; the thermal image acquisition and splicing module comprises a thermal imaging camera, a communication module and a thermal image splicing processing module; real-time temperature data acquired by the thermal imaging camera are sent to a thermal image splicing processing module through a communication module, and thermal image data acquired at the same moment are fused and spliced by the module to generate thermal image data of all angles of a full-field single material pile; according to the length of the storage yard, the installation number and the installation distance of the thermal imaging cameras are calculated on the premise of ensuring overlapping of the irradiation ranges of the cameras of the thermal imaging cameras, and the calculation mode is as follows: determining the focal length of the thermal imaging cameras according to the positions of the thermal imaging cameras in the storage yard, fixing all the thermal imaging cameras to the same focal length, and setting a view angle calculation formula: tan (ω/2) =ac/s, where s is the object distance, ac is the radius of the lens visual range, ω is the field angle, the mode of the thermal imaging camera is set to 4:3, the imaging size is calculated according to the thermal imaging camera CMOS, the CMOS imaging size is w×h, w is the width of the CMOS target area, and h is the target area height; the horizontal view angle and the vertical view angle are calculated according to the formula alpha=2 arctan (w (h)/2 x focal length), the complete minimum distance which can be seen by the thermal imaging camera in the horizontal direction is calculated according to the formula dhmin' =w/2 x tan (alpha/2) through the horizontal view angle, and the minimum height which can be seen by the thermal imaging camera in the vertical direction is calculated in the same way; the installation interval of the thermal imaging cameras is the minimum visual observation distance in the horizontal direction, and the number of cameras to be installed is obtained by dividing the width of the storage yard by the minimum distance.
2. The three-dimensional visualized safety production control system according to claim 1, wherein the thermal image map stitching process is to stitch thermal image data acquired by a single thermal imaging camera into thermal image data of a full field, the thermal imaging cameras are distributed by a horizontal minimum observation distance, a thermal image overlapping region is generated between every two cameras, the thermal image overlapping region is removed by performing data cutting of a fixed region on each single thermal image map, and the data of the fixed region is a value in the overlapping region between the two cameras within the minimum observation distance; and splicing the cut data according to the position information to form thermal image data of a single storage yard.
3. The three-dimensional visual safety production control system according to claim 2, wherein the collected thermal image maps are ordered according to storage yard and position information, the thermal image map data are grouped according to storage yard numbers, and then the thermal image map position information is ordered; dividing pixels of two adjacent thermal image images, wherein Patch represents a thermal image, overlay represents an overlapping region, cut represents a dividing line, the left image pixel of the dividing line is contributed by A, and the right image pixel of the dividing line is contributed by B; overlapping the overlapping area by 1/5 of the right side of the width of the A thermal pixel image and 1/5 of the left side of the width of the B thermal pixel image; assuming that two adjacent output pixels in the overlapping region are s and t respectively, the output pixels can be from A or B, wherein A(s) and B(s) are used for representing the colors of an s point in an A thermal image and a B thermal image, and A (t) and B (t) are used for representing the colors of a t point in the A thermal image and the B thermal image; thus, the connection of s-point and t-point is defined as:
M(s,t,A,B)=||A(s)-B(s)||+||A(t)-B(t)||
minimizing sigma for all s,t on the cut M (s, t, a, B), after finding the seam, the left pixel is copied from a and the right pixel is copied from B; in the order of thermal imageTwo Patches are continuously spliced into a thermal image of the whole field.
4. The three-dimensional visual safety production control system according to claim 1, wherein in the point cloud three-dimensional data processing module, original point cloud data of a material pile are obtained by utilizing three-dimensional laser radar scanning, the three-dimensional laser radar is arranged on a cantilever of a stacker-reclaimer or a warehouse shed roof, and the original point cloud data of the material pile are obtained in real time by utilizing the three-dimensional laser radar scanning; and the communication PLC sends the data to a point cloud processing program to perform standardized processing on the data.
5. The three-dimensional visualized safety production control system according to claim 4, wherein the data of the original point cloud is huge, the scanned data result is all objects in the range of the laser radar, and the data which do not accord with the characteristics of the material pile point cloud and the abnormal point cloud data are screened out, and the abnormal point is characterized in that compared with the normal point, the abnormal point is far away from other points, and the processing method is as follows: for a point p in the point cloud C i The kth point closest to it is p ik The distance between two points is expressed as:
m ik =||p i -p ik ||
then calculate K points and P i For measuring the distance between a point and its nearest point:
for the whole point cloud C, all d are calculated i Mean μ and variance σ of 2 Wherein N represents the number of points in C;
if abnormal point cloud data appear in a material pile, correcting the point cloud data through difference value calculation after removal, wherein the correction mode is to use scanning data of adjacent radars of the abnormal point cloud data, and point cloud registration is carried out by using target data of different scanning radars; the time cloud data are huge, large-scale data are not needed for generating the material pile three-dimensional model, and the data are thinned and simplified through a data processing program, and the method comprises the following steps: and calculating the average distance between the point clouds of normal scanning of a single stock pile in advance, taking 1/10 of the obtained average distance as a threshold value, and combining two points with the distance smaller than the threshold value into one point when all the point cloud data are processed, wherein the spatial position of the combined points is the middle position of the two points before combination.
6. The three-dimensional visualized safety production control system according to claim 1, wherein the thermal imaging material pile three-dimensional data processing module extracts a material pile width point represented in the point cloud data, and calculates the proportional position of the current point in the whole material field by the ratio of the position coordinate of the new point obtained after the abnormal point cloud threshold calculation processing and the whole field length of the material field; then extracting the width proportion value of the thermal image data of the material field after segmentation and fusion, calculating the number of pixels corresponding to the distance of 1 meter of the material field according to the ratio of the pixel length of the current thermal image to the actual length of the material field, extracting the number of point clouds contained in the distance of 1 meter after threshold calculation, determining the relation between the number of point clouds and the number of pixels, and uniformly distributing the pixel data into the point cloud data by taking the number of the point clouds as a standard; point cloud data with thermal image data is generated.
CN202310316828.7A 2023-03-29 2023-03-29 Three-dimensional visual safety production management and control system for harbor yard Active CN116151714B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310316828.7A CN116151714B (en) 2023-03-29 2023-03-29 Three-dimensional visual safety production management and control system for harbor yard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310316828.7A CN116151714B (en) 2023-03-29 2023-03-29 Three-dimensional visual safety production management and control system for harbor yard

Publications (2)

Publication Number Publication Date
CN116151714A CN116151714A (en) 2023-05-23
CN116151714B true CN116151714B (en) 2023-09-19

Family

ID=86352625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310316828.7A Active CN116151714B (en) 2023-03-29 2023-03-29 Three-dimensional visual safety production management and control system for harbor yard

Country Status (1)

Country Link
CN (1) CN116151714B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011309B (en) * 2023-09-28 2023-12-26 济宁港航梁山港有限公司 Automatic coal-coiling system based on artificial intelligence and depth data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980517A (en) * 2012-11-15 2013-03-20 天津市亚安科技股份有限公司 Monitoring measurement method
CN104111035A (en) * 2014-07-03 2014-10-22 华北电力大学(保定) Digital coal stocktaking system and digital coal stocktaking method
CN108648272A (en) * 2018-04-28 2018-10-12 上海激点信息科技有限公司 Three-dimensional live acquires modeling method, readable storage medium storing program for executing and device
CN115564316A (en) * 2022-11-29 2023-01-03 武汉云卓环保工程有限公司 Digital coal yard system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10151839B2 (en) * 2012-06-01 2018-12-11 Agerpoint, Inc. Systems and methods for determining crop yields with high resolution geo-referenced sensors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102980517A (en) * 2012-11-15 2013-03-20 天津市亚安科技股份有限公司 Monitoring measurement method
CN104111035A (en) * 2014-07-03 2014-10-22 华北电力大学(保定) Digital coal stocktaking system and digital coal stocktaking method
CN108648272A (en) * 2018-04-28 2018-10-12 上海激点信息科技有限公司 Three-dimensional live acquires modeling method, readable storage medium storing program for executing and device
CN115564316A (en) * 2022-11-29 2023-01-03 武汉云卓环保工程有限公司 Digital coal yard system

Also Published As

Publication number Publication date
CN116151714A (en) 2023-05-23

Similar Documents

Publication Publication Date Title
CN115597659B (en) Intelligent safety management and control method for transformer substation
EP3138754B1 (en) Rail track asset survey system
US9292922B2 (en) Point cloud assisted photogrammetric rendering method and apparatus
EP2033747A2 (en) Robot simulation apparatus
NO170368B (en) PROCEDURE FOR THREE-DIMENSIONAL MAPPING OF AN OBJECTS
CN116151714B (en) Three-dimensional visual safety production management and control system for harbor yard
Fang et al. A point cloud-vision hybrid approach for 3D location tracking of mobile construction assets
EP3161412B1 (en) Indexing method and system
KR102255978B1 (en) Apparatus and method for generating tunnel internal precise map based on tunnel internal object detection using 3D sensor
KR102298643B1 (en) 3D modeling method of underwater surfaces using infrared thermal imaging camera and drone
JP2018077837A (en) Position recognition method and system, and abnormality determination method and system
Wang et al. Preliminary research on vehicle speed detection using traffic cameras
CN118037956A (en) Method and system for generating three-dimensional virtual reality in fixed space
JP7480833B2 (en) Measuring equipment, measuring systems and vehicles
JP7241906B2 (en) Data processing device and data processing method
JP2009052907A (en) Foreign matter detecting system
CN112488022B (en) Method, device and system for monitoring panoramic view
JP7280852B2 (en) Person detection system, person detection program, trained model generation program and trained model
CN110288832A (en) It is merged based on microwave with the multiple-object information of video and visual presentation method
CN117636373A (en) Electronic price tag detection method and device
KR102362512B1 (en) System for correcting out by improving target detectin accuracy
KR102362509B1 (en) Image map making system auto-searching the misprounciations of digital map data for precision improvement of map imgae
KR102209025B1 (en) Markerless based AR implementation method and system for smart factory construction
JP2021174216A (en) Facility inspection system, facility inspection method
JP2000032601A (en) Orbit recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 066006 room 2317, 3rd floor, zone B, E-Valley creative space, 12 Yanghe Road, Qinhuangdao Economic and Technological Development Zone, Hebei Province

Patentee after: Binyuan Guoke (Qinhuangdao) Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 066006 room 2317, 3rd floor, zone B, E-Valley creative space, 12 Yanghe Road, Qinhuangdao Economic and Technological Development Zone, Hebei Province

Patentee before: QINHUANGDAO YANDA BINYUAN TECHNOLOGY DEVELOPMENT Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 066006 4th Floor, Science and Technology Building, No. 69 Longhai Road, Economic and Technological Development Zone, Qinhuangdao City, Hebei Province

Patentee after: Binyuan Guoke (Qinhuangdao) Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 066006 room 2317, 3rd floor, zone B, E-Valley creative space, 12 Yanghe Road, Qinhuangdao Economic and Technological Development Zone, Hebei Province

Patentee before: Binyuan Guoke (Qinhuangdao) Intelligent Technology Co.,Ltd.

Country or region before: China