CN117593498B - Digital twin scene configuration method and system - Google Patents
Digital twin scene configuration method and system Download PDFInfo
- Publication number
- CN117593498B CN117593498B CN202410079151.4A CN202410079151A CN117593498B CN 117593498 B CN117593498 B CN 117593498B CN 202410079151 A CN202410079151 A CN 202410079151A CN 117593498 B CN117593498 B CN 117593498B
- Authority
- CN
- China
- Prior art keywords
- scene
- sensor
- data
- model
- configuration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 27
- 230000005540 biological transmission Effects 0.000 claims abstract description 46
- 238000001514 detection method Methods 0.000 claims abstract description 28
- 230000008859 change Effects 0.000 claims abstract description 8
- 230000002452 interceptive effect Effects 0.000 claims description 32
- 230000004044 response Effects 0.000 claims description 31
- 238000013499 data model Methods 0.000 claims description 4
- 238000012986 modification Methods 0.000 claims description 4
- 230000004048 modification Effects 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000006978 adaptation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013479 data entry Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003628 erosive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000013535 sea water Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a digital twin scene configuration method and system, comprising at least one processor configured to execute: presenting a configuration detail list of a first sensor on a sensor configuration interface, wherein a facility port and a transmission channel of a scene with which the first sensor can be associated are listed, and responding to the selection of the facility port and the transmission channel of the scene with which the first sensor is associated in the configuration detail list by a user, and binding the facility port and the transmission channel of the selected scene with the first sensor in an associated manner, so that detection data collected by the facility port of the selected scene is transmitted to the first sensor through the selected transmission channel as the detection number of the first sensor in the scene item; therefore, only one operator is required to complete the layout of the sensor in the three-dimensional model, the sensor can be matched with background data, and when the related information of the sensor is wrong, only online change is required.
Description
Technical Field
The application relates to the technical field of digital twinning, in particular to a digital twinning scene configuration method and a digital twinning scene configuration system.
Background
In the general monitoring system, a three-dimensional sensor model is required to be displayed in a three-dimensional model, if the number of sensors is large, a large amount of manpower is required to put the sensor model in the three-dimensional model according to a drawing, and in the putting process, an operator needs to adjust the position, the angle, the size and the like of the sensor according to three views of the drawing and input the name, the model, the coordinate position and the like of the sensor. And the sensor information cannot be matched with background data, the sensor information is released to a three-dimensional rendering engine after placement is completed, and the sensor information is manually input through code matching model attributes. After the program is released, if the sensor position or information is found to be wrong, the sensor model position and the sensor information need to be changed manually again, so that the whole development process needs to participate in one pass. From the model editing, the sensor information is sent to the three-dimensional engine after being changed, and a developer needs to carry out packaging program and release the program. Therefore, the operation process of displaying the three-dimensional sensor model in the three-dimensional model is complex, more operators are needed, the sensor model is more likely to be placed repeatedly, and the repeated operation of the whole flow is involved.
Disclosure of Invention
The present application has been made to solve the above-mentioned drawbacks of the prior art. The digital twin scene configuration method and system are needed, only one operator is needed to complete the layout of the sensors in the three-dimensional model, the sensors can be matched with background data, and when related information of the sensors is wrong, the corresponding sensors are only needed to be changed online.
According to a first aspect of the present application, there is provided a digital twin scene configuration method comprising performing the following steps with at least one processor. In response to a first interactive operation of a user creating a scene item, the scene item is created. And responding to a second interactive operation of the user entering the scene configuration, and presenting a scene configuration interface. A scene is presented in the scene configuration interface, a model configuration interface is presented in response to a third interactive operation of the user to open the model configuration, and icons of respective candidate models are presented in the model configuration interface. And responding to third interactive operation of the user on the icon of the first model in the candidate models, dragging the first model into the scene to be presented, and presenting the model attribute information configuration item at the associated position of the first model. And configuring attribute information in response to a fourth interactive operation of the model attribute information configuration item by the user, and enabling the first model in the scene to be changed according to the configured attribute information, so that the configured first model is obtained. And presenting a sensor configuration interface in the scene configuration interface, and presenting icons of each candidate sensor in the sensor configuration interface. In response to a fifth interactive operation of the icon of the first sensor in the candidate sensors by the user, dragging the first sensor to a layout position in the configured first model is presented. In response to a sixth interactive operation driven by the user setting model data, the set driving data of the first model is applied to the first model. Presenting a list of configuration details of the first sensor on a sensor configuration interface, wherein a facility port and a transmission channel of a scene site with which the first sensor can be associated are listed, in response to a user selection of the facility port and the transmission channel of the scene site with which the first sensor is associated in the list of configuration details, and associating and binding the facility port and the transmission channel of the selected scene site with the first sensor, so that the detection data collected by the facility port of the selected scene site is transmitted to the first sensor through the selected transmission channel to serve as the detection data of the first sensor in the scene item.
According to a second aspect of the present application, there is provided a digital twin scene configuration system comprising an interface and at least one processor. An interface configured to: is selectively communicatively coupled to the facility port of the scene site and to the transmission channel to obtain the detection data collected by the facility port of the selected scene site via the selected transmission channel. At least one processor configured to: a digital twin scene configuration method according to any of the embodiments of the present application is performed.
According to the digital twin scene configuration method and system provided by the embodiments of the application, the drag operation of the model icon and the configuration of the attribute information are carried out on the model configuration interface of the scene, so that the configured first model is obtained; on the basis of the first model, the sensor can be placed at the corresponding layout position of the model through the dragging operation of the corresponding sensor icon, and complex operations such as the position, the angle and the size of the sensor are not required to be input according to a drawing; associating the first model with the data by a model data driven setting; through the operation of the configuration detail list on the sensor configuration interface, the sensor is associated and bound with the sensor data of the scene so as to monitor based on the actual detection data of the scene.
Drawings
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. The same reference numerals with letter suffixes or different letter suffixes may represent different instances of similar components. The accompanying drawings illustrate various embodiments by way of example in general and not by way of limitation, and together with the description and claims serve to explain the claimed embodiments. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. Such embodiments are illustrative and not intended to be exhaustive or exclusive of the present apparatus or method.
FIG. 1 shows a process flow diagram of a processor according to an embodiment of the application;
FIG. 2 illustrates an interface schematic for creating a scene item according to an embodiment of the application;
FIG. 3 illustrates a scene configuration interface schematic according to an embodiment of the application;
FIG. 4 illustrates a model configuration interface schematic according to an embodiment of the present application;
FIG. 5 illustrates an interface diagram of a presentation model attribute information configuration item according to an embodiment of the present application;
FIG. 6 illustrates a sensor configuration interface schematic according to an embodiment of the application;
FIG. 7 illustrates an interface diagram presenting a list of configuration details of a first sensor, according to an embodiment of the application;
FIG. 8 illustrates a schematic diagram of a setup model data-driven interface, according to an embodiment of the application;
FIG. 9 illustrates an interface schematic for setting up a scene data driver according to an embodiment of the application;
FIG. 10 illustrates an interface schematic for deploying a first sensor in accordance with an embodiment of the application;
FIG. 11 illustrates an interface diagram for presenting a configuration list in accordance with an embodiment of the present application;
FIG. 12 illustrates a project information configuration interface schematic according to an embodiment of the application; and
Fig. 13 shows a schematic structural diagram of a digital twin scene configuration system according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the drawings and detailed description to enable those skilled in the art to better understand the technical scheme of the present application. Embodiments of the present application will be described in further detail below with reference to the drawings and specific examples, but not by way of limitation.
The terms "first," "second," and the like, as used herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises" and the like means that elements preceding the word encompass the elements recited after the word, and not exclude the possibility of also encompassing other elements.
According to an embodiment of the present application, there is provided a digital twin scene configuration method including performing the following steps with at least one processor. FIG. 1 shows a process flow diagram of a processor according to an embodiment of the application.
FIG. 2 illustrates an interface schematic for creating a scene item according to an embodiment of the application. In step 101, a scene item is created in response to a first interactive operation by a user creating the scene item. As shown in fig. 2, by clicking on the icon of the new wind farm and popping up the input box, the user may input the scene item name in the input box to create the scene item of the new wind farm, it will be understood that other types of scene item icons may be displayed if they are other types of scene items.
FIG. 3 illustrates a scene configuration interface schematic according to an embodiment of the application. In step 102, a scene configuration interface is presented in response to a second interactive operation of the user entering the scene configuration. The second interaction operation of the user in the interface shown in fig. 2 may be clicking on an icon corresponding to the scene item to enter into the scene configuration interface, or may be other types of interaction operation modes. The scene configuration interface is presented to perform relevant configuration on the project.
FIG. 4 illustrates a model configuration interface schematic according to an embodiment of the present application. In step 103, a scene is presented in the scene configuration interface, a model configuration interface is presented in response to a third interaction of the user's open model configuration, and icons of respective candidate models are presented in the model configuration interface. Thus, the user can have an immersive experience in the scene configuration process so as to better perform model configuration and monitoring. And selecting a proper model through the icons of the candidate models, so as to meet the requirements of users.
FIG. 5 illustrates an interface diagram of a presentation model attribute information configuration item according to an embodiment of the present application. In step 104, in response to a third interactive operation of the user on the icon of the first model in the candidate models, the first model is dragged into the scene to be presented, and the model attribute information configuration item is presented at the associated position of the first model. For example, as shown in fig. 5, an input box is presented at the upper right corner of the first model, and attribute information of the first model includes a name, a rotation angle, a position, and the like, and the rotation angle may include 0 °, 90 °, 180 °, 270 °, and the like, so that the presentation angle of the first model in a scene can be adjusted, so that a user can conveniently view and lay out a sensor, and the like.
In step 105, in response to a fourth interaction of the user with the model attribute information configuration item, attribute information is configured, and the first model in the scene is caused to change according to the configured attribute information, thereby obtaining a configured first model. For example, an interactive operation of adjusting the rotation angle and/or position of the first model, and correspondingly changing the presentation angle and/or position of the first model, so as to directly change the relevant attribute of the first model.
FIG. 6 illustrates a sensor configuration interface schematic according to an embodiment of the application. At step 106, a sensor configuration interface is presented in the scene configuration interface, and icons of each candidate sensor are presented in the sensor configuration interface. A plurality of selectable sensor types, such as inclinometers, pressure gauges, strain gauges, anemometers, etc., are displayed on the sensor configuration interface, from which the sensor associated with the first model can be selected to match the type and number of sensors in the actual scene, enabling accurate and complete acquisition of the detection data in the actual scene for better monitoring.
In step 107, the first sensor is dragged to a layout position in the configured first model for presentation in response to a fifth interactive operation of the icon of the first sensor in the candidate sensors by the user. The user only needs to complete the arrangement of the first sensor through simple dragging operation according to the arrangement condition of the first sensor in an actual scene, and the user does not need to manually input complex operations such as position information of the sensor. The first model is arranged in the scene, and even if the position of the sensor is wrong, the position of the sensor on the first model is only required to be changed online, and the processes of model editing and the like do not need to be repeated. In some cases, the arrangement of a plurality of sensors can be completed in a few minutes, and the time for arranging the sensors can be obviously shortened. The sensor is conveniently added according to the actual scene condition by a user, other links are not required to be changed, and the added sensor is only required to be associated and bound with the sensor detection data of the actual scene.
In step 108, in response to the user setting the sixth interactive operation driven by the model data, the set driving data of the first model is applied to the first model. Therefore, the first model can be associated with the driving data, the first model is enabled to be in a state close to an actual scene according to the condition of the driving data, and a user can monitor the condition of the first model conveniently.
FIG. 7 illustrates an interface diagram presenting a list of configuration details of a first sensor, according to an embodiment of the application. In step 109, a list of configuration details of the first sensor is presented on a sensor configuration interface, wherein a facility port and a transmission channel of a scene site with which the first sensor can be associated are listed, and in response to a user selection of the facility port and the transmission channel of the scene site with which the first sensor is associated in the list of configuration details, the facility port and the transmission channel of the selected scene site are associated with the first sensor, such that detection data collected by the facility port of the selected scene site is transmitted to the first sensor via the selected transmission channel as detection data of the first sensor in the scene item. The user can obtain the facility port and the transmission channel corresponding to the first sensor based on the layout position of the first sensor in the actual scene and the transmission channel of the first sensor for transmitting data, so that the first sensor can be associated and bound with the sensor detection data uniquely corresponding to the actual scene. For example, the cloud database stores data items of sensors of the actual scene corresponding to the facility ports and the transmission channels, and after the first sensor is associated and bound with the data items of the facility ports and the transmission channels of the cloud database, detection data of the sensors in the actual scene corresponding to the first sensor can be obtained. And through the association matching of the facility port and the transmission channel, the plurality of first sensors can be respectively and independently associated and matched with the sensor detection data in the actual scene, the operation is simple and convenient, and the data association is more accurate. The data of each sensor is not required to be checked by a user in an actual scene, and the data of the sensor is only required to be checked by a digital twin system, so that the workload of the user is greatly reduced.
Further comprising, with the at least one processor, performing the following steps. And responding to the input of the user facility port and the transmission channel, and obtaining a data matching table. And in response to a seventh interactive operation of the uploaded data matching table of the user, identifying the data matching table, obtaining the facility port and the transmission channel of the scene associated with the first sensor, and taking the facility port and the transmission channel associated with the first sensor as selectable items of the facility port and the transmission channel associated with the first sensor of the data matching table. For example, 7, through the option of uploading files on the sensor configuration interface, the associated configuration detail list can be selected, the data matching table can be edited by the user in advance based on the actual scene condition, or can be directly edited by the EXCEL table and then uploaded, the table can comprise facility ports and transmission channels of scene sites associated with different sensors, and the content in the uploaded table can appear in the corresponding option of the configuration detail list, so that the user can quickly and accurately associate and bind the sensor detection data of the first sensor and the actual scene.
In some embodiments, the facility ports in the data matching table are named by project name, zone name, device model number, and device serial number. The first sensors correspond to unique data entries in the cloud database, so that under the condition that the number of the first sensors is large, each first sensor can be accurately associated and bound with sensor detection data in an actual scene, for example, the number 2 of the pallan 3 FAL-port 94, wherein the number 2 of the pallan represents the project name of a certain wind power plant, the number 3F represents the region of the wind power plant, the AL represents the model number of equipment corresponding to the first model, and the port 94 represents the equipment serial number of the equipment in the region. The transmission channels are named by serial numbers, and each first sensor may have a plurality of transmission channels, and a user may select according to the actually used transmission channels.
In some embodiments, the transmission channels in the data match table are named by channel number. Such as lane 1, lane 2, lane 3, etc. By selecting the channel number of the transmission channel in the configuration detail list, the channel number corresponding to the detection data actually transmitted by the first sensor can be selected.
FIG. 8 illustrates a schematic diagram of a setup model data-driven interface, according to an embodiment of the application. Responsive to a sixth interactive operation driven by user setup model data, applying the set drive data of the first model to the first model specifically includes: presenting a data matching list and/or data driving parameters of the first model on a data driving configuration interface, wherein the data matching list lists matching information of driving data of the first model, the data driving parameters list corresponding data levels of the data driving parameters of the first model and data ranges and model state information corresponding to different data levels, responding to selection of the matching information of the data matching list by a user, enabling the driving data of the first model to be matched with a sensor or a data model, and/or responding to input of the data ranges of the different data levels of the data driving parameters by the user, enabling the model state of the first model to be dynamically changed according to data changes of the sensor or the data model. For example, the first model is a fan and the data-driven list includes selections of inclinometers corresponding to the fan. The data driving parameters are divided into four grades, and when the inclination angle ranges from 0 degrees to 0.5 degrees, the corresponding grade 0 data are provided, and the fan is in a static state at the moment; corresponding to 1 level data when the inclination angle ranges from 0.5 degrees to 1 degree, wherein the fan shaking amplitude is 0.5 degree and the period is 3 seconds; corresponding to 2 stages of data in the inclination angle range of 1-1.5 degrees, wherein the fan shaking amplitude is 1 degree and the period is 3 seconds; and when the inclination angle range is larger than 1.5 degrees, corresponding to 3-level data, wherein the fan shaking amplitude is 3 degrees and the period is 3 seconds. Therefore, the state of the first model can be changed according to the data of the sensor, so that a user can feel the situation of an actual scene in time, and the first model is better monitored.
In some embodiments, the at least one processor is configured to: in response to the eighth interactive operation in which the user sets the scene data drive, the drive data of the set scene is applied to the scene. For example, when the scene of the offshore wind power plant comprises the sea surface and the driving data is applied to the fluctuation of the wave on the sea surface, the change of the fluctuation speed of the wave on the sea can be seen in the scene, so that the sense of reality of a user is improved, the situation of an actual scene can be timely felt, and the monitoring of the project is facilitated.
Fig. 9 shows an interface schematic of setting up a scene data driver according to an embodiment of the application. In response to a user setting an eighth interactive operation of scene data driving, applying driving data of the set scene to the scene specifically includes: presenting a data matching list and/or data driving parameters of the scene on a data driving configuration interface, wherein the data matching list lists matching information of driving data of the scene, the data driving parameters list corresponding data levels of the data driving parameters of the scene and data ranges and model state information corresponding to different data levels, responding to selection of the matching information of the data matching list by a user, enabling the driving data of the scene to be matched with a sensor, and/or responding to input of data ranges of different data levels of the data driving parameters by the user, enabling scene states of the scene to be dynamically changed according to scene field changes. For example, as shown in FIG. 9, the data driven list includes selections of anemometers corresponding to the sea surface. The data driving parameters are divided into four grades, and when the range of 0m/s to 5m/s is reached, the data driving parameters correspond to the data of 0 grade, and the sea surface is at rest at the moment; corresponding to 1 level data in the range of 5m/s to 10m/s, wherein the sea surface fluctuation amplitude is 0.5m, and the period is 10 seconds; corresponding to 2 stages of data in the range of 10m/s to 15m/s, wherein the sea surface fluctuation amplitude is 1m, and the period is 10 seconds; and in the range of more than 15m/s, corresponding to the 3-level data, the sea surface fluctuation amplitude is 5m, and the period is 10 seconds. Therefore, the scene can be changed according to the data of the sensor, and the user can monitor the scene related to the project conveniently.
FIG. 10 illustrates an interface schematic for deploying a first sensor in accordance with an embodiment of the application. In response to a fifth interactive operation of the user on the icon of the first sensor in the candidate sensors, dragging the first sensor to the layout position in the configured first model is presented, and specifically comprises the following steps. And in response to a fifth interactive operation of the user on the icon of the first sensor in the candidate sensors, dragging the first sensor to a root area in the configured first model, and staying for a first time interval to enable the first model to present a selected state. And under the condition that the first model is in a selected state, responding to the operation of a user at a layout position of the first model, and presenting an icon of the first sensor at the layout position. Through the stay of the first time interval, the first time interval can be about 3s, and the first model is convenient to be determined to be in a state associated with the first sensor layout, so that a user can directly drag the first sensor associated with the first model to the layout position of the first model without waiting. Further, when one of the first models is in the selected state, the other first models are in the unselected state. The sensor arrangement is carried out on different first models respectively, so that the first models correspond to the first sensors.
As shown in fig. 7, in some embodiments, the following steps are performed with the at least one processor. Listing an initial value of the overrun threshold of the first sensor in a configuration detail list on the sensor configuration interface; and responding to the modification operation of the initial value of the overrun threshold of the first sensor in the configuration detail list by a user, and changing the overrun threshold corresponding to the first sensor. For example, items of a wind farm at sea, the individual sensors need to operate in humid and insolated environments for a long period of time, so that faults may occur. The user can set the overrun threshold according to experience, so that when the data value of the detection data in the actual scene of the first sensor exceeds the overrun threshold, the user is correspondingly prompted on the interface or the color of the first sensor in the scene is changed, and the user judges whether the sensor in the actual scene has a fault or not and needs to be maintained. For the first sensor maintained, after maintenance, the channel number in the configuration detail list may be continuously used to obtain corresponding detection data.
Further, since the sensor in the scene item is subjected to a severe environment such as seawater erosion for a long period of time, when the detection data is not received within the first time threshold, it is indicated that the first sensor is not operated, and the sensor needs to be replaced. Further, a time threshold for detecting data disconnection may be set in a list of configuration details on the sensor configuration interface. So that the user gets the fault condition of the sensor in time. For the replaced first sensor, the user can continuously receive the detection data transmitted by the replaced first sensor only by modifying the transmission channel in the configuration detail list, namely selecting other channel number options from a plurality of channel number options of the transmission channel and changing the channel number of the transmission channel. Therefore, a user can conveniently and quickly associate and bind with the detection data under the condition of replacing the first sensor.
FIG. 11 illustrates an interface diagram for presenting a configuration list in accordance with an embodiment of the present application. In some embodiments, the following steps are performed with the at least one processor: presenting a configuration list in a scene configuration interface, wherein the configuration list presents a first model, first sensors to which the first model belongs and refreshing options corresponding to each first sensor; attribute information of an editable state of the first sensor is displayed in response to a user selection of the refresh option. For example, as shown in fig. 11, when the refresh button of the anemometer is clicked, an attribute input box of the anemometer is correspondingly displayed, so that after the first model and/or the first sensor is laid out, the first model and/or the first sensor is correspondingly modified.
As shown in fig. 11. The at least one processor is used for executing the following steps: presenting a configuration list in a scene configuration interface, wherein the configuration list presents a first model, first sensors to which the first model belongs, and a selection frame and operation options corresponding to each first sensor; and deleting the first model and/or the first sensor to which the first model belongs in response to the selection of the selection frame and the operation options by the user. For example, the inclinometer 5 is selected first, and then the delete button is selected, so that the inclinometer 5 can be correspondingly deleted, and the first model and/or the first sensor layout is reduced based on the actual scene situation after the first model and/or the first sensor is laid.
Fig. 12 shows a schematic diagram of a project information configuration interface according to an embodiment of the application. The at least one processor is used for executing the following steps: presenting a project picture item and a project name item on a project information configuration interface; and responding to the input of the user on the project picture item and the project name item, and obtaining the information of the scene project. Therefore, the project information can be modified, and different scene projects can be better identified.
The embodiment of the application also provides a digital twin scene configuration system. Fig. 13 shows a schematic structural diagram of a digital twin scene configuration system according to an embodiment of the present application. The digital twinning scene configuration system 200 includes an interface 201 and at least one processor 202. An interface 201 configured to: is selectively communicatively coupled to the facility port of the scene site and to the transmission channel to obtain the detection data collected by the facility port of the selected scene site via the selected transmission channel. At least one processor 202 configured to: a digital twin scene configuration method according to any of the embodiments of the present application is performed.
A processor in the present application may be a processing device, such as a microprocessor, central Processing Unit (CPU), graphics Processing Unit (GPU), or the like, that includes more than one general purpose processing device. More specifically, the processor may be a Complex Instruction Set Computing (CISC) microprocessor, a Reduced Instruction Set Computing (RISC) microprocessor, a Very Long Instruction Word (VLIW) microprocessor, a processor running other instruction sets, or a processor running a combination of instruction sets. The processor may also be one or more special purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a system on a chip (SoC), or the like. The processor may be communicatively coupled to the memory and configured to execute computer-executable instructions stored thereon.
Furthermore, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of the various embodiments across), adaptations or alterations as pertains to the present application. The elements in the claims are to be construed broadly based on the language employed in the claims and are not limited to examples described in the present specification or during the practice of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the application. This is not to be interpreted as an intention that the features of the non-claimed application are essential to any claim. Rather, the inventive subject matter may lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with one another in various combinations or permutations. The scope of the application should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are only exemplary embodiments of the present application and are not intended to limit the present application, the scope of which is defined by the claims. Various modifications and equivalent arrangements of this application will occur to those skilled in the art, and are intended to be within the spirit and scope of the application.
Claims (9)
1. A method of digital twinning scene configuration, comprising, with at least one processor, performing the steps of:
creating a scene item in response to a first interactive operation of creating the scene item by a user;
Responding to a second interactive operation of entering scene configuration of a user, and presenting a scene configuration interface;
Presenting a scene in the scene configuration interface, presenting a model configuration interface in response to a third interactive operation of the user to open the model configuration, presenting icons of each candidate model in the model configuration interface;
Responding to third interactive operation of a user on an icon of a first model in candidate models, dragging the first model into the scene to be presented, and presenting a model attribute information configuration item at an associated position of the first model;
configuring attribute information in response to a fourth interactive operation of the user on the model attribute information configuration item, and enabling the first model in the scene to be changed according to the configured attribute information, so that a configured first model is obtained;
Presenting a sensor configuration interface in a scene configuration interface, and presenting icons of each candidate sensor in the sensor configuration interface;
Responding to fifth interactive operation of a user on the icon of the first sensor in the candidate sensors, dragging the first sensor to a layout position in the configured first model for presentation;
Applying driving data of the set first model to the first model in response to a sixth interactive operation driven by the user setting model data;
Responding to the input of the user facility port and the transmission channel to obtain a data matching table;
In response to a seventh interactive operation of the uploaded data matching table of the user, identifying the data matching table, obtaining a facility port and a transmission channel of a scene associated with the first sensor, and taking the facility port and the transmission channel associated with the first sensor as selectable items of a configuration detail list;
Presenting a configuration detail list of a first sensor on a sensor configuration interface, wherein a facility port and a transmission channel of a scene with which the first sensor can be associated are listed, and responding to the selection of the facility port and the transmission channel of the scene with which the first sensor is associated in the configuration detail list by a user, and binding the facility port and the transmission channel of the selected scene with the first sensor in an associated manner, so that detection data collected by the facility port of the selected scene is transmitted to the first sensor through the selected transmission channel to serve as detection data of the first sensor in the scene item;
Responding to the change of a transmission channel associated with a first sensor in a configuration detail list by a user, enabling the first sensor to be associated and bound with a channel number of the changed transmission channel, enabling detection data collected by a facility port of a scene to be transmitted to the first sensor through the changed transmission channel, and taking the detection data as detection data of the first sensor in the scene item.
2. The digital twin scenario configuration method of claim 1, wherein the facility ports in the data matching table are named by project name, zone name, equipment model number and equipment serial number.
3. The digital twin scene configuring method according to claim 1, wherein applying the set driving data of the first model to the first model in response to a sixth interactive operation driven by the user setting model data specifically comprises:
Presenting a data matching list and/or data driving parameters of the first model on a data driving configuration interface, wherein the data matching list lists matching information of driving data of the first model, the data driving parameters list data grades corresponding to the data driving parameters of the first model and data ranges and model state information corresponding to different data grades, responding to selection of the matching information of the data matching list by a user, enabling the driving data of the first model to be matched with a sensor or a data model, and/or responding to input of the data ranges of different data grades of the data driving parameters by the user, enabling the model state of the first model to be dynamically changed according to data change of the sensor or the data model.
4. The digital twinning scenario configuration method of claim 1, further comprising, with the at least one processor: in response to the eighth interactive operation in which the user sets the scene data drive, the drive data of the set scene is applied to the scene.
5. The digital twin scene configuration method according to claim 4, wherein in response to a user setting an eighth interactive operation of scene data driving, applying driving data of the set scene to the scene specifically comprises:
And presenting a data matching list and/or data driving parameters of the scene on a data driving configuration interface, wherein the data matching list lists matching information of driving data of the scene, the data driving parameters list data levels corresponding to the data driving parameters of the scene and data ranges and model state information corresponding to different data levels, and responding to the selection of the matching information of the data matching list by a user, so that the driving data of the scene are matched with a sensor, and/or responding to the input of the data ranges of different data levels of the data driving parameters by the user, so that the scene state of the scene presents dynamic change according to scene field change.
6. The digital twin scene configuration method according to claim 1, wherein in response to a fifth interactive operation of the icon of the first sensor in the candidate sensor by the user, dragging the first sensor to the deployed position in the configured first model is specifically presented comprising:
Responding to fifth interactive operation of a user on icons of first sensors in the candidate sensors, dragging the first sensors to a root area in a configured first model, and staying for a first time interval to enable the first model to present a selected state;
and under the condition that the first model is in a selected state, responding to the operation of a user at a layout position of the first model, and presenting an icon of the first sensor at the layout position.
7. The digital twinning scenario configuration method of claim 1, further comprising, with the at least one processor: listing an initial value of the overrun threshold of the first sensor in a configuration detail list on the sensor configuration interface; and responding to the modification operation of the initial value of the overrun threshold of the first sensor in the configuration detail list by a user, and changing the overrun threshold corresponding to the first sensor.
8. The digital twinning scenario configuration method of claim 1, further comprising, with the at least one processor: presenting a configuration list in a scene configuration interface, wherein the configuration list presents a first model, first sensors to which the first model belongs and refreshing options corresponding to each first sensor; attribute information of an editable state of the first sensor is displayed in response to a user selection of the refresh option.
9. A digital twinning scene configuration system, comprising:
An interface configured to: a communication connection with the facility port of the scene site and the transmission channel in a selectable manner to acquire detection data acquired by the facility port of the scene site via the selected transmission channel; and
At least one processor configured to: a digital twin scene configuration method according to any of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410079151.4A CN117593498B (en) | 2024-01-19 | 2024-01-19 | Digital twin scene configuration method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410079151.4A CN117593498B (en) | 2024-01-19 | 2024-01-19 | Digital twin scene configuration method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117593498A CN117593498A (en) | 2024-02-23 |
CN117593498B true CN117593498B (en) | 2024-04-26 |
Family
ID=89920617
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410079151.4A Active CN117593498B (en) | 2024-01-19 | 2024-01-19 | Digital twin scene configuration method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117593498B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111176224A (en) * | 2018-11-13 | 2020-05-19 | 罗克韦尔自动化技术公司 | Industrial safety monitoring arrangement using digital twinning |
WO2021031454A1 (en) * | 2019-08-21 | 2021-02-25 | 佳都新太科技股份有限公司 | Digital twinning system and method and computer device |
CN114332439A (en) * | 2021-12-31 | 2022-04-12 | 联通(广东)产业互联网有限公司 | Three-dimensional data editing and generating system |
CN114528613A (en) * | 2022-01-26 | 2022-05-24 | 中瑞恒(北京)科技有限公司 | Intelligent park digital twin system visual editing method |
CN115794934A (en) * | 2022-11-08 | 2023-03-14 | 上海建工四建集团有限公司 | Production facility monitoring data and digital twin model integration system and method |
CN116310148A (en) * | 2023-05-17 | 2023-06-23 | 山东捷瑞数字科技股份有限公司 | Digital twin three-dimensional scene construction method, device, equipment and medium |
CN117240713A (en) * | 2023-09-22 | 2023-12-15 | 深圳市海洋王照明工程有限公司 | Visual configuration method, device and equipment of Internet of things equipment and storage medium |
CN117331377A (en) * | 2023-12-01 | 2024-01-02 | 珠海格力电器股份有限公司 | Configuration method, configuration device, electronic equipment and storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7609691B2 (en) * | 2021-04-08 | 2025-01-07 | 株式会社日立製作所 | Digital twin management system and method |
-
2024
- 2024-01-19 CN CN202410079151.4A patent/CN117593498B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111176224A (en) * | 2018-11-13 | 2020-05-19 | 罗克韦尔自动化技术公司 | Industrial safety monitoring arrangement using digital twinning |
WO2021031454A1 (en) * | 2019-08-21 | 2021-02-25 | 佳都新太科技股份有限公司 | Digital twinning system and method and computer device |
CN114332439A (en) * | 2021-12-31 | 2022-04-12 | 联通(广东)产业互联网有限公司 | Three-dimensional data editing and generating system |
CN114528613A (en) * | 2022-01-26 | 2022-05-24 | 中瑞恒(北京)科技有限公司 | Intelligent park digital twin system visual editing method |
CN115794934A (en) * | 2022-11-08 | 2023-03-14 | 上海建工四建集团有限公司 | Production facility monitoring data and digital twin model integration system and method |
CN116310148A (en) * | 2023-05-17 | 2023-06-23 | 山东捷瑞数字科技股份有限公司 | Digital twin three-dimensional scene construction method, device, equipment and medium |
CN117240713A (en) * | 2023-09-22 | 2023-12-15 | 深圳市海洋王照明工程有限公司 | Visual configuration method, device and equipment of Internet of things equipment and storage medium |
CN117331377A (en) * | 2023-12-01 | 2024-01-02 | 珠海格力电器股份有限公司 | Configuration method, configuration device, electronic equipment and storage medium |
Non-Patent Citations (2)
Title |
---|
Dynamic resource allocation optimization for digital twin-driven smart shopfloor;Haijun Zhang 等;《2018 IEEE 15th international conference on networking, sensing and control》;20180521;第1-5页 * |
基于数字孪生的LabVIEW与传感器融合实验平台的研究与创新;宋爱娟 等;《现代电子技术》;20230401;第46卷(第7期);第149-154页 * |
Also Published As
Publication number | Publication date |
---|---|
CN117593498A (en) | 2024-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7636622B2 (en) | Interactive schematic generating method and apparatus for a vehicle diagnostic procedure | |
CN109597667A (en) | System and method for display navigation level to be configured and presented in process plant | |
CN106933729A (en) | A kind of method of testing and system based on cloud platform | |
CN105159754B (en) | In-circuit emulation method and device based on business intelligence cloud platform | |
US10359912B2 (en) | Methods and apparatus for configuring a data analyzer | |
US7865835B2 (en) | System and method for hydrological analysis | |
JP2018106687A (en) | Apparatus and method for dynamic device description language menus | |
CN107193217A (en) | Device, setting support method, message handling program and recording medium are supported in setting | |
JP6751551B2 (en) | Methods and equipment for configuring process control systems based on the general-purpose process system library | |
KR102048377B1 (en) | Method for managing processes by using processing data with a spreadsheet type, process management server and specific worker's terminal using the same | |
CN110631631A (en) | Method and system for detecting state information of production workshop in real time | |
CN105335136A (en) | Control method and device of intelligent equipment | |
CN103929676B (en) | Method for remotely testing functions of intelligent television board card and intelligent television board card | |
JP5264641B2 (en) | Logging setting information creation device | |
CN108398279A (en) | Washing machine Auto-Test System and method | |
CN109584668A (en) | A kind of rock tunnel(ling) machine training platform based on virtual reality and big data | |
CN102663917A (en) | Training system of tunnel boring machine | |
CN117593498B (en) | Digital twin scene configuration method and system | |
CN110388354A (en) | A kind of test macro | |
CN107193747A (en) | Code testing method and device and computer equipment | |
EP2939124B1 (en) | Methods and apparatus for defining a probe configuration using a probe configuration tool | |
CN116433838B (en) | Three-dimensional tree symbol dynamic generation method, device, terminal and medium | |
Rocha et al. | 3D virtual environment for calibration and adjustment of smart pressure transmitters | |
CN117721856A (en) | Digital twinning-based foundation pit monitoring system and method | |
CN114035470B (en) | Unit state display method and device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |