[go: up one dir, main page]

CN117197212B - Graphics processing methods, systems, devices and media - Google Patents

Graphics processing methods, systems, devices and media

Info

Publication number
CN117197212B
CN117197212B CN202311050958.7A CN202311050958A CN117197212B CN 117197212 B CN117197212 B CN 117197212B CN 202311050958 A CN202311050958 A CN 202311050958A CN 117197212 B CN117197212 B CN 117197212B
Authority
CN
China
Prior art keywords
data
cad file
processing unit
scene template
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311050958.7A
Other languages
Chinese (zh)
Other versions
CN117197212A (en
Inventor
黄原成
李锦业
刘志彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuzhou Online E Commerce Beijing Co ltd
Original Assignee
Wuzhou Online E Commerce Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuzhou Online E Commerce Beijing Co ltd filed Critical Wuzhou Online E Commerce Beijing Co ltd
Priority to CN202311050958.7A priority Critical patent/CN117197212B/en
Publication of CN117197212A publication Critical patent/CN117197212A/en
Application granted granted Critical
Publication of CN117197212B publication Critical patent/CN117197212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

本申请实施例提供了一种图形处理方法、系统、设备和介质,其中的方法具体包括:接收CAD文件;确定所述CAD文件对应的场景模板;利用所述场景模板,对所述CAD文件进行处理,以得到所述CAD文件包含的数据对象对应的对象数据;其中,所述场景模板包括:解析单元和处理单元;所述解析单元用于对所述CAD文件进行解析,以得到CAD文件包含的元素;所述处理单元用于对CAD文件包含的元素进行处理,以得到数据对象对应的对象数据;对所述数据对象对应的对象数据进行输出。本申请实施例能够节省图形处理的人工成本,能够提高图形处理的准确率,且具有灵活性高和场景适用范围广的优点。

This application provides a graphics processing method, system, device, and medium. The method specifically includes: receiving a CAD file; determining a scene template corresponding to the CAD file; processing the CAD file using the scene template to obtain object data corresponding to data objects contained in the CAD file; wherein the scene template includes a parsing unit and a processing unit; the parsing unit parses the CAD file to obtain elements contained in the CAD file; the processing unit processes the elements contained in the CAD file to obtain object data corresponding to data objects; and outputs the object data corresponding to the data objects. This application can save labor costs in graphics processing, improve the accuracy of graphics processing, and has the advantages of high flexibility and wide applicability to various scenes.

Description

Graphics processing method, system, device and medium
Technical Field
The embodiment of the application relates to the technical field of computer information processing, in particular to a graphic processing method, a graphic processing system, graphic processing equipment and graphic processing media.
Background
CAD (Computer AIDED DESIGN) software is a drawing software for assisting a user in designing work by using a Computer and its graphic device, and has been widely used in the fields of construction, electronics and electricity, mechanical design, clothing industry, computer arts, logistics, and the like. The graphic drawn by CAD software may be referred to as a CAD graphic, which typically includes a number of primitives. Taking the building field as an example, the primitives may represent wall, room, shelf, etc. data objects in the building field.
Object data corresponding to the data object is extracted from the CAD graph, and the method has important significance for auditing the CAD graph and processing the data object.
Currently, data extraction of data objects is typically performed manually. Specifically, the user may perform manual measurement in CAD software to obtain object data corresponding to a data object included in the CAD drawing. However, manual measurement not only consumes a lot of labor cost, but also inevitably involves measurement errors.
Disclosure of Invention
The embodiment of the application provides a graphic processing method, which can save the labor cost of graphic processing, can improve the accuracy of graphic processing, and has the advantages of high flexibility and wide scene application range.
Correspondingly, the embodiment of the application also provides a graphics processing system, electronic equipment and a storage medium, which are used for realizing the realization and application of the method.
In order to solve the above problems, an embodiment of the present application discloses a graphics processing method, which includes:
Receiving a CAD file;
determining a scene template corresponding to the CAD file;
The scene template is used for processing the CAD file to obtain object data corresponding to a data object contained in the CAD file, wherein the scene template comprises an analysis unit and a processing unit, the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file, and the processing unit is used for processing the elements contained in the CAD file to obtain the object data corresponding to the data object;
and outputting object data corresponding to the data object.
In order to solve the above problems, an embodiment of the present application discloses a graphics processing method, which includes:
determining object data corresponding to the CAD file, wherein the object data is structured data;
the preset processing comprises generating a standing book or a visual model or executing warehouse operation;
The method comprises the steps of determining a scene template corresponding to a CAD file, processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file, wherein the scene template comprises an analysis unit and a processing unit, the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file, and the processing unit is used for processing the elements contained in the CAD file to obtain the object data corresponding to the data object.
In order to solve the problems, the embodiment of the application discloses a graphic processing system, which comprises a graphic processing device and a data object in a warehouse;
The graphics processing device is configured to execute the foregoing method, determine, for a CAD file containing the data object, object data corresponding to the data object, output the object data corresponding to the data object, and perform a preset process according to the object data.
In order to solve the problems, an embodiment of the application discloses an electronic device, which comprises a processing unit and a memory, wherein executable codes are stored on the memory, and when the executable codes are executed, the processing unit is caused to execute the method according to any one of the embodiments.
To address the above issues, embodiments of the present application disclose one or more machine-readable media having stored thereon executable code which, when executed, causes a processing unit to perform a method as in any of the above embodiments.
The embodiment of the application has the following advantages:
In the technical scheme of the embodiment of the application, the CAD file is processed by utilizing the scene template corresponding to the CAD file so as to obtain the object data corresponding to the data object. The scene template can process the elements contained in the CAD file by adopting a computer technology, so that the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors, in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of graphic processing.
In addition, the scene template of the embodiment of the application can comprise an analysis unit and a processing unit. Wherein, different processing results can be realized by different combinations of processing units or different combinations of processing units and analyzing units. For example, the same analyzing unit may be matched with different processing units to obtain different processing results, or the same processing unit may be matched with different analyzing units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, one parsing unit or one processing unit may be applied to a variety of scene templates, in other words, different scene template options may use the same parsing unit or processing unit. The analysis unit or the processing unit has reusability, so the embodiment of the application can further save the cost of graphic processing.
Drawings
FIG. 1 is a flow chart of the steps of a graphics processing method of one embodiment of the present application;
FIG. 2 is a schematic diagram of a CAD data structure according to one embodiment of the present application;
FIG. 3 is a schematic diagram of a scene template according to one embodiment of the application;
FIG. 4 (a) is a schematic diagram of a polygon corresponding to a line element;
FIG. 4 (b) is a schematic view of a hollow portion in a polygon;
FIG. 4 (c) is a schematic diagram of the merging result after merging neighboring graphics in the polygon;
FIG. 5 is a flow chart of steps of a graphics processing method of one embodiment of the present application;
FIG. 6 is a schematic diagram of a graphics processing system in accordance with one embodiment of the application;
fig. 7 is a schematic diagram of an exemplary apparatus provided in one embodiment of the application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description.
The embodiment of the application can be used for processing the CAD file to obtain the object data corresponding to the data objects such as the wall, the room, the goods shelf and the like contained in the CAD file.
The embodiments of the present application may involve the use of user data, and in practical applications, user-specific personal data may be used in the schemes described herein within the scope allowed by applicable laws and regulations under conditions that meet applicable legal and regulatory requirements of the country where the user explicitly agrees (e.g., practical notification to the user, etc.).
In the related art, data extraction of a data object is generally performed manually. Specifically, the user may perform manual measurement in CAD software to obtain object data corresponding to a data object included in the CAD drawing. However, manual measurement not only consumes a lot of labor cost, but also inevitably involves measurement errors.
Aiming at the technical problems that manual measurement in the related art not only consumes a great deal of labor cost, but also is difficult to avoid measurement errors, the embodiment of the application provides a graph processing method which comprises the steps of receiving CAD files; the method comprises the steps of determining a scene template corresponding to a CAD file, processing the CAD file by utilizing the scene template to obtain object data corresponding to a data object contained in the CAD file, wherein the scene template specifically comprises an analysis unit and a processing unit, the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file, the processing unit is used for processing the elements contained in the CAD file to obtain object data corresponding to the data object, and outputting the object data corresponding to the data object.
According to the embodiment of the application, the elements contained in the CAD file are processed by utilizing the scene template corresponding to the CAD file so as to obtain the object data corresponding to the data object. The scene template can process the CAD file by adopting a computer technology, so that the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors, in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of graphic processing.
Method embodiment one
Referring to fig. 1, a flowchart illustrating steps of a graphics processing method according to an embodiment of the present application may specifically include the following steps:
Step 101, receiving a CAD file;
102, determining a scene template corresponding to the CAD file, wherein the scene template specifically comprises a processing unit;
Step 103, processing the CAD file by utilizing the scene template to obtain object data corresponding to a data object contained in the CAD file, wherein the scene template specifically comprises an analysis unit and a processing unit, wherein the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file;
step 104, outputting the object data corresponding to the data object.
The embodiment of the method shown in fig. 1 may be used to parse a CAD file to obtain object data corresponding to a data object included in the CAD file. At least one step included in the method embodiment shown in fig. 1 may be performed by a graphics processing device, which may be running on a client or server. It will be appreciated that embodiments of the present application are not limited to the specific implementation of the method shown in fig. 1.
In step 101, the client may receive a CAD file uploaded by the user. The CAD file may be a file output by the drawing software, and it is understood that the embodiment of the application does not limit the specific format of the CAD file.
The CAD file which can be processed by the embodiment of the application can accord with the preset specification. Examples of the preset specification may include a drawing specification, a layer specification, a frame specification, and the like.
For example, the drawing specification requires that the CAD file contain elements such as blocks, lines, polygons, etc. The layer specification may constrain the line color, linearity, etc. characteristics of the layer, e.g., requiring the line color of the fire-blocking partition to be blue, etc. The frame specification may constrain the line shape of the frame, e.g., the line shape of the frame is a thick solid line, etc. It will be appreciated that, according to practical application requirements, a person skilled in the art may determine the preset specification, and the embodiment of the present application is not limited to the specific preset specification.
In step 102, a scene corresponding to the scene template may characterize an parsed scene of the CAD file. The parsed scenes may be associated with scenes corresponding to CAD files to enable processing of the CAD files.
In the embodiment of the application, the scene corresponding to the CAD file can represent the environment or the object in the environment where the element in the CAD file is located, wherein the object in the environment can comprise a building, a road, a park object and the like, and the park object can comprise a garden, a green land, an instrument and the like.
The analysis scene of the CAD file can be preset by a person skilled in the art according to the actual application requirement. For example, in the field of logistics technology, examples of analytical scenarios may include logistics parks, warehouse floors, warehouse interiors, warehouse complexes, automated lines, industrial parks, and the like.
The embodiment of the application can provide the scene template corresponding to the analysis scene. Correspondingly, the scene template obtaining process specifically comprises the steps of creating scene template options, determining an analysis unit and a processing unit corresponding to the scene template options, and storing mapping relations between the analysis unit and the processing unit and the scene template options.
Wherein the scene template options may correspond to the parsed scene. The embodiment of the application can determine a scene template option corresponding to a scene aiming at an analysis scene.
The processing unit corresponding to the scene template option may be matched with the parsing scene corresponding to the scene template option, in other words, the processing unit may be capable of processing the CAD file corresponding to the parsing scene.
The processing unit corresponding to the scene template option can be an existing processing unit or a newly built processing unit. In the embodiment of the application, different scene template options can use the same processing unit.
The processing unit may correspond to processing logic or code to which the processing logic corresponds. In case of using an existing processing unit, the existing code may be multiplexed to save the writing cost of the code. In the case of using a new processing unit, writing of code may be performed for the new processing unit.
The processing unit corresponding to the scene template option may be one or more. Different processing units may be used in combination, and processing units may also be combined with other classes of units. Other classes of units may include parsing units and/or filtering units.
The analysis unit is used for analyzing the CAD file to obtain elements such as lines, circles, polygons, texts and the like contained in the CAD file. In one example, the parsing unit may convert the CAD file into dxf (graphics interchange format, drawing Exchange Format) data, and then extract the data of the element from the dxf data according to the CAD format specification.
Referring to table 1, an example of attribute information of a parsing unit according to an embodiment of the present application is shown. The attribute information of the parsing unit may include, in particular, the name of the parsing unit, CAD shape, output element format and description, and the like.
TABLE 1
Referring to FIG. 2, a schematic diagram of a CAD data structure according to one embodiment of the present application is shown. The CAD file may include one or more layers, and one layer may include one or more object entities. A block is a named group of object entities that may include one or more object entities. "block nesting" may refer to one block also referring to another block. An object entity may refer to an object having a graphical representation and examples of object entities may include lines, circles, arcs, text, ellipses, and the like.
In practical applications, different CAD shapes may be subjected to different parsing units to obtain different elements. The same CAD shape can also be obtained by different analysis units. For example, a polygon can be obtained by analyzing a polygon using a polygon analyzing means, and a line can be obtained by analyzing a polygon using a line analyzing means.
The filtering unit can be used for determining a preset range and/or target graphic data corresponding to a preset layer from the CAD file.
The filtering unit may comprise a range filtering unit and/or a layer filtering unit. The range filtering unit may determine target graphic data corresponding to the preset range from the CAD file. The preset range may be determined via a graphic name, and for example, a range filtering unit may be used to determine target graphic data corresponding to the preset range "library elevation view No. 1". The layer unit may determine target graphic data corresponding to the preset layer from the CAD file. The preset layer may be determined via a layer name, for example, a range filtering unit may be used to determine target graphic data (elevation mark text) corresponding to "elevation mark" of the preset layer. Of course, the scope filtering unit and the layer filtering unit can be utilized simultaneously to determine the elevation drawing annotation text which is commonly corresponding to the 'No. 1 library elevation drawing' and the 'elevation drawing annotation'.
The processing unit in the embodiment of the application can be a unit for processing elements contained in the CAD file in the scene model. The type of processing unit may be various. For example, the processing units may be divided into graphic processing units, functional processing units, and data object processing units according to categories.
And the graphic processing unit is used for processing the graphic elements contained in the CAD file. The function processing unit is used for realizing a preset function. The data object processing unit is used for determining object data corresponding to the data object according to the data provided by the other processing units.
Referring to table 2, an example of attribute information of a processing unit of one embodiment of the present application is shown. The attribute information of the processing unit may specifically include a category of the processing unit, a name of the processing unit, a collocated parsing unit, a description, and the like. The categories of processing units may include graphics processing units, functional processing units, data object processing units, and the like. Wherein a class of processing units may further comprise the corresponding processing unit. For example, the graphics processing unit may include a line-to-polygon processing unit or the like. For example, in a scene template corresponding to a logistics park or an industrial park scene, the line-to-polygon processing unit may be used for processing data objects such as roads, ground or bicycle sheds.
TABLE 2
In summary, the scene template of the embodiment of the application can comprise a plurality of processing units. One processing unit may use the output data of another processing unit. In the case where the processing unit is a graphics processing unit, the graphics processing unit may use output data of the parsing unit and the filtering unit. Different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units.
The parsing unit may correspond to processing logic or code corresponding to the processing logic. In the case of using an existing parsing unit, existing codes can be multiplexed to save the writing cost of the codes. In the case of using a new parsing unit, writing of code may be performed for the new parsing unit.
Similarly, the filtering unit may correspond to processing logic or code corresponding to processing logic. In case of using an existing filtering unit, the existing code can be multiplexed to save the writing cost of the code. In the case of using a new filter unit, the writing of code may be performed for the new filter unit.
The embodiment of the application can combine different processing units, or processing units and analyzing units in a code combination mode. Or the embodiment of the application can use a visual mode to combine different processing units, or the processing unit and the analysis unit.
An implementation of a visualization is provided herein. The implementation process specifically comprises the steps of providing a first area and a second area in an interface, wherein the first area can be a scene template area, the second area can be a unit area, selected units contained in a scene template can be presented in the scene template area, and the second area can be a unit to be selected, such as a processing unit to be selected, an analysis unit to be selected, a filtering unit to be selected and the like. The embodiment of the application can receive the selection operation for the to-be-selected units in the second area and add the to-be-selected units selected by the user to the second area.
The second area may include a connection relationship between the selected unit and the selected unit. For example, the second region may include a selected processing unit region, a selected parsing unit region, and a selected filtering unit region. The embodiment of the application can support the movement of the unit to be selected to any one of the selected processing unit area, the selected analysis unit area and the selected filtering unit area through a drag operation. The embodiment of the application can also support editing operations, such as deleting operations or moving operations, for any selected unit in the selected processing unit area, the selected analysis unit area and the selected filtering unit area. The embodiment of the application can also support the connection operation among different selected units and the like.
Referring to fig. 3, a schematic diagram of a scene template according to an embodiment of the present application is shown, where the scene template may specifically include a processing unit 301, a parsing unit 302, and a filtering unit 303.
The processing unit 301 is configured to process a graphic element included in the CAD file. The processing unit 301 may include a graphics processing unit 311, a function processing unit 312, and a data object processing unit 313.
The graphic processing unit 311 is configured to obtain a graphic element from the parsing unit, and determine a graphic corresponding to the graphic element.
The function processing unit 312 is used for performing processing such as coordinate transformation on the graphics corresponding to the graphics element.
The data object processing unit 313 is configured to process the graphics corresponding to the graphic element to obtain object graphics data corresponding to the data object. The object graphic data may be graphic data corresponding to a data object such as a shelf.
The parsing unit 302 is configured to obtain elements such as lines, circles, polygons, text, etc. from a CAD file.
The filtering unit 303 may be used to filter the graphics data contained in the CAD file. The filtering unit 303 may include a range filtering unit and/or a layer filtering unit. The range filtering unit may determine target graphic data corresponding to the preset range from the CAD file. Alternatively, the preset range may be determined via a graphic name. The layer unit may determine target graphic data corresponding to the preset layer from the CAD file. Alternatively, the preset layer may be determined via a layer name or a block name.
It will be appreciated that the scene template shown in fig. 3 is only an example of a scene template according to an embodiment of the present application, and is not intended to limit the scene template according to an embodiment of the present application. In practice, the scene template of the embodiment of the application may include a processing unit, or a processing unit and a parsing unit, or a processing unit, a parsing unit and a filtering unit.
The scene template acquisition process of the embodiment of the application can also comprise the steps of determining a filtering unit corresponding to the analysis unit and saving the mapping relation between the filtering unit and scene template options.
In practical application, the step 102 of determining the scene template corresponding to the CAD file may specifically include presenting at least one scene template option, receiving a target scene template option selected by a user, and determining the scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option. For example, scene template options such as a logistics park, a warehouse floor, a warehouse interior, a warehouse matching building, an automated assembly line, an industrial park and the like can be displayed in the interface for selection by a user. The user may select the target scene template option by clicking or the like. The embodiment of the application can take the scene template corresponding to the target scene template option as the scene template corresponding to the CAD file.
In step 103, the data object may be a composite information representation understood by the software. The data objects may correspond to entities, which may be objectively existing and distinguishable from one another. For example, in the analytic scenario of warehouse floors, the data objects may include shelves and the like. It will be appreciated that the above-described shelves are merely examples of data objects, and embodiments of the present application are not limited to particular data objects.
The embodiment of the application can provide the following technical scheme for processing the CAD file:
technical solution 1
In the technical scheme 1, the elements can comprise graphic elements, and the processing unit can comprise a graphic processing unit and a data object processing unit;
The process of processing the CAD file may specifically include determining, by using the graphics processing unit, a graphic corresponding to the graphic element, and processing, by using the data object processing unit, the graphic corresponding to the graphic element to obtain object graphic data corresponding to the data object.
In practical applications, the graphic elements may include line elements, point elements, text elements, polygonal elements, etc.
In an example, in the case that the graphic element includes a line element, the graphic corresponding to the graphic element may include a polygon, and the above-mentioned process of processing the graphic corresponding to the graphic element may include identifying, by using a data object processing unit, a hollow portion in the polygon and merging adjacent graphics in the polygon to obtain the corresponding object graphic data.
Referring to fig. 4 (a) to 4 (c), there are illustrated schematic diagrams of a process of processing polygons according to an embodiment of the present application, wherein fig. 4 (a) illustrates polygons corresponding to line elements, fig. 4 (b) illustrates hollow portions in the polygons, and fig. 4 (c) illustrates merging results of neighboring graphics in the polygons, the merging results characterizing object graphics data corresponding to walls. The hollow portion in the polygon may be a hollow portion to which a plurality of polygons commonly correspond.
The embodiment of the application not only can obtain the object graph data corresponding to the first data object according to the graph corresponding to the first data object, but also can obtain the object graph data corresponding to the second data object according to the graph corresponding to the first data object.
The first data object may be a wall and the second data object may be a room. As shown in fig. 4 (b), data corresponding to the hollow portion in the polygon may be used as object graphics data corresponding to the room.
The first data object may be a wall and a window and the second data object may be a room. In this case, object graphic data corresponding to a room surrounded by walls and windows may be determined from polygons corresponding to the walls and windows.
The object graphic data may represent data corresponding to the graphic of the data object, which may specifically include coordinate data of points included in the graphic of the data object, and the like.
In an alternative implementation manner of the present application, the processing unit may include an attribute processing unit, and the process of processing the CAD file may further include determining object attribute data corresponding to the data object by using the attribute processing unit.
The object attribute data may characterize object attributes of the data object. The object attribute data may include, but is not limited to, name, space, code, type, model, remark, rotation angle, floor height, and the like.
The attribute processing unit can determine object attribute data such as names, codes, types, models, remarks and the like corresponding to the data objects according to the text corresponding to the data objects. The attribute processing unit may obtain object attribute data from other processing units, such as a text processing unit or a table processing unit.
The text corresponding to the data object may include text contained in the layer name, and/or text contained in a table of data objects, and/or text contained within a graphical scope of the data object, and the like. Taking the data object as a door or window as an example, the table of data objects may comprise a door window table. Taking the data object as a gate as an example, the text contained within the graphical scope of the data object may be text that appears on the edges of the gate. The text may include text strings, and the types of the text strings may include, but are not limited to, chinese characters, letters, numbers, etc.
Text contained within the scope of the data object may also include text contained in an elevation view. Taking the data object as an example of the door and window, the default height of the door and window and the default height of the floor can be obtained according to the text contained in the vertical view of the door and window.
In one example, the name of the data object may be determined first based on text contained in the layer name, and object attribute data such as type, size, description, etc. of the data object may be determined based on text contained in a table of data objects in the layer. The model of the data object may also be determined based on text contained within the graphical scope of the data object. The embodiment of the application can also generate the codes of the data objects according to the information such as the positions of the data objects in the graph.
The attribute processing unit or the text processing unit can perform semantic analysis on the text corresponding to the data object to obtain object attribute data. The semantic analysis can obtain standardized object attribute data.
The embodiment of the application can adopt the regular expression to carry out semantic analysis on the text corresponding to the data object. For example, when the text of the data object in the warehouse floor scene is semantically parsed, the spatial data included in the object attribute data such as the warehouse name, the warehouse number, the number of floors, the matched building and the like can be identified by using the regular expression. Taking the text of the No. 2 library 1 layer as an example, semantic analysis results can be obtained, wherein the semantic analysis results comprise warehouse names of the No. 2 library, warehouse numbers of the No. 2 library and floor numbers of the No. 1 library. Taking the example of 'No. 19 matched building 5 floors', semantic analysis results can be obtained, wherein the matched building name is 19 matched, the warehouse number is 19, and the floor number is 5.
The embodiment of the application can adopt a classification method to determine the object attribute data corresponding to the data object. The classification method may include a classification dictionary method or a machine classification method, etc. The classification dictionary method can comprise a double-array dictionary tree method and the like, wherein the double-array dictionary tree method can utilize the common prefix of the character strings to reduce the expenditure of the query time so as to achieve the aim of improving the efficiency.
Referring to Table 3, an illustration of determining a room type from a room type description is shown in accordance with one embodiment of the present application. The room type description may be a standard type obtained by mapping the type description, and the room type may be a standard type obtained by mapping the type description.
TABLE 3 Table 3
The object attribute data of the embodiment of the application can also comprise the building area, the sleeve area, the room yield and the like of a room. Specifically, the embodiment of the application can calculate the building area, the sleeve area, the room yield and other data of the room according to the room and the object graphic data of the corresponding wall of the room.
Technical solution 2
In the technical scheme 2, the elements may include line elements, the processing unit may include a table processing unit and a text processing unit, and the processing process of the CAD file may specifically include determining a table corresponding to the line elements by using the table processing unit, and analyzing a text included in the table by using the text processing unit to obtain object attribute data corresponding to the data object.
Taking a data object as a door and window as an example, the table processing unit can determine a door and window table corresponding to the line element, and the text processing unit can perform semantic analysis on texts contained in the table to obtain object attribute data such as width, height, model and the like of the door and window. The semantic analysis can obtain standardized object attribute data.
In the embodiment of the application, the analysis unit can be utilized to acquire the elements contained in the CAD file.
In the embodiment of the application, the scene template can further comprise a filtering unit, and before the CAD file is processed, the method can further comprise the step of determining target graph data corresponding to a preset range and/or a preset graph layer from the CAD file by utilizing the filtering unit, wherein the analyzing unit is further used for acquiring elements contained in the target graph data.
In summary, the embodiment of the application processes the CAD file by using the scene model, and can obtain the object data corresponding to the data object. The object data may include object graphic data and object attribute data.
The object graphic data may be graphic data corresponding to a data object such as a shelf. Object graphics data may be used for presentation of data objects. In other words, presentation of data objects may be achieved from object graphics data.
The object attribute data may characterize object attributes of the data object. The object attribute data may include, but is not limited to, name, code, type, model, remark, rotation angle, floor height, and the like. Object attribute data may be used for analysis of data objects. In other words, from the object attribute data, analysis of the data object may be achieved. The bottom surface height may be the height of the bottom surface of the data object relative to the ground, or the bottom surface height may be the height of the floor where the data object is located.
Referring to Table 4, examples of object data of a data object of an embodiment of the present application are shown, wherein the data object may be a window, the object data may be structured data, and fields of the structured data may include a name, a code, a type, a model, a remark, object graphic data, a rotation angle, a floor height, an object height, and the like.
A process for processing a CAD file in the event that a door or window is contained in the CAD file is provided herein. Assuming that the analysis scene corresponding to the CAD file is a matched building of the warehouse, the CAD file can be processed by utilizing a scene template corresponding to the matched building of the warehouse.
In one example, a scene template corresponding to a warehouse building may include a parsing unit and a processing unit, where the processing unit may include a graphics processing unit, a data object processing unit, a table processing unit, an elevation view processing unit, an attribute processing unit, and so on.
Taking a data object as an example of a door and window, the parsing unit is used for obtaining elements such as line elements from the CAD file. The graphic processing unit is used for acquiring the line elements from the parsing unit and converting the line elements into polygons. The data object processing unit is used for acquiring object graphic data of the doors and windows according to the polygons. The table processing unit is used for determining a door and window table corresponding to the line elements, and carrying out semantic analysis on texts contained in the table by utilizing the text processing unit so as to obtain the width, the height, the model and the like of the door and window. The elevation view processing unit can be used for processing the elevation view of the door and window to obtain the default height of the door and window and the default height of the floor. The attribute processing unit is used for acquiring object graphic data of the doors and windows from the data object processing unit and acquiring object attribute data of the doors and windows such as width, height, model and the like from the table processing unit. In the case where the table processing unit does not provide the width, height, etc. data of the door and window, the attribute processing unit may acquire the default height of the door and window from the elevation view processing unit as the door and window height.
In step 104, object data corresponding to the data object may be output for use by other devices or other systems. The object data may be structured data, and a database may be used to store and output the structured data corresponding to the object data.
In practical applications, a plurality of data objects may be contained in a CAD file. According to the embodiment of the application, the object data corresponding to each of the plurality of data objects can be determined according to the steps 102 and 103.
In summary, according to the graphic processing method of the embodiment of the application, the scene template is utilized to process the elements contained in the CAD file so as to obtain the object data corresponding to the data object. The scene template can process the elements contained in the CAD file by adopting a computer technology, so that the embodiment of the application can save the cost of manual measurement in CAD software and avoid measurement errors, in other words, the embodiment of the application can save the labor cost of graphic processing and can improve the accuracy of graphic processing.
In addition, the scene template of the embodiment of the application can comprise an analysis unit and a processing unit. Wherein, different processing results can be realized by different combinations of processing units or different combinations of processing units and analyzing units. For example, the same processing unit can be matched with different analyzing units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
In addition, the scene template of the embodiment of the application can comprise a processing unit, an analyzing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Further, in addition, one parsing unit or one processing unit may be applied to a variety of scene templates, in other words, different scene template options may use the same parsing unit or processing unit. The analysis unit or the processing unit has reusability, so the embodiment of the application can further save the cost of graphic processing.
In addition, the embodiment of the application can provide the configuration items corresponding to the processing units for the user to configure. For example, the first configuration item is used to configure whether to generate a code corresponding to the data object. As another example, the second configuration item is used to configure whether to merge adjacent graphics, and so on. The configuration result of the configuration item can control the processing logic corresponding to the processing unit, so as to control the processing result corresponding to the processing unit.
Method embodiment II
Referring to fig. 5, a flowchart illustrating steps of a graphics processing method according to an embodiment of the present application may specifically include the following steps:
Step 501, determining object data corresponding to a CAD file, wherein the object data can be structured data;
Step 502, performing preset processing according to the object data, wherein the preset processing specifically comprises generating a ledger or a visual model or executing warehouse operation;
The method comprises the steps of determining a scene template corresponding to a CAD file, processing the CAD file by using the scene template to obtain object data corresponding to a data object contained in the CAD file, wherein the scene template comprises an analysis unit and a processing unit, the analysis unit is used for analyzing the CAD file to obtain elements contained in the CAD file, and the processing unit is used for processing the elements contained in the CAD file to obtain the object data corresponding to the data object.
The embodiment of the method shown in fig. 5 may be used to illustrate a specific application of the object data corresponding to the CAD file. The object data can be structured data, so that the embodiment of the application can play a role in digitizing the data objects such as building units, equipment and the like contained in the CAD file. The object data can provide basic data for applications such as asset accounting, operation management, layout planning, data analysis and the like.
The embodiment of the application can perform preset processing according to the object data. The above-mentioned preset process can be determined by those skilled in the art according to the actual application requirements. For example, the pre-set process may specifically include generating a ledger or visualization model, or performing warehouse operations.
The ledger refers to an account book or a spreadsheet file for recording the content of a fixed asset, and is used for recording detailed information such as the purchase time, price, sponsor or department of the fixed asset.
In a specific implementation, the embodiment of the application can generate the corresponding standing book aiming at the data objects such as the window, the lifting door, the vertical hinged door and the like. Fields of the ledger may originate from object data of the data object. In other words, a data record of the ledger may be generated from object data of a data object.
Referring to table 5, an example of a ledger for a window of one embodiment of the present application is shown, where fields of the ledger may include a space to which it belongs, a code, a name, a type, a remark, etc.
TABLE 5
The visualization model may be a two-dimensional model or a three-dimensional model. The generated two-dimensional model or three-dimensional model can be used for carrying out visual display on the building corresponding to the CAD file, and can be applied to application scenes such as visual renting of houses and presentation of Internet of things equipment.
In one example, the two-dimensional model or the three-dimensional model may be a map model. Map components corresponding to the data objects may be included in the map model. The map component may include a standard map component. The information of the standard map component may include information such as name, coding, rendering mode, drawing mode, spatial attribute, etc.
The map component can also include a custom map component. Custom map components add custom properties relative to standard map components. For example, a standard shelf corresponds to a standard map component, while a user-created shelf corresponds to a custom map component.
In a specific implementation, a three-dimensional model may be presented. For example, corresponding map elements may be generated from map components. The map elements can be examples corresponding to the map components, tree-shaped relations can be formed between the map elements, and a user can configure attribute values of attribute information such as names, codes, rendering modes, spatial attributes, drawing modes, custom attributes and the like on the map elements. In this way, map elements can be rendered and displayed according to the rendering mode corresponding to the map component.
The warehouse operation may involve any operation link from the receiving of the warehouse object such as materials, commodities and the like to the delivering of the warehouse object. Accordingly, warehouse operations may include receiving, warehousing, restocking, picking, gathering, packaging, sorting, inventory, and the like.
The picking operation can be a process of conveying storage objects such as target commodities in a goods shelf to a picking workstation according to order requirements.
Examples of determination of picking information are provided herein, including in particular:
A1, placing an order to a transaction system by a user, and generating the order by the transaction system;
step A2, the transaction system sends the order to the order system;
The order may include the SKU (stock unit, stock Keeping Unit) of the good and the quantity of demand for the SKU.
A3, establishing a wave order task aiming at an order by the order system;
The order system establishes a wave order task for orders of goods, gathers a plurality of orders to perform sorting operation for one job, and the batch of the job is generally called as the wave order task in the industry.
The embodiment of the application can combine and classify the received at least one order according to different dimensions to obtain at least one wave task, wherein the dimensions can comprise at least one of commodity type, commodity name, receiver, warehouse area, warehouse-out type, carrier, order-cut time and order priority.
For example, multiple orders over a period of time may be aggregated into one wave order task. For another example, multiple orders with the same order structure may be aggregated into one order task, where the information of the order structure may include a commodity category, or commodity name, or consignee, or pool, or shipment type, or carrier, or time of cut or order priority, etc. The summarizing process is equivalent to aggregating orders of a certain batch, so that subsequent inventory occupying actions can be performed according to the dimension of the wave order task.
Step A4, the order system sends an inventory occupation command according to a preset picking strategy;
After the wave order task is established, the order system can determine which stock position in the warehouse system is used for picking the SKU in the order to obtain first stock position information, or according to the clear stock rule, the warehouse system is called for occupying the SKU in the order to determine which stock position in the warehouse system is occupied by the SKU to obtain second stock position information, and the process is called stock occupation. For example, 10 SKUs 1 are required in the order, the order system will inform the warehouse system that 10 SKUs 1 are occupied for the order at the bin corresponding to SKU1.
The picking strategy is a classification strategy defined according to the picking operation mode or the package structure of the warehouse system. For example, the goods are classified according to the number and the number of the goods, such as single goods, multiple goods and multiple goods, and the goods can be classified according to the wrapping structure properties of the goods, such as fragile goods, upward placement on the front surface and the like.
And step A5, the warehouse system performs the warehouse position occupation of the SKU.
The warehouse system can carry out inventory occupation corresponding to the inventory units according to the first inventory information or the second inventory information generated by the order system. And after the warehouse system occupies the warehouse site successfully, the first warehouse site information or the second warehouse site information is sent to the order system.
And A6, the order system generates a picking order or a picking task according to the storage position occupation information returned by the warehouse system.
The order or order may include information such as the name, quantity, and occupancy of the target commodity.
And A7, the order system sends the picking order or the picking task to the warehouse system.
And A8, the warehouse system picks the target commodity according to the picking order or the picking task.
According to the embodiment of the application, warehouse operations such as picking operations are executed according to the object data, and the execution efficiency of warehouse operations such as picking operations can be improved.
The process of executing the picking operation in the embodiment of the application can comprise the steps of determining the shelf distance between any two shelves according to the object data corresponding to any two shelves, and determining the target picking path corresponding to the picking task according to the shelf distance between any two shelves. Wherein the target picking path may be a picking path between the task start point and the task end point meeting a predetermined condition. The destination picking path may be a picking workstation at the end of the task.
Assuming that the picking order requires handling of storage devices for the A, B, C, D, E, F items, the storage devices may be bins or the like. Assuming that the storage devices corresponding to A, B, C, D, E, F types of commodities are respectively located on different shelves, the embodiment of the application can conduct path planning according to the shelf distance between any two shelves so as to obtain the carrying sequence corresponding to A, B, C, D, E, F types of commodities, and therefore the path sequence corresponding to the target picking path can be obtained.
In practical applications, the task starting point may be a location point corresponding to a shelf where the commodity is located. For example, the first sorting may be performed in order of increasing distance between the shelf where the commodity is located and the task end point, and the position point corresponding to the shelf in the first X-bit row in the first sorting result may be used as the task start point, where X may be a positive integer such as 1. Of course, the task starting point may be determined in other manners, and it is understood that embodiments of the present application are not limited to specific task starting points.
The predetermined condition may be a predetermined condition, which may be used to constrain the path length. In this way, the embodiment of the application can determine the carrying sequence among the plurality of commodities contained in the picking task according to the path length corresponding to the preset condition.
In embodiments of the present application, the route points may correspond to shelf location points other than the task start point. For example, if the shelf location point corresponding to the product a is used as the task start point, the shelf location point corresponding to the product B, C, D, E, F or the like may be used as the route point.
The predetermined condition may be used to constrain the path length. For example, the predetermined condition may be that the path length is smaller than a length threshold value, or in the case of performing the second sorting of the path lengths of the plurality of picked paths in order from small to large, the path length of the target picked path is ranked in the first Y bits in the second sorting result, and Y may be a positive integer such as 1.
The embodiment of the application does not limit the specific path planning method. For example, the embodiments of the present application may use a graph search method, a fast extended random tree method, or the like. The graph searching method can comprise a visual method or a Di Jie St-Lag method and the like. The dijkstra method may perform multiple searching, where in one searching process, a target position point (including a passing point and a task end point) closest to the task start point and not visited is searched from position points (including a passing point and a task end point), the target position point is taken as an intermediate position point, and distances from the task start point to other position points are updated until all the position points are taken as intermediate position points.
In the path planning process, the embodiment of the application can determine the path length of the picking path according to the shelf distance between any two shelves, so that the target picking path meeting the preset condition can be determined from a plurality of picking paths.
For example, the picking paths comprise A, B, C, D, E, F, task end point, A, C, B, D, F, E, task end point and the like.
In practical application, the shelf distance between the shelf a and the shelf B may be determined according to the point coordinates included in the object data corresponding to any two shelves, such as the point coordinates corresponding to the shelf a and the shelf B, respectively. And, the shelf distance matrix can be generated according to the shelf distance between any two shelves. The elements in the shelf distance matrix may represent the shelf distance between any two shelves.
In summary, the graphics processing method of the embodiment of the application can output corresponding object data aiming at the CAD file meeting the preset specification, so that one-key importing and one-key analyzing of the CAD file can be realized.
And the object data corresponding to the output data object can be used for a two-dimensional scene and/or a two-dimensional scene.
In addition, the scene template of the embodiment of the application can comprise a processing unit, an analyzing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, the embodiment of the application can express the object data by using the structured data, so that the object data can be directly used, and secondary development is facilitated.
Furthermore, the embodiment of the application can generate the standing book aiming at the data objects such as rooms, equipment and the like, and the fields of the standing book can comprise codes, names, attributes and the like, so that the asset management is convenient.
In addition, the embodiment of the application can combine the processing unit, the analysis unit and the filtering unit to quickly obtain the scene model corresponding to the analysis scene.
Under the condition that the structure of the scene template of the embodiment of the application has the advantages of high flexibility and wide scene application range, the embodiment of the application can be suitable for a plurality of technical fields using CAD.
In addition, the embodiment of the application can realize the self-defining function by utilizing the self-defining processing module.
The embodiment of the application can reduce the cost of three-dimensional modeling. Specifically, the embodiment of the application can acquire the object height and the bottom surface height of the data objects such as doors and windows, floors, interlayers and the like by using the elevation processing unit, so that the three-dimensional data of the data objects can be determined according to the two-dimensional data (such as the point coordinates of the graph) and the height data (the object height and the bottom surface height).
The embodiment of the application does not limit the graphic elements in the CAD file. Specifically, the embodiment of the application can process graphic elements such as lines, points, polygons and the like, and can process graphic elements such as blocks, nested blocks, stretching blocks, visibility blocks, ellipses, circular arcs and the like.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
System embodiment
Referring to FIG. 6, there is shown a block diagram of a graphics processing system in accordance with one embodiment of the present application, the system comprising in particular a graphics processing apparatus 601 and a data object 602 in a repository;
the graphics processing apparatus 601 is configured to perform the foregoing method, determine object data corresponding to a data object for a CAD file including the data object, output object data corresponding to the data object 602, and perform a preset process according to the object data.
In actual practice, the data objects 602 in the repository may include rooms and equipment, etc. Examples of devices may include shelves and the like.
In a specific implementation, the graphic processing device 601 may receive a CAD file, determine a scene template corresponding to the CAD file, process the CAD file with the scene template to obtain object data corresponding to a data object included in the CAD file, where the scene template includes an analysis unit and a processing unit, the analysis unit is configured to analyze the CAD file to obtain elements included in the CAD file, and the processing unit is configured to process the elements included in the CAD file to obtain object data corresponding to the data object.
In a specific implementation, the elements comprise graphic elements, and the processing unit comprises a graphic processing unit and a data object processing unit;
The processing of the CAD file comprises the steps of determining the graph corresponding to the graph element by utilizing the graph processing unit and processing the graph corresponding to the graph element by utilizing the data object processing unit so as to obtain object graph data corresponding to the data object.
In a specific implementation, the elements comprise line elements, and the processing unit comprises a table processing unit and a text processing unit;
The processing of the CAD file comprises the steps of determining a table corresponding to the line element by using a table processing unit, and analyzing texts contained in the table by using a text processing unit to obtain object attribute data corresponding to the data object.
In a specific implementation, the scene template further comprises a filtering unit, wherein the filtering unit is used for determining a preset range and/or target graphic data corresponding to a preset layer from the CAD file, and the analyzing unit is further used for acquiring elements contained in the target graphic data.
In a specific implementation, the determining the scene template corresponding to the CAD file specifically includes presenting at least one scene template option, receiving a target scene template option selected by a user, and determining a scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option.
In a specific implementation, the scene template acquisition process specifically comprises the steps of creating scene template options, determining an analysis unit and a processing unit corresponding to the scene template options, and storing mapping relations between the analysis unit and the processing unit and the scene template options.
In a specific implementation, the object data may be structured data, and the graphics processing apparatus 601 may further perform a preset process according to the object data, where the preset process includes generating a ledger or a visual model, or executing a warehouse job.
In summary, the graphics processing system of the embodiment of the application can output corresponding object data for the CAD file conforming to the preset specification, so that one-key importing and one-key analyzing of the CAD file can be realized.
And the object data corresponding to the output data object can be used for a two-dimensional scene and/or a two-dimensional scene.
In addition, the scene template of the embodiment of the application can comprise a processing unit, an analyzing unit and a filtering unit. Wherein, different processing results can be realized by different combinations of processing units, or different combinations of processing units and analyzing units, or different combinations of processing units, analyzing units and filtering units. For example, the same processing unit may be matched with different parsing units or filtering units to obtain different processing results. Therefore, the structure of the scene template provided by the embodiment of the application has the advantages of high flexibility and wide scene application range.
Furthermore, the embodiment of the application can express the object data by using the structured data, so that the object data can be directly used, and secondary development is facilitated.
Furthermore, the embodiment of the application can generate the standing book aiming at the data objects such as rooms, equipment and the like, and the fields of the standing book can comprise codes, names, attributes and the like, so that the asset management is convenient.
In addition, the embodiment of the application can combine the processing unit, the analysis unit and the filtering unit to quickly obtain the scene model corresponding to the analysis scene.
Under the condition that the structure of the scene template of the embodiment of the application has the advantages of high flexibility and wide scene application range, the embodiment of the application can be suitable for a plurality of technical fields using CAD.
In addition, the embodiment of the application can realize the self-defining function by utilizing the self-defining processing module.
The embodiment of the application can reduce the cost of three-dimensional modeling. Specifically, the embodiment of the application can acquire the object height and the bottom surface height of the data objects such as doors and windows, floors, interlayers and the like by using the elevation processing unit, so that the three-dimensional data of the data objects can be determined according to the two-dimensional data (such as the point coordinates of the graph) and the height data (the object height and the bottom surface height).
The embodiment of the application does not limit the graphic elements in the CAD file. Specifically, the embodiment of the application can process graphic elements such as lines, points, polygons and the like, and can process graphic elements such as blocks, nested blocks, stretching blocks, visibility blocks, ellipses, circular arcs and the like.
The embodiment of the application also provides a non-volatile readable storage medium, where one or more modules (programs) are stored, where the one or more modules are applied to a device, and the instructions (instructions) of each method step in the embodiment of the application may cause the device to execute.
Embodiments of the application provide one or more machine-readable media having instructions stored thereon that, when executed by one or more processors, cause an electronic device to perform a method as described in one or more of the above embodiments. In the embodiment of the application, the electronic equipment comprises a server, terminal equipment and other equipment.
Embodiments of the present disclosure may be implemented as an apparatus for performing a desired configuration using any suitable hardware, firmware, software, or any combination thereof, which may include a server (cluster), terminal, or the like. Fig. 7 schematically illustrates an example apparatus 1700 that may be used to implement various embodiments described in the present disclosure.
For one embodiment, FIG. 7 illustrates an example apparatus 1700 having one or more processors 1702, a control module (chipset) 1704 coupled to at least one of the processor(s) 1702, a memory 1706 coupled to the control module 1704, a non-volatile memory (NVM)/storage device 1708 coupled to the control module 1704, one or more input/output devices 1710 coupled to the control module 1704, and a network interface 1712 coupled to the control module 1704.
The processor 1702 may include one or more single-core or multi-core processors, and the processor 1702 may include any combination of general-purpose or special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In some embodiments, the apparatus 1700 can be used as a server, a terminal, or the like in the embodiments of the present application.
In some embodiments, the apparatus 1700 may include one or more computer-readable media (e.g., memory 1706 or NVM/storage 1708) having instructions 1714 and one or more processors 1702 combined with the one or more computer-readable media configured to execute the instructions 1714 to implement the modules to perform the actions described in this disclosure.
For one embodiment, the control module 1704 may include any suitable interface controller to provide any suitable interface to at least one of the processor(s) 1702 and/or any suitable device or component in communication with the control module 1704.
The control module 1704 may include a memory controller module to provide an interface to the memory 1706. The memory controller modules may be hardware modules, software modules, and/or firmware modules.
Memory 1706 may be used to load and store data and/or instructions 1714 for device 1700, for example. For one embodiment, memory 1706 may include any suitable volatile memory, such as, for example, a suitable DRAM. In some embodiments, memory 1706 may comprise double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, the control module 1704 may include one or more input/output controllers to provide interfaces to the NVM/storage 1708 and the input/output device(s) 1710.
For example, NVM/storage 1708 may be used to store data and/or instructions 1714. NVM/storage 1708 may include any suitable nonvolatile memory (e.g., flash memory) and/or may include any suitable nonvolatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 1708 may include a storage resource as part of the device on which apparatus 1700 is installed or may be accessible by the device without necessarily being part of the device. For example, NVM/storage 1708 may be accessed over a network via input/output device(s) 1710.
The input/output device(s) 1710 may provide an interface for the apparatus 1700 to communicate with any other suitable device, and the input/output device 1710 may include a communication component, an audio component, a sensor component, and the like. The network interface 1712 may provide the device 1700 with an interface to communicate over one or more networks, and the device 1700 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as accessing a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, 5G, etc., or a combination thereof.
For one embodiment, at least one of the processor(s) 1702 may be packaged together with logic of one or more controllers (e.g., memory controller modules) of the control module 1704. For one embodiment, at least one of the processor(s) 1702 may be packaged together with logic of one or more controllers of the control module 1704 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 1702 may be integrated on the same die as logic of one or more controllers of the control module 1704. For one embodiment, at least one of the processor(s) 1702 may be integrated on the same die as logic of one or more controllers of the control module 1704 to form a system on a chip (SoC).
In various embodiments, the apparatus 1700 may be, but is not limited to being, a terminal device such as a server, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, the device 1700 may have more or fewer components and/or different architectures. For example, in some embodiments, the apparatus 1700 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and a speaker.
The device 1700 may employ a main control chip as a processor or a control module, the sensor data, the location information, etc. are stored in a memory or NVM/storage device, the sensor group may be an input/output device, and the communication interface may include a network interface.
The embodiment of the application also provides electronic equipment, which comprises a processor and a memory, wherein executable codes are stored on the memory, and when the executable codes are executed, the processor is caused to execute the method according to one or more of the embodiments of the application.
Embodiments of the application also provide one or more machine-readable media having stored thereon executable code that, when executed, causes a processor to perform a method as described in one or more of the embodiments of the application.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable graphics processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable graphics processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable graphics processing terminal apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable graphics processing terminal apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
While the present application has been described in detail with reference to a graphics processing method, a graphics processing system, an electronic device and a storage medium, the principles and embodiments of the present application are described in detail with reference to specific examples, and the description of the above examples is only for aiding in understanding the method and core concept of the present application, and the present application should not be construed as being limited to the embodiments and application scope of the present application, as will be apparent to one of ordinary skill in the art in light of the present teaching.

Claims (9)

1.一种图形处理方法,其特征在于,所述方法包括:1. A graphics processing method, characterized in that the method comprises: 接收CAD文件;Receive CAD files; 确定所述CAD文件对应的场景模板;确定所述CAD文件对应的场景模板,包括:展现至少一个场景模板选项;接收用户选择的目标场景模板选项;根据所述目标场景模板选项对应的场景模板,确定所述CAD文件对应的场景模板;Determine the scene template corresponding to the CAD file; determining the scene template corresponding to the CAD file includes: displaying at least one scene template option; receiving a target scene template option selected by the user; and determining the scene template corresponding to the CAD file based on the scene template corresponding to the target scene template option. 利用所述场景模板,对所述CAD文件进行处理,以得到所述CAD文件包含的数据对象对应的对象数据;其中,所述场景模板包括:解析单元和处理单元;所述解析单元用于对所述CAD文件进行解析,以得到CAD文件包含的元素;所述处理单元用于对CAD文件包含的元素进行处理,以得到数据对象对应的对象数据;所述解析单元将CAD文件转换为图形交换格式数据,并根据CAD格式规范,从图形交换格式数据中提取元素的数据;The scene template is used to process the CAD file to obtain object data corresponding to the data objects contained in the CAD file; wherein, the scene template includes: a parsing unit and a processing unit; the parsing unit is used to parse the CAD file to obtain the elements contained in the CAD file; the processing unit is used to process the elements contained in the CAD file to obtain object data corresponding to the data objects; the parsing unit converts the CAD file into graphic exchange format data, and extracts the data of the elements from the graphic exchange format data according to the CAD format specification; 对所述数据对象对应的对象数据进行输出。Output the object data corresponding to the data object. 2.根据权利要求1所述的方法,其特征在于,所述元素包括:图形元素;所述处理单元包括:图形处理单元和数据对象处理单元;2. The method according to claim 1, wherein the element comprises: a graphic element; and the processing unit comprises: a graphics processing unit and a data object processing unit; 所述对所述CAD文件进行处理,包括:The processing of the CAD file includes: 利用所述图形处理单元,确定所述图形元素对应的图形;The graphics processing unit is used to determine the graphics corresponding to the graphics element; 利用所述数据对象处理单元,对所述图形元素对应的图形进行处理,以得到数据对象对应的对象图形数据。The data object processing unit is used to process the graphics corresponding to the graphic elements to obtain the object graphic data corresponding to the data object. 3.根据权利要求1所述的方法,其特征在于,所述元素包括:线元素;所述处理单元包括:表格处理单元和文本处理单元;3. The method according to claim 1, wherein the element comprises: a line element; and the processing unit comprises: a table processing unit and a text processing unit; 所述对所述CAD文件进行处理,包括:The processing of the CAD file includes: 利用表格处理单元,确定所述线元素对应的表格;The table processing unit is used to determine the table corresponding to the line element; 利用文本处理单元,对表格中包含的文本进行解析,以得到所述数据对象对应的对象属性数据。The text processing unit is used to parse the text contained in the table to obtain the object attribute data corresponding to the data object. 4.根据权利要求1所述的方法,其特征在于,所述场景模板还包括:过滤单元;所述过滤单元用于从所述CAD文件中确定出预设范围和/或预设图层对应的目标图形数据,则所述解析单元还用于获取所述目标图形数据中包含的元素。4. The method according to claim 1, wherein the scene template further comprises: a filtering unit; wherein the filtering unit is used to determine target graphic data corresponding to a preset range and/or a preset layer from the CAD file, and the parsing unit is further used to obtain the elements contained in the target graphic data. 5.根据权利要求1至4中任一所述的方法,其特征在于,所述场景模板的获取过程包括:5. The method according to any one of claims 1 to 4, characterized in that the process of obtaining the scene template includes: 创建场景模板选项;Create scene template option; 确定所述场景模板选项对应的解析单元和处理单元;Determine the parsing unit and processing unit corresponding to the scene template option; 保存解析单元、处理单元与场景模板选项之间的映射关系。Save the mapping relationship between the parsing unit, the processing unit, and the scene template options. 6.一种图形处理方法,其特征在于,所述方法包括:6. A graphics processing method, characterized in that the method comprises: 确定CAD文件对应的对象数据;所述对象数据为结构化数据;Determine the object data corresponding to the CAD file; the object data is structured data. 根据所述对象数据,进行预设处理;所述预设处理包括:生成台账或可视化模型,或者,执行仓库作业;Based on the object data, perform preset processing; the preset processing includes: generating a ledger or visualization model, or executing warehouse operations; 其中,所述确定CAD文件对应的对象数据的过程包括:确定所述CAD文件对应的场景模板;利用所述场景模板,对所述CAD文件进行处理,以得到所述CAD文件包含的数据对象对应的对象数据;其中,所述场景模板包括:解析单元和处理单元;所述解析单元用于对所述CAD文件进行解析,以得到CAD文件包含的元素;所述处理单元用于对CAD文件包含的元素进行处理,以得到数据对象对应的对象数据;所述解析单元将CAD文件转换为图形交换格式数据,并根据CAD格式规范,从图形交换格式数据中提取元素的数据;所述确定所述CAD文件对应的场景模板,包括:展现至少一个场景模板选项;接收用户选择的目标场景模板选项;根据所述目标场景模板选项对应的场景模板,确定所述CAD文件对应的场景模板。The process of determining the object data corresponding to the CAD file includes: determining the scene template corresponding to the CAD file; processing the CAD file using the scene template to obtain the object data corresponding to the data objects contained in the CAD file; wherein the scene template includes: a parsing unit and a processing unit; the parsing unit is used to parse the CAD file to obtain the elements contained in the CAD file; the processing unit is used to process the elements contained in the CAD file to obtain the object data corresponding to the data objects; the parsing unit converts the CAD file into graphic exchange format data, and extracts the element data from the graphic exchange format data according to the CAD format specification; determining the scene template corresponding to the CAD file includes: displaying at least one scene template option; receiving the target scene template option selected by the user; and determining the scene template corresponding to the CAD file according to the scene template corresponding to the target scene template option. 7.一种图形处理系统,其特征在于,所述系统包括:图形处理装置和仓库中的数据对象;7. A graphics processing system, characterized in that the system comprises: a graphics processing device and data objects in a repository; 其中,所述图形处理装置,用于执行如权利要求1-6中任一项所述的方法,针对包含所述数据对象的CAD文件,确定所述数据对象对应的对象数据,对所述数据对象对应的对象数据进行输出,以及根据所述对象数据,进行预设处理。The graphics processing device is configured to execute the method as described in any one of claims 1-6, for a CAD file containing the data object, determine the object data corresponding to the data object, output the object data corresponding to the data object, and perform preset processing based on the object data. 8.一种电子设备,其特征在于,包括:处理器;和8. An electronic device, characterized in that it comprises: a processor; and 存储器,其上存储有可执行代码,当所述可执行代码被执行时,使得所述处理器执行如权利要求1-6中任一项所述的方法。A memory having executable code stored thereon, which, when executed, causes the processor to perform the method as described in any one of claims 1-6. 9.一个或多个机器可读介质,其上存储有可执行代码,当所述可执行代码被执行时,使得处理器执行如权利要求1-6中任一项所述的方法。9. One or more machine-readable media having executable code stored thereon, which, when executed, causes a processor to perform the method as described in any one of claims 1-6.
CN202311050958.7A 2023-08-18 2023-08-18 Graphics processing methods, systems, devices and media Active CN117197212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311050958.7A CN117197212B (en) 2023-08-18 2023-08-18 Graphics processing methods, systems, devices and media

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311050958.7A CN117197212B (en) 2023-08-18 2023-08-18 Graphics processing methods, systems, devices and media

Publications (2)

Publication Number Publication Date
CN117197212A CN117197212A (en) 2023-12-08
CN117197212B true CN117197212B (en) 2025-11-25

Family

ID=88995218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311050958.7A Active CN117197212B (en) 2023-08-18 2023-08-18 Graphics processing methods, systems, devices and media

Country Status (1)

Country Link
CN (1) CN117197212B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270756A (en) * 2020-11-24 2021-01-26 山东汇颐信息技术有限公司 Data rendering method applied to BIM model file
CN113901550A (en) * 2021-09-30 2022-01-07 万翼科技有限公司 Assembly building BIM model generation method and related equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2923333C (en) * 2008-02-13 2018-12-18 Ice Edge Business Solutions Ltd. Rendering and modifying cad design entities in object-oriented applications
US9235650B2 (en) * 2012-09-27 2016-01-12 Siemens Product Lifecycle Management Software Inc. Efficient conversion of XML data into a model using persistent stores and parallelism
CN103425825A (en) * 2013-08-02 2013-12-04 苏州两江科技有限公司 3D supermarket displaying method based on CAD graphic design drawing
US10083522B2 (en) * 2015-06-19 2018-09-25 Smart Picture Technologies, Inc. Image based measurement system
US12511840B2 (en) * 2016-01-07 2025-12-30 Northwest Instrument Inc. Intelligent interface based on augmented reality
CA3178580A1 (en) * 2020-05-14 2021-11-18 Eric FITERMAN Creating imagery for al model training in security screening
CN114048539B (en) * 2021-01-13 2025-01-28 深圳市万翼数字技术有限公司 CAD file analysis and rule judgment method and related device
CN114925416B (en) * 2022-04-25 2022-12-23 清华大学 Building structure generation method and device based on data conversion
CN114880861A (en) * 2022-05-19 2022-08-09 中国商用飞机有限责任公司北京民用飞机技术研究中心 Virtual reality visualization method and device, computer equipment and storage medium
CN115495416B (en) * 2022-06-30 2025-12-05 河南辉煌科技股份有限公司 Methods for parsing and displaying CAD drawings
CN116244810A (en) * 2023-03-13 2023-06-09 北京龙智数科科技服务有限公司 Selection and calculation method, device, equipment and medium based on online data processing

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270756A (en) * 2020-11-24 2021-01-26 山东汇颐信息技术有限公司 Data rendering method applied to BIM model file
CN113901550A (en) * 2021-09-30 2022-01-07 万翼科技有限公司 Assembly building BIM model generation method and related equipment

Also Published As

Publication number Publication date
CN117197212A (en) 2023-12-08

Similar Documents

Publication Publication Date Title
US11036695B1 (en) Systems, methods, apparatuses, and/or interfaces for associative management of data and inference of electronic resources
JP5789525B2 (en) Document content ordering
US9304672B2 (en) Representation of an interactive document as a graph of entities
US7737966B2 (en) Method, apparatus, and system for processing geometric data of assembled parts
US8234264B2 (en) System and method for preferred services in nomadic environments
CN113283355A (en) Form image recognition method and device, computer equipment and storage medium
JP6062549B2 (en) Method and apparatus for retrieving information in an electronic commerce platform
CN104537098B (en) CAD diagram paper search method based on GIS technology
US10818082B2 (en) Method and system for parametrically creating an optimal three dimensional building structure
US20130325673A1 (en) Coordinate model for inventory visualization in conjunction with physical layout
CN116126809B (en) Building information model data storage conversion method based on national standard
CN101510218A (en) Method for implementing picture search and website server
US20230137639A1 (en) Data processing system and method for operating an enterprise application
JP7116744B2 (en) Method and apparatus for displaying textual information
US10296626B2 (en) Graph
CN113326314A (en) Data visualization method and device, electronic equipment and readable storage medium
CN114022702A (en) Intelligent warehouse management method and device, electronic equipment and storage medium
US20170300531A1 (en) Tag based searching in data analytics
Li et al. Intelligent extraction of multi-style and multi-template title block information based on fuzzy matching
CN101388018A (en) Management method of computer aided design file
CN117197212B (en) Graphics processing methods, systems, devices and media
WO2018208412A1 (en) Detection of caption elements in documents
CN110609927A (en) A visual family tree layout method, terminal equipment and storage medium
CN116403203B (en) Label generation method, system, electronic equipment and storage medium
KR20130083899A (en) Hyper-lattice model for optimized sequencing of online analytical processing (olap) operations on data warehouses

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant