CN114782611B - Image processing method, image processing apparatus, storage medium, and electronic device - Google Patents
Image processing method, image processing apparatus, storage medium, and electronic device Download PDFInfo
- Publication number
- CN114782611B CN114782611B CN202210723158.6A CN202210723158A CN114782611B CN 114782611 B CN114782611 B CN 114782611B CN 202210723158 A CN202210723158 A CN 202210723158A CN 114782611 B CN114782611 B CN 114782611B
- Authority
- CN
- China
- Prior art keywords
- pixel
- preset
- target
- processed
- original texture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
The present disclosure relates to an image processing method, an image processing apparatus, a storage medium, and an electronic device, and relates to the field of computer technologies, the method comprising: the method comprises the steps of responding to a received pixel adjusting instruction, obtaining a target image according to each preset graph on a preset three-dimensional model and a to-be-processed area on the preset three-dimensional model, wherein each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the to-be-processed area is adjusted to be a target pixel value, determining a target pixel in each original texture image according to a plurality of preset graphs, the target image and each original texture image, and adjusting the pixel value of the target pixel to be the target pixel value. According to the method and the device, the pixel value of the target pixel needing to be subjected to pixel adjustment in the original texture image is adjusted, so that the pixel of the original texture image corresponding to the region to be processed is adjusted, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Description
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a storage medium, and an electronic device.
Background
In recent years, with the rapid development of computer technology, three-dimensional models have been widely used. In order to make a three-dimensional model look more realistic and beautiful, the three-dimensional model is usually optimized. For a three-dimensional model that has already been created, when adjustment optimization of the pixels of the texture image of the model is required, the workload is undoubtedly huge if adjustment is returned to the original modeling software. Even some models are reconstructed from photographs and cannot be adjusted in the original modeling software (e.g., three-dimensional live-action models), so it is desirable to have a technique that can adjust the pixels of the texture image corresponding to a specified region on the model.
Disclosure of Invention
In order to solve the problems in the related art, the present disclosure provides an image processing method, apparatus, storage medium, and electronic device.
In order to achieve the above object, according to a first aspect of embodiments of the present disclosure, there is provided an image processing method including:
responding to a received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a region to be processed on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
determining a target pixel in each original texture image according to the preset graphs, the target image and each original texture image;
and adjusting the pixel value of the target pixel to the target pixel value.
Optionally, the obtaining a target image according to each preset graph on a preset three-dimensional model composed of a plurality of preset graphs and a to-be-processed region on the preset three-dimensional model includes:
determining at least one crossed preset graph which has a crossed part with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
and shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph.
Optionally, the determining, according to the plurality of preset graphs and the to-be-processed region, at least one intersected preset graph having an intersection with the to-be-processed region from the plurality of preset graphs includes:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
Optionally, the determining, according to the preset graphics, the target image, and each original texture image, a target pixel in each original texture image includes:
acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and aiming at each original texture image, determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information.
Optionally, the determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information includes:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value corresponding to the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, where the first pixel is a pixel corresponding to the to-be-processed region in the target image; the determining the target pixel according to the corresponding pixel value of the pixel in the target image includes:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that the pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is larger than or equal to the second times, taking the pixel as the target pixel.
Optionally, the determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information further includes:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus, the apparatus including:
the acquisition module is used for responding to the received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
a determining module, configured to determine a target pixel in each original texture image according to the plurality of preset graphics, the target image, and each original texture image;
and the adjusting module is used for adjusting the pixel value of the target pixel to the target pixel value.
Optionally, the obtaining module includes:
the first determining submodule is used for determining at least one intersected preset graph which has an intersection with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
and the shooting submodule is used for shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph.
Optionally, the first determining sub-module is configured to:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
Optionally, the determining module includes:
the acquisition submodule is used for acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and the second determining submodule is used for determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information aiming at each original texture image.
Optionally, the second determining submodule is configured to:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value corresponding to the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, where the first pixel is a pixel corresponding to the to-be-processed region in the target image; the second determination submodule is configured to:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that the pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is larger than or equal to the second times, taking the pixel as the target pixel.
Optionally, the second determining sub-module is further configured to:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
According to a third aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method of any one of the above first aspects.
According to a fourth aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any of the above first aspects.
According to the technical scheme, the method includes the steps of firstly responding to a received pixel adjusting instruction, obtaining a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, wherein each preset graph corresponds to one original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the to-be-processed area is adjusted to be a target pixel value, the target image is formed by pixels with two different pixel values, then determining a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusting the pixel value of the target pixel to be the target pixel value. The method and the device can determine the target pixel needing to be subjected to pixel adjustment in each original texture image by combining the preset graph and the original texture image through the preset graph and the target image acquired from the to-be-processed area, and adjust the pixel value of the target pixel, so that the adjustment of the pixel of the to-be-processed area corresponding to the original texture image is realized, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
FIG. 1 is a flow diagram illustrating a method of image processing according to an exemplary embodiment;
FIG. 2 is a flow chart of one step 101 shown in the embodiment of FIG. 1;
FIG. 3 is a schematic illustration of a target image shown in accordance with an exemplary embodiment;
FIG. 4 is a flow chart of one step 102 shown in the embodiment of FIG. 1;
FIG. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment;
FIG. 6 is a block diagram of an acquisition module shown in the embodiment of FIG. 5;
FIG. 7 is a block diagram of a determination module shown in the embodiment of FIG. 5;
FIG. 8 is a block diagram illustrating an electronic device in accordance with an exemplary embodiment;
FIG. 9 is a block diagram illustrating another electronic device in accordance with an example embodiment.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
It should be noted that all actions of acquiring signals, information or data in the present disclosure are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
FIG. 1 is a flow diagram illustrating an image processing method according to an exemplary embodiment. As shown in fig. 1, the method may include the steps of:
Each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values.
For example, the pixels of the texture image corresponding to the partial region to be subjected to pixel adjustment on the three-dimensional model may be adjusted by identifying the pixels corresponding to the partial region in the original texture image on the three-dimensional model and performing pixel adjustment on the pixels, so that the pixel adjustment on the texture image corresponding to the partial region is realized on the premise of avoiding re-rendering the preset three-dimensional model, and the adjustment efficiency of performing pixel adjustment on the texture image corresponding to the three-dimensional model is improved. Specifically, when a user wants to perform pixel adjustment on an original texture image corresponding to some region on a preset three-dimensional model, a to-be-processed region to be subjected to pixel adjustment is marked on the preset three-dimensional model (the to-be-processed region can be understood as a range surface marked on the preset three-dimensional model), and a target pixel value to be adjusted by the original texture image corresponding to the to-be-processed region is input to trigger a pixel adjustment instruction. The preset three-dimensional model is formed by splicing a plurality of preset graphs (the preset graphs can be triangles or other polygons except the triangles), the original texture image is mapped onto the surface of the preset three-dimensional model according to a specific mode, the original texture image can be understood as a two-dimensional graph on the surface of the preset three-dimensional model and can also be called a texture map, each preset graph corresponds to one original texture image, and the original texture images corresponding to all the preset graphs form the surface texture of the preset three-dimensional model together. In addition, the method for calibrating the region to be processed may be manually calibrated by the user, or may be automatically calibrated according to an input condition set by the user (for example, the position of the region to be processed on the preset three-dimensional model).
Because the number of preset graphs forming the preset three-dimensional model is huge, and more preset graphs have no intersection relation with the region to be processed, in order to further improve the adjustment efficiency of pixel adjustment, the preset graphs which have no intersection with the region to be processed can be filtered from all the preset graphs, and all the preset graphs which have intersection with the region to be processed are taken as the intersection preset graphs. Then, each intersection preset graph and the area to be processed can be shot respectively, and a target image corresponding to each intersection preset graph is obtained. The target image may include a first pixel having a first pixel value and a second pixel having a second pixel value (the first pixel is a corresponding pixel of the region to be processed in the target image), and the first pixel value is different from the second pixel value, that is, the target image is an image formed by pixels having two different pixel values.
And 102, determining a target pixel in each original texture image according to the preset graphs, the target image and each original texture image.
For example, after the target image is obtained, first, the first vertex position of all the vertices of each intersecting preset graph on the target image and the second vertex position on the original texture image may be obtained. Then, for each original texture image, sequentially traversing each pixel in the original texture image, determining a preset pattern corresponding to the pixel (one pixel may correspond to one or more different preset patterns) according to the pixel position of the pixel, and determining whether the preset pattern corresponding to the pixel is an intersecting preset pattern. If the preset graph corresponding to the pixel is an intersecting preset graph (that is, one or more intersecting preset graphs corresponding to the pixel exist), target images corresponding to all the intersecting preset graphs corresponding to the pixel (each intersecting preset graph corresponds to one target image) are further determined. Then, the target pixel position of the pixel on each corresponding target image can be respectively determined by combining the first vertex position of each intersecting preset graph corresponding to the pixel with the position relationship between the pixel position of the pixel and the second vertex position of each intersecting preset graph corresponding to the pixel. Then, for each target image corresponding to the pixel, a pixel value corresponding to a target pixel position of the pixel on the target image may be obtained. If the pixel value is the first pixel value, the pixel is labeled "1", otherwise labeled "0". And finally, counting a first time of the pixel marked as 1 and a second time of the pixel marked as 0, and if the first time is greater than or equal to the second time, taking the pixel as a target pixel. In the above manner, the target pixel in each original texture image can be determined. Finally, the pixel value of the target pixel in each original texture image can be adjusted to be the target pixel value, so as to realize the pixel adjustment of the texture image corresponding to the region to be processed.
In summary, the present disclosure first responds to a received pixel adjustment instruction, and obtains a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, the target image is formed by pixels of two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. The method and the device can determine the target pixel needing to be subjected to pixel adjustment in each original texture image by combining the preset graph and the original texture image through the preset graph and the target image acquired from the to-be-processed area, and adjust the pixel value of the target pixel, so that the adjustment of the pixel of the to-be-processed area corresponding to the original texture image is realized, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 2 is a flow chart of one step 101 shown in the embodiment shown in fig. 1. As shown in fig. 2, step 101 may include the steps of:
For example, to further improve the adjustment efficiency of the pixel adjustment, the region to be processed may be stretched in the first direction and the second direction, respectively, to obtain a target spatial region corresponding to the region to be processed. The first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located. Then, a preset graph having an intersection with the target space region in the plurality of preset graphs can be used as the intersection preset graph. For example, when the predetermined pattern is a triangle, the region to be processed may be stretched by 1 meter along a normal direction of a plane where the region to be processed is located, and a direction opposite to the normal direction, respectively, to obtain a target space region in a shape of a rectangular parallelepiped. All triangles may then be traversed to determine whether each triangle is completely outside the cuboid, and if so, the triangles that are completely outside the cuboid are removed. And finally, taking the rest triangles as intersecting triangles.
Specifically, after the intersecting preset patterns are determined, each intersecting preset pattern and the region to be processed may be photographed respectively according to a preset angle (or in a screenshot manner), so as to generate a target image corresponding to each intersecting preset pattern. For example, when the preset pattern is a triangle, the target image may be a black-and-white image (at this time, the target image only includes a white pixel value and a black pixel value), as shown in fig. 3, a white area in the black-and-white image is a corresponding area of the area to be processed in the black-and-white image.
Fig. 4 is a flow chart illustrating one step 102 of the embodiment shown in fig. 1. As shown in fig. 4, step 102 may include the steps of:
For example, after the target image is acquired, vertex position information including a first vertex position on the target image and a second vertex position on the original texture image of each vertex of the intersecting preset graph may be acquired for each intersecting preset graph. Namely, the following contents need to be saved after the target image is generated: (1) the method comprises the steps of (1) corresponding relation between a target image and an intersected preset graph, (2) pixel positions of each vertex of the intersected preset graph on the corresponding target image, (3) original texture images corresponding to the intersected preset graph, and pixel positions of each vertex of the intersected preset graph on the corresponding original texture images. Wherein the first vertex position and the second vertex position may be expressed in UV coordinates.
For example, after the vertex position information is obtained, it may be first determined, for each original texture image, whether one or more target preset graphics corresponding to each pixel exist in at least one intersecting preset graphics according to a pixel position of each pixel in the original texture image. For example, which preset patterns the pixel belongs to can be determined by the pixel position of the pixel, and the preset patterns are taken as the preset patterns corresponding to the pixel, and then it is determined whether there is an intersecting preset pattern in all the preset patterns corresponding to the pixel. If the intersected preset pattern exists in all the preset patterns corresponding to the pixel, the preset pattern which is the intersected preset pattern in all the preset patterns corresponding to the pixel can be used as a target preset pattern, and at this time, one or more target preset patterns corresponding to the pixel can be determined to exist. If there is no intersecting preset pattern in all the preset patterns corresponding to the pixel, it may be determined that there are no one or more target preset patterns corresponding to the pixel.
In a case that it is determined that one or more target preset graphics corresponding to the pixel exist in at least one intersecting preset graphics, a corresponding pixel value of the pixel in the target image may be determined according to a pixel position of the pixel and vertex position information corresponding to each target preset graphics. Specifically, the target pixel position of the pixel on each target image corresponding to the pixel may be respectively determined by combining the position relationship between the pixel position of the pixel and the second vertex position of each target preset graph corresponding to the pixel and the first vertex position of each target preset graph corresponding to the pixel, and the pixel value corresponding to the target pixel position of the pixel on each target image may be obtained. The process of determining the target pixel position of the pixel on a target image is actually to deduce the target pixel position of the pixel on the target image according to the position relationship between the pixel and the target preset pattern corresponding to the target image on the original texture image, and by combining the position of the target preset pattern corresponding to the target image on the target image.
Then, for each pixel, a target pixel may be determined from the pixel value of the pixel corresponding to the target image. For example, in a case that the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, and the first pixel is a corresponding pixel of the region to be processed in the target image, a first number of times that the corresponding pixel value of the pixel in the target image is the first pixel value and a second number of times that the corresponding pixel value of the pixel in each target image is the second pixel value may be counted. If the first number is greater than or equal to the second number, the pixel is a pixel corresponding to the region to be processed, and the pixel can be used as a target pixel. Further, in a case where it is determined that one or more target preset patterns corresponding to the pixel do not exist in at least one intersecting preset pattern, the pixel may not be taken as a target pixel.
In the above manner, the target pixel in each original texture image can be determined. Finally, the pixel value of the target pixel in each original texture image can be adjusted to be the target pixel value, so as to realize the pixel adjustment of the texture image corresponding to the region to be processed.
In summary, the present disclosure first responds to a received pixel adjustment instruction, and obtains a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, the target image is formed by pixels of two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. The method and the device can determine the target pixel needing to be subjected to pixel adjustment in each original texture image by combining the preset graph and the original texture image through the preset graph and the target image acquired from the to-be-processed area, and adjust the pixel value of the target pixel, so that the adjustment of the pixel of the to-be-processed area corresponding to the original texture image is realized, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 5 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. As shown in fig. 5, the image processing apparatus 200 includes:
the obtaining module 201 is configured to, in response to the received pixel adjustment instruction, obtain a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model. Each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values.
A determining module 202, configured to determine a target pixel in each original texture image according to a plurality of preset graphics, the target image, and each original texture image.
And an adjusting module 203, configured to adjust the pixel value of the target pixel to a target pixel value.
Fig. 6 is a block diagram of an acquisition module shown in the embodiment of fig. 5. As shown in fig. 6, the obtaining module 201 includes:
the first determining sub-module 2011 is configured to determine, according to the multiple preset graphs and the to-be-processed region, at least one intersected preset graph having an intersection with the to-be-processed region from the multiple preset graphs.
The shooting submodule 2012 is configured to shoot each intersecting preset pattern and the to-be-processed area according to a preset angle, so as to obtain a target image corresponding to each intersecting preset pattern.
Optionally, the first determination sub-module 2011 is configured to:
and stretching the area to be processed along the first direction and the second direction respectively to obtain a target space area corresponding to the area to be processed. The first direction is opposite to the second direction, and the first direction is the normal direction of the plane where the area to be processed is located.
And taking the preset graph with intersection with the target space region in the plurality of preset graphs as an intersection preset graph.
FIG. 7 is a block diagram of one type of determination module shown in the embodiment shown in FIG. 5. As shown in fig. 7, the determining module 202 includes:
the obtaining sub-module 2021 is configured to obtain vertex position information corresponding to each intersecting preset graph. The vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image.
The second determining sub-module 2022 is configured to determine, for each original texture image, a target pixel in the original texture image according to the pixel position and vertex position information of each pixel in the original texture image.
Optionally, the second determining sub-module 2022 is configured to:
and determining whether one or more target preset graphs corresponding to the pixel exist in at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value of the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in at least one intersected preset graph.
And for each pixel, determining a target pixel according to the corresponding pixel value of the pixel in the target image.
Optionally, the target image includes a first pixel having a first pixel value and a second pixel having a second pixel value, where the first pixel is a corresponding pixel in the target image of the to-be-processed region. The second determination submodule 2022 is configured to:
and counting the first times that the pixel value corresponding to the pixel in the target image is the first pixel value and the second times that the pixel value corresponding to the pixel in each target image is the second pixel value.
And if the first times is greater than or equal to the second times, taking the pixel as a target pixel.
Optionally, the second determining sub-module 2022 is further configured to:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in at least one crossed preset graph, the pixel is not taken as a target pixel.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
In summary, the present disclosure first responds to a received pixel adjustment instruction, and obtains a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model, where each preset graph corresponds to one original texture image, the pixel adjustment instruction is used to instruct to adjust a pixel value of the original texture image corresponding to the to-be-processed area to a target pixel value, the target image is formed by pixels of two different pixel values, and then determines a target pixel in each original texture image according to the plurality of preset graphs, the target image and each original texture image, and adjusts the pixel value of the target pixel to the target pixel value. The method and the device can determine the target pixel needing to be subjected to pixel adjustment in each original texture image by combining the preset graph and the original texture image through the preset graph and the target image acquired from the to-be-processed area, and adjust the pixel value of the target pixel, so that the adjustment of the pixel of the to-be-processed area corresponding to the original texture image is realized, and meanwhile, the efficiency of adjusting the pixel of the original texture image can be improved.
Fig. 8 is a block diagram of an electronic device 700 shown in accordance with an example embodiment. As shown in fig. 8, the electronic device 700 may be provided as a terminal, and the electronic device 700 may include: a first processor 701 and a first memory 702. The electronic device 700 may also include one or more of a multimedia component 703, a first input/output interface 704, and a first communication component 705.
The first processor 701 is configured to control the overall operation of the electronic device 700, so as to complete all or part of the steps in the image processing method. The first memory 702 is used to store various types of data to support operation at the electronic device 700, such as instructions for any application or method operating on the electronic device 700 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and so forth. The first Memory 702 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 703 may include screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the first memory 702 or transmitted through the first communication component 705. The audio assembly further comprises at least one speaker for outputting audio signals. The first input/output interface 704 provides an interface between the first processor 701 and other interface modules, such as a keyboard, a mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The first communication component 705 is used for wired or wireless communication between the electronic device 700 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding first communication component 705 may thus comprise: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the image Processing methods described above.
In another exemplary embodiment, there is also provided a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the image processing method described above. For example, the computer readable storage medium may be the first memory 702 comprising program instructions executable by the first processor 701 of the electronic device 700 to perform the image processing method described above.
In addition, the electronic device 700 may also be provided as a server, as shown in fig. 9, the electronic device 700 comprising a second processor 706, which may be one or more in number, and a second memory 707 for storing computer programs executable by the second processor 706. The computer program stored in the second memory 707 may include one or more modules that each correspond to a set of instructions. Further, the second processor 706 may be configured to execute the computer program to perform the image processing method described above.
Additionally, the electronic device 700 may also include a power component 708 and a second communication component 709, the power component 708 may be configured to perform power management of the electronic device 700, the second communication component 709 may be configured to enable communication, e.g., wired or wireless communication, of the electronic device 700. The electronic device 700 may further include a second input/output interface 710. The electronic device 700 may operate based on an operating system, such as Windows Server, stored in the second memory 707 TM ,Mac OS X TM ,Unix TM ,Linux TM And so on.
In another exemplary embodiment, there is also provided a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the image processing method described above. For example, the non-transitory computer readable storage medium may be the second memory 707 described above that includes program instructions that are executable by the second processor 706 of the electronic device 700 to perform the image processing method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the image processing method described above when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure as long as it does not depart from the gist of the present disclosure.
Claims (8)
1. An image processing method, characterized in that the method comprises:
responding to a received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a region to be processed on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
determining a target pixel in each original texture image according to the preset graphs, the target image and each original texture image;
adjusting the pixel value of the target pixel to the target pixel value;
the target image comprises a first pixel with a first pixel value and a second pixel with a second pixel value, and the first pixel is a pixel corresponding to the to-be-processed area in the target image;
the method for acquiring the target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and the to-be-processed area on the preset three-dimensional model comprises the following steps:
determining at least one crossed preset graph which has a crossed part with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph;
the determining, according to the plurality of preset patterns and the to-be-processed region, at least one intersecting preset pattern that has an intersection with the to-be-processed region from the plurality of preset patterns includes:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
2. The method according to claim 1, wherein determining the target pixel in each of the original texture images according to the plurality of preset graphics, the target image and each of the original texture images comprises:
acquiring vertex position information corresponding to each intersected preset graph; the vertex position information comprises a first vertex position of each vertex of the intersected preset graph on the target image and a second vertex position of each vertex of the intersected preset graph on the original texture image;
and aiming at each original texture image, determining a target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information.
3. The method of claim 2, wherein determining the target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information comprises:
determining whether one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph or not according to the pixel position of each pixel, and determining the pixel value of the pixel in the target image according to the pixel position of the pixel and the vertex position information corresponding to each target preset graph under the condition that one or more target preset graphs corresponding to the pixel exist in the at least one intersected preset graph;
and aiming at each pixel, determining the target pixel according to the corresponding pixel value of the pixel in the target image.
4. The method of claim 3, wherein determining the target pixel according to the corresponding pixel value of the pixel in the target image comprises:
counting a first time that a pixel value corresponding to the pixel in the target image is the first pixel value and a second time that a pixel value corresponding to the pixel in each target image is the second pixel value;
and if the first times is larger than or equal to the second times, taking the pixel as the target pixel.
5. The method of claim 3, wherein determining the target pixel in the original texture image according to the pixel position of each pixel in the original texture image and the vertex position information further comprises:
and under the condition that one or more target preset graphs corresponding to the pixel do not exist in the at least one intersected preset graph, taking the pixel as the target pixel.
6. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for responding to the received pixel adjustment instruction, and acquiring a target image according to each preset graph on a preset three-dimensional model formed by a plurality of preset graphs and a to-be-processed area on the preset three-dimensional model; each preset graph corresponds to an original texture image, the pixel adjusting instruction is used for indicating that the pixel value of the original texture image corresponding to the area to be processed is adjusted to be a target pixel value, and the target image is composed of pixels with two different pixel values;
a determining module, configured to determine a target pixel in each original texture image according to the plurality of preset graphics, the target image, and each original texture image;
an adjusting module, configured to adjust a pixel value of the target pixel to the target pixel value;
the target image comprises a first pixel with a first pixel value and a second pixel with a second pixel value, and the first pixel is a pixel corresponding to the to-be-processed area in the target image;
the acquisition module includes:
the first determining submodule is used for determining at least one intersected preset graph which has an intersection with the area to be processed from the preset graphs according to the preset graphs and the area to be processed;
the shooting submodule is used for shooting each intersected preset graph and the area to be processed according to a preset angle to obtain a target image corresponding to each intersected preset graph;
the first determination submodule is configured to:
stretching the region to be processed along a first direction and a second direction respectively to obtain a target space region corresponding to the region to be processed; the first direction is opposite to the second direction, and the first direction is a normal direction of a plane where the area to be processed is located;
and taking the preset graph with intersection with the target space region in the plurality of preset graphs as the intersection preset graph.
7. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
8. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210723158.6A CN114782611B (en) | 2022-06-24 | 2022-06-24 | Image processing method, image processing apparatus, storage medium, and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210723158.6A CN114782611B (en) | 2022-06-24 | 2022-06-24 | Image processing method, image processing apparatus, storage medium, and electronic device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114782611A CN114782611A (en) | 2022-07-22 |
CN114782611B true CN114782611B (en) | 2022-09-20 |
Family
ID=82422309
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210723158.6A Active CN114782611B (en) | 2022-06-24 | 2022-06-24 | Image processing method, image processing apparatus, storage medium, and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114782611B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115830091B (en) * | 2023-02-20 | 2023-05-12 | 腾讯科技(深圳)有限公司 | Texture image generation method, device, equipment, storage medium and product |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101261681A (en) * | 2008-03-31 | 2008-09-10 | 北京中星微电子有限公司 | Road image extraction method and device in intelligent video monitoring |
CN105608239A (en) * | 2014-11-24 | 2016-05-25 | 富泰华工业(深圳)有限公司 | Coordinate measuring machine programming system and method |
CN107452046A (en) * | 2017-06-30 | 2017-12-08 | 百度在线网络技术(北京)有限公司 | The Texture Processing Methods and device of D Urban model, equipment and computer-readable recording medium |
JP2018032301A (en) * | 2016-08-26 | 2018-03-01 | 株式会社アクセル | Image data processing method and program in image processor |
CN112307553A (en) * | 2020-12-03 | 2021-02-02 | 之江实验室 | A Method of Extracting and Simplifying 3D Road Model |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111738923B (en) * | 2020-06-19 | 2024-05-10 | 京东方科技集团股份有限公司 | Image processing method, device and storage medium |
CN114140568B (en) * | 2021-10-28 | 2025-05-09 | 北京达佳互联信息技术有限公司 | Image processing method, device, electronic device and storage medium |
-
2022
- 2022-06-24 CN CN202210723158.6A patent/CN114782611B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101261681A (en) * | 2008-03-31 | 2008-09-10 | 北京中星微电子有限公司 | Road image extraction method and device in intelligent video monitoring |
CN105608239A (en) * | 2014-11-24 | 2016-05-25 | 富泰华工业(深圳)有限公司 | Coordinate measuring machine programming system and method |
JP2018032301A (en) * | 2016-08-26 | 2018-03-01 | 株式会社アクセル | Image data processing method and program in image processor |
CN107452046A (en) * | 2017-06-30 | 2017-12-08 | 百度在线网络技术(北京)有限公司 | The Texture Processing Methods and device of D Urban model, equipment and computer-readable recording medium |
CN112307553A (en) * | 2020-12-03 | 2021-02-02 | 之江实验室 | A Method of Extracting and Simplifying 3D Road Model |
Non-Patent Citations (1)
Title |
---|
结合局部二元图特征的运动目标阴影抑制方法;戴璐平等;《华中科技大学学报(自然科学版)》;20161023(第10期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN114782611A (en) | 2022-07-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11330172B2 (en) | Panoramic image generating method and apparatus | |
US20170186219A1 (en) | Method for 360-degree panoramic display, display module and mobile terminal | |
CN113436338A (en) | Three-dimensional reconstruction method and device for fire scene, server and readable storage medium | |
CN113643414A (en) | Three-dimensional image generation method and device, electronic equipment and storage medium | |
CN114782611B (en) | Image processing method, image processing apparatus, storage medium, and electronic device | |
CN114332246A (en) | Virtual simulation method and device for camera distortion | |
CN112017133B (en) | Image display method and device and electronic equipment | |
CN118115546A (en) | Registration method, registration device, registration system and storage medium | |
CN113012302B (en) | Three-dimensional panorama generation method, device, computer equipment and storage medium | |
WO2025077567A1 (en) | Three-dimensional model output method, apparatus and device, and computer readable storage medium | |
WO2025092175A1 (en) | Virtual object generation method and apparatus, computer device and storage medium | |
CN114782614B (en) | Model rendering method and device, storage medium and electronic equipment | |
CN114782616B (en) | Model processing method and device, storage medium and electronic equipment | |
CN111311491B (en) | Image processing method and device, storage medium and electronic equipment | |
CN111292414B (en) | Method and device for generating three-dimensional image of object, storage medium and electronic equipment | |
CN114882194A (en) | Method and device for processing room point cloud data, electronic equipment and storage medium | |
CN114419286A (en) | Panoramic roaming method and device, electronic equipment and storage medium | |
CN112184543B (en) | Data display method and device for fisheye camera | |
CN114998228B (en) | A 3D image generation method, device and storage medium | |
CN113157835B (en) | Image processing method, device and platform based on GIS platform and storage medium | |
CN112837375B (en) | Method and system for camera positioning inside real space | |
US11812153B2 (en) | Systems and methods for fisheye camera calibration and bird's-eye-view image generation in a simulation environment | |
CN114125304B (en) | Shooting method and device | |
US20250030822A1 (en) | Electronic device, projection system, and projection method thereof | |
CN113763530B (en) | Image processing method, device, computing equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address | ||
CP03 | Change of name, title or address |
Address after: Room Y579, 3rd Floor, Building 3, No. 9 Keyuan Road, Daxing District Economic Development Zone, Beijing 102600 Patentee after: Beijing Feidu Technology Co.,Ltd. Address before: 100162 608, floor 6, building 1, courtyard 15, Xinya street, Daxing District, Beijing Patentee before: Beijing Feidu Technology Co.,Ltd. |