[go: up one dir, main page]

CN117765104A - Line rendering method and device, electronic equipment and storage medium - Google Patents

Line rendering method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117765104A
CN117765104A CN202211163292.1A CN202211163292A CN117765104A CN 117765104 A CN117765104 A CN 117765104A CN 202211163292 A CN202211163292 A CN 202211163292A CN 117765104 A CN117765104 A CN 117765104A
Authority
CN
China
Prior art keywords
image
texture
candidate
texture image
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211163292.1A
Other languages
Chinese (zh)
Inventor
王奇勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shuhang Technology Beijing Co ltd
Original Assignee
Shuhang Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shuhang Technology Beijing Co ltd filed Critical Shuhang Technology Beijing Co ltd
Priority to CN202211163292.1A priority Critical patent/CN117765104A/en
Publication of CN117765104A publication Critical patent/CN117765104A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a line rendering method, a line rendering device, electronic equipment and a storage medium. The method comprises the following steps: acquiring an interested region of an original image; matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image; determining a candidate texture image with highest matching degree as a target texture image; edge detection is carried out on the region of interest to obtain edge lines; according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image; the texture lines are used for replacing the edge lines, so that the edge lines are rendered, a target image is obtained, the matching precision between the target texture image and the interested region of the original image is improved, the line rendering speed is increased, and the experience of a user is further improved.

Description

Line rendering method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of image processing, in particular to a line rendering method, a line rendering device, electronic equipment and a storage medium.
Background
Along with the improvement of aesthetic sense of people, the clipping mode of the image is more diversified, for example, a certain area in the image is traced, so that the effect of rendering the edge lines of the area is achieved. One current implementation is by randomly selecting a texture pattern and then using the texture pattern to render lines in an image. However, this random selection method cannot guarantee the matching precision of the texture pattern and the image, and further reduces the experience of the user.
Disclosure of Invention
The embodiment of the application provides a line rendering method, a device, electronic equipment and a storage medium, wherein the matching degree of each candidate texture image and an interested region in an original image is calculated, so that a target texture image is determined, the matching precision of the original image and the target texture image is improved, and the experience of a user is further improved.
In a first aspect, an embodiment of the present application provides a line rendering method, including:
acquiring an interested region of an original image;
matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image;
determining a candidate texture image with highest matching degree as a target texture image;
Edge detection is carried out on the region of interest to obtain edge lines;
according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image;
and replacing the edge lines with the texture lines to render the edge lines, so as to obtain a target image.
In a second aspect, an embodiment of the present application provides a line rendering apparatus, including: an acquisition unit and a processing unit;
an acquisition unit configured to acquire a region of interest of an original image;
the processing unit is used for respectively matching the region of interest with a plurality of candidate texture images in the texture image library to obtain the matching degree of each candidate texture image;
determining a candidate texture image with highest matching degree as a target texture image;
edge detection is carried out on the region of interest to obtain edge lines;
according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image;
and replacing the edge lines with the texture lines to render the edge lines, so as to obtain a target image.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory, the processor being connected to the memory, the memory being for storing a computer program, the processor being for executing the computer program stored in the memory to cause the electronic device to perform the method as in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, the computer program causing a computer to perform the method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program, the computer being operable to cause a computer to perform a method as in the first aspect.
The implementation of the embodiment of the application has the following beneficial effects: acquiring a region of interest of an original image; matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image; determining a candidate texture image with highest matching degree as a target texture image; edge detection is carried out on the region of interest to obtain edge lines; according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image; the texture lines are used for replacing the edge lines, rendering is carried out on the edge lines to obtain a target image, a texture image with the highest matching degree is recommended for the region of interest of the original image to serve as a target texture image, matching precision between the target texture image and the region of interest of the original image is improved, and further a better line rendering effect can be presented; the texture lines are used for completely replacing the edge lines at one time, so that the line rendering speed is increased, and the experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a scene rendered by lines according to an embodiment of the present application;
fig. 2 is a flow chart of a line rendering method according to an embodiment of the present application;
FIG. 3a is a schematic diagram of a plurality of image pairs according to an embodiment of the present application;
FIG. 3b is a schematic diagram of constructing a positive sample image pair and a plurality of negative sample image pairs from a historical region of interest and a plurality of historical texture images of a user according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a texture preference matrix according to an embodiment of the present application;
fig. 5 is a schematic diagram of edge detection on a region of interest to obtain edge lines according to an embodiment of the present application;
fig. 6 is a schematic diagram of obtaining a texture line corresponding to an edge line from a target texture image according to the position of the edge line in an original image according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a first line obtained by bi-directionally expanding an edge line according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of acquiring a first texture line corresponding to a first line from a target texture image according to the position of the first line in an original image according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a second line obtained by unidirectional expansion of an edge line according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of acquiring a second texture line corresponding to the second line from a target texture image according to the position of the second line in an original image according to an embodiment of the present application;
FIG. 11 is a schematic diagram of replacing edge lines in an original image with first texture lines according to an embodiment of the present application;
fig. 12 is a schematic diagram of a target image obtained by overlapping a region of interest in an original image to a corresponding position of the region of interest in a first image according to an embodiment of the present application;
fig. 13 is a functional unit composition block diagram of a line rendering device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and drawings are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, result, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic view of a scene with line rendering according to an embodiment of the present application. The scene is mainly used for carrying out line rendering on the image according to the line rendering method of the application aiming at the line rendering device.
The line rendering device is provided with image or video editing software, and target function options corresponding to the line rendering method can be arranged in the image or video editing software. When the user touches the target function option, line rendering can be performed on the image, and a target image is obtained. For example, as shown in fig. 1, when a user touches a "rendering function" option on the line rendering device, the line rendering method of the present application performs line rendering on the region of interest in the original image, so as to obtain the target image.
Specifically, firstly, an interesting region of an original image is acquired; then, matching the region of interest with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image; determining a candidate texture image with highest matching degree as a target texture image; then, edge detection is carried out on the region of interest to obtain edge lines; then, according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image; and finally, replacing the edge lines with texture lines to render the edge lines, thereby obtaining a target image.
It can be seen that, in the embodiment of the present application, the region of interest of the original image is acquired; matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image; determining a candidate texture image with highest matching degree as a target texture image; edge detection is carried out on the region of interest to obtain edge lines; according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image; the texture lines are used for replacing the edge lines, rendering is carried out on the edge lines to obtain a target image, a texture image with the highest matching degree is recommended for the region of interest of the original image to serve as a target texture image, matching precision between the target texture image and the region of interest of the original image is improved, and further a better line rendering effect can be presented; the texture lines are used for completely replacing the edge lines at one time, so that the line rendering speed is increased, and the experience of a user is improved.
Referring to fig. 2, fig. 2 is a flow chart of a line rendering method according to an embodiment of the present application, where the method is applied to a line rendering device, and the method includes, but is not limited to, steps 201 to 206:
201: a region of interest of the original image is acquired.
In the embodiment of the present application, the original image may be a single frame image, or may be a certain frame image in the video, which is not limited herein. In addition, the original image may be a single frame image locally stored by the line rendering device or a certain frame image in a locally stored video, or may be a single frame image or a certain frame image in a video obtained from another device, which is not limited herein.
In addition, the line rendering device is provided with image or video editing software, and the image or video editing software can be provided with target function options which correspond to the line rendering method. When the user touches the target function option, line rendering can be performed on the image, and a target image is obtained.
For example, when a user wants to perform line rendering on an original image, firstly, the line rendering device loads the original image from locally stored images, and then performs line rendering on the original image when a touch command of the user for a target function option is received, so as to obtain a target image. In the process of line rendering the original image, the region of interest of the original image can be obtained by performing target detection on the original image, for example, by detecting through a cyclic convolutional neural network. Alternatively, the connected region can be determined to be the region of interest of the original image by identifying the connected region of the original image, and the application does not limit how to acquire the region of interest of the original image.
In an alternative embodiment of the present application, acquiring a region of interest of an original image may specifically include: acquiring one or more candidate interested areas in an original image; when the number of the candidate regions of interest is a plurality of, selecting one candidate region of interest from the plurality of candidate regions of interest as a region of interest according to the area of each candidate region of interest in the original image; or, taking each candidate region of interest as a region of interest simultaneously; when the number of candidate regions of interest is one, the candidate region of interest is taken as the region of interest.
The method for acquiring the one or more candidate regions of interest in the original image is similar to the method for acquiring the region of interest in the original image described above, and will not be described herein. When the number of the obtained candidate interested areas is multiple, each candidate interested area can be simultaneously used as the interested area, that is, the number of the interested areas is multiple at the moment, then the edge lines of the multiple interested areas can be rendered in a parallel mode, or the edge lines of the multiple interested areas are simultaneously rendered, so that the line rendering speed is increased, and the experience of a user is further improved. Alternatively, when the number of obtained candidate regions of interest is plural, one candidate region of interest may be selected as the region of interest from among the plural candidate regions of interest according to the area of each candidate region of interest in the original image. That is, the number of the regions of interest is plural, and the edge lines of the plural regions of interest may be rendered in a serial manner according to the area size of the region of interest (i.e., the area size of the corresponding candidate region of interest), or the edge lines of the plural regions of interest may be sequentially rendered. For example, the edge lines of the plurality of regions of interest may be sequentially rendered in order of the areas of interest from large to small, or the edge lines of the plurality of regions of interest may be sequentially rendered in order of the areas of interest from small to large.
And when the number of the obtained candidate interested areas is one, taking the candidate interested areas as the interested areas, and then rendering edge lines of the interested areas to obtain a target image.
202: and respectively matching the region of interest with a plurality of candidate texture images in the texture image library to obtain the matching degree of each candidate texture image.
In an alternative embodiment of the present application, step 202 may specifically include steps A1-A3:
a1: and taking the region of interest and each candidate texture image as an image pair to obtain a plurality of image pairs.
In an embodiment of the present application, the plurality of image pairs and the plurality of candidate texture images are in one-to-one correspondence. Referring to fig. 3a, fig. 3a is a schematic diagram of one or more image pairs provided in the embodiment of the present application, if the number of candidate texture images in the texture image library is m, using the region of interest and each candidate texture image of the m candidate texture images shown in fig. 3a as one image pair, and obtaining the m image pairs shown in fig. 3 a.
A2: and inputting each image pair into a matching model to obtain the matching degree of the candidate texture image in each image pair.
A3: and obtaining the matching degree of each candidate texture image according to the matching degree of the candidate texture image in each image pair.
In an alternative embodiment of the present application, the above-mentioned matching model is obtained according to steps B1-B3 before each of the plurality of image pairs is input into the matching model:
b1: a historical region of interest in a historical sample image of each user is acquired.
The method for acquiring the historical region of interest in the historical sample image of each user is similar to the method for acquiring the region of interest of the original image in step 201, and will not be repeated here.
B2: a positive sample image pair and a plurality of negative sample image pairs are constructed from each user's historical region of interest and a plurality of historical texture images in a texture image library.
Wherein the sum of the number of positive sample image pairs and the number of negative sample image pairs is equal to the number of the plurality of historical texture images. For example, if the number of the plurality of history texture images is n, the number of positive sample image pairs is 1, and the number of negative sample image pairs is p, n=p+1.
It should be noted that the line rendering device locally stores historical data of a plurality of users, wherein the historical data of each user comprises a historical interested region and a plurality of historical texture images in a historical sample image corresponding to the user; wherein the plurality of historical texture images includes: a first historical texture image for rendering edge lines of the historical region of interest, and a plurality of second historical texture images other than the first historical texture image, i.e., a plurality of second historical texture images not used for rendering edge lines of the historical region of interest. Of course, the history data of the plurality of users may be acquired from other devices. Therefore, the line rendering device can construct a positive sample image pair and a plurality of negative sample image pairs of each user according to the history data of a plurality of users stored locally or the history data of a plurality of users acquired from other devices.
Thus, in an alternative embodiment of the present application, step B2 may specifically include the steps of: taking a historical region of interest of each user and a first historical texture image as a positive sample image pair, wherein the first historical texture image is an image used for rendering edge lines of the historical region of interest in a plurality of historical texture images; and taking the historical interested region of each user and each second historical texture image in the plurality of second historical texture images as a negative sample image pair to obtain a plurality of negative sample image pairs, wherein the plurality of second historical texture images are a plurality of images except the first historical texture image in the plurality of historical texture images.
Referring to fig. 3b, fig. 3b is a schematic diagram of constructing a positive sample image pair and a plurality of negative sample image pairs according to a historical interest region of a user and a plurality of historical texture images according to an embodiment of the present application. As shown in fig. 3b, the number of history texture images is K, the number of positive sample image pairs is 1, and the number of negative sample image pairs is (K-1). The historical region of interest and the first historical texture image of each user are taken as a positive sample image pair, namely if the first historical texture image is the Q historical texture image in the K historical texture images, the Q historical texture image is used for rendering edge lines of the historical region of interest, and the historical region of interest and the Q historical texture image are constructed as a positive sample image pair as shown in fig. 3 b.
And taking the historical interested region of each user and each second historical texture image in the plurality of second historical texture images as a negative sample image pair to obtain a plurality of negative sample image pairs, and constructing (K-1) negative sample image pairs according to the historical interested region and (K-1) historical texture images except the Q historical texture image in the K historical texture images, wherein the plurality of historical second texture images are (K-1) historical texture images except the Q historical texture image in FIG. 3b, which are not used for rendering edge lines of the historical interested region.
B3: the initial model is trained based on a positive sample image pair and a plurality of negative sample image pairs for each user to obtain a matching model.
In the embodiment of the application, the number of users is multiple, and the initial model is trained and learned by the positive sample image pair and the negative sample image pairs of each user to obtain a matching model; and when the user finishes one line rendering, the corresponding positive sample image pair and a plurality of negative sample image pairs of the user are input into the matching model, the matching model is updated, trained and learned, the data volume is enlarged, and the effectiveness of the matching degree obtained based on the matching model is further improved.
In an alternative embodiment of the present application, step 202 may further specifically include steps C1-C5:
c1: and averaging the pixel values of all the first pixel points in the region of interest to obtain a first pixel mean value.
In the embodiment of the present application, the pixel values of all the first pixel points in the region of interest in the original image are obtained, for example, in the color mode of three primary colors (Red Green Blue, RGB), the pixel value of the ith first pixel point is (X) 1Ri ,Y 1Gi ,Z 1Bi ) If the number of the first pixel points is s, the first pixel mean value isWherein the method comprises the steps ofCan be obtained by formula (1),>can be obtained by formula (2),>can be obtained by the formula (3):
c2: and averaging the pixel values of all the second pixel points in the first candidate texture image to obtain a second pixel mean value.
Wherein the first candidate texture image is any one of a plurality of candidate texture images. Similarly, the method for calculating the second pixel mean value is similar to the method for calculating the first pixel mean value, and will not be described again, and exemplary, the second pixel mean value may be expressed as
And C3: and obtaining the similarity according to the first pixel mean value and the second pixel mean value.
Wherein the similarity is used to characterize the degree of color similarity between the region of interest and the first candidate texture image. For example, the similarity may be obtained by calculating a euclidean distance between the first pixel mean value and the second pixel, and if the euclidean distance is smaller, the closer the color between the region of interest and the first candidate texture image is indicated.
And C4: the heat of use of each candidate texture image is acquired.
The using heat represents the number of times each candidate texture image is used, and when the number of times of use is larger, the using heat is indicated to be higher, and the candidate texture image is indicated to be favored by a user to be higher.
C5: and obtaining the matching degree of each candidate texture image according to the first distance and the using heat corresponding to each candidate texture image.
It should be noted that, for the first distance, a plurality of distance intervals may be predefined, where each distance interval corresponds to a first value, and a smaller first value indicates a smaller first distance, which further indicates that the colors between the region of interest and the candidate texture image are more similar. For the usage heat, a plurality of heat intervals may be predefined, each heat interval corresponds to a second value, and the larger the second value, the higher the usage heat. In practical applications, in order to ensure the validity of the obtained matching result, the first value and the second value should be in the same interval range, for example, the value range of the first value is 1-10, and the value range of the second value should also be 1-10. And then taking the product of the first value and the second value corresponding to each candidate texture image as the matching degree of each candidate texture image, and determining a target texture image based on the matching degree of each candidate texture image, so that the matching precision between the target texture image and the region of interest is improved.
In an optional embodiment of the present application, after obtaining the matching degree of each candidate texture image according to the first distance and the usage heat corresponding to each candidate texture image, a difference between the size of each candidate texture image and the size of the region of interest may be calculated, to obtain a size difference corresponding to each candidate texture image; obtaining a first matching degree of each candidate texture image according to the matching degree and the size difference value of each candidate texture image; and re-using the first matching degree of each candidate texture image as the matching degree of each candidate texture image.
It should be noted that the size difference may be a negative number, a positive number, or zero, and when the size difference is a negative number, the size of the candidate texture image is smaller than the size of the region of interest; when the size difference is positive, the size of the candidate texture image is larger than the size of the region of interest; when the size difference is zero, it indicates that the size of the candidate texture image is equal to the size of the region of interest. In order to ensure that the texture lines corresponding to the edge lines can be obtained in the target texture image, the size of the target texture image should be greater than or equal to the size of the region of interest. Of course, in order to present better texture lines, the size of the target texture image cannot be much larger than the size of the region of interest, and the size difference should be in a suitable threshold interval. Thus, a plurality of threshold intervals may be predefined, each threshold interval representing a value, wherein when the value is not negative and is larger, the size of the candidate texture image is indicated as being optimal. For example, a threshold interval a, a threshold interval B, a threshold interval C, a threshold interval D, and a threshold interval E are provided, and the values represented by the 5 threshold intervals are-1,1,5,2,1 respectively, which indicates that the size of the candidate texture image corresponding to the size difference value in the threshold interval C is the best match with the size of the region of interest. And then taking the product of the value corresponding to the threshold value interval where each size difference value is located and the matching degree of each candidate texture image as the first matching degree of each candidate texture image, and under the condition that the using heat of the target texture image is higher and the color matching between the target texture image and the region of interest is ensured, the size matching of the target texture image and the size matching of the region of interest are also ensured, and further, the matching precision between the target texture image and the region of interest is improved.
In an alternative embodiment of the present application, after matching to each candidate texture image, a plurality of texture images used by the user history may also be acquired; extracting features of each texture image in the texture images to obtain a plurality of texture features; obtaining texture preference information of a user according to a plurality of texture features of each texture image; obtaining a second matching degree of each candidate texture image according to the texture preference information and the matching degree of each candidate texture image; and re-using the second matching degree of each candidate texture image as the matching degree of each candidate texture image.
Illustratively, a plurality of texture images used by a user are acquired, and then feature extraction is performed on each texture image to obtain a plurality of texture features, such as texture uniformity, shape complexity, texture style, texture color, and the like; a texture preference matrix for the user is then constructed based on the plurality of texture features for each texture image. It should be noted that the extracted texture uniformity, shape complexity, texture style, and texture color may be represented by values between 1-10. When the texture uniformity is represented by a value between 1 and 10, if the value is larger, the texture distribution is more uniform; when the shape complexity is represented by a value between 1 and 10, the larger the value is, the larger the shape complexity of the texture is represented, i.e. the more complex the shape of the texture is, e.g. the texture with the leaf shape is more complex than the texture of the rectangular shape; when the texture style is expressed by a number between 1 and 10, a texture style such as a conciseness style, a freshness style, a science and technology style, a chinese style, a business style, and the like may be set for each number or each number region; when the texture color is represented by a value between 1 and 10, the larger the value, the darker the color, and the smaller the value, the lighter the color. Referring to fig. 4, fig. 4 is a schematic diagram of a texture preference matrix provided in the embodiment of the present application, where the texture preference matrix is obtained by extracting texture features of 4 texture images (i.e. texture image 1, texture image 2, texture image 3, texture image 4) of a user 1 to obtain values corresponding to a plurality of texture features of each texture image, and then constructing the texture preference matrix of the user based on the values corresponding to the plurality of texture features. And obtaining preference information of the user based on the texture preference matrix, namely summing and averaging values corresponding to all texture images under each texture feature to obtain an average value of the user 1 under each texture feature, for example, an average value under texture uniformity is (8.5+9.5+9.6+9.2)/4, an average value under texture style is (2.2+2.5+2.9+3.0)/4, an average value under texture color is (3.2+3.5+3.7+3.8)/4, and an average value under shape complexity is (8.9+9.3+9.6+9.2)/4, so that the texture preference information of the user 1 based on the average value under each texture feature can be obtained as follows: the texture is uniformly distributed, the texture style is small and fresh, the texture color is light, and the texture shape is complex.
Further, a value corresponding to each texture feature in each candidate texture image is obtained, then a difference value between the value corresponding to each texture feature in each candidate texture image and an average value of the texture features of the user 1 is calculated, then a product between the reciprocal of the difference value corresponding to each candidate texture image and the corresponding matching degree is calculated, so that a second matching degree of each candidate texture image is obtained, and then the second matching degree is used as the matching degree of each candidate texture image again. And calculating the second matching degree of each candidate texture image by combining the texture preference information of the user and the matching degree of each candidate texture image, recommending a target texture image for the user according to the second matching degree of each candidate texture image, further improving the matching precision between the target texture image and the interested region of the original image, and improving the experience of the user.
203: and determining the candidate texture image with the highest matching degree as a target texture image.
After the matching degree of each candidate texture image is obtained, the candidate texture image with the highest matching degree can be used as a target texture image, so that the matching precision of the target texture image and the original image is improved, and the experience of a user is improved.
In an alternative embodiment, after the target texture image is obtained, the target texture image may be scaled to obtain a first image, where the size of the first image is the same as that of the original image, and then the first image is redetermined as the target texture image, which can ensure that the texture line corresponding to the edge line can be obtained quickly in the target texture image, thereby increasing the line rendering speed and reducing the complexity of line rendering.
Of course, in practical applications, there may still be dissatisfaction of the user with the recommended target texture image. Therefore, in an alternative embodiment, when a rejection response of the user for the target texture image is received, a first texture image uploaded by the user is received, and then the first texture image is determined to be the target texture image, so that the requirement of the user is met, and the experience of the user is improved.
204: and carrying out edge detection on the region of interest to obtain edge lines.
Referring to fig. 5, an edge detection is performed on an area of interest in fig. 5 to obtain a schematic diagram of edge lines, and the edge detection is performed on the area of interest in fig. 5 to obtain the edge lines of the area of interest in the embodiment of the present application.
205: and acquiring texture lines corresponding to the edge lines from the target texture image according to the positions of the edge lines in the original image.
A first image coordinate system is established in an original image, a second image coordinate system is established in a target texture image, then, according to the position of an edge line in the original image, namely, the first position of the edge line in the first image coordinate system, a texture line corresponding to the edge line is found in the target texture image, namely, the second position corresponding to the edge line is found in the second image coordinate system, and the texture line at the second position is taken as the texture line corresponding to the edge line.
Referring to fig. 6, an exemplary schematic diagram of a texture line corresponding to an edge line is obtained from a target texture image according to a position of the edge line in an original image, a first image coordinate system is established in the original image, and a second image coordinate system is established in the target texture image in fig. 6; then, the coordinates of all the pixels on the edge line are found in the original image, and it should be noted that the coordinates of three pixels a (u 1, v 1), B (u 2, v 2) and C (u 3, v 3) are only shown in the first image coordinate system in fig. 6 for illustration; the coordinates corresponding to all pixels on the edge line are then found in the second image coordinate system in the target texture image, it being noted that here again only a '(u 1, v 1), B' (u 2, v 2) and C '(u 3, v 3) are shown in the second image coordinate system in fig. 6 for illustration, where a (u 1, v 1) and a' (u 1, v 1) are the corresponding pixels, B (u 2, v 2) and B '(u 2, v 2) are the corresponding pixels, and C (u 3, v 3) and C' (u 3, v 3) are the corresponding pixels.
In an alternative embodiment, the edge line may be expanded in two directions to obtain the first line, for example, based on the edge line shown in fig. 5, referring to fig. 7, fig. 7 is a schematic diagram of the first line obtained by expanding in two directions the edge line, that is, expanding in two directions each pixel point on the edge line in multiple pixels points, that is, expanding in multiple pixels points to the left, and expanding in multiple pixels points to the right, so as to obtain the first line. Then, according to the position of the first line in the original image, a first texture line corresponding to the first line is obtained from the target texture image, for example, based on the first line shown in fig. 7, referring to fig. 8, fig. 8 is a schematic diagram of obtaining the first texture line corresponding to the first line from the target texture image according to the position of the first line in the original image, by establishing a first image coordinate system in the original image, establishing a second image coordinate system in the target texture image, then, according to the position of the first line in the original image, finding the position corresponding to the first line from the target texture image, and then, taking the texture line at the position corresponding to the first line as the first texture line. The edge lines are expanded in two directions, so that the texture shape in the obtained first texture line is clearer and more complete, and the visual aesthetic feeling is enhanced. It should be noted that, the method for acquiring the first texture line corresponding to the first line from the target texture image is similar to the method for acquiring the texture line corresponding to the edge line from the target texture image in step 205, and the detailed description is not repeated here.
In an alternative embodiment, the edge line may be further subjected to unidirectional expansion to obtain a second line, for example, all the pixel points on the edge line are expanded to the left to obtain a plurality of pixel points, and based on the edge line shown in fig. 5, as shown in fig. 9, fig. 9 is a schematic diagram of the second line obtained by unidirectional expansion of the edge line, that is, the pixel points on the edge line are expanded to the left to obtain the second line; then, according to the position of the second line in the original image, the second texture line corresponding to the second line is obtained from the target texture image, and on the basis of the second line shown in fig. 9, referring to fig. 10, fig. 10 is an exemplary diagram provided in the embodiment of the present application, according to the position of the second line in the original image, the second texture line corresponding to the second line is obtained from the target texture image, by establishing a first image coordinate system in the original image, establishing a second image coordinate system in the target texture image, then, according to the position of the second line in the original image, the position corresponding to the second line is found from the target texture image, and then, the texture line at the position corresponding to the second line is taken as the second texture line. The edge lines are expanded unidirectionally, so that the texture shape in the obtained second texture lines is clearer and more complete, and the visual aesthetic feeling is enhanced. It should be noted that, the method for acquiring the second texture line corresponding to the second line from the target texture image is similar to the method for acquiring the texture line corresponding to the edge line from the target texture image in step 205, and the detailed description is not repeated here.
In an alternative embodiment, the edge lines may also be expanded bi-directionally to obtain third lines; then reserving all first pixel points corresponding to the edge lines on the third line and all second pixel points on the left side of all first pixel points on the third line to obtain a fourth line, namely, the third line can be understood to be cut, and the third line is prevented from shielding the region of interest due to the third line corresponding to the third texture line in the target texture image; and then acquiring a third texture line corresponding to the fourth line from the target texture image according to the position of the fourth line in the original image. It should be noted that, the method for acquiring the third texture line corresponding to the fourth line from the target texture image is similar to the method for acquiring the texture line corresponding to the edge line from the target texture image in step 205, and the description thereof will not be repeated.
206: and replacing the edge lines with the texture lines to render the edge lines, so as to obtain a target image.
By using the texture lines to replace the edge lines completely at one time, a target image is obtained, the line rendering speed is increased, and the experience of a user is improved.
In an alternative embodiment, when a first line is obtained by bi-directionally expanding an edge line, and a first texture line corresponding to the first line is obtained in a target texture image, the edge line is replaced once and completely by using the first texture line corresponding to the first line to obtain a first image, and referring to fig. 11, an exemplary first texture line shown in fig. 8 is shown, and fig. 11 is a schematic diagram of replacing an edge line in an original image by using the first texture line, and the edge line of a region of interest in the original image is replaced once and completely by using the first texture line to obtain the first image; then, the region of interest in the original image is superimposed on the position corresponding to the region of interest in the first image to obtain a target image, and referring to fig. 12, fig. 12 is a schematic diagram of the target image obtained by superimposing the region of interest in the original image on the position corresponding to the region of interest in the first image according to the embodiment of the present application, and the region of interest in the original image is superimposed on the position corresponding to the region of interest in the first image, so as to obtain the target image. The edge lines are replaced once and completely by using the first texture lines, so that the line rendering speed is increased, and the region of interest in the original image is overlapped to the position corresponding to the region of interest in the first image, so that the situation that the region of interest is blocked by the first texture lines is avoided, and a better line rendering effect is presented.
In an alternative embodiment, when the first line is obtained based on bidirectional expansion of the edge line, and the first texture line corresponding to the first line is obtained in the target texture image, all third pixel points corresponding to the edge line on the first texture line and all fourth pixel points on the left side of all third pixel points on the first texture line are reserved, so that the fourth texture line is obtained, namely the first texture line can be understood as being cut, and the region of interest is prevented from being blocked by the first texture line; and then the fourth texture line is used for replacing the edge line so as to render the edge line to obtain a target image, and the fourth texture line is used for completely replacing the edge line at one time to obtain the target image, so that the line rendering speed is increased, the line rendering complexity is reduced, and the user experience is improved.
In an alternative embodiment, when the edge line is expanded in two directions to obtain a third line, all first pixel points corresponding to the edge line on the third line and all second pixel points on the left side of all first pixel points on the third line are reserved to obtain a fourth line, when the third texture line corresponding to the fourth line is obtained in the target texture image, the edge line is replaced by the third texture line to render the edge line to obtain a target image, the edge line is replaced by the third texture line once and completely to obtain a target image, the line rendering speed is accelerated, the line rendering complexity is reduced, and the experience of a user is improved.
In an alternative embodiment, when the second line is obtained based on unidirectional expansion of the edge line, and the second texture line corresponding to the second line is obtained in the target texture image, the edge line is replaced by the second texture line to render the edge line, so that the target image is obtained, the edge line is replaced by the second texture line once and completely, the target image is obtained, the line rendering speed is accelerated, the line rendering complexity is reduced, and the user experience is improved.
It can be seen that, in the embodiment of the present application, the region of interest of the original image is acquired; matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image; the candidate texture image with the highest matching degree is determined as the target texture image, so that the matching precision between the target texture image and the interested region of the original image is improved, and a better line rendering effect is presented; edge detection is carried out on the region of interest to obtain edge lines; according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image; the texture lines are used for replacing the edge lines so as to render the edge lines, a target image is obtained, the texture lines are used for completely replacing the edge lines at one time, the line rendering speed is increased, the line rendering complexity is reduced, and the user experience is further improved.
Referring to fig. 13, fig. 13 is a functional unit block diagram of a line rendering device according to an embodiment of the present application. The line rendering apparatus 1300 includes: an acquisition unit 1301 and a processing unit 1302;
an acquiring unit 1301 configured to acquire a region of interest of an original image;
a processing unit 1302, configured to match the region of interest with a plurality of candidate texture images in the texture image library, respectively, to obtain a matching degree of each candidate texture image;
determining a candidate texture image with highest matching degree as a target texture image;
edge detection is carried out on the region of interest to obtain edge lines;
according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image;
and replacing the edge lines with the texture lines to render the edge lines, so as to obtain a target image.
In one embodiment of the present application, the processing unit 1302 is specifically configured to, in matching the region of interest with a plurality of candidate texture images in the texture image library, respectively, to obtain a matching degree of each candidate texture image:
taking the region of interest and each candidate texture image as an image pair to obtain a plurality of image pairs;
Inputting each image pair into a matching model to obtain the matching degree of the candidate texture image in each image pair;
and obtaining the matching degree of each candidate texture image according to the matching degree of the candidate texture image in each image pair.
In one embodiment of the present application, the processing unit 1302 is further specifically configured to:
acquiring a historical interested area in a historical sample image of each user;
constructing a positive sample image pair and a plurality of negative sample image pairs according to the historical interested region of each user and a plurality of historical texture images in a texture image library;
the initial model is trained based on a positive sample image pair and a plurality of negative sample image pairs for each user to obtain a matching model.
In one embodiment of the present application, the processing unit 1302 is specifically configured to, in terms of constructing a positive sample image pair and a plurality of negative sample image pairs from the historical region of interest of each user and the plurality of historical texture images in the texture image library:
taking a historical region of interest of each user and a first historical texture image as a positive sample image pair, wherein the first historical texture image is an image used for rendering edge lines of the historical region of interest in a plurality of historical texture images;
And taking the historical interested region of each user and each second historical texture image in the plurality of second historical texture images as a negative sample image pair to obtain a plurality of negative sample image pairs, wherein the plurality of second historical texture images are a plurality of images except the first historical texture image in the plurality of historical texture images.
In one embodiment of the present application, the processing unit 1302 is specifically configured to, in matching the region of interest with a plurality of candidate texture images in the texture image library, respectively, to obtain a matching degree of each candidate texture image:
averaging the pixel values of all the first pixel points in the region of interest to obtain a first pixel average value;
averaging pixel values of all second pixel points in the first candidate texture image to obtain a second pixel mean value, wherein the first candidate texture image is any one of a plurality of candidate texture images;
obtaining similarity according to the first pixel mean value and the second pixel mean value, wherein the similarity is used for representing the color similarity degree between the region of interest and the first candidate texture image;
acquiring the using heat of each candidate texture image;
and obtaining the matching degree of each candidate texture image according to the corresponding similarity and the using heat degree of each candidate texture image.
In one embodiment of the present application, the processing unit 1302 is specifically configured to, after obtaining the matching degree of each candidate texture image according to the first distance and the heat of use corresponding to each candidate texture image:
calculating the difference between the size of each candidate texture image and the size of the region of interest to obtain a size difference value corresponding to each candidate texture image;
obtaining a first matching degree of each candidate texture image according to the matching degree and the size difference value of each candidate texture image;
and re-using the first matching degree of each candidate texture image as the matching degree of each candidate texture image.
In one embodiment of the present application, after obtaining the matching degree of each candidate texture image, the processing unit 1302 is specifically configured to:
acquiring a plurality of texture images used by a user in a historical manner;
extracting features of each texture image in the texture images to obtain a plurality of texture features;
obtaining texture preference information of a user according to a plurality of texture features of each texture image;
obtaining a second matching degree of each candidate texture image according to the texture preference information and the matching degree of each candidate texture image;
and re-using the second matching degree of each candidate texture image as the matching degree of each candidate texture image.
Referring to fig. 14, fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 14, the electronic device 1400 includes a transceiver 1401, a processor 1402, and a memory 1403. Which are connected by a bus 1404. The memory 1403 is used for storing computer programs and data, and the data stored in the memory 1403 can be transferred to the processor 1402.
The processor 1402 is configured to read the computer program in the memory 1403 to perform the following operations:
control transceiver 1401 acquires a region of interest of the original image;
matching the interested region with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image;
determining a candidate texture image with highest matching degree as a target texture image;
edge detection is carried out on the region of interest to obtain edge lines;
according to the position of the edge line in the original image, obtaining a texture line corresponding to the edge line from the target texture image;
and replacing the edge lines with the texture lines to render the edge lines, so as to obtain a target image.
In one embodiment of the present application, the processor 1402 is specifically configured to perform the following steps in terms of matching a region of interest with a plurality of candidate texture images in a texture image library to obtain a matching degree of each candidate texture image:
Taking the region of interest and each candidate texture image as an image pair to obtain a plurality of image pairs;
inputting each image pair into a matching model to obtain the matching degree of the candidate texture image in each image pair;
and obtaining the matching degree of each candidate texture image according to the matching degree of the candidate texture image in each image pair.
In one embodiment of the present application, the processor 1402 is further specifically configured to perform the steps of:
acquiring a historical interested area in a historical sample image of each user;
constructing a positive sample image pair and a plurality of negative sample image pairs according to the historical interested region of each user and a plurality of historical texture images in a texture image library;
the initial model is trained based on a positive sample image pair and a plurality of negative sample image pairs for each user to obtain a matching model.
In one embodiment of the present application, the processor 1402 is specifically configured to perform the following steps in constructing a positive sample image pair and a plurality of negative sample image pairs from the historical region of interest of each user and a plurality of historical texture images in the texture image library:
taking a historical region of interest of each user and a first historical texture image as a positive sample image pair, wherein the first historical texture image is an image used for rendering edge lines of the historical region of interest in a plurality of historical texture images;
And taking the historical interested region of each user and each second historical texture image in the plurality of second historical texture images as a negative sample image pair to obtain a plurality of negative sample image pairs, wherein the plurality of second historical texture images are a plurality of images except the first historical texture image in the plurality of historical texture images.
In one embodiment of the present application, the processor 1402 is specifically configured to perform the following steps in terms of matching a region of interest with a plurality of candidate texture images in a texture image library to obtain a matching degree of each candidate texture image:
averaging the pixel values of all the first pixel points in the region of interest to obtain a first pixel average value;
averaging pixel values of all second pixel points in the first candidate texture image to obtain a second pixel mean value, wherein the first candidate texture image is any one of a plurality of candidate texture images;
obtaining similarity according to the first pixel mean value and the second pixel mean value, wherein the similarity is used for representing the color similarity degree between the region of interest and the first candidate texture image;
acquiring the using heat of each candidate texture image;
and obtaining the matching degree of each candidate texture image according to the corresponding similarity and the using heat degree of each candidate texture image.
In one embodiment of the present application, after obtaining the matching degree of each candidate texture image according to the first distance and the heat of use corresponding to each candidate texture image, the processor 1402 is specifically configured to perform the following steps:
calculating the difference between the size of each candidate texture image and the size of the region of interest to obtain a size difference value corresponding to each candidate texture image;
obtaining a first matching degree of each candidate texture image according to the matching degree and the size difference value of each candidate texture image;
and re-using the first matching degree of each candidate texture image as the matching degree of each candidate texture image.
In one embodiment of the present application, after obtaining the matching degree of each candidate texture image, the processor 1402 is specifically configured to perform the following steps:
acquiring a plurality of texture images used by a user in a historical manner;
extracting features of each texture image in the texture images to obtain a plurality of texture features;
obtaining texture preference information of a user according to a plurality of texture features of each texture image;
obtaining a second matching degree of each candidate texture image according to the texture preference information and the matching degree of each candidate texture image;
And re-using the second matching degree of each candidate texture image as the matching degree of each candidate texture image.
Specifically, the transceiver 1401 may be the acquiring unit 1301 of the line rendering apparatus 1300 of the embodiment of fig. 13, and the processor 1402 may be the processing unit 1302 of the line rendering apparatus 1300 of the embodiment of fig. 13.
It should be understood that the electronic device in the present application may include a smart Phone (such as an Android mobile Phone, an iOS mobile Phone, a Windows Phone mobile Phone, etc.), a tablet computer, a palm computer, a notebook computer, a mobile internet device MID (Mobile Internet Devices, abbreviated as MID) or a wearable device, etc. The above-described electronic devices are merely examples and are not intended to be exhaustive and include, but are not limited to, the above-described electronic devices. In practical applications, the electronic device may further include: intelligent vehicle terminals, computer devices, etc.
The present application also provides a computer-readable storage medium storing a computer program that is executed by a processor to implement some or all of the steps of any one of the line rendering methods described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the line rendering methods described in the method embodiments above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all alternative embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as a division of units, merely a division of logic functions, and there may be additional divisions in actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software program modules.
The integrated units, if implemented in the form of software program modules and sold or used as a stand-alone product, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will vary in the specific embodiments and application scope according to the ideas of the present application, the contents of the present specification should not be construed as limiting the present application in summary.

Claims (10)

1. A method of line rendering, the method comprising:
acquiring an interested region of an original image;
matching the region of interest with a plurality of candidate texture images in a texture image library respectively to obtain the matching degree of each candidate texture image;
Determining a candidate texture image with highest matching degree as a target texture image;
performing edge detection on the region of interest to obtain edge lines;
acquiring texture lines corresponding to the edge lines from the target texture image according to the positions of the edge lines in the original image;
and replacing the edge lines with the texture lines so as to render the edge lines to obtain a target image.
2. The method according to claim 1, wherein the matching the region of interest with the plurality of candidate texture images in the texture image library to obtain a matching degree of each candidate texture image includes:
taking the region of interest and each candidate texture image as an image pair to obtain a plurality of image pairs;
inputting each image pair into a matching model to obtain the matching degree of the candidate texture image in each image pair;
and obtaining the matching degree of each candidate texture image according to the matching degree of the candidate texture image in each image pair.
3. The method according to claim 2, wherein the method further comprises:
acquiring a historical interested area in a historical sample image of each user;
Constructing a positive sample image pair and a plurality of negative sample image pairs according to the historical interested region of each user and a plurality of historical texture images in a texture image library;
the initial model is trained based on a positive sample image pair and a plurality of negative sample image pairs for each user, resulting in the matching model.
4. A method according to claim 3, wherein constructing positive sample image pairs and negative sample image pairs from the historical region of interest of each user and the plurality of historical texture images in the texture image library comprises:
taking the historical interested region of each user and a first historical texture image as the positive sample image pair, wherein the first historical texture image is an image used for rendering edge lines of the historical interested region in the plurality of historical texture images;
and taking each of the historical interested region of each user and a plurality of second historical texture images as a negative sample image pair to obtain a plurality of negative sample image pairs, wherein the plurality of second historical texture images are a plurality of images except the first historical texture image in the plurality of historical texture images.
5. The method according to claim 1, wherein the matching the region of interest with the plurality of candidate texture images in the texture image library to obtain a matching degree of each candidate texture image includes:
averaging the pixel values of all the first pixel points in the region of interest to obtain a first pixel mean value;
averaging pixel values of all second pixel points in a first candidate texture image to obtain a second pixel mean value, wherein the first candidate texture image is any one of the candidate texture images;
obtaining similarity according to the first pixel mean value and the second pixel mean value, wherein the similarity is used for representing the color similarity degree between the region of interest and the first candidate texture image;
acquiring the using heat of each candidate texture image;
and obtaining the matching degree of each candidate texture image according to the similarity corresponding to each candidate texture image and the using heat.
6. The method according to claim 5, wherein after obtaining the matching degree of each candidate texture image according to the first distance corresponding to each candidate texture image and the usage heat, the method comprises:
Calculating the difference between the size of each candidate texture image and the size of the region of interest to obtain a size difference value corresponding to each candidate texture image;
obtaining a first matching degree of each candidate texture image according to the matching degree of each candidate texture image and the size difference value;
and re-using the first matching degree of each candidate texture image as the matching degree of each candidate texture image.
7. The method of any one of claims 1-6, wherein after said obtaining the matching degree of each candidate texture image, the method further comprises:
acquiring a plurality of texture images used by a user in a historical manner;
extracting features of each texture image in the texture images to obtain a plurality of texture features;
obtaining texture preference information of the user according to the texture features of each texture image;
obtaining a second matching degree of each candidate texture image according to the texture preference information and the matching degree of each candidate texture image;
and re-using the second matching degree of each candidate texture image as the matching degree of each candidate texture image.
8. A line rendering apparatus, the apparatus comprising: an acquisition unit and a processing unit;
The acquisition unit is used for acquiring the region of interest of the original image;
the processing unit is used for respectively matching the region of interest with a plurality of candidate texture images in the texture image library to obtain the matching degree of each candidate texture image;
determining the candidate texture image with the highest matching degree as a target texture image;
performing edge detection on the region of interest to obtain edge lines;
acquiring texture lines corresponding to the edge lines from the target texture image according to the positions of the edge lines in the original image;
and replacing the edge lines with the texture lines so as to render the edge lines to obtain a target image.
9. An electronic device, comprising: a processor and a memory, the processor being connected to the memory, the memory being for storing a computer program, the processor being for executing the computer program stored in the memory to cause the electronic device to perform the method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of any of claims 1-7.
CN202211163292.1A 2022-09-23 2022-09-23 Line rendering method and device, electronic equipment and storage medium Pending CN117765104A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211163292.1A CN117765104A (en) 2022-09-23 2022-09-23 Line rendering method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211163292.1A CN117765104A (en) 2022-09-23 2022-09-23 Line rendering method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117765104A true CN117765104A (en) 2024-03-26

Family

ID=90313003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211163292.1A Pending CN117765104A (en) 2022-09-23 2022-09-23 Line rendering method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117765104A (en)

Similar Documents

Publication Publication Date Title
JP7413400B2 (en) Skin quality measurement method, skin quality classification method, skin quality measurement device, electronic equipment and storage medium
US11830230B2 (en) Living body detection method based on facial recognition, and electronic device and storage medium
CN108830892B (en) Face image processing method and device, electronic equipment and computer readable storage medium
CN106682632B (en) Method and device for processing face image
CN104375797B (en) Information processing method and electronic equipment
CN105912912B (en) A kind of terminal user ID login method and system
CN112419170A (en) Method for training occlusion detection model and method for beautifying face image
CN105118082A (en) Personalized video generation method and system
CN105243371A (en) Human face beauty degree detection method and system and shooting terminal
CN106650615A (en) Image processing method and terminal
CN111957040A (en) Method and device for detecting shielding position, processor and electronic device
CN108734126B (en) Beautifying method, beautifying device and terminal equipment
CN112734747B (en) Target detection method and device, electronic equipment and storage medium
CN113409342A (en) Training method and device for image style migration model and electronic equipment
CN109919190B (en) Straight line segment matching method, device, storage medium and terminal
CN103440633A (en) Digital image automatic speckle-removing method
CN111199169A (en) Image processing method and device
CN108764248B (en) Image feature point extraction method and device
CN113628144A (en) Portrait restoration method and device, electronic equipment and storage medium
CN113763233A (en) Image processing method, server and photographing device
CN110321009B (en) AR expression processing method, device, equipment and storage medium
CN117765104A (en) Line rendering method and device, electronic equipment and storage medium
CN107832690B (en) Face recognition method and related product
CN106028140B (en) A kind of terminal user ID login method and system
CN111258413A (en) Control method and device of virtual object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination