[go: up one dir, main page]

CN114693843B - Animation generation method and device of dyeing fabric singeing machine and readable storage medium - Google Patents

Animation generation method and device of dyeing fabric singeing machine and readable storage medium Download PDF

Info

Publication number
CN114693843B
CN114693843B CN202210230151.0A CN202210230151A CN114693843B CN 114693843 B CN114693843 B CN 114693843B CN 202210230151 A CN202210230151 A CN 202210230151A CN 114693843 B CN114693843 B CN 114693843B
Authority
CN
China
Prior art keywords
image
printed
area
fabric
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210230151.0A
Other languages
Chinese (zh)
Other versions
CN114693843A (en
Inventor
刘静
张久雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Vocational and Technical College
Original Assignee
Guangdong Vocational and Technical College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Vocational and Technical College filed Critical Guangdong Vocational and Technical College
Priority to CN202210230151.0A priority Critical patent/CN114693843B/en
Publication of CN114693843A publication Critical patent/CN114693843A/en
Application granted granted Critical
Publication of CN114693843B publication Critical patent/CN114693843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to an animation generation method, an animation generation device and a readable storage medium of a singeing machine for printing and dyeing fabrics, wherein the method comprises the following steps: acquiring a station physical image containing a fabric to be printed and dyed; determining the area of the fabric to be printed in the station physical image, taking the area as the area of the fabric to be printed, and taking the area outside the area of the fabric to be printed in the station physical image as a scene image; expanding the station physical images to obtain a plurality of station physical images, and performing color adjustment on the fabric area to be printed and dyed in each station physical image to obtain a virtual image; splicing a plurality of virtual images and a plurality of continuous frame flame images in a one-to-one correspondence manner to obtain a plurality of synthetic images; sequencing the synthetic images according to a time axis to generate an animation video file of the singeing machine for the printing and dyeing fabrics; the invention can improve the fidelity of the animation, does not need to make a three-dimensional graph, and has relatively simple working procedures.

Description

Animation generation method and device of dyeing fabric singeing machine and readable storage medium
Technical Field
The invention relates to the technical field of image processing, in particular to an animation generation method and device of a singeing machine for printing and dyeing fabrics and a readable storage medium.
Background
Because the dyeing and finishing process flow is long, the dyeing and finishing equipment is various in types and complex and changeable in structure, and the dyeing and finishing equipment is difficult to understand by referring to the plan view and the structure diagram on the teaching materials. The method has the advantages that abstraction is visual through virtual simulation equipment, and elaborate equipment is utilized to operate 3D animation operation demonstration, so that not only can the structure diagram and the section diagram of dyeing equipment be statically displayed, but also the whole process of pretreatment, dyeing, printing and after-finishing of the whole printing and dyeing factory can be dynamically demonstrated in a 3D mode, and the understanding of students on the whole printing and dyeing technological process is greatly improved.
In the prior art, a physical structure diagram of a singeing machine for printing and dyeing fabrics is usually required to be subjected to three-dimensional drawing, the three-dimensional diagram is led into a designed static station scene diagram for placement position adjustment, and finally a flame burning dynamic video is manufactured and arranged in a specific area in the static station scene diagram for animation display. However, the animation effect presented by the method is not lifelike, and needs cooperation of structural engineers, so that the procedures are relatively more.
Disclosure of Invention
The invention aims to provide an animation generation method, an animation generation device and a readable storage medium of a singeing machine for printing and dyeing fabrics, which are used for solving one or more technical problems in the prior art and at least providing a beneficial selection or creation condition.
In order to achieve the above object, the present invention provides the following technical solutions:
A method of animation generation for a singeing machine for printed and dyed fabrics, the method comprising the steps of:
acquiring a station physical image containing a fabric to be printed and dyed;
Determining the area of the fabric to be printed in the station physical image, taking the area as the area of the fabric to be printed, and taking the area outside the area of the fabric to be printed in the station physical image as a scene image;
Expanding the station physical images to obtain a plurality of station physical images, and performing color adjustment on the fabric area to be printed and dyed in each station physical image to obtain a virtual image; wherein, the fabric to be printed and dyed in the virtual image presents transition change of color caused by burning in time sequence;
Splicing a plurality of virtual images and a plurality of continuous frame flame images in a one-to-one correspondence manner to obtain a plurality of synthetic images; the flame images of the continuous frames are sequentially extracted from flame burning videos according to time axes, and the vigorous degree of flame burning in the flame burning videos is firstly from small to large and then from large to small;
And sequencing the synthesized images according to a time axis to generate an animation video file of the singeing machine for the printing and dyeing fabrics.
Further, the determining the area of the to-be-printed fabric in the station physical image, taking the area as the area of the to-be-printed fabric, and taking the area outside the area of the to-be-printed fabric in the station physical image as the scene image includes:
carrying out smoothing sharpening treatment on the station physical image and then carrying out binarization to obtain a binarization image;
performing contour extraction on the binarized image, determining the area of the fabric to be printed in the binarized image, and taking the area as the area of the fabric to be printed;
and removing the textile area to be printed from the binarized image to obtain a scene image.
Further, the splicing the virtual images and the flame images of the continuous frames in a one-to-one correspondence manner to obtain a plurality of synthesized images includes:
step S410, determining combustion areas of a plurality of continuous frame flame images;
step S420, setting a burning area at the top of the area of the fabric to be printed in each virtual image;
Step S430, selecting a virtual image and a frame of flame image corresponding to the virtual image according to a time axis;
Step S440, filling the combustion area in the virtual image by adopting the flame image to obtain a synthetic image;
Step S450, determining whether there are any virtual images that are not selected, if yes, executing step S430, and if not, obtaining a plurality of synthesized images.
Further, the determining the combustion area of the plurality of consecutive frame flame images includes:
and overlapping and synthesizing the combustion areas of a plurality of continuous frame flame images in the flame combustion video to obtain a combustion image, and taking the area where the combustion image is positioned as the combustion area.
Further, the filling the combustion area in the virtual image by using the flame image to obtain a composite image includes:
Step S441, calculating the outer edge EC of the burning area C with single pixel width according to the mask of the virtual image I;
Step S442, calculating the confidence coefficient C (p) and the data item D (p) of each point p on the outer edge EC;
step S443, determining the priority P (P) of the point P according to the confidence C (P) of each point P on the outer edge EC and the data item D (P); wherein the priority P (P) =c (P) ×d (P) of the point P;
Step S444, selecting a point P max having the greatest priority P (P) from the points P on the outer edge EC;
step S445, searching the field Jing Tuxiang Φ for a pixel block ψ qmin having a size most similar to ψ pmax; wherein ψ pmax is the neighborhood of point P max;
In step S446, the confidence of the points in ψ Pmax is updated, and it is determined that the points of the combustion area C are all filled, if yes, the process goes to step S441, otherwise, a composite image is output.
Further, the confidence coefficient C (p) of any point p is calculated as follows:
Wherein f p is the number of pixels that have been filled in the neighborhood ψ p of point p; the |ψ p | is the number of pixel points in the field ψ p.
Further, the data item D (p) for each point p on the outer edge EC of the combustion zone is calculated as follows:
calculating the gradient of each point p on the outer edge EC And gradient ofVertical gradient of vertical 900
Calculating n p,np is to calculate the normal of the point p on the outer edge EC in the neighborhood psi p of the point p;
according to the formula The data item D (p) is calculated.
An animation producing device of a singeing machine for printing and dyeing fabrics, the device comprising:
At least one processor;
At least one memory for storing at least one program;
The at least one program, when executed by the at least one processor, causes the at least one processor to implement the animation generation method of the printed fabric singeing machine as described in any of the above.
A computer-readable storage medium having stored thereon an animation production program of a textile singeing machine for printing, which when executed by a processor, implements the steps of the animation production method of a textile singeing machine for printing as described in any one of the above.
The beneficial effects of the invention are as follows: the invention discloses an animation generation method, a device and a readable storage medium of a singeing machine for printing and dyeing fabrics, which adopt an image acquisition processing technology, carry out one-by-one adjustment processing and overlapping splicing on a plurality of acquired singeing machine physical images and a plurality of flame combustion images, and finally add an animation effect to carry out time sequencing display on a plurality of spliced scene images, thereby completing the animation display of the singeing machine for printing and dyeing fabrics in the application process, improving the fidelity of the animation, and having no need of making three-dimensional figures and relatively simple procedures.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an animation generation method of a singeing machine for printed and dyed fabrics in an embodiment of the invention;
fig. 2 is a schematic structural view of an animation generating device of a singeing machine for printed and dyed fabrics according to an embodiment of the present invention.
Detailed Description
The conception, specific structure, and technical effects produced by the present application will be clearly and completely described below with reference to the embodiments and the drawings to fully understand the objects, aspects, and effects of the present application. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
Referring to fig. 1, fig. 1 is a schematic flow chart of an animation generation method of a singeing machine for printing and dyeing fabrics, which comprises the following steps:
step S100, acquiring a station physical image containing a fabric to be printed and dyed;
Step S200, determining the area of the fabric to be printed in the station physical image, taking the area as the area of the fabric to be printed, and taking the area outside the area of the fabric to be printed in the station physical image as a scene image;
specifically, when a fabric frame to be printed is placed on a singeing machine for the printed fabric, a station physical image is collected and then subjected to smooth sharpening treatment to obtain a scene image; and then, extracting the outline of the scene image to obtain a fabric area to be printed and dyed in the scene image, and simultaneously recording the fabric area to be printed and dyed in the scene image.
Step S300, expanding the station physical images to obtain a plurality of station physical images, and performing color adjustment on the fabric area to be printed and dyed in each station physical image to obtain a virtual image; wherein, the fabric to be printed and dyed in the virtual image presents transition change of color caused by burning in time sequence;
specifically, the scene images are duplicated according to the frame number of the flame images to obtain a plurality of scene images, and color adjustment is carried out on the fabric areas to be printed in each scene image through MAYA software, so that the fabrics to be printed in each fabric area to be printed in sequence show color transition change.
Step S400, splicing a plurality of virtual images and a plurality of continuous frame flame images in a one-to-one correspondence manner to obtain a plurality of synthetic images; the flame images of the continuous frames are sequentially extracted from flame burning videos according to time axes, and the vigorous degree of flame burning in the flame burning videos is firstly from small to large and then from large to small;
For example, firstly, a flame burning video is recorded, and the vigorous degree of flame burning in the flame burning video is firstly from small to large and then from large to small; then, carrying out contour extraction on each frame of flame image in the video to remove redundant external environment segments, and carrying out smooth sharpening treatment on a plurality of extracted flame images; and splicing a plurality of virtual images and a plurality of continuous frame flame images in a one-to-one correspondence manner, so that the to-be-printed and dyed fabric in the synthetic image can be matched with the flame image corresponding to the flame combustion vigorous degree. In some embodiments, the flame area is used as an evaluation index of the flame burning intensity; and taking a pixel point image with the gray value larger than 0.5 in the gray level image as an area image of flame, eliminating hollowed-out caused by jitter and smoke interference by using an expansion-corrosion algorithm, and carrying out sliding average filtering treatment on the continuous flame area after treatment.
And S500, sequencing the synthetic images according to time axes to generate an animation video file of the dyeing fabric singeing machine.
Therefore, the animation video file of the dyeing fabric singeing machine is obtained, and the animation effect of the dyeing fabric singeing machine during working can be displayed.
As a further improvement of the foregoing embodiment, in step S200, determining the area of the station physical image where the fabric to be printed is located, using the area as the area of the fabric to be printed, and using the area outside the area of the station physical image where the fabric to be printed is to be printed as the scene image includes:
step S210, carrying out binarization after carrying out smooth sharpening treatment on the station physical image to obtain a binarized image;
Step S220, carrying out contour extraction on the binary image, determining an area where the fabric to be printed is located in the binary image, and taking the area as the area of the fabric to be printed;
And step S230, removing the textile area to be printed from the binarized image to obtain a scene image.
As a further improvement of the foregoing embodiment, in step S400, the stitching a plurality of virtual images with a plurality of continuous frame flame images in a one-to-one correspondence manner to obtain a plurality of composite images includes:
step S410, determining combustion areas of a plurality of continuous frame flame images;
step S420, setting a burning area at the top of the area of the fabric to be printed in each virtual image;
Step S430, selecting a virtual image and a frame of flame image corresponding to the virtual image according to a time axis;
Step S440, filling the combustion area in the virtual image by adopting the flame image to obtain a synthetic image;
Step S450, determining whether there are any virtual images that are not selected, if yes, executing step S430, and if not, obtaining a plurality of synthesized images.
As a further improvement of the above embodiment, in step S410, the determining the combustion area of the flame images of the several consecutive frames includes:
and overlapping and synthesizing the combustion areas of a plurality of continuous frame flame images in the flame combustion video to obtain a combustion image, and taking the area where the combustion image is positioned as the combustion area.
As a further improvement of the foregoing embodiment, in step S440, the filling the combustion area in the virtual image with the flame image to obtain a composite image includes:
Step S441, calculating the outer edge EC of the burning area C with single pixel width according to the mask of the virtual image I;
where Φ=i-C, I represents the virtual image, C represents the combustion area, EC represents the outer edge of the combustion area, Φ represents the scene image (which remains unchanged throughout the step).
Step S442, calculating the confidence coefficient C (p) and the data item D (p) of each point p on the outer edge EC;
step S443, determining the priority P (P) of the point P according to the confidence C (P) of each point P on the outer edge EC and the data item D (P); wherein the priority P (P) =c (P) ×d (P) of the point P;
Step S444, selecting a point P max having the greatest priority P (P) from the points P on the outer edge EC;
Step S445, searching the field Jing Tuxiang Φ for a pixel block ψ qmin having a size most similar to ψ pmaxpmax being a neighborhood of point P max);
Specifically, starting the search from the upper left corner of virtual image I, each ψ q e Φ, calculating the pixel gap d (ψ qPmax), determining the most similar pixel block ψ qmin, where ψ qmin(d(ψqminpmin)=min{d(ψqpmin)},ψq e Φ), filling the unknown points in ψ pmax with ψ qmin; d (ψ qminpmin) denotes the gap between pixel blocks ψ qmin and ψ Pmax; d (phi qPmax)=∑(a(i,j)-b(i,j))2, the total number of the components is,
It should be noted that each point in the pixel block ψ q must be in the scene image Φ, and the pixel block ψ qmin closest to ψ pmax needs to be selected from the scene image Φ, so that d (ψ qminq) is minimum. The matching between conventional pixel blocks can only compare the values of the pixels already in the psi qmin. However, this embodiment uses pixel blocks for pixel level restoration, and performs basic estimation of the internal color information for the unknown region Ω, so that a closer pixel block ψ p can be found.
In step S446, the confidence of the points in ψ Pmax is updated, and it is determined that the points of the combustion area C are all filled, if yes, the process goes to step S441, otherwise, a composite image is output.
It should be noted that, after the unknown region in the pixel block ψ pmax is filled, the confidence coefficient C (p) of the pixel point in the pixel block ψ pmax is updated, and the formula for updating the confidence coefficient C (p) is: c (q) =c (qmin),In the course of the repair, as the pixel confidence C (p) decays, it is shown that the closer to the center of the unknown region C, the lower the confidence in the pixel information.
The present embodiment expands pixels according to the contour direction in the neighborhood block p of the outer edge point p based on the sampling of the contour in the pixel block. The confidence C (p) may be regarded as a measure of the trusted information for the neighborhood psip p of the point p. The purpose of using the confidence C (p) is to prioritize the best-known pixel points ψ p. The earlier the filled points the higher their confidence. For example, at the corner, block ψ p, the greater its C (p), the more likely it is to be filled in preferentially; in contrast, the smaller the block ψ p at the gentle edge, its C (p) will be filled later.
As a further improvement of the above embodiment, the confidence coefficient C (p) at any point p is calculated as:
Wherein f p is the number of pixels that have been filled in the neighborhood ψ p of point p; the I phi p is the number of pixel points in the field phi p;
As a further improvement of the above embodiment, the data item D (p) of each point p on the outer edge EC of the combustion zone is calculated by:
step S4421, calculating the gradient of each point p on the outer edge EC And gradient ofVertical gradient of 90 ° vertical
Step S4422, calculating n p,np is to calculate the normal of the point p on the outer edge EC in the neighborhood ψ p of the point p;
step S4423, according to the formula (E.g., let α=255), the data item D (p) is calculated.
Corresponding to the method of fig. 1, referring to fig. 2, an embodiment of the application also provides an animation generating device 10 of a singeing machine for printed fabrics, said device 10 comprising a memory 11, a processor 12 and a computer program stored on the memory 11 and executable on the processor 12.
The processor 12 and the memory 11 may be connected by a bus or other means.
The non-transitory software programs and instructions required to implement the unmanned aerial vehicle cluster collaborative mission planning method of the above-described embodiments are stored in the memory 11, which when executed by the processor 12, perform the animation generation method of the printed fabric singeing machine of the above-described embodiments.
Corresponding to the method of fig. 1, an embodiment of the present application further provides a computer readable storage medium, on which an animation generation program of a printing and dyeing fabric singeing machine is stored, which when executed by a processor implements the steps of the animation generation method of a printing and dyeing fabric singeing machine according to any of the embodiments described above.
The content in the method embodiment is applicable to the embodiment of the device, and the functions specifically realized by the embodiment of the device are the same as those of the method embodiment, and the obtained beneficial effects are the same as those of the method embodiment.
The Processor may be a Central-Processing Unit (CPU), other general-purpose Processor, digital-Signal-Processor (DSP), application-Specific-Integrated-Circuit (ASIC), field-Programmable-Gate array (FPGA), or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that is the control center of the animation generation system of the printing and dyeing fabric singeing machine, and that connects the various parts of the animation generation system operable devices of the entire printing and dyeing fabric singeing machine using various interfaces and lines.
The memory may be used to store the computer program and/or module, and the processor may implement various functions of the animation generation system of the printed fabric singeing machine by running or executing the computer program and/or module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart-Media-Card (SMC), secure-digital (SD) Card, flash Card (Flash-Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
While the present application has been described in considerable detail and with particularity with respect to several described embodiments, it is not intended to be limited to any such detail or embodiments or any particular embodiment, but is to be considered as providing a broad interpretation of such claims by reference to the appended claims in light of the prior art and thus effectively covering the intended scope of the application. Furthermore, the foregoing description of the application has been presented in its embodiments contemplated by the inventors for the purpose of providing a useful description, and for the purposes of providing a non-essential modification of the application that may not be presently contemplated, may represent an equivalent modification of the application.

Claims (9)

1. An animation generation method of a singeing machine for printing and dyeing fabrics, which is characterized by comprising the following steps:
acquiring a station physical image containing a fabric to be printed and dyed;
Determining the area of the fabric to be printed in the station physical image, taking the area as the area of the fabric to be printed, and taking the area outside the area of the fabric to be printed in the station physical image as a scene image;
Expanding the station physical images to obtain a plurality of station physical images, and performing color adjustment on the fabric area to be printed and dyed in each station physical image to obtain a virtual image; wherein, the fabric to be printed and dyed in the virtual image presents transition change of color caused by burning in time sequence;
Splicing a plurality of virtual images and a plurality of continuous frame flame images in a one-to-one correspondence manner to obtain a plurality of synthetic images; the flame images of the continuous frames are sequentially extracted from flame burning videos according to time axes, and the vigorous degree of flame burning in the flame burning videos is firstly from small to large and then from large to small;
And sequencing the synthesized images according to a time axis to generate an animation video file of the singeing machine for the printing and dyeing fabrics.
2. The animation generation method of a singeing machine for printed fabrics according to claim 1, wherein the determining the region of the station physical image where the fabric to be printed is located, taking the region as the fabric region to be printed, and taking the region outside the fabric region to be printed in the station physical image as the scene image comprises:
carrying out smoothing sharpening treatment on the station physical image and then carrying out binarization to obtain a binarization image;
performing contour extraction on the binarized image, determining the area of the fabric to be printed in the binarized image, and taking the area as the area of the fabric to be printed;
and removing the textile area to be printed from the binarized image to obtain a scene image.
3. The method for generating an animation of a singeing machine for printed fabrics according to claim 2, wherein the splicing the virtual images and the flame images of the continuous frames one by one to obtain the synthetic images comprises:
step S410, determining combustion areas of a plurality of continuous frame flame images;
step S420, setting a burning area at the top of the area of the fabric to be printed in each virtual image;
Step S430, selecting a virtual image and a frame of flame image corresponding to the virtual image according to a time axis;
Step S440, filling the combustion area in the virtual image by adopting the flame image to obtain a synthetic image;
Step S450, determining whether there are any virtual images that are not selected, if yes, executing step S430, and if not, obtaining a plurality of synthesized images.
4. A method of animation production for a singeing machine for printed fabrics as claimed in claim 3 wherein said determining the burning zone of a number of successive frame flame images comprises:
and overlapping and synthesizing the combustion areas of a plurality of continuous frame flame images in the flame combustion video to obtain a combustion image, and taking the area where the combustion image is positioned as the combustion area.
5. A method of producing an animation of a singeing machine for printed fabrics according to claim 3, wherein said filling the burning area in the virtual image with the flame image to obtain a composite image comprises:
Step S441, calculating the outer edge EC of the burning area C with single pixel width according to the mask of the virtual image I;
Step S442, calculating the confidence coefficient C (p) and the data item D (p) of each point p on the outer edge EC;
step S443, determining the priority P (P) of the point P according to the confidence C (P) of each point P on the outer edge EC and the data item D (P); wherein the priority P (P) =c (P) ×d (P) of the point P;
Step S444, selecting a point P max having the greatest priority P (P) from the points P on the outer edge EC;
step S445, searching the field Jing Tuxiang Φ for a pixel block ψ qmin having a size most similar to ψ pmax; wherein ψ pmax is the neighborhood of point P max;
In step S446, the confidence of the points in ψ Pmax is updated, and it is determined that the points of the combustion area C are all filled, if yes, the process goes to step S441, otherwise, a composite image is output.
6. The animation production method of a singeing machine for printed fabrics according to claim 5, wherein the confidence coefficient C (p) of any point p is calculated by the following formula:
Wherein f p is the number of pixels that have been filled in the neighborhood ψ p of point p; the |ψ p | is the number of pixel points in the field ψ p.
7. The method for generating the animation of the singeing machine for dyed fabrics according to claim 5, wherein the data item D (p) of each point p on the outer edge EC of the burning zone is calculated by:
calculating the gradient of each point p on the outer edge EC And gradient ofVertical gradient of 90 ° vertical
Calculating n p,np is to calculate the normal of the point p on the outer edge EC in the neighborhood psi p of the point p;
according to the formula The data item D (p) is calculated.
8. An animation producing device of a singeing machine for printed and dyed fabrics, characterized by comprising:
At least one processor;
At least one memory for storing at least one program;
The at least one program, when executed by the at least one processor, causes the at least one processor to implement the animation generation method of a printed fabric singeing machine as claimed in any one of claims 1 to 7.
9. A computer-readable storage medium, wherein the computer-readable storage medium has stored thereon an animation production program of a textile singeing machine for printing, which, when executed by a processor, implements the steps of the animation production method of the textile singeing machine for printing of any one of claims 1 to 7.
CN202210230151.0A 2022-03-09 2022-03-09 Animation generation method and device of dyeing fabric singeing machine and readable storage medium Active CN114693843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210230151.0A CN114693843B (en) 2022-03-09 2022-03-09 Animation generation method and device of dyeing fabric singeing machine and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210230151.0A CN114693843B (en) 2022-03-09 2022-03-09 Animation generation method and device of dyeing fabric singeing machine and readable storage medium

Publications (2)

Publication Number Publication Date
CN114693843A CN114693843A (en) 2022-07-01
CN114693843B true CN114693843B (en) 2024-07-26

Family

ID=82137960

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210230151.0A Active CN114693843B (en) 2022-03-09 2022-03-09 Animation generation method and device of dyeing fabric singeing machine and readable storage medium

Country Status (1)

Country Link
CN (1) CN114693843B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116437061A (en) * 2023-05-06 2023-07-14 华强方特(深圳)科技有限公司 Demonstration image laser projection method, device, computer equipment and storage medium
WO2023169297A1 (en) * 2022-03-10 2023-09-14 北京字跳网络技术有限公司 Animation special effect generation method and apparatus, device, and medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5409883A (en) * 1993-05-07 1995-04-25 Minnesota Mining And Manufacturing Company Process for the manufacture of multi-color donor elements for thermal transfer systems
US20200004225A1 (en) * 2018-06-29 2020-01-02 Velo3D, Inc. Manipulating one or more formation variables to form three-dimensional objects

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023169297A1 (en) * 2022-03-10 2023-09-14 北京字跳网络技术有限公司 Animation special effect generation method and apparatus, device, and medium
CN116437061A (en) * 2023-05-06 2023-07-14 华强方特(深圳)科技有限公司 Demonstration image laser projection method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN114693843A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN107172474B (en) Method and device for drawing bullet screen by using canvas
Zhang et al. Style transfer via image component analysis
US20090028432A1 (en) Segmentation of Video Sequences
AU2017254848A1 (en) Image matting using deep learning
JP7242165B2 (en) Program, Information Processing Apparatus, and Method
CN109636890B (en) Texture fusion method and device, electronic equipment, storage medium and product
JP2021039424A5 (en)
US8525846B1 (en) Shader and material layers for rendering three-dimensional (3D) object data models
US5767857A (en) Method, apparatus, and software product for generating outlines for raster-based rendered images
CN111681198A (en) A morphological attribute filtering multimode fusion imaging method, system and medium
CN106055295A (en) Picture processing method and device, and picture drawing method and device
CN108830820B (en) Electronic device, image acquisition method, and computer-readable storage medium
JP2012506577A (en) Method, apparatus and software for determining motion vectors
CN115967823A (en) Video cover generation method and device, electronic equipment and readable medium
CN114693843B (en) Animation generation method and device of dyeing fabric singeing machine and readable storage medium
US20020175923A1 (en) Method and apparatus for displaying overlapped graphical objects using depth parameters
US11475544B2 (en) Automated braces removal from images
Cui et al. Image‐based embroidery modeling and rendering
US9299389B2 (en) Interpretation of free-form timelines into compositing instructions
CN112749713B (en) Big data image recognition system and method based on artificial intelligence
CN117011158A (en) Image processing method, device and computer readable storage medium
CN116740198A (en) Image processing method, apparatus, device, storage medium, and program product
CN109872277A (en) Information processing method and device
KR101264358B1 (en) Method and System for Automated Photomosaic Image Generation
JP2001184512A (en) Image processing apparatus and method, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant