CN112700527A - Method for calculating object surface roughness map - Google Patents
Method for calculating object surface roughness map Download PDFInfo
- Publication number
- CN112700527A CN112700527A CN202011598060.XA CN202011598060A CN112700527A CN 112700527 A CN112700527 A CN 112700527A CN 202011598060 A CN202011598060 A CN 202011598060A CN 112700527 A CN112700527 A CN 112700527A
- Authority
- CN
- China
- Prior art keywords
- axis
- light
- data
- gradient
- lamp group
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a method for calculating an object surface roughness map, which comprises the following steps of S1: acquiring a gradient polarized light image of an object; step S2: performing specular reflection light separation to obtain a specular reflection brightness image; step S3: calculating a normal map of the object; step S4: obtaining a depth projection image of the object based on the normal map; step S5: and calculating the surface roughness map of the object based on the specular reflection brightness image and the depth projection image. The invention discloses a method for calculating an object surface roughness mapping, which is used for calculating the roughness of an object by utilizing a specular reflection image and a surface normal of the object to obtain the object surface roughness mapping, is mainly used for rendering based on physics in computer graphics, enhances the detailed expression of three-dimensional reconstruction of the object, improves the rendering reality degree of a three-dimensional model, and is applied to industries of AR \ VR, games, animation, movies and the like.
Description
Technical Field
The invention belongs to the technical field of optics, computer graphics and digital image processing, and particularly relates to a method for calculating a surface roughness map of an object.
Background
With the development of science and technology, in order to bring better experience to users, the expression form of data is rapidly developing from two dimensions to three dimensions. The three-dimensional modeling method based on images becomes a popular technology, but the three-dimensional model only reflects the morphological information of an object and lacks material information, so that the illumination effect of the object is difficult to truly restore.
PBR (physical-Based Rendering), a Rendering technique Based on the imaging law of the real physical world, uses a coloring/lighting model modeled Based on physical principles and micro-plane theory, and surface parameters measured from reality to accurately represent the Rendering concept of real world material. The PBR process is being adopted by more and more companies, studios and artists in the movie, game and CG industries, well incorporates the original overall manufacturing process, and standardizes the overall manufacturing process to a certain extent. In the field of realistic art expression, the PBR process is widely adopted and accepted as an effect.
For example, in the movie field, as an important driver of PBR, disney's animation studio has systematically studied physics-based rendering during creation of the movie "invincible destruction king (Wreck-It Ralph)", and finally developed a new BRDF model (except hair) that can be used for almost every surface of the movie, namely disney Principled BRDF. In the game industry, almost all top-level game studios have introduced PBR material flows, and are well applied and exhibited in some peak representatives (3A majors, next generation works, etc.). The traditional CG industry and application scenarios also increasingly introduce PBR procedures to better present photo-level realistic presentation effects in realistic scenarios, props, role material maps, and renderings. The invention CN201710666531.8 three-dimensional virtual real-time display method of physical product based on physical rendering technology and the invention CN201910194068.0 rendering method, device, equipment and storage medium belong to the application scope of PBR technology.
The surface roughness refers to the small pitch and the unevenness of tiny peaks and valleys on the surface of an object, and the surface roughness map is an important map in the PBR. The larger the roughness map value, the rougher the object surface, and conversely, the smoother the surface. Although roughness is a commonly used physical quantity, there are many specialized devices to measure roughness itself, but this does not meet the application requirements of PBR. The current method and process for making roughness maps mainly depend on manual work of artists. Some image processing software may assist in the production, such as QUIXEL SUITE DDO, Substance Painter, Photoshop, Mari, and the like. Which software is used depends on the artist's own software usage habits and the requirements of the processes in the workgroup. The final effect of manually making the roughness map mainly depends on the experience and hobby of the artist, the presented effect difference is large, the making cost is high, and the requirements for making batches of material libraries and the like are difficult to meet.
Disclosure of Invention
The invention mainly aims to provide a method for calculating an object surface roughness mapping, which is used for calculating the roughness of an object by utilizing a Specular reflection light (Specular) image and a surface Normal (Normal) of the object to obtain the object surface roughness mapping, is mainly used for enhancing the detail expression of three-dimensional reconstruction of the object Based on physical Rendering (PBR) in computer graphics, improving the Rendering trueness of a three-dimensional model, is applied to industries of AR \ VR, games, animation, movies and the like, firstly obtains a gradient polarized light image, calculates a Normal mapping, extracts a depth projection image from the Normal mapping, then obtains the Specular reflection light image by utilizing a polarization image, and fuses the Specular reflection image and the depth projection image to obtain the roughness mapping.
The invention also aims to provide a method for calculating the roughness map of the surface of the object, which can automatically calculate the roughness map of the surface of the non-metal object, has the characteristics of objective and accurate measurement and can realize the automation of the manufacturing process.
The invention also aims to provide a method for calculating the surface roughness map of the object, which generates the surface roughness map of the object by the gradient polarized light image, has short acquisition time, high map accuracy and no manual intervention and can realize automation and batch measurement.
To achieve the above object, the present invention provides a method for calculating a surface roughness map of an object, for calculating the surface roughness map of the object, comprising the steps of:
step S1: acquiring a gradient polarized light image of an object;
step S2: performing specular reflection light separation to obtain a specular reflection brightness image;
step S3: calculating a normal map of the object;
step S4: obtaining a depth projection image of the object based on the normal map;
step S5: and calculating the surface roughness map of the object based on the specular reflection brightness image and the depth projection image.
As a further preferable embodiment of the above technical means, step S1 is specifically implemented as the following steps:
step S1.1: the shot object is placed at the central point of the lighting ball lighting device, lighting lamps are arranged on the periphery of the lighting ball lighting device, a space coordinate system is established by taking the central point as an original point, and the coordinate axes are X, Y and the Z axis;
step S1.2: obtaining data for gradient polarized light for a first set of objects to form first data;
step S1.3: obtaining data for the gradient polarized light of a second set of objects to form second data;
step S1.4: obtaining data for gradient polarized light for a third set of objects to form third data;
step S1.5: the three different sets of first data, second data and third data are labeled and full-on polarization data of the object under the full-on set of lamps is acquired to obtain a gradient polarized light image of the object.
As a further preferred embodiment of the above technical solution, step S1.1 is specifically implemented as the following steps:
step S1.1.1: in a space coordinate system, setting the positive front of an object as the positive direction of a Z axis;
step S1.1.2: setting the positive direction of the Y axis right above the object;
step S1.1.3: the positive right direction of the object is set as the positive X-axis direction.
As a further preferred embodiment of the above technical solution, step S1.2 is specifically implemented as the following steps:
step S1.2.1: turning on an illumination lamp group on the positive X-axis direction side (in the absence of ambient light, the same applies below) and adjusting the brightness of the light source of the illumination lamp group so that the brightness of the light source is weakened from strong in the positive X-axis direction to form gradient light on the surface of the object, placing polarizing plates in front of the illumination lamp group (positive X-axis direction side) and the camera lens and so that the lamp light polarization direction is parallel to the polarization direction of incident light from the camera lens, and taking a picture to obtain + X-axis parallel direction data;
turning on the lighting lamp group on the side of the negative direction of the X axis and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the negative direction of the X axis to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the negative direction of the X axis) and the camera lens and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens parallel, and taking a picture to obtain-X-axis parallel direction data;
step S1.2.2: subtracting-X-axis parallel direction data from the acquired + X-axis parallel direction data to acquire X-axis parallel direction gradient polarized light image data;
step S1.2.3: turning on an illumination lamp group on one side of the positive direction of the X axis, and adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the X axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the illumination lamp group (one side of the positive direction of the X axis) and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and taking a picture to obtain data in the vertical direction of the + X axis;
turning on the lighting lamp group on the side of the negative direction of the X axis and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the negative direction of the X axis to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the negative direction of the X axis) and the camera lens so that the polarization direction of the lamp light is perpendicular to the polarization direction of the incident light from the camera lens, and taking a picture to obtain-X-axis perpendicular direction data;
step S1.2.4: subtracting-X-axis vertical direction data from the acquired + X-axis vertical direction data to acquire X-axis vertical direction gradient polarized light image data;
step S1.2.5: the obtained X-axis parallel direction gradient polarized light image data and X-axis perpendicular direction gradient polarized light image data are taken as first data.
As a further preferred embodiment of the above technical solution, step S1.3 is specifically implemented as the following steps:
step S1.3.1: turning on an illumination lamp group on one side of the positive direction of the Y axis, and adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid in front of the illumination lamp group (one side of the positive direction of the Y axis) and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens, and shooting a picture to obtain + Y-axis parallel direction data;
turning on the lighting lamp group on the side of the Y-axis negative direction and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the Y-axis negative direction to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the Y-axis negative direction) and the camera lens and so that the polarization direction of the lamp light and the polarization direction of incident light from the camera lens are parallel, and taking a picture to obtain-Y-axis parallel direction data;
step S1.3.2: subtracting-Y-axis parallel direction data from the acquired + Y-axis parallel direction data to acquire Y-axis parallel direction gradient polarized light image data;
step S1.3.3: turning on the lighting lamp group on one side of the positive direction of the Y axis, adjusting the brightness of a light source of the lighting lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the lighting lamp group (one side of the positive direction of the Y axis) and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and shooting a picture to obtain data of the vertical direction of the + Y axis;
turning on the lighting lamp group on the side of the Y-axis negative direction and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the Y-axis negative direction to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the Y-axis negative direction) and the camera lens so that the polarization direction of the lamp light is perpendicular to the polarization direction of incident light from the camera lens, and taking a picture to obtain-Y-axis perpendicular direction data;
step S1.3.4: subtracting-Y-axis vertical direction data from the acquired + Y-axis vertical direction data to acquire Y-axis vertical direction gradient polarized light image data;
step S1.3.5: the obtained Y-axis parallel direction gradient polarized light image data and the Y-axis perpendicular direction gradient polarized light image data are taken as second data.
As a further preferred embodiment of the above technical solution, step S1.4 is specifically implemented as the following steps:
step S1.4.1: turning on an illumination lamp group on one side of the positive direction of the Z axis, and adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid in front of the illumination lamp group (one side of the positive direction of the Z axis) and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens, and taking a picture to obtain + Z-axis parallel direction data;
turning on the lighting lamp group on the side of the Z-axis negative direction and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the Z-axis negative direction to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the Z-axis negative direction) and the camera lens and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens parallel, and taking a picture to obtain-Z-axis parallel direction data;
step S1.4.2: subtracting-Z-axis parallel direction data from the acquired + Z-axis parallel direction data to acquire Z-axis parallel direction gradient polarized light image data;
step S1.4.3: turning on an illumination lamp group on one side of the positive direction of the Z axis, and adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid in front of the illumination lamp group (one side of the positive direction of the Z axis) and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and taking a picture to obtain data of the vertical direction of the + Z axis;
turning on the lighting lamp group on the side of the Z-axis negative direction and adjusting the brightness of the light sources of the lighting lamp group so that the brightness of the light sources is weakened from strong in the Z-axis negative direction to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (on the side of the Z-axis negative direction) and the camera lens so that the polarization direction of the lamp light is perpendicular to the polarization direction of incident light from the camera lens, and taking a picture to obtain-Z-axis perpendicular direction data;
step S1.4.4: subtracting-Z-axis vertical direction data from the acquired + Z-axis vertical direction data to acquire Z-axis vertical direction gradient polarized light image data;
step S1.4.5: and taking the obtained gradient polarized light image data in the Z-axis parallel direction and gradient polarized light image data in the Z-axis vertical direction as third data.
As a further preferable embodiment of the above technical means, step S2 is specifically implemented as the following steps:
step S2.1: when the polarization direction of the lamp light is parallel to the polarization direction of the incident light of the camera lens, the incident light of the camera lens isIncluding transmitted diffuse reflected light IDAnd specularly reflected light IS;
Step S2.2: when the polarization direction of the lamp light is perpendicular to the polarization direction of the incident light of the camera lens, the incident light of the camera lens is(only diffuse reflected light I through the camera polarizerD);
Step S2.3: according to formula IS=I2-I1Separating out the specular reflected light IS;
Step S2.4: according to the step S2.1-step S2.3, subtracting the gradient polarized light image number in the X-axis vertical direction from the gradient polarized light image data in the X-axis parallel direction in the first data to obtain a specular reflection light gradient image in the X-axis direction; subtracting the gradient polarized light image number in the Y-axis vertical direction from the gradient polarized light image data in the Y-axis parallel direction in the second data to obtain a specular reflection light gradient image in the Y-axis direction; subtracting the gradient polarized light image number in the direction perpendicular to the Z axis from the gradient polarized light image data in the direction parallel to the Z axis in the third data to obtain a specular reflection light gradient image in the direction of the Z axis; acquiring a specular reflection light gradient image of an object under a full-bright lighting lamp group (acquired in step S1.5, the same principle is adopted, and data acquired when the light polarization direction is parallel to the polarization direction of incident light of a camera lens is subtracted from data acquired when the light polarization direction is perpendicular to the polarization direction of the incident light of the camera lens); (ii) a
Step S2.5: converting the RGB three-channel image in the specular reflection light gradient image of the object under the full-bright lighting lamp group acquired in the step S2.4 into an HSL space, extracting an L-channel image (brightness), namely a specular reflection brightness image, and recording the specular reflection brightness image as Mapspecular。
As a further preferable embodiment of the above technical means, step S3 is specifically implemented as the following steps:
step S3.1: calculating a normal map by separating the specular reflection and diffuse reflection images of the object through step S1;
step S3.2: during diffuse reflection, the surface (approximate) of an object is a lambertian body, the intensity of reflected light at a camera lens is irrelevant to the direction of the reflected light and only relevant to the included angle between incident light and a normal, the incident light direction is fixed, and the diffuse reflection normal direction is solved by utilizing a reflected light image so as to obtain a diffuse reflection normal map;
step S3.3: during specular reflection, solving a specular reflection normal map by using a two-way reflection law to obtain a specular reflection normal map;
step S3.4: and fusing the specular reflection map and the diffuse reflection normal map to obtain the normal map of the object.
As a further preferable embodiment of the above technical means, step S4 is specifically implemented as the following steps:
step S4.1: separating RGB three channels of the normal map to extract a single-waveband image of a B (blue) channel, namely the projection of the normal on a Z axis in a space coordinate system;
step S4.2: and (3) inverting the B (blue) channel image, and subtracting the gray value of the image from the highest gray value 255 to obtain a depth projection image which is recorded as Nmapz′=255-Nmapz,NmapzNormal projection image representing the Z axis, Nmapz' represents a depth projection image.
As a further preferable embodiment of the above technical means, step S5 is specifically implemented as the following steps:
step S5.1: and (3) performing inversion (inverse operation) on the specular reflection brightness image, and proportionally fusing the specular reflection brightness image with the depth projection image, wherein the formula is as follows: MAProughness=k1×NmapZ′+k2×(255-MAPspecular),k1And k2Representing the fusion coefficient.
Drawings
FIG. 1 is a flow chart of a method of calculating a surface roughness map of an object according to the present invention.
FIG. 2 is a schematic diagram of obtaining gradient polarized light according to a first embodiment of the present invention for calculating a surface roughness map of an object.
FIG. 3 is a schematic diagram of obtaining gradient polarized light according to a second embodiment of the present invention for calculating a surface roughness map of an object.
Reference numerals: 10. an object; 20. a camera; 30. a set of lighting lamps; 40. a first polarizing plate; 50. a second polarizing plate; 60. light ball lighting device support.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
Referring to fig. 1 of the drawings, fig. 1 is a flow chart of a method for calculating an object surface roughness map of the present invention, fig. 2 is a schematic diagram of obtaining gradient polarized light of a first embodiment of calculating an object surface roughness map of the present invention, and fig. 3 is a schematic diagram of obtaining gradient polarized light of a second embodiment of calculating an object surface roughness map of the present invention.
In the preferred embodiment of the present invention, those skilled in the art should note that the RGB three-channel and two-way reflection law, etc. involved in the present invention can be regarded as the prior art.
A first embodiment.
The invention discloses a method for calculating a surface roughness map of an object, which is used for calculating the surface roughness map of the object and comprises the following steps:
step S1: acquiring a gradient polarized light image of an object;
step S2: performing specular reflection light separation to obtain a specular reflection brightness image;
step S3: calculating a normal map of the object;
step S4: obtaining a depth projection image of the object based on the normal map;
step S5: and calculating the surface roughness map of the object based on the specular reflection brightness image and the depth projection image.
Specifically, step S1 is implemented as the following steps:
step S1.1: the shot object is placed at the central point of the lighting ball lighting device, lighting lamps are arranged on the periphery of the lighting ball lighting device, a space coordinate system is established by taking the central point as an original point, and the coordinate axes are X, Y and the Z axis;
step S1.2: obtaining data for gradient polarized light for a first set of objects to form first data;
step S1.3: obtaining data for the gradient polarized light of a second set of objects to form second data;
step S1.4: obtaining data for gradient polarized light for a third set of objects to form third data;
step S1.5: the three different sets of first data, second data and third data are labeled and full-on polarization data of the object under the full-on set of lamps is acquired to obtain a gradient polarized light image of the object.
More specifically, step S1.1 is embodied as the following steps:
step S1.1.1: in a space coordinate system, setting the positive front of an object as the positive direction of a Z axis;
step S1.1.2: setting the positive direction of the Y axis right above the object;
step S1.1.3: the positive right direction of the object is set as the positive X-axis direction.
Further, step S1.2 is embodied as the following steps:
step S1.2.1: turning on the lighting lamp set on the positive direction side of the X axis (under the condition of no ambient light, the same operation is performed below), and adjusting the brightness of the light source of the lighting lamp set so that the brightness of the light source is weakened from strong in the positive direction of the X axis to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp set (positive direction side of the X axis) and the camera lens, and making the polarization direction of the light and the polarization direction of the incident light of the camera lens parallel (specifically, the first polarizing plate 40 arranged in front of the lighting lamp set 30 is horizontally installed, the second polarizing plate 50 arranged in front of the camera 20 is also horizontally installed, so that the polarization direction of the light and the polarization direction of the incident light of the camera lens are parallel), and taking a picture to obtain + X axis parallel direction data;
turning on the lighting lamp set on the side of the negative direction of the X axis, and adjusting the brightness of the light sources of the lighting lamp set so that the brightness of the light sources is weakened from strong to weak along the negative direction of the X axis to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp set (the side of the negative direction of the X axis) and the camera lens so that the polarization direction of the light is parallel to the polarization direction of the incident light of the camera lens (specifically, the first polarizing plate 40 arranged in front of the lighting lamp set 30 is horizontally installed, and the second polarizing plate 50 arranged in front of the camera 20 is also horizontally installed so that the polarization direction of the light is parallel to the polarization direction of the incident light of the camera lens), and taking a picture to obtain-X-axis parallel direction;
step S1.2.2: subtracting-X-axis parallel direction data from the acquired + X-axis parallel direction data to acquire X-axis parallel direction gradient polarized light image data;
step S1.2.3: turning on the lighting lamp group on one side of the positive direction of the X axis, adjusting the brightness of the light source of the lighting lamp group to ensure that the brightness of the light source is weakened from strong to weak along the positive direction of the X axis, forming gradient light on the surface of an object, placing polaroids in front of the lighting lamp group (one side of the positive direction of the X axis) and a camera lens, ensuring that the polarization direction of the light is vertical to the polarization direction of incident light of the camera lens (specifically, the first polaroid 40 arranged in front of the lighting lamp group 30 is horizontally arranged, and the second polaroid 50 arranged in front of the camera 20 is vertically arranged by rotating 90 degrees, so that the polarization direction of the light is vertical to the polarization direction of the incident light of the camera lens), and shooting a picture to obtain data;
turning on the lighting lamp set on the side of the negative direction of the X axis, and adjusting the brightness of the light sources of the lighting lamp set so that the brightness of the light sources is weakened from strong to weak along the negative direction of the X axis to form gradient light on the surface of an object, placing polarizing plates in front of the lighting lamp set (the side of the negative direction of the X axis) and the camera lens, and making the polarization direction of the light and the polarization direction of the incident light of the camera lens perpendicular (specifically, the first polarizing plate 40 arranged in front of the lighting lamp set 30 is horizontally installed, and the second polarizing plate 50 arranged in front of the camera 20 is vertically installed by rotating by 90 degrees, so that the polarization direction of the light and the polarization direction of the incident light of the camera lens are perpendicular), and taking a picture to obtain data of the;
step S1.2.4: subtracting-X-axis vertical direction data from the acquired + X-axis vertical direction data to acquire X-axis vertical direction gradient polarized light image data;
step S1.2.5: the obtained X-axis parallel direction gradient polarized light image data and X-axis perpendicular direction gradient polarized light image data are taken as first data.
Further, step S1.3 is embodied as the following steps:
step S1.3.1: turning on the lighting lamp group on one side of the positive direction of the Y axis, adjusting the brightness of the light source of the lighting lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the lighting lamp group (one side of the positive direction of the Y axis) and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens (in a real-time mode, refer to the X axis, as shown in figure 2), and shooting a picture to obtain + Y axis parallel direction data;
turning on the lighting lamp group on the negative direction side of the Y axis, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong in the negative direction of the Y axis to form gradient light on the surface of the object, placing a polarizing plate in front of the lighting lamp group (negative direction side of the Y axis) and the camera lens, and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens parallel (see the X axis in a real-time manner, as shown in FIG. 2), and taking a picture to obtain-parallel direction data of the Y axis;
step S1.3.2: subtracting-Y-axis parallel direction data from the acquired + Y-axis parallel direction data to acquire Y-axis parallel direction gradient polarized light image data;
step S1.3.3: turning on the lighting lamp group on one side of the positive direction of the Y axis, adjusting the brightness of a light source of the lighting lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the lighting lamp group (one side of the positive direction of the Y axis) and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens (in a real-time mode, refer to the X axis, as shown in figure 2), and shooting a picture to obtain data of the vertical direction of the + Y axis;
turning on the lighting lamp group on the negative direction side of the Y axis, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong in the negative direction of the Y axis to form gradient light on the surface of the object, placing a polaroid in front of the lighting lamp group (negative direction side of the Y axis) and the camera lens, and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens perpendicular (see the X axis in a real-time manner, as shown in FIG. 2), and taking a picture to obtain-vertical direction data of the Y axis;
step S1.3.4: subtracting-Y-axis vertical direction data from the acquired + Y-axis vertical direction data to acquire Y-axis vertical direction gradient polarized light image data;
step S1.3.5: the obtained Y-axis parallel direction gradient polarized light image data and the Y-axis perpendicular direction gradient polarized light image data are taken as second data.
Preferably, step S1.4 is embodied as the following steps:
step S1.4.1: turning on an illumination lamp group on one side of the positive direction of the Z axis, adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the illumination lamp group (on one side of the positive direction of the Z axis) and a camera lens, enabling the polarization direction of lamp light and the polarization direction of incident light of the camera lens to be parallel (in a real-time mode, refer to the X axis, as shown in figure 2), and taking a picture to obtain + Z axis parallel direction data;
turning on the lighting lamp group on the negative Z-axis side and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong in the negative Z-axis direction to form gradient light on the surface of the object, placing polarizing plates in front of the lighting lamp group (negative Z-axis side) and the camera lens, and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens parallel (see the above X-axis in a real-time manner, as shown in FIG. 2), and taking a picture to obtain-parallel Z-axis direction data;
step S1.4.2: subtracting-Z-axis parallel direction data from the acquired + Z-axis parallel direction data to acquire Z-axis parallel direction gradient polarized light image data;
step S1.4.3: turning on an illumination lamp group on one side of the positive direction of the Z axis, adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid sheet in front of the illumination lamp group (on one side of the positive direction of the Z axis) and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens (in a real-time mode, refer to the X axis, as shown in figure 2), and taking a picture to obtain data of the vertical direction of the + Z;
turning on the lighting lamp group on the negative Z-axis side, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong in the negative Z-axis direction to form gradient light on the surface of the object, placing a polarizing plate in front of the lighting lamp group (negative Z-axis side) and the camera lens, and making the polarization direction of the lamp light and the polarization direction of the incident light of the camera lens perpendicular (see the above X-axis in a real-time manner, as shown in FIG. 2), and taking a picture to obtain-vertical Z-axis direction data;
step S1.4.4: subtracting-Z-axis vertical direction data from the acquired + Z-axis vertical direction data to acquire Z-axis vertical direction gradient polarized light image data;
step S1.4.5: and taking the obtained gradient polarized light image data in the Z-axis parallel direction and gradient polarized light image data in the Z-axis vertical direction as third data.
Preferably, step S2 is embodied as the following steps:
step S2.1: when the polarization direction of the lamp light is parallel to the polarization direction of the incident light of the camera lens, the incident light of the camera lens isIncluding transmitted diffuse reflected light IDAnd specularly reflected light IS;
Step S2.2: when the polarization direction of the lamp light is perpendicular to the polarization direction of the incident light of the camera lens, the incident light of the camera lens is(only diffuse reflected light I through the camera polarizerD);
Step S2.3: according to formula IS=I2-I1Separating out the specular reflected light IS;
Step S2.4: according to the step S2.1-step S2.3, subtracting the gradient polarized light image number in the X-axis vertical direction from the gradient polarized light image data in the X-axis parallel direction in the first data to obtain a specular reflection light gradient image in the X-axis direction; subtracting the gradient polarized light image number in the Y-axis vertical direction from the gradient polarized light image data in the Y-axis parallel direction in the second data to obtain a specular reflection light gradient image in the Y-axis direction; subtracting the gradient polarized light image number in the direction perpendicular to the Z axis from the gradient polarized light image data in the direction parallel to the Z axis in the third data to obtain a specular reflection light gradient image in the direction of the Z axis; acquiring a specular reflection light gradient image of an object under a full-bright lighting lamp group (acquired in step S1.5, the same principle is adopted, and data acquired when the light polarization direction is parallel to the polarization direction of incident light of a camera lens is subtracted from data acquired when the light polarization direction is perpendicular to the polarization direction of the incident light of the camera lens); (ii) a
Step S2.5: converting the RGB three-channel image in the specular reflection light gradient image of the object under the full-bright lighting lamp group acquired in the step S2.4 into an HSL space, extracting an L-channel image (brightness), namely a specular reflection brightness image, and recording the specular reflection brightness image as Mapspecular。
Preferably, step S3 is embodied as the following steps:
step S3.1: calculating a normal map by separating the specular reflection and diffuse reflection images of the object through step S1;
step S3.2: during diffuse reflection, the surface (approximate) of an object is a lambertian body, the intensity of reflected light at a camera lens is irrelevant to the direction of the reflected light and only relevant to the included angle between incident light and a normal, the incident light direction is fixed, and the diffuse reflection normal direction is solved by utilizing a reflected light image so as to obtain a diffuse reflection normal map;
step S3.3: during specular reflection, solving a specular reflection normal map by using a two-way reflection law to obtain a specular reflection normal map;
and (4) solving the specular reflection normal map by utilizing a two-way reflection law in specular reflection. According to the specular reflection BRDF bidirectional reflection distribution function:
wherein Is the specular reflection direction, S is the specular reflection cone angle, and can be obtained by simplifying through coordinate transformation according to the gradient light direction characteristics:
normalized vectorObtaining reflection in the direction of the viewing angleNormalizedAndthe intermediate vector of (a) is correlated with the direction of the normal of the specular reflection, the normal of the specular reflection of the object surface can be obtained. The normal map of the object can be obtained by fusing the specular reflection normal map and the diffuse reflection normal map, and the fusion ratio in this embodiment is 0.5: 0.5. the three channels of the normal map R, G, B represent the projection of the normal on three axes of the spatial coordinate system X, Y, Z.
Step S3.4: and fusing the specular reflection map and the diffuse reflection normal map to obtain the normal map of the object.
Preferably, step S4 is embodied as the following steps:
step S4.1: separating RGB three channels of the normal map to extract a single-waveband image of a B (blue) channel, namely the projection of the normal on a Z axis in a space coordinate system;
step S4.2: and (3) inverting the B (blue) channel image, and subtracting the gray value of the image from the highest gray value 255 to obtain a depth projection image which is recorded as Nmapz′=255-Nmapz,NmapzNormal projection image representing the Z axis, Nmapz' represents a depth projection image.
Preferably, step S5 is embodied as the following steps:
step S5.1: and (3) performing inversion (inverse operation) on the specular reflection brightness image, and proportionally fusing the specular reflection brightness image with the depth projection image, wherein the formula is as follows: MAProughness=k1×NmapZ′+k2×(255-MAPspecular),k1And k2Representing the fusion coefficient.
Wherein MAProughnessRepresenting roughness image, k1、k2Denotes the fusion coefficient, k1+k2The fusion coefficient can be adjusted according to the material of the object, i.e. k in this embodiment, 11=k2=0.5。
Preferably, the light ball fixture mount 60 is provided with a plurality of light groupings 30, the number of first polarizers 40 matching the number of light groupings.
A second embodiment.
Referring to fig. 3, the second embodiment is different from the first embodiment in that the implementation of "the polarization direction of the lamp light is parallel to the polarization direction of the incident light from the camera lens" and "the polarization direction of the lamp light is perpendicular to the polarization direction of the incident light from the camera lens" is different, specifically as follows:
the specific implementation is as follows: the first polarizer 40 arranged in front of the lighting lamp set 30 and the second polarizer arranged in front of the camera 20 are fixedly installed (the second polarizer 50 and the camera 20 only have one fixed installation mode, horizontal or vertical, and if the embodiment is horizontally arranged), but are not rotatable, the first polarizer 40 in front of the lighting lamp set 30 is fixedly installed in two modes, one mode is fixedly installed with the lighting lamp set 30 horizontally (fig. 3 is a first polarizer without a filling line), the other mode is fixedly installed with the lighting lamp set 30 vertically (fig. 3 is a first polarizer with a filling line), and the two installation modes are tightly attached, so that the number of the lighting lamp set of the second embodiment is twice that of the lighting lamp set of the first embodiment;
when the polarization direction of the lamplight is parallel to the polarization direction of incident light of the camera lens, only the second polaroid needs to be matched with the first polaroid without the filling line;
when the polarization direction of the lamp light is vertical to the polarization direction of the incident light of the camera lens, the second polaroid is matched with the first polaroid with the filling line.
It should be noted that the technical features such as the RGB three-channel and the two-way reflection law related to the present invention patent application should be regarded as the prior art, and the specific structure, the operation principle, the control mode and the spatial arrangement mode of the technical features may be selected conventionally in the field, and should not be regarded as the invention point of the present invention patent, and the present invention patent is not further specifically described in detail.
It will be apparent to those skilled in the art that modifications and equivalents may be made in the embodiments and/or portions thereof without departing from the spirit and scope of the present invention.
Claims (10)
1. A method of calculating a surface roughness map of an object, for calculating a surface roughness map of the object, comprising the steps of:
step S1: acquiring a gradient polarized light image of an object;
step S2: performing specular reflection light separation to obtain a specular reflection brightness image;
step S3: calculating a normal map of the object;
step S4: obtaining a depth projection image of the object based on the normal map;
step S5: and calculating the surface roughness map of the object based on the specular reflection brightness image and the depth projection image.
2. The method for calculating the object surface roughness map according to claim 1, wherein the step S1 is implemented as the following steps:
step S1.1: the shot object is placed at the central point of the lighting ball lighting device, lighting lamps are arranged on the periphery of the lighting ball lighting device, a space coordinate system is established by taking the central point as an original point, and the coordinate axes are X, Y and the Z axis;
step S1.2: obtaining data for gradient polarized light for a first set of objects to form first data;
step S1.3: obtaining data for the gradient polarized light of a second set of objects to form second data;
step S1.4: obtaining data for gradient polarized light for a third set of objects to form third data;
step S1.5: the three different sets of first data, second data and third data are labeled and full-on polarization data of the object under the full-on set of lamps is acquired to obtain a gradient polarized light image of the object.
3. The method of claim 2, wherein step S1.1 is embodied as the steps of:
step S1.1.1: in a space coordinate system, setting the positive front of an object as the positive direction of a Z axis;
step S1.1.2: setting the positive direction of the Y axis right above the object;
step S1.1.3: the positive right direction of the object is set as the positive X-axis direction.
4. A method for calculating a surface roughness map of an object according to claim 3, wherein step S1.2 is embodied as the following steps:
step S1.2.1: turning on an illumination lamp group on one side of the positive direction of an X axis, adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the X axis, forming gradient light on the surface of an object, placing a polaroid in front of the illumination lamp group and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens, and taking a picture to obtain data of the + X axis parallel direction;
turning on the lighting lamp group on one side of the negative direction of the X axis, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong to weak along the negative direction of the X axis, gradient light is formed on the surface of an object, a polaroid is placed in front of the lighting lamp group and the camera lens, the polarization direction of the lamp light is parallel to the polarization direction of incident light of the camera lens, and a picture is taken, so that-X-axis parallel direction data are obtained;
step S1.2.2: subtracting-X-axis parallel direction data from the acquired + X-axis parallel direction data to acquire X-axis parallel direction gradient polarized light image data;
step S1.2.3: turning on an illumination lamp group on one side of the positive direction of an X axis, adjusting the brightness of a light source of the illumination lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the X axis, forming gradient light on the surface of an object, placing a polaroid in front of the illumination lamp group and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and taking a picture to obtain data of the positive X axis vertical direction;
turning on the lighting lamp group on one side of the negative direction of the X axis, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong to weak along the negative direction of the X axis, gradient light is formed on the surface of an object, a polaroid is placed in front of the lighting lamp group and the camera lens, the polarization direction of the lamp light is perpendicular to the polarization direction of incident light of the camera lens, and a picture is taken, so that data in the vertical direction of the X axis are obtained;
step S1.2.4: subtracting-X-axis vertical direction data from the acquired + X-axis vertical direction data to acquire X-axis vertical direction gradient polarized light image data;
step S1.2.5: the obtained X-axis parallel direction gradient polarized light image data and X-axis perpendicular direction gradient polarized light image data are taken as first data.
5. The method of claim 4, wherein step S1.3 is embodied as the steps of:
step S1.3.1: turning on an illuminating lamp group on one side of the positive direction of a Y axis, adjusting the brightness of a light source of the illuminating lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid in front of the illuminating lamp group and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens, and taking a picture to obtain + Y-axis parallel direction data;
turning on the lighting lamp group on one side of the Y-axis negative direction, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong along the Y-axis negative direction, gradient light is formed on the surface of an object, a polaroid is placed in front of the lighting lamp group and the camera lens, the polarization direction of the lamp light is parallel to the polarization direction of incident light of the camera lens, and a picture is taken, so that-Y-axis parallel direction data is obtained;
step S1.3.2: subtracting-Y-axis parallel direction data from the acquired + Y-axis parallel direction data to acquire Y-axis parallel direction gradient polarized light image data;
step S1.3.3: turning on an illuminating lamp group on one side of the positive direction of a Y axis, adjusting the brightness of a light source of the illuminating lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Y axis, forming gradient light on the surface of an object, placing a polaroid in front of the illuminating lamp group and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and shooting a picture to obtain data of the + Y-axis vertical direction;
turning on the lighting lamp group on one side of the Y-axis negative direction, and adjusting the brightness of the light sources of the lighting lamp group, so that the brightness of the light sources is weakened from strong in the Y-axis negative direction, gradient light is formed on the surface of an object, a polarizing plate is placed in front of the lighting lamp group and a camera lens, the polarization direction of the lamp light is perpendicular to the polarization direction of incident light of the camera lens, and a picture is taken, so that-Y-axis perpendicular direction data is obtained;
step S1.3.4: subtracting-Y-axis vertical direction data from the acquired + Y-axis vertical direction data to acquire Y-axis vertical direction gradient polarized light image data;
step S1.3.5: the obtained Y-axis parallel direction gradient polarized light image data and the Y-axis perpendicular direction gradient polarized light image data are taken as second data.
6. The method of claim 5, wherein step S1.4 is embodied as the steps of:
step S1.4.1: turning on an illuminating lamp group on one side of the positive direction of the Z axis, adjusting the brightness of a light source of the illuminating lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid in front of the illuminating lamp group and a camera lens, enabling the polarization direction of lamp light to be parallel to the polarization direction of incident light of the camera lens, and taking a picture to obtain + Z-axis parallel direction data;
turning on the lighting lamp group on one side of the Z-axis negative direction, and adjusting the brightness of the light source of the lighting lamp group, so that the brightness of the light source is weakened from strong along the Z-axis negative direction, gradient light is formed on the surface of an object, a polaroid is placed in front of the lighting lamp group and the camera lens, the polarization direction of the lamp light is parallel to the polarization direction of incident light of the camera lens, and a picture is taken, so that-Z-axis parallel direction data is obtained;
step S1.4.2: subtracting-Z-axis parallel direction data from the acquired + Z-axis parallel direction data to acquire Z-axis parallel direction gradient polarized light image data;
step S1.4.3: turning on an illuminating lamp group on one side of the positive direction of the Z axis, adjusting the brightness of a light source of the illuminating lamp group to enable the brightness of the light source to be weakened from strong to weak along the positive direction of the Z axis, forming gradient light on the surface of an object, placing a polaroid in front of the illuminating lamp group and a camera lens, enabling the polarization direction of lamp light to be perpendicular to the polarization direction of incident light of the camera lens, and taking a picture to obtain data of the positive Z-axis vertical direction;
turning on the lighting lamp group on one side of the Z-axis negative direction, and adjusting the brightness of the light source of the lighting lamp group, so that the brightness of the light source is weakened from strong along the Z-axis negative direction, gradient light is formed on the surface of an object, a polaroid is placed in front of the lighting lamp group and the camera lens, the polarization direction of the lamp light is perpendicular to the polarization direction of incident light of the camera lens, and a picture is taken, so that-Z-axis vertical direction data is obtained;
step S1.4.4: subtracting-Z-axis vertical direction data from the acquired + Z-axis vertical direction data to acquire Z-axis vertical direction gradient polarized light image data;
step S1.4.5: and taking the obtained gradient polarized light image data in the Z-axis parallel direction and gradient polarized light image data in the Z-axis vertical direction as third data.
7. The method for calculating the object surface roughness map according to claim 6, wherein the step S2 is implemented as the following steps:
step S2.1: when the polarization direction of the lamp light is parallel to the polarization direction of the incident light of the camera lens, the incident light of the camera lens isIncluding transmitted diffuse reflected light IDAnd specularly reflected light IS;
Step S2.2: when the polarization direction of the lamp light is perpendicular to the polarization direction of the incident light of the camera lens, the incident light of the camera lens is
Step S2.3: according to formula IS=I2-I1Separating out the specular reflected light IS;
Step S2.4: according to the step S2.1-step S2.3, subtracting the gradient polarized light image number in the X-axis vertical direction from the gradient polarized light image data in the X-axis parallel direction in the first data to obtain a specular reflection light gradient image in the X-axis direction; subtracting the gradient polarized light image number in the Y-axis vertical direction from the gradient polarized light image data in the Y-axis parallel direction in the second data to obtain a specular reflection light gradient image in the Y-axis direction; subtracting the gradient polarized light image number in the direction perpendicular to the Z axis from the gradient polarized light image data in the direction parallel to the Z axis in the third data to obtain a specular reflection light gradient image in the direction of the Z axis; acquiring a specular reflection light gradient image of an object under a full-bright lighting lamp group;
step S2.5: converting the RGB three-channel image in the specular reflection light gradient image of the object under the full-bright lighting lamp group acquired in the step S2.4 into an HSL space, extracting an L-channel image, namely a specular reflection brightness image, and recording the image as Mapspecular。
8. The method of claim 7, wherein the step S3 is implemented as the following steps:
step S3.1: calculating a normal map by separating the specular reflection and diffuse reflection images of the object through step S1;
step S3.2: during diffuse reflection, the surface of an object is a lambertian body, the intensity of reflected light at the lens of a camera is irrelevant to the direction of the reflected light and only relevant to the included angle between incident light and a normal, the direction of the incident light is fixed, and the direction of a diffuse reflection normal is solved by utilizing a reflected light image so as to obtain a map of the diffuse reflection normal;
step S3.3: during specular reflection, solving a specular reflection normal map by using a two-way reflection law to obtain a specular reflection normal map;
step S3.4: and fusing the specular reflection map and the diffuse reflection normal map to obtain the normal map of the object.
9. The method of claim 8, wherein the step S4 is embodied as the following steps:
step S4.1: separating RGB three channels of the normal map to extract a single-waveband image of a B channel, namely the projection of the normal on a Z axis in a space coordinate system;
step S4.2: and (4) negating the B channel image, and subtracting the image gray value by using the highest gray value 255 to obtain a depth projection image which is recorded as Nmapz′=255-Nmapz,NmapzNormal projection image representing the Z axis, Nmapz' represents a depth projection image.
10. The method of claim 9, wherein the step S5 is embodied as the steps of:
step S5.1: the mirror reflection brightness image is inverted and is fused with the depth projection image in proportion, and the formula is as follows: MAProughness=k1×NmapZ′+k2×(255-MAPspecular),k1And k2Representing the fusion coefficient.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011598060.XA CN112700527B (en) | 2020-12-29 | 2020-12-29 | Method for calculating object surface roughness map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011598060.XA CN112700527B (en) | 2020-12-29 | 2020-12-29 | Method for calculating object surface roughness map |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112700527A true CN112700527A (en) | 2021-04-23 |
CN112700527B CN112700527B (en) | 2023-09-26 |
Family
ID=75511962
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011598060.XA Active CN112700527B (en) | 2020-12-29 | 2020-12-29 | Method for calculating object surface roughness map |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112700527B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115147896A (en) * | 2022-06-21 | 2022-10-04 | 北京理工大学 | A face image acquisition system, correction method and acquisition method based on spherical linear polarization technology |
KR20230075858A (en) * | 2021-11-23 | 2023-05-31 | 연세대학교 산학협력단 | Prediction system and method of perceived coarseness data |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120313960A1 (en) * | 2009-12-24 | 2012-12-13 | Sony Computer Entertainment Inc. | Image processing device, image data generation device, image processing method, image data generation method, and data structure of image file |
KR101451792B1 (en) * | 2013-05-30 | 2014-10-16 | 한국과학기술연구원 | Image rendering apparatus and method thereof |
CN110021067A (en) * | 2019-03-22 | 2019-07-16 | 嘉兴超维信息技术有限公司 | A method of three-dimensional face normal is constructed based on mirror-reflection gradient polarised light |
CN110033509A (en) * | 2019-03-22 | 2019-07-19 | 嘉兴超维信息技术有限公司 | A method of three-dimensional face normal is constructed based on diffusing reflection gradient polarised light |
CN111768473A (en) * | 2020-06-28 | 2020-10-13 | 完美世界(北京)软件科技发展有限公司 | Image rendering method, device and equipment |
US20200393689A1 (en) * | 2017-12-22 | 2020-12-17 | Sony Interactive Entertainment Inc. | Information processing apparatus and surface roughness acquisition method |
-
2020
- 2020-12-29 CN CN202011598060.XA patent/CN112700527B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120313960A1 (en) * | 2009-12-24 | 2012-12-13 | Sony Computer Entertainment Inc. | Image processing device, image data generation device, image processing method, image data generation method, and data structure of image file |
KR101451792B1 (en) * | 2013-05-30 | 2014-10-16 | 한국과학기술연구원 | Image rendering apparatus and method thereof |
US20200393689A1 (en) * | 2017-12-22 | 2020-12-17 | Sony Interactive Entertainment Inc. | Information processing apparatus and surface roughness acquisition method |
CN110021067A (en) * | 2019-03-22 | 2019-07-16 | 嘉兴超维信息技术有限公司 | A method of three-dimensional face normal is constructed based on mirror-reflection gradient polarised light |
CN110033509A (en) * | 2019-03-22 | 2019-07-19 | 嘉兴超维信息技术有限公司 | A method of three-dimensional face normal is constructed based on diffusing reflection gradient polarised light |
CN111768473A (en) * | 2020-06-28 | 2020-10-13 | 完美世界(北京)软件科技发展有限公司 | Image rendering method, device and equipment |
Non-Patent Citations (3)
Title |
---|
吴蔷;白睿洁;高红;王玫;: "基于偏振摄影技术和MATLAB实现对物体表面粗糙度的检测", 大学物理, no. 08, pages 68 - 77 * |
李红松;李凤霞;赵伟;: "基于图像的方向透射实时绘制算法", 计算机辅助设计与图形学学报, no. 05, pages 756 - 762 * |
郭便等: "基于明暗恢复形状的加工表面形貌重构与粗糙度检测", 《工具技术》, vol. 45, no. 6, pages 98 - 102 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230075858A (en) * | 2021-11-23 | 2023-05-31 | 연세대학교 산학협력단 | Prediction system and method of perceived coarseness data |
KR102781187B1 (en) | 2021-11-23 | 2025-03-14 | 연세대학교 산학협력단 | Prediction system and method of perceived coarseness data |
CN115147896A (en) * | 2022-06-21 | 2022-10-04 | 北京理工大学 | A face image acquisition system, correction method and acquisition method based on spherical linear polarization technology |
Also Published As
Publication number | Publication date |
---|---|
CN112700527B (en) | 2023-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ghosh et al. | BRDF acquisition with basis illumination | |
US6628298B1 (en) | Apparatus and method for rendering synthetic objects into real scenes using measurements of scene illumination | |
Rouiller et al. | 3D-printing spatially varying BRDFs | |
CN110033509B (en) | Method for constructing three-dimensional face normal based on diffuse reflection gradient polarized light | |
CN109214350B (en) | Method, device and equipment for determining illumination parameters and storage medium | |
Sajadi et al. | Autocalibrating tiled projectors on piecewise smooth vertically extruded surfaces | |
Xing et al. | Lighting simulation of augmented outdoor scene based on a legacy photograph | |
CN112700527A (en) | Method for calculating object surface roughness map | |
CN114155338B (en) | Image rendering method, device and electronic device | |
Alhakamy et al. | Real-time illumination and visual coherence for photorealistic augmented/mixed reality | |
Liu et al. | Openillumination: A multi-illumination dataset for inverse rendering evaluation on real objects | |
Walton et al. | Synthesis of environment maps for mixed reality | |
Morgand et al. | A geometric model for specularity prediction on planar surfaces with multiple light sources | |
Tuceryan | CubeMap360: interactive global illumination for augmented reality in dynamic environment | |
JP2004252603A (en) | Three-dimensional data processing method | |
Cox et al. | Imaging artwork in a studio environment for computer graphics rendering | |
Wang et al. | Capturing and rendering geometry details for BTF-mapped surfaces | |
Morgand et al. | An empirical model for specularity prediction with application to dynamic retexturing | |
Gigilashvili et al. | Appearance manipulation in spatial augmented reality using image differences | |
CN108876891A (en) | Face image data acquisition method and face image data acquisition device | |
JP2006031595A (en) | Image processing system | |
CN108776963B (en) | Reverse image authentication method and system | |
Martos et al. | Realistic virtual reproductions. Image-based modelling of geometry and appearance | |
Lee | Wand: 360∘ video projection mapping using a 360∘ camera | |
Ahmed et al. | Projector primary-based optimization for superimposed projection mappings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |