[go: up one dir, main page]

CN113542708B - Projection surface parameter confirmation method and device, storage medium and projection equipment - Google Patents

Projection surface parameter confirmation method and device, storage medium and projection equipment Download PDF

Info

Publication number
CN113542708B
CN113542708B CN202110838212.7A CN202110838212A CN113542708B CN 113542708 B CN113542708 B CN 113542708B CN 202110838212 A CN202110838212 A CN 202110838212A CN 113542708 B CN113542708 B CN 113542708B
Authority
CN
China
Prior art keywords
brightness
value
brightness value
detection area
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110838212.7A
Other languages
Chinese (zh)
Other versions
CN113542708A (en
Inventor
吕思成
张子祺
张聪
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN202110838212.7A priority Critical patent/CN113542708B/en
Publication of CN113542708A publication Critical patent/CN113542708A/en
Application granted granted Critical
Publication of CN113542708B publication Critical patent/CN113542708B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The disclosure relates to a projection surface parameter confirmation method, a projection surface parameter confirmation device, a storage medium and projection equipment, wherein the method comprises the following steps: under the condition that a detection area in the projection area does not cover the projection image, acquiring a first brightness value of the detection area; under the condition that the detection area is covered with the projection image, recording a first source image corresponding to the detection area, and acquiring a second brightness value of the detection area again; calculating the brightness value of the first source image in the calibration environment according to the brightness reference parameters, wherein the brightness reference parameters comprise a target brightness value when a calibration detection area displays a white image and a brightness stimulation value when the calibration detection area displays three primary colors; and calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment, wherein the brightness gain parameter is used for compensating the acquired brightness value of the detection area to obtain an environment light brightness value.

Description

Projection surface parameter confirmation method and device, storage medium and projection equipment
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a method and an apparatus for confirming parameters of a projection plane, a storage medium, and a projection device.
Background
Automatic brightness adjustment is a common function of display devices, and is also becoming popular in projection devices in recent years. The brightness of the environment is used as a parameter for representing the brightness degree of the environment, and is an important basis in the automatic brightness adjustment process.
Unlike the usage scenario of a device such as a mobile phone, in a projection scenario, the difference between the brightness on the user side and the brightness of the projection screen viewed by the user may be large. Meanwhile, the environment (such as a projection surface) involved in the projection process is also complex and variable, so that the environment brightness is difficult to accurately detect in practical application, and finally, the accuracy of automatic brightness adjustment of the projection equipment is reduced, and the use experience of a user is influenced.
Disclosure of Invention
The present disclosure is directed to a method, an apparatus, a storage medium, and a projection device for determining parameters of a projection plane, so as to solve the above related technical problems.
To this end, according to a first aspect of the embodiments of the present disclosure, there is provided a projection plane parameter confirmation method, including:
under the condition that a detection area in the projection area does not cover the projection image, acquiring the brightness value of the detection area to obtain a first brightness value;
under the condition that the detection area is covered with the projection image, recording a first source image corresponding to the detection area, and acquiring the brightness value of the detection area again to obtain a second brightness value;
calculating the brightness value of the first source image in the calibration environment according to the brightness reference parameters, wherein the brightness reference parameters comprise a target brightness value when a calibration detection area displays a white picture and a brightness stimulation value when the calibration detection area displays three primary colors of optics, and the calibration detection area is a detection area in the calibration environment;
and calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment, wherein the brightness gain parameter is used for compensating the acquired brightness value of the detection area to obtain an environment light brightness value.
Optionally, calculating the brightness value of the first source image in the calibration environment according to the brightness reference parameter includes:
calculating the average value of R channels, the average value of G channels and the average value of B channels of all pixel points in the first source image;
calculating the product of the average value of each channel and the brightness stimulation value of the corresponding optical primary color;
and calculating the brightness value of the first source image in the calibration environment according to each product, the brightness stimulation value and the target brightness value.
Optionally, calculating a brightness value of the first source image in the calibration environment according to each product, the brightness stimulus value, and the target brightness value, includes:
summing each product to obtain a first sum;
calculating sum values of brightness stimulation values respectively corresponding to the three primary colors to obtain second sum values;
taking the product of the ratio of the first sum value to the second sum value and the target brightness value as the brightness value of the first source image in the calibration environment;
calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment, including:
calculating a brightness difference value between the second brightness value and the first brightness value;
and taking the ratio of the brightness value of the first source image in the calibration environment to the brightness difference value as a brightness gain parameter.
Optionally, the method further comprises:
acquiring a first area value of an uncorrected projection image;
acquiring a calibration area value;
calculating the ratio of the first area value to the calibration area value to obtain an area influence parameter;
and compensating the acquired brightness value of the detection area according to the brightness gain parameter and the area influence parameter to obtain an environment light brightness value.
Optionally, compensating the acquired brightness value of the detection region according to the brightness gain parameter and the area influence parameter includes:
and calculating the product of the brightness gain parameter, the area influence parameter and the second brightness value, and taking the difference value of the product and the brightness value of the first source image in the calibration environment as the brightness value of the environment light.
Optionally, compensating the acquired brightness value of the detection region according to the brightness gain parameter and the area influence parameter includes:
under the condition that the position of the projection equipment is not changed and the detection area covers the target projection image of the projection equipment, recording a second source image corresponding to the detection area again, and acquiring the brightness value of the detection area to obtain a third brightness value;
calculating the brightness value of the second source image in the calibration environment according to the brightness reference parameter;
obtaining the saved brightness gain parameter and area influence parameter;
and calculating the product of the brightness gain parameter, the area influence parameter and the third brightness value, and taking the difference value of the product and the brightness value of the second source image in the calibration environment as the brightness value of the environment light.
Optionally, the method further comprises:
when the position of the projection device changes, the brightness gain parameter is recalculated.
According to a second aspect of the embodiments of the present disclosure, there is provided a projection plane parameter confirmation apparatus, including:
the first acquisition module is used for acquiring the brightness value of the detection area under the condition that the detection area in the projection area does not cover the projection image to obtain a first brightness value;
the first execution module is used for recording a first source image corresponding to the detection area under the condition that the projection image covers the detection area, and acquiring the brightness value of the detection area again to obtain a second brightness value;
the first calculation module is used for calculating the brightness value of the first source image in the calibration environment according to brightness reference parameters, wherein the brightness reference parameters comprise a target brightness value when a calibration detection area displays a white picture and a brightness stimulus value when the calibration detection area displays three primary optical colors, and the calibration detection area is a detection area in the calibration environment;
and the second calculation module is used for calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment, wherein the brightness gain parameter is used for compensating the acquired brightness value of the detection area to obtain an ambient light brightness value.
Optionally, the first computing module includes:
the first calculation submodule is used for calculating the average value of R channels, the average value of G channels and the average value of B channels of all pixel points in the first source image;
the second calculation submodule is used for calculating the product of the average value of each channel and the brightness stimulus value of the corresponding optical primary color;
and the third calculation submodule is used for calculating the brightness value of the first source image in the calibration environment according to each product, the brightness stimulation value and the target brightness value.
Optionally, the third computation submodule is configured to sum each product to obtain a first sum; calculating sum values of brightness stimulation values respectively corresponding to the three primary colors to obtain second sum values; taking the product of the ratio of the first sum value to the second sum value and the target brightness value as the brightness value of the first source image in the calibration environment;
the second calculation module is used for calculating the brightness difference value between the second brightness value and the first brightness value; and taking the ratio of the brightness value of the first source image in the calibration environment to the brightness difference value as a brightness gain parameter.
Optionally, the apparatus further comprises a second acquiring module, configured to acquire a first area value of the uncorrected projection image;
the third acquisition module is used for acquiring a calibration area value;
the third calculation module is used for calculating the ratio of the first area value to the calibration area value to obtain an area influence parameter;
and the ambient light brightness determining module is used for compensating the acquired brightness value of the detection area according to the brightness gain parameter and the area influence parameter to obtain an ambient light brightness value.
Optionally, the ambient light brightness determination module includes:
and the first ambient light calculation submodule is used for calculating the product of the brightness gain parameter, the area influence parameter and the second brightness value, and taking the difference value of the product and the brightness value of the first source image in the calibration environment as the ambient light brightness value.
Optionally, the ambient light brightness determination module includes:
the first execution submodule is used for recording a second source image corresponding to the detection area again under the condition that the position of the projection equipment is not changed and the detection area covers a target projection image of the projection equipment, and acquiring the brightness value of the detection area to obtain a third brightness value;
the fourth calculation sub-module is used for calculating the brightness value of the second source image in the calibration environment according to the brightness reference parameter;
and the fifth calculation submodule is used for calculating the product of the brightness gain parameter, the area influence parameter and the third brightness value, and taking the difference value of the product and the brightness value of the second source image in the calibration environment as the brightness value of the environment light.
Optionally, the apparatus further comprises:
and the fourth calculation module is used for recalculating the brightness gain parameter when the position of the projection equipment is changed.
According to a third aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any one of the methods of the first aspect described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a projection apparatus including:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to implement the steps of the method of any of the first aspects above.
According to the technical scheme, when the projection image projected by the projection equipment covers the detection area, the corresponding first source image can be obtained, and the second brightness value of the detection area at the moment is recorded. In this way, the brightness value of the first source image in the calibration environment can be determined based on the brightness reference parameter calibrated in the calibration environment. Since the difference between the second brightness value and the first brightness value can represent the brightness of the first source image in the detection area, the brightness gain parameter of the detection area relative to the calibration detection area can be calculated based on the first brightness value, the second brightness value and the brightness value of the first source image in the detection area. In this way, the brightness value measured in the detection area can be compensated based on the brightness gain parameter, and the ambient light brightness value can be obtained. By the mode, the interference of different projection planes to the ambient light brightness detection process can be reduced, and the accuracy of the determined ambient light brightness value can be improved.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure without limiting the disclosure. In the drawings:
fig. 1 is a flowchart illustrating a method for confirming parameters of a projection surface according to an exemplary embodiment of the disclosure.
Fig. 2 is a schematic diagram of a projection area and a detection area according to an exemplary embodiment of the disclosure.
Fig. 3 is a flowchart illustrating a method for confirming parameters of a projection plane according to an exemplary embodiment of the disclosure.
FIG. 4 is a schematic diagram of a projection scene shown in an exemplary embodiment of the present disclosure.
Fig. 5 is a block diagram of a projection plane parameter confirmation apparatus according to an exemplary embodiment of the present disclosure.
FIG. 6 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
Detailed Description
The following detailed description of specific embodiments of the present disclosure is provided in connection with the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present disclosure, are given by way of illustration and explanation only, not limitation.
Before describing the projection surface parameter confirmation method, device, storage medium, and projection apparatus of the present disclosure, an application scenario of the present disclosure is first described.
The projection device can be imaged by diffuse reflection and is therefore easily disturbed by ambient light, which also makes the placement of the sensors for brightness detection difficult. When the sensor is directed to the user side, if the user turns on the light source only on the projection side, it may result in no light on the user side. In this case, the actual projected picture may be brighter even if the sensor detects a lower brightness. When the sensor is oriented over the projection device, the user's light source may be directed at the sensor. In this case, the sensor may detect that the brightness is high even if the projected picture is dark. When the sensor faces the wall, the sensor is again susceptible to the light of the projection device itself, for example, when the projection image changes continuously, the detection result may be a continuously changing value.
In some scenes, the brightness of the projection equipment in different directions can be detected by arranging a plurality of groups of sensors, so that the accuracy of the detection result is improved. However, this approach also leads to an increase in the cost of the sensor and a more complex algorithm. In other scenes, the accuracy of the detection result can be improved by filtering the projection light, but because different projection planes have different gains for the light, the accurate ambient brightness detection result is difficult to obtain in such a way.
To this end, the present disclosure provides a projective plane parameter validation method that may be applied to a projection device and also to a computing processing device associated with the projection device. Fig. 1 is a flowchart of a projection plane parameter confirmation method shown in the present disclosure, where the method includes:
in step 11, when the projection image is not covered by the detection area within the projection area, the luminance value of the detection area is acquired, and the first luminance value is obtained.
The detection area may be provided, for example, in the detection area range of the relevant brightness detection device. The detection area range may be an area surrounded by a cross-sectional line formed when the three-dimensional detection area of the brightness detection device intersects with a plane where the projection area is located. The luminance detection means may be, for example, a light ray sensor, a color temperature sensor (Y channel), or the like. Referring to fig. 2, a schematic diagram of a projection area and a detection area is shown, in a specific implementation, the detection area may be included in the projection area. Here, after the projection device projects the source image, a projection image may be formed in the projection area.
In order to improve the detection accuracy, in some implementation scenarios, the detection area may be set within an image use range of the projection device, so that a projection image of the projection device can always cover the detection area. For example, when the image of the projection device is used in an area within plus or minus 5 degrees from the normal, the position of the detection area may be determined by calculation or actual measurement. In addition, the area of the detection region can be set to be larger (e.g., the largest) under the condition that the detection region is within the image use range of the projection apparatus, thereby contributing to the improvement of the detection accuracy.
In this way, when detecting the brightness, the first brightness value of the detection area may be acquired by the relevant brightness detection device before the projection apparatus is not lit up or during a blank screen idle period of the projection apparatus (e.g., an idle period when an application is switched, an idle period when a picture source is switched, etc.).
In step 12, in the case that the detection area is covered with the projection image, the first source image corresponding to the detection area is recorded, and the brightness value of the detection area is acquired again, resulting in a second brightness value. Wherein the first source image satisfies: when the first source image is projected, the imaging area is the detection area.
In step 13, the brightness value of the first source image in the calibration environment is calculated according to the brightness reference parameter.
Here, the luminance reference parameter includes a target luminance value when the calibration detection region displays a white screen, and a luminance stimulus value when the calibration detection region displays three primary colors, and the calibration detection region is a detection region in a calibration environment.
For example, the luminance reference parameter may be specified under a darkroom environment. When calibrating, the projection area can be a white wall or a white curtain, for example, and a calibration detection area is set in the projection area. Therefore, a 100% white picture corresponding to the calibration detection area can be projected, and the brightness value at the moment can be recorded to obtain the target brightness value. In addition, it is also possible to project a 100% red picture, a 100% green picture, and a 100% blue picture corresponding to the calibration detection area, and detect XYZ tristimulus values of the respective pictures, respectively. The Y channel value is taken as the luminance stimulus value to obtain the luminance stimulus value YR of the red channel, the luminance stimulus value YG of the green channel, and the luminance stimulus value YB of the blue channel.
In this way, the brightness value of the first source image in the calibration environment can be calculated according to the brightness reference parameter.
For example, in one possible embodiment, the luminance value of the first source image in the calibration environment may be calculated as follows:
and calculating the average value of the R channels, the average value of the G channels and the average value of the B channels of all the pixel points in the first source image. It should be appreciated that each pixel point in the first source image may have a value of R, G, B, and thus the R-channel average, G-channel average, and B-channel average may be calculated for all pixel points in the first source image.
For example, in the case where the number of channel bits is 8 bits, the average value of each channel can be calculated by the following calculation formula:
Figure BDA0003177934710000091
Figure BDA0003177934710000092
Figure BDA0003177934710000093
wherein Num is the number value of the pixel points. Of course, when the number of channel bits is 10 bits, 255 in the above calculation formula may be replaced by 1023, which is not limited by the present disclosure.
After obtaining the average values for the individual channels, the product of each channel average value and the luminance stimulus value of the corresponding optical primary color can be calculated. In this way, the brightness value of the first source image in the calibration environment can be calculated according to each product, the brightness stimulation value and the target brightness value. For example, each product may be summed to obtain a first sum, and the sum of the luminance stimulus values corresponding to the three primary colors of light may be calculated to obtain a second sum. In this way, the product of the ratio of the first sum value to the second sum value and the target brightness value can be used as the brightness value of the first source image in the calibration environment.
Continuing with the above example, the first Sum Sum1Can be as follows:
Sum1=RAvg×YR+GAvg×YG+BAvg×YB
second Sum2Can be expressed as: sum2YR + YG + YB. The luminance value L of the first source image in the calibration environment may be:
Figure BDA0003177934710000101
wherein R isAvgAs red channel average, GAvgAs green channel average, BAvgThe blue channel average.
In step 14, a brightness gain parameter of the detection area relative to the calibration detection area is calculated according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment. The brightness gain parameter is used for compensating the acquired brightness value of the detection area to obtain an ambient light brightness value.
It should be noted that, when the brightness gains of the detection area and the calibration detection area for the light are the same, the following relation can be satisfied:
Figure BDA0003177934710000102
here, Lb is a target luminance value, Lt is a second luminance value, and Ld is a first luminance value.
When the above relation is not satisfied, it may be determined that the brightness gain degrees of the light in the detection area and the calibration detection area are different, and at this time, the brightness value detected in the detection area needs to be compensated. For example, a calculation brightness gain parameter β may be introduced to compensate the brightness value detected in the detection area to obtain the ambient light brightness value.
In some implementation scenarios, a brightness difference between the second brightness value and the first brightness value may be calculated, and a ratio of the brightness value of the first source image in the calibration environment to the brightness difference may be used as the brightness gain parameter.
Following the above example, the brightness gain parameter β may be:
Figure BDA0003177934710000103
the projection surface parameter confirmation method of the present disclosure is exemplarily described below with reference to the above calculation formula. When the detection area is the gain curtain and the calibration detection area is a white wall, the light with the same intensity can obtain larger brightness gain in the gain curtain. In this case, the brightness detection of the detection area in the gain curtain can obtain a larger brightness value, so that the brightness value of the ambient light obtained by subsequent calculation is also larger. That is, the obtained ambient light brightness is greater than the actual ambient light brightness. Based on such ambient light brightness errors, the projection device may further increase the brightness of the projected image, eventually resulting in a continued increase in the brightness of the projected image.
By adopting the technical scheme of the application, when the gain of the detection area to the light is larger than the gain of the calibration detection area to the light, Lt-Ld (namely the brightness value when the source image is displayed in the detection area) is larger than the brightness value when the source image is displayed in the calibration environment, and at the moment, beta is less than 1. Therefore, the obtained brightness value is compensated and calculated through the brightness gain parameter, and the detection accuracy of the ambient light brightness can be improved.
According to the technical scheme, when the projection image projected by the projection equipment covers the detection area, the corresponding first source image can be obtained, and the second brightness value of the detection area at the moment is recorded. In this way, the brightness value of the first source image in the calibration environment can be determined based on the brightness reference parameter calibrated in the calibration environment. Since the difference between the second brightness value and the first brightness value can represent the brightness of the first source image in the detection area, the brightness gain parameter of the detection area relative to the calibration detection area can be calculated based on the first brightness value, the second brightness value and the brightness value of the first source image in the detection area. In this way, the brightness value measured in the detection area can be compensated based on the brightness gain parameter, and the ambient light brightness value can be obtained. By the mode, the interference of different projection planes to the ambient light brightness detection process can be reduced, and the accuracy of the determined ambient light brightness value can be improved.
Fig. 3 is a flowchart of a method for confirming parameters of a projection plane according to the present disclosure, and as shown in fig. 3, the method includes:
s31, when the projection image is not covered by the detection area in the projection area, the luminance value of the detection area is acquired to obtain a first luminance value.
S32, in the case that the detection area is covered with the projection image, recording the first source image corresponding to the detection area, and acquiring the brightness value of the detection area again, resulting in a second brightness value.
And S33, calculating the brightness value of the first source image in the calibration environment according to the brightness reference parameter. The brightness reference parameter comprises a target brightness value when the calibration detection area displays a white picture and a brightness stimulation value when the calibration detection area displays three primary colors, and the calibration detection area is a detection area of the projection equipment in a calibration environment.
And S34, calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment.
For the implementation of steps S31 to S34, please refer to the above description of the embodiments of steps S11 to S14, and for brevity of the description, the disclosure is not repeated herein.
S35, a first area value of the uncorrected projection image is acquired.
FIG. 4 is a schematic illustration of a projection scenario illustrated by the present disclosure, in general, a trapezoidal corrected projected image (illustrated at 402) may be part of the actual light exposure area (illustrated at 401) of the projection device. However, since the luminous flux of the projection device is constant, the brightness of the diffuse reflection light is correlated with the actual light irradiation area of the projection device. Thus, the first area value is calculated with the uncorrected projection image in this embodiment.
As for the first area value, in a specific implementation, sensors such as a keystone correction module, a keystone correction equipped camera, and a TOF (Time of flight) of the projection apparatus may be used to calculate a deflection angle of the projection apparatus and a distance from the projection area of the projection apparatus. In this way, the entire spatial coordinate system can be calculated in combination with the compensation of the optical machine and the throw ratio to calculate the real coordinates of the four vertices of the projected image, and the first area value of the uncorrected projected image can be calculated from the real coordinates of the four vertices. For the calculation process of the vertex and the area of the projection image, please refer to the description of the related art, which is not described in detail in this disclosure.
And S36, acquiring a calibration area value. The calibration area value may also be one of the luminance reference parameters, which may be determined during the calibration phase. For example, during calibration, the projection apparatus may be controlled to project a standard rectangular frame without turning on keystone correction and zooming, and the size is not limited (generally, between 60 and 100 inches), and the projected image area value at this time is the calibrated area value. It should be noted that, in this case, parameters such as the target brightness value, the brightness stimulus value, and the like need to be calibrated on the basis of the calibrated area value.
S37, calculating the ratio of the first area value to the calibration area value to obtain an area influence parameter;
and S38, compensating the acquired brightness value of the detection area according to the brightness gain parameter and the area influence parameter to obtain an environment light brightness value.
For example, in one possible embodiment, a product of the brightness gain parameter, the area impact parameter, and the second brightness value may be calculated, and a difference between the product and the brightness value of the first source image in the calibration environment may be used as the ambient light brightness value.
The calculation formula is used to calculate the ambient light luminance value Lr
Figure BDA0003177934710000131
Where α is an area-influencing parameter, Lc is a second luminance value, RAvgIs the red channel average of the first source image, GAvgIs the green channel average of the first source image, BAvgIs the blue channel average of the first source image.
It is to be noted that the manner of calculating the luminance gain parameter in the above embodiment is exemplarily illustrated through steps S31 to S34. In some implementations, however, the above-described process of calculating the brightness gain parameter need not be repeated. For example, the projection device may calculate and store the brightness gain parameter each time it is turned on, and may directly apply the stored brightness gain parameter in the subsequent brightness adjustment process. The brightness gain parameter may also be kept constant in case the position of the projection device is not changed.
For example, in one possible approach, step S38 may include:
and under the condition that the position of the projection equipment is not changed and the detection area covers the target projection image of the projection equipment, recording the second source image corresponding to the detection area again, and acquiring the brightness value of the detection area to obtain a third brightness value. The second source image can be, for example, a further source image of the projection device after the projection of the first source image.
And calculating the brightness value of the second source image in the calibration environment according to the brightness reference parameter, and acquiring the saved brightness gain parameter and the area influence parameter. For example, in some implementations, the projection device may calculate the brightness gain parameter and the area impact parameter each time it is turned on, and store the parameters in a storage medium of the projection device. In this way, in the subsequent determination of the ambient light brightness value, the brightness gain parameter and the area impact parameter may be obtained from the storage medium of the projection device.
In addition, the product of the brightness gain parameter, the area influence parameter and the third brightness value can be calculated, and the difference value between the product and the brightness value of the second source image in the calibration environment is used as the environment light brightness value.
For example, the ambient light luminance value Lr may be calculated by the following calculation formula
Figure BDA0003177934710000141
Wherein L 'is a third brightness value, alpha is an area influence parameter, R'AvgIs the red channel average, G 'of the second source image'AvgIs the green channel average, B 'of the second source image'AvgIs the blue channel average of the second source image. In this way, when the position of the projection device is not changed, the ambient light brightness value can be calculated based on the saved brightness gain parameter and the area influence parameter, so that the calculation amount of the execution device of the projection plane parameter confirmation method can be reduced, and the reduction of power consumption is facilitated.
According to the technical scheme, the influence of the projection plane on the gain of light is considered when the ambient light brightness is detected, and the influence of the light of the projection image and the influence of the projection area on the brightness of the projection image are also considered. Therefore, the detected brightness value is compensated through the brightness gain parameter, the light influence of the projected image is filtered, and the more accurate environment light brightness value can be obtained through the technical scheme. On the basis, the technical scheme is also favorable for improving the accuracy of automatic brightness adjustment of the projection equipment.
In addition, when the position of the projection device is changed, the brightness gain parameter may be recalculated.
For example, the position information of the projection device may be obtained, and when the position information indicates that the position of the projection device changes, it is determined that the detection area changes. Here, the position information of the projection apparatus may be acquired based on a detection element such as a gyroscope, an acceleration sensor, or the like. When the data acquired by the detection elements indicates that the position of the projection device changes, the detection area of the projection device may also change. Therefore, the luminance gain parameter needs to be recalculated in this case.
In some embodiments, the change in the position of the projection device may also be determined in response to receiving an instruction from a user that characterizes the change in the position of the projection device. For example, a user may send an instruction to the projection device to indicate that the location of the projection device has changed after moving the projection device. The instruction may be sent to the projection device by the user based on the mobile terminal, or may be generated by the projection device in response to an operation of the projection device by the user, which is not limited in this disclosure.
Thus, by adopting the technical scheme, the brightness gain parameter can be recalculated after the detection area is determined to be changed. By the mode, the accuracy of the brightness gain parameter is guaranteed, and the accuracy of brightness adjustment of the projection equipment is improved. Of course, the area impact parameter may also be recalculated when the position of the projection device changes, which is not limited by this disclosure.
Based on the same inventive concept, the present disclosure also provides a device for confirming the parameters of the projection plane. Fig. 5 is a block diagram of a projection plane parameter confirmation apparatus shown in the present disclosure, and the apparatus 500 includes:
a first obtaining module 501, configured to obtain a brightness value of a detection area in the projection area when the detection area does not cover the projection image, to obtain a first brightness value;
the first execution module 502 is configured to record a first source image corresponding to the detection area under the condition that the detection area is covered with the projection image, and obtain the brightness value of the detection area again to obtain a second brightness value;
the first calculating module 503 is configured to calculate a brightness value of the first source image in the calibration environment according to a brightness reference parameter, where the brightness reference parameter includes a target brightness value when the calibration detection region displays a white image and a brightness stimulus value when the calibration detection region displays three primary optical colors, and the calibration detection region is a detection region in the calibration environment;
the second calculating module 504 is configured to calculate a brightness gain parameter of the detection region relative to the calibration detection region according to the first brightness value, the second brightness value, and the brightness value of the first source image in the calibration environment, where the brightness gain parameter is used to compensate the obtained brightness value of the detection region, so as to obtain an ambient light brightness value.
According to the technical scheme, when the projection image projected by the projection equipment covers the detection area, the corresponding first source image can be obtained, and the second brightness value of the detection area at the moment is recorded. In this way, the brightness value of the first source image in the calibration environment can be determined based on the brightness reference parameter calibrated in the calibration environment. Since the difference between the second brightness value and the first brightness value can represent the brightness of the first source image in the detection area, the brightness gain parameter of the detection area relative to the calibration detection area can be calculated based on the first brightness value, the second brightness value and the brightness value of the first source image in the detection area. In this way, the brightness value measured in the detection area can be compensated based on the brightness gain parameter, and the ambient light brightness value can be obtained. By the mode, the interference of different projection planes to the ambient light brightness detection process can be reduced, and the accuracy of the determined ambient light brightness value can be improved.
Optionally, the first calculating module 503 includes:
the first calculation submodule is used for calculating the average value of R channels, the average value of G channels and the average value of B channels of all pixel points in the first source image;
the second calculation submodule is used for calculating the product of the average value of each channel and the brightness stimulus value of the corresponding optical primary color;
and the third calculation submodule is used for calculating the brightness value of the first source image in the calibration environment according to each product, the brightness stimulation value and the target brightness value.
Optionally, the third computation submodule is configured to sum each product to obtain a first sum; calculating sum values of brightness stimulation values respectively corresponding to the three primary colors to obtain second sum values; taking the product of the ratio of the first sum value to the second sum value and the target brightness value as the brightness value of the first source image in the calibration environment;
the second calculation module is used for calculating the brightness difference value between the second brightness value and the first brightness value; and taking the ratio of the brightness value of the first source image in the calibration environment to the brightness difference value as a brightness gain parameter.
Optionally, the apparatus 500 further comprises:
the second acquisition module is used for acquiring a first area value of the uncorrected projection image;
the third acquisition module is used for acquiring a calibration area value;
the third calculation module is used for calculating the ratio of the first area value to the calibration area value to obtain an area influence parameter;
and the ambient light brightness determining module is used for compensating the acquired brightness value of the detection area according to the brightness gain parameter and the area influence parameter to obtain an ambient light brightness value.
Optionally, the ambient light level determining module includes:
and the first ambient light calculation submodule is used for calculating the product of the brightness gain parameter, the area influence parameter and the second brightness value, and taking the difference value of the product and the brightness value of the first source image in the calibration environment as the ambient light brightness value.
Optionally, the ambient light brightness determination module includes:
the first execution submodule is used for recording a second source image corresponding to the detection area again under the condition that the position of the projection equipment is not changed and the detection area covers a target projection image of the projection equipment, and acquiring the brightness value of the detection area to obtain a third brightness value;
the fourth calculation sub-module is used for calculating the brightness value of the second source image in the calibration environment according to the brightness reference parameter;
and the fifth calculation submodule is used for calculating the product of the brightness gain parameter, the area influence parameter and the third brightness value, and taking the difference value of the product and the brightness value of the second source image in the calibration environment as the brightness value of the environment light.
Optionally, the apparatus 500 further comprises:
and the fourth calculation module is used for recalculating the brightness gain parameter when the position of the projection equipment is changed.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The present disclosure also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the projection surface parameter validation method provided by the present disclosure.
The present disclosure also provides a projection device, comprising:
a memory having a computer program stored thereon;
and the processor is used for executing the computer program in the memory so as to realize the steps of the projection surface parameter confirmation method provided by the disclosure.
Fig. 6 is a block diagram illustrating an electronic device 600 according to an example embodiment, where electronic device 600 may be provided, for example, as a projection device. As shown in fig. 6, the electronic device 600 may include: a processor 601 and a memory 602. The electronic device 600 may also include one or more of a multimedia component 603, an input/output (I/O) interface 604, and a communications component 605.
The processor 601 is configured to control the overall operation of the electronic device 600, so as to complete all or part of the steps in the projection plane parameter confirmation method. The memory 602 is used to store various types of data to support operation at the electronic device 600, such as instructions for any application or method operating on the electronic device 600 and application-related data, such as messaging, pictures, audio, video, and the like. The Memory 602 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 603 may include a screen and audio components. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 602 or transmitted through the communication component 605. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 604 provides an interface between the processor 601 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 605 is used for wired or wireless communication between the electronic device 600 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 605 may therefore include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the above-mentioned projection plane parameter confirmation method.
In another exemplary embodiment, a computer readable storage medium comprising program instructions which, when executed by a processor, implement the steps of the projection surface parameter validation method described above is also provided. For example, the computer readable storage medium may be the memory 602 including program instructions executable by the processor 601 of the electronic device 600 to perform the projective plane parameter validation method described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned projection surface parameter validation method when executed by the programmable apparatus.
The preferred embodiments of the present disclosure are described in detail with reference to the accompanying drawings, however, the present disclosure is not limited to the specific details of the above embodiments, and various simple modifications may be made to the technical solution of the present disclosure within the technical idea of the present disclosure, and these simple modifications all belong to the protection scope of the present disclosure.
It should be noted that, in the foregoing embodiments, various features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various combinations that are possible in the present disclosure are not described again.
In addition, any combination of various embodiments of the present disclosure may be made, and the same should be considered as the disclosure of the present disclosure, as long as it does not depart from the spirit of the present disclosure.

Claims (9)

1. A projection plane parameter confirmation method is characterized by comprising the following steps:
under the condition that a detection area in a projection area does not cover a projection image, acquiring a brightness value of the detection area to obtain a first brightness value;
under the condition that the detection area is covered with the projection image, recording a first source image corresponding to the detection area, and obtaining the brightness value of the detection area again to obtain a second brightness value;
calculating the brightness value of the first source image in the calibration environment according to brightness reference parameters, wherein the brightness reference parameters comprise a target brightness value when a calibration detection area displays a white picture and a brightness stimulation value when the calibration detection area displays three primary colors, and the calibration detection area is a detection area in the calibration environment;
calculating a brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and a brightness value of the first source image in a calibration environment, wherein the brightness gain parameter is used for compensating the obtained brightness value of the detection area to obtain an environment light brightness value;
the calculating the brightness value of the first source image under the calibration environment according to the brightness reference parameter includes:
calculating the average value of R channels, the average value of G channels and the average value of B channels of all pixel points in the first source image;
calculating the product of the average value of each channel and the brightness stimulation value of the corresponding optical primary color;
and calculating the brightness value of the first source image in the calibration environment according to each product, the brightness stimulation value and the target brightness value.
2. The method of claim 1, wherein said calculating a brightness value of said first source image in a calibration environment from each of said product, said brightness stimulus value, and said target brightness value comprises:
summing each of said products to obtain a first sum;
calculating sum values of brightness stimulation values respectively corresponding to the three primary colors to obtain second sum values;
taking the product of the ratio of the first sum value to the second sum value and the target brightness value as the brightness value of the first source image in a calibration environment;
the calculating the brightness gain parameter of the detection area relative to the calibration detection area according to the first brightness value, the second brightness value and the brightness value of the first source image in the calibration environment includes:
calculating a brightness difference value between the second brightness value and the first brightness value;
and taking the ratio of the brightness value of the first source image in the calibration environment to the brightness difference value as the brightness gain parameter.
3. The method of claim 1 or 2, further comprising:
acquiring a first area value of an uncorrected projection image;
acquiring a calibration area value;
calculating the ratio of the first area value to the calibration area value to obtain an area influence parameter;
and compensating the acquired brightness value of the detection area according to the brightness gain parameter and the area influence parameter to obtain an environment light brightness value.
4. The method according to claim 3, wherein compensating the acquired luminance value of the detection region according to the luminance gain parameter and the area influence parameter comprises:
and calculating the product of the brightness gain parameter, the area influence parameter and the second brightness value, and taking the difference value of the product and the brightness value of the first source image in the calibration environment as the brightness value of the environment light.
5. The method according to claim 3, wherein the compensating the acquired luminance value of the detection region according to the luminance gain parameter and the area influence parameter comprises:
under the condition that the position of the projection equipment is not changed and the detection area covers the target projection image of the projection equipment, recording a second source image corresponding to the detection area again, and acquiring the brightness value of the detection area to obtain a third brightness value;
calculating the brightness value of the second source image in a calibration environment according to the brightness reference parameter;
obtaining the saved brightness gain parameter and the area influence parameter;
and calculating the product of the brightness gain parameter, the area influence parameter and the third brightness value, and taking the difference value of the product and the brightness value of the second source image in the calibration environment as the brightness value of the environment light.
6. The method according to claim 1 or 2, characterized in that the method further comprises:
and when the position of the projection equipment is changed, recalculating the brightness gain parameter.
7. A projective plane parameter determining apparatus, comprising:
the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring the brightness value of a detection area in a projection area under the condition that the detection area does not cover a projection image to obtain a first brightness value;
the first execution module is used for recording a first source image corresponding to the detection area under the condition that the projection image covers the detection area, and acquiring the brightness value of the detection area again to obtain a second brightness value;
the first calculation module is used for calculating the brightness value of the first source image in the calibration environment according to brightness reference parameters, wherein the brightness reference parameters comprise a target brightness value when a calibration detection area displays a white picture and a brightness stimulation value when the calibration detection area displays three primary optical colors, and the calibration detection area is a detection area in the calibration environment;
the second calculation module is configured to calculate a brightness gain parameter of the detection region relative to the calibration detection region according to the first brightness value, the second brightness value, and a brightness value of the first source image in a calibration environment, where the brightness gain parameter is used to compensate the obtained brightness value of the detection region to obtain an ambient light brightness value;
the first computing module, comprising:
the first calculation submodule is used for calculating the average value of R channels, the average value of G channels and the average value of B channels of all pixel points in the first source image;
the second calculation submodule is used for calculating the product of the average value of each channel and the brightness stimulus value of the corresponding optical primary color;
and the third calculation submodule is used for calculating the brightness value of the first source image in the calibration environment according to each product, the brightness stimulation value and the target brightness value.
8. A non-transitory computer readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
9. A projection device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 6.
CN202110838212.7A 2021-07-23 2021-07-23 Projection surface parameter confirmation method and device, storage medium and projection equipment Active CN113542708B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110838212.7A CN113542708B (en) 2021-07-23 2021-07-23 Projection surface parameter confirmation method and device, storage medium and projection equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110838212.7A CN113542708B (en) 2021-07-23 2021-07-23 Projection surface parameter confirmation method and device, storage medium and projection equipment

Publications (2)

Publication Number Publication Date
CN113542708A CN113542708A (en) 2021-10-22
CN113542708B true CN113542708B (en) 2022-06-21

Family

ID=78089453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110838212.7A Active CN113542708B (en) 2021-07-23 2021-07-23 Projection surface parameter confirmation method and device, storage medium and projection equipment

Country Status (1)

Country Link
CN (1) CN113542708B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114143523B (en) * 2021-12-10 2024-07-23 深圳市火乐科技发展有限公司 Brightness adjusting method and device, projection equipment and storage medium
CN115278066A (en) * 2022-07-18 2022-11-01 Oppo广东移动通信有限公司 Point light source detection method, focusing method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1460917A (en) * 2002-05-20 2003-12-10 精工爱普生株式会社 Image processing system, projector, information storage medium and image processing method
CN104318912A (en) * 2014-10-23 2015-01-28 赵辉 Method and device for detecting environmental light brightness
CN111083458A (en) * 2019-12-31 2020-04-28 成都极米科技股份有限公司 Brightness correction method, system, equipment and computer readable storage medium
CN112995630A (en) * 2021-03-11 2021-06-18 杭州当贝网络科技有限公司 Flexible brightness adjusting method for fixed-focus projector, projector and readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3939141B2 (en) * 2001-12-05 2007-07-04 オリンパス株式会社 Projection type image display system and color correction method thereof
CN104796683B (en) * 2014-01-22 2018-08-14 南京中兴软件有限责任公司 A kind of method and system of calibration image color
JP2016133640A (en) * 2015-01-20 2016-07-25 キヤノン株式会社 Display device and method of controlling the same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1460917A (en) * 2002-05-20 2003-12-10 精工爱普生株式会社 Image processing system, projector, information storage medium and image processing method
CN104318912A (en) * 2014-10-23 2015-01-28 赵辉 Method and device for detecting environmental light brightness
CN111083458A (en) * 2019-12-31 2020-04-28 成都极米科技股份有限公司 Brightness correction method, system, equipment and computer readable storage medium
CN112995630A (en) * 2021-03-11 2021-06-18 杭州当贝网络科技有限公司 Flexible brightness adjusting method for fixed-focus projector, projector and readable storage medium

Also Published As

Publication number Publication date
CN113542708A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN113542709B (en) Projection image brightness adjusting method and device, storage medium and projection equipment
JP3497805B2 (en) Image projection display device
US9843781B1 (en) Projector
US10298894B2 (en) Projector
KR20160124737A (en) Method and device for adjusting colour temperature
CN113542708B (en) Projection surface parameter confirmation method and device, storage medium and projection equipment
US20040150795A1 (en) Multiprojection system and method of acquiring correction data in multiprojection system
CN112116888B (en) Screen calibration method, calibration device and storage medium
US11284052B2 (en) Method for automatically restoring a calibrated state of a projection system
US20140232879A1 (en) Color calibration chart acquisition
CN108737806A (en) A kind of projecting apparatus color correcting method and device, computer storage media
CN112668569B (en) Projection type image display device
JP2019149764A (en) Display device calibration device, display device calibration system, display device calibration method, and display device
KR101680446B1 (en) Creation device for color table, correction and control device for camera image and method thereof
CN112710383A (en) Light sensor calibration method and device and storage medium
JP2016525723A (en) Movie projection measurement
JP2015119344A (en) Device for measuring sensitivity distribution of imaging element and its control method, and calibration device of image display device and its control method
JP2021101204A (en) Operation method for control unit, control method for projector, and projector
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
KR20150054452A (en) compensation device for compensating a non uniformity of a display device and method thereof
CN115802173B (en) Image processing method and device, electronic equipment and storage medium
CN110896452A (en) Flash lamp correction method of mobile terminal, mobile terminal and device
US9761159B2 (en) Image processor, image projector, and image processing method
US20240094968A1 (en) Display method, display system, and storage medium storing program
JPWO2020008543A1 (en) Measurement method, measurement system, display device, computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant