CN117149038A - Image display method and image display device - Google Patents
Image display method and image display device Download PDFInfo
- Publication number
- CN117149038A CN117149038A CN202311199840.0A CN202311199840A CN117149038A CN 117149038 A CN117149038 A CN 117149038A CN 202311199840 A CN202311199840 A CN 202311199840A CN 117149038 A CN117149038 A CN 117149038A
- Authority
- CN
- China
- Prior art keywords
- editing
- image
- node
- path
- image editing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 186
- 230000000694 effects Effects 0.000 claims abstract description 294
- 238000010586 diagram Methods 0.000 claims description 95
- 230000004044 response Effects 0.000 claims description 22
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an image display method and an image display device, and belongs to the technical field of images. The method comprises the following steps: displaying an image editing interface, wherein the image editing interface comprises a first image; displaying a first editing path on an image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and comprises at least one editing node in the image editing process of the first image; receiving a first input to a first editing node of the at least one editing node; and responding to the first input, and displaying a first image editing effect graph corresponding to the first editing node.
Description
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image display method and an image display device.
Background
Currently, a large amount of image processing software is installed in an electronic apparatus, and an image can be edited by the image processing software, for example, increasing the brightness of the image, increasing a filter, adjusting the size of the image, and the like, so that an edited image is obtained.
In the related art, after an image is edited, an electronic device generally only displays a final image effect diagram, and if a user wants to view the image effect diagram in the image editing process, the user needs to return to a corresponding image editing node to view the image effect diagram corresponding to the image editing node, so that the overall viewing efficiency is low.
Disclosure of Invention
The embodiment of the application aims to provide an image display method and an image display device, which can improve the viewing efficiency of an image editing effect graph in an image editing process.
In a first aspect, an embodiment of the present application provides an image display method, including: displaying an image editing interface, wherein the image editing interface comprises a first image; displaying a first editing path on an image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and the first editing path comprises at least one editing node in the image editing process of the first image; receiving a first input to a first editing node of the at least one editing node; and responding to the first input, and displaying a first image editing effect graph corresponding to the first editing node.
In a second aspect, an embodiment of the present application provides an image display apparatus including: a display module and a receiving module. The display module is used for displaying an image editing interface, wherein the image editing interface comprises a first image; and displaying a first editing path on the image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and the first editing path comprises at least one editing node in the image editing process of the first image. The apparatus includes a receiving module to receive a first input to a first editing node of the at least one editing node. And the display module is also used for responding to the first input received by the receiving module and displaying a first image editing effect diagram corresponding to the first editing node.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, the electronic equipment displays an image editing interface, wherein the image editing interface comprises a first image; displaying a first editing path on an image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and the first editing path comprises at least one editing node in the image editing process of the first image; the electronic device receives a first input to a first editing node of the at least one editing node; and responding to the first input, and displaying a first image editing effect graph corresponding to the first editing node. In the scheme, the editing path corresponding to the image editing process can be generated in the image editing process of the first image, so that a user can directly and intuitively know the editing steps in the image editing process through the editing path, meanwhile, the user does not need to withdraw the image editing operation, the image editing effect graph corresponding to each editing node in the whole image editing process can be directly checked through the editing path, and the checking efficiency of the image editing effect graph in the image editing process is improved.
Drawings
FIG. 1 is a schematic flow chart of an image display method according to an embodiment of the present application;
FIG. 2 (A) is a schematic diagram of an example of adjusting image parameters according to an embodiment of the present application;
FIG. 2 (B) is a schematic diagram showing an example of an editing node according to an embodiment of the present application;
FIG. 2 (C) is a diagram showing a second example of an editing node according to the embodiment of the present application;
FIG. 3 is one example schematic diagram showing an image editing effect diagram according to an embodiment of the present application;
FIG. 4 is a second flowchart of an image display method according to an embodiment of the present application;
FIG. 5 (A) is a schematic diagram of an example of input to an edit node according to an embodiment of the present application;
FIG. 5 (B) is a diagram showing a second example of an image editing effect diagram according to the embodiment of the present application;
FIG. 6 (A) is a schematic diagram of an example of input of an magnification control provided by an embodiment of the present application;
FIG. 6 (B) is an example schematic diagram showing a thumbnail of an image editing effect map provided by an embodiment of the present application;
FIG. 7 is a third flow chart of an image display method according to the embodiment of the application;
FIG. 8 (A) is one example schematic diagram of an input to a parameter adjustment control provided by an embodiment of the present application;
FIG. 8B is a third exemplary diagram showing an image editing effect diagram according to the embodiment of the present application;
FIG. 9 (A) is a second exemplary diagram of an input to a parameter adjustment control provided by an embodiment of the present application;
FIG. 9 (B) is a schematic diagram of an example of adding editing nodes according to an embodiment of the present application;
FIG. 10 (A) is a schematic diagram of an example of input of a plus sign control provided by an embodiment of the present application;
FIG. 10 (B) is a third exemplary diagram of an input to a parameter adjustment control provided by an embodiment of the present application;
fig. 11 is a schematic structural view of an image display device according to an embodiment of the present application;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 13 is a second schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means that the connected object is at least one of the embodiments of the present application, and the character "/", generally indicates that the associated object is an "or" relationship.
The image display method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The image display method in the embodiment of the application can be applied to the scene of image editing.
Currently, a large amount of image processing software is installed in an electronic apparatus, and an image can be edited by the image processing software, for example, increasing the brightness of the image, increasing a filter, adjusting the size of the image, and the like, so that an edited image is obtained. In the related art, after editing an image, an electronic device generally only displays a final image effect map, and if a user wants to view the image effect map in the image editing process, the electronic device needs to go back to a corresponding image editing node, for example: the user performs the following editing operation on the picture 1: after the 'exposure +46', 'brightness +20', 'vignetting-10' and 'exposure-50', if the user wants to view the image effect graph after the 'brightness +20', the user can click a withdrawal button in the image editing interface to trigger the electronic device to withdraw and display the image effect graph after the 'vignetting-10', and click a withdrawal button in the image editing interface again to trigger the electronic device to withdraw and display the image effect graph after the 'brightness +20', so that a certain image editing effect graph in the image editing process can be viewed, and the overall viewing efficiency is lower.
In the image display method and the image display device provided by the embodiment of the application, since the editing path corresponding to the image editing process can be generated in the image editing process of the first image, a user can directly intuitively understand the editing steps in the image editing process through the editing path, meanwhile, the user does not need to withdraw the image editing operation, the image editing effect graph corresponding to each editing node in the whole image editing process can be directly checked through the editing path, and the checking efficiency of the image editing effect graph in the image editing process is improved.
The execution subject of the image display method provided by the embodiment of the application can be an image display device, and the image display device can be an electronic device or a functional module or entity in the electronic device. The technical solution provided by the embodiment of the present application is described below by taking an electronic device as an example.
An embodiment of the present application provides an image display method, and fig. 1 shows a flowchart of the image display method provided by the embodiment of the present application, where the method may be applied to an electronic device. As shown in fig. 1, the image display method provided by the embodiment of the present application may include the following steps 201 to 204.
Step 201, the electronic device displays an image editing interface.
In an embodiment of the present application, the image editing interface includes a first image.
In some embodiments of the present application, the image editing interface is used to edit an image.
In some embodiments of the present application, the image editing interface may be an image editing interface of the first application. The first application is an application having an image editing function, and may be any of the following applications: a gallery, an image editing class application.
For example, the user may trigger the electronic device to display an image editing interface of a first application in the desktop by clicking on an application icon of the first application.
In some embodiments of the present application, the first image may be any one of the following: scenery images, food images, cartoon images, character images, etc. The embodiment of the present application is not limited thereto.
In some embodiments of the present application, the first image may be an image input by a user, or the first image may be an image currently captured by an electronic device, or the first image may be a pre-stored image.
Step 202, in the process of editing the first image, the electronic device displays a first editing path on an image editing interface.
In an embodiment of the present application, the first editing path is used to indicate an image editing process of the first image, and the first editing path includes at least one editing node in the image editing process of the first image.
In some embodiments of the present application, the image editing process of the first image includes at least one editing step, where each editing step corresponds to an editing operation.
Illustratively, the editing operation described above includes at least one of: brightness and contrast are adjusted, image size is adjusted, exposure is adjusted, filters are added, and filters are removed. The embodiment of the present application is not limited thereto.
In some embodiments of the present application, each editing node in the first editing path is used to indicate an editing step of the image editing process.
In some embodiments of the present application, each of the at least one editing node corresponds to an image editing effect map. It can be understood that the image editing effect diagram corresponding to any editing node in the first editing path is: and the image is obtained through the editing operation corresponding to any editing node.
Taking the first image as image 1 as an example, in the process of editing the image 1, it is assumed that the image editing process of the image 1 includes the following editing steps: step 1) increases exposure, step 2) increases brightness. For example, when the user increases 46 the value of the exposure parameter of the image 1, the electronic device generates and displays a first editing path, where the first editing path includes "exposure" of the editing node 1, the adjustment parameter corresponding to the editing node 1 is "exposure+46", and the image editing effect map corresponding to the editing node 1 is: the value of the exposure parameter of image 1 is increased 46. Then, the user may continue to edit the image editing effect diagram corresponding to the editing node 1, and increase the value of the brightness parameter of the image editing effect diagram by 20, so that the electronic device increases the brightness of an editing node 2 after editing the node 1, the adjustment parameter corresponding to the editing node 2 is brightness +20, and the image editing effect diagram corresponding to the editing node 2 is: and (3) increasing the value of the brightness parameter of the image editing effect diagram corresponding to the editing node 1 by 20.
In some embodiments of the present application, the at least one editing node may be all or part of editing nodes corresponding to an image editing process of the first image. For example, the electronic device may use an edit node corresponding to an emphasis edit step in the image editing process of the first image as an edit node of the first edit path.
In some embodiments of the present application, at least one editing node in the first editing path is: and arranging according to the sequence of the editing steps corresponding to each editing node.
In some embodiments of the present application, when displaying the first editing path, the electronic device displays at least one editing node identifier on the first editing path, where each editing node identifier corresponds to one editing node. In other words, the above editing node identification is an identification dedicated to marking editing nodes.
Illustratively, the edit node identifier may be a shape identifier, a text label, a symbol identifier, a color identifier, or the like, which is not limited by the present application. For example, the shape of the shape identifier may be any of the following: square, circular, rectangular, irregular, etc.
In some embodiments of the present application, the electronic device may generate the first editing path based on the image editing process of the entire first image after completing the image editing process of the first image. Or, the electronic device may generate the first editing path after completing the first image editing step of the first image in the process of editing the first image, where the first editing path includes a first editing node corresponding to the first image editing step, and add a corresponding editing node in the first editing path in a subsequent image editing process based on the first image.
In some embodiments of the present application, the electronic device may directly display the first editing path in a blank area of the image editing interface; or, floating and displaying the first editing path on the image editing interface; alternatively, a first window is displayed in the image editing interface, the first window including the first editing path therein.
Illustratively, the first window has a zoom-out control displayed in the lower right corner and a zoom-in control displayed in the upper right corner. Illustratively, the zoom-out control is used to zoom out the first window; the zoom-in control is used for zooming in and displaying the first window, so that a user can clearly view the first editing path.
For example, as shown in fig. 2 (a), the electronic device displays an image editing interface 11, and the image editing interface 11 includes a plurality of image editing controls, for example, an exposure adjustment control, a brightness adjustment control, a highlight adjustment control, a shade adjustment control, a color temperature adjustment control, a sharpness adjustment control, and a dark angle adjustment control. When the user wants to perform image processing on the image 12, the electronic device may be triggered to display the image 12, i.e. the first image described above, in the image editing interface 11. Specifically, when the user wants to adjust the exposure parameters of the image 12, the exposure parameters of the image 12 may be adjusted by clicking on an exposure parameter control in the image editing interface 11, for example: "exposure+46".
As shown in fig. 2 (B), after adjusting the exposure parameters of the image 12, the image 12 displayed in the image editing interface 11 is updated to the 1 st effect diagram, i.e. 13 in fig. 2 (B), and after adjusting the exposure parameters of the image 1, the electronic device displays a first window 14 in the image editing interface 11, and displays a first editing path in the first window 14, wherein the first editing path includes an "exposure" editing node, and a corresponding parameter adjustment value "+46" is displayed above the "exposure" editing node. Subsequently, if the user wants to continuously adjust the brightness of the 1 st effect graph, the brightness parameter of the 1 st effect graph can be adjusted by clicking the image brightness adjustment control, for example: "brightness+20", and after adjusting the brightness parameter for the 1 st effect map, updating the 1 st effect map displayed in the image editing interface 11 to the 2 nd effect map, and adding a "brightness" editing node in the first editing path. Then, the user may continue to adjust the vignetting parameter of the 2 nd effect graph, update the 2 nd effect graph to the 3 rd effect graph, and add a "vignetting" editing node in the first editing path. Then, the user can continuously adjust the exposure parameters of the 3 rd effect graph, update the 3 rd effect graph to the 4 th effect graph, and add an 'exposure' editing node in the first editing path.
As shown in fig. 2 (C), after adjusting the exposure parameter, the brightness parameter, the dark angle parameter, and the exposure parameter of the image 12, the first editing path includes four editing nodes: an "exposure" edit node, a "brightness" edit node, a "vignetting edit node" and an "exposure" edit node, and a parameter adjustment value corresponding to each edit node, and a 4 th effect diagram, that is, 15 in fig. 2 (C), is displayed.
Step 203, the electronic device receives a first input to a first editing node of the at least one editing node.
In some embodiments of the present application, the first editing node may be one or more editing nodes of the at least one editing node.
In some embodiments of the present application, the first input is used to view a first image editing effect map corresponding to a first editing node.
In some embodiments of the application, the first input includes, but is not limited to: the user performs touch input on the first editing node through a touch device such as a finger or a stylus pen, or inputs a voice command input by the user, or inputs a specific gesture input by the user, or inputs a click, or inputs other feasibility. The specific determination may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
In some embodiments of the present application, the specific gesture may be any one of a single click gesture, a swipe gesture, a drag gesture, a pressure recognition gesture, a long press gesture, an area change gesture, a double press gesture, and a double click gesture.
In some embodiments of the present application, the click input may be a single click input, a double click input, or any number of click inputs, and may also be a long press input or a short press input.
Step 204, the electronic device responds to the first input to display a first image editing effect diagram corresponding to the first editing node.
In some embodiments of the present application, during the process of editing the first image, the electronic device stores an image editing effect map corresponding to each editing node in the first editing path. Thus, after the electronic device receives the first input, the first image editing effect diagram corresponding to the first editing node can be directly obtained.
In some embodiments of the present application, the electronic device may render the displayed first image into a first image editing effect map corresponding to the first editing node.
For example, referring to fig. 2 (C), as shown in fig. 3, if the user wants to view the first image editing effect diagram corresponding to the "brightness" editing node, that is, the first image editing effect diagram corresponding to the first editing node, the user may click on the "brightness" editing node in the first window 14 to trigger the electronic device to display the image editing effect diagram corresponding to the "brightness" editing node, that is, 21 in fig. 3.
In some embodiments of the present application, after displaying the first image editing effect map corresponding to the first editing node, the electronic device may receive an input from a user, and store the first image editing effect map.
In some embodiments of the present application, when the electronic device displays the first image editing effect map, a mark may be displayed on the first editing node, for example: and reversing the triangle graph to prompt the user which editing node corresponds to the image editing effect graph currently displayed.
In some embodiments of the present application, the electronic device may receive an input from a user to any edit node in the first edit path, and display a first identifier on the any edit node, for example: and the flag pattern is formed, so that a user can conveniently and quickly position a certain editing node through the first mark.
Further, the electronic device may receive an input of the first identifier by the user, display an input window, and input first information in the input window by the user and save the first information; when the subsequent electronic equipment receives the input of the user on the first identifier, the first information recorded before the user can be displayed. For example: the first information may be: recall that the image editing effect diagram corresponding to the editing node is sent to the contact A.
In some embodiments of the present application, the electronic device may simultaneously display the first image and the first image editing effect map corresponding to the first editing node in response to the first input. Thus, the user can intuitively see the editing effect of the first image.
In some embodiments of the present application, as shown in fig. 4 in conjunction with fig. 1, the above step 204 may be implemented specifically by the following step 204 a.
In step 204a, the electronic device displays, in response to the first input, a first image editing effect map corresponding to the first editing node and a second image editing effect map corresponding to the second editing node.
In the embodiment of the present application, the first editing node is an intermediate editing node of the first editing path, and the second editing node is a last editing node of the first editing path.
In some embodiments of the present application, when the electronic device displays the second image editing effect map corresponding to the last editing node in the first editing path, the electronic device may receive the first input of the user to the first editing node, and simultaneously display the first image editing effect map and the second image editing effect map corresponding to the first editing node, so as to facilitate the user to compare the difference between the first image editing effect map and the second image editing effect map while viewing the first image editing effect map.
For example, in conjunction with fig. 2 (C), as shown in fig. 5 (a), when the electronic device displays the 4 th effect diagram, that is, 15 in fig. 2 (C), the user may drag the "brightness" editing node in the first window 14 to the display area of the 4 th effect diagram, that is, the first input made by the user to the first editing node, as shown in fig. 5 (B), and trigger the electronic device to simultaneously display the first image editing effect diagram corresponding to the "brightness" editing node and the 4 th effect diagram, that is, simultaneously display 21 in fig. 5 (B) and 15 in fig. 5 (B).
In some embodiments of the present application, when the first image is edited, the electronic device may directly display two image display areas in the image editing interface, so as to facilitate the user to compare the two image editing effect graphs displayed in the two image display areas.
For example, one image display area may display an original image effect map of the first image or an image editing effect map that the user has edited in the previous step, and the other image display area may display an image editing effect map that the user has currently edited, i.e., a second image editing effect map.
Thus, when the electronic equipment simultaneously displays the original image effect diagram of the first image and the image editing effect diagram currently edited by the user, the user can compare the current editing effect with the original image effect in real time; when the electronic equipment simultaneously displays the image editing effect diagram edited by the user in the last step and the image editing effect diagram edited by the user currently, the user can compare the current editing effect with the editing effect in the last step in real time, and the user can conveniently edit the image continuously based on the comparison result.
In some embodiments of the present application, when the electronic device simultaneously displays the first image editing effect map and the second image editing effect map, the electronic device may receive an input that a user drags the fifth editing node to an image display area of the first image editing effect map or an image display area of the second image editing effect map, and switch and display the image editing effect map originally displayed in the image display area to the image editing effect map corresponding to the fifth editing node, so that the user may conveniently compare the image editing effect maps corresponding to any two editing nodes.
Illustratively, the fifth editing node is a certain editing node in the first editing path except the first editing node and the second editing node.
In some embodiments of the present application, when the electronic device simultaneously displays the first image editing effect map and the second image editing effect map, the electronic device may receive the input of the user to the zoom-out control in the first window, zoom out and display the first window, and cancel displaying the first image editing effect map, so that the user is convenient to view the large map of the second image editing effect map.
Therefore, when the electronic equipment displays the second image editing effect diagram corresponding to the last editing node in the first editing path, the electronic equipment is triggered to simultaneously display the first image editing effect diagram and the second image editing effect diagram corresponding to the first editing node by inputting the first editing node, so that a user can conveniently compare the two image editing effect diagrams, and the flexibility of the electronic equipment for comparing differences between the image effect diagrams is improved.
In the image display method provided by the embodiment of the application, since the editing path corresponding to the image editing process can be generated in the image editing process of the first image, a user can directly intuitively know the editing steps in the image editing process through the editing path, meanwhile, the user does not need to withdraw the image editing operation, the image editing effect graph corresponding to each editing node in the whole image editing process can be directly checked through the editing path, and the checking efficiency of the image editing effect graph in the image editing process is improved.
In some embodiments of the present application, after the step 202, the image display method provided in the embodiment of the present application further includes the following steps 301 and 302.
Step 301, the electronic device receives a second input.
In some embodiments of the present application, the second input is used to view a thumbnail of an image editing effect map corresponding to each editing node in the first editing path.
In some embodiments of the application, the second input includes, but is not limited to: the user inputs through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or a click input, or other feasibility inputs. The specific determination may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
In some embodiments of the present application, the second input may be an input to an image editing interface, or an input to a first window, or an input to a zoom-in control in the first window. The embodiment of the present application is not limited thereto.
Step 302, the electronic device responds to the second input, and displays a thumbnail of the image editing effect graph corresponding to each editing node in the first editing path on the image editing interface.
In some embodiments of the present application, the electronic device may display, in the first window of the image editing interface, a thumbnail of the image editing effect map corresponding to each editing node.
In some embodiments of the present application, the electronic device may display a thumbnail of the image editing effect map corresponding to each editing node in a nearby display area of the corresponding editing node, for example: to the right of the edit node, or above the edit node, or below the edit node.
For example, referring to fig. 2 (C), if the user wants to view the thumbnail of the image editing effect map corresponding to each editing node in the first editing path, the user may click on the zoom-in control 31 in the first window, that is, the second input, as shown in fig. 6 (B), to trigger the electronic device to display the thumbnail of the image editing effect map corresponding to each editing node in the first window 14, and the thumbnail of the image editing effect map corresponding to each editing node is displayed to the right of the corresponding editing node.
In some embodiments of the present application, the electronic device may receive an input from a user to a certain editing node, and display an image editing effect map corresponding to the certain editing node in the first window, thereby improving flexibility of viewing the image editing effect map.
In this way, through the input of the user, the electronic equipment is triggered to display the thumbnail of the image editing effect graph corresponding to each editing node in the first editing path on the image editing interface, so that the user can compare the image editing effect graph corresponding to each editing node through the thumbnail, and the diversity and flexibility of the electronic equipment for comparing differences between the image effect graphs are improved.
In some embodiments of the present application, after the step 302, the image display method provided in the embodiment of the present application further includes the following steps 401 and 402.
Step 401, the electronic device receives a third input to the first thumbnail.
In an embodiment of the present application, the first thumbnail is at least one thumbnail displayed in an image editing interface.
In some embodiments of the present application, the third input is used to adjust an image parameter of a third image editing effect map corresponding to the first thumbnail.
In some embodiments of the application, the third input includes, but is not limited to: the user performs touch input on the first thumbnail through a touch device such as a finger or a stylus pen, or inputs a voice command input by the user, or inputs a specific gesture input by the user, or inputs a click, or inputs other feasibility. The specific determination may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
In some embodiments of the present application, the third input may be an input to an editing node corresponding to the third image editing effect map.
Step 402, the electronic device responds to the third input, adjusts the image parameters of the third image editing effect map corresponding to the first thumbnail according to the input parameters of the third input, and adjusts the image parameters of the image editing effect map corresponding to the third editing node based on the adjusted third image editing effect map.
In the embodiment of the present application, the third editing node is another editing node after the corresponding editing node of the third image editing effect map in the first editing path.
In some embodiments of the application, the input parameters include at least one of: brightness parameters, contrast parameters, size parameters, exposure parameters, filter parameters, etc. The embodiment of the present application is not limited thereto.
In some embodiments of the application, the image parameters include at least one of: luminance parameters, sharpness parameters, color temperature parameters, shading parameters, highlighting parameters, etc. The embodiment of the present application is not limited thereto.
It can be understood that, after adjusting the image parameters of the third image editing effect map, the electronic device may adaptively update the image parameters of other image editing effect maps after the third image editing effect map based on the adjusted third image editing effect map, that is, the image parameters of the image editing effect map corresponding to the third editing node.
For example: and when the input parameter of the third input is the brightness parameter, the electronic equipment increases 20 the brightness parameter of the third image editing effect graph and the brightness parameter of the image editing effect graph corresponding to the third editing node after the editing node 1.
In some embodiments of the present application, after receiving the third input of the user to the first thumbnail, the electronic device may display a third image editing effect map corresponding to the first thumbnail in the first window, thereby improving flexibility of viewing the image editing effect map.
Therefore, as the user inputs the first thumbnail, after adjusting the image parameters of the third image editing effect diagram corresponding to the first thumbnail, the electronic device can adaptively update the image parameters of other image editing effect diagrams after the third image editing effect diagram based on the adjusted third image editing effect diagram, so that the efficiency of the electronic device for editing the image effect diagram is improved.
In some embodiments of the present application, as shown in fig. 7 in conjunction with fig. 1, after the step 204, the image display method provided in the embodiment of the present application further includes the following steps 501 and 502.
Step 501, the electronic device displays, on an image editing interface, a fourth image editing effect diagram corresponding to a fourth editing node in the first editing path.
In an embodiment of the present application, the fourth editing node is any editing node in the first editing path.
In some embodiments of the present application, the user may input a fourth editing node in the first editing path to trigger the electronic device to display a fourth image editing effect map corresponding to the fourth editing node.
Step 502, in the process of editing the fourth image editing effect diagram, the electronic device adds editing nodes in the first editing path based on the image editing process of the fourth image editing effect diagram.
In some embodiments of the present application, when the fourth editing node is the last editing node in the first editing path, if the user continues to edit the fourth image editing effect map, the electronic device may add a corresponding editing node after the fourth editing node based on an image editing process of the user on the fourth image editing effect map.
For example, referring to fig. 2 (C), as shown in fig. 8 (a), the electronic device displays the 4 th effect diagram, that is, 15 in fig. 8 (a), that is, the fourth image editing effect diagram, and the user may click on the color temperature adjustment control in the image editing interface 11 to adjust the color temperature parameter of the 4 th effect diagram, for example: "color temperature-10"; as shown in fig. 8 (B), the electronic device updates the 4 th effect diagram to the 5 th effect diagram, i.e. 41 in fig. 8 (B), and triggers the electronic device to newly add a "color temperature" editing node after the last "exposure" editing node in the first editing path.
In some embodiments of the present application, if the fourth editing node is any editing node except the last editing node in the first editing path, if the user continues to edit the fourth image editing effect map, the electronic device may add a second editing path in the first editing path and add an editing node in the second editing path, where the editing node corresponds to an image editing process of the fourth image editing effect map, so that the user may edit an image on the basis of the second editing path and may edit an image on the basis of the first editing path.
It should be noted that, for the detailed steps of adding the second editing path to the first editing path, the following embodiments will be described, and will not be repeated here.
Therefore, when the electronic equipment displays the image editing effect graph corresponding to any editing node on the image editing interface, if the user continues to edit the image editing effect graph, the electronic equipment can add the corresponding editing node in the first editing path based on the editing process, so that the subsequent user can select any editing node in the first editing path, trigger the electronic equipment to display the image editing effect graph corresponding to any editing node, and further improve the efficiency of the electronic equipment for comparing differences between the image effect graphs.
In some embodiments of the present application, the fourth editing node is other editing nodes except the last editing node in the first editing path; the above step 502 may be specifically implemented by the following step 502 a.
In step 502a, during the process of editing the fourth image editing effect map, the electronic device adds a second editing path to the first editing path based on the image editing process of the fourth image editing effect map.
In an embodiment of the present application, the second editing path includes at least one editing node in an image editing process of the fourth image editing effect map.
In some embodiments of the present application, in a process that a user continues to edit the fourth image editing effect map, if the electronic device detects that the image parameters adjusted by the user are different from the image parameters adjusted when the fourth image editing effect map is generated, that is, the image parameters corresponding to the fourth editing node are different, the electronic device automatically adds a second editing path in the first editing path based on the image editing process of the fourth image editing effect map.
For example, referring to fig. 3, as shown in fig. 9 (a), the electronic device displays an image editing effect diagram corresponding to the "brightness" editing node, that is, 21 in fig. 9 (a), that is, the fourth image editing effect diagram, and the user may click on the color temperature adjustment control in the image editing interface 11 to adjust the color temperature parameter of the fourth image editing effect diagram, for example: "color temperature-10"; as shown in fig. 9 (B), the fourth image editing effect map is updated and displayed as the 6 th effect map, that is, 51 in fig. 9 (B), and the electronic apparatus may add a second editing path including an "exposure" editing node, a "brightness" editing node, and a "color temperature" editing node in the first editing path based on the image editing process of the fourth image editing effect map.
In some embodiments of the present application, when the user continues to edit the fourth image editing effect map, if the electronic device detects that the image parameter adjusted by the user is the image parameter adjusted when the fourth image editing effect map is generated, but only the parameter value of the image parameter is adjusted, the electronic device does not add an editing node, adjusts the image parameter of the fourth image editing effect map, and adjusts the image parameter of the image editing effect map corresponding to the editing node after the fourth editing node based on the adjusted fourth image editing effect map.
For example: the adjusting parameter corresponding to the fourth editing node is 'exposure+46', and the user wants to reduce the exposure parameter of the fourth image editing effect diagram, so that the electronic device only adjusts the exposure parameter of the fourth image editing effect diagram corresponding to the fourth editing node, adaptively updates the exposure parameter of the image editing effect diagram corresponding to the editing node after the fourth editing node, and does not add the editing node.
In some embodiments of the present application, when the electronic device displays the fourth image editing effect map, the electronic device may receive the input of the user to the fourth editing node, manually add the second editing path in the first editing path, and after receiving the editing process of the user to the fourth image editing effect map, add the corresponding editing node in the second editing path based on the image editing process of the fourth image editing effect map.
For example, referring to fig. 3, as shown in fig. 10 (a), the electronic device displays an image editing effect diagram corresponding to the "brightness" editing node, that is, 21 in fig. 10 (a), that is, the fourth image editing effect diagram, and if the user wants to continue editing on the basis of the fourth image editing effect diagram, the user does not want to affect other image editing effect diagrams obtained by previous editing, that is, the image editing effect diagram corresponding to the editing node in the first editing path; the user can click on the brightness editing node to trigger the electronic device to display the plus control 61 after the brightness editing node, and the user can click on the plus control 61; as shown in fig. 10 (B), the triggering electronic device adds a second editing path in the first editing path, where the second editing path includes a blank editing node; clicking a color temperature adjustment control in the image editing interface 11 by the user adjusts a color temperature parameter of the fourth image editing effect map, for example: after "color temperature-10"; as shown in fig. 9 (B), the electronic device updates the fourth image editing effect map displayed on the image editing interface to the 6 th effect map, that is, 51 in fig. 9 (B), and updates the blank editing node in the second editing path to the "color temperature" editing node based on the image editing process of the fourth image editing effect map, that is, newly adds a "color temperature" editing node in the second editing path.
In some embodiments of the present application, the first editing path and the second editing path do not affect each other.
Therefore, when the fourth image editing effect diagram is displayed, the fourth image editing effect diagram is continuously edited, the electronic equipment is triggered to increase a second editing path in the first editing path based on the image editing process of the fourth image editing effect diagram, and the second editing path comprises editing nodes corresponding to the fourth image editing effect diagram, so that the flexibility of displaying the editing nodes by the electronic equipment is improved, and the viewing efficiency of the image editing effect diagram in the image editing process is improved.
In some embodiments of the present application, after the step 502a, the image display method provided in the embodiment of the present application further includes the following steps 601 and 602.
Step 601, the electronic device receives a fourth input.
In some embodiments of the present application, the fourth input is used to store a final image editing effect map corresponding to the first editing path and a final image editing effect map corresponding to the second editing path.
In some embodiments of the application, the fourth input includes, but is not limited to: the user inputs through a touch device such as a finger or a stylus, or a voice command input by the user, or a specific gesture input by the user, or a click input, or other feasibility inputs. The specific determination may be determined according to actual use requirements, which is not limited in the embodiment of the present application.
In some embodiments of the present application, the fourth input may be an input to a save control.
In step 602, the electronic device associates and stores the final image editing effect map corresponding to the first editing path and the first editing path, and associates and stores the final image editing effect map corresponding to the second editing path and the second editing path in response to the fourth input.
In some embodiments of the present application, the electronic device may receive a selection input from a user to any one of the first editing path or the second editing path, and associate and store a final image editing effect map corresponding to the any one editing path and any one editing path.
In some embodiments of the present application, after the electronic device associates and stores the final image editing effect map and the first editing path corresponding to the first editing path, and associates and stores the final image editing effect map and the second editing path corresponding to the second editing path, when the electronic device displays any one of the final image editing effect maps in the image editing interface again, the electronic device may simultaneously display the editing path stored in association with the any one of the final image editing effect maps, so that the user may continuously edit any one of the final image editing effect maps, or the user may input a certain editing node in the displayed editing path, so as to trigger the electronic device to display the image editing effect map corresponding to the certain editing node.
Therefore, when the electronic equipment displays any one of the final image editing effect graphs again, the electronic equipment can display the editing path corresponding to the any one of the final image editing effect graphs at the same time, so that the user can conveniently edit any one of the final image editing effect graphs continuously, and the efficiency of editing images of the electronic equipment is improved; or, the user can conveniently look up the image editing effect graph corresponding to a certain editing node in the displayed editing path, so that the looking efficiency of the image editing effect graph is improved. Displaying an image editing interface, wherein the image editing interface comprises a first image;
in some embodiments of the present application, the electronic device may display a video editing interface, and in the process of editing the first video, display an editing path on the video editing interface, where the editing path is used to indicate a video editing process of the first video, where the video editing process includes at least one editing step, and each editing step corresponds to one editing operation; the editing path includes at least one editing node in a video editing process of the first video, each editing node for indicating one editing step of the video editing process.
It should be noted that, in the image display method provided by the embodiment of the present application, the execution subject may be an image display device. In the embodiment of the present application, an image display device is described by taking an example in which the image display device performs an image display method.
Fig. 11 shows a schematic diagram of one possible configuration of an image display device involved in an embodiment of the present application. As shown in fig. 11, the image display device 70 may include: a display module 71 and a receiving module 72.
In an embodiment of the present application, the display module 71 is configured to display an image editing interface, where the image editing interface includes a first image; and displaying a first editing path on the image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and the first editing path comprises at least one editing node in the image editing process of the first image.
A receiving module 72 for receiving a first input to a first editing node of the at least one editing node;
the display module 71 is further configured to display a first image editing effect map corresponding to the first editing node in response to the first input received by the receiving module 72.
The embodiment of the application provides an image display device, which can generate an editing path corresponding to an image editing process in the image editing process of a first image, so that a user can directly intuitively know the editing steps in the image editing process through the editing path, meanwhile, the user does not need to withdraw image editing operation, and can directly check the image editing effect graphs corresponding to all editing nodes in the whole image editing process through the editing path, thereby improving the checking efficiency of the image editing effect graphs in the image editing process.
In one possible implementation manner, the display module 71 is specifically configured to display, in response to a first input, a first image editing effect map corresponding to a first editing node, and a second image editing effect map corresponding to a second editing node, where the first editing node is an intermediate editing node of the first editing path, and the second editing node is a last editing node of the first editing path.
In a possible implementation manner, the receiving module 72 is further configured to receive, during the process of performing image editing on the first image by the display module 71, the second input after the first editing path is displayed on the image editing interface. The display module 71 is further configured to display, on the image editing interface, a thumbnail of the image editing effect map corresponding to each editing node in the first editing path in response to the second input received by the receiving module 72.
In one possible implementation manner, the image display device 70 provided in the embodiment of the present application further includes: and an adjustment module. The receiving module 72 is further configured to receive a third input to a first thumbnail after the display module 71 displays, on the image editing interface, a thumbnail of the image editing effect map corresponding to each editing node in the first editing path, where the first thumbnail is at least one thumbnail displayed in the image editing interface. An adjustment module, configured to adjust, in response to the third input received by the receiving module 72, an image parameter of a third image editing effect map corresponding to the first thumbnail according to an input parameter of the third input, and adjust, based on the adjusted third image editing effect map, an image parameter of an image editing effect map corresponding to the third editing node; in the embodiment of the application, the third editing node is other editing nodes after the corresponding editing node of the third image editing effect graph in the first editing path.
In one possible implementation manner, the image display device 70 provided in the embodiment of the present application further includes: modules are added. The display module 71 is further configured to display, on the image editing interface, a fourth image editing effect map corresponding to a fourth editing node in the first editing path after displaying the first image editing effect map corresponding to the first editing node in response to the first input, where the fourth editing node is any editing node in the first editing path. An adding module, configured to add editing nodes in the first editing path based on an image editing process of the fourth image editing effect map in the process of editing the fourth image editing effect map displayed by the display module 71.
In one possible implementation manner, the fourth editing node is other editing nodes except the last editing node in the first editing path. The adding module is specifically configured to add a second editing path in the first editing path based on an image editing process of the fourth image editing effect map in the process of editing the fourth image editing effect map; the second editing path includes at least one editing node in an image editing process of the fourth image editing effect map.
In one possible implementation manner, the image display device 70 provided in the embodiment of the present application further includes: and a processing module. The receiving module 72 is configured to receive the fourth input after adding the second editing path to the first editing path based on the image editing process of the fourth image editing effect map in the process of editing the fourth image editing effect map by the adding module. And a processing module, configured to associate and store the final image editing effect map corresponding to the first editing path with the first editing path, and associate and store the final image editing effect map corresponding to the second editing path with the second editing path in response to the fourth input received by the receiving module 72.
The image display device in the embodiment of the application can be an electronic device or a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet, notebook, palmtop, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented Reality (Augmented Reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-Mobile Personal Computer, UMPC, netbook or personal digital assistant (Personal Digital Assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (Personal Computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image display device in the embodiment of the application may be a device having an operating system. The operating system may be an Android operating system, an iOS operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The image display device provided by the embodiment of the application can realize each process realized by the embodiment of the method, and in order to avoid repetition, the description is omitted here.
Optionally, as shown in fig. 12, the embodiment of the present application further provides an electronic device 900, which includes a processor 901 and a memory 902, where a program or an instruction capable of being executed on the processor 901 is stored in the memory 902, and the program or the instruction when executed by the processor 901 implements each step of the embodiment of the method, and the steps can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 13 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, and processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further include a power source (e.g., a battery) for powering the various components, and that the power source may be logically coupled to the processor 110 via a power management system to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
In an embodiment of the present application, the display unit 106 is configured to display an image editing interface, where the image editing interface includes a first image; and displaying a first editing path on the image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and the first editing path comprises at least one editing node in the image editing process of the first image.
A user input unit 107 for receiving a first input to a first editing node of the at least one editing node.
The display unit 106 is further configured to display a first image editing effect map corresponding to the first editing node in response to the first input received by the user input unit 107.
The embodiment of the application provides electronic equipment, because an editing path corresponding to the image editing process can be generated in the image editing process of a first image, a user can directly and intuitively know the editing steps in the image editing process through the editing path, meanwhile, the user does not need to withdraw the image editing operation, the image editing effect graph corresponding to each editing node in the whole image editing process can be directly checked through the editing path, and the checking efficiency of the image editing effect graph in the image editing process is improved.
In some embodiments of the present application, the display unit 106 is specifically configured to display, in response to a first input, a first image editing effect map corresponding to a first editing node, and a second image editing effect map corresponding to a second editing node, where the first editing node is an intermediate editing node of the first editing path, and the second editing node is a last editing node of the first editing path.
In some embodiments of the present application, the user input unit 107 is further configured to receive a second input after the first editing path is displayed on the image editing interface in the process of editing the first image by the display unit 106.
The display unit 106 is further configured to display, on an image editing interface, a thumbnail of an image editing effect map corresponding to each editing node in the first editing path in response to the second input received by the user input unit 107.
In some embodiments of the present application, the user input unit 107 is further configured to receive, after the display unit 106 displays, on the image editing interface, a thumbnail of an image editing effect map corresponding to each editing node in the first editing path, a third input to the first thumbnail, where the first thumbnail is at least one thumbnail displayed in the image editing interface.
A processor 110, configured to respond to the third input received by the user input unit 107, adjust an image parameter of a third image editing effect map corresponding to the first thumbnail according to an input parameter of the third input, and adjust an image parameter of an image editing effect map corresponding to the third editing node based on the adjusted third image editing effect map; in the embodiment of the application, the third editing node is other editing nodes after the corresponding editing node of the third image editing effect graph in the first editing path.
In some embodiments of the present application, the display unit 106 is further configured to display, on the image editing interface, a fourth image editing effect map corresponding to a fourth editing node in the first editing path after displaying the first image editing effect map corresponding to the first editing node in response to the first input, where the fourth editing node is any editing node in the first editing path.
The processor 110 is further configured to add an editing node to the first editing path based on an image editing process of the fourth image editing effect map during an editing process of the fourth image editing effect map displayed on the display unit 106.
In some embodiments of the present application, the fourth editing node is other editing nodes except the last editing node in the first editing path. The processor 110 is specifically configured to add a second editing path to the first editing path based on an image editing process of the fourth image editing effect map during the editing process of the fourth image editing effect map; the second editing path includes at least one editing node in an image editing process of the fourth image editing effect map.
In some embodiments of the present application, the user input unit 107 is configured to receive the fourth input after adding the second editing path to the first editing path based on the image editing process of the fourth image editing effect map in the process of editing the fourth image editing effect map by the processor 110.
The processor 110 is further configured to associate and store a final image editing effect map corresponding to the first editing path with the first editing path, and associate and store a final image editing effect map corresponding to the second editing path with the second editing path in response to the fourth input received by the user input unit 107.
The electronic device provided by the embodiment of the application can realize each process realized by the embodiment of the method and can achieve the same technical effect, and in order to avoid repetition, the description is omitted here.
The beneficial effects of the various implementation manners in this embodiment may be specifically referred to the beneficial effects of the corresponding implementation manners in the foregoing method embodiment, and in order to avoid repetition, the description is omitted here.
It should be appreciated that in embodiments of the present application, the input unit 104 may include a graphics processor (Graphics Processing Unit, GPU) 1041 and a microphone 1042, the graphics processor 1041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes at least one of a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
Memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a first memory area storing programs or instructions and a second memory area storing data, and in an embodiment of the present application, the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 109 may include volatile memory or nonvolatile memory, or the memory 109 may include both volatile and nonvolatile memory. In embodiments of the present application, the nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable Programmable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 109 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 110 may include one or more processing units; optionally, the processor 110 integrates an application processor that primarily processes operations involving an operating system, user interface, application program, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor, in embodiments of the application. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements each process of the above method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
In an embodiment of the present application, the processor is a processor in the electronic device described in the foregoing embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the embodiment of the method, and can achieve the same technical effects, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.
Claims (14)
1. An image display method, the method comprising:
displaying an image editing interface, wherein the image editing interface comprises a first image;
displaying a first editing path on the image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and comprises at least one editing node in the image editing process of the first image;
receiving a first input to a first editing node of the at least one editing node;
and responding to the first input, and displaying a first image editing effect graph corresponding to the first editing node.
2. The method of claim 1, wherein the displaying, in response to the first input, a first image editing effect map corresponding to the first editing node comprises:
and responding to the first input, displaying a first image editing effect diagram corresponding to the first editing node and a second image editing effect diagram corresponding to a second editing node, wherein the first editing node is an intermediate editing node of the first editing path, and the second editing node is the last editing node of the first editing path.
3. The method of claim 1, wherein, in the process of editing the first image, after the image editing interface displays the first editing path, the method further comprises:
receiving a second input;
and responding to the second input, and displaying a thumbnail of an image editing effect graph corresponding to each editing node in the first editing path on the image editing interface.
4. A method according to claim 3, wherein after the image editing interface displays the thumbnail of the image editing effect map corresponding to each editing node in the first editing path, the method further comprises:
receiving a third input of a first thumbnail, the first thumbnail being at least one thumbnail displayed in the image editing interface;
responding to the third input, adjusting the image parameters of a third image editing effect diagram corresponding to the first thumbnail according to the input parameters of the third input, and adjusting the image parameters of the image editing effect diagram corresponding to a third editing node based on the adjusted third image editing effect diagram;
and the third editing node is other editing nodes after the corresponding editing node of the third image editing effect graph in the first editing path.
5. The method of claim 1, wherein after displaying the first image editing effect map corresponding to the first editing node in response to the first input, the method further comprises:
displaying a fourth image editing effect diagram corresponding to a fourth editing node in the first editing path on the image editing interface, wherein the fourth editing node is any editing node in the first editing path;
and adding editing nodes in the first editing path based on the image editing process of the fourth image editing effect graph in the editing process of the fourth image editing effect graph.
6. The method of claim 5, wherein the fourth edit node is another edit node in the first edit path except for a last edit node;
in the process of editing the fourth image editing effect diagram, adding editing nodes in the first editing path based on the image editing process of the fourth image editing effect diagram includes:
in the process of editing the fourth image editing effect diagram, adding a second editing path in the first editing path based on the image editing process of the fourth image editing effect diagram;
The second editing path includes at least one editing node in an image editing process of the fourth image editing effect map.
7. The method according to claim 6, wherein the method further comprises, in the process of editing the fourth image editing effect map, after adding a second editing path in the first editing path based on the image editing process of the fourth image editing effect map:
receiving a fourth input;
and in response to the fourth input, associating and storing a final image editing effect map corresponding to the first editing path with the first editing path, and associating and storing a final image editing effect map corresponding to the second editing path with the second editing path.
8. An image display device, the device comprising: a display module and a receiving module;
the display module is used for displaying an image editing interface, and the image editing interface comprises a first image; displaying a first editing path on the image editing interface in the process of editing the first image, wherein the first editing path is used for indicating the image editing process of the first image, and comprises at least one editing node in the image editing process of the first image;
The receiving module is used for receiving a first input to a first editing node in the at least one editing node;
the display module is further configured to display a first image editing effect map corresponding to the first editing node in response to the first input received by the receiving module.
9. The apparatus according to claim 8, wherein the display module is specifically configured to display, in response to the first input, a first image editing effect graph corresponding to the first editing node, and a second image editing effect graph corresponding to a second editing node, the first editing node being an intermediate editing node of the first editing path, and the second editing node being a last editing node of the first editing path.
10. The apparatus of claim 8, wherein the receiving module is further configured to receive a second input after the image editing interface displays a first editing path during the image editing of the first image by the display module;
the display module is further configured to display, on the image editing interface, a thumbnail of an image editing effect map corresponding to each editing node in the first editing path in response to the second input received by the receiving module.
11. The apparatus of claim 10, wherein the apparatus further comprises: the adjusting module is further configured to receive a third input to a first thumbnail after the display module displays a thumbnail of an image editing effect map corresponding to each editing node in the first editing path on the image editing interface, where the first thumbnail is at least one thumbnail displayed in the image editing interface;
the adjusting module is configured to respond to the third input received by the receiving module, adjust an image parameter of a third image editing effect diagram corresponding to the first thumbnail according to an input parameter of the third input, and adjust an image parameter of an image editing effect diagram corresponding to a third editing node based on the adjusted third image editing effect diagram;
and the third editing node is other editing nodes after the corresponding editing node of the third image editing effect graph in the first editing path.
12. The apparatus of claim 8, wherein the apparatus further comprises: adding a module;
the display module is further configured to display, on the image editing interface, a fourth image editing effect diagram corresponding to a fourth editing node in the first editing path after the first image editing effect diagram corresponding to the first editing node is displayed in response to the first input, where the fourth editing node is any editing node in the first editing path;
The adding module is configured to add an editing node in the first editing path based on an image editing process of the fourth image editing effect map in a process of editing the fourth image editing effect map displayed by the display module.
13. The apparatus of claim 12, wherein the fourth edit node is another edit node in the first edit path except for a last edit node;
the adding module is specifically configured to add a second editing path in the first editing path based on an image editing process of the fourth image editing effect map in the process of editing the fourth image editing effect map;
the second editing path includes at least one editing node in an image editing process of the fourth image editing effect map.
14. The apparatus of claim 13, wherein the apparatus further comprises: a processing module;
the receiving module is used for receiving a fourth input after a second editing path is added in the first editing path based on the image editing process of the fourth image editing effect graph in the process of editing the fourth image editing effect graph by the adding module;
The processing module is configured to associate and store a final image editing effect map corresponding to the first editing path with the first editing path, and associate and store a final image editing effect map corresponding to the second editing path with the second editing path in response to the fourth input received by the receiving module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311199840.0A CN117149038A (en) | 2023-09-15 | 2023-09-15 | Image display method and image display device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311199840.0A CN117149038A (en) | 2023-09-15 | 2023-09-15 | Image display method and image display device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117149038A true CN117149038A (en) | 2023-12-01 |
Family
ID=88907962
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311199840.0A Pending CN117149038A (en) | 2023-09-15 | 2023-09-15 | Image display method and image display device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117149038A (en) |
-
2023
- 2023-09-15 CN CN202311199840.0A patent/CN117149038A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113093968B (en) | Shooting interface display method and device, electronic equipment and medium | |
EP3822758B1 (en) | Method and apparatus for setting background of ui control | |
CN112954210A (en) | Photographing method and device, electronic equipment and medium | |
CN112911147B (en) | Display control method, display control device and electronic equipment | |
CN114302009A (en) | Video processing method, video processing device, electronic equipment and medium | |
CN111866379A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN116107531A (en) | Interface display method and device | |
CN110502290A (en) | Interface display method, device, display equipment and storage medium | |
CN115617225A (en) | Application interface display method and device, electronic equipment and storage medium | |
CN113794831B (en) | Video shooting method, device, electronic equipment and medium | |
CN115037874A (en) | Photographing method and device and electronic equipment | |
CN112929566B (en) | Display control method, display control device, electronic apparatus, and medium | |
CN112383708B (en) | Shooting method and device, electronic equipment and readable storage medium | |
CN111885298B (en) | Image processing method and device | |
CN114500852B (en) | Shooting method, shooting device, electronic equipment and readable storage medium | |
CN117311885A (en) | Picture viewing method and device | |
CN114390205B (en) | Shooting method and device and electronic equipment | |
CN116244028A (en) | Interface display method and device and electronic equipment | |
CN115562539A (en) | Control display method and device, electronic equipment and readable storage medium | |
CN112887607B (en) | Shooting prompting method and device | |
CN113157184B (en) | Content display method, device, electronic equipment and readable storage medium | |
CN117149038A (en) | Image display method and image display device | |
CN114245017A (en) | Shooting method and device and electronic equipment | |
CN114286010A (en) | Shooting method, shooting device, electronic equipment and medium | |
CN113873168A (en) | Shooting method, shooting device, electronic equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |