CN110995996A - Image display method and device, storage medium and terminal equipment - Google Patents
Image display method and device, storage medium and terminal equipment Download PDFInfo
- Publication number
- CN110995996A CN110995996A CN201911265384.9A CN201911265384A CN110995996A CN 110995996 A CN110995996 A CN 110995996A CN 201911265384 A CN201911265384 A CN 201911265384A CN 110995996 A CN110995996 A CN 110995996A
- Authority
- CN
- China
- Prior art keywords
- image
- scanning
- shot object
- mobile terminal
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 239000002131 composite material Substances 0.000 claims abstract description 33
- 230000000007 visual effect Effects 0.000 claims abstract description 16
- 238000012545 processing Methods 0.000 claims description 13
- 238000000605 extraction Methods 0.000 claims description 10
- 239000000284 extract Substances 0.000 abstract description 8
- 238000003384 imaging method Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000010295 mobile communication Methods 0.000 description 6
- 238000005192 partition Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000010438 heat treatment Methods 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 206010034960 Photophobia Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 208000013469 light sensitivity Diseases 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
The application provides an image display method, an image display device, a storage medium and a terminal device, wherein the image display method determines a shot object from a scanned image of a shooting visual field area of a mobile terminal, extracts a local image of the shot object from the scanned image, and displays a composite image containing the shot object on a current preview interface according to the local image when the position change of the mobile terminal is detected, so that the mobile terminal with the changed position can see a clear and stable image of the shot object on the current preview interface without positioning the shot object, and the power consumption of the mobile terminal is reduced.
Description
Technical Field
The present application relates to the field of communications technologies, and in particular, to an image display method and apparatus, a storage medium, and a terminal device.
Background
In the process of taking a picture by using the mobile terminal, when the position of the mobile terminal changes, the position of the object to be taken in the current preview interface of the mobile terminal also changes, and the image of the object to be taken also becomes fuzzy.
In the image display method of the mobile terminal in the prior art, when the position of the mobile terminal changes, the shot object is positioned, so that a clear and stable image of the shot object is seen on a current preview interface, and the mobile terminal generates larger power consumption.
Disclosure of Invention
The application provides an image display method, an image display device, a storage medium and a terminal device, which effectively solve the problem that a mobile terminal generates larger power consumption due to the fact that a shot object needs to be positioned by the mobile terminal after the position is changed so that the shot object can be seen in a clear and stable image on a current preview interface.
In order to solve the above problem, an embodiment of the present application provides an image display method, including:
carrying out image scanning on a photographing visual field area of the mobile terminal to obtain a scanned image;
determining at least one shot object from the scanning image, and extracting a partial image of each shot object from the scanning image;
and in the image scanning process, when the position of the mobile terminal is detected to be changed, displaying a composite image containing the shot object on a current preview interface according to the local image.
In the image display method provided by the present application, the step of determining at least one object to be photographed from the scanned image specifically includes:
determining all objects in the scanned image, and displaying the scanned image on a current preview interface;
acquiring selection operation information of a user on the preview interface;
and determining at least one shot object from all the objects according to the selection operation information.
In the image display method provided by the present application, the step in the image scanning process further includes:
acquiring two pieces of position information of the mobile terminal at any adjacent time;
when the difference value between the two pieces of position information is smaller than a preset difference value, determining that the position of the mobile terminal is not changed;
and when the difference value between the two pieces of position information is not less than a preset difference value, determining that the position of the mobile terminal is changed.
In the image display method provided by the present application, the step of displaying a composite image including the object to be photographed on a current preview interface according to the local image specifically includes:
acquiring a current scanning image and taking the current scanning image as a target scanning image;
when the shot object is contained in the target scanning image, determining the position of the shot object on the target scanning image as a target position;
and generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on a current preview interface.
In the image display method provided by the present application, the step of extracting the partial image of each of the objects from the scanned image specifically includes:
acquiring continuous frame scanning images within a preset time length;
determining the definition of each shot object in the continuous frame scanning images;
and extracting the corresponding image with the highest definition of each shot object from the continuous frame scanning images as a local image of the corresponding shot object.
In order to solve the above problem, an embodiment of the present application also provides an image display device, including:
the scanning module is used for scanning images of a photographing view field of the mobile terminal to obtain a scanned image;
the extraction module is used for determining at least one shot object from the scanning image and extracting a partial image of each shot object from the scanning image;
and the display module is used for displaying a composite image containing the shot object on a current preview interface according to the local image when detecting that the position of the mobile terminal is changed in the image scanning process.
In the image display device provided by the present application, the extraction module specifically includes:
the display unit is used for determining all objects in the scanned image and displaying the scanned image on a current preview interface;
the acquisition unit is used for acquiring the selection operation information of the user on the preview interface;
and the determining unit is used for determining at least one shot object from all the objects according to the selection operation information.
In the image display device provided by the present application, the display module specifically includes:
the mobile terminal comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring two pieces of position information of the mobile terminal at any adjacent time;
a first determining unit, configured to determine that the position of the mobile terminal has not changed when a difference between the two pieces of position information is smaller than a preset difference;
and the second determining unit is used for determining that the position of the mobile terminal changes when the difference value between the two pieces of position information is not less than a preset difference value.
In the image display apparatus provided by the present application, the image display apparatus further includes a generation unit configured to:
acquiring a current scanning image and taking the current scanning image as a target scanning image;
when the shot object is contained in the target scanning image, determining the position of the shot object on the target scanning image as a target position;
and generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on a current preview interface.
In the image display apparatus provided by the present application, the image display apparatus further includes an extraction unit configured to:
acquiring continuous frame scanning images within a preset time length;
determining the definition of each shot object in the continuous frame scanning images;
and extracting the corresponding image with the highest definition of each shot object from the continuous frame scanning images as a local image of the corresponding shot object.
In order to solve the above problem, an embodiment of the present application further provides a computer-readable storage medium, where a plurality of instructions are stored, and the instructions are adapted to be loaded by a processor to execute any one of the image display methods described above.
In order to solve the above problem, an embodiment of the present application further provides a terminal device, which includes a processor and a memory, where the processor is electrically connected to the memory, the memory is used to store instructions and data, and the processor is used to execute the steps in the image display method according to any one of the above descriptions.
The beneficial effect of this application does: the invention is different from the prior art, and provides an image display method, an image display device, a storage medium and a terminal device.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image display method according to an embodiment of the present application.
Fig. 2 is another schematic flow chart of an image display method according to an embodiment of the present disclosure.
Fig. 3 is a schematic view of an application scenario of an image display method according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an image display device according to an embodiment of the present application.
Fig. 5 is another schematic structural diagram of an image display device according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Fig. 7 is another schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides an image display method, an image display device, a storage medium and terminal equipment.
Referring to fig. 1, fig. 1 is a flowchart illustrating an image display method according to an embodiment of the present disclosure, where the image display method is applied to a mobile terminal, and the mobile terminal may be any intelligent electronic device with a mobile communication function, such as a smart phone, a tablet computer, and a notebook computer. The specific flow of the image display method provided by this embodiment may be as follows:
and S101, scanning the image of the photographing visual field area of the mobile terminal to obtain a scanned image.
In this embodiment, this step is the basis of all subsequent operations, and is intended to store the picture (i.e. the scanned image) captured by the lens of the terminal device into the processor of the terminal device and display the picture in the preview interface of the terminal device. The picture captured by the lens of the terminal equipment comprises an object which a terminal user wants to shoot and other objects.
It is easy to understand that, when the preview interface of the terminal displays the picture captured by the terminal lens, the size of the preview interface can be set according to a preset display scale. The display scale is used for indicating the size relation between the preview interface and the terminal display screen. For example, if the display ratio is 70%, it indicates that the size of the preview interface is 70% of the size of the display screen.
S102, determining at least one shot object from the scanned image, and extracting a partial image of each shot object from the scanned image.
And S103, in the image scanning process, when the position of the mobile terminal is detected to be changed, displaying a composite image containing the shot object on the current preview interface according to the local image.
Further, the step S102 may specifically include:
determining all objects in the scanned image, and displaying the scanned image on the current preview interface;
acquiring selection operation information of a user on a preview interface;
determining at least one photographed object from all objects according to the selection operation information;
acquiring continuous frame scanning images within a preset time length;
determining the definition of each shot object in the continuous frame scanning images;
and extracting the corresponding image with the highest definition of each object from the continuous frame scanning images as a local image of the corresponding object.
In this embodiment, after the terminal determines all objects in the scanned image of the terminal through the sensing element inside the terminal, a corresponding mark may be added to each object on the scanned image for the user to select in the subsequent operation, and the scanned image including the mark is displayed on the current preview interface. It will be readily appreciated that the manner in which the respective indicia is added to each object in the scanned image includes, but is not limited to: a selection box is generated on each object and/or a marker point is generated on each object.
Specifically, the user may trigger the electronic device to generate the selection operation information by clicking a mark corresponding to the object displayed on the preview interface, or by specifying a gesture, voice, or the like. If the user triggers the electronic equipment to generate selection operation information by clicking a mark corresponding to the object displayed on the preview interface, the processor determines that the object corresponding to the mark selected by the user is taken as a shot object; if the user triggers the electronic equipment to generate selection operation information through the designated gesture, the processor determines an object contained in a range covered by the designated gesture as a shot object; if the user triggers the electronic equipment to generate the selection operation information through voice, the processor determines the shot object according to the information contained in the voice, wherein the information contained in the voice indicates the shot object selected by the user.
Specifically, the preset time duration may be 1 second or the like, and is usually set in combination with power consumption and imaging quality generated during terminal imaging, it is easy to understand that the factors determining the power consumption generated during terminal imaging include the performance of a digital signal processing chip inside the terminal, the better the performance of the digital signal processing chip is, the less the power consumption generated during terminal imaging is, the factors determining the terminal imaging quality include the performance of an image sensor inside the terminal, and the better the performance of the image sensor is, the higher the quality of terminal imaging is.
In the embodiment, the definition of each object is related to the sharpness of the lens in the continuous frames, the heating value of the sensor, the light sensitivity of the sensor and other factors, the processor obtains the influence scores of the influence factors in each frame, and directly adds or weights the influence scores to obtain the corresponding definition score which reflects the definition of each object. After the processor obtains the definition scores of each shot object under each frame, the corresponding image with the highest definition score is selected as the local image of the corresponding shot object, and the rest images are deleted from the cache area for storing the images.
Further, the step S103 may specifically include:
acquiring a current scanning image and taking the current scanning image as a target scanning image;
when the target scanning image contains the shot object, determining the position of the shot object on the target scanning image as a target position;
and generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on the current preview interface.
In this embodiment, when the position of the mobile terminal changes, the subject may not be in the current photographing field of view at all, or only a part of the subject may be in the current photographing field of view. When the shot object is possibly not in the current shooting visual field area at all, the target scanning image is considered not to contain the shot object; when only a part of the object is in the current photographing field of view, the object is considered to be included in the target scanning image. When only one part of the shot object is in the current shooting visual field area, the processor firstly determines the part of the shot object in the current shooting visual field area as a part to be displayed, then extracts a corresponding image from the local image according to the part to be displayed, then generates a composite image containing the part to be displayed at a target position, and displays the composite image on a current preview interface.
Specifically, the step "in the process of image scanning" further includes:
acquiring two pieces of position information of the mobile terminal at any adjacent time;
when the difference value between the two pieces of position information is smaller than a preset difference value, determining that the position of the mobile terminal is not changed;
and when the difference value between the two pieces of position information is not less than the preset difference value, determining that the position of the mobile terminal is changed.
In this embodiment, the position information may be obtained by a sensing element inside the terminal, and the information includes the distance from the terminal to the center of each object to be photographed and the exposure data of the terminal to each part of each object to be photographed.
Referring to fig. 2, fig. 2 is another schematic flow chart of an image display method according to an embodiment of the present disclosure, where the image display method is applied to a mobile terminal, and the mobile terminal may be any intelligent electronic device with a mobile communication function, such as a smart phone, a tablet computer, a notebook computer, and the like. The specific flow of the image display method provided by this embodiment may be as follows:
s201, image scanning is carried out on the photographing visual field area of the mobile terminal to obtain a scanned image.
In this embodiment, this step is the basis of all subsequent operations, and is intended to store the picture (i.e. the scanned image) captured by the lens of the terminal device into the processor of the terminal device and display the picture in the preview interface of the terminal device. The picture captured by the lens of the terminal equipment comprises an object which a terminal user wants to shoot and other objects.
It is easy to understand that, when the preview interface of the terminal displays the picture captured by the terminal lens, the size of the preview interface can be set according to a preset display scale. The display scale is used for indicating the size relation between the preview interface and the terminal display screen. For example, if the display ratio is 70%, it indicates that the size of the preview interface is 70% of the size of the display screen.
And S202, determining all objects in the scanned image, and displaying the scanned image on the current preview interface.
In this embodiment, after the terminal determines all objects in the scanned image of the terminal through the sensing element inside the terminal, a corresponding mark may be added to each object on the scanned image for the user to select in the subsequent operation, and the scanned image including the mark is displayed on the current preview interface.
It will be readily appreciated that the manner in which the respective indicia is added to each object in the scanned image includes, but is not limited to: a selection box is generated on each object and/or a marker point is generated on each object.
And S203, acquiring the selection operation information of the user on the preview interface.
In this embodiment, the user may trigger the electronic device to generate the selection operation information by clicking a mark corresponding to the object displayed on the preview interface, or by specifying a gesture, voice, or the like.
And S204, determining at least one shot object from all the objects according to the selection operation information.
In this embodiment, if the user triggers the electronic device to generate selection operation information by clicking a mark corresponding to an object displayed on the preview interface, the processor determines that the object corresponding to the mark selected by the user is a shot object; if the user triggers the electronic equipment to generate selection operation information through the designated gesture, the processor determines an object contained in a range covered by the designated gesture as a shot object; if the user triggers the electronic equipment to generate the selection operation information through voice, the processor determines the shot object according to the information contained in the voice, wherein the information contained in the voice indicates the shot object selected by the user.
S205, acquiring continuous frame scanning images within a preset time length.
In this embodiment, the preset duration may be 1 second or the like, and is usually set in combination with power consumption and imaging quality generated during terminal imaging, it is easy to understand that the factors determining the power consumption generated during terminal imaging include performances of a digital signal processing chip and the like inside the terminal, the better the performance of the digital signal processing chip is, the less the power consumption generated during terminal imaging is, the factors determining the terminal imaging quality include performances of an image sensor and the like inside the terminal, and the better the performance of the image sensor is, the higher the quality of terminal imaging is.
S206, determining the definition of each shot object in the continuous frame scanning images.
In the embodiment, the definition of each object is related to the sharpness of the lens in the continuous frames, the heating value of the sensor, the light sensitivity of the sensor and other factors, the processor obtains the influence scores of the influence factors in each frame, and directly adds or weights the influence scores to obtain the corresponding definition score which reflects the definition of each object.
And S207, extracting a corresponding image with the highest definition of each shot object from the continuous frame scanning images to serve as a local image of the corresponding shot object.
In this embodiment, after the processor obtains the sharpness score of each photographed object in each frame, the corresponding image with the highest sharpness score is selected as the partial image of the corresponding photographed object, and the remaining images are deleted from the buffer area storing the image.
S208, in the process of image scanning, when the position of the mobile terminal is detected to be changed, a current scanning image is obtained and is used as a target scanning image.
S209, when the object to be shot is contained in the target scanning image, determining the position of the object to be shot on the target scanning image and taking the position as a target position.
In this embodiment, when the position of the mobile terminal changes, the subject may not be in the current photographing field of view at all, or only a part of the subject may be in the current photographing field of view. When the shot object is possibly not in the current shooting visual field area at all, the target scanning image is considered not to contain the shot object; when only a part of the object is in the current photographing field of view, the object is considered to be included in the target scanning image.
And S210, generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on the current preview interface.
In this embodiment, when only a part of the object is in the current photographing field of view, the processor determines the part of the object in the current photographing field of view as the part to be displayed, extracts the corresponding image from the local image according to the part to be displayed, generates a composite image including the part to be displayed at the target position, and displays the composite image on the current preview interface.
Further, the step S208 further includes:
acquiring two pieces of position information of the mobile terminal at any adjacent time;
when the difference value between the two pieces of position information is smaller than a preset difference value, determining that the position of the mobile terminal is not changed;
and when the difference value between the two pieces of position information is not less than the preset difference value, determining that the position of the mobile terminal is changed.
In this embodiment, the position information may be obtained by a sensing element inside the terminal, and the information includes the distance from the terminal to the center of each object to be photographed and the exposure data of the terminal to each part of each object to be photographed.
Further, the detailed description will be given by taking as an example that all objects are the object a, the object B, and the object C, and the object to be photographed selected by the user is the object a, and the mark added by the terminal to each object is the mark frame on the scanned image.
For example, referring to fig. 3, fig. 3 is a schematic view of an application scene of an image display method provided in the embodiment of the present application, in which a mobile terminal scans an image of a photographing view area of the mobile terminal to obtain a scanned image, determines that the scanned image includes an object a, an object B, and an object C, and adds a selection frame mark to each object for a user to select in subsequent operations; the method comprises the steps that a user selects an object A as a shot object on a preview interface, a terminal obtains a scanned image 1, a scanned image 2 and a scanned image 3 of continuous frames within a preset time length, the definition of the corresponding object A is x, y and z (wherein the value of x is the maximum), and a terminal processor extracts an image of the object A from the scanned image 1 to serve as a local image of the object A; in the process that the terminal continues to scan images, when the position of the terminal is detected to change, the terminal acquires a current scanned image, determines the position of an object A on the current scanned image as a target position, generates a composite image containing the object A on the target position according to a local image of the object A, and displays the composite image on a current preview interface.
Therefore, different from the prior art, the application provides an image display method, an image display device, a storage medium and a terminal device, the image display method determines a shot object from a scanned image of a shooting visual field area of a mobile terminal, extracts a local image of the shot object from the scanned image, and displays a composite image containing the shot object on a current preview interface according to the local image when the position of the mobile terminal is detected to be changed, so that the mobile terminal with the changed position can see a clear and stable image of the shot object on the current preview interface without positioning the shot object, and the power consumption of the mobile terminal is reduced.
Referring to fig. 4, fig. 4 is a schematic structural diagram of an image display device according to an embodiment of the present disclosure, which is applied to a mobile terminal, where the mobile terminal may be any intelligent electronic device with a mobile communication function, such as a smart phone, a tablet computer, a notebook computer, and the like. The image display device provided by the embodiment may include: scanning module 10, extraction module 20 and display module 30, wherein:
(1) scanning module 10
The scanning module 10 is configured to perform image scanning on a photographing view area of the mobile terminal to obtain a scanned image.
In this embodiment, this step is the basis of all subsequent operations, and is intended to store the picture (i.e. the scanned image) captured by the lens of the terminal device into the processor of the terminal device and display the picture in the preview interface of the terminal device. The picture captured by the lens of the terminal equipment comprises an object which a terminal user wants to shoot and other objects.
It is easy to understand that, when the preview interface of the terminal displays the picture captured by the terminal lens, the size of the preview interface can be set according to a preset display scale. The display scale is used for indicating the size relation between the preview interface and the terminal display screen. For example, if the display ratio is 70%, it indicates that the size of the preview interface is 70% of the size of the display screen.
(2) Extraction module 20
And an extracting module 20, configured to determine at least one object from the scanned image, and extract a partial image of each object from the scanned image.
(3) Display module 30
And the display module 30 is configured to display a composite image including the object to be photographed on the current preview interface according to the local image when detecting that the position of the mobile terminal changes during the image scanning process.
Further, referring to fig. 5, fig. 5 is another schematic structural diagram of the image display device according to the embodiment of the present application, and the extraction module 20 specifically includes:
the display unit 21 is used for determining all objects in the scanned image and displaying the scanned image on the current preview interface;
in this embodiment, after the terminal determines all objects in the scanned image of the terminal through the sensing element inside the terminal, a corresponding mark may be added to each object on the scanned image for the user to select in the subsequent operation, and the scanned image including the mark is displayed on the current preview interface.
It will be readily appreciated that the manner in which the respective indicia is added to each object in the scanned image includes, but is not limited to: a selection box is generated on each object and/or a marker point is generated on each object.
An acquisition unit 22, configured to acquire selection operation information of a user on a preview interface;
in this embodiment, the user may trigger the electronic device to generate the selection operation information by clicking a mark corresponding to the object displayed on the preview interface, or by specifying a gesture, voice, or the like.
A determination unit 23 for determining at least one object to be photographed from all the objects according to the selection operation information.
In this embodiment, if the user triggers the electronic device to generate selection operation information by clicking a mark corresponding to an object displayed on the preview interface, the processor determines that the object corresponding to the mark selected by the user is a shot object; if the user triggers the electronic equipment to generate selection operation information through the designated gesture, the processor determines an object contained in a range covered by the designated gesture as a shot object; if the user triggers the electronic equipment to generate the selection operation information through voice, the processor determines the shot object according to the information contained in the voice, wherein the information contained in the voice indicates the shot object selected by the user.
Further, referring to fig. 5, the display module 30 may specifically include:
an obtaining unit 31, configured to obtain two pieces of location information of the mobile terminal at any adjacent time;
a first determining unit 32 for determining that the position of the mobile terminal has not changed when a difference between the two position information is less than a preset difference;
a second determining unit 33, configured to determine that the position of the mobile terminal changes when the difference between the two pieces of position information is not less than a preset difference.
In this embodiment, the position information may be obtained by a sensing element inside the terminal, and the information includes the distance from the terminal to the center of each object to be photographed and the exposure data of the terminal to each part of each object to be photographed.
Further, the image display apparatus may further include a generation unit operable to:
acquiring a current scanning image and taking the current scanning image as a target scanning image;
when the target scanning image contains the shot object, determining the position of the shot object on the target scanning image as a target position;
and generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on the current preview interface.
In this embodiment, when the position of the mobile terminal changes, the subject may not be in the current photographing field of view at all, or only a part of the subject may be in the current photographing field of view. When the shot object is possibly not in the current shooting visual field area at all, the target scanning image is considered not to contain the shot object; when only a part of the object is in the current photographing field of view, the object is considered to be included in the target scanning image. When only one part of the shot object is in the current shooting visual field area, the processor firstly determines the part of the shot object in the current shooting visual field area as a part to be displayed, then extracts a corresponding image from the local image according to the part to be displayed, then generates a composite image containing the part to be displayed at a target position, and displays the composite image on a current preview interface.
Further, the image display apparatus may further include an extraction unit operable to:
acquiring continuous frame scanning images within a preset time length;
determining the definition of each shot object in the continuous frame scanning images;
and extracting the corresponding image with the highest definition of each object from the continuous frame scanning images as a local image of the corresponding object.
In this embodiment, the preset duration may be 1 second or the like, and is usually set in combination with power consumption and imaging quality generated during terminal imaging, it is easy to understand that the factors determining the power consumption generated during terminal imaging include performances of a digital signal processing chip and the like inside the terminal, the better the performance of the digital signal processing chip is, the less the power consumption generated during terminal imaging is, the factors determining the terminal imaging quality include performances of an image sensor and the like inside the terminal, and the better the performance of the image sensor is, the higher the quality of terminal imaging is. And the definition of each object to be shot is related to factors such as the sharpness of a lens in each continuous frame, the heating value of the sensor, the photosensitive sensitivity of the sensor and the like, the processor acquires the influence scores of the influence factors in each frame, and directly adds or adds the influence scores according to weight to obtain corresponding definition scores, wherein the definition scores reflect the definition of each object to be shot. After the processor obtains the definition scores of each shot object under each frame, the corresponding image with the highest definition score is selected as the local image of the corresponding shot object, and the rest images are deleted from the cache area for storing the images.
In a specific implementation, the above units may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and the specific implementation of the above units may refer to the foregoing method embodiments, which are not described herein again.
It can be seen from the foregoing that, unlike the prior art, the present application provides an image display method, an apparatus, a storage medium, and a terminal device, in which the image display method determines a photographed object from a scanned image of a photographing field of view of a mobile terminal through a scanning module 10, extracts a local image of the photographed object from the scanned image through an extraction module 20, and when a change in position of the mobile terminal is detected, a display module 30 displays a composite image including the photographed object on a current preview interface according to the local image, so that the mobile terminal with the changed position can see a clear and stable image of the photographed object on the current preview interface without positioning the photographed object, thereby reducing power consumption of the mobile terminal.
In addition, the embodiment of the application further provides a terminal device, and the terminal device can be a smart phone, a tablet computer and other devices. As shown in fig. 6, the terminal device 200 includes a processor 201 and a memory 202. The processor 201 is electrically connected to the memory 202.
The processor 201 is a control center of the terminal device 200, connects various parts of the entire terminal device by using various interfaces and lines, and performs various functions of the terminal device and processes data by running or loading an application program stored in the memory 202 and calling data stored in the memory 202, thereby performing overall monitoring of the terminal device.
In this embodiment, the terminal device 200 is provided with a plurality of memory partitions, the plurality of memory partitions includes a system partition and a target partition, the processor 201 in the terminal device 200 loads instructions corresponding to processes of one or more application programs into the memory 202 according to the following steps, and the processor 201 runs the application programs stored in the memory 202, so as to implement various functions:
carrying out image scanning on a photographing visual field area of the mobile terminal to obtain a scanned image;
determining at least one shot object from the scanned image, and extracting a partial image of each shot object from the scanned image;
and in the process of image scanning, when the position of the mobile terminal is detected to be changed, displaying a composite image containing the shot object on the current preview interface according to the local image.
Fig. 7 is a block diagram showing a specific structure of a terminal device according to an embodiment of the present invention, where the terminal device may be used to implement the image display method provided in the above-described embodiment. The terminal device 300 may be a smart phone or a tablet computer.
The RF circuit 310 is used for receiving and transmitting electromagnetic waves, and performing interconversion between the electromagnetic waves and electrical signals, thereby communicating with a communication network or other devices. RF circuitry 310 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. RF circuit 310 may communicate with various networks such as the internet, an intranet, a wireless network, or with other devices over a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. The Wireless network may use various Communication standards, protocols and technologies, including but not limited to Global System for Mobile Communication (GSM), Enhanced Mobile Communication (EDGE), Wideband Code Division Multiple Access (WCDMA), Code Division Multiple Access (CDMA), Time Division Multiple Access (TDMA), Wireless Fidelity (Wi-Fi) (e.g., IEEE802.11 a, IEEE802.11 b, IEEE802.1 g and/or IEEE802.1 n), Voice over Internet Protocol (VoIP), world wide Internet Protocol (Microwave Access for Wireless communications, Wi-Max), and other short message protocols, as well as any other suitable communication protocols, and may even include those that have not yet been developed.
The memory 320 may be configured to store software programs and modules, such as program instructions/modules corresponding to the automatic light supplement system and method for front-facing camera photographing in the foregoing embodiments, and the processor 380 executes various functional applications and data processing by running the software programs and modules stored in the memory 320, so as to implement the function of automatic light supplement for front-facing camera photographing. The memory 320 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 320 may further include memory located remotely from processor 380, which may be connected to terminal device 300 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input unit 330 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, the input unit 330 may include a touch-sensitive surface 331 as well as other input devices 332. The touch-sensitive surface 331, also referred to as a touch screen or touch pad, may collect touch operations by a user on or near the touch-sensitive surface 331 (e.g., operations by a user on or near the touch-sensitive surface 331 using a finger, a stylus, or any other suitable object or attachment), and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface 331 may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 380, and can receive and execute commands sent by the processor 380. In addition, the touch-sensitive surface 331 may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 330 may comprise other input devices 332 in addition to the touch sensitive surface 331. In particular, other input devices 332 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 340 may be used to display information input by or provided to the user and various graphic user interfaces of the terminal apparatus 300, which may be configured by graphics, text, icons, video, and any combination thereof. The Display unit 340 may include a Display panel 341, and optionally, the Display panel 341 may be configured in the form of an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. Further, touch-sensitive surface 331 may overlay display panel 341, and when touch-sensitive surface 331 detects a touch operation thereon or thereabout, communicate to processor 380 to determine the type of touch event, and processor 380 then provides a corresponding visual output on display panel 341 in accordance with the type of touch event. Although in FIG. 7, touch-sensitive surface 331 and display panel 341 are implemented as two separate components for input and output functions, in some embodiments, touch-sensitive surface 331 and display panel 341 may be integrated for input and output functions.
The terminal device 300 may also include at least one sensor 350, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 341 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 341 and/or the backlight when the terminal device 300 is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when the mobile phone is stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the terminal device 300, detailed descriptions thereof are omitted.
The terminal device 300 may assist the user in e-mail, web browsing, streaming media access, etc. through the transmission module 370 (e.g., a Wi-Fi module), which provides the user with wireless broadband internet access. Although fig. 7 shows the transmission module 370, it is understood that it does not belong to the essential constitution of the terminal device 300, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 380 is a control center of the terminal device 300, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the terminal device 300 and processes data by running or executing software programs and/or modules stored in the memory 320 and calling data stored in the memory 320, thereby performing overall monitoring of the mobile phone. Optionally, processor 380 may include one or more processing cores; in some embodiments, processor 380 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 380.
Terminal device 300 also includes a power supply 390 (e.g., a battery) for powering the various components, which may be logically coupled to processor 380 via a power management system in some embodiments to manage charging, discharging, and power consumption management functions via the power management system. The power supply 390 may also include any component including one or more of a dc or ac power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown, the terminal device 300 may further include a camera (e.g., a front camera, a rear camera), a bluetooth module, and the like, which are not described in detail herein. Specifically, in this embodiment, the display unit of the terminal device is a touch screen display, the terminal device further includes a memory, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the one or more processors, and the one or more programs include instructions for:
carrying out image scanning on a photographing visual field area of the mobile terminal to obtain a scanned image;
determining at least one shot object from the scanned image, and extracting a partial image of each shot object from the scanned image;
and in the process of image scanning, when the position of the mobile terminal is detected to be changed, displaying a composite image containing the shot object on the current preview interface according to the local image.
In specific implementation, the above modules may be implemented as independent entities, or may be combined arbitrarily to be implemented as the same or several entities, and specific implementation of the above modules may refer to the foregoing method embodiments, which are not described herein again.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, embodiments of the present invention provide a storage medium, in which a plurality of instructions are stored, and the instructions can be loaded by a processor to execute steps in any one of the image display methods provided by the embodiments of the present invention.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any image display method provided in the embodiment of the present invention, the beneficial effects that can be achieved by any image display method provided in the embodiment of the present invention can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
In addition to the above embodiments, other embodiments are also possible. All technical solutions formed by using equivalents or equivalent substitutions fall within the protection scope of the claims of the present application.
In summary, although the present application has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present application, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present application, so that the scope of the present application shall be determined by the appended claims.
Claims (10)
1. An image display method, characterized in that the image display method comprises:
carrying out image scanning on a photographing visual field area of the mobile terminal to obtain a scanned image;
determining at least one shot object from the scanning image, and extracting a partial image of each shot object from the scanning image;
and in the image scanning process, when the position of the mobile terminal is detected to be changed, displaying a composite image containing the shot object on a current preview interface according to the local image.
2. The image display method according to claim 1, wherein the step of determining at least one object to be photographed from the scanned image specifically comprises:
determining all objects in the scanned image, and displaying the scanned image on a current preview interface;
acquiring selection operation information of a user on the preview interface;
and determining at least one shot object from all the objects according to the selection operation information.
3. The image display method according to claim 1, wherein the step in the process of the image scanning further comprises:
acquiring two pieces of position information of the mobile terminal at any adjacent time;
when the difference value between the two pieces of position information is smaller than a preset difference value, determining that the position of the mobile terminal is not changed;
and when the difference value between the two pieces of position information is not less than a preset difference value, determining that the position of the mobile terminal is changed.
4. The image display method according to claim 1, wherein the step of displaying the composite image including the subject on the current preview interface according to the partial image specifically includes:
acquiring a current scanning image and taking the current scanning image as a target scanning image;
when the shot object is contained in the target scanning image, determining the position of the shot object on the target scanning image as a target position;
and generating a composite image containing the shot object at the target position according to the local image, and displaying the composite image on a current preview interface.
5. The image display method according to claim 1, wherein the step of extracting the partial image of each of the objects from the scanned image specifically includes:
acquiring continuous frame scanning images within a preset time length;
determining the definition of each shot object in the continuous frame scanning images;
and extracting the corresponding image with the highest definition of each shot object from the continuous frame scanning images as a local image of the corresponding shot object.
6. An image display device characterized by comprising:
the scanning module is used for scanning images of a photographing view field of the mobile terminal to obtain a scanned image;
the extraction module is used for determining at least one shot object from the scanning image and extracting a partial image of each shot object from the scanning image;
and the display module is used for displaying a composite image containing the shot object on a current preview interface according to the local image when detecting that the position of the mobile terminal is changed in the image scanning process.
7. The image display device according to claim 6, wherein the extraction module specifically comprises:
the display unit is used for determining all objects in the scanned image and displaying the scanned image on a current preview interface;
the acquisition unit is used for acquiring the selection operation information of the user on the preview interface;
and the determining unit is used for determining at least one shot object from all the objects according to the selection operation information.
8. The image display device according to claim 6, wherein the display module specifically comprises:
the mobile terminal comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring two pieces of position information of the mobile terminal at any adjacent time;
a first determining unit, configured to determine that the position of the mobile terminal has not changed when a difference between the two pieces of position information is smaller than a preset difference;
and the second determining unit is used for determining that the position of the mobile terminal changes when the difference value between the two pieces of position information is not less than a preset difference value.
9. A computer-readable storage medium having stored thereon a plurality of instructions adapted to be loaded by a processor to perform the image display method of any of claims 1 to 5.
10. A terminal device, comprising a processor and a memory, wherein the processor is electrically connected to the memory, the memory is used for storing instructions and data, and the processor is used for executing the steps of the image display method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911265384.9A CN110995996A (en) | 2019-12-11 | 2019-12-11 | Image display method and device, storage medium and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911265384.9A CN110995996A (en) | 2019-12-11 | 2019-12-11 | Image display method and device, storage medium and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110995996A true CN110995996A (en) | 2020-04-10 |
Family
ID=70092265
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911265384.9A Pending CN110995996A (en) | 2019-12-11 | 2019-12-11 | Image display method and device, storage medium and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110995996A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103595909A (en) * | 2012-08-16 | 2014-02-19 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
CN106101545A (en) * | 2016-06-30 | 2016-11-09 | 维沃移动通信有限公司 | A kind of image processing method and mobile terminal |
CN107507159A (en) * | 2017-08-10 | 2017-12-22 | 珠海市魅族科技有限公司 | Image processing method and device, computer installation and readable storage medium storing program for executing |
CN109413334A (en) * | 2018-12-13 | 2019-03-01 | 浙江舜宇光学有限公司 | Image pickup method and filming apparatus |
US20190158745A1 (en) * | 2017-11-20 | 2019-05-23 | Canon Kabushiki Kaisha | Imaging apparatus and control method thereof |
-
2019
- 2019-12-11 CN CN201911265384.9A patent/CN110995996A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103595909A (en) * | 2012-08-16 | 2014-02-19 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
CN106101545A (en) * | 2016-06-30 | 2016-11-09 | 维沃移动通信有限公司 | A kind of image processing method and mobile terminal |
CN107507159A (en) * | 2017-08-10 | 2017-12-22 | 珠海市魅族科技有限公司 | Image processing method and device, computer installation and readable storage medium storing program for executing |
US20190158745A1 (en) * | 2017-11-20 | 2019-05-23 | Canon Kabushiki Kaisha | Imaging apparatus and control method thereof |
CN109413334A (en) * | 2018-12-13 | 2019-03-01 | 浙江舜宇光学有限公司 | Image pickup method and filming apparatus |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104135609B (en) | Auxiliary photo-taking method, apparatus and terminal | |
CN109688322B (en) | Method and device for generating high dynamic range image and mobile terminal | |
EP3627823B1 (en) | Image selection method and related product | |
CN107124555B (en) | Method and device for controlling focusing, computer equipment and computer readable storage medium | |
CN108307106B (en) | An image processing method, device and mobile terminal | |
CN105989572B (en) | Picture processing method and device | |
CN111857793B (en) | Training method, device, equipment and storage medium of network model | |
JP7371254B2 (en) | Target display method and electronic equipment | |
CN111182236A (en) | Image synthesis method and device, storage medium and terminal equipment | |
CN111401463B (en) | Method for outputting detection result, electronic equipment and medium | |
CN111050069B (en) | Shooting method and electronic equipment | |
CN105635553B (en) | Image shooting method and device | |
JP6862564B2 (en) | Methods, devices and non-volatile computer-readable media for image composition | |
CN110944114B (en) | Photographing method and electronic device | |
CN109561255B (en) | Terminal photographing method and device and storage medium | |
CN109104573B (en) | Method for determining focusing point and terminal equipment | |
CN111556248B (en) | Shooting method, shooting device, storage medium and mobile terminal | |
CN111372001B (en) | Image fusion method and device, storage medium and mobile terminal | |
CN111182206B (en) | Image processing method and device | |
CN108449560B (en) | A video recording method and terminal | |
CN111083361A (en) | Image acquisition method and electronic device | |
CN111343335B (en) | Image display processing method, system, storage medium and mobile terminal | |
CN111355892B (en) | Picture shooting method and device, storage medium and electronic terminal | |
CN110996003B (en) | Photographing positioning method and device and mobile terminal | |
CN110086916B (en) | Photographing method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200410 |
|
RJ01 | Rejection of invention patent application after publication |