CN105427247B - A kind of mobile terminal and image processing method of image procossing - Google Patents
A kind of mobile terminal and image processing method of image procossing Download PDFInfo
- Publication number
- CN105427247B CN105427247B CN201510843545.3A CN201510843545A CN105427247B CN 105427247 B CN105427247 B CN 105427247B CN 201510843545 A CN201510843545 A CN 201510843545A CN 105427247 B CN105427247 B CN 105427247B
- Authority
- CN
- China
- Prior art keywords
- image
- pixel
- row
- original image
- pixel value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 21
- 238000000034 method Methods 0.000 claims abstract description 37
- 230000000694 effects Effects 0.000 abstract description 3
- 230000009466 transformation Effects 0.000 abstract description 3
- 238000004891 communication Methods 0.000 description 26
- 230000008569 process Effects 0.000 description 16
- 230000003321 amplification Effects 0.000 description 10
- 238000003199 nucleic acid amplification method Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000010295 mobile communication Methods 0.000 description 7
- 238000000354 decomposition reaction Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Editing Of Facsimile Originals (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of mobile terminal of image procossing and image processing method, the above method includes:Input unit inputs pending original image;Controller determines the corresponding first processing image of original image;In a first direction, the enlarged drawing of original image in a first direction is obtained into row interpolation to original image and corresponding first processing image;Determine the corresponding second processing image of the enlarged drawing of original image in a first direction;In a second direction, the final enlarged drawing of original image is obtained into row interpolation to original image enlarged drawing in a first direction and corresponding second processing image;Wherein, first direction is perpendicular to second direction.Image processing method and device disclosed by the invention, the principle based on wavelet inverse transformation can improve the reproduction effect of original image detailed information.
Description
Technical Field
The present invention relates to the field of image processing, and in particular, to a mobile terminal and an image processing method for image processing.
Background
In image processing, image scaling is a common processing method. In image processing applications, image scaling is required in many places. Image enlargement is achieved, for example, by image interpolation. In brief, the interpolation is a method of simulating peripheral pixel values according to parameters of a central pixel point. The currently common image scaling algorithm is a bicubic interpolation algorithm. Each pixel in the output image processed by the bicubic interpolation algorithm is the operation result of 16 pixels (namely 4 × 4 areas) in the original image. In other words, the dual cubic interpolation algorithm obtains a final pixel point by applying 16 pixel points in the surrounding 4 × 4 region. However, the double cubic interpolation operation will bring serious loss to the image edge, and in the interpolated and amplified image, the reproduction effect of the detail information of the original image needs to be improved.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides an image processing mobile terminal and an image processing method, which can improve the reproduction effect of the detail information of the original image based on the principle of inverse wavelet transform.
In order to achieve the object of the present invention, the present invention provides an image processing mobile terminal, comprising: an input unit and a controller; an input unit for inputting an original image to be processed; the controller is used for determining a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction; determining a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and the corresponding second processed image to obtain a final amplified image of the original image; wherein the first direction is perpendicular to the second direction.
Further, the first direction is a row direction of the pixel points, and the second direction is a column direction of the pixel points; or, the first direction is a column direction of the pixel points, and the second direction is a row direction of the pixel points.
Further, the determining, by the controller, a first processed image corresponding to the original image is:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the original image and the pixel values of the pixel points on the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
Further, the controller interpolates the original image and the corresponding first processed image in a first direction to obtain an enlarged image of the original image in the first direction, which means that:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the original image, the second pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
Further, the determining, by the controller, a second processed image corresponding to the enlarged image of the original image in the first direction is:
the controller determines the pixel values of the pixel points positioned in the h-th row and the w-th row in the corresponding second processed image according to the pixel values of the pixel points (h, w) positioned in the h-th row and the w-th row in the enlarged image of the original image in the first direction and the pixel values of the pixel points positioned in the left side and the right side:
P2_high(h,w)=(2×P1_out(h,w)-P1_out(h,w-1)-P1_out(h,w+1))/4,
p2_ high (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the second processed image corresponding to the enlarged image of the original image in the first direction, P1_ out (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the enlarged image of the original image in the first direction, P1_ out (h, w-1) represents pixel values of pixels located on the left side of the pixels in the enlarged image of the original image in the first direction, P1_ out (h, w +1) represents pixel values of pixels located on the right side of the pixels in the enlarged image of the original image in the first direction, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image of the original image in the first direction to obtain a pixel value corresponding to each pixel point in the corresponding second processed image so as to determine the second processed image corresponding to the amplified image of the original image in the first direction.
Further, the controller interpolates the enlarged image of the original image in the first direction and the corresponding second processed image in the second direction to obtain a final enlarged image of the original image, which means that:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-1 th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel points positioned on the h-th row and the w-th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel points positioned on the h-th row and the w-th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image.
The invention also provides an image processing method, which comprises the following steps: an input unit inputs an original image to be processed; the controller determines a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction; the controller determines a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and the corresponding second processed image to obtain a final amplified image of the original image; wherein the first direction is perpendicular to the second direction.
Further, the first direction is a row direction of the pixel points, and the second direction is a column direction of the pixel points; or, the first direction is a column direction of the pixel points, and the second direction is a row direction of the pixel points.
Further, the controller determines a first processed image corresponding to the original image, including:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the original image and the pixel values of the pixel points on the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
Further, the controller interpolates the original image and the corresponding first processed image in a first direction to obtain an enlarged image of the original image in the first direction, and the method includes:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the original image, the second pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
Compared with the prior art, in the invention, the input unit inputs the original image to be processed; the controller determines a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction; then, determining a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and the corresponding second processed image to obtain a final amplified image of the original image; wherein the first direction is perpendicular to the second direction. Therefore, the image interpolation amplification is carried out based on the wavelet inverse transformation principle, and the amplified image can better keep the detail information of the original image.
Drawings
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the example serve to explain the principles of the invention and not to limit the invention.
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 4 is a diagram of a wavelet transform one-pass decomposition;
FIG. 5 is a flowchart of an image processing method according to an embodiment of the present invention;
fig. 6 is a flowchart of an image processing method according to a second embodiment of the present invention;
fig. 7 is a flowchart of an image processing method according to a third embodiment of the present invention;
fig. 8 is a schematic diagram of a mobile terminal for image processing according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a mobile terminal for image processing according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic hardware configuration of a mobile terminal implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive broadcast data by using a digital broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), for example) Forward link medium (MediaFLO)@) A digital broadcasting system of a terrestrial digital broadcasting integrated service (ISDB-T), etc. receives digital broadcasting. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd so on.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Fig. 3 is a flowchart of an image processing method according to an embodiment of the present invention. As shown in fig. 3, the image processing method provided by the present embodiment includes the following steps:
step 301: the input unit inputs an original image to be processed.
Here, the original image is, for example, an RGB image. However, the present invention is not limited thereto. The present embodiment is equally applicable to other types of images (e.g., grayscale images, etc.).
Step 302: the controller determines a first processed image corresponding to the original image; and in the first direction, the original image and the corresponding first processed image are interpolated to obtain an enlarged image of the original image in the first direction.
Herein, the controller determines a first processed image corresponding to the original image, including:
the controller determines the pixel values of the pixel points positioned in the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned in the h-th row and the w-th row in the original image and the pixel values of the pixel points at the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
The controller interpolates the original image and the corresponding first processed image in the first direction to obtain an enlarged image of the original image in the first direction, and the method comprises the following steps:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned on the h th row and the w th row in the original image, the second pixel value is the pixel value of the pixel point positioned on the h th row and the w th row in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processing image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
Step 303: the controller determines a second processed image corresponding to the amplified image of the original image in the first direction; and in the second direction, interpolating the amplified image of the original image in the first direction and the corresponding second processed image to obtain a final amplified image of the original image.
Herein, the controller determines a second processed image corresponding to an enlarged image of the original image in the first direction, including:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding second processed image according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the enlarged image of the original image in the first direction and the pixel values of the pixel points positioned on the left side and the right side as follows:
P2_high(h,w)=(2×P1_out(h,w)-P1_out(h,w-1)-P1_out(h,w+1))/4,
p2_ high (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the second processed image corresponding to the enlarged image of the original image in the first direction, P1_ out (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the enlarged image of the original image in the first direction, P1_ out (h, w-1) represents pixel values of pixels located on the left side of the pixels in the enlarged image of the original image in the first direction, P1_ out (h, w +1) represents pixel values of pixels located on the right side of the pixels in the enlarged image of the original image in the first direction, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image of the original image in the first direction to obtain a pixel value corresponding to each pixel point in the corresponding second processed image so as to determine the second processed image corresponding to the amplified image of the original image in the first direction.
Herein, the controller interpolates the enlarged image of the original image in the first direction and the corresponding second processed image in the second direction to obtain a final enlarged image of the original image, and the method includes:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-1 th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel point positioned on the h-th row and the w-th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel point positioned on the h-th row and the w-th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image.
Here, the first direction is perpendicular to the second direction. In one embodiment, the first direction is a row direction of the pixels, and the second direction is a column direction of the pixels. In another embodiment, the first direction is a row direction of the pixels, and the second direction is a column direction of the pixels.
Here, the image interpolation and enlargement provided by the present embodiment is based on the inverse wavelet transform principle. Specifically, the primary decomposition of the wavelet transform mainly decomposes an original image into four sub-images, namely an LL image, an LH image, an HL image and an HH image, wherein the LL image represents low-frequency information of the original image and is used for representing a main body in the image, and the remaining three images represent high-frequency detail information of the original image and are used for representing detail parts in the image.
Fig. 4 is a schematic diagram of a first decomposition of a wavelet transform. As shown in fig. 4, in the primary decomposition process of the wavelet transform, the original image shown in fig. 4(a) is decomposed into two sub-images with the same height and half width (as shown in fig. 4 (b)), and then, the decomposition is performed to obtain four sub-images as shown in fig. 4 (c). In fig. 4(c), the image at the upper left corner is an LL image, the image at the lower left corner is an LH image, the image at the upper right corner is an HL image, and the image at the lower right corner is an HH image.
According to the wavelet transform process, the inverse wavelet transform process is a process of acquiring an original image according to the sub-image.
The image interpolation and amplification process based on the inverse wavelet transform principle in this embodiment is as follows: the method comprises the steps of taking an input image (such as an original image or an amplified image of the original image in a first direction) as a low-frequency image obtained by wavelet transformation primary decomposition, estimating a high-frequency processing image corresponding to the low-frequency image according to the low-frequency image, and then carrying out interpolation amplification on the low-frequency image and the obtained high-frequency processing image in the corresponding direction to obtain an amplified image of the input image in the corresponding direction.
In addition, in this embodiment, when the pixel value of each pixel of the original image includes at least two components, the controller performs interpolation on each component image of the original image and the first processed image corresponding to each component image in the first direction, respectively, to obtain an enlarged image of each component in the first direction; and the controller respectively interpolates the amplified image of each component in the first direction and the corresponding second processed image in the second direction to obtain a final amplified image of each component, and determines the final amplified image of the original image according to the final amplified image of each component.
In other words, when the pixel values of the pixel points of the original image include a plurality of components, the image on each component is interpolated in the first direction and the second direction in sequence to obtain a corresponding final enlarged image, and then the final enlarged images on the components are combined to determine the final enlarged image of the original image.
For example, taking the original image as a Red-Green-Blue (RGB) image as an example, each pixel value of the RGB image includes an R component, a G component, and a B component. The controller obtains a final enlarged image of the original image by determining the final enlarged image on the R component, the final enlarged image on the G component, and the final enlarged image on the B component, respectively.
Therefore, by the embodiment of the invention, the image interpolation amplification is carried out based on the inverse wavelet transform principle, and the obtained amplified image can better retain the detail information of the original image.
Based on the above mobile terminal hardware structure and communication system, the present invention provides various embodiments of the method.
Example one
Fig. 5 is a flowchart of an image processing method according to an embodiment of the present invention. As shown in fig. 5, the image processing method provided in this embodiment is applied to a mobile terminal (e.g., a mobile phone, a tablet computer, etc.), and includes the following steps:
step 501: the input unit inputs an original image to be processed.
Here, the original image will be described by taking an RGB image as an example. However, the present invention is not limited thereto. The present embodiment is equally applicable to other types of images (e.g., grayscale images, etc.).
Step 502: the controller interpolates and amplifies R, G, and B components of the RGB image in the row direction, respectively.
Here, the G component is taken as an example and described in detail.
First, the controller determines a corresponding first processed image for the image of the G component. Specifically, according to pixel values of pixel points (h, w) located in the h-th row and the w-th column in the image of the G component and pixel points on the left and right sides of the pixel points, it is determined that the pixel values of the pixel points located in the h-th row and the w-th column in the corresponding first processed image are:
G1_high(h,w)=(2×G1(h,w)-G1(h,w-1)-G1(h,w+1))/4,
wherein G1_ high (h, w) represents pixel values of pixels located in an h-th row and a w-th column in a first processed image corresponding to a G component image, G1(h, w) represents pixel values of pixels (h, w) located in the h-th row and the w-th column in the G component image, G1(h, w-1) represents pixel values of left pixels of the pixels in the G component image, G1(h, w +1) represents pixel values of right pixels of the pixels in the G component image, and h and w are integers greater than or equal to 0; and then, circularly traversing each pixel point in the G component image, thereby obtaining a first processing image corresponding to the G component image.
Then, the controller performs interpolation in the line direction using the image of the G component and the corresponding first processed image to obtain an enlarged image of the G component in the line direction. Specifically, the interpolation is performed according to the following equation:
G1_out(h,2×w-1)=G1(h,w)+G1_high(h,w)
G1_out(h,2×w)=G1(h,w)-G1_high(h,w),
wherein, G1_ out (h, 2 × w-1) represents pixel values of pixel points located at the h-th row and the 2 × w-1 column in the magnified image of the G component in the row direction, G1_ out (h, 2 × w) represents pixel values of pixel points located at the h-th row and the 2 × w column in the magnified image of the G component in the row direction, G1(h, w) represents pixel values of pixel points located at the h-th row and the w-th column in the G component image, and G1_ high (h, w) represents pixel values of pixel points located at the h-th row and the w-th column in the first processed image corresponding to the G component image; thereafter, each pixel in the G component image and the corresponding first processed image is cyclically traversed to obtain an enlarged image of the G component in the row direction.
According to the interpolation formula, an enlarged image with the height of the G component image unchanged and the width 2 times that of the original G component image is obtained. Similarly, the R component image and the B component image are interpolated and enlarged according to the description in step 502 to obtain an enlarged image of the R component in the row direction and an enlarged image of the B component in the row direction.
Step 503: the controller interpolates and amplifies, in the column direction, an enlarged image of the R component in the row direction, an enlarged image of the G component in the row direction, and an enlarged image of the B component in the row direction, respectively.
Here, the G component is still used as an example for explanation.
First, the controller determines a corresponding second processed image for the enlarged image of the G component in the row direction. Specifically, according to pixel values of pixel points (h, w) located in the h-th row and the w-th column in the G-component line-direction enlarged image and pixel points on the left and right sides thereof, it is determined that the pixel values of the pixel points located in the h-th row and the w-th column in the corresponding second processed image are:
G2_high(h,w)=(2×G1_out(h,w)-G1_out(h,w-1)-G1_out(h,w+1))/4,
wherein G2_ high (h, w) represents pixel values of pixel points located in the h-th row and the w-th column in the second processed image corresponding to the amplified image in the G component row direction, G1_ out (h, w) represents pixel values of pixel points (h, w) located in the h-th row and the w-th column in the amplified image in the G component row direction, G1_ out (h, w-1) represents pixel values of left pixel points of the pixel points in the amplified image in the G component row direction, G1_ out (h, w +1) represents pixel values of right pixel points of the pixel points in the amplified image in the G component row direction, and h and w are integers greater than or equal to 0; and then, circularly traversing each pixel point of the G component in the amplified image in the row direction, thereby obtaining a second processed image corresponding to the amplified image in the row direction of the G component.
And secondly, the controller performs interpolation in the column direction by using the amplified image of the G component in the row direction and the corresponding second processed image to obtain an amplified image of the G component in the row direction in the column direction, namely a final amplified image of the G component. Specifically, the interpolation is performed according to the following equation:
G2_out(2×h-1,w)=G1_out(h,w)+G2_high(h,w)
G2_out(2×h,w)=G1_out(h,w)-G2_high(h,w),
wherein G2_ out (2 × h-1, w) indicates pixel values of pixels positioned in the 2 × h-1 row and the w-th column in the final enlarged image of the G component, G2_ out (2 × h, w) indicates pixel values of pixels positioned in the 2 × h row and the w-th column in the final enlarged image of the G component, G1_ out (h, w) indicates pixel values of pixels positioned in the h row and the w-th column in the line-direction enlarged image of the G component, and G2_ high (h, w) indicates pixel values of pixels positioned in the h row and the w-th column in the second processed image corresponding to the line-direction enlarged image of the G component; thereafter, each pixel in the row-direction enlarged image of the G component and the corresponding second processed image is cyclically traversed to obtain an enlarged image of the G component in the column direction, i.e., a final enlarged image.
According to the interpolation formula, an enlarged image with a constant width relative to the line direction enlarged image of the G component and a height 2 times the height of the line direction enlarged image of the G component is obtained. Similarly, the R component image and the B component image are interpolated and enlarged according to the description in step 503 to obtain an enlarged image of the R component in the column direction and an enlarged image of the B component in the column direction.
The final enlarged image of the original RGB image may be obtained from the final enlarged image of the G component, the final enlarged image of the B component, and the final enlarged image of the R component.
Example two
Fig. 6 is a flowchart of an image processing method according to a second embodiment of the present invention. As shown in fig. 6, the image processing method provided by the present embodiment includes the following steps:
step 601: the input unit inputs an original image to be processed.
Here, the original image will be described by taking an RGB image as an example. However, the present invention is not limited thereto. The present embodiment is equally applicable to other types of images (e.g., grayscale images, etc.).
Step 602: the controller interpolates and amplifies R, G, and B components of the RGB image in the column direction, respectively.
Here, the G component is taken as an example and described in detail.
First, the controller determines a corresponding first processed image for the image of the G component. Specifically, according to pixel values of pixel points (h, w) located in the h-th row and the w-th column in the image of the G component and pixel points on the left and right sides of the pixel points, it is determined that the pixel values of the pixel points located in the h-th row and the w-th column in the corresponding first processed image are:
G1_high(h,w)=(2×G1(h,w)-G1(h,w-1)-G1(h,w+1))/4,
wherein G1_ high (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the first processed image corresponding to the G component image, G1(h, w) represents pixel values of pixels (h, w) located in the h-th row and the w-th column in the G component image, G1(h, w-1) represents pixel values of left pixels of the pixels in the G component image, G1(h, w +1) represents pixel values of right pixels of the pixels in the G component image, and h and w are integers greater than or equal to 0; and then, circularly traversing each pixel point in the G component image, thereby obtaining a first processing image corresponding to the G component image.
And secondly, the controller performs interpolation by using the image of the G component and the corresponding first processing image in the column direction to obtain an enlarged image of the G component in the column direction. Specifically, the interpolation is performed according to the following equation:
G1_out(2×h-1,w)=G1(h,w)+G1_high(h,w)
G1_out(2×h,w)=G1(h,w)-G1_high(h,w),
wherein G1_ out (2 × h-1, w) indicates pixel values of pixels located in the 2 × h-1 th row and the w th column in the magnified image in the column direction of the G component, G1_ out (2 × h, w) indicates pixel values of pixels located in the 2 × h th row and the w th column in the magnified image in the row direction of the G component, G1(h, w) indicates pixel values of pixels located in the h-th row and the w-th column in the G component image, and G1_ high (h, w) indicates pixel values of pixels located in the h-th row and the w-th column in the first processed image corresponding to the G component image; then, each pixel in the G component image and the corresponding first processed image is cyclically traversed to obtain an enlarged image of the G component in the column direction.
According to the interpolation formula, an enlarged image with the width of the G component image unchanged and the height 2 times of the height of the original G component image is obtained. Similarly, the R component image and the B component image are interpolated and amplified according to the description in step 502 to obtain an amplified image of the R component in the column direction and an amplified image of the B component in the column direction.
Step 1403: the controller interpolates and amplifies an enlarged image of the R component in the column direction, an enlarged image of the G component in the column direction, and an enlarged image of the B component in the column direction, respectively, in the row direction.
Here, the G component is still used as an example for explanation.
First, the controller determines a corresponding second processed image for the enlarged image of the G component in the column direction. Specifically, according to pixel values of pixel points (h, w) located in the h-th row and the w-th column in the amplified image in the column direction of the G component and pixel points on the left and right sides thereof, it is determined that pixel values of pixel points located in the h-th row and the w-th column in the corresponding second processed image are:
G2_high(h,w)=(2×G1_out(h,w)-G1_out(h,w-1)-G1_out(h,w+1))/4,
wherein G2_ high (h, w) represents pixel values of pixel points located at the h-th row and the w-th column in the second processed image corresponding to the column-direction enlarged image of the G component, G1_ out (h, w) represents pixel values of pixel points (h, w) located at the h-th row and the w-th column in the column-direction enlarged image of the G component, G1_ out (h, w-1) represents pixel values of left pixel points of the pixel points in the column-direction enlarged image of the G component, G1_ out (h, w +1) represents pixel values of right pixel points of the pixel points in the column-direction enlarged image of the G component, and h and w are integers greater than or equal to 0; and then, circularly traversing each pixel point of the G component in the amplified image in the column direction, thereby obtaining a second processed image corresponding to the amplified image in the column direction of the G component.
And secondly, the controller performs interpolation in the row direction by using the amplified image of the G component in the column direction and the corresponding second processed image to obtain an amplified image of the G component in the column direction in the row direction, namely a final amplified image of the G component. Specifically, the interpolation is performed according to the following equation:
G2_out(h,2×w-1)=G1_out(h,w)+G2_high(h,w)
G2_out(h,2×w)=G1_out(h,w)-G2_high(h,w),
wherein G2_ out (h, 2 × w-1) represents pixel values of pixel points located at the h-th row and the 2 × w-1 column in the final enlarged image of the G component, G2_ out (h, 2 × w) represents pixel values of pixel points located at the h-th row and the 2 × w column in the final enlarged image of the G component, G1_ out (h, w) represents pixel values of pixel points located at the h-th row and the w-th column in the enlarged image in the column direction of the G component, and G2_ high (h, w) represents pixel values of pixel points located at the h-th row and the w-th column in the second processed image corresponding to the enlarged image in the column direction of the G component; thereafter, each pixel in the column-direction enlarged image and the corresponding second processed image of the G component is cyclically traversed to obtain an enlarged image of the G component in the row direction, i.e., a final enlarged image.
According to the interpolation formula, an enlarged image with a constant height and a width 2 times the width of the column-direction enlarged image of the G component is obtained with respect to the column-direction enlarged image of the G component. Similarly, the R component image and the B component image are interpolated and enlarged according to the description in step 503 to obtain an enlarged image of the R component in the row direction and an enlarged image of the B component in the row direction.
The final enlarged image of the original RGB components can be obtained from the final enlarged image of the G component, the final enlarged image of the B component, and the final enlarged image of the R component.
EXAMPLE III
Fig. 7 is a flowchart of an image processing method according to a third embodiment of the present invention. As shown in fig. 7, the image processing method provided by the present embodiment includes the following steps:
step 701: the input unit inputs an original image to be processed.
Here, the original image will be described by taking an RGB image as an example. However, the present invention is not limited thereto. The present embodiment is equally applicable to other types of images (e.g., grayscale images, etc.).
Step 702: the controller performs interpolation amplification in the line direction on the G component of the original RGB image, and performs interpolation amplification in the line direction on an enlarged image obtained by interpolating the R component in the line direction.
Here, the G component is taken as an example for explanation.
Firstly, the controller determines a corresponding first processing image for an image of a G component, and interpolates the G component image and the corresponding first processing image in the line direction to obtain an amplified image of the G component in the line direction; this process is the same as step 502 of the first embodiment, and therefore, will not be described in detail herein;
then, the controller determines a corresponding second processed image for the amplified image of the G component in the row direction, and performs interpolation by using the amplified image of the G component in the row direction and the corresponding second processed image in the column direction to obtain an amplified image of the G component in the column direction, wherein the amplified image is a final amplified image of the G component; the process is the same as step 503 of the first embodiment, and therefore, the description thereof is omitted.
In addition, in this step, the R component of the original RGB image may be interpolated and amplified in the column direction, and then the enlarged image obtained by interpolating the R component in the column direction may be interpolated and amplified in the line direction. That is, the present embodiment does not limit the order of the line direction interpolation and the column direction interpolation.
Step 703: the controller carries out interpolation amplification in the line direction on the R component of the original RGB image, and carries out interpolation amplification in the line direction on an enlarged image obtained by interpolation of the R component in the line direction.
Here, the processing manner of the R component is the same as that of the G component, and therefore, the description thereof is omitted.
Step 704: the controller carries out interpolation amplification in the line direction on the B component of the original RGB image, and carries out interpolation amplification in the line direction on an enlarged image obtained by interpolation of the B component in the line direction.
Here, the processing manner of the B component is the same as that of the G component, and therefore, the description thereof is omitted.
In the present invention, the order of processing the R component, G component, and B component is not limited. In other embodiments, the R component may be processed first, and then the B component and the G component may be processed sequentially.
Likewise, a final enlarged image of the original RGB image may be obtained from the final enlarged image of the G component, the final enlarged image of the B component, and the final enlarged image of the R component.
In addition, when the embodiment of the invention is applied to the gray image, the processing procedure of interpolating and amplifying the original gray image to obtain the final amplified image is similar to the processing procedure of each component (such as the G component) in the RGB image. And therefore will not be described herein.
Fig. 8 is a schematic diagram of a mobile terminal for image processing according to an embodiment of the present invention. Fig. 9 is a schematic structural diagram of a mobile terminal for image processing according to an embodiment of the present invention. Here, as shown in fig. 8, the mobile terminal is, for example, a mobile phone. However, the present invention is not limited thereto.
As shown in fig. 9, the image processing mobile terminal provided in this embodiment includes: an input unit and a controller. The input unit is used for inputting an original image to be processed; the controller is used for determining a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction; determining a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and the corresponding second processed image to obtain a final amplified image of the original image; wherein the first direction is perpendicular to the second direction.
In one embodiment, the first direction is a row direction of the pixels, and the second direction is a column direction of the pixels.
In another embodiment, the first direction is a row direction of the pixels, and the second direction is a column direction of the pixels.
Further, the determining, by the controller, a first processed image corresponding to the original image is:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the original image and the pixel values of the pixel points on the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
Further, the controller interpolates the original image and the corresponding first processed image in a first direction to obtain an enlarged image of the original image in the first direction, which means that:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the original image, the second pixel value is the pixel value of the pixel point positioned on the h th row and the w th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
Further, the determining, by the controller, a second processed image corresponding to the enlarged image of the original image in the first direction is:
the controller determines the pixel values of the pixel points positioned in the h-th row and the w-th row in the corresponding second processed image according to the pixel values of the pixel points (h, w) positioned in the h-th row and the w-th row in the enlarged image of the original image in the first direction and the pixel values of the pixel points positioned in the left side and the right side:
P2_high(h,w)=(2×P1_out(h,w)-P1_out(h,w-1)-P1_out(h,w+1))/4,
p2_ high (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the second processed image corresponding to the enlarged image of the original image in the first direction, P1_ out (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the enlarged image of the original image in the first direction, P1_ out (h, w-1) represents pixel values of pixels located on the left side of the pixels in the enlarged image of the original image in the first direction, P1_ out (h, w +1) represents pixel values of pixels located on the right side of the pixels in the enlarged image of the original image in the first direction, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image of the original image in the first direction to obtain a pixel value corresponding to each pixel point in the corresponding second processed image so as to determine the second processed image corresponding to the amplified image of the original image in the first direction.
Further, the controller interpolates the enlarged image of the original image in the first direction and the corresponding second processed image in the second direction to obtain a final enlarged image of the original image, which means that:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the 2 xh-1 th row and the w th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the 2 xh th row and the w th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel points positioned on the h th row and the w th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-1 th row in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned on the h-th row and the 2 xw-th row in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel points positioned on the h-th row and the w-th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel points positioned on the h-th row and the w-th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image.
Further, when the pixel value of each pixel point of the original image comprises at least two components, the controller performs interpolation on each component image of the original image and the first processed image corresponding to each component image in the first direction respectively to obtain an amplified image of each component in the first direction; and the controller respectively interpolates the amplified image of each component in the first direction and the corresponding second processed image in the second direction to obtain a final amplified image of each component, and determines the final amplified image of the original image according to the final amplified image of each component.
In addition, the image processing mobile terminal provided by the embodiment further includes a display unit for displaying the obtained final enlarged image.
In addition, the specific processing flow of the mobile terminal is the same as that of the method, and thus is not described herein again.
It should be noted that the controller of the mobile terminal for image processing provided by the present invention is the controller in fig. 1; a camera of an input unit such as the user input unit or the a/V input unit in fig. 1; the display unit is for example the display unit in fig. 1.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (8)
1. A mobile terminal for image processing, comprising:
an input unit and a controller;
the input unit is used for inputting an original image to be processed;
the controller is used for determining a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction; determining a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and a corresponding second processed image to obtain a final amplified image of the original image;
wherein the first direction is perpendicular to the second direction;
the controller determines a first processed image corresponding to the original image, and the determining is as follows:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the original image and the pixel values of the pixel points on the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
2. The mobile terminal of claim 1, wherein the first direction is a row direction of pixel points, and the second direction is a column direction of pixel points; or, the first direction is a column direction of the pixel points, and the second direction is a row direction of the pixel points.
3. The mobile terminal according to claim 1 or 2, wherein the controller interpolates the original image and the corresponding first processed image in a first direction to obtain an enlarged image of the original image in the first direction, and the interpolation is performed by:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the 2 xh-1 th row and the w th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the 2 xh th row and the w th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
4. The mobile terminal according to claim 1 or 2, wherein the controller determines the second processed image corresponding to the enlarged image of the original image in the first direction by:
the controller determines the pixel values of the pixel points positioned in the h-th row and the w-th row in the corresponding second processed image according to the pixel values of the pixel points (h, w) positioned in the h-th row and the w-th row in the enlarged image of the original image in the first direction and the pixel values of the pixel points positioned in the left side and the right side:
P2_high(h,w)=(2×P1_out(h,w)-P1_out(h,w-1)-P1_out(h,w+1))/4,
p2_ high (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the second processed image corresponding to the enlarged image of the original image in the first direction, P1_ out (h, w) represents pixel values of pixels located in the h-th row and the w-th column in the enlarged image of the original image in the first direction, P1_ out (h, w-1) represents pixel values of pixels located on the left side of the pixels in the enlarged image of the original image in the first direction, P1_ out (h, w +1) represents pixel values of pixels located on the right side of the pixels in the enlarged image of the original image in the first direction, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image of the original image in the first direction to obtain a pixel value corresponding to each pixel point in the corresponding second processed image so as to determine the second processed image corresponding to the amplified image of the original image in the first direction.
5. The mobile terminal according to claim 1 or 2, wherein the controller interpolates the enlarged image of the original image in the first direction and the corresponding second processed image in the second direction to obtain a final enlarged image of the original image, which means:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the 2 xh-1 th row and the w th column in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned in the 2 xh th row and the w th column in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the final amplified image are equal to the sum of a third pixel value and a fourth pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the final amplified image are equal to the difference between the third pixel value and the fourth pixel value, wherein the third pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the amplified image in the first direction of the original image, the fourth pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the second processed image corresponding to the amplified image in the first direction of the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the amplified image in the first direction and the corresponding second processed image to obtain the pixel value of each pixel point in the final amplified image so as to determine the final amplified image of the original image.
6. An image processing method, comprising:
an input unit inputs an original image to be processed;
the controller determines a first processed image corresponding to the original image; in a first direction, carrying out interpolation on the original image and the corresponding first processing image to obtain an amplified image of the original image in the first direction;
the controller determines a second processed image corresponding to the amplified image of the original image in the first direction; in a second direction, carrying out interpolation on the amplified image of the original image in the first direction and a corresponding second processed image to obtain a final amplified image of the original image;
wherein the first direction is perpendicular to the second direction;
the controller determines a first processed image corresponding to the original image, and comprises the following steps:
the controller determines the pixel values of the pixel points positioned on the h-th row and the w-th row in the corresponding first processed image as follows according to the pixel values of the pixel points (h, w) positioned on the h-th row and the w-th row in the original image and the pixel values of the pixel points on the left side and the right side:
P1_high(h,w)=(2×P1(h,w)-P1(h,w-1)-P1(h,w+1))/4,
wherein, P1_ high (h, w) represents the pixel values of the pixels located at the h-th row and the w-th row in the first processed image corresponding to the original image, P1(h, w) represents the pixel values of the pixels (h, w) located at the h-th row and the w-th row in the original image, P1(h, w-1) represents the pixel value of the left pixel of the pixel in the original image, P1(h, w +1) represents the pixel value of the right pixel of the pixel in the original image, and h and w are both integers greater than or equal to 0;
and the controller circularly traverses each pixel point in the original image to obtain a pixel value corresponding to each pixel point in the corresponding first processed image so as to determine the first processed image corresponding to the original image.
7. The method of claim 6, wherein the first direction is a row direction of pixels and the second direction is a column direction of pixels; or, the first direction is a column direction of the pixel points, and the second direction is a row direction of the pixel points.
8. The method of claim 6 or 7, wherein the controller interpolates the original image and the corresponding first processed image in a first direction to obtain an enlarged image of the original image in the first direction, comprising:
when the first direction is the row direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-1 th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the h-th row and the 2 xw-th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h-th row and the w-th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction;
or,
when the first direction is the column direction of the pixel points, the controller determines that the pixel values of the pixel points positioned in the 2 xh-1 th row and the w th column in the enlarged image in the first direction are equal to the sum of a first pixel value and a second pixel value, and determines that the pixel values of the pixel points positioned in the 2 xh th row and the w th column in the enlarged image in the first direction are equal to the difference between the first pixel value and the second pixel value, wherein the first pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the original image, the second pixel value is the pixel value of the pixel point positioned in the h th row and the w th column in the first processed image corresponding to the original image, and h and w are integers greater than or equal to 0;
and the controller circularly traverses the original image and each pixel point in the corresponding first processed image to obtain the pixel value of each pixel point in the amplified image in the first direction so as to determine the amplified image of the original image in the first direction.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510843545.3A CN105427247B (en) | 2015-11-26 | 2015-11-26 | A kind of mobile terminal and image processing method of image procossing |
PCT/CN2016/105707 WO2017088679A1 (en) | 2015-11-26 | 2016-11-14 | Image-processing mobile terminal and image processing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510843545.3A CN105427247B (en) | 2015-11-26 | 2015-11-26 | A kind of mobile terminal and image processing method of image procossing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105427247A CN105427247A (en) | 2016-03-23 |
CN105427247B true CN105427247B (en) | 2018-08-24 |
Family
ID=55505429
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510843545.3A Active CN105427247B (en) | 2015-11-26 | 2015-11-26 | A kind of mobile terminal and image processing method of image procossing |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105427247B (en) |
WO (1) | WO2017088679A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105427247B (en) * | 2015-11-26 | 2018-08-24 | 努比亚技术有限公司 | A kind of mobile terminal and image processing method of image procossing |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1246242A (en) * | 1996-12-20 | 2000-03-01 | 韦斯特福德技术公司 | Improved estimator for recovering high frequency components from compressed image data |
CN101815157A (en) * | 2009-02-24 | 2010-08-25 | 虹软(杭州)科技有限公司 | Image and video amplification method and relevant image processing device |
EP2544144A1 (en) * | 2010-03-01 | 2013-01-09 | Sharp Kabushiki Kaisha | Image enlargement device, image enlargement program, memory medium on which an image enlargement program is stored, and display device |
CN103500436A (en) * | 2013-09-17 | 2014-01-08 | 广东威创视讯科技股份有限公司 | Image super-resolution processing method and system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104853059B (en) * | 2014-02-17 | 2018-12-18 | 台达电子工业股份有限公司 | Super-resolution image processing method and device |
CN104134189B (en) * | 2014-07-31 | 2017-07-28 | 青岛海信电器股份有限公司 | A kind of method and device of image amplification |
CN105427247B (en) * | 2015-11-26 | 2018-08-24 | 努比亚技术有限公司 | A kind of mobile terminal and image processing method of image procossing |
-
2015
- 2015-11-26 CN CN201510843545.3A patent/CN105427247B/en active Active
-
2016
- 2016-11-14 WO PCT/CN2016/105707 patent/WO2017088679A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1246242A (en) * | 1996-12-20 | 2000-03-01 | 韦斯特福德技术公司 | Improved estimator for recovering high frequency components from compressed image data |
CN101815157A (en) * | 2009-02-24 | 2010-08-25 | 虹软(杭州)科技有限公司 | Image and video amplification method and relevant image processing device |
EP2544144A1 (en) * | 2010-03-01 | 2013-01-09 | Sharp Kabushiki Kaisha | Image enlargement device, image enlargement program, memory medium on which an image enlargement program is stored, and display device |
CN103500436A (en) * | 2013-09-17 | 2014-01-08 | 广东威创视讯科技股份有限公司 | Image super-resolution processing method and system |
Non-Patent Citations (4)
Title |
---|
Image Zooming Using Wavelet Transform;HUDA NAWAF等;《Proceedings of the 5th WSEAS Int. Conf. on System Science and Simulation in Engineering》;20061218;全文 * |
Remote Sensing Image Resolution Enlargement Algorithm Based on Wavelet Transformation;Samiul Azam等;《I.J. Image, Graphics and Signal Processing》;20140603;全文 * |
一种基于小波分量相似性的图像放大方法;邢国波等;《计算机与数字工程》;20071231;第35卷(第4期);全文 * |
小波分解在图像放大缩小中的应用;王炜等;《计算机工程与应用》;20000531;全文 * |
Also Published As
Publication number | Publication date |
---|---|
WO2017088679A1 (en) | 2017-06-01 |
CN105427247A (en) | 2016-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105278910B (en) | A kind of display methods and device | |
CN105045509B (en) | A kind of device and method of editing picture | |
CN105160628B (en) | A kind of method and apparatus obtaining RGB data | |
CN106383647B (en) | Terminal interface control device and method | |
CN106534693B (en) | A kind of photo processing method, device and terminal | |
CN106569539B (en) | A kind of method and device, the terminal of bias voltage compensation | |
CN105427261A (en) | Method and apparatus for removing image color noise and mobile terminal | |
CN105958180A (en) | Antenna, mobile terminal and control method thereof | |
CN105430258B (en) | A kind of method and apparatus of self-timer group photo | |
CN106231587A (en) | The data service changing method of mobile terminal and switching device | |
CN109168029B (en) | Method, device and computer-readable storage medium for adjusting resolution | |
CN105791541B (en) | Screenshot method and mobile terminal | |
CN105353346A (en) | Mobile terminal and hot spot positioning method thereof | |
CN106455009B (en) | Network searching device and method | |
CN105261054A (en) | Device and method for compositing audio GIF image | |
CN106791149A (en) | A kind of method of mobile terminal and control screen | |
CN105262953B (en) | A kind of mobile terminal and its method of control shooting | |
CN105224177B (en) | A kind of mobile terminal and application icon redraw method | |
CN106802751A (en) | A kind of terminal and screen display method | |
CN105427247B (en) | A kind of mobile terminal and image processing method of image procossing | |
CN109041197B (en) | Communication method of terminal, terminal and computer readable storage medium | |
CN106843649B (en) | Icon processing method and device and terminal | |
CN105138235A (en) | Picture processing apparatus and method | |
CN106900037B (en) | Display method and terminal | |
CN105141834A (en) | Device and method for controlling picture shooting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |