CN107566721B - Information display method, terminal and computer readable storage medium - Google Patents
Information display method, terminal and computer readable storage medium Download PDFInfo
- Publication number
- CN107566721B CN107566721B CN201710761768.4A CN201710761768A CN107566721B CN 107566721 B CN107566721 B CN 107566721B CN 201710761768 A CN201710761768 A CN 201710761768A CN 107566721 B CN107566721 B CN 107566721B
- Authority
- CN
- China
- Prior art keywords
- operation object
- interface
- shooting
- shooting interface
- terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000015654 memory Effects 0.000 claims description 31
- 238000004891 communication Methods 0.000 claims description 19
- 238000013507 mapping Methods 0.000 claims description 13
- 230000008569 process Effects 0.000 claims description 10
- 230000004044 response Effects 0.000 claims description 7
- 230000006870 function Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 10
- 210000003813 thumb Anatomy 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 210000003811 finger Anatomy 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Landscapes
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses an information display method, which comprises the following steps: detecting whether a first shooting interface of the terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface; and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface. The embodiment of the invention also discloses a terminal and a computer readable storage medium, which solve the problem that normal shooting cannot be performed due to the fact that the virtual key disappears after the shooting interface is moved in the prior art, and improve the intelligent degree of the terminal.
Description
Technical Field
The present invention relates to the field of communications, and in particular, to an information display method, a terminal, and a computer-readable storage medium.
Background
With the development of science and technology, more and more mobile terminals enter the lives of people, for example, smart phones, tablet computers, notebook computers and the like, and the smart terminals provide great convenience for the lives of people. However, with the development of mobile terminal hardware, although hardware represented by a large screen meets the size requirement of a display screen for a user, it causes much inconvenience in operation. Taking a large-screen smart phone as an example, if a user wants to touch certain areas of the large-screen smart phone while holding the large-screen smart phone with one hand, both hands must be used for cooperation. Thus, the development of a large screen makes one-handed operation difficult. Based on the above problems, in the existing scheme, a single-hand operation mode occurs, and when the large-screen mobile phone is in the single-hand operation mode, the whole display interface on the screen is moved to a single-hand operable area, so that a user can conveniently operate the mobile phone with a single hand.
However, in the prior art, after the shooting interface is moved, some virtual keys in the shooting interface disappear, and normal shooting cannot be performed.
Disclosure of Invention
In view of this, embodiments of the present invention are expected to provide an information display method, a terminal, and a computer-readable storage medium, so as to solve the technical problem in the prior art that when a shooting interface is moved, a virtual key disappears, which may cause a normal shooting to be impossible, and achieve an effect of improving the intelligent degree of the terminal.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information display method, including:
detecting whether a first shooting interface of the terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface.
Further, if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface, including: if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal;
and displaying the shooting preview interface in a second part of the display area, wherein the display area comprises the first part and the second part.
Further, if the first shooting interface does not include the first operation object, before displaying the second operation object on the first shooting interface, the method further includes:
generating a second operation object; or;
based on the first operation object, a second operation object is determined.
Further, if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface, further comprising:
setting touch operation corresponding to a second operation object;
and establishing a mapping relation between the second operation object and the first operation object.
Further, if the first shooting interface does not include the first operation object, after the second operation object is displayed on the first shooting interface, the method further includes:
receiving a trigger instruction aiming at a second operation object;
and responding to the trigger instruction, and executing the operation corresponding to the first operation object based on the mapping relation between the second operation object and the first operation object.
Further, after the operation corresponding to the first operation object is executed based on the mapping relationship between the second operation object and the first operation object in response to the trigger instruction, the method further includes:
receiving a hiding instruction;
and hiding the second operation object and displaying a second shooting interface on the display area of the terminal in response to the hiding instruction.
Further, if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface, including:
if the first shooting interface does not comprise the first operation object, displaying a first interface comprising a first sub-operation object on the first shooting interface;
receiving a switching instruction, responding to the switching instruction, and switching the first interface into a second interface comprising a second sub-operation object; the second operation object comprises a first sub operation object and a second sub operation object.
In a second aspect, an embodiment of the present invention provides a terminal, where the terminal includes a processor, a memory, and a communication bus; the communication bus is used for realizing connection communication between the processor and the memory; the processor is used for executing the information display program stored in the memory so as to realize the following steps:
detecting whether a first shooting interface of the terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface.
Further, if the first shooting interface does not include the first operation object, the processor is further configured to execute the information display program when the second operation object is displayed on the first shooting interface, so as to implement the following steps:
if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal;
and displaying the shooting preview interface in a second part of the display area, wherein the display area comprises the first part and the second part.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium storing an information display program, where the shooting program, when executed by a processor, implements the steps of the information display method described above.
According to the information display method, the terminal and the computer readable storage medium provided by the embodiment of the invention, the terminal detects whether a first shooting interface of the terminal comprises a first operation object, and the first operation object is used for operating a second shooting interface; and if the first shooting interface does not comprise the first operation object, displaying a second operation object for operating the first shooting interface on the first shooting interface. That is, after the second photographing interface is moved to obtain the first photographing interface, even if the first operation object originally displayed in the second photographing interface for operating the second photographing interface disappears from the display area of the terminal, the moved first photographing interface can be operated by using the second operation object displayed on the first photographing interface. Therefore, by the information display method, the technical problem that in the prior art, after the shooting interface is moved, the virtual key disappears, so that normal shooting cannot be performed is solved, and the effect of improving the intelligent degree of the terminal is achieved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of a mobile terminal implementing various embodiments of the present invention;
fig. 2 is a diagram of a communication network system architecture according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating an information display method according to an embodiment of the invention;
fig. 4 is a schematic diagram of a touch operation interface according to an embodiment of the invention;
FIG. 5 is a schematic diagram of another touch operation interface according to an embodiment of the invention;
FIG. 6 is a schematic diagram of another touch operation interface according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a touch operation interface according to another embodiment of the present invention;
FIG. 8 is a schematic diagram of another touch operation interface according to another embodiment of the present invention;
fig. 9 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palm top computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: a Radio Frequency (RF) unit 101, a WiFi module 102, an audio output unit 103, an a/V (audio/video) input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access 2000 (CDMA2000, Code Division Multiple Access 2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Frequency Division duplex Long Term Evolution (FDD-LTE), and Time Division duplex Long Term Evolution (TDD-LTE), etc.
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processor (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a display unit, may collect a touch operation by a user (e.g., an operation of the user on the touch panel 1071 or near the touch panel 1071 using any suitable object or accessory such as a finger, a stylus, etc.) thereon or nearby, and drive a corresponding connection device according to a preset program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present invention, a communication network system on which the mobile terminal of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes User Equipment (UE) 201, Evolved UMTS terrestrial radio Access Network (E-UTRAN) 202, Evolved Packet Core Network (EPC) 203, and IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include a Mobility Management Entity (MME) 2031, a Home Subscriber Server (HSS) 2032, other MMEs 2033, a Serving Gateway (SGW) 2034, a packet data network gateway (PGW, PDN gateway) 2035, and a Policy and Charging Rules Function (PCRF) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IP Multimedia Subsystem (IMS), other IP services, and the like.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present invention is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, the present invention provides various embodiments of the method.
The embodiment of the invention provides an information display method, wherein an execution main body of the information display method is a mobile terminal, at least one image collector is arranged in the mobile terminal, and the image collector can be used for collecting images.
In practical application, the image collector may be a camera. The mobile terminal can be a smart phone with a front camera or a rear camera or both the front camera and the rear camera. The mobile terminal may also be a terminal with a shooting function, such as a tablet computer. Of course, the mobile terminal may also be other terminals having a shooting function, and the embodiment of the present invention is not particularly limited.
Further, the functions implemented by the information display method can be implemented by calling program codes through a processor in the mobile terminal, and the program codes can be stored in a computer storage medium. In practical application, the information display method can be applied to various occasions needing shooting, such as self-shooting, shooting of scenery, people and the like.
Based on the foregoing embodiments, an embodiment of the present invention provides an information display method, as shown in fig. 3, the method including the following steps:
s301: whether a first operation object is included in a first shooting interface of the terminal is detected.
The first operation object is used for operating the second shooting interface.
Here, in the process of shooting by using the terminal, when the user sees a lovely puppy and wants to take a picture of the puppy, the camera application on the system user interface of the terminal is opened, and a preset operation is completed on the terminal, for example, the camera application is clicked, so that the mobile terminal obtains an instruction for opening the camera, and then responds to the instruction and opens the camera application. After the camera application is started, an initial shooting interface, namely a second shooting interface, is displayed in a display area of the terminal. For example, referring to fig. 4, the second shooting interface includes a shooting area 41 and an operable area 42, where the operable area is a fixed display area displayed at the bottom of the second shooting interface, and a first operation object for operating the second shooting interface is displayed in the fixed display area. Here, the above-mentioned first operable object may be a virtual operation key such as the image pickup key 43, the photographing key 44, and the photographing mode key 45; further, when the user clicks the camera key 43, the terminal enters a camera mode; when the user clicks the photographing button 44, the terminal enters a photographing mode; when the user clicks the photographing mode button 45, the terminal presents a plurality of photographing modes for the user to select. A photographic subject such as a lovely puppy is displayed in the photographing region 41.
Of course, the first operation object may also be disposed in other areas of the second shooting interface, and the first operation object may also be another interactive object, so as to implement the function of operating the second shooting interface, which is not specifically limited in the embodiment of the present invention.
In practical applications, referring to fig. 4, when the terminal used by the user is a terminal with a large screen, if the user wants to enlarge and display the head position of the puppy with the right hand, but finds that the head position of the puppy cannot be touched by the right hand; at this time, the user continuously touches the menu key twice, such as the circle portion right below the photographing key 44 in fig. 4, the terminal starts the one-handed operation mode, the terminal pulls down the second photographing interface, and the pulled-down photographing interface may be the first photographing interface. For example, as shown in fig. 4 and 5, after the terminal starts the one-handed operation mode, the upper boundary of the shooting area 41 in the second shooting interface moves to the upper boundary of the first shooting interface 52, and at this time, a blank area 51 appears on the display interface of the terminal. Further, the terminal detects whether the first shooting interface comprises the first operation object after the second shooting interface is moved to obtain the first shooting interface. Here, the first operation object may be a part of the operation objects in the second imaging interface, or may be all the operation objects in the second imaging interface. For example, the terminal detects whether the first photographing interface includes the photographing key 44.
S302: and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface.
The second operation object is used for operating the first shooting interface.
Here, still taking the first operation object as the photographing key 44 as an example, as shown in fig. 4 and 5, after the mobile terminal pulls down the second photographing interface, the photographing region 41 in fig. 4 is correspondingly pulled down along the vertical direction, and the pulled-down photographing region is the region indicated by 52 in fig. 5, i.e. the first photographing region, which is also referred to as the second portion of the display interface. At this time, nothing is displayed in the first portion 51 of the terminal display interface. Further, the terminal detects whether the first shooting interface includes the shooting key 44, obviously, as shown in fig. 5, as the display interface is pulled down, the operable area originally located at the bottom of the display interface is moved down and moved out of the display area, and at this time, the terminal detects that the first shooting interface does not include the first operation object, i.e., the shooting key 44, i.e., the first shooting interface does not include an operation object that can be used for shooting; in this way, in order to ensure that the user can still perform normal shooting operation, the terminal displays a second operation object for operating the first shooting interface on the first shooting interface. For example, referring to fig. 6, the terminal displays a camera key 63, a photographing key 64, and a photographing mode key 65 on a first photographing interface, i.e., a first portion 62, of the display interface. Therefore, the problem that in the prior art, after the shooting interface is moved, the virtual key disappears, and normal shooting cannot be performed is solved.
In practical applications, after the first operation object is displayed in the first portion 62, the relevant parameters of each operation object and the adjustment keys (not shown in fig. 6) for adjusting the relevant parameters may also be displayed in the second portion 61. Therefore, after the user selects a certain first operation object, the shooting parameters can be further customized, and the shooting quality is improved. For example, when the user selects the photographing key 64 displayed on the first section 62 and then clicks the photographing key 64, a panorama photographing key, a continuous photographing preference key, an all-in-focus key, and the like are displayed on the second section 61. The user may further select an adjustment button in the second section, such as a continuous shot preference button, to take multiple pictures in succession for a subject, such as a lovely dog. Therefore, the user experience is improved.
It should be noted that the second operation object may be only one operation object having a photographing function, for example, the second operation object is only a photographing key (not shown in fig. 6); a plurality of operation subjects including an operation subject having a photographing function may also be possible, for example, the second operation subject includes a photographing key and a flash key (not shown in fig. 6); it is also possible to have a plurality of operation subjects having the same function as all the operation subjects in the second photographing interface, for example, as shown in fig. 6, a photographing key 63 (corresponding to the photographing key 43), a photographing key 64 (corresponding to the photographing key 44), and a photographing mode key 65 (corresponding to the photographing mode key 45) are displayed.
As is apparent from the above description in S301, since the first photographing interface is a photographing interface generated after the second photographing interface is pulled down, the operable area 42 disposed at the bottom of the second photographing interface disappears as the second photographing interface is pulled down.
Then, between S301 and S302, the information display method provided by the present invention further includes the steps of:
and A1, generating a second operation object.
The terminal may generate a second operation object for operating the first shooting interface, where the generated second operation object may include: and displaying a camera shooting key, a photographing key and a photographing mode key.
Of course, a separate operation may also be established for the second operand, without association with the first operand. Here, in the process of generating the second operation object, the terminal may also generate a corresponding touch key, associate the touch object with the second operation object, and then simultaneously display the touch object on the first shooting interface. For example, referring to fig. 7, a blank area 71 (corresponding to the blank area 61) appears in a first portion of the display interface of the terminal, a first photographing interface 72 is a second portion of the display interface of the terminal, and a first sub-operation object is displayed on the first photographing interface 72: a camera key 73, a photographing key 74, and a photographing mode key 75 are displayed. Meanwhile, a sliding touch key 76 and a switching touch key 77 are also displayed in the lower left corner sector touch interface of the first shooting interface 72; the first shooting interface 72 includes a first interface and a sector touch interface. Specifically, the processor establishes a mapping relationship between the first sub-operation objects and the sliding touch key, so that when the user slides with the thumb along the extending direction in the annular region corresponding to the sliding touch key 76 by different distances, the highlight indication is continuously switched between the first sub-operation objects (only one first sub-operation object is highlighted at each time), for example, the photographing key 74 in fig. 7 is in a highlighted state. Further, when the user finishes sliding the thumb, the currently highlighted photographing key 74 is triggered, and the terminal enters a photographing state.
In practical application, the central angle of the fan-shaped touch interface is 45-90 degrees; one straight edge of the fan-shaped touch interface is overlapped with the one-side longitudinal boundary line, the other straight edge of the fan-shaped touch interface is parallel to a lower side boundary line of the display interface facing downwards in the current use direction, or one straight edge of the fan-shaped touch interface is parallel to the one-side longitudinal boundary line, the other straight edge of the fan-shaped touch interface is overlapped with the lower side boundary line, or two straight edges of the fan-shaped touch interface are respectively parallel to the one-side longitudinal boundary line and the lower side boundary line, or two straight edges of the fan-shaped touch interface are respectively overlapped with the one-side longitudinal. The fan-shaped touch interface is supported to be dragged, and when the fan-shaped touch interface is dragged to be close to the other side longitudinal boundary line and far away from the side longitudinal boundary line, the orientation of the fan-shaped touch interface is switched to be an arc edge protruding towards the side longitudinal boundary line.
Alternatively, a2 determines the second operation object based on the first operation object.
In practical application, when the terminal detects that the first shooting interface does not comprise the first operation object, the terminal can regenerate a second operation object for operating the first shooting interface; or, the second operation object for operating the first shooting interface is determined based on the first operation object. And then, the terminal displays the second operation object on the first shooting interface so that the user can operate the first shooting interface to achieve the purpose of shooting.
In other embodiments of the present invention, the first operation object is N operation objects, and the second operation object is M operation objects, where N and M are positive integers greater than 1, and M is equal to or less than N. The terminal can determine M second operation objects from the N first operation objects according to some preset rules. For example, the terminal may determine, according to the usage frequency of each first operation object, M second operation objects with higher usage frequency from the N first operation objects; or, the terminal may determine, among the N first operation objects, M second operation objects located at intermediate positions in a fixed display area at the bottom of the display area, according to the display positions of the first operation objects; of course, in practical application, the preset rule may also be other rules, and the embodiment is not particularly limited.
Further, after the terminal determines the M second operation objects, the M second operation objects are displayed on the first shooting interface. The second operation object may be displayed at an arbitrary position on the first photographing interface. However, to facilitate user operationThe processor of the terminal can also obtain a holding mode of the terminal and determine the centralized display area of the second operation object according to the holding mode. For example, the processor may control the detection units such as a gravity sensor and a gyroscope arranged on the terminal to work, and detect and acquire the holding mode of the terminal; the holding mode of the terminal can be obtained according to the selection of the user on the holding mode; the processor then determines the corresponding centralized display area according to the holding mode. For example, the processor detects a group of opposite sides of the terminal through the gravity sensor, if the left side and the right side are not on the same horizontal plane and the left side is lower than the horizontal plane, at this time, the processor determines that the terminal is in a left-hand and single-hand holding mode, and correspondingly, determines that the position of the concentrated display area is the lower left area of the terminal display interface; then, the second operation object is displayed in the lower left region. For example, referring to fig. 6, an image pickup key 63 (corresponding to the image pickup key 43), a photographing key 64 (corresponding to the photographing key 44), and a photographing mode key 65 (corresponding to the photographing mode key 45) are displayed in this order from top to bottom on the left side of the first photographing interface. Further, the display modes of the M second operation objects on the first shooting interface are consistent with the display modes when the M second operation objects are on the second shooting interface. For example, the display style, the display size, and the like of the M second operation objects on the first shooting interface are all consistent with those when the M second operation objects are displayed on the first display interface. Further, the terminal may set a touch operation of a second object, and establish a mapping relationship between the second operation object and the first operation object, that is, associate M second operation objects on the first shooting interface with M first operation objects on the second shooting interface one by one, and when a user performs a certain operation object, such as M, on the first shooting interface, in the M second operation objects1In operation, the processor obtains a target for M1A trigger instruction, responding to the trigger instruction, and executing the M operation objects according to the object relation between the M second operation objects on the first shooting interface and the M second operation objects on the second shooting interface1And (4) performing corresponding operation. Therefore, the terminal does not need to store a set of related operation instructions for the second operation object on the first shooting interface again, and the storage space of the terminal is saved.
When the user triggers the switching touch key 77, the touched switching touch key is in a highlighted state as shown by 87 in fig. 8, and at this time, the terminal receives the switching instruction and responds to the switching instruction to switch the first interface to a second interface including a second sub-operation object; for example, referring to fig. 8, the second sub-operation object includes: a voice control key 83, a High-Dynamic Range (HDR) key 84, and a front-rear camera conversion key 85. The second operation object comprises a first sub operation object and a second sub operation object.
In practical applications, the sector touch interface further supports dragging and zooming of a user, the user can zoom in/out the sector area according to a comfortable operation range of the thumb of the user, and the user who is used to hold the terminal at the middle lower part, the middle part or the middle upper part of the terminal can drag the sector touch interface to a position where the thumb of the user is convenient to operate. The sector touch interface is displayed in a semitransparent state, and the radius of the sector touch interface is the length of the thumb.
Based on the foregoing embodiment, in another embodiment of the present invention, if the first shooting interface does not include the first operation object, the step S302 of displaying the second operation object on the first shooting interface may further include:
first, if the first shooting interface does not include the first operation object, the first shooting interface is displayed in a first part of a display area of the terminal.
And secondly, displaying the shooting preview interface in a second part of the display area, wherein the display area comprises the first part and the second part.
In practical applications, referring to fig. 8, if the terminal detects that the first photographing interface does not include the first operation object, the first photographing interface, such as a target dog being photographed, is displayed in the first portion 82 of the display area of the terminal; a preview interface for photographing, such as a picture of a landscape stored by the terminal, is displayed in a second portion 81 of the display area, which includes the first portion and the second portion. Here, the areas of the display regions occupied by the first portion and the second portion may be the same or different, and this embodiment is not particularly limited. The picture displayed in the shooting preview interface can be a picture stored in a terminal memory, and a user can change the display content in the shooting preview interface by sliding the picture left and right, and can also change the display content in the shooting preview interface by other operation modes. Therefore, the user can browse the shot pictures in real time in the process of shooting the pictures without influencing the current shooting.
In this embodiment, after the step S302, the information display method provided by the present invention may further include: receiving a hiding instruction; and hiding the second operation object and displaying a second shooting interface on the display area of the terminal in response to the hiding instruction.
In practical application, after a user completes an operation on a second operation object, that is, after the terminal responds to a trigger instruction, and executes an operation corresponding to a first operation object based on a mapping relationship between the second operation object and the first operation object, the terminal may further receive a hidden instruction; and hiding the second operation object and displaying a second shooting interface on the display area of the terminal in response to the hiding instruction. Therefore, the terminal can automatically realize the function of recovering the original shooting interface without manual operation of a user to recover the original shooting interface, and the intelligent degree of the terminal is improved.
As can be seen from the above, in the information display method provided in the embodiment of the present invention, the terminal detects whether the first shooting interface of the terminal includes the first operation object, where the first operation object is used to operate the second shooting interface; and if the first shooting interface does not comprise the first operation object, displaying a second operation object for operating the first shooting interface on the first shooting interface. That is, after the second photographing interface is moved to obtain the first photographing interface, even if the first operation object originally displayed in the second photographing interface for operating the second photographing interface disappears from the display area of the terminal, the moved first photographing interface can be operated by using the second operation object displayed on the first photographing interface. Therefore, by the information display method, the technical problem that in the prior art, after the shooting interface is moved, the virtual key disappears, and normal shooting cannot be performed is solved, the effect of improving the intelligent degree of the terminal is achieved, and user experience is improved.
Based on the foregoing embodiments, an embodiment of the present invention provides a terminal, which may be applied to the information display method provided in the embodiment corresponding to fig. 3, and as shown in fig. 9, the terminal 90 includes: a memory 901 (corresponding to the memory 109 in fig. 1), a processor 902 (corresponding to the processor 110 in fig. 1), and a computer program 903 stored on the memory 901 and operable on the processor 902, wherein the memory 901 and the processor 902 are connected by a communication bus, and the processor 902 executes the computer program 903 to implement the following steps:
detecting whether a first shooting interface of the terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface.
Further, the processor may further implement the following steps when executing the program: if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal;
and displaying the shooting preview interface in a second part of the display area, wherein the display area comprises the first part and the second part.
Further, the processor may further implement the following steps when executing the program: generating a second operation object; or; based on the first operation object, a second operation object is determined.
Further, the processor may further implement the following steps when executing the program: setting touch operation corresponding to a second operation object;
and establishing a mapping relation between the second operation object and the first operation object.
Further, the processor may further implement the following steps when executing the program: receiving a trigger instruction aiming at a second operation object;
and responding to the trigger instruction, and executing the operation corresponding to the first operation object based on the mapping relation between the second operation object and the first operation object.
Further, the processor may further implement the following steps when executing the program: receiving a hiding instruction;
and hiding the second operation object and displaying a second shooting interface on the display area of the terminal in response to the hiding instruction.
Further, the processor may further implement the following steps when executing the program: if the first shooting interface does not comprise the first operation object, displaying a first interface comprising a first sub-operation object on the first shooting interface;
receiving a switching instruction, responding to the switching instruction, and switching the first interface into a second interface comprising a second sub-operation object; the second operation object comprises a first sub operation object and a second sub operation object.
In practical applications, the Processor may be implemented by a Central Processing Unit (CPU), a GPU, a microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Here, it should be noted that: the description of the terminal embodiment is similar to the description of the method, and has the same beneficial effects as the method embodiment, and therefore, the description is omitted. For technical details that are not disclosed in the terminal embodiment of the present invention, those skilled in the art should refer to the description of the method embodiment of the present invention for understanding, and therefore, the detailed description is omitted here. According to the terminal provided by the embodiment of the invention, whether a first shooting interface of the terminal comprises a first operation object is detected, and the first operation object is used for operating a second shooting interface; and if the first shooting interface does not comprise the first operation object, displaying a second operation object for operating the first shooting interface on the first shooting interface. That is, after the second photographing interface is moved to obtain the first photographing interface, even if the first operation object originally displayed in the second photographing interface for operating the second photographing interface disappears from the display area of the terminal, the moved first photographing interface can be operated by using the second operation object displayed on the first photographing interface. Therefore, by the information display method, the technical problem that in the prior art, after the shooting interface is moved, the virtual key disappears, and normal shooting cannot be performed is solved, the effect of improving the intelligent degree of the terminal is achieved, and user experience is improved.
Based on the foregoing method embodiments, the present embodiment provides a computer-readable storage medium, which can be applied to a mobile terminal in one or more embodiments, where the computer-readable storage medium stores one or more programs, and the one or more programs are executable by one or more processors to implement the following steps: detecting whether a first shooting interface of the terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
and if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface.
Further, the processor may further implement the following steps when executing the program: if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal;
and displaying the shooting preview interface in a second part of the display area, wherein the display area comprises the first part and the second part.
Further, the processor may further implement the following steps when executing the program: generating a second operation object; or;
based on the first operation object, a second operation object is determined.
Further, the processor may further implement the following steps when executing the program: setting touch operation corresponding to a second operation object;
and establishing a mapping relation between the second operation object and the first operation object.
Further, the processor may further implement the following steps when executing the program: receiving a trigger instruction aiming at a second operation object;
and responding to the trigger instruction, and executing the operation corresponding to the first operation object based on the mapping relation between the second operation object and the first operation object.
Further, the processor may further implement the following steps when executing the program: receiving a hiding instruction;
and hiding the second operation object and displaying a second shooting interface on the display area of the terminal in response to the hiding instruction.
Further, the processor may further implement the following steps when executing the program: if the first shooting interface does not comprise the first operation object, displaying a first interface comprising a first sub-operation object on the first shooting interface;
receiving a switching instruction, responding to the switching instruction, and switching the first interface into a second interface comprising a second sub-operation object; the second operation object comprises a first sub operation object and a second sub operation object.
The computer-readable storage medium may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a magnetic Random Access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); and may be various electronic devices such as mobile phones, computers, tablet devices, personal digital assistants, etc., including one or any combination of the above-mentioned memories.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. An information display method, characterized in that the method comprises:
detecting whether a first shooting interface of a terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface;
the first shooting interface is a shooting interface generated after the second shooting interface is moved;
wherein, if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface includes:
if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal; the display area comprises the first portion and a second portion;
displaying a shooting preview interface on the second part; and pictures stored in the terminal memory are displayed in the shooting preview interface, so that a user can browse the shot pictures in real time in the process of shooting the pictures.
2. The method according to claim 1, wherein if the first operation object is not included in the first shooting interface, before displaying a second operation object on the first shooting interface, the method further comprises:
generating the second operation object; or;
and determining the second operation object based on the first operation object.
3. The method according to claim 1, wherein if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface, further comprising:
setting touch operation corresponding to the second operation object;
and establishing a mapping relation between the second operation object and the first operation object.
4. The method according to claim 3, wherein if the first operation object is not included in the first shooting interface, after displaying a second operation object on the first shooting interface, further comprising:
receiving a trigger instruction aiming at the second operation object;
and responding to the trigger instruction, and executing the operation corresponding to the first operation object based on the mapping relation between the second operation object and the first operation object.
5. The method according to claim 4, wherein after the performing, in response to the trigger instruction, the operation corresponding to the first operation object based on the mapping relationship between the second operation object and the first operation object, further comprises:
receiving a hiding instruction;
and responding to the hiding instruction, hiding the second operation object and displaying the second shooting interface on a display area of the terminal.
6. The method according to claim 1, wherein if the first operation object is not included in the first shooting interface, displaying a second operation object on the first shooting interface comprises:
if the first shooting interface does not comprise the first operation object, displaying a first interface comprising a first sub-operation object on the first shooting interface;
receiving a switching instruction, responding to the switching instruction, and switching the first interface into a second interface comprising a second sub-operation object; wherein the second operation object comprises the first sub operation object and the second sub operation object.
7. A terminal, characterized in that the terminal comprises a processor, a memory and a communication bus;
the communication bus is used for realizing connection communication between the processor memories;
the processor is used for executing the information display program stored in the memory so as to realize the following steps:
detecting whether a first shooting interface of a terminal comprises a first operation object, wherein the first operation object is used for operating a second shooting interface;
if the first shooting interface does not comprise the first operation object, displaying a second operation object on the first shooting interface, wherein the second operation object is used for operating the first shooting interface;
the first shooting interface is a shooting interface generated after the second shooting interface is moved;
wherein, if the first shooting interface does not include the first operation object, displaying a second operation object on the first shooting interface includes:
if the first shooting interface does not comprise the first operation object, displaying the first shooting interface on a first part of a display area of the terminal; the display area comprises the first portion and a second portion;
displaying a shooting preview interface on the second part; and pictures stored in the terminal memory are displayed in the shooting preview interface, so that a user can browse the shot pictures in real time in the process of shooting the pictures.
8. A computer-readable storage medium, characterized in that the computer-readable medium stores an information display program which, when executed by a processor, realizes the steps of the information display method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710761768.4A CN107566721B (en) | 2017-08-30 | 2017-08-30 | Information display method, terminal and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710761768.4A CN107566721B (en) | 2017-08-30 | 2017-08-30 | Information display method, terminal and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107566721A CN107566721A (en) | 2018-01-09 |
CN107566721B true CN107566721B (en) | 2020-06-26 |
Family
ID=60977944
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710761768.4A Active CN107566721B (en) | 2017-08-30 | 2017-08-30 | Information display method, terminal and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107566721B (en) |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3958557B1 (en) | 2015-04-23 | 2024-09-25 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
US10009536B2 (en) | 2016-06-12 | 2018-06-26 | Apple Inc. | Applying a simulated optical effect based on data received from multiple camera sensors |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | User interfaces for simulated depth effects |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
CN111901477B (en) * | 2019-05-06 | 2021-05-25 | 苹果公司 | User interface for capturing and managing visual media |
US11054973B1 (en) | 2020-06-01 | 2021-07-06 | Apple Inc. | User interfaces for managing media |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103581561A (en) * | 2013-10-30 | 2014-02-12 | 广东欧珀移动通信有限公司 | Human and scene image synthesis method and system based on rotary camera lens photographing |
CN104657073A (en) * | 2015-01-22 | 2015-05-27 | 上海华豚科技有限公司 | Half-screen operating method of mobile phone interface |
CN106610821A (en) * | 2015-10-22 | 2017-05-03 | 青岛海信电器股份有限公司 | Method of displaying picture on terminal and terminal |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5306266B2 (en) * | 2010-03-15 | 2013-10-02 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP5997921B2 (en) * | 2012-04-13 | 2016-09-28 | 株式会社MetaMoJi | Character input method and character input device |
CN104049883A (en) * | 2013-03-15 | 2014-09-17 | 广州三星通信技术研究有限公司 | Method and system for displaying subscreen and operating in subscreen |
CN104007930B (en) * | 2014-06-09 | 2015-11-25 | 努比亚技术有限公司 | A kind of mobile terminal and realize the method and apparatus of one-handed performance |
CN104049846A (en) * | 2014-06-24 | 2014-09-17 | 联想(北京)有限公司 | Information processing method and electronic device |
US20160034131A1 (en) * | 2014-07-31 | 2016-02-04 | Sony Corporation | Methods and systems of a graphical user interface shift |
CN104375776A (en) * | 2014-11-10 | 2015-02-25 | 格科微电子(上海)有限公司 | Touch control equipment and touch control method thereof |
CN104536664A (en) * | 2014-12-25 | 2015-04-22 | 深圳市金立通信设备有限公司 | Shutter position determining method |
CN106210492A (en) * | 2015-04-29 | 2016-12-07 | 阿里巴巴集团控股有限公司 | A kind of method and apparatus realizing shoot function |
CN106547462B (en) * | 2015-09-22 | 2020-09-04 | 北京小米移动软件有限公司 | Photographing control method and device and mobile terminal |
CN105892854A (en) * | 2016-03-30 | 2016-08-24 | 乐视控股(北京)有限公司 | Photographing parameter menu loading method and device |
CN106406661A (en) * | 2016-09-09 | 2017-02-15 | 北京小米移动软件有限公司 | Displaying method and device for photographing interface, and terminal device |
-
2017
- 2017-08-30 CN CN201710761768.4A patent/CN107566721B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103581561A (en) * | 2013-10-30 | 2014-02-12 | 广东欧珀移动通信有限公司 | Human and scene image synthesis method and system based on rotary camera lens photographing |
CN104657073A (en) * | 2015-01-22 | 2015-05-27 | 上海华豚科技有限公司 | Half-screen operating method of mobile phone interface |
CN106610821A (en) * | 2015-10-22 | 2017-05-03 | 青岛海信电器股份有限公司 | Method of displaying picture on terminal and terminal |
Also Published As
Publication number | Publication date |
---|---|
CN107566721A (en) | 2018-01-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107566721B (en) | Information display method, terminal and computer readable storage medium | |
CN108037893B (en) | Display control method and device of flexible screen and computer readable storage medium | |
CN107943400B (en) | Double-sided screen switching method, mobile terminal and readable storage medium | |
CN107943367B (en) | Interface display method of double-screen terminal, double-screen terminal and computer storage medium | |
CN107948360B (en) | Shooting method of flexible screen terminal, terminal and computer readable storage medium | |
CN107145293A (en) | A kind of screenshot method, mobile terminal and storage medium | |
CN108322647B (en) | Panoramic image shooting method, mobile terminal and computer readable storage medium | |
CN109710135A (en) | Split screen display available control method, terminal and computer readable storage medium | |
CN108037887B (en) | Method for constructing virtual key of terminal, terminal and computer readable storage medium | |
CN108415636A (en) | A kind of generation method, mobile terminal and the storage medium of suspension button | |
CN109542325B (en) | Double-sided screen touch method, double-sided screen terminal and readable storage medium | |
CN109085990A (en) | A kind of gestural control method, mobile terminal and computer readable storage medium | |
CN109471579A (en) | Terminal screen arrangement information method of adjustment, device, mobile terminal and storage medium | |
CN107515691A (en) | A kind of touch control display method and mobile terminal, storage medium | |
CN108184052A (en) | A kind of method of video record, mobile terminal and computer readable storage medium | |
CN108055463A (en) | Image processing method, terminal and storage medium | |
CN109062465A (en) | A kind of application program launching method, mobile terminal and storage medium | |
CN107979727A (en) | A kind of document image processing method, mobile terminal and computer-readable storage medium | |
CN108848298B (en) | Picture shooting method, flexible terminal and computer readable storage medium | |
CN108200332A (en) | A kind of pattern splicing method, mobile terminal and computer readable storage medium | |
CN108153477B (en) | Multi-touch operation method, mobile terminal and computer-readable storage medium | |
CN108200327B (en) | Shooting control method, flexible screen terminal and computer readable storage medium | |
CN107422956B (en) | Mobile terminal operation response method, mobile terminal and readable storage medium | |
CN107656678B (en) | Long screenshot realization method, terminal and computer readable storage medium | |
CN109739414B (en) | Picture processing method, mobile terminal and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20200529 Address after: 100193 room 2340, building No. 2, Beijing Zhongguancun Software Park incubator, Beijing Zhongguancun, Haidian District, Northeast China Applicant after: BEIJING GREATMAP TECHNOLOGY Co.,Ltd. Address before: 518000 Guangdong Province, Shenzhen high tech Zone of Nanshan District City, No. 9018 North Central Avenue's innovation building A, 6-8 layer, 10-11 layer, B layer, C District 6-10 District 6 floor Applicant before: NUBIA TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |