[go: up one dir, main page]

CN114489429B - Terminal equipment, long screen capturing method and storage medium - Google Patents

Terminal equipment, long screen capturing method and storage medium Download PDF

Info

Publication number
CN114489429B
CN114489429B CN202210113003.0A CN202210113003A CN114489429B CN 114489429 B CN114489429 B CN 114489429B CN 202210113003 A CN202210113003 A CN 202210113003A CN 114489429 B CN114489429 B CN 114489429B
Authority
CN
China
Prior art keywords
window
image
interface
terminal device
rolling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210113003.0A
Other languages
Chinese (zh)
Other versions
CN114489429A (en
Inventor
袁高阳
何琦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN202210113003.0A priority Critical patent/CN114489429B/en
Priority to CN202410281794.7A priority patent/CN118113192A/en
Publication of CN114489429A publication Critical patent/CN114489429A/en
Application granted granted Critical
Publication of CN114489429B publication Critical patent/CN114489429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a terminal device, a long screen capturing method and a storage medium, and relates to the technical field of computers. The terminal equipment can respond to screen capturing operation input by a user and detect whether a current display interface of the terminal equipment contains a rolling window or not; if the current display interface contains a rolling window, displaying a long screen capturing control aiming at the rolling window; the terminal device may generate a long screenshot picture based on the scrolling window in response to a triggering operation for the long screenshot control. By the method, the long screen capturing operation can be performed on the rolling window contained in the current display interface of the terminal equipment, so that the time of a user can be saved, and the storage space of the terminal equipment can be saved.

Description

Terminal equipment, long screen capturing method and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a terminal device, a long screen capturing method, and a storage medium.
Background
With the popularization and development of intelligent terminal devices, the terminal devices have more and more functions, for example, a screen capturing function, and the terminal devices can capture all contents displayed in a display screen.
At present, the terminal device can only perform screen capturing on the whole content of the current display interface, but cannot perform screen capturing on part of the content contained in the current display interface, and particularly, how to store both the content displayed in the display screen and the content not displayed in the display screen in the floating window through screen capturing is an unresolved problem.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present application provide a terminal device, a long screen capturing method, and a storage medium, which can capture a screen of content in a floating window included in a current display interface of the terminal device.
In a first aspect, an embodiment of the present application provides a terminal device, including: a display, a memory, and a processor;
The display is configured to display an interface of the terminal device when in operation;
The memory is configured to store a program or data used by the terminal device to operate;
The processor is configured to respond to screen capturing operation input by a user and detect whether a rolling window is contained in a current display interface; if the current display interface comprises a rolling window, displaying a long screen capturing control aiming at the rolling window; a long screen capture picture is generated based on the scrolling window in response to a triggering operation for the long screen capture control.
In one possible implementation, the processor is specifically configured to:
responding to screen capturing operation input by a user, and detecting whether a current display interface contains a floating window or not; the floating window refers to a window covering part of a display screen of the terminal equipment;
And if the current display interface comprises a floating window, determining whether the floating window is a rolling window.
In one possible implementation, the processor is specifically configured to:
Responding to screen capturing operation input by a user, and acquiring layer data of each layer contained in the current display interface;
Determining the area of each layer based on the layer data of each layer;
Determining the area ratio of each layer to the display screen based on the area of each layer;
And if at least one area ratio is within the preset area ratio range, determining that the current display interface comprises a floating window.
In one possible implementation, the processor is specifically configured to:
acquiring a first interface currently displayed by the floating window;
performing rolling operation on the floating window;
acquiring a second interface displayed by the scrolled floating window, and comparing the first interface with the second interface;
And if the first interface and the second interface are different, determining that the floating window is a rolling window.
In one possible implementation, the processor is specifically configured to:
Responding to the triggering operation of the long screen capturing control, executing at least one rolling operation for the rolling window, and respectively acquiring a first image and a second image corresponding to each rolling operation; the first image is an interface image of a scrolling window before scrolling operation; the second image is an interface image of a rolling window after the rolling operation;
and sequentially splicing the first image and the second image corresponding to each scrolling operation to generate the long screen capturing picture.
In one possible implementation, the processor is specifically configured to:
executing one-time scrolling operation on the scrolling window, and acquiring a first image and a second image corresponding to the one-time scrolling operation;
And if the acquired first image and the acquired second image have differences, repeating the step of executing one-time scrolling operation on the scrolling window until the acquired first image and the acquired second image have no differences.
In one possible implementation, the processor is specifically configured to:
For the first image and the second image corresponding to each scrolling operation, the following operations are respectively performed: determining a coordinate position of a difference region between the first image and the second image; determining a splicing area in the first image and the second image respectively based on the coordinate position of the difference area;
splicing the splicing areas in each image in sequence to obtain a long screen capturing spliced picture;
and generating the long screen capturing picture based on the long screen capturing spliced picture.
In one possible implementation, the processor is specifically configured to:
Splicing the long screen shot spliced picture and the image of the non-spliced area to obtain the long screen shot picture; the non-stitching region is a region of any one image other than the stitching region.
In a second aspect, an embodiment of the present application provides a long screen capturing method, where the method includes:
responding to screen capturing operation input by a user, and detecting whether a current display interface contains a rolling window or not;
If the current display interface comprises a rolling window, displaying a long screen capturing control aiming at the rolling window;
A long screen capture picture is generated based on the scrolling window in response to a triggering operation for the long screen capture control.
In a third aspect, an embodiment of the present application provides a long screen capturing device, including:
The detection unit is used for responding to screen capturing operation input by a user and detecting whether a current display interface contains a rolling window or not;
the display unit is used for displaying a long screen capturing control aiming at the rolling window if the current display interface comprises the rolling window;
and the generation unit is used for responding to the triggering operation of the long screen capturing control and generating a long screen capturing picture based on the rolling window.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, implements the method of the second aspect or any of the possible implementations of the second aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program or instructions which, when executed by a processor, implement the method of the second aspect or any of the possible embodiments of the second aspect described above.
In the embodiment of the application, the terminal equipment can respond to the screen capturing operation input by the user and detect whether the current display interface of the terminal equipment contains a rolling window or not; if the current display interface contains a rolling window, displaying a long screen capturing control aiming at the rolling window; the terminal device may generate a long screenshot picture based on the scrolling window in response to a triggering operation for the long screenshot control. By the method, the terminal device can store scrollable floating windows, namely, contents displayed in a display screen and contents not displayed in the display screen in the scrollable floating windows through screen capturing.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 is a software architecture block diagram of a terminal device according to an embodiment of the present application;
FIG. 3 is a flowchart of a long screen capturing method according to an embodiment of the present application;
Fig. 4 is a schematic diagram of a mobile phone display interface according to an embodiment of the present application;
fig. 5 is a schematic diagram of a pull-down menu of a mobile phone according to an embodiment of the present application;
Fig. 6 is a schematic diagram of each layer included in a mobile phone display interface according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a widget layer according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a widget screenshot control according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a coordinate system for a floating window according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a display interface of a scrolled floating window according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a long screen capture control for displaying a widget according to an embodiment of the present application;
FIG. 12 is a schematic illustration of a difference value provided by an embodiment of the present application;
FIG. 13 is a schematic diagram of determining a scrollable area included in a scrollable window according to an embodiment of the present application;
FIG. 14 is a schematic view of a first image and a second image according to an embodiment of the present application;
FIG. 15 is a schematic illustration of a difference value provided by an embodiment of the present application;
FIG. 16 is a schematic view of a scrolling window stitching region according to an embodiment of the present application;
FIG. 17 is a schematic view of another first image and a second image according to an embodiment of the present application;
FIG. 18 is a schematic representation of another difference provided by an embodiment of the present application;
FIG. 19 is a schematic view of another scrolling window stitching region according to an embodiment of the present application;
FIG. 20 is a schematic diagram of splicing regions according to an embodiment of the present application;
FIG. 21 is a schematic diagram of a generated long screenshot provided by an embodiment of the present application;
fig. 22 is a block diagram of a long screen capturing device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described in further detail below with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that, the application scenario described in the following embodiments of the present application is for more clearly describing the technical solution of the embodiments of the present application, and does not constitute a limitation on the technical solution provided by the embodiments of the present application, and those skilled in the art can know that, with the appearance of the new application scenario, the technical solution provided by the embodiments of the present application is also applicable to similar technical problems.
In order to enable a terminal device to perform screen capturing on part of content contained in a current display interface, the embodiment of the application provides the terminal device, a long screen capturing method and a storage medium, wherein the terminal device can respond to screen capturing operation input by a user to detect whether the current display interface of the terminal device contains a rolling window or not; if the current display interface contains a rolling window, displaying a long screen capturing control aiming at the rolling window; the terminal device may generate a long screenshot picture based on the scrolling window in response to a triggering operation for the long screenshot control. By the method, the terminal device can store scrollable floating windows, namely, contents displayed in a display screen and contents not displayed in the display screen in the scrollable floating windows through screen capturing.
In the embodiment of the application, the long screen capturing refers to the action of capturing part or all of the content which can be displayed by the current screen in a up-down scrolling way and exceeds the display range of a screen on the screen of the mobile terminal and storing the content as a picture.
The terminal equipment provided by the embodiment of the application can be portable equipment such as mobile phones, wearable equipment, tablet computers and the like. Fig. 1 is a block diagram schematically illustrating a hardware configuration of a terminal device according to an embodiment of the present application. It should be understood that the terminal device 100 shown in fig. 1 is only one example, and that the terminal device 100 may have more or fewer components than shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 1, the terminal device 100 includes: communication component 110, processor 120, memory 130, display 140, input component 150, audio circuitry 160, SIM card interface 170, and sensor 180.
The communication component 110 is configured to receive or send a call request, receive and send signals during a call, and connect to a server to upload or download data. The communication component 110 may include an RF (radio frequency) circuit 111, a Wi-Fi (WIRELESS FIDELITY ) module 112.
The RF circuit 111 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, and may receive downlink data of the base station and then transmit the downlink data to the processor 120 for processing; uplink data may be sent to the base station. In general, RF circuitry 111 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. The RF circuit 111 may receive electromagnetic waves from an antenna, filter, amplify, and the like the received electromagnetic waves, and transmit the electromagnetic waves to a modem processor for demodulation. The RF circuit 111 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate. In some embodiments, at least some functional blocks of RF circuitry 111 may be disposed in processor 120. In some embodiments, at least some of the functional blocks of RF circuitry 111 may be disposed in the same device as at least some of the blocks of processor 120. The RF circuitry 111 and antenna of the terminal device 100 are coupled such that the terminal device 100 can communicate with a network and other devices through wireless communication techniques.
Wi-Fi belongs to a short-range wireless transmission technology, and the terminal device 100 can help a user to send and receive e-mail, browse web pages, access streaming media and the like through the Wi-Fi module 112, so that wireless broadband internet access is provided for the user. The Wi-Fi module 112 may connect to a router through which an external network is connected. The Wi-Fi module 112 may also connect to a server to upload or download data.
The memory 130 may be used to store data or program codes used by the terminal device when operating. The processor 120 performs various functions of the terminal device 100 and data processing by executing data or program codes stored in the memory 130. Memory 130 may include high-speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. The memory 130 stores an operating system that enables the terminal device 100 to operate.
The display 140 is used to display information input by a user or information provided to the user and a graphical user interface (GRAPHICAL USER INTERFACE, GUI) of various menus of the terminal device 100. In particular, the display 140 may include a display disposed on the front side of the terminal device 100. The display may be configured in the form of a liquid crystal display, light emitting diodes, or the like. The display 140 may be used to display an interface of the terminal device while it is running.
The input component 150 may be used to receive numeric or character information entered by a user, various operations entered by the user, etc., and generate signal inputs related to user settings and function controls of the terminal device 100. In particular, the input component 150 may include keys and a touch screen, which may be disposed on the front side of the terminal device 100, may collect touch operations on or near the user, such as clicking buttons, dragging scroll boxes, and the like.
The touch screen may be covered on the display, and in some embodiments, the touch screen may be integrated with the display to implement the input and output functions of the terminal device 100, and after integration, the touch screen may be simply referred to as a touch display.
The terminal device 100 may further include a positioning module, such as a satellite positioning module or a mobile communication network positioning module, and may determine the geographic location of the terminal device 100 in real time.
Audio circuitry 160, speaker 161, microphone 162 may provide an audio interface between the user and terminal device 100. The audio circuit 160 may transmit the received electrical signal converted from audio data to the speaker 161, and the speaker 161 converts the electrical signal into a sound signal and outputs the sound signal. The terminal device 100 may also be configured with a volume button for adjusting the volume of the sound signal. On the other hand, the microphone 162 converts the collected sound signal into an electrical signal, receives it by the audio circuit 160, converts it into audio data, and outputs the audio data to the RF circuit 111 for transmission to, for example, another terminal, or outputs the audio data to the memory 130 for further processing.
The SIM card interface 170 is used to connect a SIM card. The SIM card may be contacted and separated from the terminal apparatus 100 by being inserted into the SIM card interface 170 or by being withdrawn from the SIM card interface 170. The terminal device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 170 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface can be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The terminal device 100 interacts with the network through the SIM card to realize functions such as call and data communication. In some embodiments, the terminal device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the terminal device 100 and cannot be separated from the terminal device 100. The SIM card is used to identify the user's mobile phone number.
The terminal device 100 may include a USB (universal serial bus ) interface or the like in addition to the SIM card interface 170. The USB interface is used for connecting a charging wire or other peripheral equipment. For example, the terminal device 100 may be connected to the charging wire through a USB interface. The components or modules in the terminal device 100 are connected by a bus.
The terminal device 100 may further comprise at least one sensor 180, such as an acceleration sensor 181, a distance sensor 182, a fingerprint sensor 183, a temperature sensor 184. The terminal device 100 may also be configured with other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, light sensors, motion sensors, and the like. For example, the fingerprint sensor 183 may be used to sense icons of the user's pointing interface with the terminal device 100.
The terminal device 100 may also include a camera for capturing still images or video. The camera can be one or a plurality of cameras. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the processor 120 for conversion into a digital image signal.
The processor 120 is a control center of the terminal device 100, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by running or executing software programs stored in the memory 130, and calling data stored in the memory 130. In some embodiments, processor 120 may include one or more processing units. The processor 120 of the present application may run an operating system, an application, a user interface display, and a touch response, as well as the long screen capturing method of the present application. The specific process by which the processor 120 performs the long screen capture method will be described in detail below.
Fig. 2 is a software configuration block diagram of the terminal device 100 of the embodiment of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages. As shown in fig. 2, the application package may include applications such as cameras, gallery, calendar, talk, map, navigation, clock, bluetooth, music, video, short message, etc. The user may set an alarm clock in the clock application. The application layer may also include third party applications installed on the terminal device.
The application framework layer provides an application programming interface (Application Programming Interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager may obtain the display size, determine if there is a status bar, lock the screen, intercept the screen, etc.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include alarm clock data, video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is arranged to provide communication functions for the terminal device. Such as the management of call status (including on, hung-up, etc.). The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the terminal vibrates, and an indicator light blinks.
Android run time includes a core library and virtual machines. Android runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc. Wherein the three-dimensional graphics processing library and the 2D graphics engine both belong to a common camera resource.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In order to further explain the technical solution provided by the embodiments of the present application, the following details are described with reference to the accompanying drawings and the detailed description. Although embodiments of the present application provide the method operational steps shown in the following embodiments or figures, more or fewer operational steps may be included in the method, either on a routine or non-inventive basis. In steps where there is logically no necessary causal relationship, the execution order of the steps is not limited to the execution order provided by the embodiments of the present application. The method may be performed sequentially or and in accordance with the method shown in the embodiments or drawings when the actual process or apparatus is performed.
Fig. 3 shows a flowchart of a long screen capturing method according to an embodiment of the present application. The method may be applied to the terminal device shown in fig. 1, which may include, but is not limited to, a mobile phone. As shown in fig. 3, the method may include the steps of:
Step S301, responding to the screen capturing operation input by the user, and detecting whether a rolling window is contained in the current display interface.
In an alternative implementation manner, the mobile phone can respond to the screen capturing operation input by the user to detect whether the current display interface contains a floating window or not; if the current display interface contains a floating window, determining whether the floating window is a rolling window; wherein, the floating window refers to a window covering part of the display screen of the terminal equipment;
Specifically, when detecting whether a floating window is included in the current display interface, the mobile phone can respond to screen capturing operation input by a user to acquire layer data of each layer included in the current display interface; determining the area of each layer based on the layer data of each layer; determining the area ratio of each layer to the display screen based on the areas of each layer respectively; and if at least one area ratio is within the preset area ratio range, determining that the current display interface comprises the floating window.
Illustratively, in one embodiment, as shown in fig. 4, it is assumed that fig. 4 is the current display interface of the mobile phone, and a widget 401 is included in fig. 4. Alternatively, the user may take a screen capture by: the user can execute a pull-down operation from the top of the mobile phone, open the pull-down menu bar of the mobile phone, as shown in fig. 5, and include a screen capturing button 501 in the pull-down menu bar of the mobile phone, and the user can click the screen capturing button 501 in the pull-down menu bar of the mobile phone to input the screen capturing operation. The mobile phone responds to the operation of clicking the screen capturing button 501 by the user, and can acquire the layer data of each layer contained in the current display interface of the mobile phone from the data Buffer (Buffer). For example, each layer included in the current display interface of the mobile phone may correspond to a set of Bitmap data, where each layer included in the current display interface of the mobile phone may include a status bar layer, a navigation bar layer, a background layer, and a widget layer, as shown in fig. 6, where 601 in (a) is a status bar layer; 602 in (b) is a navigation bar layer; (c) a background layer; 603 in (d) is a widget layer.
Because the portlet layer needs to be overlaid on top of the background layer, as shown in fig. 4, a layer of translucent area is typically carried on the bottom of the portlet layer, as shown in fig. 6 (d), with grey representing the translucent area. Therefore, after obtaining Bitmap data corresponding to each layer, the mobile phone needs to determine whether each layer contains a semitransparent region, if so, the semitransparent region is discarded, i.e. only an opaque region is reserved; specifically, after obtaining Bitmap data corresponding to each layer, the mobile phone may discard a semitransparent area included in one layer according to an Alpha value used for identifying transparency in the Bitmap data, for example, if the Alpha value is smaller than 255, which indicates that the area is a semitransparent area, the area may be discarded, as shown in fig. 7, and 701 is a widget layer after discarding the transparent area.
After the opaque areas in the layers are determined by the method, the mobile phone can determine the areas of the layers according to Bitmap data corresponding to the opaque areas of the layers; determining the area ratio of each layer to the mobile phone display screen based on the determined areas of each layer and the area of the mobile phone display screen; if the area ratio is within the preset area ratio range, determining that the current display interface comprises a suspension window; if the area ratio is not within the preset area ratio range (thred ), determining that the current display interface does not contain a floating window, wherein thred and thred are greater than zero.
For example, it is assumed that the preset area ratio range is (20:100, 90:100), where the minimum value of the area ratio range may be set according to the area ratio of the common layer such as the status bar layer or the navigation bar layer, which is generally included in the mobile phone display interface, to the mobile phone display screen, so that the area ratio is greater than the area ratio of the common layer to the mobile phone display screen; the maximum value of the area ratio range can be set according to the area ratio of the suspension window area which is usually the maximum to the display screen of the mobile phone.
Assuming that the area ratio of the determined status bar layer to the mobile phone display screen is 5:100, the area ratio of the navigation bar layer to the mobile phone display screen is 10:100, the area ratio of the background layer to the mobile phone display screen is 100:100, the area ratio of the small window layer to the mobile phone display screen is 50:100. therefore, the area ratio of the status bar layer, the navigation bar layer, the background layer and the mobile phone display screen is not within the preset area ratio range, and the area ratio of the widget layer to the mobile phone display screen is within the preset area ratio range, so that the mobile phone can determine that the current display interface contains a floating window, i.e. the widget 401 is a floating window.
In an alternative embodiment, after the floating window is detected in the above manner, a screen capturing control for the floating window may be displayed, for example, as shown in fig. 8, a small window screen capturing button 801 may be displayed on the lower right side of the display screen of the mobile phone, where the small window screen capturing button 801 is an optional screen capturing control for the floating window, a user may click on the small window screen capturing button 801, and the mobile phone may perform a screen capturing operation on the interface currently displayed by the floating window 401 in response to the user clicking on the small window screen capturing button 801.
In an optional implementation manner, after detecting the floating window in the above manner, the mobile phone may obtain a first interface currently displayed by the floating window; and performing rolling operation on the floating window; acquiring a second interface displayed by the scrolled floating window, and comparing the first interface with the second interface; and if the first interface and the second interface are different, determining that the floating window is a rolling window.
For example, in one embodiment, after detecting that the widget 401 is a floating window, the mobile phone may first acquire an image of a current display interface of the floating window, that is, a first interface, and establish a coordinate system based on a position of the floating window, as shown in fig. 9, may use a vertex of an upper left corner of the floating window as an origin O of coordinates, establish an X-axis along a width of the floating window, establish a Y-axis along a height of the floating window, and determine a coordinate center point P, where X max may represent a wide value of the floating window, and Y max represents a high value of the floating window. After the establishment is completed, the mobile phone may simulate a touch screen sliding operation, that is, a scrolling operation, by using the coordinate center point P as a starting point, and move upwards by a distance Y max/thred 3, where thred3 is any value greater than or equal to 2, for example, thred3 may be 3. After the scrolling operation is completed, the mobile phone may acquire an image of the interface currently displayed by the scrolled floating window, that is, the second interface, as shown in fig. 10.
After the images shown in fig. 9 and 10 are obtained, the mobile phone can compare the images shown in fig. 9 and 10, for example, the mobile phone can calculate the difference value of pixels between corresponding pixels in the images shown in fig. 9 and 10 by adopting an SAD algorithm, and calculate the sum of absolute values of the difference values, and judge whether the images shown in fig. 9 and 10 have the difference by comparing the sum of absolute values with a preset threshold value, wherein the preset threshold value can be a value close to 0 or can be set according to the size of the images, and for example, when the number of pixels contained in the images is more, the threshold value can be set to 100; when the image contains fewer pixels, the threshold may be set to 20. If the sum of the absolute values is less than the preset threshold value, indicating that no difference exists between the images shown in fig. 9 and 10, indicating that the floating window cannot roll; if the sum of the absolute values is greater than the preset threshold, indicating that there is a difference between the images shown in fig. 9 and 10, it is indicated that the floating window is scrollable, i.e., the floating window is a scrolling window.
In another optional embodiment, the current display interface of the mobile phone may include a plurality of floating windows, where the user may select one floating window from the plurality of floating windows detected by the mobile phone as a target floating window, and the mobile phone responds to the operation of selecting the target floating window by the user to detect whether the target floating window is a rolling window, and in a specific embodiment, similar to the above steps, details are not repeated herein.
Step S302, if the current display interface contains a rolling window, displaying a long screen capturing control for the rolling window.
Step S303, in response to a trigger operation for the long screen capturing control, a long screen capturing picture is generated based on the scroll window.
In an alternative embodiment, if it is determined in step S301 that the current display interface of the mobile phone includes a scrolling window, the mobile phone may display a long screen capturing control for the scrolling window, for example, as shown in fig. 11, a small window long screen capturing button 1101 is an optional long screen capturing control for the scrolling window, the user may click the small window long screen capturing button 1101, and in response to an operation of clicking the small window long screen capturing button 1101 by the user, the mobile phone may generate a long screen capturing picture based on the scrolling window.
In an alternative embodiment, after the scroll window is detected in step S301, when a long screen capture is generated based on the scroll window, since not all the areas within the scroll window may be scrolled, it is necessary to detect a scrollable area included in the scroll window. Specifically, the mobile phone may compare the differences between the images shown in fig. 9 and fig. 10 to obtain a corresponding difference map, as shown in fig. 12, gray represents a non-difference region, and black represents a difference region, and since the scrolling mode of the scrolling window is scrolling up and down, the scrollable region and the height H of the scrollable region contained in the scrolling window may be initially determined according to the difference map shown in fig. 12.
Specifically, fig. 12 is a difference chart corresponding to the images shown in fig. 9 and fig. 10, where the mobile phone may determine the vertex coordinates of the difference area included in the difference chart, where the vertex coordinates of the difference area are the vertex coordinates of the scrollable area, and a coordinate system where the vertex coordinates of the scrollable area are located is consistent with a coordinate system where the vertex coordinates of the difference area are located.
In an optional implementation manner, after the position and the height of the scrollable area are preliminarily determined in the above manner, the mobile phone can execute at least one scrolling operation for the scrolling window, and a first image and a second image corresponding to each scrolling operation are respectively obtained; and splicing the first image and the second image corresponding to each scrolling operation in sequence to generate a long screen capturing picture. The first image is an interface image of a scrolling window before scrolling operation; the second image is an interface image of the scroll window after the scroll operation is performed.
Illustratively, in one embodiment, the user may click the widget long screen capturing button 1001 in fig. 11, and the mobile phone obtains an image Pic0 of the scrolling window in response to the user clicking the widget long screen capturing button 1101, and simulates a touch screen sliding operation, i.e., a scrolling operation, by a distance H/thred4 upwards with a center point of the determined scrollable area as a starting point, and obtains an image Pic1 of the scrolling window, wherein thred may be any integer greater than 0; for example, as shown in fig. 13, H is the height of the scrollable area, Q is the center point determined according to the vertex coordinates of the scrollable area, after the image Pic0 is obtained, the mobile phone may simulate a touch screen sliding operation with the Q point as the starting point, move up by H/thred4 distance, and obtain the image Pic1 of the scrolling window, repeat the above steps, obtain the image Pic2 of the scrolling window, and compare Pic1 and Pic2, if Pic1 and Pic2 have no difference, it indicates that scrolling has reached the bottommost end, and stop scrolling; if Pic1 and Pic2 are different, scrolling is continued until scrolling to the bottommost end or feedback operation of the user is received, etc.
In one embodiment, assuming that Pic1 and Pic2 are not different, the finally obtained image of the scrolling window contains Pic0 and Pic1, where Pic0 is the first image and Pic1 is the second image. The first image and the second image may be stitched together to generate a long screenshot: as shown in fig. 14, (a) is Pic0, and (b) is Pic1, after images Pic0 and Pic1 of the rolling window are obtained, the mobile phone may perform difference on the images Pic0 and Pic1 to obtain a corresponding difference map Diff01, as shown in fig. 15, where gray represents the non-difference region and black represents the difference region. After the difference map is obtained, diff01, the mobile phone can determine the coordinate position of the difference region contained in the difference map Diff01, and the coordinate position of the difference region can be represented by the vertex coordinates of each vertex. The mobile phone can determine the vertex coordinates of the splicing area according to the vertex coordinates of the difference area, extract corresponding splicing areas Roi0 and Roi1 from the images Pic0 and Pic1 according to the vertex coordinates of the splicing area, and splice the Roi0 and the Roi1 to obtain a long screenshot spliced picture.
For example, assuming that the vertex coordinates of four vertices of the difference region included in the difference map Diff01 are A1 (1, 1), B1 (5, 1), C1 (1, 7), D1 (5, 7), respectively, the vertex coordinates of the stitching region are a (1, 1), B (5, 1), C (1, 7), D (5, 7). After the vertex coordinates of the splicing area are determined, the splicing area Roi contained in each image can be extracted from the acquired images Pic0 and Pic1 of the rolling window according to the vertex coordinates of the splicing area, as shown in fig. 16, (a) is the splicing area Roi0 in Pic0, and (b) is the splicing area Roi1 in Pic1, after the splicing areas Roi0 and Roi1 are determined, the Roi0 and Roi1 can be spliced, and a long screenshot spliced picture is obtained. The specific steps for splicing the Roi0 and the Roi1 will be described in detail below, and will not be repeated here. After the long screen shot spliced picture is obtained, the long screen shot spliced picture and a non-spliced area of any image can be fused to generate the long screen shot picture.
In one embodiment, the acquired first image and second image of the rolling window may be divided into two parts of a stitched area and a non-stitched area, where the vertex coordinates of the stitched area are consistent with those of the scrollable area included in the rolling window, and the non-stitched area is the rest of the first image and the second image except the stitched area.
In another embodiment, it is assumed that the finally obtained image of the scrolling window includes Pic0, pic1 and Pic2, as shown in fig. 17, (a) is Pic0, (b) is Pic1, (c) is Pic2, after the obtaining is completed, the mobile phone may perform a difference on the images obtained in two adjacent times to obtain a corresponding difference map, as shown in fig. 18, (a) is a difference map Diff01 corresponding to Pic0 and Pic1, and (b) is a difference map Diff12 corresponding to Pic1 and Pic 2; wherein gray 1201 represents a non-differential area, black 1202 represents a differential area, after obtaining differential graphs Diff01 and Diff12, the mobile phone can determine the vertex coordinates, namely the coordinate positions, of the differential areas contained in the differential graphs Diff01 and Diff12, and determine the vertex coordinates of the splicing area according to the vertex coordinates of the differential areas.
Illustratively, it is assumed that the vertex coordinates of the four vertices of the difference region included in the difference map Diff01 are A1 (1, 1), B1 (5, 1), C1 (1, 7), D1 (5, 7), respectively; the vertex coordinates of the four vertices of the difference region included in the difference map Diff12 are A2 (1, 1), B2 (6, 1), C2 (1, 7), D2 (6, 7), respectively, and therefore, the vertex coordinates of the stitching region are a (1, 1), B (6, 1), C (1, 7), D (6, 7); wherein A is the maximum value of each coordinate point in A1 and A2; b is the maximum value of each coordinate point in B1 and B2; c is the maximum value of each coordinate point in C1 and C2; d is the maximum value of each coordinate point in D1 and D2. After the vertex coordinates of the splicing area are determined, the splicing area Roi contained in each image may be extracted from the acquired images Pic0, pic1 and Pic2 of the rolling window according to the vertex coordinates of the splicing area, as shown in fig. 19, (a) is the splicing area Roi0 in Pic0, (b) is the splicing area Roi1 in Pic1, and (c) is the splicing area Roi2 in Pic2, and after the splicing areas Roi0, roi1 and Roi2 are extracted, the Roi0, roi1 and Roi2 may be spliced.
In one embodiment, the example of splicing the Roi0 and the Roi1 is taken as an example, since only the change in the Y direction needs to be considered, the offset can be calculated by using a template matching method, specifically, an area with the height of H/thred4 can be taken as a template from the top of the Roi1, where H is the height of the spliced area, and can be determined according to the vertex coordinates of the spliced area. thred4 is any value greater than 0; the search is performed in the Y-axis direction in the Roi0, i.e., the search is performed from top to bottom in the Roi0 until the region that best matches the template in the Roi1 is found in the Roi0, and the displacement amount in the Y-axis direction when the best matching region is found is defined as the offset mvY.
For example, assuming thred is 4, i.e., starting from the top of Roi1, taking the H/4-high region as a template, as shown in fig. 20, searching from top to bottom in Roi0 for the H/4-high region, assuming that when the best matching region is found, the displacement in the Y-axis direction is 5, i.e., the offset mvY between the top of Roi0 and the top of the best matching region is 5, after determining the offset mvY, the Roi0 may be divided into two parts according to the offset in the Y-axis direction, and the upper half of Roi1 and Roi0 may be spliced to obtain a spliced picture, and referring to the above steps, the obtained spliced picture and Roi2 may be spliced again to obtain a long-screen spliced picture.
After the long screen capturing spliced picture is obtained by the method, the long screen capturing spliced picture and the non-spliced area of any image can be fused to generate the long screen capturing picture, as shown in fig. 21.
Based on the same inventive concept, the embodiment of the present application further provides a long screen capturing device, as shown in fig. 22, where the long screen capturing device includes:
A detection unit 2201 that detects whether a scroll window is included in a current display interface in response to a screen capturing operation input by a user;
a display unit 2202, configured to display a long screen capture control for a scrolling window if the current display interface includes the scrolling window;
the generation unit 2203 generates a long screen capture picture based on the scroll window in response to a trigger operation for the long screen capture control.
In one possible embodiment, the detection unit 2201 is specifically configured to:
responding to screen capturing operation input by a user, and detecting whether a current display interface contains a floating window or not; the floating window refers to a window covering part of a display screen of the terminal equipment;
And if the current display interface comprises a floating window, determining whether the floating window is a rolling window.
In one possible embodiment, the detection unit 2201 is specifically configured to:
Responding to screen capturing operation input by a user, and acquiring layer data of each layer contained in the current display interface;
Determining the area of each layer based on the layer data of each layer;
Determining the area ratio of each layer to the display screen based on the area of each layer;
And if at least one area ratio is within the preset area ratio range, determining that the current display interface comprises a floating window.
In one possible embodiment, the detection unit 2201 is specifically configured to:
acquiring a first interface currently displayed by the floating window;
performing rolling operation on the floating window;
acquiring a second interface displayed by the scrolled floating window, and comparing the first interface with the second interface;
And if the first interface and the second interface are different, determining that the floating window is a rolling window.
In a possible implementation manner, the generating unit 2203 is specifically configured to:
Responding to the triggering operation of the long screen capturing control, executing at least one rolling operation for the rolling window, and respectively acquiring a first image and a second image corresponding to each rolling operation; the first image is an interface image of a scrolling window before scrolling operation; the second image is an interface image of a rolling window after the rolling operation;
and sequentially splicing the first image and the second image corresponding to each scrolling operation to generate the long screen capturing picture.
In a possible implementation manner, the generating unit 2203 is specifically configured to:
executing one-time scrolling operation on the scrolling window, and acquiring a first image and a second image corresponding to the one-time scrolling operation;
And if the acquired first image and the acquired second image have differences, repeating the step of executing one-time scrolling operation on the scrolling window until the acquired first image and the acquired second image have no differences.
In a possible implementation manner, the generating unit 2203 is specifically configured to:
For the first image and the second image corresponding to each scrolling operation, the following operations are respectively performed: determining a coordinate position of a difference region between the first image and the second image; determining a splicing area in the first image and the second image respectively based on the coordinate position of the difference area;
splicing the splicing areas in each image in sequence to obtain a long screen capturing spliced picture;
and generating the long screen capturing picture based on the long screen capturing spliced picture.
In a possible implementation manner, the generating unit 2203 is specifically configured to:
Splicing the long screen shot spliced picture and the image of the non-spliced area to obtain the long screen shot picture; the non-stitching region is a region of any one image other than the stitching region.
Based on the same inventive concept, there is also provided in an embodiment of the present application a computer program product comprising a computer program or instructions which, when executed by a processor, implement any of the long screen capturing methods described above.
Based on the same inventive concept, the embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and when the computer program is executed by a processor, any long screen capturing method is realized.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (9)

1. A terminal device, comprising: a display, a memory, and a processor;
The display is configured to display an interface of the terminal device when in operation;
The memory is configured to store a program or data used by the terminal device to operate;
The processor is configured to respond to screen capturing operation input by a user and detect whether a floating window is contained in a current display interface; the floating window refers to a window covering part of a display screen of the terminal equipment; if the current display interface contains a floating window, determining whether the floating window is a rolling window; if the floating window is a rolling window, displaying a long screen capturing control aiming at the rolling window; a long screen capture picture is generated based on the scrolling window in response to a triggering operation for the long screen capture control.
2. The terminal device of claim 1, wherein the processor is specifically configured to:
Responding to screen capturing operation input by a user, and acquiring layer data of each layer contained in the current display interface;
Determining the area of each layer based on the layer data of each layer;
Determining the area ratio of each layer to the display screen based on the area of each layer;
And if at least one area ratio is within the preset area ratio range, determining that the current display interface comprises a floating window.
3. The terminal device of claim 1, wherein the processor is specifically configured to:
acquiring a first interface currently displayed by the floating window;
performing rolling operation on the floating window;
acquiring a second interface displayed by the scrolled floating window, and comparing the first interface with the second interface;
And if the first interface and the second interface are different, determining that the floating window is a rolling window.
4. The terminal device of claim 1, wherein the processor is specifically configured to:
Responding to the triggering operation of the long screen capturing control, executing at least one rolling operation for the rolling window, and respectively acquiring a first image and a second image corresponding to each rolling operation; the first image is an interface image of a scrolling window before scrolling operation; the second image is an interface image of a rolling window after the rolling operation;
and sequentially splicing the first image and the second image corresponding to each scrolling operation to generate the long screen capturing picture.
5. The terminal device of claim 4, wherein the processor is specifically configured to:
executing one-time scrolling operation on the scrolling window, and acquiring a first image and a second image corresponding to the one-time scrolling operation;
And if the acquired first image and the acquired second image have differences, repeating the step of executing one-time scrolling operation on the scrolling window until the acquired first image and the acquired second image have no differences.
6. The terminal device of claim 4, wherein the processor is specifically configured to:
For the first image and the second image corresponding to each scrolling operation, the following operations are respectively performed: determining a coordinate position of a difference region between the first image and the second image; determining a splicing area in the first image and the second image respectively based on the coordinate position of the difference area;
splicing the splicing areas in each image in sequence to obtain a long screen capturing spliced picture;
and generating the long screen capturing picture based on the long screen capturing spliced picture.
7. The terminal device of claim 6, wherein the processor is specifically configured to:
Splicing the long screen shot spliced picture and the image of the non-spliced area to obtain the long screen shot picture; the non-stitching region is a region of any one image other than the stitching region.
8. A method of long screen shots, the method comprising:
responding to screen capturing operation input by a user, and detecting whether a current display interface contains a floating window or not; the floating window refers to a window covering part of a display screen of the terminal equipment;
If the current display interface contains a floating window, determining whether the floating window is a rolling window;
if the floating window is a rolling window, displaying a long screen capturing control aiming at the rolling window;
A long screen capture picture is generated based on the scrolling window in response to a triggering operation for the long screen capture control.
9. A computer-readable storage medium having a computer program stored therein, characterized in that: which computer program, when being executed by a processor, implements the method of claim 8.
CN202210113003.0A 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium Active CN114489429B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210113003.0A CN114489429B (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium
CN202410281794.7A CN118113192A (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210113003.0A CN114489429B (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202410281794.7A Division CN118113192A (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium

Publications (2)

Publication Number Publication Date
CN114489429A CN114489429A (en) 2022-05-13
CN114489429B true CN114489429B (en) 2024-04-19

Family

ID=81477840

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410281794.7A Pending CN118113192A (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium
CN202210113003.0A Active CN114489429B (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202410281794.7A Pending CN118113192A (en) 2022-01-29 2022-01-29 Terminal equipment, long screen capturing method and storage medium

Country Status (1)

Country Link
CN (2) CN118113192A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048351B (en) * 2022-07-27 2024-08-27 荣耀终端有限公司 Screen capturing method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107229401A (en) * 2017-04-18 2017-10-03 硕诺科技(深圳)有限公司 A kind of long screenshotss method of mobile terminal
CN107957841A (en) * 2016-10-17 2018-04-24 腾讯科技(深圳)有限公司 Roll screenshotss method and device
WO2018072413A1 (en) * 2016-10-19 2018-04-26 中兴通讯股份有限公司 Terminal screenshot method and apparatus, and mobile terminal and storage medium
CN109189296A (en) * 2018-07-12 2019-01-11 Oppo(重庆)智能科技有限公司 Screenshotss method, apparatus, mobile terminal and storage medium
CN111399720A (en) * 2020-03-24 2020-07-10 北京小米移动软件有限公司 Method and device for displaying application interface and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110122159A1 (en) * 2009-11-20 2011-05-26 Sony Ericsson Mobile Communications Ab Methods, devices, and computer program products for providing multi-region touch scrolling
CN104615343A (en) * 2013-11-04 2015-05-13 中兴通讯股份有限公司 Terminal printscreen method and device
CN106970754B (en) * 2017-03-28 2020-09-08 北京小米移动软件有限公司 Screen capture processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107957841A (en) * 2016-10-17 2018-04-24 腾讯科技(深圳)有限公司 Roll screenshotss method and device
WO2018072413A1 (en) * 2016-10-19 2018-04-26 中兴通讯股份有限公司 Terminal screenshot method and apparatus, and mobile terminal and storage medium
CN107229401A (en) * 2017-04-18 2017-10-03 硕诺科技(深圳)有限公司 A kind of long screenshotss method of mobile terminal
CN109189296A (en) * 2018-07-12 2019-01-11 Oppo(重庆)智能科技有限公司 Screenshotss method, apparatus, mobile terminal and storage medium
CN111399720A (en) * 2020-03-24 2020-07-10 北京小米移动软件有限公司 Method and device for displaying application interface and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Q来A去;闻翔军;电脑知识与技术(经验技巧);全文 *
ScreenSeg: On-Device Screenshot Layout Analysis;Manoj Goyal,et.al;2021 International Joint Conference on Neural Networks (IJCNN);全文 *

Also Published As

Publication number Publication date
CN118113192A (en) 2024-05-31
CN114489429A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
CN111597000B (en) Small window management method and terminal
CN111225108A (en) Communication terminal and card display method of negative screen interface
CN112099892B (en) Communication terminal and method for rapidly scanning two-dimension code
CN113835569A (en) Terminal device, quick start method for internal function of application and storage medium
CN114035870A (en) Terminal device, application resource control method and storage medium
CN111726605B (en) Resolving power determining method and device, terminal equipment and storage medium
CN114020379B (en) Terminal equipment, information feedback method and storage medium
CN114489429B (en) Terminal equipment, long screen capturing method and storage medium
CN113038141B (en) Video frame processing method and electronic equipment
CN112163033B (en) Mobile terminal and travel list display method thereof
CN111479075B (en) Photographing terminal and image processing method thereof
CN118170302B (en) Man-machine interaction method, electronic equipment and storage medium
CN111913772A (en) Terminal and desktop display method
CN114067758A (en) Mobile terminal and image display method thereof
CN114594894B (en) Marking method of interface element, terminal equipment and storage medium
CN113835582B (en) Terminal equipment, information display method and storage medium
CN115499577B (en) Image processing method and terminal device
CN114546219B (en) Picture list processing method and related device
CN114020381B (en) Terminal equipment, plug-in deployment method and storage medium
CN113934340B (en) Terminal equipment and progress bar display method
CN115033199B (en) Mobile terminal and image display method thereof
CN114675762A (en) Terminal device, function searching method and storage medium
CN113642010B (en) Method for acquiring data of extended storage device and mobile terminal
CN113760164B (en) Display device and response method of its control operation
CN113407096A (en) Terminal device and picture processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Applicant after: Qingdao Hisense Mobile Communication Technology Co.,Ltd.

Address before: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Applicant before: HISENSE MOBILE COMMUNICATIONS TECHNOLOGY Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant