CN117971043A - Interface display method and device, storage medium and electronic equipment - Google Patents
Interface display method and device, storage medium and electronic equipment Download PDFInfo
- Publication number
- CN117971043A CN117971043A CN202410102968.9A CN202410102968A CN117971043A CN 117971043 A CN117971043 A CN 117971043A CN 202410102968 A CN202410102968 A CN 202410102968A CN 117971043 A CN117971043 A CN 117971043A
- Authority
- CN
- China
- Prior art keywords
- target
- data
- gesture information
- distance
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an interface display method, an interface display device, a storage medium and electronic equipment. Wherein the method comprises the following steps: and acquiring pressure data corresponding to the target equipment, wherein the pressure data at least comprises pressure values received by different positions of an outer frame of the target equipment in the process that the target user is in contact with the target equipment, determining target posture information corresponding to the target equipment when the target user operates the target equipment according to the pressure data, wherein the target posture information is used for representing a handheld posture of the target user when the target user operates the target equipment, generating a target interface suitable for the target user to operate the target equipment according to the target posture information, and displaying the content of the target interface on a screen of the target equipment. The application solves the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
Description
Technical Field
The present application relates to the field of user interface design and other related technical fields, and in particular, to an interface display method, an apparatus, a storage medium, and an electronic device.
Background
In the prior art, in order to improve the convenience of communication between the intelligent handheld device and a user, a technician can be provided with a touch screen for the handheld device of the intelligent mobile phone. As user demand increases further, the screen size of handheld devices is designed to be larger and larger, and users can complete their interactions with the devices via user interfaces displayed on the screen. However, in the prior art, the user interface displayed on the screen of the handheld device is fixed and does not adjust with the change in the user's hand-held posture, whether the user is operating the device with the left hand or the right hand. Because the design factor of the hand-held gesture of the user is absent in the design process of the user interface displayed on the screen of the device, the problem of inconvenient operation and high operation difficulty of the user caused by adopting a fixed screen display scheme in the prior art under the condition of larger screen of the device.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The application provides an interface display method, an interface display device, a storage medium and electronic equipment, which at least solve the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
According to an aspect of the present application, there is provided an interface display method including: acquiring pressure data corresponding to target equipment, wherein the pressure data at least comprises pressure values received by different positions of an outer frame of the target equipment in the process that a target user contacts the target equipment; determining corresponding target gesture information when the target user operates the target device according to the pressure data, wherein the target gesture information is used for representing the handheld gesture when the target user operates the target device; and generating a target interface suitable for the target user to operate the target equipment according to the target posture information, and displaying the content of the target interface on a screen of the target equipment.
Optionally, the interface display method further includes: m pressure values corresponding to the target equipment are obtained through M sensors uniformly distributed in the outer frame of the target equipment, wherein M is a positive integer, and each sensor in the M sensors is used for obtaining one pressure value in the M pressure values; and taking at least one pressure value which is larger than a preset threshold value in the M pressure values as pressure data.
Optionally, the interface display method further includes: judging whether the pressure data comprises first data and second data, wherein the first data is used for representing a pressure value corresponding to a sensor in the outer frame on the left side of the target equipment, and the second data is used for representing a pressure value corresponding to a sensor in the outer frame on the right side of the target equipment; under the condition that the pressure data comprises the first data and the second data at the same time, determining target attitude information according to the first data and the second data; and under the condition that the pressure data does not comprise the first data and/or the second data, judging whether the pressure data comprises third data and fourth data, and determining target posture information according to the obtained judging result, wherein the third data is a pressure value corresponding to a sensor in an upper outer frame of the target equipment, and the fourth data is a pressure value corresponding to a sensor in a lower outer frame of the target equipment.
Optionally, the interface display method further includes: p first sensors corresponding to the first data and Q second sensors corresponding to the second data are determined, wherein P and Q are positive integers, the first sensors are sensors used for acquiring the first data in the left outer frame of the target equipment, and the second sensors are sensors used for acquiring the second data in the right outer frame of the target equipment; acquiring a first distance corresponding to each first sensor in the P first sensors and a second distance corresponding to each second sensor in the Q second sensors, wherein the first distance corresponding to each first sensor is used for representing the distance between the first sensor and the bottom of the target device, and the second distance corresponding to each second sensor is used for representing the distance between the second sensor and the bottom of the target device; determining the maximum first distance among the first distances corresponding to each of the P first sensors as a first target distance; determining a maximum second distance among the second distances corresponding to each of the Q second sensors as a second target distance; and determining target attitude information according to the first target distance and the second target distance.
Optionally, the interface display method further includes: judging whether the first target distance is smaller than the second target distance; under the condition that the first target distance is smaller than the second target distance, taking the first gesture information as target gesture information, wherein the first gesture information is used for representing that a target user uses a left hand to operate the target device; and taking the second gesture information as target gesture information when the first target distance is greater than or equal to the second target distance, wherein the second gesture information is used for representing that a target user uses a right hand to operate the target device.
Optionally, the interface display method further includes: taking third gesture information as target gesture information under the condition that the pressure data simultaneously comprise third data and fourth data, wherein the third gesture information is used for representing that a target user simultaneously uses two hands to operate target equipment; and taking fourth gesture information as target gesture information in the case that the pressure data does not comprise the third data and/or the fourth data, wherein the fourth gesture information is used for representing that the target user is not contacted with the target device.
Optionally, the interface display method further includes: under the condition that the target gesture information is the first gesture information, generating a preset interface which is suitable for a target user to operate the target equipment by using the left hand as a target interface; under the condition that the target gesture information is the second gesture information, generating a preset interface which is suitable for a target user to operate the target equipment by using the right hand as a target interface; generating a preset interface suitable for a target user to operate the target device by using both hands as a target interface under the condition that the target gesture information is third gesture information; and determining a screen interface corresponding to the target equipment at the current moment as a target interface under the condition that the target gesture information is fourth gesture information.
According to another aspect of the present application, there is also provided an interface display apparatus including: the device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring pressure data corresponding to target equipment, and the pressure data at least comprises pressure values received by different positions of an outer frame of the target equipment in the process that a target user contacts the target equipment; the determining unit is used for determining corresponding target gesture information when the target user operates the target device according to the pressure data, wherein the target gesture information is used for representing the handheld gesture of the target user when the target user operates the target device; and the generating unit is used for generating a target interface suitable for the target user to operate the target equipment according to the target gesture information and displaying the content of the target interface on a screen of the target equipment.
According to another aspect of the present application, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to control a device in which the computer-readable storage medium is located to execute the interface display method of any one of the above items when the computer program is executed.
According to another aspect of the present application, there is also provided an electronic device, wherein the electronic device includes one or more processors and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the interface display method of any of the above.
According to the method, firstly, pressure data corresponding to target equipment are obtained, wherein the pressure data at least comprise pressure values received by different positions of an outer frame of the target equipment in the process that a target user is in contact with the target equipment, then, corresponding target posture information when the target user operates the target equipment is determined according to the pressure data, the target posture information is used for representing a handheld posture of the target user when the target user operates the target equipment, finally, a target interface suitable for the target user to operate the target equipment is generated according to the target posture information, and the content of the target interface is displayed on a screen of the target equipment.
As can be seen from the above, according to the technical scheme of the application, the gesture information (i.e., the target gesture information) of the target user for operating the target device is determined by the collected pressure values (i.e., the pressure data) received by different positions of the outer frame of the target device in the process that the target user contacts the target device, and then a customized target interface is generated according to the gesture information of the target user for operating the target device, so that the purpose of customizing the screen display content of the intelligent handheld device according to different handheld gestures of the user is achieved, and the user experience is improved.
Therefore, the technical scheme of the application adopts a mode of generating the customized target interface according to the gesture information of the target user for operating the target device, thereby realizing the purpose of customizing the screen display content of the target device, further realizing the technical effect of improving the user experience when the user operates the target device by adopting different handheld gestures, and solving the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a flow chart of an alternative interface display method according to an embodiment of the application;
FIG. 2 is a schematic diagram of an alternative interface display system according to an embodiment of the application;
FIG. 3 is a flow chart of another alternative interface display method according to an embodiment of the application;
FIG. 4 is a schematic diagram of an alternative interface display device according to an embodiment of the application;
Fig. 5 is a schematic diagram of an alternative electronic device according to an embodiment of the application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, the related information related to the present application (including the target user information, the target device information, and the handheld gesture information when the target user operates the target device) and the data (including, but not limited to, the data for presentation and the data for analysis) are both information and data authorized by the user or sufficiently authorized by each party. For example, an interface is provided between the system and the relevant user or institution, before acquiring the relevant information, the system needs to send an acquisition request to the user or institution through the interface, and acquire the relevant information after receiving the consent information fed back by the user or institution.
According to an embodiment of the present application, there is provided an interface display method embodiment, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
The application provides an interface display system (display system for short) for executing an interface display method in the application, fig. 1 is a flow chart of an alternative interface display method according to an embodiment of the application, as shown in fig. 1, the method comprises the following steps:
Step S101, acquiring pressure data corresponding to the target device.
In step S101, the pressure data includes at least pressure values to which different positions of the outer frame of the target device are subjected during contact of the target user with the target device.
Alternatively, the pressure data may be collected by pressure sensors mounted at different locations in the outer frame of the target device.
Optionally, the display system acquires gravity data corresponding to the target device through the gravity sensor, wherein the gravity data at least comprises an inclination angle of the target device when the target user operates the target device at the current moment.
Step S102, corresponding target posture information when the target user operates the target equipment is determined according to the pressure data.
In step S102, the target gesture information is used to characterize a handheld gesture of the target user when operating the target device;
Optionally, the target gesture information is first gesture information, second gesture information, third gesture information and fourth gesture information, where the first gesture information is used to characterize that the target user uses a left hand to operate the target device, the first gesture information is used to characterize that the target user uses a right hand to operate the target device, the first gesture information is used to characterize that the target user uses two hands to operate the target device, and the fourth gesture information is used to characterize that the target user is temporarily contactless with the target device.
Step S103, generating a target interface suitable for the target user to operate the target device according to the target gesture information, and displaying the content of the target interface on a screen of the target device.
Optionally, the content of the corresponding generated target interface is different according to the difference of the target gesture information by the display system. For example, when the target user uses the target device to play MOBA (Multiplayer Online Battle Arena, multi-user online tactical competition), the target user uses both hands to operate the target device at the same time, and at this time, the target gesture information corresponding to the target user is third gesture information, and then the display system generates an interface suitable for the target user to operate the target device using both hands as a target interface according to the third gesture information.
Optionally, after the target interface is generated, the display system determines a display direction of the target interface on a screen of the target device according to the gravity data collected in advance by the gravity sensor, and the display system displays the target interface on the screen of the target device according to the display direction.
As can be seen from the above, according to the technical scheme of the application, the gesture information (i.e., the target gesture information) of the target user for operating the target device is determined by the collected pressure values (i.e., the pressure data) received by different positions of the outer frame of the target device in the process that the target user contacts the target device, and then a customized target interface is generated according to the gesture information of the target user for operating the target device, so that the purpose of customizing the screen display content of the intelligent handheld device according to different handheld gestures of the user is achieved, and the user experience is improved.
Therefore, the technical scheme of the application adopts a mode of generating the customized target interface according to the gesture information of the target user for operating the target device, thereby realizing the purpose of customizing the screen display content of the target device, further realizing the technical effect of improving the user experience when the user operates the target device by adopting different handheld gestures, and solving the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
In an alternative embodiment, the display system firstly obtains M pressure values corresponding to the target device through M sensors uniformly distributed in an outer frame of the target device, where M is a positive integer, each of the M sensors is used to obtain one of the M pressure values, and then the display system uses at least one of the M pressure values that is greater than a preset threshold as the pressure data.
Optionally, when the target user operates the target device, there is a situation of mistaken touch to an outer frame of the target device, so that the display system takes a pressure value greater than a preset threshold value of the M pressure values as an effective value and takes a pressure value less than or equal to the preset threshold value of the M pressure values as an ineffective value, thereby avoiding the situation that the screen content displayed by the target device is changed along with the situation caused by mistaken touch to the target device by the target user, further improving the accuracy of the display system and improving the user experience of the target user.
Optionally, the display system first determines whether each of the M pressure values is greater than a preset threshold, then determines that the state of the sensor corresponding to the pressure value is a first state if the pressure value is greater than or equal to the preset threshold, and determines that the state of the sensor corresponding to the pressure value is a second state if the pressure value is less than the preset threshold, and finally uses the pressure value corresponding to the sensor in the first state of the M sensors as the pressure data.
Optionally, fig. 2 is a schematic diagram of an optional interface display system according to an embodiment of the present application, where, as shown in fig. 2, 1 is a target device, 2 is an outer frame of the target device, 3 is sensors uniformly distributed in the outer frame of the target device, and a distance between two adjacent sensors located on the same side of the outer frame of the target device is equal to a preset distance.
In an alternative embodiment, the display system first determines whether the pressure data includes first data and second data, where the first data is used to represent a pressure value corresponding to a sensor in a left external frame of the target device, the second data is used to represent a pressure value corresponding to a sensor in a right external frame of the target device, then determines target posture information according to the first data and the second data when the pressure data includes the first data and the second data at the same time, then determines whether the pressure data includes third data and fourth data when the pressure data does not include the first data and/or the second data, and determines target posture information according to a result of the determination, where the third data is a pressure value corresponding to a sensor in an upper external frame of the target device, and the fourth data is a pressure value corresponding to a sensor in a lower external frame of the target device.
Optionally, the display system first determines whether the sensor in the left outer frame and the sensor in the right outer frame of the target device sense pressure, determines target posture information according to a first operation logic under the condition that the sensor in the left outer frame and the sensor in the right outer frame of the target device sense pressure at the same time, wherein the first operation logic is used for determining the target posture information according to a comparison result of the first distance and the second distance, and determines the target posture information according to a second operation logic under the condition that the sensor in the left outer frame and/or the sensor in the right outer frame of the target device sense no pressure, wherein the second operation logic is used for determining the target posture information according to a compression condition of the upper outer frame and a compression condition of the lower outer frame of the target device.
In an alternative embodiment, the display system firstly determines P first sensors corresponding to the first data and Q second sensors corresponding to the second data, where P and Q are positive integers, the first sensors are sensors used for acquiring the first data in a left outer frame of the target device, the second sensors are sensors used for acquiring the second data in a right outer frame of the target device, secondly, the display system obtains a first distance corresponding to each first sensor in the P first sensors and a second distance corresponding to each second sensor in the Q second sensors, the first distance corresponding to each first sensor is used for characterizing a distance between the first sensor and the bottom of the target device, then the display system determines a first distance corresponding to each first sensor in the P first sensors and a second distance corresponding to each second sensor in the P first sensors as a second distance corresponding to the target device, and finally determines the second distance corresponding to the second sensor in the P first sensor as a second distance corresponding to the target device, and the second distance corresponding to the second sensor in the second sensor is determined as a second distance corresponding to the target device.
Optionally, under the condition that the pressure data includes the first data and the second data at the same time, the display system performs comparison on the distance between each pressure sensor sensing pressure in the left outer frame of the target device and the bottom of the target device to obtain a distance h Lmax between the highest position pressure sensor in the left outer frame and the bottom of the target device, and then performs comparison on the distance between each pressure sensor sensing pressure in the right outer frame of the target device and the bottom of the target device to obtain a distance h Rmax between the highest position pressure sensor in the right outer frame and the bottom of the target device.
In an alternative embodiment, the display system first determines whether the first target distance is less than the second target distance, then uses the first gesture information as target gesture information if the first target distance is less than the second target distance, where the first gesture information is used to characterize the target user using the left hand to operate the target device, and uses the second gesture information as target gesture information if the first target distance is greater than or equal to the second target distance, where the second gesture information is used to characterize the target user using the right hand to operate the target device.
Optionally, in the case that the pressure data includes both the first data and the second data, the display system compares h Lmax to be less than h Rmax, determines that the target user uses the left-handed operation target device at the current time when h Lmax is less than h Rmax, and thus determines that the target interface is an interface suitable for left-handed operation, and determines that the target user uses the right-handed operation target device at the current time when h Lmax is greater than or equal to h Rmax, and thus determines that the target interface is an interface suitable for right-handed operation.
In an alternative embodiment, the display system takes third gesture information as target gesture information in case the pressure data comprises third data and fourth data at the same time, wherein the third gesture information is used for representing that the target user uses both hands to operate the target device at the same time, and takes fourth gesture information as target gesture information in case the pressure data does not comprise the third data and/or the fourth data, wherein the fourth gesture information is used for representing that the target user is not in contact with the target device.
Optionally, in the case that the pressure data includes both the third data and the fourth data, it is determined that the upper side outer frame and the lower side outer frame of the target device sense pressure at the same time, so that it is determined that the target user uses both hands to operate the target device at the current moment, so that it is determined that the target interface is an interface suitable for two-hand operation, and in the case that the pressure data does not include the third data and/or the fourth data, the display content of the screen interface corresponding to the target device at the current moment is kept unchanged.
In an alternative embodiment, in the case that the target gesture information is the first gesture information, the display system generates a preset interface suitable for the target user to operate the target device by using the left hand as the target interface; when the target gesture information is the second gesture information, the display system generates a preset interface which is suitable for a target user to operate the target device by using the right hand as a target interface; when the target gesture information is the third gesture information, the display system generates a preset interface which is suitable for a target user to operate the target device by using both hands as a target interface; and under the condition that the target gesture information is fourth gesture information, the display system determines a screen interface corresponding to the target equipment at the current moment as a target interface.
Optionally, fig. 3 is a flowchart of another optional interface display method according to an embodiment of the present application, as shown in fig. 3, after pressure data of a target device is acquired, the display system analyzes the pressure data through the information processing portion, first determines whether pressure sensors on the left and right sides of the rectangular outer frame are simultaneously pressed, determines a user operation interface (i.e. a target interface) of the target device according to a first operation logic when the pressure sensors on the left and right sides of the rectangular outer frame are simultaneously pressed, determines a user operation interface of the target device according to a second operation logic when the pressure sensors on the left and right sides of the rectangular outer frame are not simultaneously pressed, and if all the pressure sensors of the rectangular outer frame are not pressed, neither the first operation logic nor the second operation logic is executed, and keeps the user operation interface of the target device at the current moment unchanged through the display portion.
Optionally, the specific executing steps of the first operation logic include: obtaining a plurality of distances h L between a plurality of pressure sensors pressed in the left side of a rectangular outer frame (namely the outer frame of target equipment) and the bottom surface of the rectangular outer frame (namely the bottom of target equipment), obtaining a plurality of distances h R between a plurality of pressure sensors pressed in the right side of the rectangular outer frame and the bottom surface of the rectangular outer frame, then comparing a plurality of distances h L between each pressure sensor pressed in the left side of the rectangular outer frame and the bottom surface of the rectangular outer frame by a display system, obtaining a distance h Lmax between the highest pressure sensor in the left side of the rectangular outer frame and the bottom surface of the rectangular outer frame, and secondly comparing a plurality of distances h R between each pressure sensor pressed in the right side of the rectangular outer frame and the bottom surface of the rectangular outer frame by the display system, and obtaining a distance h Rmax between the highest pressure sensor in the right side of the rectangular outer frame and the bottom surface of the rectangular outer frame.
Then, the sizes of h Lmax and h Rmax are compared, the right-hand operation target device is used at the current moment of the target user under the condition that h Lmax is larger than h Rmax, the display system displays a user operation interface suitable for the right operation hand of the target user through the display part, and the left-hand operation target device is used at the current moment of the target user under the condition that h Rmax is larger than h Lmax, and the display system displays a user operation interface suitable for the left operation hand of the target user through the display part.
Optionally, the specific executing steps of the second operation logic include: the display system detects whether the pressure sensors on the upper side and the lower side of the rectangular outer frame are pressed at the same time, and displays a user operation interface suitable for the two-hand operation of a target user at the current moment through the display part under the condition that the pressure sensors on the upper side and the lower side of the rectangular outer frame are pressed at the same time; under the condition that only the upper pressure sensor or only the lower pressure sensor of the rectangular outer frame is pressed, the fact that the user is not in a double-hand operation mode is determined, therefore, the display part keeps the user operation interface at the current moment unchanged, and finally, the display system re-executes the first operation logic to carry out circulation judgment, wherein the priority of the display system executing the first operation logic through the information processing part is higher than that of the display system executing the second operation logic.
Optionally, the display system further comprises a ranging module, the ranging module measures the distance between the target device and the target user through a distance sensor, and when the distance between the target device and the target user is larger than the maximum preset man-machine distance, the power supply of the target device is turned off, optionally, the monitoring range of the ranging module is 100 meters in diameter, and the monitoring time beyond the monitoring range is kept to be 3 minutes.
Optionally, the display system further comprises a wireless communication module, wherein the wireless communication module is used for realizing wireless network construction, wireless data transmission, wireless network management and wireless network security, and long-distance wireless communication is realized through wireless network construction; the device is connected with different types of equipment through wireless data transmission, and meanwhile remote monitoring and control are realized; the method comprises the steps of realizing operation on nodes in a network through wireless network management, and managing the network; wireless network security prevents the network of the target device from being attacked by using a variety of security policies.
According to the method, firstly, pressure data corresponding to target equipment are obtained, wherein the pressure data at least comprise pressure values received by different positions of an outer frame of the target equipment in the process that a target user is in contact with the target equipment, then, corresponding target posture information when the target user operates the target equipment is determined according to the pressure data, the target posture information is used for representing a handheld posture of the target user when the target user operates the target equipment, finally, a target interface suitable for the target user to operate the target equipment is generated according to the target posture information, and the content of the target interface is displayed on a screen of the target equipment.
As can be seen from the above, according to the technical scheme of the application, the gesture information (i.e., the target gesture information) of the target user for operating the target device is determined by the collected pressure values (i.e., the pressure data) received by different positions of the outer frame of the target device in the process that the target user contacts the target device, and then a customized target interface is generated according to the gesture information of the target user for operating the target device, so that the purpose of customizing the screen display content of the intelligent handheld device according to different handheld gestures of the user is achieved, and the user experience is improved.
Therefore, the technical scheme of the application adopts a mode of generating the customized target interface according to the gesture information of the target user for operating the target device, thereby realizing the purpose of customizing the screen display content of the target device, further realizing the technical effect of improving the user experience when the user operates the target device by adopting different handheld gestures, and solving the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
According to another aspect of the embodiment of the application, an interface display device is also provided. FIG. 4 is a schematic diagram of an alternative interface display device according to an embodiment of the application, as shown in FIG. 4, the interface display device comprising: an acquisition unit 401, a determination unit 402, and a generation unit 403.
Optionally, the acquiring unit is configured to acquire pressure data corresponding to the target device, where the pressure data at least includes pressure values received by different positions of an outer frame of the target device during a process that the target user contacts the target device; the determining unit is used for determining corresponding target gesture information when the target user operates the target device according to the pressure data, wherein the target gesture information is used for representing the handheld gesture of the target user when the target user operates the target device; and the generating unit is used for generating a target interface suitable for the target user to operate the target equipment according to the target gesture information and displaying the content of the target interface on a screen of the target equipment.
In an alternative embodiment, the acquisition unit comprises: a first acquisition subunit and a first determination subunit.
Optionally, the first obtaining subunit is configured to obtain M pressure values corresponding to the target device through M sensors uniformly distributed in an outer frame of the target device, where M is a positive integer, and each sensor in the M sensors is configured to obtain one pressure value in the M pressure values; and the first determination subunit is used for taking at least one pressure value which is larger than a preset threshold value in the M pressure values as pressure data.
In an alternative embodiment, the determining unit comprises: the first judging subunit, the second determining subunit and the third determining subunit.
Optionally, the first judging subunit is configured to judge whether the pressure data includes first data and second data, where the first data is used to characterize a pressure value corresponding to a sensor in a left external frame of the target device, and the second data is used to characterize a pressure value corresponding to a sensor in a right external frame of the target device; a second determination subunit configured to determine, in a case where the pressure data includes both the first data and the second data, target posture information according to the first data and the second data; and the third determining subunit is used for judging whether the pressure data comprise third data and fourth data or not under the condition that the pressure data do not comprise the first data and/or the second data, and determining target posture information according to the obtained judging result, wherein the third data are pressure values corresponding to the sensors in the upper side outer frame of the target equipment, and the fourth data are pressure values corresponding to the sensors in the lower side outer frame of the target equipment.
In an alternative embodiment, the second determining subunit comprises: the device comprises a first determining module, an acquiring module, a second determining module, a third determining module and a fourth determining module.
Optionally, the first determining module is configured to determine P first sensors corresponding to the first data and Q second sensors corresponding to the second data, where P and Q are both positive integers, the first sensor is a sensor for collecting the first data in a left outer frame of the target device, and the second sensor is a sensor for collecting the second data in a right outer frame of the target device; an obtaining module, configured to obtain a first distance corresponding to each of the P first sensors and a second distance corresponding to each of the Q second sensors, where the first distance corresponding to each first sensor is used to represent a distance between the first sensor and a bottom of the target device, and the second distance corresponding to each second sensor is used to represent a distance between the second sensor and the bottom of the target device; the second determining module is used for determining the maximum first distance in the first distances corresponding to each of the P first sensors as a first target distance; a third determining module, configured to determine a maximum second distance among second distances corresponding to each of the Q second sensors as a second target distance; and the fourth determining module is used for determining target attitude information according to the first target distance and the second target distance.
In an alternative embodiment, the fourth determining module includes: the device comprises a judging sub-module, a first determining sub-module and a second determining sub-module.
Optionally, the judging submodule is used for judging whether the first target distance is smaller than the second target distance; the first determining submodule is used for taking the first gesture information as target gesture information under the condition that the first target distance is smaller than the second target distance, wherein the first gesture information is used for representing that a target user uses a left hand to operate the target device; and the second determining submodule is used for taking second gesture information as target gesture information when the first target distance is greater than or equal to a second target distance, wherein the second gesture information is used for representing that a target user uses a right hand to operate the target device.
In an alternative embodiment, the third determining subunit comprises: a fifth determination module and a sixth determination module.
Optionally, the fifth determining module is configured to take third pose information as target pose information when the pressure data includes third data and fourth data at the same time, where the third pose information is used to characterize the target user to operate the target device by using both hands at the same time; and a sixth determining module, configured to take fourth gesture information as target gesture information if the pressure data does not include the third data and/or the fourth data, where the fourth gesture information is used to characterize that the target user is not in contact with the target device.
In an alternative embodiment, the generating unit comprises: the device comprises a first generation subunit, a second generation subunit, a third generation subunit and a fourth generation subunit.
Optionally, the first generating subunit is configured to generate, as the target interface, a preset interface suitable for the target user to operate the target device with the left hand, where the target gesture information is the first gesture information; the second generation subunit is used for generating a preset interface suitable for the target user to operate the target device by using the right hand as a target interface under the condition that the target gesture information is second gesture information; a third generating subunit, configured to generate, when the target gesture information is third gesture information, a preset interface suitable for the target user to use both hands to operate the target device as a target interface; and the fourth generation subunit is used for determining a screen interface corresponding to the target equipment at the current moment as a target interface when the target gesture information is fourth gesture information.
According to the method, firstly, pressure data corresponding to target equipment are obtained, wherein the pressure data at least comprise pressure values received by different positions of an outer frame of the target equipment in the process that a target user is in contact with the target equipment, then, corresponding target posture information when the target user operates the target equipment is determined according to the pressure data, the target posture information is used for representing a handheld posture of the target user when the target user operates the target equipment, finally, a target interface suitable for the target user to operate the target equipment is generated according to the target posture information, and the content of the target interface is displayed on a screen of the target equipment.
As can be seen from the above, according to the technical scheme of the application, the gesture information (i.e., the target gesture information) of the target user for operating the target device is determined by the collected pressure values (i.e., the pressure data) received by different positions of the outer frame of the target device in the process that the target user contacts the target device, and then a customized target interface is generated according to the gesture information of the target user for operating the target device, so that the purpose of customizing the screen display content of the intelligent handheld device according to different handheld gestures of the user is achieved, and the user experience is improved.
Therefore, the technical scheme of the application adopts a mode of generating the customized target interface according to the gesture information of the target user for operating the target device, thereby realizing the purpose of customizing the screen display content of the target device, further realizing the technical effect of improving the user experience when the user operates the target device by adopting different handheld gestures, and solving the technical problem of high user operation difficulty caused by adopting a fixed screen display scheme in the prior art.
According to another aspect of the embodiment of the present application, there is also provided a computer readable storage medium, including a stored computer program, where the computer program when executed controls a device in which the computer readable storage medium is located to perform any one of the interface display methods described above.
According to another aspect of the embodiment of the present application, there is further provided an electronic device, fig. 5 is a schematic diagram of an alternative electronic device according to an embodiment of the present application, as shown in fig. 5, where the electronic device includes: the electronic equipment comprises a rectangular outer frame and an equipment body, wherein an information acquisition part is arranged in the rectangular outer frame and used for acquiring relevant data of the electronic equipment held by a user, an information processing part and a display part are arranged in the equipment body, the information processing part analyzes the relevant data based on first operation logic and second operation logic and determines an electronic equipment manipulator currently operated by the user based on analysis results, and the display part displays a user operation interface suitable for the user manipulator according to the electronic equipment manipulator currently operated by the user.
Optionally, the information acquisition part further includes M pressure sensors, where the M pressure sensors are equidistantly arranged on four sides of the rectangular outer frame, and each pressure sensor is configured to detect pressure of a user manipulator on the rectangular outer frame, where the rectangular outer frame is installed on an outer wall of the electronic device body.
Optionally, the first operation logic of the information processing section includes: detecting the distance between a plurality of pressure sensors pressed on the left side and the right side of the rectangular outer frame and the bottom surface of the rectangular outer frame, and executing second operation logic if the pressure sensors on the left side and the right side of the rectangular outer frame are not pressed at the same time; comparing the distances between each pressure sensor pressed on the left side of the rectangular outer frame and the bottom surface of the rectangular outer frame to obtain the maximum distance h Lmax; comparing the distances between each pressure sensor pressed on the right side of the rectangular outer frame and the bottom surface of the rectangular outer frame to obtain the maximum distance h Rmax; comparing the sizes of h Lmax and h Rmax, determining that the user currently operates the manipulator of the electronic equipment according to the comparison result, and displaying a user operation interface suitable for the user manipulator according to the manipulator of the electronic equipment currently operated by the user by the display part.
Optionally, the first operation logic of the information processing section includes: detecting whether pressure sensors on the upper side and the lower side of the rectangular outer frame are pressed at the same time; if the pressure sensors on the upper side or the lower side of the rectangular outer frame are pressed, the display part displays the user operation interface unchanged, and if the pressure sensors on the upper side and the lower side of the rectangular outer frame are pressed simultaneously, the display part displays the user operation interface suitable for the current two-hand operation of the user.
Alternatively, the information processing section performs the first execution logic with a higher priority than the second execution logic.
Optionally, a gravity sensor is arranged in the device body, and the display part automatically converts the display direction of the user operation interface according to the measurement result of the gravity sensor.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.
Claims (10)
1. An interface display method, comprising:
Acquiring pressure data corresponding to target equipment, wherein the pressure data at least comprises pressure values received by different positions of an outer frame of the target equipment in the process that a target user contacts the target equipment;
Determining corresponding target gesture information when the target user operates the target device according to the pressure data, wherein the target gesture information is used for representing a handheld gesture when the target user operates the target device;
And generating a target interface suitable for the target user to operate the target equipment according to the target gesture information, and displaying the content of the target interface on a screen of the target equipment.
2. The interface display method according to claim 1, wherein acquiring pressure data corresponding to the target device includes:
M pressure values corresponding to the target equipment are obtained through M sensors uniformly distributed in the outer frame of the target equipment, wherein M is a positive integer, and each sensor in the M sensors is used for obtaining one pressure value in the M pressure values;
And taking at least one pressure value which is larger than a preset threshold value in the M pressure values as the pressure data.
3. The interface display method according to claim 1, wherein determining target posture information corresponding to the target user operating the target device according to the pressure data includes:
judging whether the pressure data comprises first data and second data, wherein the first data is used for representing a pressure value corresponding to a sensor in a left outer frame of the target equipment, and the second data is used for representing a pressure value corresponding to a sensor in a right outer frame of the target equipment;
Determining the target attitude information according to the first data and the second data under the condition that the pressure data simultaneously comprises the first data and the second data;
And under the condition that the pressure data does not comprise the first data and/or the second data, judging whether the pressure data comprise third data and fourth data, and determining the target posture information according to the obtained judging result, wherein the third data are pressure values corresponding to sensors in an upper outer frame of the target equipment, and the fourth data are pressure values corresponding to sensors in a lower outer frame of the target equipment.
4. The interface display method according to claim 3, wherein determining the target pose information from the first data and the second data includes:
determining P first sensors corresponding to the first data and Q second sensors corresponding to the second data, wherein P and Q are positive integers, the first sensors are sensors used for acquiring the first data in a left outer frame of the target equipment, and the second sensors are sensors used for acquiring the second data in a right outer frame of the target equipment;
Acquiring a first distance corresponding to each first sensor in the P first sensors and a second distance corresponding to each second sensor in the Q second sensors, wherein the first distance corresponding to each first sensor is used for representing the distance between the first sensor and the bottom of the target equipment, and the second distance corresponding to each second sensor is used for representing the distance between the second sensor and the bottom of the target equipment;
determining the maximum first distance among the first distances corresponding to each of the P first sensors as a first target distance;
determining the maximum second distance among the second distances corresponding to each of the Q second sensors as a second target distance;
And determining the target attitude information according to the first target distance and the second target distance.
5. The interface display method according to claim 4, wherein determining the target pose information according to the first target distance and the second target distance comprises:
judging whether the first target distance is smaller than the second target distance;
Taking first gesture information as the target gesture information when the first target distance is smaller than the second target distance, wherein the first gesture information is used for representing that the target user uses a left hand to operate the target device;
And taking second gesture information as the target gesture information when the first target distance is greater than or equal to the second target distance, wherein the second gesture information is used for representing that the target user uses the right hand to operate the target device.
6. The interface display method according to claim 3, wherein determining whether the third data and the fourth data are included in the pressure data, and determining the target posture information according to the obtained determination result, comprises:
taking third gesture information as the target gesture information in the case that the pressure data simultaneously comprises the third data and the fourth data, wherein the third gesture information is used for representing that the target user simultaneously uses two hands to operate the target device;
And taking fourth gesture information as the target gesture information in the condition that the pressure data does not comprise the third data and/or the fourth data, wherein the fourth gesture information is used for representing that the target user is not contacted with the target device.
7. The interface display method according to claim 1, wherein generating a target interface suitable for the target user to operate the target device from the target posture information includes:
Generating a preset interface suitable for the target user to operate the target device by using the left hand as the target interface under the condition that the target gesture information is first gesture information;
Generating a preset interface suitable for the target user to operate the target device by using the right hand as the target interface under the condition that the target gesture information is second gesture information;
Generating a preset interface suitable for the target user to operate the target device by using both hands as the target interface under the condition that the target gesture information is third gesture information;
And determining a screen interface corresponding to the target equipment at the current moment as the target interface under the condition that the target gesture information is fourth gesture information.
8. An interface display device, comprising:
The device comprises an acquisition unit, a control unit and a control unit, wherein the acquisition unit is used for acquiring pressure data corresponding to target equipment, and the pressure data at least comprises pressure values received by different positions of an outer frame of the target equipment in the process that a target user contacts the target equipment;
The determining unit is used for determining corresponding target gesture information when the target user operates the target equipment according to the pressure data, wherein the target gesture information is used for representing the handheld gesture of the target user when the target user operates the target equipment;
And the generating unit is used for generating a target interface suitable for the target user to operate the target equipment according to the target gesture information and displaying the content of the target interface on a screen of the target equipment.
9. A computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, and wherein the computer program, when executed, controls a device in which the computer-readable storage medium is located to perform the interface display method according to any one of claims 1 to 7.
10. An electronic device comprising one or more processors and a memory for storing one or more programs, wherein the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the interface display method of any of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410102968.9A CN117971043A (en) | 2024-01-24 | 2024-01-24 | Interface display method and device, storage medium and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410102968.9A CN117971043A (en) | 2024-01-24 | 2024-01-24 | Interface display method and device, storage medium and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117971043A true CN117971043A (en) | 2024-05-03 |
Family
ID=90847397
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410102968.9A Pending CN117971043A (en) | 2024-01-24 | 2024-01-24 | Interface display method and device, storage medium and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117971043A (en) |
-
2024
- 2024-01-24 CN CN202410102968.9A patent/CN117971043A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2837992B1 (en) | User interface interaction method and apparatus applied in touchscreen device, and touchscreen device | |
US12102904B2 (en) | Game control method based on a smart bracelet, smart bracelet and storage medium | |
JP5738707B2 (en) | Touch panel | |
CN105824610A (en) | A terminal-based method and device for arranging application program icons | |
CN106325675B (en) | Icon placing method and terminal equipment | |
KR20140008637A (en) | Method using pen input device and terminal thereof | |
CN105824422B (en) | A kind of information processing method and electronic equipment | |
CN105302407A (en) | Application icon display method and apparatus | |
CN108089691A (en) | A kind of method and device for reducing terminal device power consumption | |
CN107632760B (en) | Handwriting circle selection method and device, touch equipment and readable storage medium | |
CN108920055A (en) | Touch operation method, device, storage medium and electronic device | |
US20140168066A1 (en) | Method and electronic device for controlling data transmission | |
CN104182126A (en) | Method and device for dialing numbers via mobile terminals | |
CN105242780A (en) | Interactive control method and apparatus | |
CN104038832A (en) | Video playing method and device | |
CN104182161A (en) | Method and device for opening screen functional area | |
CN104834655B (en) | A kind of method and apparatus for the mass parameter for showing Internet resources | |
CN107092410A (en) | Interface alternation method, equipment and the intelligent terminal of a kind of touch-screen | |
CN108604142B (en) | Touch screen device operation method and touch screen device | |
CN106909272B (en) | Display control method and mobile terminal | |
CN117971043A (en) | Interface display method and device, storage medium and electronic equipment | |
CN105487785A (en) | Method and device for locking screen of terminal equipment and terminal equipment | |
CN114298403B (en) | Method and device for predicting the attention of works | |
CN117387788A (en) | Ambient temperature detection method, device, equipment and medium based on capacitive touch screen | |
CN113238708B (en) | Method and device for displaying touch operation information in head-mounted display equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |