CN115445186B - Method, device, equipment and medium for controlling virtual lens in game - Google Patents
Method, device, equipment and medium for controlling virtual lens in gameInfo
- Publication number
- CN115445186B CN115445186B CN202211160187.2A CN202211160187A CN115445186B CN 115445186 B CN115445186 B CN 115445186B CN 202211160187 A CN202211160187 A CN 202211160187A CN 115445186 B CN115445186 B CN 115445186B
- Authority
- CN
- China
- Prior art keywords
- control
- lens
- game
- virtual lens
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a method, a device, equipment and a medium for controlling a virtual lens in a game, which are applied to the technical field of games and are used for solving the problem that in the prior art, the efficiency is lower when a lens control part on the game control equipment is used for controlling the virtual lens, so that the game experience is affected. The method comprises the steps of receiving a first lens control instruction sent by game control equipment, wherein the first lens control instruction is sent by the game control equipment in response to a first trigger operation aiming at a lens control component and a pose control operation aiming at the game control equipment, determining pose control data of a virtual lens based on pose control data corresponding to the pose control operation in the first lens control instruction, and controlling the pose of the virtual lens to be adjusted based on the pose control data of the virtual lens so as to display a game scene picture captured after the pose of the virtual lens is adjusted in a graphical user interface, thereby effectively preventing misoperation and improving virtual lens control efficiency and game experience.
Description
Technical Field
The present application relates to the field of game technologies, and in particular, to a method, an apparatus, a device, and a medium for controlling a virtual lens in a game.
Background
In large reel-to-reel graphics games such as multiplayer online tactical games (Multiplayer Online Battle Arena, MOBA), players often need to frequently control virtual lens movements to observe game scenes from different perspectives to cope with different in-office situations.
Currently, a player can perform a game operation by means of a game control device, for example, the player can control a virtual lens by manipulating a lens control part on the game control device, but in the virtual lens control method, since the rate of controlling the virtual lens by the lens control part is relatively fixed, when emergency and rapid control of the virtual lens are required in a crisis situation, the control of the virtual lens by the lens control part is relatively inefficient, and the game experience of the player is also reduced.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a medium for controlling a virtual lens in a game, which are used for solving the problem that the efficiency of the prior art is lower when the virtual lens is controlled by a lens control component on game control equipment, so that the game experience is affected.
The technical scheme provided by the embodiment of the application is as follows:
In one aspect, an embodiment of the present application provides a method for controlling a virtual lens in a game, in which a graphical user interface is provided by a terminal device and the terminal device is connected to a game control device, and at least a game scene captured by the virtual lens is displayed in the graphical user interface, the method for controlling the virtual lens in the game includes:
Receiving a first lens control instruction sent by the game control equipment, wherein the first lens control instruction is sent by the game control equipment in response to a first trigger operation aiming at a lens control part and a pose control operation aiming at the game control equipment;
Based on the motion control data corresponding to the position control operation of the first lens control instruction, the position control data of the virtual lens is determined, and based on the position control data of the virtual lens, the position adjustment of the virtual lens is controlled so as to display a game scene picture captured after the position adjustment of the virtual lens in a graphical user interface.
In one possible implementation manner, before receiving the first lens control instruction sent by the game control device, the method further includes:
receiving a first mode control instruction sent by the game control device, wherein the first mode control instruction is sent by the game control device in response to a first trigger operation aiming at a mode control part;
based on the first mode control instruction, the current control mode of the virtual lens is set to be a somatosensory control mode.
In one possible implementation manner, receiving a first lens control instruction sent by the game control device includes:
and receiving a simultaneous control instruction sent by the game control device as a first lens control instruction, wherein the simultaneous control instruction is sent by the game control device in response to a first trigger operation for a lens control part and a pose control operation for the game control device which are simultaneously executed.
In one possible implementation manner, receiving a first lens control instruction sent by the game control device includes:
And receiving a continuous control instruction sent by the game control device as a first lens control instruction, wherein the continuous control instruction is sent by the game control device in response to a continuously executed first trigger operation for the lens control part and a pose control operation for the game control device, and the pose control operation is a continuous operation after the first trigger operation.
In one possible embodiment, determining pose control data of a virtual lens based on motion sensing control data corresponding to a pose control operation in a first lens control instruction includes:
Determining a position offset of the virtual lens based on the position offset of the game control device in the motion sensing control data, and determining a moving speed of the virtual lens based on the moving speed of the game control device in the motion sensing control data;
And/or;
The direction offset of the virtual lens is determined based on the rotational offset of the game control device in the motion-feel control data, and the rotational speed of the virtual lens is determined based on the rotational speed of the game control device in the motion-feel control data.
In a possible implementation manner, the method for controlling the virtual lens in the game provided by the embodiment of the application further includes:
receiving a second mode control instruction sent by the game control device, wherein the second mode control instruction is sent by the game control device in response to a second trigger operation aiming at the mode control part;
And switching the current control mode of the virtual lens from the somatosensory control mode to the component control mode based on the second mode control instruction.
In a possible implementation manner, the method for controlling the virtual lens in the game provided by the embodiment of the application further includes:
receiving a second lens control instruction sent by the game control device, wherein the second lens control instruction is sent by the game control device in response to a second trigger operation for the lens control component;
And determining pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction, and controlling the adjusting pose of the virtual lens based on the pose control data of the virtual lens so as to display a game scene picture captured after the adjusting pose of the virtual lens in the graphical user interface.
In one possible embodiment, determining pose control data of the virtual lens based on operation control data of the lens control part in the second lens control instruction includes:
Determining the direction offset of the virtual lens based on the control direction of the lens control part in the operation control data, and adjusting the rotation speed of the virtual lens according to the set multiple based on the control speed of the lens control part in the operation control data;
And/or;
and determining the displacement offset of the virtual lens based on the control time length of the lens control component in the operation control data, and adjusting the moving speed of the virtual lens according to the set multiple based on the control speed of the lens control component in the operation control data.
In one possible embodiment, controlling the virtual lens to adjust the pose based on pose control data of the virtual lens includes:
controlling the movement of the virtual lens based on the position offset of the virtual lens in the pose control data of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens;
And/or;
And controlling the rotation of the virtual lens based on the direction offset of the virtual lens in the pose control data of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens.
In another aspect, an embodiment of the present application provides an in-game virtual lens control apparatus that provides a graphical user interface in which at least game scene images captured by a virtual lens are displayed, and is connected to a game control device, the in-game virtual lens control apparatus including:
A first receiving unit configured to receive a lens control instruction sent by a game control apparatus, where the lens control instruction is sent by the game control apparatus in response to a first trigger operation for a lens control part and a pose control operation for the game control apparatus;
The first determining unit is used for determining pose control data of the virtual lens based on the motion sensing control data corresponding to the pose control operation in the lens control instruction;
And the lens control unit is used for controlling the position and the posture of the virtual lens based on the position and posture control data of the virtual lens so as to display a game scene picture captured after the position and the posture of the virtual lens are adjusted in the graphical user interface.
In a possible implementation manner, the in-game virtual lens control device provided by the embodiment of the application further includes:
a second receiving unit configured to receive a first mode control instruction transmitted by the game control device, wherein the first mode control instruction is transmitted by the game control device in response to a first trigger operation for the mode control section;
And the mode control unit is used for setting the current control mode of the virtual lens to be a somatosensory control mode based on the first mode control instruction.
In one possible implementation manner, when receiving the first lens control instruction sent by the game control device, the first receiving unit is specifically configured to:
and receiving a simultaneous control instruction sent by the game control device as a first lens control instruction, wherein the simultaneous control instruction is sent by the game control device in response to a first trigger operation for a lens control part and a pose control operation for the game control device which are simultaneously executed.
In one possible implementation manner, when receiving the first lens control instruction sent by the game control device, the first receiving unit is specifically configured to:
And receiving a continuous control instruction sent by the game control device as a first lens control instruction, wherein the continuous control instruction is sent by the game control device in response to a continuously executed first trigger operation for the lens control part and a pose control operation for the game control device, and the pose control operation is a continuous operation after the first trigger operation.
In one possible implementation manner, when determining the pose control data of the virtual lens based on the motion control data corresponding to the pose control operation in the first lens control instruction, the first determining unit is specifically configured to:
Determining a position offset of the virtual lens based on the position offset of the game control device in the motion sensing control data, and determining a moving speed of the virtual lens based on the moving speed of the game control device in the motion sensing control data;
And/or;
The direction offset of the virtual lens is determined based on the rotational offset of the game control device in the motion-feel control data, and the rotational speed of the virtual lens is determined based on the rotational speed of the game control device in the motion-feel control data.
In a possible implementation manner, the in-game virtual lens control device provided by the embodiment of the application further includes:
A third receiving unit configured to receive a second mode control instruction transmitted by the game control device, wherein the second mode control instruction is transmitted by the game control device in response to a second trigger operation for the mode control section;
And the mode control unit is used for switching the current control mode of the virtual lens from the somatosensory control mode to the component control mode based on the second mode control instruction.
In a possible implementation manner, the in-game virtual lens control device provided by the embodiment of the application further includes:
a fourth receiving unit configured to receive a second lens control instruction transmitted by the game control apparatus, wherein the second lens control instruction is transmitted by the game control apparatus in response to a second trigger operation for the lens control section;
a second determining unit configured to determine pose control data of the virtual lens based on operation control data of the lens control section in the second lens control instruction;
And the lens control unit is used for controlling the position and the posture of the virtual lens based on the position and posture control data of the virtual lens so as to display a game scene picture captured after the position and the posture of the virtual lens are adjusted in the graphical user interface.
In one possible implementation manner, when determining the pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction, the second determining unit is specifically configured to:
Determining the direction offset of the virtual lens based on the control direction of the lens control part in the operation control data, and adjusting the rotation speed of the virtual lens according to the set multiple based on the control speed of the lens control part in the operation control data;
And/or;
and determining the displacement offset of the virtual lens based on the control time length of the lens control component in the operation control data, and adjusting the moving speed of the virtual lens according to the set multiple based on the control speed of the lens control component in the operation control data.
In one possible implementation, when controlling the pose of the virtual lens based on the pose control data of the virtual lens, the lens control unit is specifically configured to:
controlling the movement of the virtual lens based on the position offset of the virtual lens in the pose control data of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens;
And/or;
And controlling the rotation of the virtual lens based on the direction offset of the virtual lens in the pose control data of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens.
On the other hand, the embodiment of the application provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the method for controlling the virtual lens in the game provided by the embodiment of the application when executing the computer program.
On the other hand, the embodiment of the application also provides a computer readable storage medium, and the computer readable storage medium stores computer instructions which are executed by the processor to realize the method for controlling the virtual lens in the game.
The embodiment of the application has the following beneficial effects:
in the embodiment of the application, the virtual lens can be controlled based on the motion sensing control data corresponding to the position and posture control operation by executing the position and posture control operation on the game control equipment, so that the problem that the control efficiency of the virtual lens is lower due to the fact that the speed of the virtual lens controlled by the lens control component is relatively fixed can be effectively solved, the control efficiency of the virtual lens and the game experience of a player can be effectively improved, and in the process of executing the position and posture control operation on the game control equipment, the misoperation on the virtual lens can be effectively prevented by executing the first trigger operation on the lens control component, so that the game experience of the player can be further improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of a system frame of a game console according to an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating an overview of a virtual shot control method in a game according to an embodiment of the application;
FIG. 3a is a schematic diagram of an interactive flow of a virtual shot control method in a game according to an embodiment of the present application;
FIG. 3b is a schematic view of the right rocker of the handle being depressed in an embodiment of the present application;
FIG. 3c is a schematic view of the handle rotated while the right rocker of the handle is depressed in an embodiment of the present application;
FIG. 3d is a schematic view of the right rocker of the release handle in an embodiment of the present application;
FIG. 4 is a schematic functional structure diagram of a virtual lens control device in a game according to an embodiment of the present application;
Fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the application.
Detailed Description
In order to make the objects, technical solutions and advantageous effects of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In order to facilitate a better understanding of the present application, technical terms related to the embodiments of the present application will be briefly described below.
The terminal equipment is front-end equipment for running game software and displaying game scene pictures through a graphical user interface, and in the embodiment of the application, the terminal equipment can be, but is not limited to, a notebook computer, a desktop computer, a smart television and the like.
The game control device is an input device for controlling virtual characters, virtual lenses and the like in a game, and in the embodiment of the application, the game control device can be, but is not limited to, a game handle, a mouse, a keyboard and the like, a plurality of control components are arranged on the game control device, the functions of the plurality of control components can be set in a self-defined mode according to the operation habit of a player, for example, the game handle is taken as an example, the player can set a right rocker as a lens control component, and left remote sensing as a character control component and the like.
It should be noted that references to "first," "second," "third," etc. in this disclosure are for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that such terms are interchangeable under appropriate circumstances such that the embodiments described herein are capable of operation in other sequences than those illustrated or otherwise described herein. Furthermore, the description of the "and/or" mentioned in the present application is that the association relationship of the association objects means that three relationships may exist, for example, A and/or B may mean that A exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
After technical terms related to the application are introduced, application scenes and design ideas of the embodiment of the application are briefly introduced.
Referring to fig. 1, a game control device 110 such as a game pad, a mouse, and a keyboard may be used as a game external device, and may be connected with a terminal device 120 such as a smart tv, a desktop computer, and a notebook computer by a wired or wireless method, after the communication connection between the game control device 110 and the terminal device 120 is established, a player may perform a game operation through the game control device, for example, the player may control the position of a virtual character by controlling a character control part on the game control device 110, and, for example, the player may also control the pose of the virtual lens by controlling a lens control part on the game control device 110, however, in an actual control process, especially when the virtual lens is controlled by controlling a lens control part on the game control device 110, because the rate of controlling the virtual lens through the lens control part is relatively fixed, when the virtual lens is required to be controlled rapidly in emergency, the game experience of the player is also reduced.
Therefore, in the embodiment of the application, when the game control device executes the game operation, the pose control operation can be executed on the game control device when the need of adjusting the virtual lens is determined, so that the virtual lens can be controlled based on the motion control data corresponding to the pose control operation, further, the problem that the control efficiency of the virtual lens is lower due to the fact that the speed of the virtual lens controlled by the lens control component is relatively fixed can be effectively solved, the control efficiency of the virtual lens and the game experience of a player are effectively improved, and in addition, in the process of executing the pose control operation on the game control device, the first trigger operation can be executed on the lens control component, thereby, misoperation on the virtual lens can be effectively prevented, and further, the game experience of the player can be further improved.
After the application scenario and the design idea of the embodiment of the present application are introduced, the technical solution provided by the embodiment of the present application is described in detail below.
The embodiment of the application provides a virtual lens control method in a game, which can be applied to terminal equipment, wherein the terminal equipment can provide a graphical user interface and is connected with game control equipment, at least game scene pictures captured through a virtual lens are displayed in the graphical user interface, and the outline flow of the virtual lens control method in the game is as follows with reference to fig. 2:
Step 201, receiving a first lens control instruction sent by the game control device, wherein the first lens control instruction is sent by the game control device in response to a first trigger operation for a lens control component and a pose control operation for the game control device.
In practical use, the player may select one of the control units on the game control device as the lens control unit according to the operation habit, for example, taking a game handle as an example, the player may set the right rocker as the lens control unit. In addition, the player may select one control section from the control sections on the game control device as a mode control section according to the operation habit, wherein the mode control section may be the same as or different from the lens control section, for example, the player may set the right joystick as both the lens control section and the mode control section, for example. Further, the player can set a control mode of the virtual lens through the mode control part according to the operation habit, specifically, the player can execute a first trigger operation on the mode control part, for example, the player can execute a pressing operation on the mode control part, the game control device responds to the first trigger operation on the mode control part and sends a first mode control instruction to the terminal device, and when the terminal device receives the first mode control instruction sent by the game control device, the current control mode of the virtual lens is set to a somatosensory control mode based on the first mode control instruction.
Further, after the player sets the current control mode of the virtual lens to the somatosensory control mode, when the player determines that the virtual lens needs to be adjusted in the game control process, in one embodiment, after the player performs a first trigger operation on the lens control component, the player performs a pose control operation on the game control device, that is, the pose control operation is a continuous operation after the first trigger operation, for example, after the player performs a pressing operation on the lens control component, the player performs a rotation operation on the game control device, the game control device sends a first lens control instruction to the terminal device in response to the continuously performed first trigger operation on the lens control component and the pose control operation on the game control device, and the terminal device receives the first lens control instruction sent by the game control device. In another embodiment, the player may further perform a pose control operation for the game control device while performing the first trigger operation for the lens control part, i.e., the pose control operation is a concurrent operation of the first trigger operation, e.g., the player may perform a spin operation for the game control device while performing a press operation for the lens control part, the game control device may transmit a first lens control instruction to the terminal device in response to the simultaneously performed first trigger operation for the lens control part and pose control operation for the game control device, and the terminal device may receive the first lens control instruction transmitted by the game control device.
Step 202, based on the motion control data corresponding to the position control operation in the first lens control instruction, determining position control data of the virtual lens, and based on the position control data of the virtual lens, controlling the position adjustment of the virtual lens so as to display a game scene picture captured after the position adjustment of the virtual lens in a graphical user interface.
In a specific implementation, when the terminal device determines the pose control data of the virtual lens based on the motion control data corresponding to the pose control operation in the first lens control instruction, the following three situations may exist, but are not limited to:
in the first case, the pose control operation includes a movement control operation.
In this case, the terminal device may determine the position offset of the virtual lens based on the position offset of the game control device in the motion-sensing control data, and determine the moving speed of the virtual lens based on the moving speed of the game control device in the motion-sensing control data.
In the second case, the pose control operation includes a rotation control operation.
In this case, the terminal device may determine the directional offset of the virtual lens based on the rotational offset of the game control device in the motion-sensing control data, and determine the rotational speed of the virtual lens based on the rotational speed of the game control device in the motion-sensing control data.
In the third case, the pose control operation includes a movement control operation and a rotation control operation.
In this case, the terminal device may determine the position offset of the virtual lens based on the position offset of the game control device in the motion sensing control data, and determine the moving speed of the virtual lens based on the moving speed of the game control device in the motion sensing control data, and determine the direction offset of the virtual lens based on the rotational offset of the game control device in the motion sensing control data, and determine the rotational speed of the virtual lens based on the rotational speed of the game control device in the motion sensing control data.
Further, the terminal device determines the pose control data of the virtual lens based on the pose control data corresponding to the pose control operation in the first lens control instruction, and then can control the virtual lens to adjust the pose based on the pose control data of the virtual lens. Correspondingly, when the terminal device controls the pose of the virtual lens to adjust based on the pose control data of the virtual lens, the following three situations may exist, but are not limited to:
In the first case, the pose control data of the virtual lens includes a position offset and a moving speed.
In this case, the terminal device may control the movement of the virtual lens based on the position offset of the virtual lens in the pose control data of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens.
In the second case, the pose control data of the virtual lens includes a direction offset and a rotation speed.
In this case, the terminal device may control the rotation of the virtual lens based on the directional offset of the virtual lens in the pose control data of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens.
In the third case, the pose control data of the virtual lens includes a position offset amount and a moving speed, a direction offset amount and a rotating speed.
In this case, the terminal device may control the movement of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens, based on the position offset of the virtual lens in the pose control data of the virtual lens, and control the rotation of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens and based on the direction offset of the virtual lens in the pose control data of the virtual lens.
Further, the terminal device controls the position and posture of the virtual lens based on the position and posture control data of the virtual lens, and then a game scene picture captured after the position and posture of the virtual lens are adjusted can be displayed in the graphical user interface.
In the embodiment of the application, the player can effectively prevent misoperation of the virtual lens while realizing virtual lens control based on motion sensing control data by executing the first trigger operation on the lens control part and executing the pose control operation on the game control equipment. Of course, the player can also realize virtual lens control based on the operation control data by performing the second trigger operation for the lens control section. Specifically, if the current control mode of the virtual lens is a somatosensory control mode, the player may execute a second trigger operation on the mode control component, for example, the player may execute a second trigger operation such as a long press operation, a double click operation, etc. on the mode control component, the game control device sends a second mode control instruction to the terminal device in response to the second trigger operation on the mode control component, and when the terminal device receives the second mode control instruction sent by the game control device, the current control mode of the virtual lens is switched from the somatosensory control mode to the component control mode based on the second mode control instruction. The player can execute a second trigger operation on the lens control component, for example, the player can execute a second trigger operation such as long press operation, any angle pushing operation and the like on the lens control component, the game control device responds to the second trigger operation on the lens control component and sends a second lens control instruction to the terminal device, when the terminal device receives the second lens control instruction sent by the game control device, the pose control data of the virtual lens is determined based on the operation control data of the lens control component in the second lens control instruction, and the pose adjustment pose of the virtual lens is controlled based on the pose control data of the virtual lens so as to display a game scene picture captured after the pose adjustment of the virtual lens in a graphical user interface.
In practical application, when the terminal device determines pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction, the following three situations may exist, but are not limited to:
In the first case, the operation control data includes a manipulation direction and a manipulation speed for the lens control section.
In this case, the terminal device may determine the directional offset of the virtual lens based on the manipulation direction of the lens control part in the operation control data, and adjust the rotation speed of the virtual lens by a set multiple based on the manipulation speed of the lens control part in the operation control data.
In the second case, the operation control data includes a manipulation time length and a manipulation speed for the lens control section.
In this case, the terminal device may determine the displacement offset of the virtual lens based on the manipulation time length of the lens control part in the operation control data, and adjust the moving speed of the virtual lens by a set multiple based on the manipulation speed of the lens control part in the operation control data.
In the second case, the operation control data includes a manipulation direction, a manipulation time length, and a manipulation speed for the lens control section.
In this case, the terminal device may determine the directional offset of the virtual lens based on the manipulation direction of the lens control part in the operation control data, determine the displacement offset of the virtual lens based on the manipulation duration of the lens control part in the operation control data, and adjust the control speed of the virtual lens by a set multiple based on the manipulation speed of the lens control part in the operation control data.
Further, the terminal device determines the pose control data of the virtual lens based on the pose control data corresponding to the pose control operation in the second lens control instruction, and then can control the virtual lens to adjust the pose based on the pose control data of the virtual lens. The specific adjustment manner is the same as the foregoing description manner, and the repetition is not repeated.
In practical application, the game paddle is a common game control device, and is provided with a plurality of control components such as a cross key, a left rocker, a right rocker and an ABXY function key, so that a player can control a game by controlling each control component on the game paddle, for example, the player can control a virtual lens to rotate in a 360 ° direction by pushing the right rocker on the game paddle in the 360 ° direction. However, in the actual control process, since the rate of controlling the virtual lens by the lens control component is relatively fixed, when the virtual lens needs to be controlled in an emergency and fast manner under the crisis condition, the virtual lens control component is relatively inefficient, and the game experience of a player is reduced.
Next, with the "game control device as a game handle" as a specific application scenario, the method for controlling a virtual lens in a game provided by the embodiment of the present application is further described in detail, and referring to fig. 3a, an interaction flow of the method for controlling a virtual lens in a game provided by the embodiment of the present application is as follows:
Step 301, a gamepad responds to a first mode triggering operation for a right rocker and sends a first mode control instruction to terminal equipment. Wherein the first mode trigger operation may be an operation of pressing the right rocker as shown in fig. 3 b.
Step 302, when the terminal equipment receives a first mode control instruction sent by the game handle, setting a current control mode of the virtual lens to be a somatosensory control mode based on the first mode control instruction.
Step 303, the game handle responds to the first lens control operation for the right rocker and the pose control operation for the game handle, which are simultaneously executed, and sends a first lens control instruction to the terminal equipment. The first lens control operation and the pose control operation simultaneously performed may be an operation of rotating the game handle downward and rightward while pressing the right rocker for a long time as shown in fig. 3 c.
Step 304, when the terminal equipment receives a first lens control instruction sent by the game handle, the pose control data of the virtual lens is determined based on the motion control data corresponding to the pose control operation in the first lens control instruction. The specific determination manner is the same as the foregoing description manner, and the repetition is not repeated.
Step 305, the terminal device controls the position and posture of the virtual lens based on the position and posture control data of the virtual lens so as to display a game scene picture captured after the position and posture of the virtual lens is adjusted in the graphical user interface. The specific control manner is the same as the foregoing description manner, and the repetition is not repeated.
Step 306, the game handle responds to a second mode triggering operation for the right rocker and sends a second mode control instruction to the terminal equipment. Wherein the second mode triggering operation may be an operation of releasing the right rocker as shown in fig. 3 d.
Step 307, when the terminal device receives the second mode control instruction sent by the game handle, the current control mode of the virtual lens is switched from the somatosensory control mode to the component control mode based on the second mode control instruction.
Step 308, the game handle responds to a second lens triggering operation aiming at the right rocker and sends a second lens control instruction to the terminal equipment. The second lens triggering operation may be any angle pushing operation for the right rocker.
Step 309, when the terminal device receives the second lens control instruction sent by the game handle, the terminal device determines pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction. The specific determination manner is the same as the foregoing description manner, and the repetition is not repeated.
Step 310, the terminal equipment controls the position and posture of the virtual lens based on the position and posture control data of the virtual lens so as to display a game scene picture captured after the position and posture of the virtual lens is adjusted in the graphical user interface.
Based on the above embodiments, the embodiment of the present application provides an in-game virtual lens control device, where the in-game virtual lens control device provides a graphical user interface and is connected to a game control device, and at least a game scene image captured by a virtual lens is displayed in the graphical user interface, and referring to fig. 4, an in-game virtual lens control device 400 provided in the embodiment of the present application at least includes:
a first receiving unit 401 for receiving a lens control instruction sent by the game control device, wherein the lens control instruction is sent by the game control device in response to a first trigger operation for the lens control part and a pose control operation for the game control device;
A first determining unit 402, configured to determine pose control data of the virtual lens based on the motion sensing control data corresponding to the pose control operation in the lens control instruction;
The lens control unit 403 is configured to control the virtual lens adjustment pose based on the pose control data of the virtual lens, so as to display a game scene picture captured after the virtual lens adjustment pose in the graphical user interface.
In a possible implementation manner, the in-game virtual lens control apparatus 400 provided in the embodiment of the present application further includes:
A second receiving unit 404 for receiving a first mode control instruction transmitted by the game control device, wherein the first mode control instruction is transmitted by the game control device in response to a first trigger operation for the mode control section;
the mode control unit 405 is configured to set a current control mode of the virtual lens to a somatosensory control mode based on the first mode control instruction.
In one possible implementation manner, when receiving the first lens control instruction sent by the game control device, the first receiving unit 401 is specifically configured to:
and receiving a simultaneous control instruction sent by the game control device as a first lens control instruction, wherein the simultaneous control instruction is sent by the game control device in response to a first trigger operation for a lens control part and a pose control operation for the game control device which are simultaneously executed.
In one possible implementation manner, when receiving the first lens control instruction sent by the game control device, the first receiving unit 401 is specifically configured to:
And receiving a continuous control instruction sent by the game control device as a first lens control instruction, wherein the continuous control instruction is sent by the game control device in response to a continuously executed first trigger operation for the lens control part and a pose control operation for the game control device, and the pose control operation is a continuous operation after the first trigger operation.
In one possible implementation manner, when determining the pose control data of the virtual lens based on the motion sensing control data corresponding to the pose control operation in the first lens control instruction, the first determining unit 402 is specifically configured to:
Determining a position offset of the virtual lens based on the position offset of the game control device in the motion sensing control data, and determining a moving speed of the virtual lens based on the moving speed of the game control device in the motion sensing control data;
And/or;
The direction offset of the virtual lens is determined based on the rotational offset of the game control device in the motion-feel control data, and the rotational speed of the virtual lens is determined based on the rotational speed of the game control device in the motion-feel control data.
In a possible implementation manner, the in-game virtual lens control apparatus 400 provided in the embodiment of the present application further includes:
A third receiving unit 406 for receiving a second mode control instruction transmitted from the game control device, wherein the second mode control instruction is transmitted from the game control device in response to a second trigger operation for the mode control section;
the mode control unit 405 is configured to switch the current control mode of the virtual lens from the somatosensory control mode to the component control mode based on the second mode control instruction.
In a possible implementation manner, the in-game virtual lens control apparatus 400 provided in the embodiment of the present application further includes:
a fourth receiving unit 407 for receiving a second lens control instruction transmitted by the game control apparatus, wherein the second lens control instruction is transmitted by the game control apparatus in response to a second trigger operation for the lens control section;
A second determining unit 408 for determining pose control data of the virtual lens based on operation control data of the lens control section in the second lens control instruction;
The lens control unit 403 is configured to control the virtual lens adjustment pose based on the pose control data of the virtual lens, so as to display a game scene picture captured after the virtual lens adjustment pose in the graphical user interface.
In one possible implementation manner, when determining the pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction, the second determining unit 408 is specifically configured to:
Determining the direction offset of the virtual lens based on the control direction of the lens control part in the operation control data, and adjusting the rotation speed of the virtual lens according to the set multiple based on the control speed of the lens control part in the operation control data;
And/or;
and determining the displacement offset of the virtual lens based on the control time length of the lens control component in the operation control data, and adjusting the moving speed of the virtual lens according to the set multiple based on the control speed of the lens control component in the operation control data.
In one possible implementation, when controlling the pose of the virtual lens based on the pose control data of the virtual lens, the lens control unit 403 is specifically configured to:
controlling the movement of the virtual lens based on the position offset of the virtual lens in the pose control data of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens;
And/or;
And controlling the rotation of the virtual lens based on the direction offset of the virtual lens in the pose control data of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens.
It should be noted that, the principle of solving the technical problem of the in-game virtual lens control device 400 provided by the embodiment of the present application is similar to that of the in-game virtual lens control method provided by the embodiment of the present application, so that the implementation of the in-game virtual lens control device 400 provided by the embodiment of the present application can refer to the implementation of the in-game virtual lens control method provided by the embodiment of the present application, and the repetition is omitted.
After the method and the device for controlling the virtual lens in the game provided by the embodiment of the application are introduced, the electronic equipment provided by the embodiment of the application is briefly introduced.
Referring to fig. 5, an electronic device 500 according to an embodiment of the present application at least includes a processor 501, a memory 502, and a computer program stored in the memory 502 and capable of running on the processor 501, where the processor 501 implements the method for controlling virtual shots in a game according to the embodiment of the present application when executing the computer program.
The electronic device 500 provided by embodiments of the present application may also include a bus 503 that connects the different components, including the processor 501 and the memory 502. Where bus 503 represents one or more of several types of bus structures, including a memory bus, a peripheral bus, a local bus, and so forth.
The Memory 502 may include readable storage media in the form of volatile Memory, such as random access Memory (Random Access Memory, RAM) 5021 and/or cache Memory 5022, and may further include Read Only Memory (ROM) 5023. The memory 502 may also include a program tool 5025 having a set (at least one) of program modules 5024, the program modules 5024 including, but not limited to, an operating subsystem, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The processor 501 may be one processing element or a collective term for a plurality of processing elements, for example, the processor 501 may be a central processing unit (Central Processing Unit, CPU) or one or more integrated circuits configured to implement the in-game virtual lens control method provided by the embodiment of the present application. In particular, the processor 501 may be a general purpose processor including, but not limited to, a CPU, application SPECIFIC INTEGRATED Circuit (ASIC), off-the-shelf programmable gate array (Field Programmable GATE ARRAY, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, and the like.
The electronic device 500 may communicate with one or more external devices 504 (e.g., keyboard, remote control, etc.), with one or more devices that enable a user to interact with the electronic device 500 (e.g., cell phone, computer, etc.), and/or with any device that enables the electronic device 500 to communicate with one or more other electronic devices 500 (e.g., router, modem, etc.). Such communication may be through an Input/Output (I/O) interface 505. Also, electronic device 500 may communicate with one or more networks such as a local area network (Local Area Network, LAN), a wide area network (Wide Area Network, WAN), and/or a public network such as the internet via network adapter 506. As shown in fig. 5, network adapter 506 communicates with other modules of electronic device 500 over bus 503. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in connection with electronic device 500, including, but not limited to, microcode, device drivers, redundant processors, external disk drive arrays, disk array (Redundant Arrays of INDEPENDENT DISKS, RAID) subsystems, tape drives, and data backup storage subsystems, among others.
It should be noted that the electronic device 500 shown in fig. 5 is only an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present application.
The following describes a computer-readable storage medium provided by an embodiment of the present application. The computer readable storage medium provided by the embodiment of the application stores computer instructions, and when the computer instructions are executed by the processor, the method for controlling the virtual lens in the game provided by the embodiment of the application is realized. Specifically, the computer instructions may be built-in or installed in the processor, so that the processor may implement the in-game virtual lens control method provided by the embodiment of the present application by executing the built-in or installed computer instructions.
In addition, the method for controlling the in-game virtual lens provided by the embodiment of the application can be further implemented as a computer program product, and the computer program product comprises program codes which realize the method for controlling the in-game virtual lens provided by the embodiment of the application when being run on a processor.
The computer program product provided by embodiments of the application may employ one or more computer-readable storage media, which may be, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing, and more specific examples (a non-exhaustive list) of computer-readable storage media include, inter alia, an electrical connection having one or more wires, a portable disk, a hard disk, RAM, ROM, erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), an optical fiber, a portable compact disk read-Only Memory (Compact Disc Read-Only Memory, CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer program product provided by the embodiment of the application can adopt a CD-ROM and comprise program codes, and can also run on electronic devices such as mobile phones, tablet computers, notebook computers, palm computers, virtual Reality (VR) devices, augmented Reality (Augmented Reality, AR) devices and the like. However, the computer program product provided by the embodiments of the present application is not limited thereto, and the computer readable storage medium may be any tangible medium that can contain, or store the program code for use by or in connection with the instruction execution system, apparatus, or device.
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the elements described above may be embodied in one element in accordance with embodiments of the present application. Conversely, the features and functions of one unit described above may be further divided into a plurality of units to be embodied.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made to the embodiments of the present application without departing from the spirit or scope of the embodiments of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims and the equivalents thereof, the present application is also intended to include such modifications and variations.
Claims (12)
1. An in-game virtual lens control method, characterized in that a graphical user interface is provided through a terminal device and the terminal device is connected with a game control device, at least a game scene picture captured through a virtual lens is displayed in the graphical user interface, the in-game virtual lens control method comprises:
Receiving a first lens control instruction sent by the game control device, wherein the first lens control instruction is sent by the game control device in response to a first trigger operation for a lens control part and a pose control operation for the game control device;
the method comprises the steps of determining pose control data of a virtual lens based on motion control data corresponding to pose control operation in a first lens control instruction, controlling the virtual lens to adjust the pose based on the pose control data of the virtual lens so as to display a game scene picture captured after the virtual lens is adjusted in a graphical user interface, wherein the first trigger operation is used for preventing misoperation on the virtual lens while controlling the virtual lens based on the motion control data.
2. The in-game virtual lens control method according to claim 1, further comprising, before receiving the first lens control instruction sent by the game control device:
receiving a first mode control instruction sent by the game control device, wherein the first mode control instruction is sent by the game control device in response to a first trigger operation aiming at a mode control part;
And setting the current control mode of the virtual lens as a somatosensory control mode based on the first mode control instruction.
3. The in-game virtual lens control method of claim 1, wherein receiving the first lens control instruction transmitted by the game control device comprises:
And receiving a simultaneous control instruction sent by the game control device as the first lens control instruction, wherein the simultaneous control instruction is sent by the game control device in response to a first trigger operation for the lens control component and a pose control operation for the game control device, which are simultaneously executed.
4. The in-game virtual lens control method of claim 1, wherein receiving the first lens control instruction transmitted by the game control device comprises:
and receiving a continuous control instruction sent by the game control device as the first lens control instruction, wherein the continuous control instruction is sent by the game control device in response to a continuously executed first trigger operation for the lens control component and a pose control operation for the game control device, and the pose control operation is a continuous operation after the first trigger operation.
5. The in-game virtual lens control method according to claim 1, wherein determining the pose control data of the virtual lens based on the motion-sensing control data corresponding to the pose control operation in the first lens control instruction includes:
determining a position offset of the virtual lens based on a position offset of the game control device in the motion-sensing control data, and determining a moving speed of the virtual lens based on a moving speed of the game control device in the motion-sensing control data;
And/or;
The direction offset of the virtual lens is determined based on the rotation offset of the game control device in the motion-sensing control data, and the rotation speed of the virtual lens is determined based on the rotation speed of the game control device in the motion-sensing control data.
6. The in-game virtual lens control method according to claim 1, further comprising:
receiving a second mode control instruction sent by the game control device, wherein the second mode control instruction is sent by the game control device in response to a second trigger operation aiming at a mode control part;
And switching the current control mode of the virtual lens from a somatosensory control mode to a component control mode based on the second mode control instruction.
7. The in-game virtual lens control method according to claim 6, further comprising:
Receiving a second lens control instruction sent by the game control device, wherein the second lens control instruction is sent by the game control device in response to a second trigger operation aiming at the lens control component;
and determining pose control data of the virtual lens based on the operation control data of the lens control component in the second lens control instruction, and controlling the virtual lens to adjust the pose based on the pose control data of the virtual lens so as to display a game scene picture captured after the virtual lens is adjusted in the pose in the graphical user interface.
8. The in-game virtual lens control method according to claim 7, wherein determining pose control data of the virtual lens based on operation control data of the lens control section in the second lens control instruction, comprises:
Determining the direction offset of the virtual lens based on the control direction of the lens control part in the operation control data, and adjusting the rotation speed of the virtual lens according to a set multiple based on the control speed of the lens control part in the operation control data;
And/or;
and determining the displacement offset of the virtual lens based on the control time length of the lens control component in the operation control data, and adjusting the moving speed of the virtual lens according to a set multiple based on the control speed of the lens control component in the operation control data.
9. The in-game virtual lens control method according to any one of claims 1 to 8, wherein controlling the virtual lens adjustment pose based on pose control data of the virtual lens, comprises:
Controlling the movement of the virtual lens based on the position offset of the virtual lens in the pose control data of the virtual lens according to the movement speed of the virtual lens in the pose control data of the virtual lens;
And/or;
and controlling the rotation of the virtual lens based on the direction offset of the virtual lens in the pose control data of the virtual lens according to the rotation speed of the virtual lens in the pose control data of the virtual lens.
10. An in-game virtual lens control apparatus that provides a graphical user interface in which at least game scene pictures captured by a virtual lens are displayed and is connected to a game control device, the in-game virtual lens control apparatus comprising:
The game control device comprises a command receiving unit, a control unit and a control unit, wherein the command receiving unit is used for receiving a lens control command sent by the game control device, and the lens control command is sent by the game control device in response to a first trigger operation for a lens control part and a pose control operation for the game control device;
The camera control device comprises a camera control unit, a first trigger operation and a second trigger operation, wherein the camera control unit is used for determining pose control data of a virtual camera based on motion control data corresponding to the pose control operation in the camera control instruction, controlling the position adjustment pose of the virtual camera based on the pose control data of the virtual camera so as to display a game scene picture captured after the position adjustment pose of the virtual camera in the graphical user interface, and the first trigger operation is used for preventing misoperation on the virtual camera while controlling the virtual camera based on the motion control data.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the in-game virtual lens control method according to any one of claims 1-9 when executing the computer program.
12. A computer readable storage medium storing computer instructions which when executed by a processor implement the in-game virtual lens control method of any one of claims 1-9.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211160187.2A CN115445186B (en) | 2022-09-22 | 2022-09-22 | Method, device, equipment and medium for controlling virtual lens in game |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202211160187.2A CN115445186B (en) | 2022-09-22 | 2022-09-22 | Method, device, equipment and medium for controlling virtual lens in game |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN115445186A CN115445186A (en) | 2022-12-09 |
| CN115445186B true CN115445186B (en) | 2025-09-16 |
Family
ID=84306802
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202211160187.2A Active CN115445186B (en) | 2022-09-22 | 2022-09-22 | Method, device, equipment and medium for controlling virtual lens in game |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN115445186B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108854054A (en) * | 2018-06-15 | 2018-11-23 | 苏州运智互动科技有限公司 | Role and lens control method with touch plate type somatosensory handle |
| CN109718548A (en) * | 2018-12-19 | 2019-05-07 | 网易(杭州)网络有限公司 | The method and device of virtual lens control in a kind of game |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113440846B (en) * | 2021-07-15 | 2024-05-10 | 网易(杭州)网络有限公司 | Game display control method and device, storage medium and electronic equipment |
| CN114849233A (en) * | 2022-05-25 | 2022-08-05 | 网易(杭州)网络有限公司 | Game control method, device, equipment and storage medium |
-
2022
- 2022-09-22 CN CN202211160187.2A patent/CN115445186B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108854054A (en) * | 2018-06-15 | 2018-11-23 | 苏州运智互动科技有限公司 | Role and lens control method with touch plate type somatosensory handle |
| CN109718548A (en) * | 2018-12-19 | 2019-05-07 | 网易(杭州)网络有限公司 | The method and device of virtual lens control in a kind of game |
Also Published As
| Publication number | Publication date |
|---|---|
| CN115445186A (en) | 2022-12-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7331124B2 (en) | Virtual object control method, device, terminal and storage medium | |
| CA3014348C (en) | Systems and methods for cloud processing and overlaying of content on streaming video frames of remotely processed applications | |
| US9707485B2 (en) | Systems and methods for cloud processing and overlaying of content on streaming video frames of remotely processed applications | |
| JP2022527502A (en) | Virtual object control methods and devices, mobile terminals and computer programs | |
| EP4134142A1 (en) | Tablet-based client device and computer-implemented method for managing an online application | |
| JP2023542148A (en) | Method, apparatus, electronic device, and storage medium for controlling movement of virtual objects in a game | |
| CN107106909A (en) | Game State Saving, Transfer and Restoration for Cloud Gaming | |
| JP7746420B2 (en) | Virtual object control method, device, terminal, and program | |
| JP2022533919A (en) | Virtual character control method, its computer equipment, computer program, and virtual character control device | |
| JP7806221B2 (en) | Screen display method, device, terminal, and computer program | |
| CN117205549A (en) | Screen rendering methods, devices, equipment, storage media and program products | |
| CN113694514A (en) | Object control method and device | |
| US12121799B2 (en) | Contextual adjustment of input device resistance | |
| JP2024026661A (en) | Virtual object control method, device, terminal and computer program | |
| CN116440491A (en) | Interaction method, device, equipment and storage medium of handle and terminal equipment | |
| WO2026001523A1 (en) | Virtual scene interaction method and apparatus, and computer device and storage medium | |
| US20150126285A1 (en) | Server and method for providing game | |
| CN115445186B (en) | Method, device, equipment and medium for controlling virtual lens in game | |
| CN111973984A (en) | Coordinate control method and device for virtual scene, electronic equipment and storage medium | |
| US20130296049A1 (en) | System and Method for Computer Control | |
| EP4688189A1 (en) | Systems and methods for generating adaptive control layouts for second screen devices | |
| WO2024221693A1 (en) | Method and apparatus for adjusting virtual lens, and storage medium and electronic apparatus | |
| CN117479985A (en) | Computer program, game system and control method for the computer program | |
| CN116139481B (en) | A method, apparatus, device, and medium for information processing in games. | |
| CN114053704B (en) | Information display method, device, terminal and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |