[go: up one dir, main page]

CN112691372B - Virtual item display method, device, equipment and readable storage medium - Google Patents

Virtual item display method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN112691372B
CN112691372B CN202011615531.3A CN202011615531A CN112691372B CN 112691372 B CN112691372 B CN 112691372B CN 202011615531 A CN202011615531 A CN 202011615531A CN 112691372 B CN112691372 B CN 112691372B
Authority
CN
China
Prior art keywords
virtual
display
bearing surface
prop
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011615531.3A
Other languages
Chinese (zh)
Other versions
CN112691372A (en
Inventor
潘佳绮
寇敬
文晓晴
毛克
余伟祥
邓颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011615531.3A priority Critical patent/CN112691372B/en
Publication of CN112691372A publication Critical patent/CN112691372A/en
Application granted granted Critical
Publication of CN112691372B publication Critical patent/CN112691372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a display method, a display device, display equipment and a readable storage medium of a virtual item, and relates to the field of virtual environments. The method comprises the following steps: receiving a property display operation; displaying a live-action picture based on the property display operation, wherein the live-action picture is a picture acquired by an image acquisition module of the terminal; responding to a display area which meets the display condition and is included in the live-action picture, and displaying at least one virtual prop based on a live-action object in the display area; and responding to the selection operation of a target virtual item in the at least one virtual item, and displaying item introduction information which is used for introducing the target virtual item. When the live-action picture has the display area which meets the display condition, the target virtual prop is displayed in the display area, and the details of the target virtual prop can be further reflected according to the selection operation, so that the display efficiency of the target virtual prop is improved.

Description

Virtual item display method, device, equipment and readable storage medium
Technical Field
The present application relates to the field of virtual environments, and in particular, to a method, an apparatus, a device, and a readable storage medium for displaying a virtual item.
Background
On terminals such as smart phones, tablets, etc., there are many applications based on virtual environments, such as: a virtual reality application program, a three-dimensional map program, a Third-Person shooter game (TPS), a First-Person shooter game (FPS), a Multiplayer Online tactical sports game (MOBA), a sandbox game, and the like. In the application program, the virtual image controlled by the user can hold different kinds of virtual props.
In the related art, a user can control the virtual image to acquire the virtual prop in a purchasing mode. And corresponding to the purchasing process of the virtual game, displaying a trading interface in the application program, wherein the trading interface comprises a virtual prop list, and a user can preview and purchase the appearance and the attribute of the virtual prop by selecting the virtual prop from the virtual prop list so as to acquire the virtual prop.
However, when the virtual items are displayed in the virtual item list, the features of the virtual items often cannot be completely displayed, and the related art method has low display efficiency for the information of the virtual items.
Disclosure of Invention
The application relates to a display method, a device, equipment and a readable storage medium of a virtual item, which can improve the efficiency of human-computer interaction, and the technical scheme is as follows:
in one aspect, a method for displaying a virtual item is provided, where the method includes:
receiving a prop display operation, wherein the prop display operation is used for indicating to display a virtual prop in a virtual environment by an augmented reality scene;
displaying a live-action picture based on the property display operation, wherein the live-action picture is a picture acquired by an image acquisition module of the terminal;
responding to a display area which meets the display condition and is included in the live-action picture, and displaying at least one virtual prop based on a live-action object in the display area;
and responding to the selection operation of a target virtual item in at least one virtual item, and displaying item introduction information which is used for introducing the target virtual item.
In another aspect, a display device of a virtual item is provided, the device including:
the receiving module is used for receiving a prop display operation, and the prop display operation is used for indicating to display the virtual prop in the virtual environment by an augmented reality scene;
the display module is used for displaying a live-action picture based on the prop display operation, wherein the live-action picture is a picture acquired by the image acquisition module of the terminal;
the display module is also used for responding to a display area which meets the display condition and is included in the live-action picture, and displaying at least one virtual prop based on a live-action object in the display area;
and the display module is also used for responding to the selection operation of a target virtual item in the at least one virtual item and displaying item introduction information, and the item introduction information is used for introducing the target virtual item.
In another aspect, a computer device is provided, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement the display method of a virtual prop as provided in the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement any one of the above-mentioned display methods of the virtual prop.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to make the computer device execute the display method of the virtual item in any of the above embodiments.
The beneficial effect that technical scheme that this application provided brought includes at least:
under the condition of applying the AR technology, after a real-scene picture is displayed by corresponding prop display operation, the virtual props are displayed in the real-scene picture, and the introduction information corresponding to the virtual props is displayed in response to the selection operation of the virtual props. Based on the AR technology, the virtual prop is displayed in the live-action picture, and the acquired information of the corresponding virtual prop is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual prop is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating a scene of acquiring a virtual item in the related art;
FIG. 2 is a schematic diagram showing a transaction interface in the related art;
FIG. 3 is a diagram illustrating a real scene provided by an exemplary embodiment of the present application;
FIG. 4 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 5 shows a flowchart of a method for displaying a virtual prop provided by an exemplary embodiment of the present application;
FIG. 6 illustrates a diagram of a virtual environment screen provided by an exemplary embodiment of the present application;
FIG. 7 is a diagram illustrating a live action scene provided by an exemplary embodiment of the present application;
FIG. 8 is a schematic diagram illustrating another real scene provided by an exemplary embodiment of the present application;
fig. 9 is a flowchart illustrating a method for displaying virtual items according to an exemplary embodiment of the present application;
FIG. 10 is a schematic diagram illustrating another live action scene provided by an exemplary embodiment of the present application;
FIG. 11 is a schematic diagram illustrating another live action scene provided by an exemplary embodiment of the present application;
FIG. 12 is a diagram illustrating a live action scene including two bearing surfaces provided by an exemplary embodiment of the present application;
FIG. 13 shows a flowchart of a method for displaying a virtual prop provided by an exemplary embodiment of the present application;
FIG. 14 is a diagram illustrating a scene of a uniform terrain level landscape provided by an exemplary embodiment of the present application;
FIG. 15 is a diagram illustrating a scene view when the levels of the assets are not uniform, provided by an exemplary embodiment of the present application;
FIG. 16 illustrates a schematic view of a close-up view of a piece of jewelry provided by an exemplary embodiment of the present application;
fig. 17 is a schematic diagram illustrating a change of a property sketch picture when an enlargement operation is received according to an exemplary embodiment of the present application;
FIG. 18 is a schematic diagram illustrating a change in close-up view of a prop upon receipt of a spin maneuver provided by an exemplary embodiment of the present application;
FIG. 19 is a process diagram illustrating a method for displaying a virtual item provided in an exemplary embodiment of the present application;
FIG. 20 is a block diagram illustrating a display device of a virtual item provided in an exemplary embodiment of the present application;
FIG. 21 is a block diagram illustrating a display device of another virtual item provided in an exemplary embodiment of the present application;
fig. 22 shows a block diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, the following detailed description of the embodiments of the present application will be made with reference to the accompanying drawings.
First, the terms referred to in the embodiments of the present application will be briefly described:
augmented Reality (AR) technology is a technology for calculating the position and angle of a camera influence in real time and adding a corresponding image, and aims to add and display contents in a virtual environment on a screen on which real world contents are displayed so as to achieve the effect of displaying virtual elements in the real world.
The virtual environment is a virtual environment displayed (or provided) when an application program runs on a terminal. The virtual environment may be a simulation environment of a real world, a semi-simulation semi-fictional environment, or a pure fictional environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application. The following embodiments are illustrated with the virtual environment being a three-dimensional virtual environment. In some embodiments of the present application, the virtual environment is used to have a user's master virtual object move within it. Meanwhile, the virtual environment can provide some functions, so that the main control virtual object corresponds to the functions in the virtual environment and simulates activities in real life. In one example, to implement a virtual restaurant, the master virtual object may have meals in the restaurant; in another example, the virtual environment is implemented as a virtual battlefield within which the master virtual object may battle; in another example, the virtual environment is implemented as a virtual store within which the master virtual object can shop. The embodiment of the present application does not limit the function corresponding to the virtual environment.
The game based on the virtual environment is composed of one or more game world maps, the virtual environment in the game simulates the scene of the real world, a user can control a main control virtual object in the game to walk, run, jump, shoot, fight, drive, release skills, be attacked by other virtual objects, be injured by the virtual environment, attack other virtual objects and other actions in the virtual environment, the interactivity is strong, and a plurality of users can form a team on line to play a competitive game.
Virtual objects refer to moveable objects in a virtual environment. The movable object can be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual object is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual object has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. In the embodiment of the present application, a virtual object is taken as an example for explanation, and a master virtual object generally refers to one or more master virtual objects in a virtual environment.
In this application, a virtual object is a virtual object that is located within a virtual environment. As described above, virtual objects are located within a virtual environment and may perform actions corresponding to functions of the virtual environment.
Sandboxed games, which are types of virtual applications evolved from sandboxed games. The sandboxed game has a game map that includes at least two map areas. In the sandbox game, the main control virtual object of the user can perform actions of walking, running, jumping, shooting, fighting, driving, skill releasing and the like in the virtual environment, and through the actions, the user can influence, change and even create the virtual world in the sandbox game. In a sandbox game, a user's master virtual object is typically targeted to survive as a first goal and to build and explore as a second goal to survive in the virtual world of the sandbox game.
The virtual prop refers to a virtual object which can be assembled on a virtual object. In the above scenario, the virtual prop may interact with the virtual object. Optionally, the virtual object corresponds to an attribute of the virtual object, and the virtual item may affect the attribute of the virtual object through interaction between the virtual item and the virtual object. In one example, the attribute of the virtual object comprises a life value of the virtual object, the life value has an initial life value upper limit, the initial life value upper limit is 500, the virtual object is equipped with a virtual prop, the virtual prop enables the virtual object to obtain a gain effect, and the life value upper limit of the virtual object is increased from 500 to 1000; in another example, the attribute of the virtual object includes a life value of the virtual object, the life value indicates a current life value of the virtual object, the current life value is 100, the virtual object selects a target virtual item from virtual items held by the virtual object and consumes the target virtual item, the virtual object obtains a gain effect by consuming the target virtual item, and the life value returns from 100 to 500; in another example, the virtual object obtains a new appearance after the virtual prop is assembled. The embodiment of the application does not limit the specific effect of the virtual prop on the generation of the virtual object and the interaction mode between the virtual object and the virtual object. In a sandbox game, a virtual object may obtain a virtual item from a virtual environment through different approaches. In one example, the virtual environment is a virtual sea, and the virtual object can acquire the virtual prop in a salvaging manner; in another example, the virtual environment is a virtual kitchen, and the virtual object can obtain the virtual prop by cooking; in another example, the virtual environment is a virtual store, and the virtual object may acquire the virtual item by way of a transaction. The method for acquiring the virtual item by the virtual object is not limited in the application.
In the scene corresponding to the virtual store, the virtual object acquires the virtual item in a trading mode. Fig. 1 is a schematic diagram illustrating a scene of acquiring a virtual item in the related art. Referring to fig. 1, a virtual environment screen 100 has a first virtual object 101 and a second virtual object 102, where the first virtual object 101 is a virtual object that needs to obtain a virtual item, and the second virtual object 102 is a virtual object that holds the virtual item. The virtual environment screen 100 further includes a transaction identifier 110, and through the operation on the transaction identifier, the first virtual object 101 and the second virtual object 102 can perform a transaction, in the transaction, the first virtual object 101 can obtain a virtual item, and correspondingly, the second virtual object 102 can send out the virtual item and obtain corresponding currency. In the related art, a transaction identifier is further included in the virtual environment screen 100 corresponding to a transaction of a virtual object, and in response to an operation of the transaction identifier, a transaction screen is displayed in the virtual environment screen 100.
Fig. 2 is a schematic diagram illustrating a transaction interface provided in an exemplary embodiment of the present application. In trading interface 200, virtual item search area 210, virtual item classification area 220, and virtual item display area 230 are included. A user of virtual item search area 210 provides an item search function, virtual item classification area 220 is used to display a classification corresponding to a virtual item, and at least one virtual item 231 and a virtual item identifier 232 corresponding to the virtual item 231 are displayed in virtual item display area 230. Optionally, a transaction 233 of the virtual item, for example, the number of virtual items sold, is also included in the display area of the virtual item.
In the embodiment of the present application, as shown in fig. 3, in the case of applying the AR technology, the virtual item 301 is displayed in the real-scene screen 300, and the user can directly obtain the form of the virtual item 301 from the real-scene screen 300.
Fig. 4 shows a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 400 includes: a first terminal 420, a server 440, and a second terminal 460.
The first terminal 420 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, an FPS game, an MOBA game, and a multi-player gunfight type live game. The first endpoint 420 is an endpoint used by a first user who uses the first endpoint 420 to control a first master virtual object located in the virtual environment for activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, releasing skills, picking up, attacking, evading attacks by other virtual objects. Illustratively, the first master virtual object is a first virtual character, such as an animated character or an animation character. Illustratively, the first master virtual object releases the regional skill in the virtual environment, and the virtual environment screen moves from the position of the master virtual object to the target region selected by the regional skill indicator. The zone-type skill indicator is used for controlling the virtual object to select a release zone when releasing the skill.
The first terminal 420 is connected to the server 440 through a wireless network or a wired network.
The server 440 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Illustratively, the server 440 includes a processor 444 and a memory 442, the memory 442 further includes a receiving module 4421, a control module 4422 and a transmitting module 4424, the receiving module 4421 is used for receiving a request transmitted by a client, such as a team formation request; the control module 4422 is configured to control rendering of a virtual environment picture; the sending module 4424 is configured to send a message notification, such as a queue success notification, to the client. The server 440 is used to provide background services for applications that support a three-dimensional virtual environment. Optionally, the server 440 undertakes primary computational work and the first terminal 420 and the second terminal 460 undertake secondary computational work; alternatively, the server 440 undertakes the secondary computing work and the first terminal 420 and the second terminal 460 undertake the primary computing work; or, the server 440, the first terminal 420 and the second terminal 460 perform cooperative computing by using a distributed computing architecture.
The second terminal 460 is connected to the server 440 through a wireless network or a wired network.
The second terminal 460 is installed and operated with an application program supporting a virtual environment. The application program may be any one of a virtual reality application program, a three-dimensional map program, an FPS game, an MOBA game, and a multi-player gunfight type live game. The second terminal 460 is a terminal used by a second user who uses the second terminal 460 to control a second master virtual object located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, releasing skills, picking up, attacking, evading attacks by other master virtual objects. Illustratively, the second master virtual object is a second virtual character, such as an animated character or a cartoon character.
Optionally, the first virtual character and the second virtual character are in the same virtual environment. Optionally, the first avatar character and the second avatar character may belong to the same team, the same organization, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the first terminal 420 and the second terminal 460 are the same, or the applications installed on the two terminals are the same type of application of different control system platforms. The first terminal 420 may generally refer to one of a plurality of terminals, and the second terminal 460 may generally refer to one of a plurality of terminals, and this embodiment is only illustrated by the first terminal 420 and the second terminal 460. The device types of the first terminal 420 and the second terminal 460 are the same or different, and include: at least one of a smartphone, a tablet, an ebook reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smartphone.
Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
The method for displaying the virtual items provided in the embodiment of the present application is described with reference to the above noun introduction and description of the implementation environment. Fig. 5 shows a flowchart of a method for displaying a virtual item according to an exemplary embodiment of the present application, which is described by taking the method as an example applied to the first terminal 420, the second terminal 460, or another terminal in the computer system 400 shown in fig. 4, and the method includes:
step 501, receiving a property display operation, where the property display operation is used to instruct to display a virtual property in a virtual environment in an AR scene.
In embodiments of the present application, the virtual environment may be embodied as a program that exposes an environment in a game application, as described above. And simultaneously, corresponding to the functions of the virtual environment in the game application program, the prop display operation is realized as different types of controls superposed on the virtual environment picture. In one example, if the virtual environment is an environment corresponding to an exhibition scene, the property exhibition operation is an operation performed on an exhibition control of an exhibit that is overlaid and displayed on the virtual environment; in another example, if the virtual environment corresponds to the environment of a restaurant, the item display operation is an operation on an item display control that is superimposed on the virtual environment. In the embodiment of the present application, the virtual environment is an environment corresponding to a transaction scenario. The property display operation is the operation of the commodity display control which is displayed on the virtual environment in an overlapping manner. Referring to fig. 6, a virtual environment screen 600 includes a first virtual object 601 and a second virtual object 602. The first virtual object 601 is a virtual object for initiating a transaction, and the second virtual object 602 is a virtual object for accepting a transaction. And a prop display control 620 is also displayed on the virtual environment picture in an overlaying manner, and the prop display control 620 is used for indicating that the virtual props in the virtual environment are displayed in an AR scene.
In the embodiment of the application, the AR scene is a scene generated in real time by the image acquisition module based on the terminal, and in other embodiments of the application, the AR scene is a scene generated by combining live-action videos prestored in the terminal. The actual generation mode of the AR scene is not limited in the application.
And 502, displaying a live-action picture based on the prop display operation, wherein the live-action picture is a picture acquired by an image acquisition module of the terminal in real time.
The item display operation is an operation performed corresponding to the virtual item or the virtual item group where the virtual item is located. In one example, if the virtual item is a commodity, and the commodity exists in the store, the item presentation operation is a display operation for the commodity, or the item presentation operation is a display operation for all commodities in the store.
In the embodiment of the present application, the terminal needs to have a function of acquiring images in real time in addition to a display function of displaying an interface of a game application program. In one example, the terminal is implemented with a camera module. The camera module receives information in a real scene through the camera and generates a live-action picture.
In another embodiment of the present application, the terminal is connected to the image capturing device, and may receive data in real time through the image capturing device.
Step 503, in response to the display area meeting the display condition included in the live-action picture, displaying at least one virtual item based on the live-action object in the display area.
In the embodiment of the present application, a scene within the real scene screen has a feature conforming to a function of the virtual environment. In one example, the virtual environment is a marine environment, and the live-action scene has a pool of water therein; in another example, the virtual environment is a restaurant environment with tableware within the live action view; in another example, the virtual environment is a shop environment, and the live action view has a table on which the merchandise is placed.
The following describes a manner of defining the display area in the present embodiment by using the virtual environment as the shop environment:
in an actual shop environment, a merchant places goods on a desktop for customers to select. Therefore, in the live-action picture, the virtual item is realized as a virtual commodity, and correspondingly, the virtual commodity needs to be placed in the live-action environment and simulates the effect of the virtual item placed on the desktop, so that in the live-action picture, a plane similar to the desktop needs to be included to place the virtual item.
Referring to fig. 7, in the live-action picture 700, the table 710 collected by the image collecting module of the terminal is included, the table 710 has a desktop 711, and since the current desktop 711 is a display area meeting the display condition, the virtual prop 712 is displayed on the desktop 711. In the examples of the present application. In the embodiment of the present application, the number of the virtual prop 712 is 3, and the 3 virtual props 712 are displayed on the desktop 711 in a tiled manner.
Optionally, after the virtual item is displayed based on a real-scene object, the positions of the virtual item and the real-scene object in the real-scene picture are relatively fixed, and when the virtual item and the real-scene object are observed at different viewing angles, the relative positions of the virtual item and the real-scene object are not changed.
Step 504, responding to the selection operation of the target virtual item in the at least one virtual item, and displaying item introduction information, wherein the item introduction information is used for introducing the target virtual item.
In the embodiment of the application, corresponding to the item transaction scenario, after the virtual object selects the virtual item, the target virtual item to be purchased is selected.
Optionally, the number of the virtual items selected each time is 1, or the number of the virtual items selected each time is at least two, and the number of the target virtual items selected by each selection operation is not limited in the embodiment of the present application.
In the embodiment of the application, the selection operation of the target virtual item is a pressing operation for selecting a target virtual item identifier corresponding to the target virtual item; or, observing the target virtual prop by the target virtual prop. The embodiment of the application does not limit the selection and display of the target virtual prop.
In the embodiment of the application, after receiving the selection operation of the target virtual item, processing item introduction information is performed. The property introduction is the information which corresponds to the selection operation of the virtual property and is displayed in the real scene picture together with the target virtual property. Please refer to fig. 8. In the live view 800, the target virtual item 811 is located within the display area. At this time, in response to the selection operation of the target virtual item 811, item introduction information 812 of the target virtual item 811 is also displayed in the real field. The item introduction information 812 includes an item identifier of the target virtual item 811, a function of the target virtual item 811, and an acquisition condition of the target virtual item 811.
Optionally, after the target virtual item is purchased, the target virtual item is not displayed in the live-action picture, so as to prompt the user that the target virtual item is sold.
To sum up, in the method provided in this embodiment of the present application, after displaying the live-action picture corresponding to the property display operation in the case of applying the AR technology, the virtual property is displayed in the live-action picture, and the introduction information corresponding to the virtual property is displayed in response to the selection operation on the virtual property. Based on the AR technology, the virtual props are displayed in the live-action picture, and the acquired information of the corresponding virtual props is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual props is improved.
In some embodiments of the present application, before the target virtual item is displayed, a bearing surface for bearing the virtual item needs to be determined, so as to ensure the integrity of the display of the virtual item. Fig. 9 shows a flowchart of a method for displaying a virtual item, which is provided in an exemplary embodiment of the present application, and is described by way of example when the method is applied to an electronic device, where the method includes:
step 901, performing plane area identification on an object in the live-action picture, wherein the plane area identification is used for identifying a bearing surface in the live-action picture.
In the embodiment of the present application, the virtual environment is a virtual environment with a commodity selling function. In order to embody the commodity buying and selling function in the live-action picture, a plane for placing commodities is required in the live-action picture. Therefore, in the embodiment of the present application, after switching to the live-action picture, the terminal performs plane area recognition on the live-action picture.
In the embodiment of the application, since the object in the real scene picture needs to have visual flatness and needs to meet the requirements of inclination and size, the identification of the plane area includes plane identification and bearing surface identification.
In the embodiment of the present application, in the process of performing plane recognition on live-action picture detection, please refer to fig. 10, a live-action picture 1000 shown in fig. 10 includes a table 1001, and the table 1001 has a first reference surface 1011, a second reference surface 1012, a third reference surface 1013, and a fourth reference surface 1014. In the process of plane identification, the first reference plane 1011, the second reference plane 1012, the third reference plane 1013, and the fourth reference plane are all determined as planes.
It should be noted that, in the embodiment of the present application, the plane is a plane whose surface unevenness is lower than the unevenness threshold, and the unevenness is related to the unevenness of the displayed plane in the real scene picture, but not related to the unevenness of the desktop in the corresponding real scene. In one example, the surface of the real table corresponding to the table in the real scene picture is a frosted surface, and the frosted surface has high unevenness in the actual use process. However, after the corresponding image is acquired from the live-action picture, the live-action picture is still determined to be a plane.
In the embodiment of the application, after the plane identification is performed on the live-action picture and the plane exists in the live-action picture, whether the plane includes the bearing surface is further identified. Optionally, the identification of whether the plane is a bearing surface includes identification of a slope of the plane within the live-action scene, and identification of a bearing surface size. In one example, referring to fig. 11, a real scene screen 1100 includes a table 1101 therein, the table 1101 having a first reference surface 1111, a second reference surface 1112, a third reference surface 1113, and a fourth reference surface 1114. Wherein, the first reference surface 1111 and the fourth reference surface are parallel to the bottom side of the live-action picture 1100, and the inclination is 0 degree; the second reference surface 1112 is perpendicular to the bottom side of the live-action picture 1100, and the inclination is 90 degrees, and the third reference surface 1112 is 45 degrees to the bottom side of the live-action picture 1100, and the inclination is 45 degrees. In the embodiment of the present application, if the gradient threshold is set to 30 degrees, a plane smaller than the gradient threshold may be used as a bearing surface, and a plane larger than the gradient threshold may not be used as a bearing surface, that is, in this example, the first reference surface 1111 and the fourth reference surface 1114 may be used as bearing surfaces, and the second reference surface 1112 and the third reference surface 1113 may not be used as bearing surfaces.
Then, the bearing surface sizes of the reference surfaces 1111 and 1114 are detected. Optionally, the detection of the size of the bearing surface selects a plane, as the bearing surface, in which the size of the bearing surface is greater than a preset size threshold in the terminal, the size of the reference surface 1111 is greater than the size threshold, and the size of the reference surface 1114 is smaller than the size threshold, so that the reference surface 1111 may be used as the bearing surface, and the reference surface 1114 may not be used as the bearing surface, that is, in the embodiment of the present application, the bearing surface is the reference surface 1111.
Step 902, in response to the existence of the area meeting the bearing surface requirement in the live-action picture, determining that an object with a plane exists in the live-action picture.
In the embodiment of the application, after the area meeting the bearing surface requirement in the real scene picture is determined, the object for bearing the virtual prop in the real scene picture is determined.
Step 903, displaying at least one virtual prop on the plane of the object in a tiled manner.
And after the terminal finishes bearing surface identification, determining the object with the plane and correspondingly displaying the virtual prop. The virtual prop is automatically displayed on the bearing surface, and the bearing surface for bearing the virtual prop is the first bearing surface.
In one example, the virtual items are displayed on one side of the first bearing surface. Optionally, the virtual props displayed on the first bearing surface include a first virtual prop.
Step 904, in response to the second display ratio of the second bearing surface in the live-action picture being greater than the first display ratio of the first bearing surface in the live-action picture, displaying a bearing surface switching prompt.
In the embodiment of the application, at least two bearing surfaces are included in one live-action picture. Referring to fig. 12, in the live-action picture 1200, the live-action picture 1200 includes a first bearing surface 1201 and a second bearing surface 1202, where the first bearing surface 1201 is a bearing surface currently bearing the target virtual item, and the second bearing surface 1202 is a candidate bearing surface, as shown in fig. 12, as the user moves the handheld terminal and gets closer to the second bearing surface 1202, the content of the first bearing surface 1201 and the target virtual item 1210 borne on the first bearing surface 1201 in the live-action picture 1200 is displayed incompletely, so that in the terminal, a bearing surface switching prompt 1220 is displayed, and the bearing surface switching prompt 1220 is used to instruct the user to display the target virtual item 1210 on the second bearing surface 1202.
Step 905, in response to receiving the switching approval operation for the bearing surface switching prompt, clearing the target virtual prop on the first bearing surface, and displaying the target virtual prop on the second bearing surface.
In the embodiment of the present application, the bearing surface switching prompt 1220 has a countdown, and after the countdown is completed, the switching approval operation is received, and the target virtual property 1210 on the first bearing surface 1201 is cleared, and the target virtual property 1210 is displayed on the second bearing surface 1202.
Alternatively, the consent switching operation may also be implemented as a pressing operation or a sliding operation that the user may switch the prompt to the bearing surface. The embodiment of the present application is not limited to a specific form of agreeing to the switching operation.
To sum up, in the method provided in this embodiment of the present application, after displaying the live-action picture corresponding to the prop display operation under the condition of applying the AR technology, the virtual prop is displayed in the live-action picture, and the introduction information corresponding to the virtual prop is displayed in response to the selection operation of the virtual prop. Based on the AR technology, the virtual prop is displayed in the live-action picture, and the acquired information of the corresponding virtual prop is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual prop is improved.
The method provided by the embodiment of the application identifies whether the display state in the live-action picture corresponds to the condition that the virtual prop needs to be displayed on the plane or not, and displays the virtual prop on the bearing surface of the object after the corresponding object is determined, so that the problem caused by the fact that the plane does not meet the requirement in the display process is avoided, and the display efficiency of the information of the virtual prop is further improved.
According to the method provided by the embodiment of the application, under the condition that the live-action picture changes and a better plane exists, the prompt of bearing surface switching is carried out on the user, the bearing surface switching is carried out corresponding to the feedback of the user, and the virtual prop is displayed again. Under the condition of providing the display mode of the live-action picture, the virtual props can be rearranged in a manner of adapting to the change of the live-action picture, and the display efficiency of the information of the virtual props is further improved.
In some embodiments of the present application, a display manner of the target virtual item in the live-action picture is changed according to a setting in the application program. Fig. 13 is a flowchart illustrating a method for displaying a virtual item according to an exemplary embodiment of the present application, which is described by way of example when the method is applied to an electronic device, and the method includes:
step 1301, receiving a prop display operation, wherein the prop display operation is used for indicating to display a virtual prop in a virtual environment in an augmented reality scene.
In an embodiment of the present application, the virtual environment may be embodied as a program that exposes an environment in a game application, as described in step 501. Meanwhile, corresponding to the functions of the virtual environment in the game application program, the prop display operation is realized by different types of controls superposed on the virtual environment picture.
In the embodiment of the application, when the terminal receives the prop display operation, the picture of the virtual environment is still displayed on the terminal, and after the prop display operation is received, the live-action picture is displayed on the terminal.
And step 1302, displaying a live-action picture based on the prop display operation.
In the embodiment of the application, the terminal needs to have a function of acquiring images in real time besides a display function of displaying an interface of a game application program. In one example, the terminal is implemented with a camera module. The camera module receives information in a real scene through the camera and generates a real scene picture.
Optionally, the live-action picture includes content acquired by a camera module of the terminal in real time and content added to the live-action picture by a processor of the terminal. In one example, the live-action picture further comprises real-time added on the live-action picture by the processor of the terminal.
And step 1303, performing plane area identification on the object in the live-action picture, wherein the plane area identification is used for identifying a bearing surface in the live-action picture.
In the embodiment of the present application, before displaying the virtual item in the live-action picture, the display feasibility of the virtual item in the live-action picture needs to be verified. The display feasibility verification is also called plane area identification, and the plane area identification comprises plane identification and bearing surface identification.
In the process of plane detection, the terminal detects whether a plane exists in the live-action picture.
Optionally, when the plane does not exist in the terminal, displaying a reset live-action picture identifier in the terminal, where the reset live-action picture is used to prompt the user to reselect the live-action picture to display the virtual item.
And when the real scene picture has a plane, further detecting the bearing surface of the real scene picture. Optionally, the bearing surface detection includes detection of inclination of the bearing surface and detection of size of the bearing surface. The detection of the inclination of the bearing surface is used for detecting whether an angle formed between the plane and the bottom surface of the live-action picture is smaller than an inclination threshold value or not, the detection of the size of the bearing surface is used for detecting whether the size of the bearing surface is larger than a size threshold value or not, and when the plane simultaneously passes through the detection of the size of the bearing surface and the detection of the inclination of the bearing surface, the plane can be determined to be the bearing surface.
Step 1304, in response to the existence of the area meeting the bearing surface requirement in the live-action picture, determining that an object with a plane exists in the live-action picture.
Step 1305, displaying at least one virtual prop in a tiled manner on the plane of the object.
In the embodiment of the application, the number of the bearing surfaces is at least one. The display area is an area with a bearing surface.
Optionally, in this embodiment of the present application, the virtual item corresponds to an item level. The prop tier indicates the attribute level of the prop. In one example, the item hierarchy indicates the priority at which the virtual item changes during the purchase; in another example, the item level indicates an item level of the virtual item. Step 1304 is a method for displaying the virtual items when the item levels of the virtual items are uniform. Referring to fig. 14, when the prop levels of the virtual props are uniform, at least one virtual prop 1411 is displayed in a tiled manner on the first supporting surface 1410 of the real scene 1400, in this embodiment, the at least one virtual prop 1411 is tiled on one side of the first supporting surface.
In this embodiment of the application, the virtual item belongs to a virtual object, that is, in the virtual environment, the virtual item is an item held by the virtual object. When virtual item 1411 is displayed on first supporting surface 1410 of live-action screen 1400, virtual object 1420 is still present in live-action screen 1400 to indicate that virtual item 1411 belongs to virtual object 1420.
Step 1306, displaying a virtual item display rack on the plane of the object, wherein the virtual item display rack comprises at least two layers of display platforms.
And 1307, displaying the virtual props on the corresponding display platforms in a tiled mode based on the prop levels of the virtual props.
In the embodiment of the application, when the virtual object in the display area corresponds to different item levels, a virtual item display rack is displayed on the bearing surface, and the virtual item display rack comprises at least two layers of display platforms. Alternatively, the virtual item display shelf may be embodied in the form of a display shelf displayed in the live-action interface, or in the form of a showcase displayed in the live-action interface. Referring to fig. 15, in the live-action picture 1500, the first bearing surface 1501 has a virtual item display rack 1502, and the virtual item display rack 1502 has three layers, which respectively correspond to three item levels owned by virtual items. Referring to fig. 15, the level of each layer in the virtual item display rack 1502 is parallel to the first carrying surface 1501.
Alternatively, as shown in fig. 15, in the live-action screen 1500, the higher-level virtual item 1511 is located on a higher-level display platform of the virtual item display rack 1502, and the lower-level virtual item 1511 is located on a lower-level display platform of the virtual item display rack 1502.
As shown in fig. 15, corresponding to the virtual item 1511, a virtual object 1520 currently holding the virtual item 1511 is still displayed.
Step 1308, in response to the change of the live-action picture, displaying the virtual item at the observation angle corresponding to the changed live-action picture.
In the embodiment of the application, since the live-action picture is a picture acquired by the image acquisition module of the terminal in real time, the observation angle of the first bearing surface is changed in the process of moving the terminal held by a user. In the embodiment of the present application, when the virtual item is displayed, the virtual item is bound to the first bearing surface, that is, the relative position relationship between the virtual item and the first bearing surface is not changed. Therefore, in the embodiment of the present application, the viewing angle of the virtual prop is also adjusted correspondingly.
Step 1309, in response to the selection operation on the target virtual item in the at least one virtual item, displaying an item detail screen, where the item detail screen is used to observe the target virtual item and display item acquisition information.
In this embodiment of the application, the selection operation may be implemented as a click operation on the target virtual item, or the selection operation may be implemented as an observation operation of displaying the target virtual item in a preset area in the real-view screen at a preset observation angle. When the selection operation is realized as an observation operation on a preset area in the live-action picture, the user can change the content in the live-action picture in a mode of moving the handheld terminal, so that the observation operation on the preset area in the live-action picture with a preset observation visual angle is realized.
After the virtual target is selected, in one example, a prop close-up picture, namely a close-up display picture for the virtual prop, is displayed. Referring to fig. 16, in a prop close-up picture 1600, a virtual prop 1601 is implemented, where the virtual prop 1601 corresponds to a prop acquisition information 1602, and the prop acquisition information is one of the prop introduction information, and is used to display a requirement that a virtual object needs to meet to acquire the virtual prop. In property acquisition information 1602, it is indicated that the property name of virtual property 1601 is AMR48, the acquisition price of the virtual property is 1800, the rarity of the virtual property is a streetz level, and the attack force attribute of the rarity property is 2500. In addition, on prop close-up screen 1600, a purchase control 1610 for purchasing virtual prop 1601 is also displayed superimposed.
Step 1310, in response to receiving the zoom-in operation on the target virtual prop, displaying the target virtual prop in the prop close-up screen in a zoom-in state, wherein in the zoom-in state, the display size of the target virtual prop is larger than the original display size of the target virtual prop in the prop close-up screen.
In the embodiment of the application, the method further includes an operation of enlarging the target virtual item in the special writing picture of the corresponding item. Referring to fig. 17, target virtual prop 1701 is first displayed in an original state within prop close-up screen 1700, at which time, in response to receiving a zoom-in operation, target virtual prop 1701 in an enlarged state is displayed within prop close-up screen 1700, and due to the limitations of the terminal screen size, only a portion of target virtual prop 1701 is displayed in prop close-up screen 1700 when target virtual prop 1701 is in an enlarged state.
In this embodiment, the operation of enlarging the target virtual item may be implemented as a two-finger dragging operation on the target virtual item.
Step 1311, in response to receiving the rotating operation of the target virtual prop, displaying the target virtual prop in a prop close-up screen in a rotating state, wherein the rotating state indicates that the target virtual prop rotates at a preset rotating speed.
In the embodiment of the application, in the special-writing picture corresponding to the prop, the rotating operation of the target virtual prop is also included. Referring to fig. 18, in the prop close-up screen 1800, the target virtual prop 1801 is first displayed in an original state in the prop close-up screen 1800, at this time, in response to receiving a rotation operation, the target virtual prop 1801 is rotated at a preset speed in the prop close-up screen 1800, so that the user can watch the target virtual prop 1801 from various angles.
In this embodiment of the present application, the operation of enlarging the target virtual item may be implemented as a single-finger dragging operation on the target virtual item.
To sum up, in the method provided in this embodiment of the present application, after displaying the live-action picture corresponding to the property display operation in the case of applying the AR technology, the virtual property is displayed in the live-action picture, and the introduction information corresponding to the virtual property is displayed in response to the selection operation on the virtual property. Based on the AR technology, the virtual prop is displayed in the live-action picture, and the acquired information of the corresponding virtual prop is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual prop is improved.
According to the method provided by the embodiment of the application, the virtual props are arranged in the live-action picture in a layered mode corresponding to the prop levels thereof, so that the display of the virtual props is more corresponding to the content in the virtual environment, and the display efficiency of the information of the virtual props is further improved.
According to the method provided by the embodiment of the application, the prop close-up picture is set for the virtual prop, the rotation and amplification operation of the virtual prop is provided corresponding to the prop close-up picture, the mastering condition of the information of the user on the virtual prop is improved, and the display efficiency of the information of the virtual prop is further improved.
Fig. 19 is a schematic process diagram illustrating a method for displaying a virtual item according to an exemplary embodiment of the present application, where the process includes:
step 1901, click the "virtual reality transaction" button.
In the process, the 'virtual display transaction button' is a button for triggering prop display operation. Alternatively, before clicking the "virtual display transaction" button, a virtual environment screen is displayed in the terminal, and after clicking the "virtual display transaction button", step 1902 is executed.
Step 1902, turning on a camera of the mobile phone.
The process is that the camera of the mobile phone is opened to display the live-action picture, and after the live-action picture is displayed, the terminal detects whether the live-action picture has a bearing surface.
Step 1903, detect if there is a level near the scene.
Optionally, the "horizontal plane" in step 1903 indicates the "bearing plane" in the live-action scene. The process is a process of determining a plane which can be used as a bearing surface in the live-action picture after plane identification and bearing surface identification.
When no level exists near the scene, step 1904 is performed.
In step 1904, the player rotates the cell phone camera.
The process is a process of the user moving the terminal to change the content in the real scene picture.
When there is a bearing surface near the scene, or the player adjusts the camera of the mobile phone to make the bearing surface near the scene, step 1905 is executed.
In step 1905, the trading booths are automatically distributed on the left and right sides of the real scene.
The process is a process of displaying the virtual prop in the display area of the bearing surface. In the present embodiment, the trading booths are located on one side of the carrying surface, or the trading booths are located on both sides of the carrying surface.
In step 1906, the user holds the terminal and moves.
The process is that the user holds the terminal by hand to move, so that the content of the live-action picture is changed, in the process, the movement of the user comprises walking in an actual environment and standing still in the actual environment, and the terminal is moved in a rotating and mobile terminal mode.
Step 1907, turn to the target virtual prop.
The process is a process of selecting and operating the target virtual prop. In the process, the selection operation is an observation operation of displaying the target virtual prop in a preset area in the real scene picture at a preset observation visual angle.
Step 1908, click on the target virtual item.
The process is a process of selecting operation of the target virtual item, as in step 1907. In the process, the selection operation is a click operation on the target virtual item.
Optionally, after the target virtual prop is clicked, a prop close-up picture is displayed.
Step 1909, enlarge and rotate the target virtual item.
Step 1910, the finger slides laterally to rotate the target virtual prop.
Steps 1909-1910 are the process of enlarging and rotating the target virtual item in the item close-up view.
Step 1911, click to purchase the target virtual item.
The process is the process of obtaining the target virtual prop.
To sum up, in the method provided in this embodiment of the present application, after displaying the live-action picture corresponding to the property display operation in the case of applying the AR technology, the virtual property is displayed in the live-action picture, and the introduction information corresponding to the virtual property is displayed in response to the selection operation on the virtual property. Based on the AR technology, the virtual props are displayed in the live-action picture, and the acquired information of the corresponding virtual props is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual props is improved.
Fig. 20 is a block diagram illustrating a structure of a display device of a virtual prop according to an exemplary embodiment of the present application, where the display device includes:
a receiving module 2001, configured to receive a prop displaying operation, where the prop displaying operation is used to instruct to display a virtual prop in a virtual environment in an augmented reality scene;
a display module 2002 for displaying a live-action picture based on the property display operation, the live-action picture being a picture collected by the image collection module of the terminal;
the display module 2002 is further configured to, in response to a display area meeting the display condition being included in the live-action picture, display at least one virtual item based on a live-action object in the display area;
display module 2002 is further configured to, in response to a selection operation on a target virtual item in the at least one virtual item, display item introduction information, where the item introduction information is used to introduce the target virtual item.
In an alternative embodiment, the real world object is an object having a flat surface;
display module 2002 is further configured to, in response to an object having a plane being included in the live-action picture, display at least one virtual prop in a tiled manner on the plane of the object.
In an alternative embodiment, the virtual props correspond to at least two prop levels;
the display module 2002 is further configured to display a virtual item display rack on the plane of the object, where the virtual item display rack includes at least two layers of display platforms;
the display module 2002 is further configured to display the virtual item on the corresponding display platform in a tiled manner based on the item level of the virtual item.
In an alternative embodiment, the number of display platforms is equal to the number of prop levels.
In an alternative embodiment, please refer to fig. 21, the apparatus further includes an identification module 2003, configured to perform plane area identification on an object in the live-action picture, where the plane area identification is used to identify a bearing surface in the live-action picture;
a determining module 2004, configured to determine that an object having a plane exists in the live-action picture in response to a region that meets the requirement of the bearing surface existing in the live-action picture;
the display module 2002 is further configured to determine that an object having a plane exists in the live-action picture in response to a region that meets the requirement of the bearing surface existing in the live-action picture;
display module 2002 is further configured to display at least one virtual prop in a tiled manner on a plane of the object.
In an optional embodiment, the live-action picture includes a first bearing surface and a second bearing surface, the first bearing surface is a bearing surface for bearing the virtual prop currently, and the second bearing surface is a candidate bearing surface;
the display module 2002 is further configured to display a bearing surface switching prompt in response to that a second display ratio of the second bearing surface in the live-action picture is greater than a first display ratio of the first bearing surface in the live-action picture;
the apparatus further includes a clearing module 2005, configured to clear the virtual item on the first bearing surface and display the virtual item on the second bearing surface in response to receiving a switching approval operation for the bearing surface switching prompt.
In an optional embodiment, the display module 2002 is further configured to, in response to a change of the live-action picture, display the virtual prop at an observation angle corresponding to the changed live-action picture.
In an optional embodiment, the selection operation is a click operation on the target virtual item;
and/or the presence of a gas in the atmosphere,
the selection operation is an observation operation of displaying the target virtual prop in a preset area in the real scene picture at a preset observation visual angle.
In an optional embodiment, display module 2002 is further configured to, in response to receiving an enlarging operation on the target virtual item, display the target virtual item in the item close-up screen in an enlarged state, where a display size of the target virtual item is larger than an original display size of the target virtual item in the item close-up screen in the enlarged state;
and/or the presence of a gas in the atmosphere,
in response to receiving a rotation operation on the target virtual prop, displaying the target virtual prop in a prop close-up screen in a rotation state, the rotation state indicating that the target virtual prop rotates at a preset rotation speed.
In an alternative embodiment, the virtual item is an item held by the virtual object;
the display module 2002 is further configured to, in response to a display area meeting the display condition being included in the live-action picture, display a virtual object in the display area, where the virtual object is located on a peripheral side of the virtual item.
To sum up, the apparatus provided in this application embodiment displays the virtual item in the real-world image after displaying the real-world image corresponding to the item display operation in the case of applying the AR technology, and displays the introduction information corresponding to the virtual item in response to the selection operation for the virtual item. Based on the AR technology, the virtual prop is displayed in the live-action picture, and the acquired information of the corresponding virtual prop is displayed in the live-action picture according to the selection operation, so that the display efficiency of the information of the virtual prop is improved.
It should be noted that: the display device of the virtual item provided in the above embodiment is only illustrated by dividing each functional module, and in practical applications, the above function allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above described functions. In addition, the display device of the virtual prop and the display method embodiment of the virtual prop provided by the above embodiments belong to the same concept, and the specific implementation process thereof is described in detail in the method embodiment and is not described herein again.
Fig. 22 shows a block diagram of an electronic device 2200 provided in an exemplary embodiment of the present application. The electronic device 2200 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion Picture Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion Picture Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The electronic device 2200 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
Generally, the electronic device 2200 includes: a processor 2201 and a memory 2202.
The processor 2201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 2201 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 2201 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 2201 may be integrated with a GPU (Graphics Processing Unit) for rendering and drawing content required to be displayed by the display screen. In some embodiments, the processor 2201 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 2202 may include one or more computer-readable storage media, which may be non-transitory. Memory 2202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 2202 is configured to store at least one instruction for execution by processor 2201 to implement a method of displaying a virtual prop provided by a method embodiment of the present application.
In some embodiments, the electronic device 2200 may further include: a peripheral interface 2203 and at least one peripheral. The processor 2201, memory 2202, and peripheral interface 2203 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 2203 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 2204, a display 2205, a camera assembly 2206, an audio circuit 2207, a positioning assembly 2208 and a power supply 2209.
The peripheral interface 2203 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 2201 and the memory 2202. In some embodiments, the processor 2201, memory 2202, and peripheral interface 2203 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 2201, the memory 2202, and the peripheral device interface 2203 may be implemented on separate chips or circuit boards, which are not limited in this embodiment.
The Radio Frequency circuit 2204 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 2204 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 2204 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 2204 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 2204 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 2204 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display 2205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 2205 is a touch display screen, the display screen 2205 also has the ability to acquire touch signals on or above the surface of the display screen 2205. The touch signal may be input to the processor 2201 as a control signal for processing. At this point, the display 2205 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 2205 may be one, disposed on the front panel of the electronic device 2200; in other embodiments, the display 2205 can be at least two, respectively disposed on different surfaces of the electronic device 2200 or in a folded design; in other embodiments, the display 2205 may be a flexible display disposed on a curved surface or on a folded surface of the electronic device 2200. Even more, the display 2205 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 2205 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 2206 is used to capture images or video. Optionally, camera assembly 2206 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 2206 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp and can be used for light compensation under different color temperatures.
The audio circuit 2207 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 2201 for processing or inputting the electric signals into the radio frequency circuit 2204 for realizing voice communication. The microphones may be provided in a plurality, respectively, at different locations of the electronic device 2200 for stereo sound capture or noise reduction purposes. The microphone may also be an array microphone or an omni-directional acquisition microphone. The speaker is used to convert electrical signals from the processor 2201 or the radio frequency circuit 2204 into sound waves. The loudspeaker can be a traditional film loudspeaker and can also be a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 2207 may also include a headphone jack.
The power supply 2209 is used to supply power to various components in the electronic device 2200. The power source 2209 may be ac, dc, disposable or rechargeable. When the power source 2209 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 2200 also includes one or more sensors 2210. The one or more sensors 2210 include, but are not limited to: acceleration sensor 2211, gyro sensor 2212, pressure sensor 2213, optical sensor 2215 and proximity sensor 2216.
The acceleration sensor 2211 can detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the electronic device 2200. For example, the acceleration sensor 2211 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 2201 may control the display 2205 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 2211. The acceleration sensor 2211 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 2212 may detect a body direction and a rotation angle of the electronic device 2200, and the gyro sensor 2212 may cooperate with the acceleration sensor 2211 to acquire a 3D motion of the user on the electronic device 2200. The processor 2201 may implement the following functions according to the data collected by the gyro sensor 2212: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization while shooting, game control, and inertial navigation.
The pressure sensor 2213 may be disposed on a side bezel of the electronic device 2200 and/or underlying the display 2205. When the pressure sensor 2213 is arranged on the side frame of the electronic device 2200, a holding signal of the user to the electronic device 2200 can be detected, and the processor 2201 performs left-right hand recognition or quick operation according to the holding signal acquired by the pressure sensor 2213. When the pressure sensor 2213 is arranged at the lower layer of the display screen 2205, the processor 2201 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 2205. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 2215 is used to collect the ambient light intensity. In one embodiment, the processor 2201 may control the display brightness of the display screen 2205 according to the ambient light intensity collected by the optical sensor 2215. Specifically, when the ambient light intensity is high, the display brightness of the display screen 2205 is increased; when the ambient light intensity is low, the display brightness of the display screen 2205 is adjusted to be low. In another embodiment, the processor 2201 may also dynamically adjust the shooting parameters of the camera assembly 2206 according to the intensity of the ambient light collected by the optical sensor 2215.
A proximity sensor 2216, also referred to as a distance sensor, is typically disposed on the front panel of the electronic device 2200. The proximity sensor 2216 is used to capture the distance between the user and the front of the electronic device 2200. In one embodiment, when the proximity sensor 2216 detects that the distance between the user and the front of the electronic device 2200 is gradually decreased, the display screen 2205 is controlled by the processor 2201 to switch from the bright screen state to the dark screen state; when the proximity sensor 2216 detects that the distance between the user and the front surface of the electronic device 2200 gradually becomes larger, the processor 2201 controls the display 2205 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 22 is not limiting to the electronic device 2200, and may include more or fewer components than shown, or combine certain components, or employ a different arrangement of components.
The embodiment of the present application further provides a computer-readable storage medium, where at least one instruction, at least one program, a code set, or an instruction set are stored in the computer-readable storage medium, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the display method of the virtual prop.
The present application also provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the display method of the virtual prop in any one of the above embodiments.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, which may be a computer readable storage medium contained in a memory of the above embodiments; or it may be a separate computer-readable storage medium not incorporated in the terminal. The computer readable storage medium has at least one instruction, at least one program, code set, or instruction set stored therein, and the at least one instruction, at least one program, code set, or instruction set is loaded and executed by a processor to implement the method for displaying the virtual item.
Optionally, the computer-readable storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a Solid State Drive (SSD), or an optical disc. The Random Access Memory may include a resistive Random Access Memory (ReRAM) and a Dynamic Random Access Memory (DRAM). The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (12)

1. A method for displaying a virtual item, the method comprising:
receiving a prop display operation, wherein the prop display operation is used for indicating to display a virtual prop in a virtual environment by an augmented reality scene;
displaying a live-action picture based on the prop display operation, wherein the live-action picture is a picture acquired by an image acquisition module of the terminal;
performing plane area identification on an object in the real scene picture, wherein the plane area identification is used for identifying a bearing surface in the real scene picture, the plane area identification comprises plane identification and bearing surface identification, the plane identification is used for determining the visual flatness of the object plane in the real scene picture, and the bearing surface identification is used for determining the inclination and the bearing surface size of the object plane in the real scene picture;
responding to the existence of an object plane meeting the bearing surface requirement in the live-action picture, and determining a first bearing surface and a second bearing surface in the live-action picture, wherein the bearing surface requirement comprises at least one of the condition that the plane unevenness displayed by the object plane in the live-action picture is lower than an unevenness threshold value, the inclination of the object plane in the live-action picture is smaller than an inclination threshold value, and the bearing surface size of the object plane in the live-action picture is larger than a size threshold value, and the first bearing surface and the second bearing surface are used for bearing and displaying at least one virtual prop;
displaying the at least one virtual prop on the first bearing surface in a tiled manner;
responding to a second display proportion of the second bearing surface in the live-action picture, which is larger than a first display proportion of the first bearing surface in the live-action picture, and displaying a bearing surface switching prompt;
in response to receiving a switching approval operation for the bearing surface switching prompt, clearing the at least one virtual prop on the first bearing surface and displaying the at least one virtual prop on the second bearing surface;
and responding to the selection operation of a target virtual item in the at least one virtual item, and displaying item introduction information which is used for introducing the target virtual item.
2. The method of claim 1, wherein the virtual items correspond to at least two item levels;
the displaying the at least one virtual prop on the first bearing surface in a tiled manner, comprising:
displaying a virtual item display rack on the first bearing surface, wherein the virtual item display rack comprises at least two layers of display platforms;
displaying the virtual props on corresponding display platforms in a tiled form based on the prop hierarchies of the virtual props.
3. The method of claim 2,
the number of display platforms is equal to the number of prop levels.
4. The method of any of claims 1 to 3, further comprising:
and responding to the change of the live-action picture, and displaying the virtual prop at an observation visual angle corresponding to the changed live-action picture.
5. The method of claim 4,
the selection operation is a click operation on the target virtual item;
and/or the presence of a gas in the atmosphere,
the selection operation is an observation operation of displaying the target virtual prop in a preset area in the real scene picture at a preset observation visual angle.
6. The method of any of claims 1 to 3, further comprising:
in response to receiving a zoom-in operation on the target virtual item, displaying the target virtual item in an item close-up screen in a zoomed-in state in which a display size of the target virtual item is larger than an original display size of the target virtual item in the item close-up screen;
and/or the presence of a gas in the atmosphere,
in response to receiving a rotation operation on the target virtual prop, displaying the target virtual prop in the prop close-up screen in a rotation state, the rotation state indicating that the target virtual prop rotates at a preset rotation speed.
7. A method according to any one of claims 1 to 3, wherein the virtual item is an item held by a virtual object;
the method further comprises the following steps:
and responding to a display area which meets display conditions and is included in the live-action picture, and displaying the virtual object in the display area, wherein the virtual object is positioned on the peripheral side of the virtual prop.
8. A display device of a virtual prop, the device comprising:
the system comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving a prop display operation, and the prop display operation is used for indicating to display a virtual prop in a virtual environment by an augmented reality scene;
the display module is used for displaying a live-action picture based on the prop display operation, wherein the live-action picture is a picture acquired by an image acquisition module of the terminal;
the display module is further configured to perform plane area recognition on an object in the real scene image, where the plane area recognition is used to recognize a bearing surface in the real scene image, the plane area recognition includes plane recognition and bearing surface recognition, the plane recognition is used to determine a visual flatness of an object plane in the real scene image, and the bearing surface recognition is used to determine an inclination and a bearing surface size of the object plane in the real scene image;
the determining module is used for responding to the existence of an object plane meeting the bearing surface requirement in the live-action picture, and determining a first bearing surface and a second bearing surface in the live-action picture, wherein the bearing surface requirement comprises at least one of the condition that the plane unevenness displayed by the object plane in the live-action picture is lower than an unevenness threshold value, the inclination of the object plane in the live-action picture is smaller than an inclination threshold value, and the bearing surface size of the object plane in the live-action picture is larger than a size threshold value, and the first bearing surface and the second bearing surface are used for bearing and displaying at least one virtual prop;
the display module is further configured to display the at least one virtual prop on the first bearing surface in a tiled manner;
the display module is further configured to respond to a second display proportion of the second bearing surface in the live-action picture, which is greater than a first display proportion of the first bearing surface in the live-action picture, and display a bearing surface switching prompt;
the clearing module is used for clearing the at least one virtual prop on the first bearing surface in response to receiving switching approval operation for the bearing surface switching prompt and displaying the at least one virtual prop on the second bearing surface;
the display module is further configured to display item introduction information in response to a selection operation of a target virtual item in the at least one virtual item, where the item introduction information is used to introduce the target virtual item.
9. The apparatus of claim 8, wherein the virtual prop corresponds to at least two levels of props;
the display module is further used for displaying a virtual item display rack on the first bearing surface, wherein the virtual item display rack comprises at least two layers of display platforms;
the display module is further configured to display the virtual props on corresponding display platforms in a tiled manner based on the prop hierarchies of the virtual props.
10. A computer device, characterized in that it comprises a processor and a memory, in which at least one program is stored, which is loaded and executed by the processor to implement the method of displaying a virtual item as claimed in any one of claims 1 to 7.
11. A computer-readable storage medium, wherein at least one program is stored in the computer-readable storage medium, and the at least one program is loaded and executed by a processor to implement the method for displaying a virtual item according to any one of claims 1 to 7.
12. A computer program product comprising computer instructions which, when executed by a processor, implement a method of displaying a virtual item as claimed in any one of claims 1 to 7.
CN202011615531.3A 2020-12-30 2020-12-30 Virtual item display method, device, equipment and readable storage medium Active CN112691372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011615531.3A CN112691372B (en) 2020-12-30 2020-12-30 Virtual item display method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011615531.3A CN112691372B (en) 2020-12-30 2020-12-30 Virtual item display method, device, equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112691372A CN112691372A (en) 2021-04-23
CN112691372B true CN112691372B (en) 2023-01-10

Family

ID=75512798

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011615531.3A Active CN112691372B (en) 2020-12-30 2020-12-30 Virtual item display method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112691372B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113680067B (en) * 2021-08-19 2024-10-18 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual prop in game
CN113786621A (en) * 2021-08-26 2021-12-14 网易(杭州)网络有限公司 Virtual transaction node browsing method and device, electronic equipment and storage medium
CN113641443B (en) * 2021-08-31 2023-10-24 腾讯科技(深圳)有限公司 Interface element display method, device, equipment and readable storage medium
CN114570030A (en) * 2022-04-06 2022-06-03 网易(杭州)网络有限公司 Method and device for processing virtual equipment in game, electronic equipment and storage medium
CN116943236A (en) * 2022-04-18 2023-10-27 腾讯科技(深圳)有限公司 Virtual carrier selection method, device, equipment, storage medium and program product
CN114911382A (en) * 2022-05-06 2022-08-16 深圳市商汤科技有限公司 Signature display method and device and related equipment and storage medium thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111494934A (en) * 2020-04-16 2020-08-07 网易(杭州)网络有限公司 Method, device, terminal and storage medium for displaying virtual props in game
CN111589148A (en) * 2020-05-15 2020-08-28 腾讯科技(深圳)有限公司 User interface display method, device, terminal and storage medium
CN111643899A (en) * 2020-05-22 2020-09-11 腾讯数码(天津)有限公司 Virtual article display method and device, electronic equipment and storage medium
CN112138383A (en) * 2020-10-15 2020-12-29 腾讯科技(深圳)有限公司 Virtual item display method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111494934A (en) * 2020-04-16 2020-08-07 网易(杭州)网络有限公司 Method, device, terminal and storage medium for displaying virtual props in game
CN111589148A (en) * 2020-05-15 2020-08-28 腾讯科技(深圳)有限公司 User interface display method, device, terminal and storage medium
CN111643899A (en) * 2020-05-22 2020-09-11 腾讯数码(天津)有限公司 Virtual article display method and device, electronic equipment and storage medium
CN112138383A (en) * 2020-10-15 2020-12-29 腾讯科技(深圳)有限公司 Virtual item display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN112691372A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
CN112691372B (en) Virtual item display method, device, equipment and readable storage medium
CN109529319B (en) Display method and device of interface control and storage medium
CN111589128B (en) Operation control display method and device based on virtual scene
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN111701238A (en) Virtual picture volume display method, device, equipment and storage medium
CN111325822B (en) Method, device and equipment for displaying hot spot diagram and readable storage medium
CN113398572B (en) Virtual item switching method, skill switching method and virtual object switching method
CN111273780B (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN110448908B (en) Method, device and equipment for applying sighting telescope in virtual environment and storage medium
CN113274729B (en) Interactive observation method, device, equipment and medium based on virtual scene
CN113546419B (en) Game map display method, game map display device, terminal and storage medium
CN112704876B (en) Method, device and equipment for selecting virtual object interaction mode and storage medium
TWI817208B (en) Method and apparatus for determining selected target, computer device, non-transitory computer-readable storage medium, and computer program product
CN111035929B (en) Elimination information feedback method, device, equipment and medium based on virtual environment
CN111026318A (en) Animation playing method, device and equipment based on virtual environment and storage medium
CN112957732B (en) Searching method, device, terminal and storage medium
CN113577765A (en) User interface display method, device, equipment and storage medium
CN113194329B (en) Live interaction method, device, terminal and storage medium
CN112755517B (en) Virtual object control method, device, terminal and storage medium
CN112023403B (en) Battle process display method and device based on image-text information
TW202243713A (en) Method and apparatus for controlling virtual object, electronic device, non-transitory computer-readable storage medium, and computer program product
CN113599819A (en) Prompt message display method, device, equipment and storage medium
CN113641443B (en) Interface element display method, device, equipment and readable storage medium
CN113058266B (en) Method, device, equipment and medium for displaying scene fonts in virtual environment
CN111338487B (en) Feature switching method and device in virtual environment, terminal and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40042979

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant