US11850514B2 - Physical games enhanced by augmented reality - Google Patents
Physical games enhanced by augmented reality Download PDFInfo
- Publication number
- US11850514B2 US11850514B2 US16/565,337 US201916565337A US11850514B2 US 11850514 B2 US11850514 B2 US 11850514B2 US 201916565337 A US201916565337 A US 201916565337A US 11850514 B2 US11850514 B2 US 11850514B2
- Authority
- US
- United States
- Prior art keywords
- game
- player
- gameplay
- physical
- augmented reality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5375—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
- A63F13/49—Saving the game status; Pausing or ending the game
- A63F13/497—Partially or entirely replaying previous game actions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F3/00—Board games; Raffle games
- A63F3/00643—Electric board games; Electric features of board games
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2401—Detail of input, input devices
- A63F2009/243—Detail of input, input devices with other kinds of input
- A63F2009/2435—Detail of input, input devices with other kinds of input using a video camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2401—Detail of input, input devices
- A63F2009/2436—Characteristics of the input
- A63F2009/2439—Characteristics of the input the input being a code, e.g. ID
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2457—Display screens, e.g. monitors, video displays
- A63F2009/246—Computer generated or synthesized image
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2448—Output devices
- A63F2009/245—Output devices visual
- A63F2009/2461—Projection of a two-dimensional real image
- A63F2009/2463—Projection of a two-dimensional real image on a screen, e.g. using a video projector
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2483—Other characteristics
- A63F2009/2485—Other characteristics using a general-purpose personal computer
- A63F2009/2486—Other characteristics using a general-purpose personal computer the computer being an accessory to a board game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/305—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/55—Details of game data or player data management
- A63F2300/5546—Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/63—Methods for processing data by generating or executing the game program for controlling the execution of the game in time
- A63F2300/634—Methods for processing data by generating or executing the game program for controlling the execution of the game in time for replaying partially or entirely the game actions since the beginning of the game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F3/00—Board games; Raffle games
- A63F3/02—Chess; Similar board games
Definitions
- Computer games sometimes possess features or characteristics not found in physical games, such as those played with tangible game boards and game pieces.
- Computer games can excel at presenting the actions of a game from the viewpoint of an individual participant, can include any of a variety of animations or audio effects, and so forth.
- computer games do not require their participants to possess specialized game boards or game pieces. More generally, the gameplay of computer games is not bound by the constraints of the physical world. Conversely, physical games therefore lack many of the features found in computer games.
- FIG. 1 illustrates an example gaming system, in accordance with an embodiment
- FIG. 2 illustrates an example of an embodiment of an augmented reality enhanced gameplay system, in accordance with an embodiment
- FIG. 3 illustrates an example of selective information sharing to enhance an adversarial game, in accordance with an embodiment
- FIG. 4 is a flowchart illustrating an example of a process for initializing an AR-enhanced physical game, in accordance with an embodiment
- FIG. 5 is a flowchart illustrating an example of a process for computing AR-assisted game states, in accordance with an embodiment
- FIG. 6 is a flowchart illustrating an example of a process for initializing an AR-enhanced physical game, in accordance with an embodiment
- FIG. 7 is a flowchart illustrating an example of a process for distributing information to game participants, in accordance with an embodiment
- FIG. 8 illustrates an example of an embodiment of an augmented reality enhanced gameplay system incorporating localization aspects, in accordance with an embodiment
- FIG. 9 illustrates an example of an embodiment of an augmented reality enhanced gameplay system incorporating player level customization, in accordance with an embodiment
- FIG. 10 illustrates further aspects of gameplay enhanced by player level customization, in accordance with an embodiment
- FIG. 11 illustrates an example process for enhancing physical gameplay by augmented reality localization and customization, in accordance with an embodiment
- FIG. 12 illustrates an environment in which various embodiments can be implemented.
- Systems, devices, and techniques described herein relate to the enhancement of physical games using augmented-reality (“AR”) systems, devices and techniques. Moreover, at least some embodiments of the systems, devices, and techniques described herein comprise a system of services that enhance the gameplay of physical games using augmented reality.
- Physical games refers to games that comprise at least some physical components, such as a real-world surface on which the game is played, a real-world object, and so forth.
- gameplay refers to the conduct of a game which has at least some physical aspects. Examples include, but are not necessarily limited to, games such as chess or checkers in which the players may not necessarily possess all of the required physical components. Another example is a role-playing game in which the player's possess some, but not all, of the figurines needed to represent the game participants.
- a system of services which may also be described as an augmented reality platform, interacts with an AR device to provide enhanced gameplay for physical games.
- a player places a physical object (such as a token) on a physical surface (such as a mat or tabletop).
- the AR device captures video data including the token and provides it to the system, which then recognizes the token.
- the system associates the token with a game, and generates an augmented reality display that, on the AR device, incorporates the physical surface and facilitates gameplay.
- a component of the system maintains a combined state of the virtual and physical components of the game.
- This component may be referred to as a game engine.
- the game engine obtains inputs and movements from the players, updates the state of the game, and presents an augmented reality view of the game that is consistent with the updated game state. For example, a user may manipulate “stand-in” objects on the physical surface to indicate moves in the game. These movements are observed by an AR device.
- the game engine is provided, through the system, with information indicative of the movements.
- the game engine then updates the game state accordingly. Inputs might also be provided from other sources, such as other players, or in response to various events.
- the inputs to the game engine are processed and used, by the game engine, to update the game state.
- the game engine can therefore receive updates from both physical and virtual sources and update the game state accordingly.
- the updated state can then be used to generate augmented reality displays for presentation to the players.
- an AR device user may place a token on a playing surface.
- a system comprising or in communication with the AR device, recognizes the identity of the token, associates the token with a game, and causes a version of the game, using the AR device, to be displayed so that it incorporates the playing surface.
- a component of the system which may be referred to as a game engine, maintains a combined state of the virtual and physical components of the game. For example, some aspects of the game might be represented by physical tokens on the playing surface. Other aspects may be purely virtual.
- the game engine updates the combined state and presents it to the players. Each player's experience may be customized according to factors such as localization references, skill level, experience, and perspective.
- the customization involves localizing the gameplay to reflect the player's language or culture.
- the customization involves adjusting the gameplay to reflect variances in skill level. This can be applied, for example, to maintain an illusion that each player is of approximately equal skill, while also permitting each player to exert his or her best effort at playing the game.
- the personalization may also involve facilitating a player's participation in an activity as a student, and likewise facilitating another player's participation in the activity as a teacher.
- the localization and personalization techniques may also be applied to games, such as billiards, that based on a relatively high degree of physical skill.
- an augmented reality enhanced gameplay system as described herein, may be used to provide skill-level based enhancements to a game such as billiards by provide varying degrees of insight to a player.
- the degree of insight that is provided may be based on skill, and might decline as that player's skill improves.
- a system might provide insights on the expected effect of a billiards shot to a junior player, such that the line each ball is expected to follows, and suggestions for improving the shot.
- the system might simply provide warnings when a shot is determined to have a high likelihood of failing.
- the system might continuously assess the player's skill level (using insights generated by a game engine service, based on statistics derived from observations made during turns of the game) to adjust the level of assistance provided. Similar techniques may be applied to other physical activities, such as exercising. For example, the system might provide suggestions for correcting form, or implement a scoring system that helps individuals of varying degrees of fitness to engage in a competition to determine who performed their personal best.
- the localization and personalization techniques may also be applied to non-game activities, including procedural activities or “goal seeking” activities.
- One example is cooking.
- an AR device and AR system, as described herein, might provide various levels of assistance to a user who is following a recipe.
- a novice chef might be provided with more detailed instructions, supplementing whatever information is provided in the new chef's recipe book.
- a more advanced chef might be provided with less information, or be provided with suggestions for varying the recipe.
- Localization might also be performed. For example, cooking units expressed in kilograms might be converted to pounds, or the system might provide suggestions for substituting exotic ingredients in the recipe with more readily available items.
- Techniques described and suggested in the present disclosure improve the field of augmented reality, especially as it applies to the enhancement of games played at least partially in the physical world.
- a system provides the ability for a user to play a wide variety of games, while possessing a limited set of physical components. These physical components might be generalized or dedicated game pieces, or might even be everyday objects used in place of game pieces.
- the system might allow commonplace six-sided dice to be used as stand-ins for more complex type of dice, such as twenty-sided dice.
- the AR-enhanced dice might further include custom faces, such as symbols instead of the pips ordinarily found on six-sided dice.
- a coin might be used as a stand-in for a game piece such as a pawn or knight.
- a system provides the ability to present information to players of a physical game in a selective fashion.
- the provided information may be based on game state such as character position, the condition of the character within the game, and so forth.
- game state such as character position, the condition of the character within the game, and so forth.
- each player may view the board differently based on their respective character's position on the game board, or based on where that character has explored.
- a system for enhancing physical games might, for example, selectively provide information to a player whose character is near one position on the board, but not provide the information to a player whose character is not at that position.
- a system facilitates the selective exchange of information between certain players, such as members of a team.
- the system identifies team members based on the game state, identifies AR devices capable of providing information to the corresponding users, and shares the information.
- a system provides enhanced gameplay by providing dynamic responses to changes in game state. For example, certain changes to the game state may trigger the presentation of a movie or music to selected players, based on events occurring in the physical game. This might occur, for example, upon a player victory or defeat, or to signal or enhance the mood of a new phase of gameplay.
- a system described herein provides the ability for game history to be viewed by current or former players, or by third parties.
- This can include a “replay” capability which enables a user of the system to view a previously played game or turn. This may be done by recording changes in the game state over time and regenerating the previously displayed augmented reality scenes. Elements of the scene that were real-world when initially played may be replaced, in the replay, with virtual elements.
- the replay may comprise video or audiovisual data representative of the data, and may include or exclude a depiction of the physical elements of the augmented reality gameplay.
- a system described herein provides the ability to incorporate expansion features into a physical game, without requiring the player to possess any additional physical components.
- the system may further help users acquire or manage licenses to expansion features.
- certain games utilize card-based game mechanics.
- the purchase of an expansion set introduces a new set of cards.
- the expansion may be purchased and implemented digitally, rather than requiring the physical acquisition of a new set of cards.
- Stand-in objects, such as blank cards can be rendered in augmented reality as cards from an expansion set.
- a system described herein allows for players to create or provide house rules that are applicable only to their copy of the game, or to support evolution of a game's rules over time. For example, changes to a game board may be applied by being rendered in the augmented reality display, rather than being permanently imprinted on the physical surface.
- a system described herein allows for players to take actions in the game independently of their turn. For example, rather than wait for their turn in the physical game, the player may input actions to be taken subsequently. This may have the effect of speeding gameplay.
- a system described herein facilitates the coordination of activities between players. For example, players on a team may enter prospective moves, which can be displayed to other players on the same team. The player's actions may then be executed simultaneously, or in-turn.
- the system may also facilitate gameplay in which that players are unaware of their teammate's intended plans, by facilitating concealed input of player intentions. Such interactions may be difficult or impractical when conventional physical games are played without incorporating aspects of the techniques described herein.
- a system described herein facilitates enforcing rules or educating players concerning the rules of a game, or providing insights into play.
- the system may present AR-assisted tutorials or gameplay hints. These may be performed based in part on an indicated or perceived skill level of the participants.
- a system described herein provides for the state of a physical game to be saved and restored, at a later time.
- the system may facilitate the setup or restoration of the state of a physical game.
- the system may, for example, identify a game state known to the system's game engine, and compare that state to an observed state of a physical surface and physical objects placed on the surface.
- the system may then present, in augmented reality, information that assists the players in manipulating the objects or surface to comply with the game state.
- a system described herein allows for physical components to represent game types, or to represent extensions to a game.
- a game vendor might provide a game surface, such as a mat or game board, which represents the “basic” version of a game. The system may observe the mat or board and facilitate gameplay using the basic rules.
- additional objects such as cards or tokens, might be used to present expansion features.
- the game vendor might provide expansion cards which, when recognized by the system, cause the system to facilitate gameplay using an expanded version of the rules.
- a system described herein uses household objects to represent components of a game.
- an AR-enhanced experience may permit the creation of a Rube Goldberg machine comprising ramps, trampolines, bumpers, and so forth, using everyday objects such as forks, spoons, cups, and so on.
- the system may provide consistent classification of these objects, so that a fork consistently represents a ramp.
- a process of discovery may thereby be enabled, where a user may discover that a spoon always represents a ramp, a cup always represents a funnel, and so on.
- this example is intended only as an illustration, and that embodiments disclosed herein may facilitate a variety of gameplay types using various objects to act as stand-ins for various components of a game. As described herein, these stand-in objects may be rendered in augmented reality as their corresponding in-game components.
- a system described herein provides for assisted gameplay.
- the system might provide facilities which allow a player (for example, a chess player) to explore alternate future states of the game.
- the system may also, in some embodiments, provide for the assisted or automatic computation of scores, or the tracking or computation of other game states.
- a system described herein allows for “fog-of-war” features. This refers to information that is selectively provided to a player based on various game factors. For example, in certain games a player might be presented with current information related to an area the player's character is able to observe, but not provide current information related to areas the character cannot presently observe. In an embodiment, available information is rendered in augmented reality, e.g. by superimposing the information on top of the game surface.
- a system described herein allows for the creation of “illusionary walls” or other game elements that are viewed differently by different players in the game. For example, a player who created an illusionary wall may bypass it, while a player not aware of the wall's origin sees and experiences the wall as if it were real. This may continue until it is determined, based on gameplay, that the player is made aware of the wall's origin.
- a system described herein allows for the sharing of physical gameplay experiences with others, including remote participants.
- the system may record changes to the game state as it progresses, and render an all-virtual version of the gameplay to remote participants.
- the rendered augmented reality video may be recorded and synchronized to the evolution of the game state. In either case, replay of the game may be rewound or fast-forwarded to specific points in the game, as indicated by a given game state.
- physical or virtual challenges are incorporated into the physical gameplay.
- the players might participate in a physically conducted game of “rock-paper-scissors” to adjudicate a gameplay result.
- Embodiments may also incorporate, into the physical gameplay, virtual challenges which require varying levels of skill, such as solving math problems, sliding puzzles, color or shape matching, and so forth.
- the challenge level can be set as a player handicap, a player-configured option, or to adjust the difficultly automatically for all players.
- the adjustment may, for example, be made by the game engine in accordance with the game rules, available expansions, and configuration information.
- Certain techniques described herein may also be applicable to scenarios other than physical gameplay. For example, certain of the techniques described herein may be employed to enhance the experience of a sporting event, or the reading of a physical book. In another example, everyday items (such as glasses and liquids) may be used as a virtual chemistry set. In another example, a user experiences a puzzle game using simple physical objects as a stand-in. In a further aspect, a physical object is experienced in augmented reality as an object that evolves or change over time or based on an inspection of the object.
- FIG. 1 illustrates an example gaming system 100 , in accordance with an embodiment.
- the example system 100 may, in various aspects and embodiments, incorporate some or all of the features described above, in various combinations.
- the system 100 comprises AR devices 102 a,b which communicate, via a network 150 , with various services, including an object and player recognition service 120 , a game engine service 122 , and a rendering service 124 .
- a service refers to distributed computing systems which provide computing functions to various client devices, such as the depicted AR devices 102 a,b .
- Services may, for example, be implemented by a combination of web server(s) 1206 and application servers 1208 , as depicted in FIG. 12 .
- the depicted system renders augmented reality gameplay which incorporates a physical surface.
- Incorporation of the physical surface refers to the generation of an augmented reality display that involves the physical surface.
- virtual reality elements may be generated and displayed on or above the physical surface.
- surface should not be construed as being limited to two-dimensional surfaces.
- a surface may have a complex three-dimensional geometry, possibly including a three-dimensional area in physical space.
- a game surface might include a hill, boulder, or similar landscape feature, and might include the air above the landscape feature.
- the rendering of the augmented reality gameplay, incorporating the physical surface is based on output of a game engine.
- a game engine service 122 may maintain and periodically update, or output, a game state that can be used, directly or indirectly, by a rendering service 124 .
- a rule set is used by a game engine.
- a rule set may include information describing various aspects of the gameplay, in a format usable by the game engine to implement the game. This refers generally to information which allows the game engine to process game input, apply it to the current state of the game, and output a new state for the game.
- the rule set may also comprise additional information describing the various game elements, indicating how those elements might be used, how they might be rendered, and so forth.
- a rule set in at least one embodiment, includes information indicating which aspects of gameplay can be localized, and may further include information indicating how such localization should be accomplished. For example, the rule set might contains data indicating which cards used in gameplay have localizable text on them, and may also provide translations of the text in various languages.
- a prospective change to the game state is identified based at least in part on an observation of a physical object, such as one of the depicted stand-in objects 106 , on a play surface, such as the depicted play surface 104 .
- the observation may be conducted based on video data of the play surface 104 collected by one or more of the AR devices 102 a,b .
- the movement of the objects can be tracked, and a prospective change to the game state identified based on the tracked movement.
- the movement can be indicative of a change to the game state, as might be the case when an object representing a game piece is moved from one location to another.
- the system may identify prospective changes to the game state based on the tracked movement.
- the prospective change can then be validated based on the rule set, and then applied to the game state.
- the fiducial marker 108 refers to an object which includes a characteristic which can be identified by the system and used to identify a game corresponding to or otherwise associated with the characteristic.
- the characteristic may be the shape of the object. For example, a pawn or knight from a chess game can be associated with the game of chess.
- the characteristic might be an identifier, such as a bar code or other printed matter, which can be used by the system to look-up the associated game.
- the AR device 102 a uses one or more of its sensors, such as its camera, to read a code imprinted on the fiducial marker 108 .
- the AR device 102 a provides this information to the system services, such as the object recognition service 120 or game engine service 122 .
- the system 100 uses this information to identify a game associated with the fiducial marker 108 .
- the fiducial marker 108 might be a card whose code corresponds to a game of chess.
- the AR device 102 a obtains the code from the card, transmits it to one of the services 120 , 122 , and the system thereby determines that a game of chess should be initiated.
- Other cards might be associated with other games, such as checkers, backgammon, poker, a particular board game, a role-playing game, and so on.
- the game engine service initializes a virtual game state associated with the physical game.
- the total game state is represented by a combination of physical state (e.g., the position of physical objects on the play surface 104 ) and virtual state.
- stand-in objects 106 are used to represent aspects of the physical state of the game.
- the stand-in objects 106 might represent player characters in a role-playing game, or pawns in a game of chess.
- the game engine service 122 uses sensor input from the AR devices 102 a,b as input for calculating a subsequent game state.
- input from the AR devices 102 a,b might be used to collect information indicating what actions are taken by the players for each turn in the game. This can include the manipulation of the stand-in objects 106 on the game surface.
- Various other devices and modes of input may also be used, such as the selection of a menu item placement of tokens on the game surface, voice commands, physical gestures, and so forth.
- the game engine service 122 causes or assists the rendering service 124 in generating aspects of the augmented reality display produced by the AR devices 102 a,b .
- the rendering service 124 can render graphics for game objects such as chess pieces, figurines, dice, cards, and so on. These graphics can then be incorporated into what is displayed on the screens of the AR devices 102 a,b . More complex graphics, such as a complete view of a chess board, a fully-rendered battlefield, and so forth, may also be produced by the rendering service 124 .
- the graphics produced by the rendering service 124 can be based on the current game state.
- the game state might represent the position of characters on a battlefield, which can then be rendered, by the rendering service 124 , for each AR device. It will be appreciated that the scene required for each AR device may be different, due to differences in each player's perspective.
- the rendering of the gameplay is be based in part on an object tracking mechanism.
- This may include, in some cases, continuous object tracking performed independently of game turns.
- one of the stand-in objects 106 might be a simple token in the physical world, but rendered in AR as a complex and detailed figurine.
- the system 100 may continuously track the position of this token as it is moved about the board, so that it remains depicted in AR as a figurine, rather than as a simple token. This may be accomplished by the depicted object and player recognition service 120 .
- the system 100 may provide various other features to facilitate and enhance gameplay. These may include, for example, providing enforcement or guidance regarding rules, providing supplementary information to players, allowing information to be shared with players on the same team, but not other players, calculating game scores, rolling virtual dice, and so on.
- the play surface 104 is a mat, cloth, or other similar two-dimensional surface.
- the play surface 104 may in some instances be relatively plain or feature-less, in order to minimize interference with the rendering of augmented reality assets within the scene.
- the play surface 104 may in some instances have patterns, colors, or other features imprinted on it, in order to assist with sensor calibration and object rendering.
- the play surface 104 is a three-dimensional structure, such as a board or table on which additional elements have been placed. For example, blocks, bricks, columns, or other objects may be treated by the system as part of the play surface.
- various services such as an object and player recognition service 120 , may distinguish between an object intended to be a portion of the play surface 104 , an object intended to be a fiducial marker 108 , and an object intended to be a stand-in object 106 . Objects may, however, be utilized for more than one such purpose.
- the play surface 104 corresponds to some or all of the game that is to be played.
- a physical chessboard might be used to play an AR-enhanced game of chess.
- an AR-enhanced wargame might be played with a mat on which hexagonal markings have been imprinted.
- the experience of playing these games may be enhanced in AR by the integration of additional elements. For example, terrain features rendered in AR might be projected onto the mat to enhance the wargame, and the physical chessboard might be rendered in AR to seem to be much larger in AR than it actually is in physical terms.
- a fiducial marker 108 is an object, such as a card or token, on which information is imprinted.
- a fiducial marker 108 may be a card on which a quick response (“QR”) code is imprinted.
- the fiducial marker 108 may be an object containing a radio-frequency identifier (“RFID”) or other mechanisms with which a device, such as an AR device 102 , may obtain information from.
- RFID radio-frequency identifier
- the fiducial marker 108 may have various additional properties used by the system 100 to facilitate the provision of AR services to enhance gameplay.
- a fiducial marker 108 may have a unique code which can be mapped to an account or owner. In an embodiment, this information is used to identify both the game to be played as well as additional information, such as saved games, player preferences, and so forth.
- the fiducial marker may also be used to facilitate the generation of the augmented reality display.
- the fiducial marker 108 is of a known size, which facilitates the determination of scale information for other objects observable through an AR device.
- the system 100 can use the known size of an observed fiducial marker 108 to determine the size of the play surface 104 , when both are captured by an AR device or camera.
- a fiducial marker 108 may include visual patterns or colors that can be observed by the camera of an AR device 102 a,b and used to perform calibration of the camera. For example, a white balance correction might be performed based on a pattern observed on a fiducial marker 108 .
- the stand-in objects 106 may be used in gameplay. These may correspond to game pieces, game hazards, walls, barriers, and so forth.
- the stand-in object may also have properties used by the system 100 to facilitate the provision of AR services to enhance gameplay.
- a stand-in object may have a unique identifier which allows it to be mapped to a specific game object.
- a pack of cards might be provided as stand-in objects, and each card might have a unique identifier on it.
- the identifiers might be globally unique (e.g., a randomly generated string of alphanumeric characters that is sufficiently long to practically eliminate the chance of collision) or locally unique.
- the cards in a simple pack of playing cards might each be considered to have a unique identifier.
- a jack of clubs might correspond to a tank
- a ten of diamonds might correspond to an infantry unit, and so on.
- a stand-in object may have visual characteristics that can be used to facilitate its use in AR-enhanced gameplay. For example, chess pieces might be used as stand-in objects. A pawn might represent an army private, a knight a lieutenant, and so on.
- a stand-in object might directly correspond to its role in the game.
- a rook may still correspond to a rook in an AR-enhanced game, but can still described as a stand-in object.
- the visual characteristics of the object allow the system to identify it as a rook.
- the system 100 may employ a federation of services to provide AR-enhanced gameplay. These may include an object and player recognition service 120 , a game engine service 122 , a rendering service 124 , and a variety of additional services 126 - 136 .
- An object and player recognition service 120 provides functionality related to classifying and identifying objects and people.
- Classification refers to determine what class of objects an observed object belongs to. For example, classification might include determine whether an observed object is a person or thing, is a fork, a knife, or a spoon, and so on.
- identification refers to determining the particular identity of an object. For example, identifying a person may involve determining the name of an observed person, and identifying a stand-in object may involve determine which particular stand-in object has been observed. Identification may be done on a global basis or a local basis. For example, a person might be identified globally by determining that person's name and address, or locally by determining that the person is identified as “Player One.”
- a game engine service 122 provides functionality related to managing and updating game state. As already indicated, game state typically involves a combination of physical and virtual game state. The game engine service 122 maintains a record of this combined state. The game engine service 122 also receives input for each turn of the game (although in AR-assisted gameplay, the concept of turn may sometimes be loosened more than is typically possible in conventional, physical gameplay), and calculates a subsequent game state. Where necessary, the game engine service 122 may also assist in updating the physical game state. For example, the game engine service 122 may send instructions, to be displayed via the AR devices 102 , to move certain pieces or to remove other pieces from the play surface 104 .
- a rendering service 124 provides graphical rendering capabilities, as needed to produce visual elements of the AR scene displayed by the AR devices 102 a,b . This may include rendering of individual objects, partial scenes, or full scenes. In various cases and embodiments, graphics rendered by the rendering service 124 are integrated into visual data captured by the AR device, in order to produce the final, integrated, AR scene.
- the rendering service 124 may perform the integration in some cases and embodiments, while in others the integration is performed by the AR device 102 . The amount of rendering performed by the service 124 may depend, in some cases and embodiments, on the computing capabilities of the AR device 102 .
- a game state service 126 may provide for storage and retrieval of game state information.
- the game state service 126 storages and retrieves game state information based on a composite index of the identity of the player, identity of the game, and a time or turn indicator. This allows retrieval of the current game state or storage of a subsequent game state, as well as the retrieval of a history of games states. The latter may be used, for example, to enable rewinding the game state to an earlier time, or to enable a reply of the game state.
- the game engine service 122 provides the game state information to the game state service 126 for storage. When a game is restored, the information is loaded by the game state service 126 and used to initialize the game state for use by the game engine service 122 .
- a rules library 128 provides for the storage and retrieval of rules for the various games.
- the term “rules” refers to the information needed by the game engine service to facilitate gameplay.
- the rules can comprise scripts that can be leveraged by the game engine to calculate a subsequent game state based on a current game state, parameters for describing permissible movement of game pieces, and so forth.
- An information management and sharing service 130 provides for distributing information to the game players (and potentially to others) in a controlled manner. For example, each player in the game may be assigned to a team.
- the information management and sharing service 130 can be used, for example, to provide information to the AR devices 102 a,b of the players on one team, but not to players on the other teams.
- the information management and sharing service may store and retrieve information based on composite indexes such as game, player, and team number. In some instances, further restrictions are applied, such as based on an “information zone” that the player is in. These techniques may be employed to enforce “fog of war” features, for example.
- the game engine service 122 interacts with the information sharing and management service 130 to provide these features.
- a game asset library 132 provides the digital assets needed to render a game. These assets may include wireframes, textures, bitmaps, audio files, and so forth. These may be stored and retrieved, e.g., by the asset library 132 at the request of the rendering service, based on the game being played.
- the game asset library 132 may also provide storage and retrieval services for assets needed for expansions or customizations of the game. For example, data for player-specific “skins” might be stored by the service.
- the game asset library 132 may employ various optimizations in order to provide the rendering service 124 with the assets its needs to render graphics efficiently, e.g. by employing techniques designed to minimize load times, etc.
- a personalization service 134 provides data or other assets for personalizing gameplay.
- personalizing gameplay comprises adjusting gameplay to account for factors such as player skill level, player preferences, house rules, and adjustments to localize the game to another culture, geographic region, or language.
- the personalization service 134 acquires data and performs computations for determining player skill level. For example, the personalization service 134 may monitor a player's participation or performance in a particular game, and make an assessment of that player's skill. The assessment may be made using any of a variety of machine learning techniques, by statistical methods, or by other methods. This assessment can then be used by the system as a basis of providing assistance to the player. The assistance can be provided through a variety of means, such as through hints, through the additional of game elements (such as extra game units), and so forth.
- the personalization service 134 stores and retrieves rules data.
- the personalization service provides the game engine service 122 with rule data, or other assets, that are applied by the game engine service 122 to adjust gameplay according to the intended level of personalization.
- This can include personal rules or house rules. These allow for the game experience to be customized for some or all of the players.
- the game engine can track these variations and apply them to the actions taken by the players. For example, the game engine may determine not to enforce an advanced rule against a beginning player, or may subject certain players to game mechanics better suited to those with more advanced skills.
- Applying the rules in this context, can comprise identifying a potential player action, examining the action with respect to a set of rules deemed applicable to the player, and then permitting, altering, or refusing the player action in accordance with those rules.
- the game engine can then update the game state as appropriate.
- a localization service 136 provides data or other assets for localizing gameplay.
- localizing gameplay comprises adapting gameplay, including in some cases visual or audio aspects of gameplay, to account for an individual player's location, culture, or language.
- the localization service may provide capabilities for associating localization preferences with particular user.
- the localization service 136 stores and retrieves assets related to localization, such as textual data in various languages, graphical elements, instructions for adjusting gameplay, and so forth. For example, physical cards containing textual portions printed in English might be rendered in augmented reality using French, Spanish, or some other language.
- FIG. 2 illustrates an example of an embodiment of an AR-enhanced gameplay system 200 .
- the system 200 can comprise a play surface 204 , stand-in objects 206 , and a fiducial marker 208 . These may correspond to the play surface 104 , stand-in objects 106 , and fiducial marker 108 depicted by FIG. 1 .
- the system 200 may further comprise one or more projector(s) 210 and camera(s) 212 .
- the camera(s) 212 are mounted or positioned within a room so that they are able to observe the play surface 204 and any objects placed thereon.
- the camera(s) 212 may be a device comprising optical sensors to capture images of the play surface.
- the camera sensors may operate in the visible wavelengths, infrared wavelength, or other wavelengths. Examples of a camera include dedicated optical devices such as cameras or webcams, or multi-purpose devices such as smartphones, tablets, and so on.
- the projector(s) 210 are devices that convert input signals to visual output.
- the projection is two-dimensional.
- the projector(s) 210 project a two-dimensional projection onto the play-surface 204 .
- the system 200 is able to align the projection onto the stand-in objects 206 and play-surface 204 . This is due at least in part to the ability to calibrate the projection using per-determined characteristics of the fiducial marker 208 .
- the projected image may therefore transform blank surfaces on the stand-in objects 206 and play-surface 204 .
- an image might be projected onto a blank token to transform it into a character in the game.
- terrain features might be projected onto the play surface 204 .
- a plurality of two-dimensional projections are used, based on a plurality of projector(s) 210 positioned at strategic angles. This allows projections onto three-dimensional surfaces.
- the projector(s) 210 are three-dimensional or holographic projectors.
- FIG. 3 illustrates an example 300 of selective information sharing to enhance an adversarial game.
- an AR-Enhanced Game Board 304 is generated using the systems depicted in FIG. 1 or FIG. 2 .
- the game board is observed by the AR devices 302 a,b .
- Each AR device 302 a,b generates an augmented reality scene comprising the AR-Enhanced Game Board 304 .
- each AR device 302 a,b displays additional information to its user. For example, consider that the user of a first AR device 302 a might be “Player One” in a game, and might be on “Team One” in the game. Similarly, the user of a second AR device 302 b might be “Player Two” in the game, and a member of “Team Two.”
- the displays generated by the AR devices 302 a,b might also be based in part on knowledge possessed by the player utilizing the device.
- This knowledge may include, for example, information indicating the status of an ally, information about the area in which a player is currently located, an estimate of an opponents status (e.g., an estimate of the amount of “hit points” an opponent has remaining), and so forth.
- the augmented reality scene displayed to the user can be adjusted in accordance with that player's “zone of knowledge.”
- secret information is organized as zones of information.
- One zone might correspond, for example, to information available only to a particular player.
- Another zone might correspond to information available only to a team.
- Another zone might be available only to players in a particular region of the game.
- the augmented reality scene generated for each player's is adjusted in accordance with these zones.
- by the game engine determine what constitutes each player's zone of knowledge and providing, to the augmented reality rendering components, the information to be rendered for a particular player.
- the display generated by the first AR device 302 a might display shared information 310 , which is also generated by the second AR device 302 b .
- the first AR device 302 a might generate a display of secret information 306 a that is only presented to “Player One.”
- the first AR device 302 a might generate a display of secret information 308 a that is only presented to members of “Team One.”
- the second AR device 302 b may likewise display player-specific secret information 306 b and team-specific secret information 308 b to “Player Two.”
- FIG. 4 is a flowchart illustrating an example of a process 400 for initializing an AR-enhanced physical game, in accordance with various embodiments.
- Some or all of the process 400 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
- the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
- process 400 may be performed by any suitable system, such as a server in a data center, by various components of the environment 1200 described in conjunction with FIG. 12 , such as the one or more web servers 1206 or the one or more application servers 1208 , by multiple computing devices in a distributed system of a computing resource service provider, or by any electronic client device such as the electronic client device 1202 .
- the services depicted in FIG. 1 operate on the web servers 1206 and application servers 1208 depicted in FIG. 12 .
- the process 400 includes a series of operations wherein an AR-enhanced game is initialized based on the observation of a fiducial marker.
- a fiducial marker is observed by an AR device. This may be done using the techniques described with respect to FIG. 1 .
- environmental sensors are calibrated based on the known properties of the fiducial marker. This may be done using the techniques described herein, including with respect to FIG. 1 .
- the game is identified based on the fiducial marker in accordance with the techniques described herein, including with respect to FIG. 1 .
- the play surface is identified. This may comprise identifying the area of physical space in or on which the game will be played. In some instances, a mat or game board is utilized. Other potential surfaces include a dining-room table, desk, or floor.
- the identification of the play surface may comprise identification of its dimensions, which may be informed by the sensors calibrated using the fiducial marker. For example, the size of the fiducial marker may be compared to the size of an observed table, in order to determine how big the table is.
- stand-in objects are identified. Identification of a stand-in object may include identifying a category or identity of the object in order to correlate this information to the role the stand-in object will serve in the game. In an embodiment, a visual hash is calculated to serve as a unique identifier of the object and to allow it to be tracked as it is moved around the game surface. This approach may be combined, in some embodiments, with other tracking techniques in order to improve continuity of identification.
- the game state is initialized. This step may comprise a number of sub-operations.
- the game state may be initialized by identifying the players, determining license rights associated with the players (e.g., expansions owns by the players), retrieving rules for the identified game, loading expansions, allocating a game engine, retrieving any saved state information, and initializing the game engine based on the rules and saved state.
- the initialization process may further comprise loading graphical and audio assets from a game asset library, for use by a rendering service. These assets may comprise bitmaps, audio files, and so forth.
- the game is rendered in its initial state. This may, for example, involve a rendering service working in conjunctions with the AR devices to generate augmented-reality scenes in which the stand-in objects and play surface have been replaced by their augmented-reality counterparts. Virtual-only elements can also be included in the scene.
- video frames for the scene are rendered by a rendering service, such as the rendering service 124 depicted in FIG. 1 , and transmitted to an AR device for display to a user.
- FIG. 5 is a flowchart illustrating an example of a process 500 for computing AR-assisted game states, in accordance with various embodiments. Some or all of the process 500 (or any other processes described, or variations and/or combinations of those processes) may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
- the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
- process 500 may be performed by any suitable system, such as a server in a data center, by various components of the environment 1200 described in conjunction with FIG. 12 , such as the one or more web servers 1206 or the one or more application servers 1208 , by multiple computing devices in a distributed system of a computing resource service provider, or by any electronic client device such as the electronic client device 1202 .
- the services depicted in FIG. 1 operate on the web servers 1206 and application servers 1208 depicted in FIG. 12 .
- the process 500 includes a series of operations wherein a game engine calculates game states to facilitate AR-enhanced physical gameplay.
- a process for facilitating AR-enhanced physical gameplay may comprise operations for obtaining player input, updating a game state in accordance with the player input, game rules, and other factors, and rendering an augmented reality scene consistent with the updated game state.
- the game engine receives inputs based on physical changes to the game state.
- stand-in objects may have been moved or added to the playing surface. Examples of such actions include moving a stand-in object representing a chess piece, adding a figuring to represent a new foe, and so forth.
- the forms of input are not necessarily limited to the manipulation, addition, or removal of stand-in objects.
- other physical changes such as a change in lighting condition, music, physical gestures, and so forth can all be forms of input.
- the passage of time might also be treated as a physical change to the game state, and used as a form of input.
- the game engine identifies other forms of input, such as input received via player gestures detected by the AR device, vocal commands detected by the AR device, menu selections, and so forth. Inputs might also be based on messages received from any of a variety of sources, such as other games.
- changes to the virtual game state are computed based on the inputs, as are intended changes to the physical state.
- the virtual changes can include calculating game effects, calculating the positions of objects that exist only virtually, and so forth.
- the intended changes to the physical state includes changes to the physical state that should be performed in order to make the overall state of the game consistent. For example, it may be determined that an enemy unit has been destroyed. If that unit is represented in the game by a physical token, e.g., a stand-in object, that token should be removed from the board.
- the game is rendered based on the newly calculated state, and displayed on the AR devices.
- the rendering may be performed using the techniques described herein, including those described with respect to FIG. 1 .
- the game engine interfaces with other components of the system to assist any needed changes to the physical state.
- the game engine might generate a set of instructions that are to be displayed to the users, via their respective AR devices.
- the game engine may also facilitate verification of the physical changes, to ensure that the physical environment has been modified to match whatever requirements are imposed on it by the changes to the virtual game state.
- the game engine proceeds to the next cycle.
- the game state is recorded during each cycle, or during selected cycles, in order to facilitate a subsequent replay of the game or a subsequent restoration of the game.
- the game state is transmitted, for each or selected cycles, to another device or computing system, to facilitate remote viewing of the gameplay.
- FIG. 6 is a flowchart illustrating an example of a process 600 for initializing an AR-enhanced physical game, in accordance with various embodiments.
- Some or all of the process 600 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
- the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
- process 600 may be performed by any suitable system, such as a server in a data center, by various components of the environment 1200 described in conjunction with FIG. 12 , such as the one or more web servers 1206 or the one or more application servers 1208 , by multiple computing devices in a distributed system of a computing resource service provider, or by any electronic client device such as the electronic client device 1202 .
- the services depicted in FIG. 1 operate on the web servers 1206 and application servers 1208 depicted in FIG. 12 .
- the process 600 includes a series of operations wherein a game is set up or restored to a desired state, so that game play can commence or be continued.
- the system identifies the game based on a fiducial object or marker place on a playing surface. This may be accomplished using the techniques described herein, including those techniques described in relation to FIG. 1 .
- the system identifies the players. This may be accomplished using the techniques described herein, including those techniques described in relation to FIG. 1 .
- the subsequent retrieval of a game state may be based on this identification, such that games in which a player owns or was involved in can be made available and selected by the player for restoration.
- the system retrieves the rules of the game, as well as the rules of any associated expansions or extensions to the game.
- rules refers to data and instructions usable by a game engine to facilitate the enhanced gameplay, rather than the traditional text-based rules that might be provided to a player.
- Various licensing requirements may be considered to determine if an expansion or extension should be retrieved and applied to gameplay. For example, a game might require that at least one participating player have purchased a license to an extension. In another example, a player may have established various “house rules” that should be applied to the game, but only when that particular player is involved.
- the system retrieves state information associated with the game and the identified player(s). For example, the game might have been previously saved, and its players may wish to resume gameplay from the save point.
- a unique ability of AR-assisted gameplay is the facilitation of “save points” for a physical game.
- the system retrieves a previously stored game state, provides information and hints for restoring the physical game to that state, and facilitates subsequent gameplay. Virtual components of the game can be automatically restored.
- the virtual game state is either initialized or restored based on a save point. This may comprise retrieving the relevant game data associated with the save point, allocating a game engine, and initializing the game engine in accordance with the state.
- the system assists in the initialization or restoration of the physical game state. For example, if a game of chess is commencing, the system might help the players by ensuring that the various pieces are properly placed in their starting positions, or in the positions associated with the save point.
- the system can also assist in the restoration of a prior physical game state by providing information in augmented reality regarding where pieces should be placed.
- the system might display augmented reality markers at various points on the play surface, in order to indicate where a token should be placed, and what type of token is required.
- FIG. 7 is a flowchart illustrating an example of a process 700 for distributing information to game participants, in accordance with various embodiments. Some or all of the process 700 (or any other processes described, or variations and/or combinations of those processes) may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
- the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
- process 700 may be performed by any suitable system, such as a server in a data center, by various components of the environment 1200 described in conjunction with FIG. 12 , such as the one or more web servers 1206 or the one or more application servers 1208 , by multiple computing devices in a distributed system of a computing resource service provider, or by any electronic client device such as the electronic client device 1202 .
- the services depicted in FIG. 1 operate on the web servers 1206 and application servers 1208 depicted in FIG. 12 .
- the process 700 includes a series of operations wherein information is compartmentalized and distributed according to game criteria.
- the system identifies the game being played and retrieves rules for the game.
- the game rules may include various elements related to the compartmentalization of information to be distributed in the game. Examples include rules to indicate how fog-of-war is handled, how information is shared between team members, and so forth.
- the system identifies the players of the game, as well as any teams and the associations between the players and the teams.
- the system may further track correspondence between the players and objects on the physical game surface, so that the system may determine which tokens are controlled by which player.
- an information zone refers to a type or category of information, as defined by the gameplay rules and gameplay state.
- the information made available to a particular player might be described as the intersection of the rules and state.
- the area might be affected by various rules and game states, such as whether or not the character has a torch or flashlight.
- the position of a monster might be included in an information zone if it would be visible under these game conditions, but excluded otherwise.
- an information zone is one based on team membership.
- an information zone might be based on team membership, so that all of the members of “Team One” are included in the same information zone, but players not on “Team One” are excluded.
- players are assigned to information zones based on the aforementioned criteria. Note that this may be an ongoing process, since membership in an information zone can change based on changes to the game state. For example, the game engine might determine that a character's position has change (e.g., by observing movement of a corresponding stand-in object), and thereby determine to include or exclude that character from an information zone.
- game information is distributed based on membership in the information zones.
- the game engine causes the rendering component to include the information in the augmented reality display of the appropriate users, or trigger some other means of providing the information.
- the game engine might cause a first player's device to emit “mood” audio associated with a particular location, or cause another user's heads-up display to be updated to include messages from that player's teammates.
- the system facilitates game input based on membership in an information zone.
- the system facilitates based on team membership alone, or in combination with information zone. For example, player's whose characters are within the same zone and who are on the same team might be permitted to coordinate their actions for the next turn. Similarly, opponents who are in the same zone might receive hints as to the opposing team's actions, when characters on the opposing team are communicating within the opponent's range of hearing.
- FIG. 8 illustrates an example of an embodiment of an augmented reality enhanced gameplay system incorporating localization aspects, in accordance with an embodiment.
- a play surface 806 is observed by two players “A” 850 and “B” 852 , operating respective AR Devices “A” and “B” 802 , 804 .
- the AR Device “A” 802 may generate a view of the play surface 806 from a player A's perspective 812
- the AR Device “B” 804 may generate a view of the play surface 806 from player B's perspective 814 .
- player “A” may be assumed to be associated with a first set of localization settings 820
- player “B” may be assumed to be associated with a second set of localization settings 822
- the example 800 of FIG. 8 describes aspects of embodiments that localize game elements in accordance with each player's localization preferences. Examples of localizing game elements include adapting instructions to a player's language requirements, adjusting currency values to a denomination a player is familiar with, and so forth.
- localization settings 820 , 822 refer to data indicative of the language preferences of the respective player.
- the play surface 806 as rendered by AR Device A 802 might be rendered such that locations or objects 810 on the board are rendered with labels in the French language, whereas the same locations or objects 810 on play surface 806 might be rendered by AR Device B 804 using English language labels.
- localization settings 820 , 822 refer to localization-related gameplay adaptations of the respective player.
- the currency for player “A” might be rendered and treated in the game as French francs
- the currency for player “B” might be rendered and treated in the game as U.S. dollars.
- the gameplay might be further adapted based on these differences, such as by incorporating an “exchange rate” between the different currency types.
- the rendering of gameplay in accordance with a localization preference is based, in various embodiments, on the rule set.
- a rule set in such embodiments, may identify gameplay elements—such as cards, spots on the board, instructions, and so forth—that may require localization.
- the rule set may identify cards used in a game, or spots on the game board, that include textual elements that can be localized to accommodate the language needs of a particular player.
- a game engine in an embodiment, is made cognizant of the localizable elements of a game based on the rule set, and takes appropriate steps to cause those elements to be rendered, by a rendering service, using the appropriate localization preferences.
- Localization services may also include translation services 854 , facilitating communication between players speaking different languages.
- the translation services 854 includes spoken word translation.
- the translation services 854 include translation of gameplay elements that are expressed in a localized manner. For example, in game which involves bartering or trading, terms offered by one player in terms of French Francs might be expressed to another player in terms of U.S. dollars. Likewise, while one player might refer to a region of a board using a term expressed in one language, another player might refer to the same region using a term from a different language.
- the translation services 854 may facilitate gameplay adaptations based on these differences.
- Localization may also include, in some embodiments, localization of gameplay actions, such as bartering or trading, through visual interactions.
- a player can initiate a barter or trade in the gameplay by creating a virtual list or manifest of goods or money to exchange, and both parties can then participate in the process by modifying the virtual list.
- the game engine can cause the edits to be restricted to what is possible within the constraints of the game.
- this aspect is enabled when the system determines that visual interaction may be useful due to differences in the language requirements of the players.
- Embodiments may also provide means of allowing players, despite language differences, to perform gameplay actions such as forming pacts, alliances, or agreements, based on exchanges of funds or promises to perform actions in the future.
- the game engine may enforce the terms of the pact, alliance, or agreement within the gameplay. For example, a move that would be illegal in the terms of an agreement might be disallowed by the game engine.
- FIG. 9 illustrates an example of an embodiment of an augmented reality enhanced gameplay system incorporating player level customization, in accordance with an embodiment.
- player level customization refers to adaptations to the gameplay that are designed to adapt to differences in player's skill level.
- player “A” might be of relatively low skill in the game being played, or might simply prefer a more relaxed playing experience.
- player “B” might be a more advanced player or might desire a more intense playing experience.
- Information about these preferences might be stored by the system, as depicted by the skill settings 920 , 922 elements of FIG. 9 .
- the system may therefore, in accordance with the respective skill settings 920 , 922 , adjust the gameplay to adapt to these respective differences.
- the AR Device A 902 of player “A” renders a display of gameplay having a relatively low complexity compared to the display 902 b rendered by the AR Device B 904 of player “B.” This is the case even though the players are observing, through their respective AR devices, the same play surface 906 .
- the adjustments to gameplay depicted by FIG. 9 may be implemented, at least in part, by the actions of the game engine service 122 and the personalization service 134 .
- the personalization service 134 may identify desired types and amounts of adaptations, and drive the game engine service 122 to apply those changes to the gameplay.
- the personalization service 134 instructs the game engine service 122 to incorporate additional gameplay elements that player “B” must contend with, compared to player “A.” This has the effect of levelling the amount of skill required for player “B” to win the game, relative to player “A.”
- the additional gameplay elements might, for example, include additional non-player characters opposing the actions of the more advanced player, additional terrain obstacles, additional game pieces attributed to player “A” but controlled by the system, and so forth.
- player “B” might have to contend with fog-of-war rules, advanced unit management rules, and so forth, which are not applied to player “A,” or are automated on behalf of player “A.”
- FIG. 10 illustrates further aspects of gameplay enhanced by player level customization, in accordance with an embodiment.
- player “A” is assumed, for the purposes of the example, to be a novice student of the game, whereas player “B” is assumed to be an advanced player or teacher of the game.
- Each player views the AR-Enhanced game board 1002 from their own perspective, as generated by the respective AR Devices 1004 , 1006 .
- the gameplay as produced by the system 100 depicted in FIG. 1 and rendered by the AR Device A 1004 of player “A,” is adjusted to facilitate the student's learning of the game.
- the gameplay for player “A” is adjusted, by the system 100 and AR Device A 1004 , to include rules assistance 1010 .
- the AR Device A 1004 may render graphics showing allowable moves or hints about rules which might be applicable to the player's current situation. If the player attempts to make a move which is against the rules, the system might provide an explanation of why the rule is being disallowed.
- the AR Device A 1004 can render these aspects based on data and processing of the game engine service 122 , rendering service 124 , and personalization service 134 .
- the gameplay for player “A” is adjusted, by the system 100 and AR Device A 1004 , to include strategy and tactics insight 1012 .
- This can include, for example, displaying suggested moves, highlight the effects of a move proposed by the player, and so forth.
- the AR Device A 1004 can render these suggestions based on data and processing of the game engine service 122 , rendering service 124 , and personalization service 134 .
- the gameplay experience for player “A” is adjusted, by the system 100 and AR Device A 1004 , to include assistance from others 1014 .
- assistance from others refers to hints or other communications that originates from other players, or other sources.
- assistance from others 1014 includes insight provided from player “B,” delivered through the system 100 to the AR Device A 1004 .
- assistance from others 1014 includes insight provided from a game streaming service.
- the assistance might, for example, include commentary from those observing the game in real time.
- the system 100 applies machine learning algorithms to identify comments made by players in similar game situations, or in response to similar moves.
- the AR Device A 1004 can render these aspects based on data and processing of the game engine service 122 , rendering service 124 , and personalization service 134 .
- the gameplay as produced by the system 100 depicted in FIG. 1 and rendered by the AR Device B 1006 of player “B,” is adjusted to facilitate player “B's” role as a teacher of the game.
- the gameplay experience of “player B” is adjusted by the system 100 and AR Device B 1006 by the inclusion of elements related to advanced strategy insight 1020 .
- the more junior player might receive comparatively simply tips regarding strategy and tactics
- the more advanced player B might receive more advanced tips.
- These may include insights which may facilitate the teacher role.
- the AR Device B 1006 might render hints or suggestions which relate to the long term implications of a proposed move by player A or player B.
- the system 100 (particularly the game engine) might identify moves which might be considered “instructive” because they would introduce situations in the game that the student would benefit from experiencing.
- the gameplay experience of “player B” is adjusted by the system 100 and AR Device B 1006 by the inclusion of elements related to providing teacher insight 1022 .
- the operations, choices, or decisions that are available to a player may depend on the player's absolute or relative skill level.
- This can comprise presenting fair substitutions of options, which may be put in play automatically by the game engine.
- a novice player might be provided with an army of chess pieces in which the more complex pieces are omitted, or their behavior simplified.
- the substitutions may be made in a manner that are determined by the game engine to be fair.
- the omitted pieces might be replaced with additional pieces, such as additional pawns or an additional queen.
- the complexity of cockpit in a flight simulator is reduced for novice players, and increased for advanced players.
- the gameplay experience of “player B” is adjusted by the system 100 and AR Device B 1006 by the inclusion of elements and features related to assistance generation 1024 .
- This might include, for example, supporting gesture-based input to trigger the provision of a hint to the student, when it is apparent to the student or system that the student is struggling to make a suitable game decision. It might also include support for providing gameplay tips generated by the system, of which the teacher might approve or disapprove of sending to the student.
- FIG. 11 illustrates an example process for enhancing physical gameplay by augmented reality localization and customization, in accordance with an embodiment.
- Some or all of the process 1100 may be performed under the control of one or more computer systems configured with executable instructions and/or other data, and may be implemented as executable instructions executing collectively on one or more processors.
- the executable instructions and/or other data may be stored on a non-transitory computer-readable storage medium (e.g., a computer program persistently stored on magnetic, optical, or flash media).
- process 1100 may be performed by any suitable system, including but not limited to augmented reality devices comprising a processor and a non-transitory memory on which instructions executable by the processor are stored.
- the augmented reality system may further comprise components such as those depicted in FIG. 1 .
- the process 1100 includes a series of operations wherein an augmented reality device provides localized and personalized gameplay to each participant in an augmented reality enhanced game.
- the system 100 identifies the players.
- Player identification may be performed by the object player recognition service 120 , based on one or more input factors such as identifying characteristics of the player's AR device, facial recognition of the player, voice identification, and so forth. The may be applied in combination.
- the system 100 obtains per-player localization data. This may be performed by retrieving information stored by the localization service 136 and/or personalization service 134 , and index by the player's identity. In some embodiments, geographic information (such as the current location of the player, obtained via global-positioning features of an AR device) may also be used to retrieve an appropriate set of localization data.
- the system 100 obtains personalization data, such as per-player skill assessments, that can then be used by the system 100 to drive gameplay adjustments. These can be obtained by using the identity of a player, as obtained at 1102 , to retrieve an appropriate information set via the personalization service 134 .
- the system adjusts or loads various game assets to support localization in augmented reality.
- the system might determine the locale(s) to which the system needs to adjust, and load game assets (such as maps, textures, fonts, textual resources, translation services, and so forth) that might be needed.
- game assets such as maps, textures, fonts, textual resources, translation services, and so forth
- This operation may involve the rendering service 124 , as it relies on assets to perform graphical rendering, and the game engine service 122 , as it relies on rule sets and other assets to drive gameplay.
- the system adjusts or loads various game assets to support game personalization in augmented reality. For example, additional rulesets might be loaded, graphical elements corresponding to additional non-player characters or obstacles might be loaded, and so on.
- This operation may involve the rendering service 124 , as it relies on assets to perform graphical rendering, and the game engine service 122 , as it relies on rule sets and other assets to drive gameplay.
- the system uses the game engine to drive gameplay that incorporates adjustments for localization and personalization.
- FIG. 12 illustrates aspects of an example environment 1200 for implementing aspects in accordance with various embodiments.
- the environment includes an electronic client device 1202 , which can include any appropriate device operable to send and/or receive requests, messages, or information over an appropriate network 1204 and convey information back to a user of the device.
- client devices include personal computers, cell phones, handheld messaging devices, laptop computers, tablet computers, set-top boxes, personal data assistants, embedded computer systems, electronic book readers, and the like.
- the environment 1200 in one embodiment is a distributed and/or virtual computing environment utilizing several computer systems and components that are interconnected via communication links, using one or more computer networks or direct connections.
- the environment 1200 in one embodiment is a distributed and/or virtual computing environment utilizing several computer systems and components that are interconnected via communication links, using one or more computer networks or direct connections.
- FIG. 12 should be taken as being illustrative in nature and not limiting to the scope of the disclosure.
- the network 1204 can include any appropriate network, including an intranet, the Internet, a cellular network, a local area network, a satellite network or any other network, and/or combination thereof. Components used for such a system can depend at least in part upon the type of network and/or environment selected. Many protocols and components for communicating via such network 1204 are well known and will not be discussed in detail. Communication over the network 1204 can be enabled by wired or wireless connections and combinations thereof.
- the network 1204 includes the Internet and/or other publicly-addressable communications network, as the environment 1200 includes one or more web servers 1206 for receiving requests and serving content in response thereto, although for other networks an alternative device serving a similar purpose could be used as would be apparent to one of ordinary skill in the art.
- the illustrative environment 1200 includes one or more application servers 1208 and data storage 1210 . It should be understood that there can be several application servers, layers or other elements, processes or components, which may be chained or otherwise configured, which can interact to perform tasks such as obtaining data from an appropriate data store. Servers, as used, may be implemented in various ways, such as hardware devices or virtual computer systems. In some contexts, “servers” may refer to a programming module being executed on a computer system.
- data store or “data storage” refers to any device or combination of devices capable of storing, accessing, and retrieving data, which may include any combination and number of data servers, databases, data storage devices, and data storage media, in any standard, distributed, virtual, or clustered environment.
- the one or more application servers 1208 can include any appropriate hardware, software and firmware for integrating with the data storage 1210 as needed to execute aspects of one or more applications for the electronic client device 1202 , handling some or all of the data access and business logic for an application.
- the one or more application servers 1208 may provide access control services in cooperation with the data storage 1210 and is able to generate content including, text, graphics, audio, video, and/or other content usable to be provided to the user, which may be served to the user by the one or more web servers 1206 in the form of HyperText Markup Language (HTML), Extensible Markup Language (XML), JavaScript, Cascading Style Sheets (CS S), JavaScript Object Notation (JSON), and/or another appropriate client-side structured language.
- HTML HyperText Markup Language
- XML Extensible Markup Language
- CS S Cascading Style Sheets
- JSON JavaScript Object Notation
- Content transferred to the electronic client device 1202 may be processed by the electronic client device 1202 to provide the content in one or more forms including forms that are perceptible to the user audibly, visually, and/or through other senses.
- the handling of all requests and responses, as well as the delivery of content between the electronic client device 1202 and the one or more application servers 1208 can be handled by the one or more web servers 1206 using PHP: Hypertext Preprocessor (PHP), Python, Ruby, Perl, Java, HTML, XML, JSON, and/or another appropriate server-side structured language in this example.
- PHP Hypertext Preprocessor
- Python Python
- Ruby Ruby
- Perl Java
- Java Hypertext Preprocessor
- HTML Hypertext Preprocessor
- XML XML
- JSON Java-side structured language
- Each server typically will include an operating system that provides executable program instructions for the general administration and operation of that server and typically will include a computer-readable storage medium (e.g., a hard disk, random access memory, read only memory, etc.) storing instructions that, when executed (i.e., as a result of being executed) by a processor of the server, allow the server to perform its intended functions.
- a computer-readable storage medium e.g., a hard disk, random access memory, read only memory, etc.
- the data storage 1210 can include several separate data tables, databases, data documents, dynamic data storage schemes, and/or other data storage mechanisms and media for storing data relating to a particular aspect of the present disclosure.
- the data storage 1210 may include mechanisms for storing various types of data and user information, which can be used to serve content to the electronic client device 1202 .
- the data storage 1210 also is shown to include a mechanism for storing log data, such as application logs, system logs, access logs, and/or various other event logs, which can be used for reporting, analysis, or other purposes.
- the data storage 1210 is operable, through logic associated therewith, to receive instructions from the one or more application servers 1208 and obtain, update, or otherwise process data in response thereto.
- the one or more application servers 1208 may provide static, dynamic, or a combination of static and dynamic data in response to the received instructions.
- Dynamic data such as data used in web logs (blogs), shopping applications, news services, and other applications may be generated by server-side structured languages as described or may be provided by a content management system (CMS) operating on, or under the control of, the one or more application servers 1208 .
- CMS content management system
- a user through a device operated by the user, can submit a search request for a match to a particular search term.
- the data storage 1210 might access the user information to verify the identity of the user and obtain information about items of that type. The information then can be returned to the user, such as in a results listing on a web page that the user is able to view via a browser on the electronic client device 1202 .
- Information related to the particular search term can be viewed in a dedicated page or window of the browser. It should be noted, however, that embodiments of the present disclosure are not necessarily limited to the context of web pages, but may be more generally applicable to processing requests in general, where the requests are not necessarily requests for content.
- the various embodiments further can be implemented in a wide variety of operating environments, which in some embodiments can include one or more user computers, computing devices, or processing devices that can be used to operate any of a number of applications.
- User or client devices can include any of a number of computers, such as desktop, laptop, or tablet computers running a standard operating system, as well as cellular, wireless, and handheld devices running mobile software and capable of supporting a number of networking and messaging protocols.
- Such a system also can include a number of workstations running any of a variety of commercially available operating systems and other known applications for purposes such as development and database management.
- These devices also can include other electronic devices, such as dummy terminals, thin-clients, gaming systems, and other devices capable of communicating via the network 1204 .
- These devices also can include virtual devices such as virtual machines, hypervisors, and other virtual devices capable of communicating via the network 1204 .
- Various embodiments of the present disclosure utilize the network 1204 that would be familiar to those skilled in the art for supporting communications using any of a variety of commercially available protocols, such as Transmission Control Protocol/Internet Protocol (TCP/IP), User Datagram Protocol (UDP), protocols operating in various layers of the Open System Interconnection (OSI) model, File Transfer Protocol (FTP), Universal Plug and Play (UpnP), Network File System (NFS), and Common Internet File System (CIFS).
- the network 1204 can be, for example, a local area network, a wide-area network, a virtual private network, the Internet, an intranet, an extranet, a public switched telephone network, an infrared network, a wireless network, a satellite network, and any combination thereof.
- connection-oriented protocols may be used to communicate between network endpoints.
- Connection-oriented protocols (sometimes called connection-based protocols) are capable of transmitting data in an ordered stream.
- Connection-oriented protocols can be reliable or unreliable.
- TCP protocol is a reliable connection-oriented protocol.
- ATM Asynchronous Transfer Mode
- Frame Relay is unreliable connection-oriented protocols.
- Connection-oriented protocols are in contrast to packet-oriented protocols such as UDP that transmit packets without a guaranteed ordering.
- the one or more web servers 1206 can run any of a variety of server or mid-tier applications, including Hypertext Transfer Protocol (HTTP) servers, FTP servers, Common Gateway Interface (CGI) servers, data servers, Java servers, Apache servers, and business application servers.
- HTTP Hypertext Transfer Protocol
- CGI Common Gateway Interface
- the server(s) also may be capable of executing programs or scripts in response to requests from user devices, such as by executing one or more web applications that may be implemented as one or more scripts or programs written in any programming language, such as Java®, C, C# or C++, or any scripting language, such as Ruby, PHP, Perl, Python, or TCL, as well as combinations thereof.
- the server(s) may also include database servers, including those commercially available from Oracle®, Microsoft®, Sybase®, and IBM® as well as open-source servers such as MySQL, Postgres, SQLite, MongoDB, and any other server capable of storing, retrieving, and accessing structured or unstructured data.
- Database servers may include table-based servers, document-based servers, unstructured servers, relational servers, non-relational servers, or combinations of these and/or other database servers.
- the environment 1200 can include a variety of data stores and other memory and storage media as discussed above. These can reside in a variety of locations, such as on a storage medium local to (and/or resident in) one or more of the computers or remote from any or all of the computers across the network 1204 . In a particular set of embodiments, the information may reside in a storage-area network (SAN) familiar to those skilled in the art. Similarly, any necessary files for performing the functions attributed to the computers, servers or other network devices may be stored locally and/or remotely, as appropriate.
- SAN storage-area network
- each such device can include hardware elements that may be electrically coupled via a bus, the elements including, for example, a central processing unit (CPU or processor), an input device (e.g., a mouse, keyboard, controller, touch screen, or keypad), and an output device (e.g., a display device, printer, or speaker).
- a central processing unit CPU or processor
- an input device e.g., a mouse, keyboard, controller, touch screen, or keypad
- an output device e.g., a display device, printer, or speaker
- Such a system may also include one or more storage devices, such as disk drives, optical storage devices, and solid-state storage devices such as random access memory (RAM) or read-only memory (ROM), as well as removable media devices, memory cards, flash cards, etc.
- RAM random access memory
- ROM read-only memory
- Such devices can include a computer-readable storage media reader, a communications device (e.g., a modem, a network card (wireless or wired), an infrared communication device, etc.), and working memory as described above.
- the computer-readable storage media reader can be connected with, or configured to receive, a computer-readable storage medium, representing remote, local, fixed, and/or removable storage devices as well as storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information.
- the system and various devices also typically will include a number of software applications, modules, services, or other elements located within a working memory device, including an operating system and application programs, such as a client application or web browser.
- customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed.
- Storage media and computer readable media for containing code, or portions of code can include any appropriate media known or used in the art, including storage media and communication media, such as, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage and/or transmission of information such as computer readable instructions, data structures, program modules, or other data, including RAM, ROM, Electrically Erasable Programmable Read-Only Memory (EEPROM), flash memory or other memory technology, Compact Disc Read-Only Memory (CD-ROM), digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage, or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the system device.
- RAM random access memory
- ROM read-only memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- CD-ROM Compact Disc Read-Only Memory
- DVD digital versatile disk
- magnetic cassettes magnetic tape
- magnetic disk storage or other magnetic storage devices, or any other medium
- the conjunctive phrases “at least one of A, B, and C” and “at least one of A, B and C” refer to any of the following sets: ⁇ A ⁇ , ⁇ B ⁇ , ⁇ C ⁇ , ⁇ A, B ⁇ , ⁇ A, C ⁇ , ⁇ B, C ⁇ , ⁇ A, B, C ⁇ .
- such conjunctive language is not generally intended to imply that certain embodiments require at least one of A, at least one of B and at least one of C each to be present.
- the term “plurality” indicates a state of being plural (e.g., “a plurality of items” indicates multiple items). The number of items in a plurality is at least two, but can be more when so indicated either explicitly or by context.
- Processes described can be performed in any suitable order unless otherwise indicated or otherwise clearly contradicted by context.
- Processes described may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof.
- the code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising instructions executable by one or more processors.
- the computer-readable storage medium may be non-transitory.
- the code is stored on a set of one or more non-transitory computer-readable storage media having stored thereon executable instructions that, when executed (i.e., as a result of being executed) by one or more processors of a computer system, cause the computer system to perform operations described herein.
- the set of non-transitory computer-readable storage media may comprise multiple non-transitory computer-readable storage media and one or more of individual non-transitory storage media of the multiple non-transitory computer-readable storage media may lack all of the code while the multiple non-transitory computer-readable storage media collectively store all of the code.
- the executable instructions are executed such that different instructions are executed by different processors.
- a non-transitory computer-readable storage medium may store instructions.
- a main CPU may execute some of the instructions and a graphics processor unit may execute other of the instructions.
- a graphics processor unit may execute other of the instructions.
- different components of a computer system may have separate processors and different processors may execute different subsets of the instructions.
- computer systems are configured to implement one or more services that singly or collectively perform operations of processes described herein.
- Such computer systems may, for instance, be configured with applicable hardware and/or software that enable the performance of the operations.
- computer systems that implement various embodiments of the present disclosure may, in some embodiments, be single devices and, in other embodiments, be distributed computer systems comprising multiple devices that operate differently such that the distributed computer system performs the operations described and such that a single device may not perform all operations.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/565,337 US11850514B2 (en) | 2018-09-07 | 2019-09-09 | Physical games enhanced by augmented reality |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862728679P | 2018-09-07 | 2018-09-07 | |
US201862774035P | 2018-11-30 | 2018-11-30 | |
US16/565,337 US11850514B2 (en) | 2018-09-07 | 2019-09-09 | Physical games enhanced by augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200078680A1 US20200078680A1 (en) | 2020-03-12 |
US11850514B2 true US11850514B2 (en) | 2023-12-26 |
Family
ID=69720899
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/565,337 Active US11850514B2 (en) | 2018-09-07 | 2019-09-09 | Physical games enhanced by augmented reality |
Country Status (1)
Country | Link |
---|---|
US (1) | US11850514B2 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7061649B2 (en) * | 2020-08-27 | 2022-04-28 | 株式会社バンダイ | Game watching system, program, watching terminal and connection device |
US11273375B1 (en) | 2020-08-28 | 2022-03-15 | Justin Hanyan Wong | Methods and systems for rendering virtual three-dimensional field of play for AR-enhanced gameplay experience |
US12053247B1 (en) * | 2020-12-04 | 2024-08-06 | Onpoint Medical, Inc. | System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures |
WO2023010167A1 (en) * | 2021-08-04 | 2023-02-09 | Shellmont Pty. Ltd. | Placement guide for physical layout of objects and automated score tracking system and method |
CN113797554B (en) * | 2021-09-22 | 2023-12-26 | 北京有竹居网络技术有限公司 | Game engine resource processing method and device, storage medium and electronic equipment |
Citations (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6080063A (en) * | 1997-01-06 | 2000-06-27 | Khosla; Vinod | Simulated real time game play with live event |
US20030142587A1 (en) | 2002-01-25 | 2003-07-31 | Zeitzew Michael A. | System and method for navigation using two-way ultrasonic positioning |
US20060223635A1 (en) * | 2005-04-04 | 2006-10-05 | Outland Research | method and apparatus for an on-screen/off-screen first person gaming experience |
US20080170123A1 (en) | 2007-01-12 | 2008-07-17 | Jacob C Albertson | Tracking a range of body movement based on 3d captured image streams of a user |
US20080176583A1 (en) | 2005-10-28 | 2008-07-24 | Skyhook Wireless, Inc. | Method and system for selecting and providing a relevant subset of wi-fi location information to a mobile client device so the client device may estimate its position with efficient utilization of resources |
US20090005140A1 (en) * | 2007-06-26 | 2009-01-01 | Qualcomm Incorporated | Real world gaming framework |
US20100156660A1 (en) | 2008-12-23 | 2010-06-24 | Lee In Ock | Apparatus and method for estimating position of mobile unit |
US20110216060A1 (en) * | 2010-03-05 | 2011-09-08 | Sony Computer Entertainment America Llc | Maintaining Multiple Views on a Shared Stable Virtual Space |
US20110298827A1 (en) | 2010-06-02 | 2011-12-08 | Microsoft Corporation | Limiting avatar gesture display |
US20110301934A1 (en) | 2010-06-04 | 2011-12-08 | Microsoft Corporation | Machine based sign language interpreter |
US20120079990A1 (en) | 2009-06-17 | 2012-04-05 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and apparatus for conserving deep-sea organisms |
US20120083325A1 (en) * | 2010-09-30 | 2012-04-05 | Heatherly Christopher W | Systems and methods to provide augmented reality for a board game |
US20120243375A1 (en) | 2011-03-25 | 2012-09-27 | Teledyne Instruments, Inc. | Determining a position of a submersible vehicle within a body of water |
US20120281181A1 (en) | 2011-05-05 | 2012-11-08 | Sony Computer Entertainment Inc. | Interface using eye tracking contact lenses |
US8384542B1 (en) | 2010-04-16 | 2013-02-26 | Kontek Industries, Inc. | Autonomous and federated sensory subsystems and networks for security systems |
US20130077820A1 (en) | 2011-09-26 | 2013-03-28 | Microsoft Corporation | Machine learning gesture detection |
US20130261856A1 (en) | 2012-03-27 | 2013-10-03 | Ankit Sharma | Method and system for identifying a directional heading of a vehicle |
US20140032034A1 (en) | 2012-05-09 | 2014-01-30 | Singularity University | Transportation using network of unmanned aerial vehicles |
US20140153794A1 (en) | 2011-01-25 | 2014-06-05 | John Varaklis | Systems and methods for medical use of motion imaging and capture |
US20140253590A1 (en) | 2013-03-06 | 2014-09-11 | Bradford H. Needham | Methods and apparatus for using optical character recognition to provide augmented reality |
US20140267008A1 (en) | 2013-03-15 | 2014-09-18 | Lutron Electronics Co., Inc. | Gesture-based load control |
US20140310595A1 (en) | 2012-12-20 | 2014-10-16 | Sri International | Augmented reality virtual personal assistant for external representation |
US20150177842A1 (en) | 2013-12-23 | 2015-06-25 | Yuliya Rudenko | 3D Gesture Based User Authorization and Device Control Methods |
US20160078289A1 (en) | 2014-09-16 | 2016-03-17 | Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer | Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction |
US20160086349A1 (en) | 2014-09-23 | 2016-03-24 | Microsoft Corporation | Tracking hand pose using forearm-hand model |
US9317916B1 (en) | 2013-04-12 | 2016-04-19 | Aic Innovations Group, Inc. | Apparatus and method for recognition of medication administration indicator |
US20160180468A1 (en) | 2014-12-23 | 2016-06-23 | The Travelers Indemnity Company | Systems, methods, and apparatus for object classification based on localized information |
US20160243434A1 (en) * | 2014-09-05 | 2016-08-25 | Trigger Global Inc. | Augmented reality game piece |
US20160328604A1 (en) | 2014-01-07 | 2016-11-10 | Arb Labs Inc. | Systems and methods of monitoring activities at a gaming venue |
US20170148339A1 (en) * | 2014-08-08 | 2017-05-25 | Greg Van Curen | Virtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same |
US20170144756A1 (en) | 2015-11-25 | 2017-05-25 | Mohammad Rastgaar Aagaah | Drone having drone-catching feature |
JP2017093425A (en) | 2015-11-12 | 2017-06-01 | 太平洋セメント株式会社 | Member for fish reef or algal reef |
US20170168586A1 (en) | 2015-12-15 | 2017-06-15 | Purdue Research Foundation | Method and System for Hand Pose Detection |
US20170193708A1 (en) * | 2010-11-15 | 2017-07-06 | Bally Gaming, Inc. | System and method for augmented reality with complex augmented reality video image tags |
US20170190051A1 (en) | 2016-01-06 | 2017-07-06 | Disney Enterprises, Inc. | Trained human-intention classifier for safe and efficient robot navigation |
US20170208493A1 (en) | 2016-01-19 | 2017-07-20 | Qsense Inc. | Management of a distributed sensor system |
US20170212210A1 (en) | 2014-07-17 | 2017-07-27 | Origin Wireless, Inc. | Wireless positioning systems |
WO2017132563A1 (en) | 2016-01-29 | 2017-08-03 | Baylor Research Institute | Joint disorder diagnosis with 3d motion capture |
US20170227638A1 (en) | 2016-01-04 | 2017-08-10 | Raytheon Bbn Technologies Corp. | Bobber Field Acoustic Detection System |
US20170234966A1 (en) | 2016-02-17 | 2017-08-17 | Qualcomm Incorporated | Device for uav detection and identification |
US20170280678A1 (en) | 2016-03-31 | 2017-10-05 | Wal-Mart Stores, Inc. | Apparatus and method for providing aerial animal food delivery |
US9782668B1 (en) * | 2012-07-31 | 2017-10-10 | Niantic, Inc. | Placement of virtual elements in a virtual world associated with a location-based parallel reality game |
US20170293742A1 (en) | 2016-04-07 | 2017-10-12 | Javad Sadeghi | Interactive mobile technology for guidance and monitoring of physical therapy exercises |
US20170293824A1 (en) | 2014-12-30 | 2017-10-12 | Baidu Online Network Technology ( Beijing) Co., Ltd. | Method and device for recognizing subject area of image |
US20170313421A1 (en) | 2016-04-29 | 2017-11-02 | United Parcel Service Of America, Inc. | Unmanned aerial vehicle including a removable parcel carrier |
US20170344859A1 (en) | 2016-05-26 | 2017-11-30 | Audun Bjornerud MO | Method and system for providing gesture recognition services to user applications |
US20170358144A1 (en) | 2016-06-13 | 2017-12-14 | Julia Schwarz | Altering properties of rendered objects via control points |
KR20170139093A (en) | 2015-08-19 | 2017-12-18 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | A method for a network access device to access a wireless network access point, a network access device, an application server, and a non-volatile computer readable storage medium |
US20180018861A1 (en) | 2016-07-12 | 2018-01-18 | Tyco Fire & Security Gmbh | Holographic Technology Implemented Security Solution |
US20180020329A1 (en) | 2016-07-18 | 2018-01-18 | Rivada Research, Llc | Method and System for Internet of Things (iOT) Enhanced Location Based Services Trilateration |
US20180024641A1 (en) | 2016-07-20 | 2018-01-25 | Usens, Inc. | Method and system for 3d hand skeleton tracking |
US20180093186A1 (en) * | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Methods for Providing Interactive Content in a Virtual Reality Scene to Guide an HMD User to Safety Within a Real World Space |
CN107897068A (en) | 2017-10-31 | 2018-04-13 | 中国科学院南海海洋研究所 | A kind of method for improving artificial breeding giant clam children's shellfish enhancement releasing survival rate |
US20180122043A1 (en) | 2016-10-27 | 2018-05-03 | Semih Energin | Virtual object movement |
USD817195S1 (en) | 2016-10-03 | 2018-05-08 | Elemental Machines, Inc. | Sensor device for measuring environmental conditions |
US20180213713A1 (en) | 2015-07-23 | 2018-08-02 | Arthur J. Zito, Jr. | Responsive dispersion from compartment in aqueous solution |
US20180263170A1 (en) | 2015-10-12 | 2018-09-20 | Droneseed Co. | Aerial deployment planting methods and systems |
US20180310532A1 (en) | 2017-04-27 | 2018-11-01 | International Business Machines Corporation | Automated aquaculture pen location |
US20180330810A1 (en) | 2017-05-09 | 2018-11-15 | Concorde Health, Inc. | Physical therapy monitoring algorithms |
US10143925B2 (en) * | 2007-12-07 | 2018-12-04 | Sony Mobile Communications Inc. | Dynamic gaming environment |
US20190000350A1 (en) | 2017-06-28 | 2019-01-03 | Incyphae Inc. | Diagnosis tailoring of health and disease |
US10192126B2 (en) | 2016-06-01 | 2019-01-29 | Toyota Jidosha Kabushiki Kaisha | Behavior recognition apparatus, learning apparatus, and method |
US20190038222A1 (en) | 2018-05-23 | 2019-02-07 | Yuri Krimon | Mitigating effects of neuro-muscular ailments |
US20190061890A1 (en) | 2017-08-29 | 2019-02-28 | Gooch's Beach Drone Company | Submersible drone devices and systems |
US20190091582A1 (en) * | 2017-09-27 | 2019-03-28 | Activision Publishing, Inc. | Methods and Systems for Improved Content Generation in Multiplayer Gaming Environments |
US20190124893A1 (en) | 2017-10-31 | 2019-05-02 | Aviantronics, Llc | Aquatic animal identification and passage control device |
US10279264B1 (en) * | 2016-03-22 | 2019-05-07 | Electronic Arts Inc. | Adaptive gaming tutorial system |
US20190217198A1 (en) * | 2018-01-12 | 2019-07-18 | International Business Machines Corporation | Physical obstacle avoidance in a virtual reality environment |
US20190221035A1 (en) * | 2018-01-12 | 2019-07-18 | International Business Machines Corporation | Physical obstacle avoidance in a virtual reality environment |
US20190294881A1 (en) | 2018-03-22 | 2019-09-26 | Viisights Solutions Ltd. | Behavior recognition |
US20190325605A1 (en) | 2016-12-29 | 2019-10-24 | Zhejiang Dahua Technology Co., Ltd. | Systems and methods for detecting objects in images |
US20190383903A1 (en) | 2018-06-13 | 2019-12-19 | KaiKuTek Inc. | Gesture recognition system having machine-learning accelerator |
US20200005028A1 (en) | 2018-06-28 | 2020-01-02 | Atlassian Pty Ltd | Automatic machine recognition of sign language gestures |
US20200050342A1 (en) | 2018-08-07 | 2020-02-13 | Wen-Chieh Geoffrey Lee | Pervasive 3D Graphical User Interface |
US20200057425A1 (en) | 2018-08-20 | 2020-02-20 | Dell Products, L.P. | Systems and methods for prototyping a virtual model |
US20200055570A1 (en) | 2017-04-28 | 2020-02-20 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Navigation system for underwater vehicles |
US10579869B1 (en) | 2017-07-18 | 2020-03-03 | Snap Inc. | Virtual object machine learning |
US20200160535A1 (en) | 2018-11-15 | 2020-05-21 | Qualcomm Incorporated | Predicting subject body poses and subject movement intent using probabilistic generative models |
US10675536B2 (en) * | 2018-10-03 | 2020-06-09 | Song Chen | Gaming system that alters target images produced by an LED array |
US20200234231A1 (en) | 2019-01-23 | 2020-07-23 | Ashored Inc. | Methods and systems for underwater gear tracking |
US20200238177A1 (en) * | 2016-09-30 | 2020-07-30 | Sony Interactive Entertainment Inc. | Methods for providing interactive content in a virtual reality scene to guide an hmd user to safety within a real world space |
US20200284903A1 (en) | 2017-09-12 | 2020-09-10 | Subsea Finder Llc | Method for tracking underwater objects |
US20200289922A1 (en) * | 2019-03-15 | 2020-09-17 | Sony Interactive Entertainment LLC | Near real-time augmented reality video gaming system |
US10839203B1 (en) | 2016-12-27 | 2020-11-17 | Amazon Technologies, Inc. | Recognizing and tracking poses using digital imagery captured from multiple fields of view |
US20200394393A1 (en) | 2017-09-11 | 2020-12-17 | Conti Temic Microelectronic Gmbh | Gesture Control for Communication with an Autonomous Vehicle on the Basis of a Simple 2D Camera |
US10989815B2 (en) | 2015-09-08 | 2021-04-27 | Underwater Communications & Navigation Laboratory (Limited Liability Company) | Method for positioning underwater objects and system for the implementation thereof |
US11036303B2 (en) | 2019-03-29 | 2021-06-15 | Tata Consultancy Services Llc | Systems and methods for three-dimensional (3D) reconstruction of human gestures from radar based measurements |
US11132606B2 (en) | 2019-03-15 | 2021-09-28 | Sony Interactive Entertainment Inc. | Reinforcement learning to train a character using disparate target animation data |
US11249179B2 (en) | 2019-08-01 | 2022-02-15 | Socionext Inc. | Motion detection system and motion detection device |
US11337358B2 (en) | 2014-09-23 | 2022-05-24 | Dendra Systems Ltd. | Techniques for automated planting |
-
2019
- 2019-09-09 US US16/565,337 patent/US11850514B2/en active Active
Patent Citations (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6080063A (en) * | 1997-01-06 | 2000-06-27 | Khosla; Vinod | Simulated real time game play with live event |
US20030142587A1 (en) | 2002-01-25 | 2003-07-31 | Zeitzew Michael A. | System and method for navigation using two-way ultrasonic positioning |
US20060223635A1 (en) * | 2005-04-04 | 2006-10-05 | Outland Research | method and apparatus for an on-screen/off-screen first person gaming experience |
US20080176583A1 (en) | 2005-10-28 | 2008-07-24 | Skyhook Wireless, Inc. | Method and system for selecting and providing a relevant subset of wi-fi location information to a mobile client device so the client device may estimate its position with efficient utilization of resources |
US20080170123A1 (en) | 2007-01-12 | 2008-07-17 | Jacob C Albertson | Tracking a range of body movement based on 3d captured image streams of a user |
US20090005140A1 (en) * | 2007-06-26 | 2009-01-01 | Qualcomm Incorporated | Real world gaming framework |
US10143925B2 (en) * | 2007-12-07 | 2018-12-04 | Sony Mobile Communications Inc. | Dynamic gaming environment |
US20100156660A1 (en) | 2008-12-23 | 2010-06-24 | Lee In Ock | Apparatus and method for estimating position of mobile unit |
US20120079990A1 (en) | 2009-06-17 | 2012-04-05 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Method and apparatus for conserving deep-sea organisms |
US20110216060A1 (en) * | 2010-03-05 | 2011-09-08 | Sony Computer Entertainment America Llc | Maintaining Multiple Views on a Shared Stable Virtual Space |
US8384542B1 (en) | 2010-04-16 | 2013-02-26 | Kontek Industries, Inc. | Autonomous and federated sensory subsystems and networks for security systems |
US20110298827A1 (en) | 2010-06-02 | 2011-12-08 | Microsoft Corporation | Limiting avatar gesture display |
US20110301934A1 (en) | 2010-06-04 | 2011-12-08 | Microsoft Corporation | Machine based sign language interpreter |
US20120083325A1 (en) * | 2010-09-30 | 2012-04-05 | Heatherly Christopher W | Systems and methods to provide augmented reality for a board game |
US20170193708A1 (en) * | 2010-11-15 | 2017-07-06 | Bally Gaming, Inc. | System and method for augmented reality with complex augmented reality video image tags |
US20140153794A1 (en) | 2011-01-25 | 2014-06-05 | John Varaklis | Systems and methods for medical use of motion imaging and capture |
US20120243375A1 (en) | 2011-03-25 | 2012-09-27 | Teledyne Instruments, Inc. | Determining a position of a submersible vehicle within a body of water |
US20120281181A1 (en) | 2011-05-05 | 2012-11-08 | Sony Computer Entertainment Inc. | Interface using eye tracking contact lenses |
US20130077820A1 (en) | 2011-09-26 | 2013-03-28 | Microsoft Corporation | Machine learning gesture detection |
US20130261856A1 (en) | 2012-03-27 | 2013-10-03 | Ankit Sharma | Method and system for identifying a directional heading of a vehicle |
US20140032034A1 (en) | 2012-05-09 | 2014-01-30 | Singularity University | Transportation using network of unmanned aerial vehicles |
US9782668B1 (en) * | 2012-07-31 | 2017-10-10 | Niantic, Inc. | Placement of virtual elements in a virtual world associated with a location-based parallel reality game |
US20140310595A1 (en) | 2012-12-20 | 2014-10-16 | Sri International | Augmented reality virtual personal assistant for external representation |
US20140253590A1 (en) | 2013-03-06 | 2014-09-11 | Bradford H. Needham | Methods and apparatus for using optical character recognition to provide augmented reality |
US20140267008A1 (en) | 2013-03-15 | 2014-09-18 | Lutron Electronics Co., Inc. | Gesture-based load control |
US9317916B1 (en) | 2013-04-12 | 2016-04-19 | Aic Innovations Group, Inc. | Apparatus and method for recognition of medication administration indicator |
US20150177842A1 (en) | 2013-12-23 | 2015-06-25 | Yuliya Rudenko | 3D Gesture Based User Authorization and Device Control Methods |
US20160328604A1 (en) | 2014-01-07 | 2016-11-10 | Arb Labs Inc. | Systems and methods of monitoring activities at a gaming venue |
US20170212210A1 (en) | 2014-07-17 | 2017-07-27 | Origin Wireless, Inc. | Wireless positioning systems |
US20170148339A1 (en) * | 2014-08-08 | 2017-05-25 | Greg Van Curen | Virtual reality system enabling compatibility of sense of immersion in virtual space and movement in real space, and battle training system using same |
US20160243434A1 (en) * | 2014-09-05 | 2016-08-25 | Trigger Global Inc. | Augmented reality game piece |
US20160078289A1 (en) | 2014-09-16 | 2016-03-17 | Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer | Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction |
US11337358B2 (en) | 2014-09-23 | 2022-05-24 | Dendra Systems Ltd. | Techniques for automated planting |
US20160086349A1 (en) | 2014-09-23 | 2016-03-24 | Microsoft Corporation | Tracking hand pose using forearm-hand model |
US20160180468A1 (en) | 2014-12-23 | 2016-06-23 | The Travelers Indemnity Company | Systems, methods, and apparatus for object classification based on localized information |
US20170293824A1 (en) | 2014-12-30 | 2017-10-12 | Baidu Online Network Technology ( Beijing) Co., Ltd. | Method and device for recognizing subject area of image |
US20180213713A1 (en) | 2015-07-23 | 2018-08-02 | Arthur J. Zito, Jr. | Responsive dispersion from compartment in aqueous solution |
KR20170139093A (en) | 2015-08-19 | 2017-12-18 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | A method for a network access device to access a wireless network access point, a network access device, an application server, and a non-volatile computer readable storage medium |
US10989815B2 (en) | 2015-09-08 | 2021-04-27 | Underwater Communications & Navigation Laboratory (Limited Liability Company) | Method for positioning underwater objects and system for the implementation thereof |
US20180263170A1 (en) | 2015-10-12 | 2018-09-20 | Droneseed Co. | Aerial deployment planting methods and systems |
JP2017093425A (en) | 2015-11-12 | 2017-06-01 | 太平洋セメント株式会社 | Member for fish reef or algal reef |
US20170144756A1 (en) | 2015-11-25 | 2017-05-25 | Mohammad Rastgaar Aagaah | Drone having drone-catching feature |
US20170168586A1 (en) | 2015-12-15 | 2017-06-15 | Purdue Research Foundation | Method and System for Hand Pose Detection |
US20170227638A1 (en) | 2016-01-04 | 2017-08-10 | Raytheon Bbn Technologies Corp. | Bobber Field Acoustic Detection System |
US20170190051A1 (en) | 2016-01-06 | 2017-07-06 | Disney Enterprises, Inc. | Trained human-intention classifier for safe and efficient robot navigation |
US20170208493A1 (en) | 2016-01-19 | 2017-07-20 | Qsense Inc. | Management of a distributed sensor system |
WO2017132563A1 (en) | 2016-01-29 | 2017-08-03 | Baylor Research Institute | Joint disorder diagnosis with 3d motion capture |
US20170234966A1 (en) | 2016-02-17 | 2017-08-17 | Qualcomm Incorporated | Device for uav detection and identification |
US10279264B1 (en) * | 2016-03-22 | 2019-05-07 | Electronic Arts Inc. | Adaptive gaming tutorial system |
US20170280678A1 (en) | 2016-03-31 | 2017-10-05 | Wal-Mart Stores, Inc. | Apparatus and method for providing aerial animal food delivery |
US20170293742A1 (en) | 2016-04-07 | 2017-10-12 | Javad Sadeghi | Interactive mobile technology for guidance and monitoring of physical therapy exercises |
US20170313421A1 (en) | 2016-04-29 | 2017-11-02 | United Parcel Service Of America, Inc. | Unmanned aerial vehicle including a removable parcel carrier |
US20170344859A1 (en) | 2016-05-26 | 2017-11-30 | Audun Bjornerud MO | Method and system for providing gesture recognition services to user applications |
US10192126B2 (en) | 2016-06-01 | 2019-01-29 | Toyota Jidosha Kabushiki Kaisha | Behavior recognition apparatus, learning apparatus, and method |
US20170358144A1 (en) | 2016-06-13 | 2017-12-14 | Julia Schwarz | Altering properties of rendered objects via control points |
US20180018861A1 (en) | 2016-07-12 | 2018-01-18 | Tyco Fire & Security Gmbh | Holographic Technology Implemented Security Solution |
US20180020329A1 (en) | 2016-07-18 | 2018-01-18 | Rivada Research, Llc | Method and System for Internet of Things (iOT) Enhanced Location Based Services Trilateration |
US20180024641A1 (en) | 2016-07-20 | 2018-01-25 | Usens, Inc. | Method and system for 3d hand skeleton tracking |
US20180093186A1 (en) * | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Methods for Providing Interactive Content in a Virtual Reality Scene to Guide an HMD User to Safety Within a Real World Space |
US20200238177A1 (en) * | 2016-09-30 | 2020-07-30 | Sony Interactive Entertainment Inc. | Methods for providing interactive content in a virtual reality scene to guide an hmd user to safety within a real world space |
USD817195S1 (en) | 2016-10-03 | 2018-05-08 | Elemental Machines, Inc. | Sensor device for measuring environmental conditions |
US20180122043A1 (en) | 2016-10-27 | 2018-05-03 | Semih Energin | Virtual object movement |
US10839203B1 (en) | 2016-12-27 | 2020-11-17 | Amazon Technologies, Inc. | Recognizing and tracking poses using digital imagery captured from multiple fields of view |
US20190325605A1 (en) | 2016-12-29 | 2019-10-24 | Zhejiang Dahua Technology Co., Ltd. | Systems and methods for detecting objects in images |
US20180310532A1 (en) | 2017-04-27 | 2018-11-01 | International Business Machines Corporation | Automated aquaculture pen location |
US20200055570A1 (en) | 2017-04-28 | 2020-02-20 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Navigation system for underwater vehicles |
US20180330810A1 (en) | 2017-05-09 | 2018-11-15 | Concorde Health, Inc. | Physical therapy monitoring algorithms |
US20190000350A1 (en) | 2017-06-28 | 2019-01-03 | Incyphae Inc. | Diagnosis tailoring of health and disease |
US10579869B1 (en) | 2017-07-18 | 2020-03-03 | Snap Inc. | Virtual object machine learning |
US20190061890A1 (en) | 2017-08-29 | 2019-02-28 | Gooch's Beach Drone Company | Submersible drone devices and systems |
US20200394393A1 (en) | 2017-09-11 | 2020-12-17 | Conti Temic Microelectronic Gmbh | Gesture Control for Communication with an Autonomous Vehicle on the Basis of a Simple 2D Camera |
US20200284903A1 (en) | 2017-09-12 | 2020-09-10 | Subsea Finder Llc | Method for tracking underwater objects |
US20190091582A1 (en) * | 2017-09-27 | 2019-03-28 | Activision Publishing, Inc. | Methods and Systems for Improved Content Generation in Multiplayer Gaming Environments |
CN107897068A (en) | 2017-10-31 | 2018-04-13 | 中国科学院南海海洋研究所 | A kind of method for improving artificial breeding giant clam children's shellfish enhancement releasing survival rate |
US20190124893A1 (en) | 2017-10-31 | 2019-05-02 | Aviantronics, Llc | Aquatic animal identification and passage control device |
US20190221035A1 (en) * | 2018-01-12 | 2019-07-18 | International Business Machines Corporation | Physical obstacle avoidance in a virtual reality environment |
US20190217198A1 (en) * | 2018-01-12 | 2019-07-18 | International Business Machines Corporation | Physical obstacle avoidance in a virtual reality environment |
US20190294881A1 (en) | 2018-03-22 | 2019-09-26 | Viisights Solutions Ltd. | Behavior recognition |
US20190038222A1 (en) | 2018-05-23 | 2019-02-07 | Yuri Krimon | Mitigating effects of neuro-muscular ailments |
US20190383903A1 (en) | 2018-06-13 | 2019-12-19 | KaiKuTek Inc. | Gesture recognition system having machine-learning accelerator |
US20200005028A1 (en) | 2018-06-28 | 2020-01-02 | Atlassian Pty Ltd | Automatic machine recognition of sign language gestures |
US20200050342A1 (en) | 2018-08-07 | 2020-02-13 | Wen-Chieh Geoffrey Lee | Pervasive 3D Graphical User Interface |
US20200057425A1 (en) | 2018-08-20 | 2020-02-20 | Dell Products, L.P. | Systems and methods for prototyping a virtual model |
US10675536B2 (en) * | 2018-10-03 | 2020-06-09 | Song Chen | Gaming system that alters target images produced by an LED array |
US20200160535A1 (en) | 2018-11-15 | 2020-05-21 | Qualcomm Incorporated | Predicting subject body poses and subject movement intent using probabilistic generative models |
US20200234231A1 (en) | 2019-01-23 | 2020-07-23 | Ashored Inc. | Methods and systems for underwater gear tracking |
US20200289922A1 (en) * | 2019-03-15 | 2020-09-17 | Sony Interactive Entertainment LLC | Near real-time augmented reality video gaming system |
US11132606B2 (en) | 2019-03-15 | 2021-09-28 | Sony Interactive Entertainment Inc. | Reinforcement learning to train a character using disparate target animation data |
US11036303B2 (en) | 2019-03-29 | 2021-06-15 | Tata Consultancy Services Llc | Systems and methods for three-dimensional (3D) reconstruction of human gestures from radar based measurements |
US11249179B2 (en) | 2019-08-01 | 2022-02-15 | Socionext Inc. | Motion detection system and motion detection device |
Non-Patent Citations (7)
Title |
---|
Chamberland, et al., "New Seeding Approach Reduces Costs and Time to Outplant Sexually Propagated Corals for Reef Restoration," www.nature.com/Scientificreports, Dec. 22, 2017, 12 pages. |
Charles, "GPS Goes Mainsteam," NPR, Dec. 26, 2007, 7 pages. |
International Invitation to Pay Additional Fees dated Jun. 2, 2020, in International Patent Application No. PCT/US2020/016882, filed Feb. 5, 2020, 21 pages. |
International Search Report and Written Opinion dated Jul. 23, 2020, Patent Application No. PCT/US2020/016882, 19 pages. |
Kramar, V., et al., "Particularities of Visualisation of Medical and Wellness Data Through a Digitial Patient Avatar", 14th Conference of Open Innovation Association FRUCT, 2013, 12 pages. |
Langley et al., "Approaches to Machine Learning," Department of Computer Science Carnegie-Mellon University, Feb. 16, 1984, 28 pages. |
Ocean News, "Meet RangerBot, Robot Reef Protector," Sep. 4, 2018, 7 pages. |
Also Published As
Publication number | Publication date |
---|---|
US20200078680A1 (en) | 2020-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11850514B2 (en) | Physical games enhanced by augmented reality | |
Dovey et al. | Game cultures: Computer games as new media: computer games as new media | |
Wardyga | The Video Games Textbook: History• Business• Technology | |
Bossom et al. | Video games: an introduction to the industry | |
Erlank | Property in virtual worlds | |
Rato et al. | A taxonomy of social roles for agents in games | |
Cameron | Narrative and gameplay design in the story-driven videogame: A case study on the last of us | |
US9162143B2 (en) | System and method for presenting a view of a virtual lobby environment to a user | |
Kim et al. | Gamification framework | |
Payne | Connected viewing, connected capital: Fostering gameplay across screens | |
Maley | Video games and esports: The growing world of gamers | |
Jenny et al. | Key Terms Definitions | |
Stark | Ludic literature: Ready Player One as didactic fiction for the Neoliberal Subject | |
Arlt et al. | The Computer as Game, Toy, and Player | |
Stolee | An Object-Focused Approach to Analog Game Adaptation | |
Kontour | War, masculinity, and gaming in the military entertainment complex: A case study of “Call of Duty 4: Modern Warfare” | |
Featherstone | Optimising gamification using constructive competition and videogames | |
MacDonald | The Case for Virtual Property | |
Blocher | Gaming | |
Luzardo et al. | Video Games: More than Just a Game: The Unknown Successes of Latin American and Caribbean Studios | |
US12277836B2 (en) | Attendee directed donee donation defined by bookmaker donor to incent attendance of a virtual reality gambling metaverse competition | |
Guillory | Video Games | |
Rivas | " You Must Defeat [the Tutorial] To Stand A Chance": Learning To Play Competitive Fighting Video Games | |
US20250108288A1 (en) | System and method for treasure hunting game | |
Mazzo | Gamifying Education for Millennials: It’s More Than Just a Video Game |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: VULCAN INC., WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SIMPKINSON, RICHARD EARL;ROSENBAUM, OMER;GERARD, RUSTY A.;AND OTHERS;SIGNING DATES FROM 20191015 TO 20191018;REEL/FRAME:050996/0752 |
|
AS | Assignment |
Owner name: VULCAN INC., WASHINGTON Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECT THIRD INVENTOR NAME PREVIOUSLY RECORDED AT REEL: 050996 FRAME: 0752. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:SIMPKINSON, RICHARD EARL;ROSENBAUM, OMER;GERARD, RUSTY ALLEN;AND OTHERS;SIGNING DATES FROM 20191202 TO 20191219;REEL/FRAME:051424/0261 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: VULCAN LLC, WASHINGTON Free format text: CHANGE OF NAME;ASSIGNOR:VULCAN INC.;REEL/FRAME:070014/0049 Effective date: 20211020 Owner name: VALE GROUP LLC, WASHINGTON Free format text: CHANGE OF NAME;ASSIGNOR:VULCAN LLC;REEL/FRAME:070015/0137 Effective date: 20240112 |