[go: up one dir, main page]

HK1114042B - Network game system, network game system control method, game machine and game machine control method - Google Patents

Network game system, network game system control method, game machine and game machine control method Download PDF

Info

Publication number
HK1114042B
HK1114042B HK08109666.6A HK08109666A HK1114042B HK 1114042 B HK1114042 B HK 1114042B HK 08109666 A HK08109666 A HK 08109666A HK 1114042 B HK1114042 B HK 1114042B
Authority
HK
Hong Kong
Prior art keywords
base position
base
game
player
game device
Prior art date
Application number
HK08109666.6A
Other languages
Chinese (zh)
Other versions
HK1114042A1 (en
Inventor
芦田裕行
大里新太郎
和久田肇
岛田岳彦
福岛圭悟
惠阪裕之
冲野强
下村贤
大富牧子
矢田健一
西村秀峰
文野智之
Original Assignee
科乐美数码娱乐株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2005371119A external-priority patent/JP4861699B2/en
Application filed by 科乐美数码娱乐株式会社 filed Critical 科乐美数码娱乐株式会社
Publication of HK1114042A1 publication Critical patent/HK1114042A1/en
Publication of HK1114042B publication Critical patent/HK1114042B/en

Links

Description

Network game system, control method for network game system, game device, and control method for game device
Technical Field
The present invention relates to a network game system, a method of controlling a network game system, a game device, a method of controlling a game device, and an information storage medium, and more particularly to sharing of a position and a posture of an object in a network game, and a user interface of a game device.
Background
A network game is known in which a plurality of game devices are connected via a communication network to share a virtual space. In this type of network game, a plurality of objects associated with each game device are arranged in a virtual space, and in each game device, the position and orientation of an object (object) associated with the game device are updated in accordance with a game operation performed by a player, and the information is transmitted to another game device.
The above-described network game has a problem that if the position and orientation of each object are shared by a plurality of game devices, the communication amount of the communication network significantly increases.
In addition, in a game device, when moving an object in a virtual space, it is known to instruct a moving direction by a direction key provided in a controller or to move the object forward in the virtual space by stepping on a foot pedal. However, the former has a problem that it is necessary to operate the direction key, which is troublesome. Especially in the case of a shooting game, in which a player needs to operate a gun type controller, it is very difficult to operate the direction key. In the latter case, there is a problem that the direction of the moving object in the virtual space is limited.
Disclosure of Invention
The present invention has been made in view of the above problems, and it is an object of the present invention to provide a network game system, a control method for the network game system, a game device, a control method for the game device, and an information storage medium, which can share the position and orientation of an object associated with a plurality of game devices among the game devices while suppressing an increase in communication traffic of a communication network.
Another object of the present invention is to provide a game device, a game device control method, and an information storage medium, which can easily move an object in an arbitrary direction in a virtual space.
In order to solve the above problem, a network game system according to the present invention includes a plurality of game devices connected to a communication network, each of the plurality of game devices having a plurality of objects associated therewith, and sharing a virtual space in which a plurality of base positions are set, the game device including: a 1 st base position selecting unit that selects 1 base position from the plurality of base positions; a 1 st base position transmission unit configured to transmit the base position selected by the 1 st base position selection unit to another game device; 1 st base position receiving means for receiving the base position transmitted from the other game device by the 1 st base position transmitting means; 2 nd base position selecting means for selecting 1 base position from the base positions received by the 1 st base position receiving means; a self-object position determining means for determining a position of an object associated with the game device based on the base position selected by the 1 st base position selecting means; and a self-object posture determining means for determining the posture of the object associated with the game device based on the base position selected by the 2 nd base position selecting means.
Further, a control method of a network game system according to the present invention is a control method of a network game system including a plurality of game devices connected to a communication network, the network game system including a plurality of game devices, each of the plurality of game devices being configured to arrange a plurality of objects associated with one of the plurality of game devices and to share a virtual space in which a plurality of base positions are set, the game devices including: a 1 st base position selecting step of selecting 1 base position from the plurality of base positions; a 1 st base position transmission step of transmitting the base position selected in the 1 st base position selection step to another game device; a 1 st base position receiving step of receiving the base position transmitted from the other game device in the 1 st base position transmitting step; a 2 nd base position selecting step of selecting 1 base position from the base positions received by the 1 st base position receiving step; a self-object position determination step of determining a position of an object associated with the game device based on the base position selected in the 1 st base position selection step; and a self-object posture determining step of determining a posture of the object associated with the game device based on the base position selected in the 2 nd base position selecting step.
In the present invention, a plurality of base positions are set in a virtual space. In each game device, 1 (referred to as 1 st base position herein) is selected from these plural base positions. And transmits the base position thus selected to other game devices. In each game device, 1 of the base positions selected by the other game devices (referred to as the 2 nd base position here) is also selected. The position and posture of the object are determined by these base positions. That is, the position of the object associated with the game device is determined based on the 1 st base position. In this case, the position of the object may be the 1 st base position itself, or may be a position in which the 1 st base position is shifted by a predetermined shift amount. Further, the posture of the object associated with the game device is determined based on the 2 nd base position. For example, the posture of the object is determined so as to be directed to the 2 nd base position or so as to be shifted by a predetermined shift amount from the 2 nd base position.
According to the present invention, since the degree of freedom of the posture of each object is limited by the plurality of base positions set in the virtual space, it is possible to share the positions and postures of the objects associated with the plurality of game devices among the plurality of game devices while suppressing an increase in the communication traffic of the communication network. In addition, according to the present invention, the posture of each object is determined based on the base position that is a basis for determining the position of any of the other objects. Thus, the gestures of the respective objects can be made to face positions associated with other objects.
Preferably, each of the game devices further includes: a 2 nd base position transmission means for transmitting the base position selected by the 2 nd base position selection means to the other game device; another object position determining means for determining a position of an object associated with the other game device based on the base position received from the other game device by the 1 st base position receiving means; and other object posture determining means for determining the posture of the object associated with the other game device based on the base position transmitted from the other game device by the 2 nd base position transmitting means. In this way, the position and orientation of the object associated with the other game device can be appropriately determined while suppressing the communication traffic of the communication network.
Preferably, each of the game devices further includes: an offset amount input mechanism that inputs an offset amount of an object associated with the game device; and an offset amount transmission means for transmitting the offset amount input by the offset amount input means to the other game device; the self-object position determining means determines the position of the object associated with the game device based on the base position selected by the 1 st base position selecting means and the offset amount input by the offset amount input means. In this way, the object can be arranged at a position shifted from the base position. In this case, if the amount of shift is a one-dimensional amount or a two-dimensional amount with an upper limit, the object can be arranged at a position deviated from the base position with a small amount of data.
Preferably, the self-object posture determining means determines the posture of the object based on the base position selected by the 2 nd base position selecting means and the position of the object associated with the game device.
Preferably, the 2 nd base position selecting means newly selects a next base position from the base positions received by the 1 st base position receiving means, based on a current position of an object associated with the game device. In this way, the base position that is the basis for determining the position of the object can be selected from the periphery of the object. In this case, the base position may be further selected according to the current pose of the object.
Preferably, each of the game devices further includes a direction input means for inputting direction data by a player. In this case, the 1 st base position selecting means may select 1 base position from the plurality of base positions in accordance with the direction data input by the direction input means. In this way, the player can select the base position to be set in a desired direction.
In addition, preferably, the direction input mechanism includes: player posture determining means for acquiring data indicating a posture of the player; and a direction data calculation unit that calculates direction data indicating a direction in the virtual space based on the data acquired by the player posture determination unit. In this way, the player can input the direction in the virtual space by changing the posture.
In this case, the player posture determining means may acquire data indicating a position of the predetermined portion of the player as data indicating a posture of the player. In this way, the player can input the direction in the virtual space by changing the position of a predetermined portion (for example, the head).
In this case, the 1 st base position selecting means may select 1 of the plurality of base positions in accordance with the direction data calculated by the direction data calculating means when the player operates a predetermined operation member. In this way, when the player does not operate a predetermined operation member, the position of the predetermined portion of the player is not reflected in the direction data, and operability can be improved. The operation member may be disposed under the feet of the player. In this way, the player can operate the operation member with his foot.
Preferably, the player posture determining means includes: an ultrasonic transmitter for transmitting ultrasonic waves to the player; a plurality of ultrasonic receivers which receive ultrasonic waves transmitted from the ultrasonic transmitter and reflected by the player at positions away from each other; and a time measuring means for measuring each time from the transmission of the ultrasonic wave by the ultrasonic transmitter to the reception of the ultrasonic wave by each of the ultrasonic receivers; data indicating the posture of the player is acquired based on the respective times measured by the time measuring means. In this way, data indicating the posture of the player can be acquired without touching the player.
Preferably, the game device further includes game image generating means for generating a game screen including a direction instruction image for displaying a direction indicated by the direction data calculated by the direction data calculating means. In this way, it is possible to easily identify which direction is input.
In a preferred aspect, the player posture determining means calculates data indicating a deviation amount of the head position of the player from a reference position as data indicating a position of the predetermined portion of the player; the direction data calculation means calculates direction data indicating a direction corresponding to a deviation of the head position of the player from the reference position. In this way, by moving the head around the reference position, the direction can be input even in a narrow place.
Further, a game device according to the present invention is connected to a communication network, and shares a virtual space in which an object is arranged between other game devices connected to the communication network, and includes: a position storage unit that stores a plurality of base positions provided in the virtual space; a base position selecting unit configured to select 1 base position from the plurality of base positions; a base position receiving unit configured to receive the base position selected by the other game device; a position determining unit configured to determine a position of the object based on the base position selected by the base position selecting unit; and an attitude determination means for determining the attitude of the object based on the base position received by the base position reception means.
Further, a method for controlling a game device according to the present invention is a game device connected to a communication network, the game device sharing a virtual space in which an object is arranged between other game devices connected to the communication network, the method including: a base position selecting step of selecting 1 from a plurality of base positions provided in the virtual space; a basic position receiving step of receiving the basic position selected by the other game device; a position determining step of determining a position of the object based on the base position selected in the base position selecting step; and a posture determining step of determining the posture of the object based on the base position received by the base position receiving step.
The information storage medium of the present invention stores a program for causing a computer to function as: means for sharing a virtual space in which an object is disposed with another game device via a communication network; a position storage unit that stores a plurality of base positions provided in the virtual space; a base position selecting unit configured to select 1 base position from the plurality of base positions; a base position receiving unit configured to receive the base position selected by the other game device; a position determining unit configured to determine a position of the object based on the base position selected by the base position selecting unit; and an attitude determination means for determining the attitude of the object based on the base position received by the base position reception means. Examples of the computer include a game machine for business use, a home game machine, a portable game machine, a personal computer, various server computers, a portable information terminal, and a mobile phone. In addition, the program may be stored in other computer-readable information storage media such as CD-ROM, DVD-ROM, and the like.
According to the present invention, since the degree of freedom of the posture of each object is limited by the plurality of base positions set in the virtual space, it is possible to share the positions and postures of the objects associated with the plurality of game devices among the plurality of game devices while suppressing an increase in the communication traffic of the communication network.
Further, a game device according to the present invention is a game device in which a player moves an object arranged in a virtual space, the game device including: player posture determining means for acquiring data indicating a posture of the player; a direction data calculation unit that calculates direction data indicating a direction in the virtual space based on the data acquired by the player posture determination unit; and game image generating means for generating a game screen showing a state in which the object moves in the virtual space in accordance with the direction data calculated by the direction data calculating means.
Further, a method for controlling a game device according to the present invention is a method for controlling a game device in which a player moves an object placed in a virtual space, the method including: a player posture determining step of acquiring data indicating a posture of the player; a direction data calculation step of calculating direction data indicating a direction in the virtual space based on the data acquired in the player posture determination step; and a game image generation step of generating a game screen showing a state in which the object moves in the virtual space in accordance with the direction data calculated in the direction data calculation step.
In addition, the information storage medium of the present invention stores a program for causing a computer to function as: player posture determining means for acquiring data indicating a posture of the player; a direction data calculation unit that calculates direction data indicating a direction in the virtual space based on the data acquired by the player posture determination unit; and game image generating means for generating a game screen showing a state in which the object moves in the virtual space in accordance with the direction data calculated by the direction data calculating means. Examples of the computer include a game machine for business use, a home game machine, a portable game machine, a personal computer, various server computers, a portable information terminal, and a mobile phone. In addition, the program may be stored in other computer-readable information storage media such as CD-ROM, DVD-ROM, and the like.
According to the present invention, a player can input a direction in a virtual space by changing his/her posture, and can move an object in the direction, thereby easily moving the object in an arbitrary direction in the virtual space.
In this case, the player posture determining means may acquire data indicating a position of the predetermined portion of the player as data indicating a posture of the player. In this way, the player can input the direction in the virtual space by changing the position of a predetermined portion (for example, the head).
Further, the game image generating means generates a game screen showing a state in which the object moves in the virtual space, in accordance with the direction data calculated by the direction data calculating means, when the player operates a predetermined operation member. In this way, when the player does not operate a predetermined operation member, the position of the predetermined portion of the player is not reflected in the direction data, and operability can be improved. The operation member may be disposed under the feet of the player. In this way, the player can operate the operation member with his foot.
In addition, preferably, a plurality of base positions are provided in the virtual space; the game screen generating means selects 1 from the plurality of base positions in accordance with the direction data calculated by the direction data calculating means, and generates a game screen indicating a state in which the object moves toward the selected base position in the virtual space.
Preferably, the player posture determining means includes: an ultrasonic transmitter for transmitting ultrasonic waves to the player; a plurality of ultrasonic receivers which receive ultrasonic waves transmitted from the ultrasonic transmitter and reflected by the player at positions away from each other; and a time measuring means for measuring each time from the transmission of the ultrasonic wave by the ultrasonic transmitter to the reception of the ultrasonic wave by each of the ultrasonic receivers; data indicating the posture of the player is acquired based on the respective times measured by the time measuring means. In this way, data indicating the posture of the player can be acquired without touching the player.
Preferably, the game device further includes game image generating means for generating a game screen including a direction instruction image for displaying a direction indicated by the direction data calculated by the direction data calculating means. By doing so, it is possible to easily recognize which direction the user has input.
In a preferred aspect, the player posture determining means calculates data indicating a deviation amount of the head position of the player from a reference position as data indicating a position of the predetermined portion of the player; the direction data calculation means calculates direction data indicating a direction corresponding to a deviation of the head position of the player from the reference position. In this way, by moving the head around the reference position, the direction can be input even in a narrow place.
Drawings
Fig. 1 is an external perspective view of a game device according to an embodiment of the present invention.
Fig. 2 is an overall configuration diagram of a related network shooting game system according to an embodiment of the present invention.
Fig. 3 is a diagram showing an example of a game screen.
Fig. 4 is a diagram showing a hardware configuration of the game device.
Fig. 5 is a functional block diagram of a game device.
Fig. 6 is a diagram showing a plurality of base positions set in a virtual three-dimensional space.
Fig. 7 is a diagram showing a method of determining the position and orientation of a player character object.
Fig. 8 is a diagram illustrating a method of determining the posture of the player.
Fig. 9 is a diagram showing a configuration of position data of a player character object.
Fig. 10 is a diagram showing a direction of change of the base position (arrangement reference position).
Fig. 11 is a diagram showing a configuration of track data of a bullet target.
Fig. 12 is a diagram illustrating a method of correcting the orbit data.
Fig. 13 is a diagram showing an example of a game screen including a movement direction instruction image.
Fig. 14 is a diagram showing a relationship between direction data relating to a moving direction of a player character and a posture of the player.
Fig. 15 is a diagram showing another procedure of changing the base position (arrangement reference position).
Fig. 16 is an external perspective view of a game device according to a modification.
Fig. 17 is a diagram showing a procedure of changing the correlation base position (arrangement reference position) in the modification.
Fig. 18 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 19 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 20 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 21 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 22 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 23 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 24 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 25 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 26 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 27 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Fig. 28 is a diagram illustrating a method of displaying a pigment trace corresponding to a static object.
Detailed Description
An embodiment of the present invention will be described in detail below with reference to the drawings.
Fig. 1 is a perspective view showing an external appearance of a game device according to an embodiment of the present invention. The game device 10 shown in the figure is a business machine disposed in various game halls, and a base 18 extending forward from the housing 12 is attached to the lower surface of the housing 12, and a game table 20 thinner than the base 18 is attached to the front of the base 18. A foot 52 is drawn in the center of the table 20, and the player stands upright on the foot 52 and faces the cabinet 12.
The foremost part of the base 18, that is, the position where the game table 20 is attached, is formed with a slope on which the foot controller is provided. The foot controller 50 has a pressure-sensitive sensor built therein, and if a player standing on the game table 20 pushes one of the left and right feet forward and steps on the foot controller 50, the information is notified to the inside of the apparatus.
The housing 12 is formed to be taller than the height of a typical adult, and a substantially rectangular frame 14 is attached to the upper portion thereof. The frame 14 is slightly inclined so that the front side thereof is located higher than the rear side. The back side of the frame 14 is fixed to the top of the cabinet 12, and the left and right sides are also fixed to the tops of a pair of support rods 16. The pair of support rods 16 are fixed to the left and right side surfaces of the housing 12. On the other hand, an ultrasonic transmitter 17 and ultrasonic receivers 13 and 15 are provided on the front side of the housing 14. That is, an ultrasonic receiver 15 is provided above the left side facing the cabinet 12, an ultrasonic receiver 13 is provided above the right side, and an ultrasonic transmitter 17 is provided above the center. The ultrasonic transmitter 17 is disposed on the same straight line as the ultrasonic receivers 13 and 15, and the ultrasonic transmitter 17 is mounted at the center of the ultrasonic receivers 13 and 15. In the game device 10, the ultrasonic wave is emitted downward from the ultrasonic wave emitter 17, and the time until the reflected wave is detected by the ultrasonic wave receivers 13 and 15 is measured. By doing so, two distances are obtained, that is, a distance obtained by adding the distance from the ultrasonic transmitter 17 to the player's head to the distance from the player's head to the ultrasonic receiver 13, and a distance obtained by adding the distance from the ultrasonic transmitter 17 to the player's head to the distance from the player's head to the ultrasonic receiver 15. Then, the posture of the player standing on the game table 20 is detected from these two distances.
A monitor 24 for displaying a game screen is provided on the upper part of the monitor facing the player in the cabinet 12, and an advertisement board 22 is mounted on the monitor. The monitor 24 has a projection 26 projecting forward from the lower side thereof. A speaker 28 for outputting a game effect sound and a game music is attached to the foremost surface of the projection 26. Further, an upper end of a cowl-top cover 38, which is a curved elongated plate member having a width smaller than that of the cabinet, is attached to a lower side of the monitor 24. The cowl 38 is mounted on the base 18 at a lower end thereof and is erected on the base 18. The cowl-top panel 38 is bent toward the cabinet 12 side after being erected substantially vertically from the base 18, and the upper end thereof is attached below the monitor 24 as described above.
The front of the front plate 38 is provided with selection buttons 34, 36 and a decision button 32, which are pressed by a player to perform various selection operations. In addition, a gun holder is formed below the selection buttons 34 and 36 and the decision button 32 to hang the gun controller 30 when not in use.
The stock portion of the gun controller 30 mounts a signal cable 48 and each end of the retaining cable 42. In addition, the side of the body portion of the controller 30 has a pointing direction switching button 30a for switching the display contents of the monitor 24. The other end of the signal cable 48 is led into the inside of the cabinet 12, and the detection result (for detecting the body direction) of the optical sensor provided inside the body of the gun controller 30, a trigger signal for notifying that the trigger has been pulled, and a visual line direction switching signal indicating that the pointing direction switching button 30a has been pressed are notified to the inside of the cabinet 12 through the signal cable 48. In addition, the other end of the retaining cable 41 is securely mounted to the lower portion of the chassis 12 such that the firearm controller 30 cannot be easily removed.
The lower part of the housing 12 is provided with a coin input unit 40 and a coin return unit 44, and a recovery door 46 for recovering coins input from the coin input unit 40 and stored in an unillustrated coin box.
In the game device 10 having the above configuration, the player stands on the game table 20 with his or her feet aligned with the foot shapes 52, holds the gun controller 30 with his or her hand, and pulls the trigger with his or her body toward the enemy displayed on the monitor 24. The bullet object is then fired in a virtual three-dimensional space, flying toward the enemy. In addition, the bullet object is also shot from the enemy to itself. At this time, the player can avoid the bullet object by swinging the head left and right or by lowering the head by bending the body.
As shown in fig. 2, the game device 10 is connected to a communication network, and constitutes a network shooting game system together with other game devices 10. That is, in the network shooting game system 60, as shown in the drawing, a plurality of game devices 10 (n game devices 10-1 to 10-n in this case) are connected to a communication network 62 such as the internet, and further connected to a lobby server 64. Each game device 10 has a server function 10b in addition to the client function 10a, and the lobby server 64 determines a plurality of game devices 10 participating in the network shooting game from among the currently accessed game devices 10. For example, the lobby server 64 acquires the game proficiency of the players of the respective game devices 10, and selects a plurality of game devices 10 operated by players of the same proficiency to participate in the same network shooting game. The lobby server 64 selects one game server from the game devices 10 thus determined. Then, the client function 10a of the game device 10 and the client functions 10a of the other game devices 10 exchange data indicating the current state of the virtual three-dimensional space by the server function 10b of the game device 10, and share the virtual three-dimensional space which is the stage of the shooting game.
Fig. 3 shows an example of a game screen displayed on the monitor 24 of each game device 10. In the network shooting game 60, each game device 10 is associated with a player character object, and the player character objects associated with the game devices 10 participating in the game are all arranged in a virtual three-dimensional space. In the game device 10, the appearance of the virtual three-dimensional space viewed from the viewpoint set at the eye position of the player character object associated with the game device 10 is displayed on the monitor 24 as a game screen. Further, on the game screen, a self status image 70 showing the self status and another person status images 66 showing the statuses of other participants are displayed. In addition, an elapsed time image 68 indicating the elapsed time from the start of the game is also displayed.
As shown in the figure, in the virtual three-dimensional space, in addition to static objects (objects whose positions and postures are not changed even when time passes) such as the automobile object 72, the building object 78, and the floor object 76, dynamic objects (objects whose positions and postures change in accordance with a program or a game operation as time passes) such as the player character object (viewpoint setting object) 74 and the bullet objects (moving objects) 80 and 82 are provided. The game screen shown in the figure shows a virtual three-dimensional space viewed from a viewpoint set at the eye position of a certain player character object, and a player character object 74 associated with another game machine 10 is displayed at a substantially central position. Further, a bullet object 80 indicating a paint ball (a resin ball, with paint being sealed inside) shot from a toy gun held by the player character object 74 is displayed in front of the bullet object. Further, a bullet object 82 shot by a player character object (not shown) associated with the game device 10 displaying the game screen is displayed around the player character object 74.
In the present embodiment, the bullets 80 and 82 simulate paint balls, and if the bullet objects 80 and 82 hit other objects, such as the automobile object 72, the building object 78, the floor object 76, the player character object 74, and the like, the bullet objects are broken to show the appearance that the pigment inside is attached to the objects. Here, pigment traces 84 are arranged on the ground object 76 and the automobile object 72, indicating that a bullet object hits these objects. In particular, in the present embodiment, the direction (contact direction) in which the bullet object hits the objects is calculated, and a texture image in which the angle formed between the direction and the direction of the contact surface (horizontal direction or normal direction) is associated is used for displaying the pigment trace object 84. Specifically, as the angle formed by the contact direction and the horizontal direction of the contact surface approaches 90 degrees, a texture image of a pigment trace that approaches a circular shape is displayed, and the pigment trace object 84 is displayed. Further, as the angle formed by the contact direction and the horizontal direction of the contact surface approaches 0 degree, a texture image of the pigment trace extending in the lateral direction is displayed, and the pigment trace object 84 is displayed. At this time, the extending direction of the pigment trace, that is, the arrangement direction of the pigment trace object 84 coincides with the contact direction of the bullet object. By doing so, the player can immediately grasp where the bullet object fired from the virtual three-dimensional space is attached by viewing the paint track object 84 displayed on the game screen.
The internal processing of each game device 10 will be described in detail below.
Fig. 4 is a hardware configuration diagram of the game device 10. As shown in the drawing, the game device 10 is a computer game system, and is configured with a control unit 98 configured by a CPU, a memory, and the like as a center. The gun controller 30, the ultrasonic transmitter 17, the ultrasonic receivers 13 and 15, the foot controller 50, the storage unit 90, the disk reading device 94, the communication unit 92, the sound processing unit 102, and the display control unit 100 are connected to the control unit 98.
The gun controller 30 is a gun-type game controller, and inputs the time when the trigger is pulled by the player, the pointing direction of the gun controller 30 at that time (specifically, which position of the monitor 24 the body direction of the gun controller 30 is directed to), and an execution direction switching signal indicating that the pointing direction switching button 30a is pressed, to the control section 98. The ultrasonic transmitter 17 emits ultrasonic waves in accordance with instructions from the control unit 98. The control unit 98 starts the time measurement from the time when the ultrasonic transmitter 17 is instructed to transmit the ultrasonic wave. The ultrasonic receivers 13 and 15 receive the ultrasonic waves emitted from the ultrasonic emitter 17, and transmit the reception waveforms to the control unit 98. Then, the control unit 98 determines the timing at which the reflected wave reflected by the head of the player enters the ultrasonic receivers 13 and 15 based on the reception waveform sent thereto from the ultrasonic receivers 13 and 15. The control unit 98 is notified of a message from the foot controller 50 indicating that the player steps on the foot controller 50.
The storage unit 90 is configured by various data storage means such as a hard disk storage device or a RAM, and stores programs for realizing the client function 10a and the server function 10 b.
The disk reading device 94 reads data from a disk 96 which is a computer-readable information storage medium such as a CD-ROM or a DVD-ROM, and supplies the data to a control unit 98. Here, various programs executed by the game device 10 are supplied from the disk 96 to the game device 10, and loaded in the storage unit 90.
The communication unit 92 is connected to the communication network 62, and receives data (position data and track data described later) indicating the status of another game machine 10 participating in the network shooting game from the game machine 10 functioning as a game server. Data (position data and track data described later) indicating the status of the own machine (the game device) is transmitted to the game device 10 that operates as a game server. Further, when the game device 10 operates as a server, the client function 10a of the other game device 10 receives data from the client function 10a of the other game device 10, and transmits the data to the client function 10a of the other game device 10.
The sound processing unit 102 is connected to the speakers 28, and outputs a game effect sound, game music, and other sounds in accordance with the control from the control unit 98. For example, when a bullet object is ejected, a bullet ejection sound is output. The display control unit 100 is connected to the monitor 24, and displays and outputs, for example, a game screen shown in fig. 3 in accordance with the control from the control unit 98.
Fig. 5 is a functional block diagram of the game device 10. The game device 10 is a computer game system having a known structure as described above, and executes a predetermined program to realize various functions. As shown in the drawing, the game device 10 functionally includes a communication control unit 200, another-character trajectory data storage unit 202, another-character position data storage unit 204, self-character trajectory data storage unit 206, self-character position data storage unit 208, left-right offset amount update unit 210, player posture determination unit 212, base position setting unit 216, self-character trajectory data generation unit 218, hit estimation unit 220, trajectory data correction unit 222, and game image generation unit 214. These functions are realized by the control unit 98 executing a program supplied from the disk 96 to the game device 10.
First, the bullet trap data generating unit 218 generates bullet trap data based on an input from the gun controller 30. That is, when data indicating the direction of the gun body is input from the gun controller 30, the position coordinates (absolute coordinates in the virtual three-dimensional space) of the character at that time are generated based on the contents stored in the character position data storage unit 208. The position coordinates are stored as the shooting position of the bullet object, and the body direction (shooting direction) input from the gun controller 30 and the shooting position are stored as the trajectory data in the character trajectory data storage unit 206. Fig. 11(b) shows a structure of the trajectory data stored in the avatar trajectory data storage unit 206. The trajectory data stored in the player character trajectory data storage unit 206 is transmitted to the game device 10 that executes the server function 10b via the communication control unit 200. The game device 10 that executes the server function 10b distributes the received trajectory data to another game device 10 that participates in the network shooting game. Then, the data is received by the communication control unit 200 of each game device 10 and stored in the other-character-trajectory-data storage unit 202. Fig. 11(a) shows the structure of data stored in the other character trajectory data storage unit 202.
When the bullet track data newly generated by another game device 10 is stored in the other-character-bullet-track-data storage unit 202, the hit estimation unit 220 predicts whether or not a bullet object hits the player character object, based on the bullet track data stored in the other-character-bullet-track-data storage unit 202 and the position coordinates (absolute coordinates) of the player character object associated with the game device 10 calculated based on the storage content of the own-character-position-data storage unit 208. That is, it is determined whether or not the predicted trajectory of the bullet object indicated by the trajectory data enters a hit check area (not shown) set for the player character object. The trajectory of the bullet object may be linear or parabolic. Various other tracks may also be employed.
When the bullet object enters the hit check area, that is, when it is evaluated that the bullet object 306 hits a predetermined position of the player character PC (see fig. 12 a), the trajectory data correction unit 222 corrects the trajectory data stored in the other-character trajectory data storage unit 202, based on the coordinate position of the viewpoint VP set in the eye position of the player character object PC associated with the game device 10 (see fig. 12 b). Specifically, the track data correction unit 222 obtains the corrected ejection direction 304a by shifting the ejection direction 304 in the trajectory data to the side of the position of the viewpoint VP set at the eye position of the player character PC. That is, the ejection direction 304 (vector data) constituting the trajectory data is corrected so that the angle formed by the vector connecting the ejection position of the bullet object 306 and the viewpoint VP and the vector of the ejection direction 304 of the bullet object 306 becomes smaller. Then, the corrected data is stored in the other character trajectory data storage unit 202. In this way, when the bullet object 306 moves within the visual field 302 of the player character PC and the appearance of the virtual three-dimensional space is projected onto the screen 300 to generate a game screen, the bullet object 306 is displayed in the virtual three-dimensional space.
As shown in fig. 6, the game device 10 is provided with a plurality of base positions in a virtual three-dimensional space serving as a stage of a game, and position coordinates and IDs thereof are stored in advance. Then, each character object determines its actual installation position with reference to any one of the base positions (placement reference positions). The base position setting unit 216 selects the base position of the player character object associated with the game device 10 as the arrangement reference position. Specifically, at the start of the game, a preset base position is selected as the placement reference position. Then, when it is determined by the player posture determining unit 212 that the player has left or right the body of the game table 20 largely shifted and a predetermined time or more has elapsed, the arrangement reference position is changed to the base position set in the direction of the virtual three-dimensional space corresponding to the shift direction. Further, if the foot controller 50 is stepped on, the arrangement reference position is changed to the base position located in front of the player character object.
Further, the base position setting unit 216 selects an opponent to which the posture of the player character object associated with the game device 10 is directed, from among the other player character objects set in the virtual three-dimensional space, and manages the base position of the player character object as the pointing position. Specifically, the other-character position data storage unit 204 stores the base positions (placement reference positions) selected by the other game devices 10, and selects 1 of them as the pointing positions. In addition, each time the gun controller 30 presses the pointing direction switch button 30a to input a pointing direction switch signal, the base position selected as the pointing position is switched to the other. In this case, the game device 10 is preferentially switched to a reference position that satisfies a condition such as being close to the current arrangement reference position or being the arrangement reference position of the player character object that is attacking the player character object associated with the game device. In this way, the player character object can be oriented toward the other party with a high necessity of attack by a small number of times of pressing. The selection results (the arrangement reference position and the pointing position of the player character object associated with the game machine 10) by the base position setting unit 216 are stored in the character position data storage unit 208.
Fig. 10 is a diagram illustrating a method of reselecting the arrangement reference position. As shown in the figure, when the base positions P1 to P6 are arranged in the virtual three-dimensional space, the trajectory 250 (direction data) passing through the current position SP' of the player character object and centered on the position of the opponent player character object (opponent object) arranged at the pointing position is calculated. When the player's head is shifted leftward while facing the casing 12, the rail 250 extends counterclockwise in a plan view. When the displacement is rightward, the displacement extends clockwise in a plan view. The base position closest to the rail 250 (here P3) is then selected as the new placement reference position.
The player posture determining unit 212 determines the posture of a player standing on the game table 20 by using the ultrasonic transmitter 17 and the ultrasonic receivers 13 and 15. That is, as shown in fig. 8, the sum (10+11) of the distance 10 from the ultrasonic transmitter 17 to the head of the player M and the distance 11 from the head of the player M to the ultrasonic receiver 13, and the sum (10+12) of the distance 10 from the ultrasonic transmitter 17 to the head of the player M and the distance 12 from the head of the player M to the ultrasonic receiver 15 are obtained by measuring the respective times until the ultrasonic waves are emitted from the ultrasonic transmitter 17, reflected by the head of the player M, and incident on the ultrasonic receivers 13, 15. Further, since the length of L in the figure is known, data (x and y) for specifying the head position of the player M is calculated from these pieces of information. When the absolute value of the value of y (the amount of displacement of the player's head in the left-right direction from the position of the ultrasonic transmitter 17 as the reference position) is a predetermined value or more for a predetermined time or more, the base position setting unit 216 is notified of the direction of displacement of the player's head. Upon receiving the notification, the base position setting unit 216 reselects the placement reference position.
On the other hand, when the state in which the absolute value of the y value is equal to or greater than the predetermined value does not continue for a predetermined time or more, the player posture determination unit 212 transmits the y value to the left/right offset amount update unit 210. Then, the left-right offset amount updating unit 210 calculates the offset amount of the player character object from the value of y, and stores the calculated offset amount in the character position data storage unit 208. The offset amount may be, for example, the y value itself delivered from the player posture determination unit 212, or may be calculated by subjecting the sequentially generated y values to various kinds of processing such as smoothing.
Fig. 9(b) shows the structure of data stored in the avatar position data storage unit 208. As shown, the location data includes: a base position ID for identifying the base position selected as the arrangement reference position by the base position setting unit 216, the offset amount set by the left-right offset amount updating unit 210, and a base position ID (lock base position ID) for identifying the base position selected as the pointing position by the base position setting unit 216.
Fig. 7 shows the relationship between the offset amount, the arrangement reference position, and the current position of the player character. In the figure, the arrow of the thick line indicates the posture of the player character object. In the figure, SP denotes a placement reference position, SP' denotes a position of the player character object after the displacement of the maximum distance, and EP denotes a base position selected as a pointing position. The player character object moves the offset amount set by the left/right offset amount update unit 210 left and right with the arrangement reference position SP as the center and in a state of pointing to the pointing position EP. Here, the direction of the shift of the player character is made perpendicular to the direction from the placement reference position SP to the pointing position EP, but the direction may be shifted in another direction. In addition, the amount of offset is limited to a certain distance (here, L) on both the left and right.
The data stored in the character position data storage unit 208 is transmitted from the communication control unit 200 to the game device 10 that executes the server function 10 b. The game device 10 distributes the received data to another game device 10 participating in the network game. The communication control unit 200 receives the distributed position data and stores the received position data in the other character position data storage unit 204. Fig. 9(a) shows the structure of data stored in the other character position data storage unit 204.
The game image generating unit 214 draws a game screen to be displayed on the monitor 24 based on the respective contents stored in the other-character trajectory data storage unit 202, the other-character position data storage unit 204, the self-character trajectory data storage unit 206, and the self-character position data storage unit 208. Specifically, the bullet trajectory data is read from the bullet trajectory data storage unit 202 of the other character, and the bullet object is placed on the trajectory in the virtual three-dimensional space indicated by the bullet trajectory data and moved with the passage of time. Similarly, the trajectory data is read from the character trajectory data storage unit 206, and the bullet object is arranged on the track in the virtual three-dimensional space indicated by the trajectory data and moved with the passage of time.
Further, the position data is read from the other-character position data storage unit 204, and the player character object associated with the other game machine 10 is arranged at the position in the virtual three-dimensional space indicated by the position data. At this time, the posture of the player character object is determined to be directed from the current position to the pointed position based on the pointed position (lock base position ID). Further, the player character object is arranged at a position shifted from the arrangement reference position by a shift amount. Similarly, the position data is read from the character position data storage unit 208, and the player character object associated with the game device 10 is arranged at the position in the virtual three-dimensional space indicated by the position data. In this case, the posture of the player character object is determined from the current position toward the pointed position based on the pointed position (lock base position ID). Further, the player character object is arranged at a position shifted from the arrangement reference position by a shift amount. When the arrangement reference position of the player character object associated with the game device 10 is switched, or when the arrangement reference position of the player character object associated with another game device 10 is switched, the position of each player character object in the virtual three-dimensional space is moved toward the newly set arrangement reference position. Then, a game screen showing the movement of the player character object in the virtual three-dimensional space is generated and output to the monitor 24.
In the above description, the player character object is moved forward when the foot controller 50 is stepped on, and the track 250 extending in an arc shape in the left or right direction of the player character object is calculated as direction data when the head position of the player is deviated by a predetermined distance or more in the left or right direction from the reference position, and the player character object is moved to the newly set arrangement reference position while switching the arrangement reference position according to the direction data, but the reference position arranged in the direction indicated by the direction data may be selected as a new arrangement reference position when the foot controller 50 is stepped on after calculating the direction data in the direction corresponding to the amount of deviation in the left direction of the head position of the player from the reference position (the position of the ultrasonic wave transmitter 17), and at the same time, the player character object is moved toward the arrangement reference position.
Fig. 13 shows an example of a game screen in this case. The game screen example shown in the figure is different from the game screen example shown in fig. 3 in that the movement direction instruction image 73 in a roughly semicircular shape in the lower part of the screen is arranged so that the circular arc part is positioned on the upper side. The movement direction instruction image 73 includes a movement direction recognition image 73a indicating the movement direction of the player character object associated with the game device 10 on which the movement direction instruction image 73 is displayed. The movement direction recognition image 73a is an image having an elongated shape (here, an elongated isosceles triangle), and is arranged on the movement direction indication image 73 such that one end side is fixed to the center of the lower side of the movement direction indication image 73 and extends at an angle corresponding to the amount of displacement of the player's head position. This angle is used when calculating a new arrangement reference position of the player character object, and indicates the direction of the direction data.
Fig. 14 shows a relationship between a left-right direction offset amount y from a reference position of the head position of the player and a direction θ indicated by the direction data. As shown, the direction θ monotonically increases as the offset amount y increases, whereas the direction θ monotonically decreases as the offset amount y decreases. Further, if the offset amount y is equal to or greater than yth, the direction θ is set to +90 °, and if the offset amount y is equal to or less than-yth, the direction θ is set to-90 °. The base position setting unit 216 (see fig. 5) receives the offset amount y from the player posture determining unit 212, and calculates direction data indicating the direction θ from the offset amount in accordance with the relationship shown in the figure. The game image generating unit 214 acquires the direction data thus calculated, arranges the movement direction recognition image 73a on the movement direction instruction image 73 so as to extend in the direction θ indicated by the acquired direction data, and generates a game screen arranged at the bottom as shown in fig. 13. The game screen is displayed and output through the monitor 24.
The direction data calculated by the base position setting unit 216 is used by the base position setting unit 216 to calculate the rearrangement of the arrangement reference position when the foot controller 50 is stepped on. Fig. 15 is a diagram showing a method of rearranging the arrangement reference positions of the player character objects. In the figure, Pa to Pd indicate base positions, and SP indicates a base position selected as a placement reference position of the current player character object. In addition, SP' represents the current position of the player character object. The player character object is arranged so that the lock base position Pa is shifted from the arrangement reference position SP toward a current position SP' which is a position shifted in accordance with the head position of the player. In this way, when the offset y from the reference position of the head position of the player is calculated and the direction data indicating the direction θ corresponding thereto is calculated, if the foot controller 50 is stepped on, a new placement reference position is selected from positions in which the placement reference position of the player character object has not been selected from the base positions with reference to the direction D that is offset from the front direction F of the player character object by the direction θ. At this time, an object as an obstacle is provided between the base position close to the extending direction of the direction D, the base position close to the current position SP' of the player character object, and the lock base position Pa, and the base position located at a position where the player character object in the battle can be avoided is preferentially selected. After the new placement reference position (here, the base position Pd) is selected in this manner, the player character object moves in the direction M. When the foot controller 50 is stepped on, the base position setting unit 216 selects a base position to be a new arrangement base position as described above, and then stores the ID of the base position in the avatar position data storage unit 208. The ID of the base position is transmitted to the other game device 10. Then, the game image generating unit 214 of each game device 10 moves the player character object whose base position serving as the arrangement reference position has been changed from the current position to the newly selected arrangement reference position in the virtual three-dimensional space, and generates a game screen showing the situation.
In this way, the player can move the player character object 50 in a desired direction by stepping on the foot controller 50 at a timing when the movement direction recognition image 73a extends in the desired direction while shaking the head left and right to confirm the direction indicated by the movement direction recognition image 73a of the movement direction instruction image 73. At this time, since the player may not use his/her hand, it is possible to concentrate on the operation of the gun controller 30.
Further, in the above-described embodiment, only a single foot controller 50 is provided as a means operated by the foot of the player, but a plurality of foot controllers may be provided in the game table 20, and the player character object may be moved in the virtual space in a direction corresponding to the foot controller stepped on by the player. Fig. 16 is an external perspective view of a game device according to this modification. The game device 10a shown in the figure is different from the game device 10 shown in fig. 1 in that 4 foot boards, that is, a front foot board 50f, a right foot board 50r, a left foot board 501, and a rear foot board 50b are provided on the game table 20.
When a player stands at the center of the game table 20, the front leg plate 50f is provided at the front (the cabinet 12 side) thereof, and the right leg plate 50r is provided at the right side thereof. The left leg plate 501 is provided on the left side thereof, and the rear leg plate 50b is provided on the rear side thereof. When the front foot plate 50f is stepped on, the player character object is moved forward in the virtual space, and the arrangement reference position is changed to the base position arranged forward in the virtual space. Similarly, when the right foot plate 50r is stepped on, the player character object is moved rightward in the virtual space, and the arrangement reference position is changed to the base position arranged rightward in the virtual space. When the left foot plate 501 is stepped on, the player character object is moved leftward in the virtual space, and the arrangement reference position is changed to the base position arranged leftward in the virtual space. When the rear foot plate 50b is stepped on, the player character object is moved backward in the virtual space, and the arrangement reference position is changed to the base position arranged backward in the virtual space.
Fig. 17 is a diagram showing a specific method of rearranging the arrangement reference positions of the player character objects 50. In the figure, Pa to Pf indicate base positions, and SP indicates a base position selected as an arrangement reference position of the current player character object. In addition, SP' represents the current position of the player character object. The player character object is arranged so that the lock base position Pa is shifted from the arrangement reference position SP toward a current position SP' which is a position shifted in accordance with the head position of the player.
In this modification, when the front foot plate 50f is stepped on in this state, a new arrangement reference position is selected from the base positions arranged in the direction DF from the current position SP' toward the lock base position Pa. For example, a new arrangement reference position is selected from among the base positions arranged within a sector range (for example, 178 degrees) of a predetermined angle centered on the direction DF, in consideration of the distance from the current position SP', whether or not an object to be an obstacle can be arranged between the locked base position Pa and the player character object in engagement, and the like.
Similarly, when the right foot plate 50r is stepped on, a new arrangement reference position is selected from the base positions arranged in the direction DR at 90 degrees in the direction DF toward the lock base position Pa with respect to the current position SP'. When the left foot plate 501 is stepped on, a new placement reference position is selected from among the base positions that are placed in the direction of the left direction DL at 90 degrees with respect to the direction DF from the current position SP' toward the lock base position Pa. Further, when the rear foot plate 50b is stepped on, a new arrangement reference position is selected from the base positions arranged in the direction DB opposite to the direction DF to lock the base position Pa from the current position SP'. The reference for selecting the reference position is the same as that in the case where the front foot plate 50f is stepped on.
By stepping on the foot plates 50f, 50r, 501, and 50b, the player character object can be immediately moved in the direction corresponding to the foot plates, and a more realistic operation can be achieved. In place of the above-described moving method, particularly when the toe board 50f is stepped on, direction data in a direction corresponding to the amount of lateral displacement of the player's head position from the reference position may be calculated, and the reference position arranged in the direction indicated by the direction data may be selected as a new arrangement reference position and the player character object may be moved toward the arrangement reference position. In this way, particularly when the player character object is moved forward, the moving direction can be finely adjusted according to the head position of the player.
Next, the bullet mark display of the paint ball in the game device 10 will be described. First, the game device 10 determines whether or not a bullet object is in contact with another object. Then, if contact is made with another object, the bullet track data relating to the bullet object is deleted from the other-character-track-data storage unit 202 or the self-character track data 206. And an image representing traces of pigment is displayed in the contact position. Therefore, a pigment trace object is arranged at a position where the bullet object comes into contact with (hits) another object in the virtual three-dimensional space.
The following two methods are employed for displaying the pigment trace object. That is, as shown in fig. 18, when a bullet object 400 comes into contact with a static object 402, such as the building object 78, a pigment trace object 84 extending in the direction of contact with the static object 402 (collision direction) is displayed instead of the display of the bullet object 400, as shown in fig. 19. The pigment trace object 84 is disposed at a position (collision position) where the bullet object 400 contacts the static object 402. In this image processing, as shown in fig. 20, a position centered on the contact position 404 is cut from the static object 402. At this time, as shown in fig. 21, a contact direction 408 is calculated from the trajectory of the bullet object 400, and projected onto the stationary object 402 to obtain a vector 409, and an angle θ formed by the vector 409 and the contact direction 408 is calculated.
When a position centered on the contact position 404 is cut out from the static object 402, the direction of the cut-out position is determined in accordance with the vector 409. Then, the polygon constituting the cut pigment trace object 406 is finely divided (see fig. 22), and a texture image indicating pigment traces is mapped thereto. At this time, a plurality of texture images representing the traces of the pigment are prepared in advance and stored in association with the ranges of the angles θ described above, respectively (see fig. 23). Then, on the pigment trace object 406, a texture image corresponding to the angle θ calculated as described above is selectively read and mapped (see fig. 24). Thereafter, the pigment trace object 406 is disposed at the contact position of the bullet object 400 in the original static object 402.
As shown in fig. 25, an invisible polygonal model (paint mark object) 502 that surrounds a dynamic object 500 such as a player character object is arranged around the dynamic object in advance. As shown in fig. 26, the pigment trace object 502 is formed by combining polygons that are finer than the dynamic object 500 itself, and thus can be mapped with an arbitrary texture at an arbitrary position. When the bullet object 400 is in contact with the dynamic object 500, the position of the polygon model 502 corresponding to the contact position is determined (see fig. 27), and a paint mark texture image prepared in advance as shown in fig. 23 is mapped (see fig. 28). At this time, the texture image corresponding to the angle formed by the contact direction of the bullet object 400 with respect to the dynamic object 500 and the direction of the contact surface thereof is selectively read and mapped to the contact position.
Then, the game image generating unit 214 images the appearance of the virtual three-dimensional space viewed from the viewpoint set at the eye position of the player character object associated with the game device 10, and displays the image on the monitor 24.
According to the network game system described above, when it is predicted that the bullet object comes into contact with the player character object, the trajectory of the bullet object is corrected so that the bullet object moves toward the viewpoint set on the player character, and therefore, by displaying the appearance of the virtual three-dimensional space viewed from the viewpoint on the monitor 24, it is possible to indicate the appearance that the bullet object is approaching.
Further, since the position of each player character object is limited to any one of the preset base positions and the posture thereof is calculated based on the base positions to which other player character objects are limited, an increase in the communication traffic of the communication network 62 is suppressed, and the positions and postures of the objects associated with the plurality of game apparatuses 10 can be shared among the plurality of game apparatuses 10.
Further, when the bullet object comes into contact with another object, the pigment trace object on which the texture image of the content corresponding to the contact direction and the direction of the contact surface is mapped is displayed, so that the player can immediately understand from the game screen where the bullet object is shot.

Claims (16)

1. A network game system comprising a plurality of game devices connected to a communication network, a plurality of objects associated with any one of the plurality of game devices being arranged, and a virtual space in which a plurality of base positions are set being shared,
each of the game devices includes:
a 1 st base position selecting unit that selects 1 base position from the plurality of base positions;
a 1 st base position transmission unit configured to transmit the base position selected by the 1 st base position selection unit to another game device;
1 st base position receiving means for receiving the base position transmitted from the other game device by the 1 st base position transmitting means;
2 nd base position selecting means for selecting 1 base position from the base positions received by the 1 st base position receiving means;
a self-object position determining means for determining a position of an object associated with the game device based on the base position selected by the 1 st base position selecting means; and
the object posture determining means determines the posture of the object associated with the game device based on the base position selected by the 2 nd base position selecting means.
2. The network game system according to claim 1, wherein:
each of the game devices further includes:
a 2 nd base position transmission means for transmitting the base position selected by the 2 nd base position selection means to the other game device;
another object position determining means for determining a position of an object associated with the other game device based on the base position received from the other game device by the 1 st base position receiving means; and
and another object posture determining means for determining the posture of the object associated with the other game device based on the base position transmitted from the other game device by the 2 nd base position transmitting means.
3. The network game system according to claim 1 or 2, wherein:
each of the game devices further includes:
an offset amount input mechanism that inputs an offset amount of an object associated with the game device; and
offset amount transmitting means for transmitting the offset amount input by the offset amount input means to the other game device,
the self-object position determining means determines the position of the object associated with the game device based on the base position selected by the 1 st base position selecting means and the offset amount input by the offset amount input means.
4. The network game system according to any one of claims 1 to 3, wherein:
the self-object posture determining means determines the posture of the object based on the base position selected by the 2 nd base position selecting means and the position of the object associated with the game device.
5. The network game system according to any one of claims 1 to 4, wherein:
the 2 nd base position selecting means newly selects a next one of the base positions received by the 1 st base position receiving means, based on the current position of the object associated with the game device.
6. The network game system according to any one of claims 1 to 5, wherein:
in each of the above-described game apparatuses,
a direction input mechanism for allowing the player to input direction data,
the 1 st base position selecting means selects 1 of the plurality of base positions in accordance with the direction data input by the direction input means.
7. The network game system according to claim 6, wherein:
the direction input mechanism includes:
player posture determining means for acquiring data indicating a posture of the player; and the number of the first and second groups,
and a direction data calculation unit that calculates direction data indicating a direction in the virtual space based on the data acquired by the player posture determination unit.
8. The network game system according to claim 7, wherein:
the player posture determining means acquires data indicating a position of a predetermined part of the player as data indicating a posture of the player.
9. The network game system according to claim 7 or 8, wherein:
the 1 st base position selecting means selects 1 of the plurality of base positions in accordance with the direction data calculated by the direction data calculating means when the player operates a predetermined operation member.
10. The network game system according to claim 9, wherein:
the operation member is disposed under the player's foot.
11. The network game system according to any one of claims 7 to 10, wherein:
the player posture determining means includes:
an ultrasonic transmitter for transmitting ultrasonic waves to the player;
a plurality of ultrasonic receivers which receive ultrasonic waves emitted from the ultrasonic emitters and reflected by the player at positions away from each other; and
time measuring means for measuring each time from when the ultrasonic transmitter transmits an ultrasonic wave to when the ultrasonic receiver receives the ultrasonic wave,
data indicating the posture of the player is acquired based on the respective times measured by the time measuring means.
12. The network game system according to any one of claims 7 to 11, wherein:
the game device further includes game image generating means for generating a game screen including a direction instruction image indicating a direction indicated by the direction data calculated by the direction data calculating means.
13. The network game system of claim 8, wherein:
the player posture determining means calculates data indicating a deviation amount of the position of the head of the player from a reference position as data indicating the position of the predetermined portion of the player,
the direction data calculation means calculates direction data indicating a direction corresponding to a deviation from the reference position of the head of the player.
14. A method for controlling a network game system including a plurality of game devices connected to a communication network, a plurality of objects associated with any one of the plurality of game devices being arranged, and a virtual space in which a plurality of base positions are set being shared,
each of the game devices includes:
a 1 st base position selecting step of selecting 1 base position from the plurality of base positions;
a 1 st base position transmission step of transmitting the base position selected in the 1 st base position selection step to another game device;
a 1 st base position receiving step of receiving the base position transmitted from the other game device in the 1 st base position transmitting step;
a 2 nd base position selecting step of selecting 1 base position from the base positions received by the 1 st base position receiving step;
a self-object position determination step of determining a position of an object associated with the game device based on the base position selected in the 1 st base position selection step; and
and a self-object posture determining step of determining a posture of the object associated with the game device based on the base position selected in the 2 nd base position selecting step.
15. A game device connected to a communication network, wherein a virtual space in which an object is arranged is shared between other game devices connected to the communication network, the game device comprising:
a position storage unit that stores a plurality of base positions provided in the virtual space;
a base position selecting unit configured to select 1 base position from the plurality of base positions;
a base position receiving unit configured to receive the base position selected by the other game device;
a position determining unit configured to determine a position of the object based on the base position selected by the base position selecting unit; and
and an attitude determination means for determining the attitude of the object based on the base position received by the base position reception means.
16. A method for controlling a game device connected to a communication network, the game device sharing a virtual space in which an object is arranged between other game devices connected to the communication network, the method comprising:
a base position selecting step of selecting 1 from a plurality of base positions provided in the virtual space;
a basic position receiving step of receiving the basic position selected by the other game device;
a position determining step of determining a position of the object based on the base position selected in the base position selecting step; and
a posture determining step of determining a posture of the object based on the base position received by the base position receiving step.
HK08109666.6A 2005-06-29 2006-06-29 Network game system, network game system control method, game machine and game machine control method HK1114042B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP190328/2005 2005-06-29
JP2005190328 2005-06-29
JP371119/2005 2005-12-22
JP2005371119A JP4861699B2 (en) 2005-06-29 2005-12-22 NETWORK GAME SYSTEM, NETWORK GAME SYSTEM CONTROL METHOD, GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
PCT/JP2006/312993 WO2007001050A1 (en) 2005-06-29 2006-06-29 Network game system, network game system control method, game machine, game machine control method, and information storage medium

Publications (2)

Publication Number Publication Date
HK1114042A1 HK1114042A1 (en) 2008-10-24
HK1114042B true HK1114042B (en) 2012-03-02

Family

ID=

Similar Documents

Publication Publication Date Title
US7059962B2 (en) Gun shooting game device, method of controlling computer and program
CN100542645C (en) Image generating device and image display method
JP4179162B2 (en) Information processing device, game device, image generation method, and game image generation method
CN101213003B (en) Network game system, network game system control method, game machine, and game machine control method
JP2002052240A (en) Pseudo camera viewpoint movement control method in 3d video game and 3d video game device
WO1998017361A1 (en) Game controller and information storage medium
US20020190981A1 (en) Image generation device and information storage medium
JP2002052243A (en) Competition type video game
US6206783B1 (en) Control input device and game system
JP4563266B2 (en) NETWORK GAME SYSTEM, GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP4412714B2 (en) Arrangement information detection system, game system, program, information storage medium, and arrangement information detection system control method
JP3818774B2 (en) Game device
JP4563267B2 (en) Network game system, network game control method, game device, game control method, and program
JP4861706B2 (en) NETWORK GAME SYSTEM, NETWORK GAME SYSTEM CONTROL METHOD, GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
HK1114042B (en) Network game system, network game system control method, game machine and game machine control method
JPH08323037A (en) Target hitting game machine
TW200934566A (en) Image processor, game device, and computer program
JP7022998B2 (en) Game system and computer programs used for it
JP2005331188A (en) Simulated gun operation input device and shooting game device
HK1081307B (en) Image generation device, image display method and program product