US20130316817A1 - Information processing apparatus, method for information processing, and game apparatus - Google Patents
Information processing apparatus, method for information processing, and game apparatus Download PDFInfo
- Publication number
- US20130316817A1 US20130316817A1 US13/748,937 US201313748937A US2013316817A1 US 20130316817 A1 US20130316817 A1 US 20130316817A1 US 201313748937 A US201313748937 A US 201313748937A US 2013316817 A1 US2013316817 A1 US 2013316817A1
- Authority
- US
- United States
- Prior art keywords
- coordinate
- intermediate point
- display
- area
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000010365 information processing Effects 0.000 title claims description 50
- 238000000034 method Methods 0.000 title claims description 28
- 238000012545 processing Methods 0.000 claims description 142
- 210000003811 finger Anatomy 0.000 description 21
- 230000006870 function Effects 0.000 description 13
- 238000004891 communication Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000011017 operating method Methods 0.000 description 5
- 230000007123 defense Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 210000005224 forefinger Anatomy 0.000 description 3
- 210000003813 thumb Anatomy 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000005674 electromagnetic induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000010897 surface acoustic wave method Methods 0.000 description 2
- 239000012780 transparent material Substances 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/822—Strategy games; Role-playing games
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04808—Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
Definitions
- the present invention relates to an information processing apparatus, a method for information processing, and a game apparatus which can specify an area on a display screen. More specifically, the present invention relates to, for example, an information processing apparatus which can perform processing of selecting one or a plurality of objects included in, for example, a specified area by specifying the area on a display area.
- a technique of specifying a rectangular area by sliding an input position on a display screen using a pointing device such as a touch screen and a mouse pointer is well known.
- a pointing device such as a touch screen and a mouse pointer.
- a pointing device generally performs a touch input (click input) of one point on a display screen, slides (swipes) a touch position while continuing the touch input on the screen, and then releases the touch input to specify a rectangular area a diagonal line of which is a line segment connecting a first touch position and a last touch position.
- Patent Literature 1 discloses, for example, a technique of using a plurality of objects on a screen using a touch panel display. According to the technique disclosed in Patent Literature 1, a touch state is detected when touch control of the touch panel display arranged overlaid in front of a monitor which can display a plurality of objects is performed, a size of a selection area which defines a range to select objects is determined according to the detected touch state, the selection area is set based on the touch position, and objects at least part of which overlap the set selection area are used as selection candidates.
- touch panel displays can be made larger, and usage thereof is diversifying.
- a game apparatus is also being developed which displays a game image on a touch panel display and enables a user to perform an inputting operation related to game progress by performing touch control of a touch panel by the finger.
- a large touch panel display on a game apparatus and executing an action game, it is possible to provide a game of realistic sensation to users.
- a method of touch control of a touch panel display includes, for example, an operation of touching one point of a screen of the touch panel display, an operation of tracing the screen by the finger or an operation of tapping the screen by the finger.
- the operations are assigned different commands (instructions), and a user advances a game by selecting these operating methods and repeatedly inputting various commands.
- a game apparatus which advances a game by selecting a plurality of objects displayed on a display screen, upon selection of an object, if a user desires to select one object, the user touches one point on the screen which displays this object and, if the user selects a plurality of objects, the user touches the screen which displays the respective objects.
- Patent Literature 1 not only an operation of selecting one object but also an operation of selecting a plurality of objects are basically performed based on a touch state of one finger.
- the operation of selecting one object and the operation of selecting a plurality of objects are distinguished based only on whether or not a touch state of one finger continues, and therefore there is a problem that users are easily to make mistakes of touch control. That is, if that a touch state on the touch panel display continues is detected by mistake irrespectively of a user's intension of selecting one object, there is a problem that unintentional another object is selected.
- a technique which can prevent a user's error operation by effectively distinguishing an inputting operation of specifying an area on a display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point.
- a technique is demanded which relates to an inputting operation of easily specifying a wide range of an area on the display screen by one hand.
- the inventors of the present invention obtained knowledge as a result of devoted study of means for solving the above problem of the conventional invention that, by fixing a predetermined area on the display screen based on an intermediate coordinate calculated from coordinates of two points simultaneously inputted first to a coordinate input unit and an intermediate coordinate calculated from coordinates of two points simultaneously inputted last to the coordinate input unit, it is possible to effectively distinguish an inputting operation of specifying an area on the display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point. Moreover, the inventors obtained knowledge that the above operating method can easily specify a wide range of an area on the display screen by one hand. Further, the inventors of the present invention reached to solving the problem of the conventional technique based on the above knowledge, and made the present invention.
- the present invention employs the following configuration.
- a first aspect of the present invention relates to an information processing apparatus.
- the information processing apparatus has: a display unit 1 which can display an image; a coordinate input unit 2 which receives an input of a coordinate on a display screen of the display unit 1 ; and an area specifying unit 3 which specifies an area R on the display screen of the display unit 1 based on the coordinate inputted to the coordinate input unit 2 .
- the area specifying unit 3 first determines a coordinate of a first intermediate point M 1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the coordinate input unit.
- the area specifying unit 3 determines a coordinate of a second intermediate point M 2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted.
- the area specifying unit 3 fixes the area R on the display screen of the display unit 1 based on the coordinate of the first intermediate point M 1 and the coordinate of the second intermediate point M 2 .
- the information processing apparatus by calculating a first intermediate coordinate of one point from the coordinates of the two points simultaneously inputted first and a second intermediate coordinate from the coordinates of the two points simultaneously inputted last, the information processing apparatus according to the present invention defines the area on the display screen based on the first intermediate coordinate and the second intermediate coordinate.
- an inputting operation of specifying an area on the display screen is an operation of simultaneously inputting coordinates of two points to the coordinate input unit, so that it is possible to effectively distinguish an inputting operation of specifying the area on the display screen from a touching or click inputting operation of specifying, for example, one point and an operation of performing sliding while continuing inputting one point. Consequently, according to the present invention, it is possible to prevent a user's error operation.
- the present invention can obtain the first intermediate coordinate from touch positions of two fingers, obtain a second intermediate coordinate from touch positions released after the two fingers slide, and define the area on the display screen based on these first intermediate coordinate and second intermediate coordinate, so that the user can easily specify a wide range of an area on the display screen by one hand.
- the present invention proposes a new operating method of the information processing apparatus which has the display unit and a coordinate input unit. According to the present invention, it is possible to improve operability of the information processing apparatus.
- the display unit 1 may be able to display a plurality of objects O on the display screen.
- the information processing apparatus according to the present invention preferably further has an object selecting unit 4 which selects one or the plurality of objects O at least part of which is included in the area R specified by the area specifying unit 3 .
- usage of specifying an area on the display screen by means of the area specifying unit according to the present invention is selecting one or a plurality of objects included in the specified area.
- a portion at which a desired object is displayed needs to be touched or clicked and inputted by a pointing device as conventionally performed.
- a predetermined area needs to be determined based on the first intermediate coordinate and the second coordinate according to the operating method of the present invention to include a desired object in this predetermined area.
- the display unit 1 and the coordinate input unit 2 preferably form a touch panel display 10 with the coordinate input unit 2 overlaid in front of the display unit 1 .
- the touch panel display 10 is an apparatus which detects a point on the coordinate input unit 2 which the user's hand and finger contacts as a coordinate on the screen of the display unit 1 .
- the user can specify a predetermined area on the display area by an intuitive operation.
- the area R may be, for example, a polygonal area a diagonal line of which is a line segment connecting the first intermediate point M 1 and the second intermediate point M 2 , a circular area a diameter of which is the line segment connecting the first intermediate point M 1 and the second intermediate point M 2 , and an elliptical area a major axis or a minor axis of which is the line segment connecting the first intermediate point M 1 and the second intermediate point M 2 .
- a second aspect of the present invention relates to a method for information processing executed by the information processing apparatus according to the first aspect.
- the method for information processing according to the present invention specifies an area on a display screen of the display unit 1 based on a coordinate detected by the coordinate input unit 2 which receives an input of the coordinate on the display screen of the display unit 1 .
- the method for information processing according to the present invention first performs a step of determining a coordinate of the first intermediate point M 1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the coordinate input unit 2 .
- the method for information processing according to the present invention performs a step of determining a coordinate of the second intermediate point M 2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted.
- the method for information processing performs a step of defining the area R on the display screen of the display unit 1 based on the coordinate of the first intermediate point M 1 and the coordinate of the second intermediate point M 2 .
- a third aspect of the present invention relates to a game apparatus.
- a game apparatus has: a touch panel display 100 ; a card reader 200 ; and a game body 300 which advances a game by displaying information read by the card reader 200 and displaying the information on the touch panel 100 .
- the touch panel display 100 has: a display 110 which can display an image; and a touch screen 120 which is overlaid in front of the display 110 and through which a coordinate on a display screen is inputted.
- the card reader 200 has: a panel 210 on which a card with a code having predetermined card information printed thereon is set; and an image sensor 230 which reads the code of the card set on the panel 210 and detects the card information.
- the game body 300 has: a game information memory unit 380 which stores information related to an object O in association with the card information; an image processing unit 330 which reads the information related to the object O from the game information memory unit 380 based on the card information detected by the image sensor 230 of the card reader 200 , and performs control of displaying an image of the read object O on the display 110 of the touch panel display 100 ; and a game processing unit 320 .
- the game processing unit 320 first determines a coordinate of the first intermediate point M 1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the touch screen 120 of the touch panel display 100 . Further, the game processing unit 320 determines a coordinate of the second intermediate point M 2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted. Furthermore, the game processing unit 320 specifies the area R on the display screen of the display 110 based on the coordinate of the first intermediate point M 1 and the coordinate of the second intermediate point M 2 . Still further, the game processing unit 320 selects one or a plurality of objects (O) an image of which is displayed such that at least part of one or the plurality of objects is included in the specified area R to advance a game.
- O plurality of objects
- the game apparatus has the card reader and the touch panel mounted thereon, and advances a game by combining these operations.
- a user of the game operates by one hand a card set on the card reader to display one or a plurality of objects on the touch panel display, and performs touch control of the touch panel display by the other hand to select the displayed object.
- the game apparatus which requires selection of an object by operating the touch panel display by one hand can improve operability of games by adopting the above object selecting operation.
- the area on the display screen is defined based on the first intermediate coordinate and the second intermediate coordinate.
- the inputting operation of specifying the area on the display screen is an operation of simultaneously inputting coordinates of two points to a coordinate input unit. Consequently, the present invention can effectively distinguish an inputting operation of specifying the area on the display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point.
- the present invention can obtain the first intermediate coordinate from touch positions of two fingers, obtain a second intermediate coordinate from touch positions released after the two fingers slide, and define the area on the display screen based on these first intermediate coordinate and second intermediate coordinate. Consequently, according to the present invention, the user can easily specify a wide range of an area on the display screen by one hand.
- FIG. 1 is a functional block diagram illustrating an information processing apparatus according to the present invention.
- FIG. 2 illustrates a flow of processing executed by the information processing apparatus according to the present invention.
- FIG. 3 is a schematic diagram schematically illustrating an object selecting operation.
- FIG. 4 illustrates an example of a first intermediate point and a second intermediate point calculated from two coordinates.
- FIGS. 5( a ), 5 ( b ) and 5 ( c ) illustrate an example of an area calculated from the coordinates of the first intermediate point and the second intermediate point.
- FIG. 6 is a block diagram illustrating a configuration example of a game apparatus according to the present invention.
- FIG. 7 is a perspective view illustrating an example of an outlook of the game apparatus according to the present invention.
- FIG. 8 schematically illustrates a card reader on which a plurality of cards are set.
- FIG. 9 is a perspective view illustrating an example of the outlook of the game apparatus according to the present invention.
- FIGS. 10( a ) and 10 ( b ) are a view for describing an example of a game executed by the game apparatus according to the present invention.
- FIGS. 11( a ) and 11 ( b ) are a view for describing an example of a game executed by the game apparatus according to the present invention.
- the information processing apparatus can specify a predetermined area on a display screen, and perform various information processing of the specified predetermined area.
- the information processing apparatus according to the present invention can select one or a plurality of objects included in the predetermined area of the specified display screen and move a position of the selected object, that is, give an arbitrary command to the selected object.
- the information processing apparatus according to the present invention can perform editing processing of, for example, specifying an image included in the predetermined area on the display screen and enlarging and displaying the image in the area or cutting the image in the area.
- use of the predetermined area specified by the present invention is by no means limited to these.
- FIG. 1 is a block diagram illustrating a basic functional configuration of the information processing apparatus according to the present invention. As illustrated in FIG. 1 , the information processing apparatus according to the present invention has a touch panel display 10 , a control unit 20 and a memory unit 30 .
- the touch panel display 10 is configured to display various items of image data as an image which a user can view, and detect a coordinate which the user touched on a display screen. More specifically, the touch panel display 10 is formed by disposing a coordinate input unit 2 formed using a transparent material, in front of a display unit 1 which can display images.
- the display unit 1 is a display apparatus such as a LCD (Liquid Crystal Display) or an OELD (Organic Electro Luminescence Display).
- the display unit 1 outputs and displays various pieces of information which the user requires to use the information processing apparatus, as a still image or a movie according to an input signal from the control unit 20 .
- the coordinate input unit 2 can detect that the user's hand or finger contact according to a known electrostatic capacitance method, electromagnetic induction method, infrared scan method, resistance film method or ultrasonic surface acoustic wave method, and obtain coordinate information.
- a positional relationship between the display unit 1 and the coordinate input unit 2 is mutually linked, and the coordinate input unit 2 can obtain coordinate information about a touch position on a display screen displayed on the display unit 1 .
- the coordinate input unit 2 can detect contact of the user's finger, and obtain the information about the coordinate on the screen of the display unit 1 which the user's finger contacted.
- the coordinate input unit 2 supports so-called multitouch of when, for example, the user touches a plurality of points, acquiring information about coordinates of a plurality of these points.
- the information processing apparatus can have the comparatively large touch panel display 10 mounted thereon.
- the touch panel display 10 is preferably displays of 10 inches to 75 inches, 16 inches to 40 inches or 20 inches to 38 inches.
- the display unit 1 and the coordinate input unit 2 are by no means limited to a display which functions as a touch panel display integrally formed with the display unit 1 and the coordinate input unit 2 .
- the display unit 1 and the coordinate input unit 2 may function as separate hardware.
- a normal display apparatus such as an LCD or an OELD may be used for the display unit 1 .
- the coordinate input unit 2 may be a pointing device such as a mouse or a touch tablet which is provided separately from the display unit 1 .
- the control unit 20 controls the entire operation of the information processing apparatus by reading and executing a control program stored in the memory unit 30 .
- the control unit 20 executes a function by means of, for example, a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit).
- the control unit 20 reads information including image data of an object from the memory unit 30 , generates an image of the object and has the touch panel display 10 display the image. Further, the control unit 20 stores coordinate information about the touch position detected by the touch panel display 10 , in the memory unit 30 .
- the control unit 20 can perform computation of specifying a predetermined area on the display screen based on the coordinate information inputted to the touch panel display 10 . Further, the control unit 20 can decide whether or not an object is selected by analyzing position information of the object displayed on the touch panel display 10 and the coordinate information inputted to the touch panel display 10 .
- the control unit 20 has an area specifying unit 3 and an object selecting unit 4 from the functional view point.
- the area specifying unit 3 has a function of specifying a selection area on the display screen of the touch panel display 10 according to the control program stored in the memory unit 30 .
- the object selecting unit 4 has a function of, when the touch position detected by the touch panel display 10 overlaps an object or when the selection area specified by the area specifying unit 3 includes an object, selecting these objects. Details will be described below.
- the memory unit 30 stores various pieces of information including the control program required for processing in the information processing apparatus.
- the memory unit 30 is realized by a storage apparatus such as a ROM (Read Only Memory) or a RAM (Random Access Memory).
- the RAM is, for example, a VRAM (Video RAM), a DRAM (Dynamic RAM) or a SRAM (Static RAM).
- the memory unit 30 has an object memory unit 5 and a coordinate memory unit 6 from a functional view point.
- the object memory unit 5 stores information including image data (for example, a top coordinate, a top texture coordinate or brightness data of the object) of the object displayed on the touch panel display 10 .
- the coordinate memory unit 6 stores coordinate information acquired by the coordinate input unit 2 of the touch panel display 10 .
- the coordinate memory unit 6 stores coordinate information read and written by the control unit 20 , and is realized by, for example, a working area of the RAM.
- FIG. 2 specifically illustrates processing of the control unit 20 of selecting one or a plurality of objects displayed on the touch panel display 10 .
- FIG. 3 is a schematic view illustrating processing of object selection executed by the information processing apparatus.
- Step S 1 is a touch input stand-by state in which the coordinate input unit 2 of the touch panel display 10 does not detect a touch input.
- the display unit 1 of the touch panel display 10 displays a plurality of objects O (O 1 to O 5 ) read from the object memory unit 5 .
- the plurality of objects O (O 1 to O 5 ) may be moving on the screen of the display unit 1 or stay still under control of the control unit 20 .
- a background image may be displayed on the screen of the display unit 1 in addition to the objects O.
- step S 2 the coordinate input unit 2 of the touch panel display 10 detects a touch input of a first point.
- a touch point P 1 of the first point is, for example, a point at which a user's forefinger contacts the coordinate input unit 2 .
- the control unit 20 acquires information about a coordinate of the touch point P 1 of the first point, and temporarily stores the acquired coordinate information in the coordinate memory unit 6 .
- step S 3 the control unit 20 decides whether or not the touch input of the first point continues, based on the information detected by the coordinate input unit 2 .
- the flow proceeds to step S 4 , and, when it is decided that the touch input of the first point does not continue, the flow proceeds to step S 17 .
- step S 4 the control unit 20 decides whether or not a touch input of a second point is performed while the touch input of the first point continues, based on the information detected by the coordinate input unit 2 .
- a touch point P 2 of the second point is, for example, a point at which the user's thumb contacts the coordinate input unit 2 .
- the control unit 20 acquires information about a coordinate of the touch point P 2 of the second point, and temporarily stores the acquired coordinate information in the coordinate memory unit 6 .
- step S 5 When it is decided that the touch input of the second point is performed while the touch input of the first point continues, the flow proceeds to step S 5 , and, when it is decided that the touch input of the second point is not performed, the flow proceeds to step S 17 .
- step S 1 in the touch input stand-by state (step S 1 ), when the touch input of the first point and the touch input of the second point are simultaneously detected, processing in steps S 2 to S 4 are simultaneously performed, and the flow proceeds to step S 5 .
- step S 5 the control unit 20 reads the information about the coordinate of the touch point P 1 of the first point and the touch point P 2 of the second point from the coordinate memory unit 6 , and calculates the coordinate of a first intermediate point M 1 based on these pieces of coordinate information.
- the first intermediate point M 1 is set to exactly an intermediate point between the touch point P 1 of the first point and the touch point P 2 of the second point. That is, the first intermediate point M 1 is set to a position at which distances to the touch point P 1 of the first point and the touch point P 2 of the second point are equal on a line segment connecting the touch point P 1 of the first point and the touch point P 2 of the second point.
- the position to which the first intermediate point M 1 is set is not limited to the above, and may be any position as long as the position can be set based on the information about the coordinate of the touch point P 1 of the first point and the touch point P 2 of the second point.
- the first intermediate point M 1 may be set to a position at which distances to the touch point P 1 of the first point and the touch point P 2 of the second point are 6:4 on the line segment connecting the touch point P 1 of the first point and the touch point P 2 of the second point.
- the ratio of distances to the touch point P 1 of the first point and the touch point P 2 of the second point can be arbitrarily set.
- the coordinate of the first intermediate point can be set based on coordinate information of the touch point P 1 of the first point and the touch point P 2 of the second point according to various conditions.
- FIG. 4 illustrates an example of the first intermediate point M 1 set under another condition.
- the first intermediate point M 1 is provided at a position of a top of an isosceles triangle in which the touch point P 1 of the first point and the touch point P 2 of the second point are positioned at base angles. That is, as illustrated in FIG. 4 , the first intermediate point M 1 is set to a position spaced a height h of the isosceles triangle apart from the line segment connecting the touch point P 1 of the first point and the touch point P 2 of the second point.
- the first intermediate point M 1 by setting the first intermediate point M 1 at a position spaced a predetermined distance h apart from the line segment connecting the touch point P 1 of the first point and the touch point P 2 of the second point, it is possible to provide an advantage that the user can easily view the position of the first intermediate point M 1 displayed on the touch panel display 10 . That is, if the first intermediate point M 1 is set on the line segment connecting the touch point P 1 of the first point and the touch point P 2 of the second point, there is a problem that the first intermediate point M 1 hides behind the user's hand depending on a viewing angle and is hard to view. In this regard, by setting the first intermediate point M 1 to a position illustrated in FIG. 4 , it is possible to prevent the first intermediate point M 1 from being hidden behind the user's hand, so that user can easily learn the position of the first intermediate point M 1 .
- step S 5 The coordinate of the first intermediate point M 1 calculated in step S 5 is temporarily stored in the coordinate memory unit 6 .
- the flow proceeds to step S 6 .
- step S 6 the control unit 20 decides whether or not the touch input of the first point and the touch input of the second point continue, based on information detected by the coordinate input unit 2 .
- the flow proceeds to step S 7 , and, when it is decided that one or both of the touch inputs of the first point and the second point does not continue, the flow proceeds to step S 17 .
- step S 7 the control unit 20 decides whether or not the touch inputs of the two points slide in a state where the touch inputs of the first point and the second point continue. That the touch inputs slide means that detected coordinates of the touch inputs are continuously displaced.
- step S 7 whether or not the user's two fingers (for example, the forefinger and the thumb) move tracing on the screen in a state where the user's two fingers are in contact with the coordinate input unit 2 of the touch panel display 10 .
- the control unit 20 can decide that the touch inputs of two points slide, based on the coordinate information continuously detected by the coordinate input unit 2 .
- the coordinate information which is continuously detected when the touch inputs of two points slide is occasionally stored in the coordinate information memory unit 6 .
- step S 8 the control unit 20 reads from the coordinate information memory unit 6 the coordinate information continuously detected when the touch inputs of the two inputs slide, and occasionally calculates coordinates of second intermediate point candidates based on the information about the coordinates of the touch points of the points.
- the “second intermediate point” is a point calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before the coordinates of the two points stop being simultaneously inputted.
- the “second intermediate point candidate” is a point which can be the second intermediate point.
- a reference numeral P 3 refers a touch point (a release point of the forefinger) of a third point detected by the coordinate input unit 2 immediately before the slide input continuing from the touch point P 1 of the first point is released
- a reference numeral P 4 refers to a touch point (a release point of the thumb) of a fourth point detected by the coordinate input unit 2 immediately before the slide input continuing from the touch point P 2 of the second point is released.
- the coordinate of the “second intermediate point M 2 ” can be calculated from the coordinate of the touch point of the third point and the coordinate of the touch point of the fourth point.
- Conditions for calculating the coordinate of the second intermediate point M 2 may be the same as the conditions for calculating the coordinate of the first intermediate point M 1 .
- FIG. 3 illustrates that a reference numeral P 3 ′ refers to a halfway point at which the slide input is continuing from the touch point P 1 of the first point, and a reference numeral P 4 ′ refers to a halfway point at which the slide input is continuing from the touch point P 2 of the second point.
- Conditions for calculating the coordinate of the second intermediate point M 2 ′ are the same as the conditions for calculating the coordinate of the second intermediate point M 2 .
- the calculated coordinates of the second intermediate point candidates MT are occasionally stored in the coordinate memory unit 6 .
- the flow proceeds to step S 9 .
- step S 9 the area specifying unit 3 of the control unit 20 calculates selection area candidates A′ based on the above-described coordinate of the first intermediate point M 1 and coordinates of the second intermediate point candidates MT.
- the selection area candidate A′ is an area which can be a selection area A described below.
- a selection area candidate R′ is a rectangular area a diagonal line of which is, for example, a line segment connecting the first intermediate point M 1 and the second intermediate point candidate M 2 ′.
- a shape and an area of the selection area candidate R′ change when touch inputs of two points slide and the coordinate of the second intermediate point candidate M 2 ′ changes.
- the selection area candidates R′ are continuously calculated according to changes in the coordinates of the second intermediate point candidates M 2 ′.
- the flow proceeds to step S 10 .
- step S 10 the control unit 20 displays the selection area candidates R′ calculated in step S 9 , on the touch panel display 10 .
- the selection area candidates R′ are continuously calculated, and displayed on the touch panel display 10 every time the selection area candidate R′ is calculated.
- the user can check the selection area candidates R′ from the display of the touch panel display 10 , and can adjust a touch position such that an object which the user desires to select is included in the selection area candidates R′.
- the flow proceeds to step S 11 .
- step S 11 the control unit 20 decides whether or not slide inputs continuing from the touch point P 1 of the first point and the touch point P 2 of the second point are released. That is, the control unit 20 may decide that the slide inputs are released when the coordinate input unit 2 no longer detects touch inputs continuing from the touch point P 1 of the first point and the touch point P 2 of the second point.
- the flow proceeds to step S 12 .
- processing in steps S 8 to S 10 are repeated until release of the slide inputs is detected.
- step S 12 the control unit 20 decides the coordinate of the second intermediate point M 2 . That is, as illustrated in FIG. 3 , the control unit 20 acknowledges points detected by the coordinate input unit 2 immediately before the slide inputs are released in step S 11 as the touch point P 3 of the third point and the touch point P 4 of the fourth point.
- the touch point P 3 of the third point is a touch point detected by the coordinate input unit 6 immediately before the slide input continuing from the touch point P 1 of the first point is released.
- the touch point P 4 of the fourth point is a touch point detected by the coordinate input unit 6 immediately before the slide input continuing from the touch point P 2 of the second point is released.
- control unit 20 calculates the coordinate of the second intermediate point M 2 based on the coordinate of the touch point P 3 of the third point and the coordinate of the touch point P 4 of the fourth point.
- Conditions for calculating the coordinate of the second intermediate point M 2 may be the same as the conditions for calculating the coordinate of the first intermediate point M 1 .
- the coordinate of the second intermediate point M 2 is stored in the coordinate memory unit 6 . When the coordinate of the second intermediate point M 2 is calculated, the flow proceeds to step S 13 .
- the area specifying unit 3 of the control unit 20 defines a selection area R on the display screen of the touch panel display 10 based on the above coordinate of the first intermediate point M 1 and coordinate of the second intermediate point M 2 .
- the selection area R is a rectangular (square) area a diagonal line of which is a line segment D connecting the coordinate of the first intermediate point M 1 and the second intermediate point M 2 and a periphery of which is defined by two sides parallel to a Y axis of the display screen and two sides parallel to the X axis.
- the coordinate of each top of the shape of the selection area R is stored in the memory unit 30 .
- the shape of the selection area R is not limited to the above, and may have a shape determined based on coordinates of two points of the first intermediate point M 1 and the second intermediate point M 2 .
- FIG. 5 illustrates an example of the shape of the selection area R which can be determined based on the coordinates of the two points of the first intermediate point M 1 and the second intermediate point M 2 .
- the selection area R is a polygonal area a diagonal line of which is the line segment D connecting the coordinate of the first intermediate point M 1 and the second intermediate point M 2 . More specifically, in the example in FIG. 5( a ), a selection area A has a hexagonal shape a diagonal line of which is the line segment D connecting the coordinates of the first intermediate point M 1 and the second intermediate point M 2 , and in which a side extending from the first intermediate point M 1 in the X axis direction and a side extending from the second intermediate point M 2 in the X axis direction are parallel.
- the selection area R is a perfect circular area a diameter of which is the line segment D connecting the coordinate of the first intermediate point M 1 and the second intermediate point M 2 .
- the selection area R is an elliptical area a major axis of which is the line segment D connecting the coordinates of the first intermediate point M 1 and the second intermediate point M 2 .
- the length of a minor axis of the elliptical area may be a fixed value, or a value proportional to the major axis.
- the selection area R may be an elliptical area a minor axis of which is the line segment D connecting the coordinates of the first intermediate point M 1 and the second intermediate point M 2 .
- the shape of the selection area R can be adequately set according to use thereof.
- step S 14 the control unit 20 displays the defined selection area R on the touch panel display 10 .
- the user can check the selection area R based on the display on the touch panel display 10 .
- step S 15 the object selecting unit 4 of the control unit 20 decides whether or not there is an object in the selection area R on the display screen. Positions at which there is a plurality of objects O displayed on the screen are learned by the control unit 20 . Consequently, by referring to the coordinate of each top of the shape of the selection area R on the display screen and the coordinates at which there are the objects O, it is possible to decide whether or not the objects O are included in the selection area R.
- objects O 1 to O 5 are displayed on the screen of the touch panel display 10 . Among these objects, the objects O 1 to O 3 are entirely or partially included in the selection area R. On the other hand, the objects O 4 and O 5 are entirely positioned outside the selection area R.
- the object selecting unit 4 of the control unit 20 decides that the objects O 1 to O 3 among a plurality of objects O 1 to O 5 are included in the selection area R.
- the flow proceeds to step S 16 .
- step S 16 the object selecting unit 4 of the control unit 20 selects one or a plurality of objects which are decided to be included in the selection area R.
- Information for example, an identification number of each object
- Information is temporarily stored in a working area of the memory unit 30 .
- step S 17 when the touch input of the second point is not performed while the touch input of the first point continues or, even though the touch input of the second point is performed, a simultaneous slide input of the two points is not performed, the flow proceeds to step S 17 .
- step S 17 whether or not an object on the display screen is positioned at the touch point P 1 of the first point is decided as conventionally performed. Meanwhile, the coordinates of the touch point P 1 of the first point and the coordinate at which there is the object on the display screen are referred to and, when both of the coordinates match, the flow proceeds to step S 16 .
- step S 16 the object selecting unit 4 of the control unit 20 selects one object the coordinate of which matches with the touch point P 1 of the first point.
- the information related to the selected object is temporarily stored in the working area of the memory unit 30 .
- control unit 20 performs processing of selecting one or a plurality of objects of the objects displayed on the touch panel display 10 .
- the control unit 20 selects an object, by performing a drag operation in a state where the object is selected, for example, it is possible to perform various known information processing of, for example, moving the position of the object on the display screen.
- the game apparatus according to the present invention basically has a touch panel display. Further, the game apparatus can advance a game by selecting one or a plurality of objects displayed on the touch panel display according to the above method, and giving various commands to the selected objects.
- FIG. 6 is a block diagram illustrating a configuration example of the game apparatus. The embodiment illustrated by this block diagram can be suitably used for an arcade game apparatus in particular.
- FIG. 7 is a perspective view illustrating an example of an outlook of a housing of the game apparatus.
- the game apparatus has the touch panel display 100 , a card reader 200 and a game body 300 .
- the touch panel display 100 is configured to display various items of image data as an image which the user can view, and detect the coordinates which the user touches on the display screen. Further, when a card with a predetermined identification code printed thereon is set on the card reader 200 , the card reader 200 is configured to read the identification code recorded in the card and acquire unique card information of the card.
- the game body 300 controls the entire function of the game apparatus. Particularly, the game body 300 can display an object on the touch panel display 100 based on the information read by the card reader 200 , and advance a game based on a card operation with respect to the card reader 200 and touch control with respect to the touch panel display 100 .
- the touch panel display 100 has the display 110 and the touch screen 120 .
- the touch panel display 100 is formed by disposing the touch screen 120 formed using a transparent material, in front of the display 100 which can display images.
- the display 110 is a display apparatus such as a LCD (Liquid Crystal Display) or an OELD (Organic Electro Luminescence Display).
- the display 110 outputs and displays various pieces of information which the user requires to use the information processing apparatus, as a still image or a movie according to an input signal from the game body 300 .
- the touch screen 120 can detect contact of the user's hand or finger according to a known electrostatic capacitance method, electromagnetic induction method, infrared scan method, resistance film method or ultrasonic surface acoustic wave method, and obtain information about the coordinate of the touch position.
- the positional relationship between the display 110 and the touch screen 120 is mutually linked, and the touch screen 120 can acquire information about the coordinate of a touch position on the display screen displayed on the display 110 .
- the touch screen 120 can detect contact of the user's finger, and obtain the information about the coordinate on the screen of the display 110 which the user's finger contacted.
- the coordinate information acquired by the touch screen 120 is stored in a temporary memory unit 370 of the game body 300 .
- the touch screen 120 supports so-called multitouch of, when, for example, the user touches a plurality of points, acquiring information about coordinates of a plurality of these points.
- the game apparatus according to the present invention preferably has the comparatively large touch panel display 100 mounted thereon.
- the touch panel display 100 is preferably displays of 10 inches to 75 inches, 16 inches to 40 inches or 20 inches to 38 inches.
- the card reader 200 is an apparatus which can capture an image of an identification card recorded in a card C, and has a panel 210 , a light source 220 and an image sensor 230 .
- an illustration of an object used in a game is printed on the surface of the card C, and an identification code for identifying the object printed on the surface is recorded on the back surface of the card C.
- an identification code is printed on the back surface of the card C using an ink which cannot be viewed by means of visible light, and a pattern printed using black and white appear when specific invisible light is radiated on the card.
- the identification code is printed using a special ink which absorbs invisible light such as infrared ray and, when infrared ray is radiated on the back surface of the card C, the invisible light radiated on a portion except the black portion of the identification code is reflected.
- the identification code of the card C has at least an identification number of an object drawn in the card and information related to, for example, an orientation of the cord recorded therein.
- the panel 210 is provided on the upper surface of the card reader 200 , and a plurality of cards C can be set on the panel 210 . Further, inside the housing of the game apparatus, for example, a light source 220 which radiates infrared ray (invisible light) on the back surface of the card C set on the panel 210 , and an image sensor 230 which acquires the infrared ray reflected from the back surface of the card C set on the panel 210 and captures an image of a pattern of card data recorded in the card C are provided.
- the light source 220 is, for example, a light emitting diode (LED) which emits invisible light such as infrared ray or ultraviolet ray which is invisible to the eyes.
- LED light emitting diode
- the image sensor 230 is, for example, an image capturing element which captures an image of an identification code by means of infrared ray which is reflected on the back surface of the card C and is incident on the card reader 200 . Further, the card reader 200 can acquire unique card information of the card C by analyzing this identification code. The card information acquired by the card reader 200 is transmitted to a processing unit 310 of the game body 300 , and stored in the temporary memory unit 270 .
- the identification code of the card C has at least an identification number of an object drawn in the card and information related to, for example, an orientation of the card recorded therein.
- the processing unit 310 of the game body 300 can learn a status, a type, a name and an attribute of the object recorded in the card C and, moreover, the characteristics of the object matching the orientation or the position of the card C.
- An example of an object is a game character.
- the image sensor 230 of the card reader 200 detects the position at which infrared ray light is reflected from the back surface of the card C, so that the processing unit 310 of the game body 300 can calculate the position at which the card C is set on the panel 210 as coordinate information. Furthermore, the image sensor 230 continuously detects reflection positions of infrared ray, so that it is possible to obtain information that the card C set on the panel 210 moves from a certain position to another position.
- the panel 210 of the card reader 200 is preferably partitioned into a plurality of areas.
- the number of partitions of the panel 210 is, for example, 2 to 10.
- the panel 210 of the card reader 200 is divided into two of an offensive area A 1 (first area) and a defensive area A 2 (second area). These areas are partitioned according to the coordinate of the panel, and each card C can slide in the offensive area A 1 and the defensive area A 2 .
- the processing unit 310 of the game body 300 can decide which one of the offensive area A 1 and the defensive area A 2 the position of each card C belongs to.
- the rectangular card C can be set vertically or horizontally on the panel 210 of the card reader 200 , and the processing unit 310 of the game body 300 can decide whether the card C is set vertically or horizontally, based on detection information from the card reader 200 .
- an identification code is printed on the back surface of the card C. This identification code includes information related to the orientation of the card. Consequently, the processing unit 310 of the game body 300 can decide whether the card C is set vertically or horizontally by reading the identification code by means of the card reader 200 and analyzing the orientation of the card C based on the read identification code.
- the game body 300 has the processing unit 310 , and reads and executes a game program and controls an entire operation of the game apparatus according to the game program. As illustrated in FIG. 6 , the game body 300 has the following configuration.
- the processing unit 310 performs various processing such as control of the entire system, an instruction to give a command to each block in the system, game processing, image processing and audio processing.
- the function of the processing unit 310 can be realized by hardware such as various processors (for example, a CPU or a DSP) or an ASIC (for example, a gate array), or a given program (game program).
- the processing unit 310 may include a game processing unit 320 , an image processing unit 330 and an audio processing unit 350 . More specifically, the processing unit 310 includes a main processor, a coprocessor, a geometry processor, a drawing processor, a data processing processor, and a four arithmetic operation circuit or a generalized arithmetic operation circuit. These processors and circuit are adequately coupled through a bus, and can transmit and receive signals. Further, the processing unit 310 may have a data extension processor for extending compressed information.
- the game processing unit 320 performs various processing such as processing of displaying an object on the display 110 based on card information acquired by the card reader 200 , processing of scrolling the position of a view point (the position of a virtual camera) or an angle of view (a rotation angle of the virtual camera) on the display 110 , processing of arranging an object such as a map object in object space, processing of selecting an object, processing of moving the object (motion processing), processing of calculating the position or the rotation angle of the object (the rotation angle around an X, Y or Z axis), processing of receiving coins (price), processing of setting various modes, processing of advancing a game, processing of setting a selection screen, hit check processing, processing of computing a game result (achievement or score), processing of allowing a plurality of players to play a game in common game space or game-over processing, based on input data from the touch screen 120 , the card reader 200 and the operating unit 360 and personal data, stored data and a game program from a mobile information storage apparatus 392 .
- the image processing unit 330 performs various image processing according to, for example, an instruction from the game processing unit 320 .
- the game processing unit 320 reads image information of an object and game space from the game information memory unit 380 based on information about the position of a view point and an angle of view, and writes the read image information in the temporary memory unit 370 .
- the game processing unit 320 supplies scroll data for moving the view point to the image processing unit 330 .
- the image processing unit 330 reads image information per frame from the temporary memory unit 370 based on given scroll data, and has the display 110 display images of the object and the game space according to the read image information. By this means, the display 110 displays the object and the game space based on the view point.
- the image processing unit 330 moves the view point in the game space according to the coordinate inputted to the touch screen 120 . Furthermore, the image processing unit 330 reads frames from the temporary memory unit 370 based on the information about the moving view point, and has the display 110 to display the read image. Thus, by scrolling the view point in the game space, the display screen transitions.
- the image processing unit 330 reads the card information acquired from the temporary memory unit 370 by the card reader 200 , and refers to the object table stored in a game information memory unit 380 based on this card information. Furthermore, the image processing unit 330 reads image data of the object associated with the card information from the temporary memory unit 370 or the game information memory unit 380 based on link information stored in the object table. Still further, the image processing unit 330 generates the object in the game space according to the image data of the read object, and has the display 110 to display the object.
- the game processing unit 320 controls a behavior of the object which appears in the game space, based on the information about the coordinate inputted to the touch screen 120 , the orientation or the position of the card set on the card reader 200 and other operation information from the operating unit 260 (a lever, button or a controller). For example, the game processing unit 320 refers to the coordinate information of the object displayed on the display 110 and the coordinate information inputted to the display 110 , and decides whether or not the user touches the object. That is, the game processing unit 320 decides that the user touched and selected the object when position information inputted to the touch screen 120 and position information of the object match. Further, when an operation or an instruction is given to the selected object, processing matching a game program is performed according to the operation or the instruction.
- the game processing unit 320 preferably performs selection processing unique to the present invention when the object displayed on the display 110 of the touch panel display 100 is selected. That is, the game processing unit 320 determines the coordinate of the first intermediate point calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the touch screen 120 of the touch panel display 100 . Further, the game processing unit 320 determines the coordinate of the second intermediate point calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before the two coordinates of the two points stop being simultaneously inputted.
- the game processing unit 320 specifies an area on the display screen of the display 110 based on the coordinate of the first intermediate point and the coordinate of the second intermediate point, and selects one or a plurality of objects images of which are displayed such that at least part of the objects are included in the specified area. Still further, when the operation or the instruction is given to the selected object, the game processing unit 320 performs processing matching a game program according to the operation or the instruction. When, for example, one or a plurality of objects are selected according to the input operation with respect to the touch screen 120 and then different coordinate information is inputted to the touch screen 120 again, the game processing unit 320 performs control of moving one or a plurality of selected objects to the coordinate information inputted again. Thus, the game processing unit 320 preferably advances a game by linking the card information acquired by the card reader 200 and the coordinate information inputted to the touch screen 120 .
- the audio processing unit 250 emits various sounds according to, for example, an instruction from the game processing unit 320 .
- Functions of the game processing unit 320 , the image processing unit 330 and the audio processing unit 350 may all be realized by hardware or may all be realized by programs. Alternatively, these functions may be realized by both of the hardware and the programs.
- the image processing unit 330 may have a geometry processing unit 332 (three-dimensional coordinate computing unit) and a drawing unit 340 (rendering unit).
- the geometry processing unit 332 performs various geometry computations (three-dimensional coordinate computation) such as coordinate transformation, clipping processing, perspective transformation and light source calculation. Further, object data (for example, top coordinate, top texture coordinate or brightness data of the object) for which geometry processing has been performed (perspective transformation has been performed) is stored in a main memory 372 of the temporary memory unit 370 and kept.
- the drawing unit 340 draws the object in a frame buffer 374 based on the object data for which geometry computation has been performed (perspective transformation has been performed) and a texture and the like stored in the texture memory unit 376 .
- the drawing unit 340 may include, for example, a texture mapping unit 342 and a shading processing unit 344 . More specifically, the drawing unit 340 can be implemented by a drawing processor.
- the drawing processor is connected to the texture memory unit, various tables, a frame buffer and a VRAM via a bus and the like, and is further connected with the display.
- the texture mapping unit 242 reads an environment texture from the texture memory unit 276 , and maps the read environment texture on the object.
- the shading processing unit 344 performs shading processing with respect to the object.
- the geometry processing unit 332 performs light source calculation, and calculates brightness (RGB) of each top of the object based on information about the light source for shading processing, an illumination model and a normal vector of each top of the object.
- the shading processing unit 344 calculates the brightness of each dot of a primitive surface (polygon or curved surface) based on the brightness of each top according to Phong shading or Gouraud shading.
- the geometry processing unit 332 may include a normal vector processing unit 334 .
- the normal vector processing unit 334 may perform processing of rotating a normal vector of each top of the object (a normal vector on a plane of the object in a broad sense) according to a rotation matrix from a local coordinate system to a world coordinate system.
- the operating unit 360 allows a player to input operation data.
- the function of the operating unit 360 is realized by, for example, a lever, a button and hardware. Processing information from the operating unit 360 is transmitted to the main processor through a serial interface (I/F) or the bus.
- I/F serial interface
- the game information memory unit 380 stores game programs, objects displayed on the display 110 and information related to image data in game space.
- the game information memory unit 380 is, for example, a ROM, and is realized by a non-volatile memory such as an optical disk (CD or DVD), a magnetooptical disk (MO), a magnetic disk, a hard disk or a magnetic tape.
- the processing unit 310 performs various processing based on information stored in this game information memory unit 380 .
- the game information memory unit 380 stores information (programs or the programs and data) for executing means (a block included in the processing unit 310 in particular) of the present invention (the present embodiment). Part or all of information stored in the game information memory unit 380 may be written to the temporary memory unit 370 when, for example, the system is turned on.
- the information stored in the game information memory unit 380 includes, for example at least two of a program code for performing predetermined processing, image data, audio data, shape data of a display object, table data, list data, information for instructing processing of the present invention and information for performing processing according to the instruction.
- the table data includes data of an object table which stores a status, a type, a name and an attribute of an object, and characteristics of the object matching the orientation or the position of the card, in association with an identification number of the object.
- the status of the object is information in which, for example, a moving speed, a hit point, offense power and defense power are stored as numerical values.
- the game processing unit 320 can decide superiority and inferiority of, for example, the moving speed, the hit point and the offense power of each object by referring to the status stored in the object table. Further, the game processing unit 320 can perform various computations for advancing a game based on these numerical values related to the status. For example, the numerical value of the moving speed of each object is comparable, and, by referring to the object table, it is possible to learn which one of a given object and another object has a faster moving speed. Further, by performing predetermined computation processing based on a numerical value of the moving speed of each object, it is possible to calculate a time required for the object to move from a give point to another point in game space.
- the characteristics of the object matching the orientation of the card are data which change according to the orientation of the card set on the panel 210 of the card reader 200 .
- the object table stores information which is different when the card is vertically set or horizontally set. For example, when the card is vertically set and horizontally set, the status of the object may change.
- the characteristics of the object matching the position of the card are data which change according to the position at which the card is set on the panel 210 of the card reader 200 .
- the object table stores information which is different when the card is positioned in the offensive area A 1 (first area) and when the card is positioned in the defensive area A 2 (second area). For example, when the card is positioned in the offensive area A 1 and when the card is positioned in the defensive area A 2 , the status of the object may change.
- the game information memory unit 380 stores data related to game space.
- the game space means a world of a game in the game apparatus according to the present invention which is also referred to as a “world”.
- the data related to the game space includes position information of a target object to be displayed, information related to the type of the target object to be displayed and image data of the target object to be displayed.
- the target object to be displayed is, for example, a background, a building, a landscape, a plant and an object appearing in a game.
- This image data is preferably stored as polygon data.
- the polygon data includes, for example, top coordinate data, color data, texture data and transparency data.
- the game information memory unit 380 classifies and stores a target object to be displayed according to the orientation of a view point, a position and an area of a player character.
- the audio output unit 390 outputs an audio.
- the function of the audio output unit 390 can be realized by hardware such as a speaker.
- An audio output is applied audio processing by a sound processor connected to, for example, the main processor through the bus, and is outputted from the audio output unit such as the speaker.
- the mobile information storage apparatus 392 stores, for example, personal data of a player and saved data.
- This mobile information storage apparatus 392 may be, for example, a memory card or a mobile game apparatus.
- a function of the mobile information storage apparatus 392 can be achieved by known storage means such as a memory card, a flash memory, a hard disk or a USB memory. Meanwhile, the mobile information storage apparatus 392 is not a necessary configuration, and may be implemented when a player needs to be identified.
- the communication unit 394 is an arbitrary unit which performs various controls for performing communication with outside (for example, a host server or another game apparatus). By connecting the game apparatus with a host sever on a network or another game apparatus through the communication unit 394 , it is possible to play a match play or a combination play of a game.
- the function of the communication unit 394 can be realized by various processors, hardware such as a communication ASIC or a program. Further, a program or data for executing a game apparatus may be distributed from an information storage medium of a host apparatus (server) to the game information memory unit 380 through the network and the communication unit 394 .
- the game apparatus can play a match game using communication such as the Internet.
- this match game each game user plays a match by having a plurality of objects (game characters) to appear in one game space.
- the user performs an instruction operation such as appearance, movement, offense and defense of each object through, for example, the touch panel display 100 and the card reader 200 to beat enemy objects (Enemy), conquer a tower and break a stone.
- FIG. 9 conceptually illustrates states of the touch panel display 100 and the card reader 200 when a game is actually played using the game apparatus according to the present invention.
- the user sets desired cards C 1 to C 7 on the card reader 200 .
- the identification code is printed on the back surface of each of the cards C 1 to C 7 .
- the card reader 200 analyzes card information based on the identification code, and transmits card information to the processing unit 310 of the game apparatus. Further, the card reader 200 can learn the orientation and the position of each of the cards C 1 to C 7 . In an example illustrated in FIG.
- the cards C 1 , C 2 , C 5 and C 6 are vertically set, and the cards C 3 , C 4 and C 7 are horizontally set. Further, in the example illustrated in FIG. 9 , on the card reader 200 , the cards C 1 , C 4 and C 6 are positioned in the offensive area A 1 , and the cards C 2 , C 3 , C 5 and C 7 are positioned in the defensive area A 2 .
- Information detected by the card reader 200 is transmitted to the processing unit 310 , and the processing unit 310 refers to the object table stored in the game information memory unit 380 (or the temporary memory unit 370 ) based on the card information and information about, for example, the orientation of the card and the position of the card, and reads information (for example, image data and a status) about the object associated with the card information.
- the processing unit 310 has the touch panel display 100 display images based on the read image data. For example, the touch panel display 100 displays the images of the cards in a lower area of the touch panel display 100 .
- the images of the cards displayed on the touch panel display 100 match an arrangement order of each of the cards C 1 to C 7 set on the card reader 200 and the orientation of each of the cards C 1 to C 7 .
- touch panel display 100 can display information regarding positions of each of the cards C 1 to C 7 set on the card reader 200 (for example, the card is positioned on the offensive area A 1 or the defensive area A 2 ).
- the objects (game characters) O 1 to O 6 associated with the cards C 1 to C 6 are displayed on the display screen of the touch panel display 100 .
- Each object has a unique status, a type, a name, an attribute and object characteristics matching the orientation or the position of the card.
- the status of the object is information in which, for example, a moving speed, a hit point, offense power and defense power are stored as numerical values. These pieces of information are stored in the object table in association with identification information of each object. While it is possible to set objects the cards of which are vertically arranged to carry out normal offenses, it is possible to set objects the cards of which are horizontally arranged to carry out special offenses.
- the object (O 7 ) associated with the card C 7 is not displayed on the touch panel display 100 .
- the image of the card C 7 displayed on the touch panel display 100 is touched, and the image of the card C 7 is dragged to the position at which a call gate is displayed.
- the object (O 7 ) associated with the card C 7 appears in game space, and is displayed on the touch panel display 100 .
- positional coordinates of the call gate in the game space are stored in the game information memory unit 380 or the temporary memory unit 370 , and the game processing unit 320 get grip on the position of the call gate.
- FIG. 10 illustrates an example of an operation of moving objects displayed on the touch panel display 100 .
- the touch panel display 100 obtains the coordinate of the touch position.
- the processing unit 310 refers to the coordinate of the touch position and the coordinate at which the object is displayed, and decides whether or not the coordinate of the touch position and the coordinate at which the object is displayed match.
- the processing unit 310 learns that the object is selected. In an example in FIG. 10( a ), the object O 4 is selected.
- the processing unit 310 stores the coordinate of the touch position in the temporary memory unit.
- the processing unit 310 stores the coordinates of the touch positions in the temporary memory unit together with information about the touch order. Further, the processing unit 310 performs processing of moving the object O 4 touched and selected by the user to a point which the user touches next. The moving speed varies per object. Then, the processing unit 310 reads a numerical value of the moving speed related to the object O 4 from the object table. Further, the object O 4 is moved from the first point to a moving destination point based on the numerical value of the read moving speed. Furthermore, when a plurality of points is touched, the selected object O 4 is sequentially moved to each point according to the order. In addition, when the moving object O 4 encounters an enemy object during movement or arrives at a tower, processing of playing a match with the enemy object or processing of conquering the tower only need to be performed similar to the known game system.
- FIGS. 11( a ) and 11 ( b ) the game system can collectively select a plurality of objects on the display screen, and can form a party of the selected objects.
- FIG. 11 ( a ) illustrates an example of an operation of collectively selecting a plurality of objects O 1 to O 7 .
- the operation illustrated in FIG. 11( a ) is basically the same as the operation illustrated in FIG. 3 .
- the processing unit 310 determines the coordinate of the first intermediate point M 1 calculated from coordinates of the first two points P 1 and P 2 based on the coordinates of the first two points P 1 and P 2 simultaneously inputted to the touch panel display 100 by the user.
- the processing unit 310 determines the coordinate M 2 of the second intermediate point calculated from coordinates of the last points P 3 and P 4 based on the coordinates of the last two points P 3 and P 4 detected immediately before the coordinates of the two points are dragged by the user and stop being simultaneously inputted. Furthermore, the processing unit 310 specifies a rectangular area on the display screen of the touch panel display 100 based on the coordinate of the first intermediate point M 1 and the coordinate M 2 of the second intermediate point. Still further, a plurality of displayed objects O 1 to O 7 is selected such that at least part of the objects is included in the specified rectangular area. The selected objects O 1 to O 7 are stored in the temporary memory unit in association with objects which form a party.
- FIG. 11( b ) illustrates an example where a plurality of selected objects O 1 to O 7 forms a party.
- a plurality of collectively selected objects O 1 to O 7 gather at one site and form a party.
- the processing unit 310 performs processing of moving the party formed by a user's operation to a point touched next. That is, the objects O 1 to O 7 form the party and move in a massing state. In this case, the moving speed of each object which forms the party is different from each other.
- the processing unit 310 reads from the object table a numerical value of the moving speed of each object which forms a party, and calculates the moving speed of the party based on the numerical value of the read moving speed of each object.
- the processing unit 310 only needs to calculate the slowest moving speed (the lowest numerical value) of each object which forms a party as the moving speed of the entire party. Further, the processing unit 310 may use an average value of moving speeds of a plurality of objects which form the party as the moving speed of the entire party. Furthermore, the processing unit 310 can also use the fastest moving speed (the highest numerical value) of each object which forms the party as the moving speed of the entire party. The party of the object moves to the specified point at the moving speed matching the value of the moving speed calculated by the processing unit 310 . When a plurality of points is specified, the processing unit 310 moves the party of the objects sequentially to each point according to the specified order. When the moving party of the objects encounters an enemy object during movement or arrives at a tower, processing of playing a match with an enemy object or processing of conquering a tower only needs to be performed similar to a known game system.
- a characteristic system of the game executed by the game apparatus according to the present invention has been mainly described above.
- a known game apparatus which has a card reader or a known game apparatus which has a touch panel display can be adequately applied to other game processing.
- the present invention can be suitably used in, for example, a computer industry and a game industry.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
- The present invention relates to an information processing apparatus, a method for information processing, and a game apparatus which can specify an area on a display screen. More specifically, the present invention relates to, for example, an information processing apparatus which can perform processing of selecting one or a plurality of objects included in, for example, a specified area by specifying the area on a display area.
- Conventionally, a technique of specifying a rectangular area by sliding an input position on a display screen using a pointing device such as a touch screen and a mouse pointer is well known. Thus, by specifying a rectangular area on a display screen, it is possible to select one or a plurality of objects included in the specified area, or to enlarge, cut or edit an image in the specified area. According to such a conventional technique, a pointing device generally performs a touch input (click input) of one point on a display screen, slides (swipes) a touch position while continuing the touch input on the screen, and then releases the touch input to specify a rectangular area a diagonal line of which is a line segment connecting a first touch position and a last touch position.
- Further,
Patent Literature 1 discloses, for example, a technique of using a plurality of objects on a screen using a touch panel display. According to the technique disclosed inPatent Literature 1, a touch state is detected when touch control of the touch panel display arranged overlaid in front of a monitor which can display a plurality of objects is performed, a size of a selection area which defines a range to select objects is determined according to the detected touch state, the selection area is set based on the touch position, and objects at least part of which overlap the set selection area are used as selection candidates. -
- Patent Literature 1: JP 2010-20608 A
- By the way, in recent years, touch panel displays can be made larger, and usage thereof is diversifying. For example, a game apparatus is also being developed which displays a game image on a touch panel display and enables a user to perform an inputting operation related to game progress by performing touch control of a touch panel by the finger. Particularly, by mounting a large touch panel display on a game apparatus and executing an action game, it is possible to provide a game of realistic sensation to users.
- When an action game is provided using such a large touch panel display, users need to incessantly perform touch control. A method of touch control of a touch panel display includes, for example, an operation of touching one point of a screen of the touch panel display, an operation of tracing the screen by the finger or an operation of tapping the screen by the finger. The operations are assigned different commands (instructions), and a user advances a game by selecting these operating methods and repeatedly inputting various commands. Particularly, according to a game apparatus which advances a game by selecting a plurality of objects displayed on a display screen, upon selection of an object, if a user desires to select one object, the user touches one point on the screen which displays this object and, if the user selects a plurality of objects, the user touches the screen which displays the respective objects.
- Meanwhile, when the user desires to select multiple objects from the objects displayed on the screen, touching the objects which the user desires to select one by one is troublesome and undermines operability of a game. Hence, a technique of, when a touch state of one point on a touch panel display continues, determining a selection area based on a trace on the screen on which a user's finger slides and selecting an object included in this selection area is also proposed.
- However, according to the technique disclosed in
Patent Literature 1, not only an operation of selecting one object but also an operation of selecting a plurality of objects are basically performed based on a touch state of one finger. Thus, the operation of selecting one object and the operation of selecting a plurality of objects are distinguished based only on whether or not a touch state of one finger continues, and therefore there is a problem that users are easily to make mistakes of touch control. That is, if that a touch state on the touch panel display continues is detected by mistake irrespectively of a user's intension of selecting one object, there is a problem that unintentional another object is selected. By contrast with this, if that a touch state on the touch panel display continues is not detected by mistake irrespectively of a user's intension of selecting a plurality of objects, there is a problem that only one object is selected. Further, when an operation of tracing a screen of a touch panel display by one finger (an operation of sliding the finger) is assigned another command, a user is more likely to make an error operation. - Hence, a technique is currently demanded which can prevent a user's error operation by effectively distinguishing an inputting operation of specifying an area on a display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point. Particularly, when a game which requires an operation of incessantly touching a large touch panel display is assumed, a technique is demanded which relates to an inputting operation of easily specifying a wide range of an area on the display screen by one hand.
- Hence, the inventors of the present invention obtained knowledge as a result of devoted study of means for solving the above problem of the conventional invention that, by fixing a predetermined area on the display screen based on an intermediate coordinate calculated from coordinates of two points simultaneously inputted first to a coordinate input unit and an intermediate coordinate calculated from coordinates of two points simultaneously inputted last to the coordinate input unit, it is possible to effectively distinguish an inputting operation of specifying an area on the display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point. Moreover, the inventors obtained knowledge that the above operating method can easily specify a wide range of an area on the display screen by one hand. Further, the inventors of the present invention reached to solving the problem of the conventional technique based on the above knowledge, and made the present invention.
- More specifically, the present invention employs the following configuration.
- A first aspect of the present invention relates to an information processing apparatus.
- The information processing apparatus according to the present invention has: a
display unit 1 which can display an image; acoordinate input unit 2 which receives an input of a coordinate on a display screen of thedisplay unit 1; and anarea specifying unit 3 which specifies an area R on the display screen of thedisplay unit 1 based on the coordinate inputted to thecoordinate input unit 2. - The
area specifying unit 3 first determines a coordinate of a first intermediate point M1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the coordinate input unit. - Further, the
area specifying unit 3 determines a coordinate of a second intermediate point M2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted. - Furthermore, the
area specifying unit 3 fixes the area R on the display screen of thedisplay unit 1 based on the coordinate of the first intermediate point M1 and the coordinate of the second intermediate point M2. - According to the above configuration, by calculating a first intermediate coordinate of one point from the coordinates of the two points simultaneously inputted first and a second intermediate coordinate from the coordinates of the two points simultaneously inputted last, the information processing apparatus according to the present invention defines the area on the display screen based on the first intermediate coordinate and the second intermediate coordinate. Thus, the present invention requires that an inputting operation of specifying an area on the display screen is an operation of simultaneously inputting coordinates of two points to the coordinate input unit, so that it is possible to effectively distinguish an inputting operation of specifying the area on the display screen from a touching or click inputting operation of specifying, for example, one point and an operation of performing sliding while continuing inputting one point. Consequently, according to the present invention, it is possible to prevent a user's error operation.
- Further, when, for example, a touch panel display is assumed, the present invention can obtain the first intermediate coordinate from touch positions of two fingers, obtain a second intermediate coordinate from touch positions released after the two fingers slide, and define the area on the display screen based on these first intermediate coordinate and second intermediate coordinate, so that the user can easily specify a wide range of an area on the display screen by one hand.
- Thus, the present invention proposes a new operating method of the information processing apparatus which has the display unit and a coordinate input unit. According to the present invention, it is possible to improve operability of the information processing apparatus.
- In the information processing apparatus according to the present invention, the
display unit 1 may be able to display a plurality of objects O on the display screen. In this case, the information processing apparatus according to the present invention preferably further has anobject selecting unit 4 which selects one or the plurality of objects O at least part of which is included in the area R specified by thearea specifying unit 3. - In view of the above configuration, usage of specifying an area on the display screen by means of the area specifying unit according to the present invention is selecting one or a plurality of objects included in the specified area. To select one object displayed on the screen, for example, a portion at which a desired object is displayed needs to be touched or clicked and inputted by a pointing device as conventionally performed. Further, to select a plurality of objects displayed on the screen, a predetermined area needs to be determined based on the first intermediate coordinate and the second coordinate according to the operating method of the present invention to include a desired object in this predetermined area. By this means, the user can easily distinguish an operation of selecting one object from an operation of selecting a plurality of objects.
- In the information processing apparatus according to the present invention, the
display unit 1 and thecoordinate input unit 2 preferably form atouch panel display 10 with thecoordinate input unit 2 overlaid in front of thedisplay unit 1. Thetouch panel display 10 is an apparatus which detects a point on thecoordinate input unit 2 which the user's hand and finger contacts as a coordinate on the screen of thedisplay unit 1. - In view of the above configuration, by forming the touch panel display using the display unit and the coordinate input unit, the user can specify a predetermined area on the display area by an intuitive operation.
- In the present invention, the area R may be, for example, a polygonal area a diagonal line of which is a line segment connecting the first intermediate point M1 and the second intermediate point M2, a circular area a diameter of which is the line segment connecting the first intermediate point M1 and the second intermediate point M2, and an elliptical area a major axis or a minor axis of which is the line segment connecting the first intermediate point M1 and the second intermediate point M2.
- A second aspect of the present invention relates to a method for information processing executed by the information processing apparatus according to the first aspect.
- That is, the method for information processing according to the present invention specifies an area on a display screen of the
display unit 1 based on a coordinate detected by thecoordinate input unit 2 which receives an input of the coordinate on the display screen of thedisplay unit 1. - The method for information processing according to the present invention first performs a step of determining a coordinate of the first intermediate point M1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to the
coordinate input unit 2. - Next, the method for information processing according to the present invention performs a step of determining a coordinate of the second intermediate point M2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted.
- Further, the method for information processing according to the present invention performs a step of defining the area R on the display screen of the
display unit 1 based on the coordinate of the first intermediate point M1 and the coordinate of the second intermediate point M2. - A third aspect of the present invention relates to a game apparatus.
- A game apparatus according to the present invention has: a
touch panel display 100; acard reader 200; and agame body 300 which advances a game by displaying information read by thecard reader 200 and displaying the information on thetouch panel 100. - The
touch panel display 100 has: adisplay 110 which can display an image; and atouch screen 120 which is overlaid in front of thedisplay 110 and through which a coordinate on a display screen is inputted. - The
card reader 200 has: apanel 210 on which a card with a code having predetermined card information printed thereon is set; and animage sensor 230 which reads the code of the card set on thepanel 210 and detects the card information. - Further, the
game body 300 has: a gameinformation memory unit 380 which stores information related to an object O in association with the card information; animage processing unit 330 which reads the information related to the object O from the gameinformation memory unit 380 based on the card information detected by theimage sensor 230 of thecard reader 200, and performs control of displaying an image of the read object O on thedisplay 110 of thetouch panel display 100; and agame processing unit 320. - The
game processing unit 320 first determines a coordinate of the first intermediate point M1 calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to thetouch screen 120 of thetouch panel display 100. Further, thegame processing unit 320 determines a coordinate of the second intermediate point M2 calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before coordinates of two points stop being simultaneously inputted. Furthermore, thegame processing unit 320 specifies the area R on the display screen of thedisplay 110 based on the coordinate of the first intermediate point M1 and the coordinate of the second intermediate point M2. Still further, thegame processing unit 320 selects one or a plurality of objects (O) an image of which is displayed such that at least part of one or the plurality of objects is included in the specified area R to advance a game. - In view of the above configuration, the game apparatus according to the present invention has the card reader and the touch panel mounted thereon, and advances a game by combining these operations. For example, a user of the game operates by one hand a card set on the card reader to display one or a plurality of objects on the touch panel display, and performs touch control of the touch panel display by the other hand to select the displayed object. Thus, the game apparatus which requires selection of an object by operating the touch panel display by one hand can improve operability of games by adopting the above object selecting operation.
- According to the above configuration, by calculating a first intermediate coordinate of one point from the coordinates of the two points simultaneously inputted first and a second intermediate coordinate from the coordinates of the two points simultaneously inputted last, the area on the display screen is defined based on the first intermediate coordinate and the second intermediate coordinate. Thus, the present invention requires that the inputting operation of specifying the area on the display screen is an operation of simultaneously inputting coordinates of two points to a coordinate input unit. Consequently, the present invention can effectively distinguish an inputting operation of specifying the area on the display screen from a touching or click inputting operation of specifying one point and an operation of performing sliding while continuing inputting one point.
- Further, when, for example, a touch panel display is assumed, the present invention can obtain the first intermediate coordinate from touch positions of two fingers, obtain a second intermediate coordinate from touch positions released after the two fingers slide, and define the area on the display screen based on these first intermediate coordinate and second intermediate coordinate. Consequently, according to the present invention, the user can easily specify a wide range of an area on the display screen by one hand.
-
FIG. 1 is a functional block diagram illustrating an information processing apparatus according to the present invention. -
FIG. 2 illustrates a flow of processing executed by the information processing apparatus according to the present invention. -
FIG. 3 is a schematic diagram schematically illustrating an object selecting operation. -
FIG. 4 illustrates an example of a first intermediate point and a second intermediate point calculated from two coordinates. -
FIGS. 5( a), 5(b) and 5(c) illustrate an example of an area calculated from the coordinates of the first intermediate point and the second intermediate point. -
FIG. 6 is a block diagram illustrating a configuration example of a game apparatus according to the present invention. -
FIG. 7 is a perspective view illustrating an example of an outlook of the game apparatus according to the present invention. -
FIG. 8 schematically illustrates a card reader on which a plurality of cards are set. -
FIG. 9 is a perspective view illustrating an example of the outlook of the game apparatus according to the present invention. -
FIGS. 10( a) and 10(b) are a view for describing an example of a game executed by the game apparatus according to the present invention. -
FIGS. 11( a) and 11(b) are a view for describing an example of a game executed by the game apparatus according to the present invention. - An embodiment for implementing the present invention will be described below with reference to the drawings. The present invention is by no means limited to the embodiment described below, and incorporates embodiments obtained by adequately modifying the following embodiment in a range obvious for one of ordinary skill in art.
- First, a basic configuration of an information processing apparatus according to the present invention will be described. The information processing apparatus according to the present invention can specify a predetermined area on a display screen, and perform various information processing of the specified predetermined area. For example, the information processing apparatus according to the present invention can select one or a plurality of objects included in the predetermined area of the specified display screen and move a position of the selected object, that is, give an arbitrary command to the selected object. Further, for example, the information processing apparatus according to the present invention can perform editing processing of, for example, specifying an image included in the predetermined area on the display screen and enlarging and displaying the image in the area or cutting the image in the area. However, use of the predetermined area specified by the present invention is by no means limited to these.
- The information processing according to the present invention will be described below using an embodiment as an example where one or a plurality of objects included in the specified predetermined area on the display screen are selected.
-
FIG. 1 is a block diagram illustrating a basic functional configuration of the information processing apparatus according to the present invention. As illustrated inFIG. 1 , the information processing apparatus according to the present invention has atouch panel display 10, acontrol unit 20 and amemory unit 30. - The
touch panel display 10 is configured to display various items of image data as an image which a user can view, and detect a coordinate which the user touched on a display screen. More specifically, thetouch panel display 10 is formed by disposing a coordinateinput unit 2 formed using a transparent material, in front of adisplay unit 1 which can display images. Thedisplay unit 1 is a display apparatus such as a LCD (Liquid Crystal Display) or an OELD (Organic Electro Luminescence Display). Thedisplay unit 1 outputs and displays various pieces of information which the user requires to use the information processing apparatus, as a still image or a movie according to an input signal from thecontrol unit 20. Further, the coordinateinput unit 2 can detect that the user's hand or finger contact according to a known electrostatic capacitance method, electromagnetic induction method, infrared scan method, resistance film method or ultrasonic surface acoustic wave method, and obtain coordinate information. A positional relationship between thedisplay unit 1 and the coordinateinput unit 2 is mutually linked, and the coordinateinput unit 2 can obtain coordinate information about a touch position on a display screen displayed on thedisplay unit 1. By this means, the coordinateinput unit 2 can detect contact of the user's finger, and obtain the information about the coordinate on the screen of thedisplay unit 1 which the user's finger contacted. The coordinateinput unit 2 supports so-called multitouch of when, for example, the user touches a plurality of points, acquiring information about coordinates of a plurality of these points. - With the operating method according to the present invention, the user can easily specify a wide range of an area on the display screen by one hand. Consequently, the information processing apparatus according to the present invention can have the comparatively large
touch panel display 10 mounted thereon. For example, thetouch panel display 10 is preferably displays of 10 inches to 75 inches, 16 inches to 40 inches or 20 inches to 38 inches. - Meanwhile, in the present invention, the
display unit 1 and the coordinateinput unit 2 are by no means limited to a display which functions as a touch panel display integrally formed with thedisplay unit 1 and the coordinateinput unit 2. For example, thedisplay unit 1 and the coordinateinput unit 2 may function as separate hardware. In this case, a normal display apparatus such as an LCD or an OELD may be used for thedisplay unit 1. Further, the coordinateinput unit 2 may be a pointing device such as a mouse or a touch tablet which is provided separately from thedisplay unit 1. - The
control unit 20 controls the entire operation of the information processing apparatus by reading and executing a control program stored in thememory unit 30. Thecontrol unit 20 executes a function by means of, for example, a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). Thecontrol unit 20 reads information including image data of an object from thememory unit 30, generates an image of the object and has thetouch panel display 10 display the image. Further, thecontrol unit 20 stores coordinate information about the touch position detected by thetouch panel display 10, in thememory unit 30. Thecontrol unit 20 can perform computation of specifying a predetermined area on the display screen based on the coordinate information inputted to thetouch panel display 10. Further, thecontrol unit 20 can decide whether or not an object is selected by analyzing position information of the object displayed on thetouch panel display 10 and the coordinate information inputted to thetouch panel display 10. - As illustrated in
FIG. 1 , thecontrol unit 20 has anarea specifying unit 3 and anobject selecting unit 4 from the functional view point. Thearea specifying unit 3 has a function of specifying a selection area on the display screen of thetouch panel display 10 according to the control program stored in thememory unit 30. Further, theobject selecting unit 4 has a function of, when the touch position detected by thetouch panel display 10 overlaps an object or when the selection area specified by thearea specifying unit 3 includes an object, selecting these objects. Details will be described below. - The
memory unit 30 stores various pieces of information including the control program required for processing in the information processing apparatus. Thememory unit 30 is realized by a storage apparatus such as a ROM (Read Only Memory) or a RAM (Random Access Memory). The RAM is, for example, a VRAM (Video RAM), a DRAM (Dynamic RAM) or a SRAM (Static RAM). Thememory unit 30 has anobject memory unit 5 and a coordinatememory unit 6 from a functional view point. Theobject memory unit 5 stores information including image data (for example, a top coordinate, a top texture coordinate or brightness data of the object) of the object displayed on thetouch panel display 10. The coordinatememory unit 6 stores coordinate information acquired by the coordinateinput unit 2 of thetouch panel display 10. The coordinatememory unit 6 stores coordinate information read and written by thecontrol unit 20, and is realized by, for example, a working area of the RAM. - Subsequently, a flow of information processing executed by the information processing apparatus according to the present invention will be described with reference to
FIG. 2 . That is, thecontrol unit 20 of the information processing apparatus reads the control program stored in thememory unit 30, and executes processing illustrated inFIG. 2 according to the read control program.FIG. 2 specifically illustrates processing of thecontrol unit 20 of selecting one or a plurality of objects displayed on thetouch panel display 10. Further,FIG. 3 is a schematic view illustrating processing of object selection executed by the information processing apparatus. - Step S1 is a touch input stand-by state in which the coordinate
input unit 2 of thetouch panel display 10 does not detect a touch input. As illustrated inFIG. 3 , at this stage, thedisplay unit 1 of thetouch panel display 10 displays a plurality of objects O (O1 to O5) read from theobject memory unit 5. Naturally, the plurality of objects O (O1 to O5) may be moving on the screen of thedisplay unit 1 or stay still under control of thecontrol unit 20. Further, for example, a background image may be displayed on the screen of thedisplay unit 1 in addition to the objects O. - In step S2, the coordinate
input unit 2 of thetouch panel display 10 detects a touch input of a first point. As illustrated inFIG. 3 , a touch point P1 of the first point is, for example, a point at which a user's forefinger contacts the coordinateinput unit 2. When the coordinateinput unit 2 detects an input of the touch point P1 of the first point, thecontrol unit 20 acquires information about a coordinate of the touch point P1 of the first point, and temporarily stores the acquired coordinate information in the coordinatememory unit 6. - In step S3, the
control unit 20 decides whether or not the touch input of the first point continues, based on the information detected by the coordinateinput unit 2. When it is decided that the touch input of the first point continues, the flow proceeds to step S4, and, when it is decided that the touch input of the first point does not continue, the flow proceeds to step S17. - In step S4, the
control unit 20 decides whether or not a touch input of a second point is performed while the touch input of the first point continues, based on the information detected by the coordinateinput unit 2. As illustrated inFIG. 3 , a touch point P2 of the second point is, for example, a point at which the user's thumb contacts the coordinateinput unit 2. When the coordinateinput unit 2 detects an input of the touch point P2 of the second point, thecontrol unit 20 acquires information about a coordinate of the touch point P2 of the second point, and temporarily stores the acquired coordinate information in the coordinatememory unit 6. When it is decided that the touch input of the second point is performed while the touch input of the first point continues, the flow proceeds to step S5, and, when it is decided that the touch input of the second point is not performed, the flow proceeds to step S17. In addition, in the touch input stand-by state (step S1), when the touch input of the first point and the touch input of the second point are simultaneously detected, processing in steps S2 to S4 are simultaneously performed, and the flow proceeds to step S5. - In step S5, the
control unit 20 reads the information about the coordinate of the touch point P1 of the first point and the touch point P2 of the second point from the coordinatememory unit 6, and calculates the coordinate of a first intermediate point M1 based on these pieces of coordinate information. In an example illustrated inFIG. 3 , the first intermediate point M1 is set to exactly an intermediate point between the touch point P1 of the first point and the touch point P2 of the second point. That is, the first intermediate point M1 is set to a position at which distances to the touch point P1 of the first point and the touch point P2 of the second point are equal on a line segment connecting the touch point P1 of the first point and the touch point P2 of the second point. Meanwhile, the position to which the first intermediate point M1 is set is not limited to the above, and may be any position as long as the position can be set based on the information about the coordinate of the touch point P1 of the first point and the touch point P2 of the second point. For example, the first intermediate point M1 may be set to a position at which distances to the touch point P1 of the first point and the touch point P2 of the second point are 6:4 on the line segment connecting the touch point P1 of the first point and the touch point P2 of the second point. In addition, the ratio of distances to the touch point P1 of the first point and the touch point P2 of the second point can be arbitrarily set. In addition, the coordinate of the first intermediate point can be set based on coordinate information of the touch point P1 of the first point and the touch point P2 of the second point according to various conditions. - Further,
FIG. 4 illustrates an example of the first intermediate point M1 set under another condition. In an example inFIG. 4 , the first intermediate point M1 is provided at a position of a top of an isosceles triangle in which the touch point P1 of the first point and the touch point P2 of the second point are positioned at base angles. That is, as illustrated inFIG. 4 , the first intermediate point M1 is set to a position spaced a height h of the isosceles triangle apart from the line segment connecting the touch point P1 of the first point and the touch point P2 of the second point. Thus, by setting the first intermediate point M1 at a position spaced a predetermined distance h apart from the line segment connecting the touch point P1 of the first point and the touch point P2 of the second point, it is possible to provide an advantage that the user can easily view the position of the first intermediate point M1 displayed on thetouch panel display 10. That is, if the first intermediate point M1 is set on the line segment connecting the touch point P1 of the first point and the touch point P2 of the second point, there is a problem that the first intermediate point M1 hides behind the user's hand depending on a viewing angle and is hard to view. In this regard, by setting the first intermediate point M1 to a position illustrated inFIG. 4 , it is possible to prevent the first intermediate point M1 from being hidden behind the user's hand, so that user can easily learn the position of the first intermediate point M1. - The coordinate of the first intermediate point M1 calculated in step S5 is temporarily stored in the coordinate
memory unit 6. When calculation of the coordinate of the first intermediate point M1 is finished, the flow proceeds to step S6. - In step S6, the
control unit 20 decides whether or not the touch input of the first point and the touch input of the second point continue, based on information detected by the coordinateinput unit 2. When it is decided that the touch inputs of the first point and the second point continue, the flow proceeds to step S7, and, when it is decided that one or both of the touch inputs of the first point and the second point does not continue, the flow proceeds to step S17. - In step S7, the
control unit 20 decides whether or not the touch inputs of the two points slide in a state where the touch inputs of the first point and the second point continue. That the touch inputs slide means that detected coordinates of the touch inputs are continuously displaced. To sum up, in step S7, whether or not the user's two fingers (for example, the forefinger and the thumb) move tracing on the screen in a state where the user's two fingers are in contact with the coordinateinput unit 2 of thetouch panel display 10. Thecontrol unit 20 can decide that the touch inputs of two points slide, based on the coordinate information continuously detected by the coordinateinput unit 2. The coordinate information which is continuously detected when the touch inputs of two points slide is occasionally stored in the coordinateinformation memory unit 6. When it is decided that the touch inputs of the two points slide, the flow proceed to step S8, and, when it is decided that one or both of the touch inputs of the first point and the second point do not slide, the flow proceeds to step S17. - In step S8, the
control unit 20 reads from the coordinateinformation memory unit 6 the coordinate information continuously detected when the touch inputs of the two inputs slide, and occasionally calculates coordinates of second intermediate point candidates based on the information about the coordinates of the touch points of the points. - The “second intermediate point” is a point calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before the coordinates of the two points stop being simultaneously inputted. The “second intermediate point candidate” is a point which can be the second intermediate point.
- In
FIG. 3 , a reference numeral P3 refers a touch point (a release point of the forefinger) of a third point detected by the coordinateinput unit 2 immediately before the slide input continuing from the touch point P1 of the first point is released, and a reference numeral P4 refers to a touch point (a release point of the thumb) of a fourth point detected by the coordinateinput unit 2 immediately before the slide input continuing from the touch point P2 of the second point is released. The coordinate of the “second intermediate point M2” can be calculated from the coordinate of the touch point of the third point and the coordinate of the touch point of the fourth point. Conditions for calculating the coordinate of the second intermediate point M2 may be the same as the conditions for calculating the coordinate of the first intermediate point M1. - Meanwhile,
FIG. 3 illustrates that a reference numeral P3′ refers to a halfway point at which the slide input is continuing from the touch point P1 of the first point, and a reference numeral P4′ refers to a halfway point at which the slide input is continuing from the touch point P2 of the second point. There are a plurality of these halfway points (P3′ and P4′), and, when slide inputs continue, the coordinates of the halfway points (P3′ and P4′) are continuously stored in the coordinateinformation memory unit 6. Coordinates of the “second intermediate point candidates M2” are continuously calculated based on these halfway points (P3′ and P4′). Conditions for calculating the coordinate of the second intermediate point M2′ are the same as the conditions for calculating the coordinate of the second intermediate point M2. The calculated coordinates of the second intermediate point candidates MT are occasionally stored in the coordinatememory unit 6. When the coordinates of the second intermediate point candidates M2′ are calculated, the flow proceeds to step S9. - In step S9, the
area specifying unit 3 of thecontrol unit 20 calculates selection area candidates A′ based on the above-described coordinate of the first intermediate point M1 and coordinates of the second intermediate point candidates MT. The selection area candidate A′ is an area which can be a selection area A described below. A selection area candidate R′ is a rectangular area a diagonal line of which is, for example, a line segment connecting the first intermediate point M1 and the second intermediate point candidate M2′. A shape and an area of the selection area candidate R′ change when touch inputs of two points slide and the coordinate of the second intermediate point candidate M2′ changes. Hence, the selection area candidates R′ are continuously calculated according to changes in the coordinates of the second intermediate point candidates M2′. When the selection area candidates R′ are calculated, the flow proceeds to step S10. - In step S10, the
control unit 20 displays the selection area candidates R′ calculated in step S9, on thetouch panel display 10. As described above, the selection area candidates R′ are continuously calculated, and displayed on thetouch panel display 10 every time the selection area candidate R′ is calculated. By this means, the user can check the selection area candidates R′ from the display of thetouch panel display 10, and can adjust a touch position such that an object which the user desires to select is included in the selection area candidates R′. When the selection area candidates R′ are displayed, the flow proceeds to step S11. - In step S11, the
control unit 20 decides whether or not slide inputs continuing from the touch point P1 of the first point and the touch point P2 of the second point are released. That is, thecontrol unit 20 may decide that the slide inputs are released when the coordinateinput unit 2 no longer detects touch inputs continuing from the touch point P1 of the first point and the touch point P2 of the second point. When it is decided that the slide inputs are released, the flow proceeds to step S12. On the other hand, when it is decided that the slide inputs are continuing without being released, processing in steps S8 to S10 are repeated until release of the slide inputs is detected. - In step S12, the
control unit 20 decides the coordinate of the second intermediate point M2. That is, as illustrated inFIG. 3 , thecontrol unit 20 acknowledges points detected by the coordinateinput unit 2 immediately before the slide inputs are released in step S11 as the touch point P3 of the third point and the touch point P4 of the fourth point. In addition, the touch point P3 of the third point is a touch point detected by the coordinateinput unit 6 immediately before the slide input continuing from the touch point P1 of the first point is released. Further, the touch point P4 of the fourth point is a touch point detected by the coordinateinput unit 6 immediately before the slide input continuing from the touch point P2 of the second point is released. Furthermore, thecontrol unit 20 calculates the coordinate of the second intermediate point M2 based on the coordinate of the touch point P3 of the third point and the coordinate of the touch point P4 of the fourth point. Conditions for calculating the coordinate of the second intermediate point M2 may be the same as the conditions for calculating the coordinate of the first intermediate point M1. The coordinate of the second intermediate point M2 is stored in the coordinatememory unit 6. When the coordinate of the second intermediate point M2 is calculated, the flow proceeds to step S13. - In step S13, the
area specifying unit 3 of thecontrol unit 20 defines a selection area R on the display screen of thetouch panel display 10 based on the above coordinate of the first intermediate point M1 and coordinate of the second intermediate point M2. In an example illustrated inFIG. 3 , the selection area R is a rectangular (square) area a diagonal line of which is a line segment D connecting the coordinate of the first intermediate point M1 and the second intermediate point M2 and a periphery of which is defined by two sides parallel to a Y axis of the display screen and two sides parallel to the X axis. When the selection area R is specified, the coordinate of each top of the shape of the selection area R is stored in thememory unit 30. - Meanwhile, the shape of the selection area R is not limited to the above, and may have a shape determined based on coordinates of two points of the first intermediate point M1 and the second intermediate point M2.
FIG. 5 illustrates an example of the shape of the selection area R which can be determined based on the coordinates of the two points of the first intermediate point M1 and the second intermediate point M2. - In an example illustrated in
FIG. 5( a), the selection area R is a polygonal area a diagonal line of which is the line segment D connecting the coordinate of the first intermediate point M1 and the second intermediate point M2. More specifically, in the example inFIG. 5( a), a selection area A has a hexagonal shape a diagonal line of which is the line segment D connecting the coordinates of the first intermediate point M1 and the second intermediate point M2, and in which a side extending from the first intermediate point M1 in the X axis direction and a side extending from the second intermediate point M2 in the X axis direction are parallel. - Further, in an example illustrated in
FIG. 5( b), the selection area R is a perfect circular area a diameter of which is the line segment D connecting the coordinate of the first intermediate point M1 and the second intermediate point M2. - Furthermore, in an example illustrated in
FIG. 5( c), the selection area R is an elliptical area a major axis of which is the line segment D connecting the coordinates of the first intermediate point M1 and the second intermediate point M2. In this case, the length of a minor axis of the elliptical area may be a fixed value, or a value proportional to the major axis. Further, although not illustrated, the selection area R may be an elliptical area a minor axis of which is the line segment D connecting the coordinates of the first intermediate point M1 and the second intermediate point M2. - Thus, the shape of the selection area R can be adequately set according to use thereof.
- In step S14, the
control unit 20 displays the defined selection area R on thetouch panel display 10. By this means, the user can check the selection area R based on the display on thetouch panel display 10. - In step S15, the
object selecting unit 4 of thecontrol unit 20 decides whether or not there is an object in the selection area R on the display screen. Positions at which there is a plurality of objects O displayed on the screen are learned by thecontrol unit 20. Consequently, by referring to the coordinate of each top of the shape of the selection area R on the display screen and the coordinates at which there are the objects O, it is possible to decide whether or not the objects O are included in the selection area R. In the example illustrated inFIG. 3 , objects O1 to O5 are displayed on the screen of thetouch panel display 10. Among these objects, the objects O1 to O3 are entirely or partially included in the selection area R. On the other hand, the objects O4 and O5 are entirely positioned outside the selection area R. Hence, theobject selecting unit 4 of thecontrol unit 20 decides that the objects O1 to O3 among a plurality of objects O1 to O5 are included in the selection area R. When there are objects in the selection area R on the display screen, the flow proceeds to step S16. - Meanwhile, when there is not any object in the selection area R, the flow returns to step S1 and a touch input stand-by state starts again.
- In step S16, the
object selecting unit 4 of thecontrol unit 20 selects one or a plurality of objects which are decided to be included in the selection area R. Information (for example, an identification number of each object) related to the selected objects is temporarily stored in a working area of thememory unit 30. - Further, as illustrated in
FIG. 2 , when the touch input of the second point is not performed while the touch input of the first point continues or, even though the touch input of the second point is performed, a simultaneous slide input of the two points is not performed, the flow proceeds to step S17. In this case, in step S17, whether or not an object on the display screen is positioned at the touch point P1 of the first point is decided as conventionally performed. Meanwhile, the coordinates of the touch point P1 of the first point and the coordinate at which there is the object on the display screen are referred to and, when both of the coordinates match, the flow proceeds to step S16. - In step S16, the
object selecting unit 4 of thecontrol unit 20 selects one object the coordinate of which matches with the touch point P1 of the first point. The information related to the selected object is temporarily stored in the working area of thememory unit 30. - As described above, the
control unit 20 performs processing of selecting one or a plurality of objects of the objects displayed on thetouch panel display 10. When thecontrol unit 20 selects an object, by performing a drag operation in a state where the object is selected, for example, it is possible to perform various known information processing of, for example, moving the position of the object on the display screen. - (2. Game Apparatus)
- Next, a game apparatus according to the embodiment of the present invention will be described. The game apparatus according to the present invention basically has a touch panel display. Further, the game apparatus can advance a game by selecting one or a plurality of objects displayed on the touch panel display according to the above method, and giving various commands to the selected objects.
- [Configuration Example of Game Apparatus]
-
FIG. 6 is a block diagram illustrating a configuration example of the game apparatus. The embodiment illustrated by this block diagram can be suitably used for an arcade game apparatus in particular.FIG. 7 is a perspective view illustrating an example of an outlook of a housing of the game apparatus. - As illustrated in
FIG. 6 , the game apparatus has thetouch panel display 100, acard reader 200 and agame body 300. Thetouch panel display 100 is configured to display various items of image data as an image which the user can view, and detect the coordinates which the user touches on the display screen. Further, when a card with a predetermined identification code printed thereon is set on thecard reader 200, thecard reader 200 is configured to read the identification code recorded in the card and acquire unique card information of the card. Furthermore, thegame body 300 controls the entire function of the game apparatus. Particularly, thegame body 300 can display an object on thetouch panel display 100 based on the information read by thecard reader 200, and advance a game based on a card operation with respect to thecard reader 200 and touch control with respect to thetouch panel display 100. - As illustrated in
FIG. 6 , thetouch panel display 100 has thedisplay 110 and thetouch screen 120. Thetouch panel display 100 is formed by disposing thetouch screen 120 formed using a transparent material, in front of thedisplay 100 which can display images. Thedisplay 110 is a display apparatus such as a LCD (Liquid Crystal Display) or an OELD (Organic Electro Luminescence Display). Thedisplay 110 outputs and displays various pieces of information which the user requires to use the information processing apparatus, as a still image or a movie according to an input signal from thegame body 300. Further, thetouch screen 120 can detect contact of the user's hand or finger according to a known electrostatic capacitance method, electromagnetic induction method, infrared scan method, resistance film method or ultrasonic surface acoustic wave method, and obtain information about the coordinate of the touch position. The positional relationship between thedisplay 110 and thetouch screen 120 is mutually linked, and thetouch screen 120 can acquire information about the coordinate of a touch position on the display screen displayed on thedisplay 110. By this means, thetouch screen 120 can detect contact of the user's finger, and obtain the information about the coordinate on the screen of thedisplay 110 which the user's finger contacted. The coordinate information acquired by thetouch screen 120 is stored in atemporary memory unit 370 of thegame body 300. Further, thetouch screen 120 supports so-called multitouch of, when, for example, the user touches a plurality of points, acquiring information about coordinates of a plurality of these points. Furthermore, the game apparatus according to the present invention preferably has the comparatively largetouch panel display 100 mounted thereon. For example, thetouch panel display 100 is preferably displays of 10 inches to 75 inches, 16 inches to 40 inches or 20 inches to 38 inches. - As illustrated in
FIG. 6 , thecard reader 200 is an apparatus which can capture an image of an identification card recorded in a card C, and has apanel 210, alight source 220 and animage sensor 230. For example, an illustration of an object used in a game is printed on the surface of the card C, and an identification code for identifying the object printed on the surface is recorded on the back surface of the card C. Further, for example, an identification code is printed on the back surface of the card C using an ink which cannot be viewed by means of visible light, and a pattern printed using black and white appear when specific invisible light is radiated on the card. The identification code is printed using a special ink which absorbs invisible light such as infrared ray and, when infrared ray is radiated on the back surface of the card C, the invisible light radiated on a portion except the black portion of the identification code is reflected. For example, the identification code of the card C has at least an identification number of an object drawn in the card and information related to, for example, an orientation of the cord recorded therein. - The
panel 210 is provided on the upper surface of thecard reader 200, and a plurality of cards C can be set on thepanel 210. Further, inside the housing of the game apparatus, for example, alight source 220 which radiates infrared ray (invisible light) on the back surface of the card C set on thepanel 210, and animage sensor 230 which acquires the infrared ray reflected from the back surface of the card C set on thepanel 210 and captures an image of a pattern of card data recorded in the card C are provided. Thelight source 220 is, for example, a light emitting diode (LED) which emits invisible light such as infrared ray or ultraviolet ray which is invisible to the eyes. Theimage sensor 230 is, for example, an image capturing element which captures an image of an identification code by means of infrared ray which is reflected on the back surface of the card C and is incident on thecard reader 200. Further, thecard reader 200 can acquire unique card information of the card C by analyzing this identification code. The card information acquired by thecard reader 200 is transmitted to aprocessing unit 310 of thegame body 300, and stored in the temporary memory unit 270. - The identification code of the card C has at least an identification number of an object drawn in the card and information related to, for example, an orientation of the card recorded therein. Hence, by referring to an object table stored in the game
information memory unit 380 or the temporary memory unit 270 based on the card information acquired from thecard reader 200, theprocessing unit 310 of thegame body 300 can learn a status, a type, a name and an attribute of the object recorded in the card C and, moreover, the characteristics of the object matching the orientation or the position of the card C. An example of an object is a game character. Further, theimage sensor 230 of thecard reader 200 detects the position at which infrared ray light is reflected from the back surface of the card C, so that theprocessing unit 310 of thegame body 300 can calculate the position at which the card C is set on thepanel 210 as coordinate information. Furthermore, theimage sensor 230 continuously detects reflection positions of infrared ray, so that it is possible to obtain information that the card C set on thepanel 210 moves from a certain position to another position. - Still further, as illustrated in
FIG. 8 , thepanel 210 of thecard reader 200 is preferably partitioned into a plurality of areas. The number of partitions of thepanel 210 is, for example, 2 to 10. In an example illustrated inFIG. 8 , thepanel 210 of thecard reader 200 is divided into two of an offensive area A1 (first area) and a defensive area A2 (second area). These areas are partitioned according to the coordinate of the panel, and each card C can slide in the offensive area A1 and the defensive area A2. By acquiring the position of each card C on thepanel 210 as coordinate information, theprocessing unit 310 of thegame body 300 can decide which one of the offensive area A1 and the defensive area A2 the position of each card C belongs to. - Further, as illustrated in
FIG. 8 , the rectangular card C can be set vertically or horizontally on thepanel 210 of thecard reader 200, and theprocessing unit 310 of thegame body 300 can decide whether the card C is set vertically or horizontally, based on detection information from thecard reader 200. For example, an identification code is printed on the back surface of the card C. This identification code includes information related to the orientation of the card. Consequently, theprocessing unit 310 of thegame body 300 can decide whether the card C is set vertically or horizontally by reading the identification code by means of thecard reader 200 and analyzing the orientation of the card C based on the read identification code. - The
game body 300 has theprocessing unit 310, and reads and executes a game program and controls an entire operation of the game apparatus according to the game program. As illustrated inFIG. 6 , thegame body 300 has the following configuration. - The
processing unit 310 performs various processing such as control of the entire system, an instruction to give a command to each block in the system, game processing, image processing and audio processing. The function of theprocessing unit 310 can be realized by hardware such as various processors (for example, a CPU or a DSP) or an ASIC (for example, a gate array), or a given program (game program). - The
processing unit 310 may include agame processing unit 320, animage processing unit 330 and anaudio processing unit 350. More specifically, theprocessing unit 310 includes a main processor, a coprocessor, a geometry processor, a drawing processor, a data processing processor, and a four arithmetic operation circuit or a generalized arithmetic operation circuit. These processors and circuit are adequately coupled through a bus, and can transmit and receive signals. Further, theprocessing unit 310 may have a data extension processor for extending compressed information. - The
game processing unit 320 performs various processing such as processing of displaying an object on thedisplay 110 based on card information acquired by thecard reader 200, processing of scrolling the position of a view point (the position of a virtual camera) or an angle of view (a rotation angle of the virtual camera) on thedisplay 110, processing of arranging an object such as a map object in object space, processing of selecting an object, processing of moving the object (motion processing), processing of calculating the position or the rotation angle of the object (the rotation angle around an X, Y or Z axis), processing of receiving coins (price), processing of setting various modes, processing of advancing a game, processing of setting a selection screen, hit check processing, processing of computing a game result (achievement or score), processing of allowing a plurality of players to play a game in common game space or game-over processing, based on input data from thetouch screen 120, thecard reader 200 and theoperating unit 360 and personal data, stored data and a game program from a mobileinformation storage apparatus 392. - The
image processing unit 330 performs various image processing according to, for example, an instruction from thegame processing unit 320. Thegame processing unit 320 reads image information of an object and game space from the gameinformation memory unit 380 based on information about the position of a view point and an angle of view, and writes the read image information in thetemporary memory unit 370. Thegame processing unit 320 supplies scroll data for moving the view point to theimage processing unit 330. Theimage processing unit 330 reads image information per frame from thetemporary memory unit 370 based on given scroll data, and has thedisplay 110 display images of the object and the game space according to the read image information. By this means, thedisplay 110 displays the object and the game space based on the view point. Further, theimage processing unit 330 moves the view point in the game space according to the coordinate inputted to thetouch screen 120. Furthermore, theimage processing unit 330 reads frames from thetemporary memory unit 370 based on the information about the moving view point, and has thedisplay 110 to display the read image. Thus, by scrolling the view point in the game space, the display screen transitions. - Further, the
image processing unit 330 reads the card information acquired from thetemporary memory unit 370 by thecard reader 200, and refers to the object table stored in a gameinformation memory unit 380 based on this card information. Furthermore, theimage processing unit 330 reads image data of the object associated with the card information from thetemporary memory unit 370 or the gameinformation memory unit 380 based on link information stored in the object table. Still further, theimage processing unit 330 generates the object in the game space according to the image data of the read object, and has thedisplay 110 to display the object. - The
game processing unit 320 controls a behavior of the object which appears in the game space, based on the information about the coordinate inputted to thetouch screen 120, the orientation or the position of the card set on thecard reader 200 and other operation information from the operating unit 260 (a lever, button or a controller). For example, thegame processing unit 320 refers to the coordinate information of the object displayed on thedisplay 110 and the coordinate information inputted to thedisplay 110, and decides whether or not the user touches the object. That is, thegame processing unit 320 decides that the user touched and selected the object when position information inputted to thetouch screen 120 and position information of the object match. Further, when an operation or an instruction is given to the selected object, processing matching a game program is performed according to the operation or the instruction. - Furthermore, the
game processing unit 320 preferably performs selection processing unique to the present invention when the object displayed on thedisplay 110 of thetouch panel display 100 is selected. That is, thegame processing unit 320 determines the coordinate of the first intermediate point calculated from coordinates of first two points based on the coordinates of the first two points simultaneously inputted to thetouch screen 120 of thetouch panel display 100. Further, thegame processing unit 320 determines the coordinate of the second intermediate point calculated from coordinates of last two points based on the coordinates of the last two points detected immediately before the two coordinates of the two points stop being simultaneously inputted. Furthermore, thegame processing unit 320 specifies an area on the display screen of thedisplay 110 based on the coordinate of the first intermediate point and the coordinate of the second intermediate point, and selects one or a plurality of objects images of which are displayed such that at least part of the objects are included in the specified area. Still further, when the operation or the instruction is given to the selected object, thegame processing unit 320 performs processing matching a game program according to the operation or the instruction. When, for example, one or a plurality of objects are selected according to the input operation with respect to thetouch screen 120 and then different coordinate information is inputted to thetouch screen 120 again, thegame processing unit 320 performs control of moving one or a plurality of selected objects to the coordinate information inputted again. Thus, thegame processing unit 320 preferably advances a game by linking the card information acquired by thecard reader 200 and the coordinate information inputted to thetouch screen 120. - The audio processing unit 250 emits various sounds according to, for example, an instruction from the
game processing unit 320. - Functions of the
game processing unit 320, theimage processing unit 330 and theaudio processing unit 350 may all be realized by hardware or may all be realized by programs. Alternatively, these functions may be realized by both of the hardware and the programs. - As illustrated in
FIG. 6 , for example, theimage processing unit 330 may have a geometry processing unit 332 (three-dimensional coordinate computing unit) and a drawing unit 340 (rendering unit). - The
geometry processing unit 332 performs various geometry computations (three-dimensional coordinate computation) such as coordinate transformation, clipping processing, perspective transformation and light source calculation. Further, object data (for example, top coordinate, top texture coordinate or brightness data of the object) for which geometry processing has been performed (perspective transformation has been performed) is stored in amain memory 372 of thetemporary memory unit 370 and kept. - The
drawing unit 340 draws the object in aframe buffer 374 based on the object data for which geometry computation has been performed (perspective transformation has been performed) and a texture and the like stored in thetexture memory unit 376. Thedrawing unit 340 may include, for example, atexture mapping unit 342 and ashading processing unit 344. More specifically, thedrawing unit 340 can be implemented by a drawing processor. The drawing processor is connected to the texture memory unit, various tables, a frame buffer and a VRAM via a bus and the like, and is further connected with the display. - The texture mapping unit 242 reads an environment texture from the texture memory unit 276, and maps the read environment texture on the object.
- The
shading processing unit 344 performs shading processing with respect to the object. For example, thegeometry processing unit 332 performs light source calculation, and calculates brightness (RGB) of each top of the object based on information about the light source for shading processing, an illumination model and a normal vector of each top of the object. Theshading processing unit 344 calculates the brightness of each dot of a primitive surface (polygon or curved surface) based on the brightness of each top according to Phong shading or Gouraud shading. - The
geometry processing unit 332 may include a normalvector processing unit 334. The normalvector processing unit 334 may perform processing of rotating a normal vector of each top of the object (a normal vector on a plane of the object in a broad sense) according to a rotation matrix from a local coordinate system to a world coordinate system. - The
operating unit 360 allows a player to input operation data. The function of theoperating unit 360 is realized by, for example, a lever, a button and hardware. Processing information from theoperating unit 360 is transmitted to the main processor through a serial interface (I/F) or the bus. - The game
information memory unit 380 stores game programs, objects displayed on thedisplay 110 and information related to image data in game space. The gameinformation memory unit 380 is, for example, a ROM, and is realized by a non-volatile memory such as an optical disk (CD or DVD), a magnetooptical disk (MO), a magnetic disk, a hard disk or a magnetic tape. Theprocessing unit 310 performs various processing based on information stored in this gameinformation memory unit 380. The gameinformation memory unit 380 stores information (programs or the programs and data) for executing means (a block included in theprocessing unit 310 in particular) of the present invention (the present embodiment). Part or all of information stored in the gameinformation memory unit 380 may be written to thetemporary memory unit 370 when, for example, the system is turned on. - The information stored in the game
information memory unit 380 includes, for example at least two of a program code for performing predetermined processing, image data, audio data, shape data of a display object, table data, list data, information for instructing processing of the present invention and information for performing processing according to the instruction. For example, the table data includes data of an object table which stores a status, a type, a name and an attribute of an object, and characteristics of the object matching the orientation or the position of the card, in association with an identification number of the object. - The status of the object is information in which, for example, a moving speed, a hit point, offense power and defense power are stored as numerical values. The
game processing unit 320 can decide superiority and inferiority of, for example, the moving speed, the hit point and the offense power of each object by referring to the status stored in the object table. Further, thegame processing unit 320 can perform various computations for advancing a game based on these numerical values related to the status. For example, the numerical value of the moving speed of each object is comparable, and, by referring to the object table, it is possible to learn which one of a given object and another object has a faster moving speed. Further, by performing predetermined computation processing based on a numerical value of the moving speed of each object, it is possible to calculate a time required for the object to move from a give point to another point in game space. - Furthermore, the characteristics of the object matching the orientation of the card are data which change according to the orientation of the card set on the
panel 210 of thecard reader 200. For example, as to objects related to a given card, the object table stores information which is different when the card is vertically set or horizontally set. For example, when the card is vertically set and horizontally set, the status of the object may change. - Further, the characteristics of the object matching the position of the card are data which change according to the position at which the card is set on the
panel 210 of thecard reader 200. For example, as to objects related to a given card, the object table stores information which is different when the card is positioned in the offensive area A1 (first area) and when the card is positioned in the defensive area A2 (second area). For example, when the card is positioned in the offensive area A1 and when the card is positioned in the defensive area A2, the status of the object may change. - Further, the game
information memory unit 380 stores data related to game space. The game space means a world of a game in the game apparatus according to the present invention which is also referred to as a “world”. The data related to the game space includes position information of a target object to be displayed, information related to the type of the target object to be displayed and image data of the target object to be displayed. The target object to be displayed is, for example, a background, a building, a landscape, a plant and an object appearing in a game. This image data is preferably stored as polygon data. The polygon data includes, for example, top coordinate data, color data, texture data and transparency data. The gameinformation memory unit 380 classifies and stores a target object to be displayed according to the orientation of a view point, a position and an area of a player character. - The audio output unit 390 outputs an audio. The function of the audio output unit 390 can be realized by hardware such as a speaker. An audio output is applied audio processing by a sound processor connected to, for example, the main processor through the bus, and is outputted from the audio output unit such as the speaker.
- The mobile
information storage apparatus 392 stores, for example, personal data of a player and saved data. This mobileinformation storage apparatus 392 may be, for example, a memory card or a mobile game apparatus. A function of the mobileinformation storage apparatus 392 can be achieved by known storage means such as a memory card, a flash memory, a hard disk or a USB memory. Meanwhile, the mobileinformation storage apparatus 392 is not a necessary configuration, and may be implemented when a player needs to be identified. - The communication unit 394 is an arbitrary unit which performs various controls for performing communication with outside (for example, a host server or another game apparatus). By connecting the game apparatus with a host sever on a network or another game apparatus through the communication unit 394, it is possible to play a match play or a combination play of a game. The function of the communication unit 394 can be realized by various processors, hardware such as a communication ASIC or a program. Further, a program or data for executing a game apparatus may be distributed from an information storage medium of a host apparatus (server) to the game
information memory unit 380 through the network and the communication unit 394. - [Operation Example of Game Apparatus]
- Next, an operation example of the game apparatus employing the above configuration will be described with reference to
FIGS. 9 to 11 . Hereinafter, the system of the game executed by the game apparatus will be described using an example. For example, the game apparatus according to the present invention can play a match game using communication such as the Internet. In this match game, each game user plays a match by having a plurality of objects (game characters) to appear in one game space. In an example of the game described below, the user performs an instruction operation such as appearance, movement, offense and defense of each object through, for example, thetouch panel display 100 and thecard reader 200 to beat enemy objects (Enemy), conquer a tower and break a stone. -
FIG. 9 conceptually illustrates states of thetouch panel display 100 and thecard reader 200 when a game is actually played using the game apparatus according to the present invention. The user sets desired cards C1 to C7 on thecard reader 200. The identification code is printed on the back surface of each of the cards C1 to C7. When reading the identification code of each of the cards C1 to C7, thecard reader 200 analyzes card information based on the identification code, and transmits card information to theprocessing unit 310 of the game apparatus. Further, thecard reader 200 can learn the orientation and the position of each of the cards C1 to C7. In an example illustrated inFIG. 9 , on thecard reader 200, the cards C1, C2, C5 and C6 are vertically set, and the cards C3, C4 and C7 are horizontally set. Further, in the example illustrated inFIG. 9 , on thecard reader 200, the cards C1, C4 and C6 are positioned in theoffensive area A 1, and the cards C2, C3, C5 and C7 are positioned in the defensive area A2. Information detected by thecard reader 200 is transmitted to theprocessing unit 310, and theprocessing unit 310 refers to the object table stored in the game information memory unit 380 (or the temporary memory unit 370) based on the card information and information about, for example, the orientation of the card and the position of the card, and reads information (for example, image data and a status) about the object associated with the card information. Theprocessing unit 310 has thetouch panel display 100 display images based on the read image data. For example, thetouch panel display 100 displays the images of the cards in a lower area of thetouch panel display 100. The images of the cards displayed on thetouch panel display 100 match an arrangement order of each of the cards C1 to C7 set on thecard reader 200 and the orientation of each of the cards C1 to C7. Thus, by displaying the image of each of the cards C1 to C7 set on thecard reader 200, on part of thetouch panel display 100, the user can learn an arrangement and the orientation of each of the cards C1 to C7 by viewing thetouch panel display 100 without visually checking thecard reader 200. In addition,touch panel display 100 can display information regarding positions of each of the cards C1 to C7 set on the card reader 200 (for example, the card is positioned on the offensive area A1 or the defensive area A2). - In the example illustrated in
FIG. 9 , the objects (game characters) O1 to O6 associated with the cards C1 to C6 are displayed on the display screen of thetouch panel display 100. Each object has a unique status, a type, a name, an attribute and object characteristics matching the orientation or the position of the card. The status of the object is information in which, for example, a moving speed, a hit point, offense power and defense power are stored as numerical values. These pieces of information are stored in the object table in association with identification information of each object. While it is possible to set objects the cards of which are vertically arranged to carry out normal offenses, it is possible to set objects the cards of which are horizontally arranged to carry out special offenses. Further, while it is possible to make a setting to increase numerical values of offence power of the objects the cards of which are positioned in the offensive area A1, it is possible to make a setting to increase numerical values of defense power of objects the cards of which are positioned in the defensive area A2. Furthermore, in the example illustrated inFIG. 9 , the object (O7) associated with the card C7 is not displayed on thetouch panel display 100. To allow the object (O7) to appear in game space, the image of the card C7 displayed on thetouch panel display 100 is touched, and the image of the card C7 is dragged to the position at which a call gate is displayed. When the image of the card C7 is dropped at the position at which a call gate is displayed, the object (O7) associated with the card C7 appears in game space, and is displayed on thetouch panel display 100. In addition, positional coordinates of the call gate in the game space are stored in the gameinformation memory unit 380 or thetemporary memory unit 370, and thegame processing unit 320 get grip on the position of the call gate. -
FIG. 10 illustrates an example of an operation of moving objects displayed on thetouch panel display 100. When the user touches thetouch panel display 100, thetouch panel display 100 obtains the coordinate of the touch position. Theprocessing unit 310 refers to the coordinate of the touch position and the coordinate at which the object is displayed, and decides whether or not the coordinate of the touch position and the coordinate at which the object is displayed match. When the touch position and the position of the object match, theprocessing unit 310 learns that the object is selected. In an example inFIG. 10( a), the object O4 is selected. Further, when the user touches the display screen of thetouch panel display 100 in a state where the object O4 is selected, theprocessing unit 310 stores the coordinate of the touch position in the temporary memory unit. Particularly, when the user touches a plurality of points in a state where theobject 4 is selected, theprocessing unit 310 stores the coordinates of the touch positions in the temporary memory unit together with information about the touch order. Further, theprocessing unit 310 performs processing of moving the object O4 touched and selected by the user to a point which the user touches next. The moving speed varies per object. Then, theprocessing unit 310 reads a numerical value of the moving speed related to the object O4 from the object table. Further, the object O4 is moved from the first point to a moving destination point based on the numerical value of the read moving speed. Furthermore, when a plurality of points is touched, the selected object O4 is sequentially moved to each point according to the order. In addition, when the moving object O4 encounters an enemy object during movement or arrives at a tower, processing of playing a match with the enemy object or processing of conquering the tower only need to be performed similar to the known game system. - Meanwhile, when the user wishes to simultaneously move a plurality of objects, an operation of touching and selecting objects one by one and moving the objects according to the method illustrated in
FIGS. 10( a) and 10(b) require that all objects need to be touched and selected one by one, and becomes complicated. Further, as described above, the moving speed varies per each object. Hence, even when the user wishes to move a plurality of objects to the same point, a timing when each object arrives at the specified point varies according to the moving speed of the object. - Hence, as illustrated
FIGS. 11( a) and 11(b), the game system can collectively select a plurality of objects on the display screen, and can form a party of the selected objects.FIG. 11 (a) illustrates an example of an operation of collectively selecting a plurality of objects O1 to O7. In addition, the operation illustrated inFIG. 11( a) is basically the same as the operation illustrated inFIG. 3 . Theprocessing unit 310 determines the coordinate of the first intermediate point M1 calculated from coordinates of the first two points P1 and P2 based on the coordinates of the first two points P1 and P2 simultaneously inputted to thetouch panel display 100 by the user. Further, theprocessing unit 310 determines the coordinate M2 of the second intermediate point calculated from coordinates of the last points P3 and P4 based on the coordinates of the last two points P3 and P4 detected immediately before the coordinates of the two points are dragged by the user and stop being simultaneously inputted. Furthermore, theprocessing unit 310 specifies a rectangular area on the display screen of thetouch panel display 100 based on the coordinate of the first intermediate point M1 and the coordinate M2 of the second intermediate point. Still further, a plurality of displayed objects O1 to O7 is selected such that at least part of the objects is included in the specified rectangular area. The selected objects O1 to O7 are stored in the temporary memory unit in association with objects which form a party. -
FIG. 11( b) illustrates an example where a plurality of selected objects O1 to O7 forms a party. As illustrated inFIG. 11( b), a plurality of collectively selected objects O1 to O7 gather at one site and form a party. Further, theprocessing unit 310 performs processing of moving the party formed by a user's operation to a point touched next. That is, the objects O1 to O7 form the party and move in a massing state. In this case, the moving speed of each object which forms the party is different from each other. Then, theprocessing unit 310 reads from the object table a numerical value of the moving speed of each object which forms a party, and calculates the moving speed of the party based on the numerical value of the read moving speed of each object. For example, theprocessing unit 310 only needs to calculate the slowest moving speed (the lowest numerical value) of each object which forms a party as the moving speed of the entire party. Further, theprocessing unit 310 may use an average value of moving speeds of a plurality of objects which form the party as the moving speed of the entire party. Furthermore, theprocessing unit 310 can also use the fastest moving speed (the highest numerical value) of each object which forms the party as the moving speed of the entire party. The party of the object moves to the specified point at the moving speed matching the value of the moving speed calculated by theprocessing unit 310. When a plurality of points is specified, theprocessing unit 310 moves the party of the objects sequentially to each point according to the specified order. When the moving party of the objects encounters an enemy object during movement or arrives at a tower, processing of playing a match with an enemy object or processing of conquering a tower only needs to be performed similar to a known game system. - A characteristic system of the game executed by the game apparatus according to the present invention has been mainly described above. A known game apparatus which has a card reader or a known game apparatus which has a touch panel display can be adequately applied to other game processing.
- The present invention can be suitably used in, for example, a computer industry and a game industry.
-
- 1 Display unit
- 2 Coordinate input unit
- 3 Area specifying unit
- 4 Object selecting unit
- 5 Object memory unit
- 6 Coordinate memory unit
- 10 Touch panel display
- 20 Control unit
- 30 Memory unit
- M1 First intermediate point
- M2 Second intermediate point
- P1 Touch point of first point
- P2 Touch point of second point
- P3 Touch point of third point
- P4 Touch point of fourth point
- R Area
- O Object
Claims (10)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/688,146 US10831258B2 (en) | 2012-05-23 | 2017-08-28 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
US16/517,143 US11119564B2 (en) | 2012-05-23 | 2019-07-19 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-117972 | 2012-05-23 | ||
JP2012117972A JP5377709B2 (en) | 2012-05-23 | 2012-05-23 | Information processing apparatus, information processing method, and game apparatus |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/688,146 Continuation US10831258B2 (en) | 2012-05-23 | 2017-08-28 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130316817A1 true US20130316817A1 (en) | 2013-11-28 |
Family
ID=47845708
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/748,937 Abandoned US20130316817A1 (en) | 2012-05-23 | 2013-01-24 | Information processing apparatus, method for information processing, and game apparatus |
US15/688,146 Active 2033-03-06 US10831258B2 (en) | 2012-05-23 | 2017-08-28 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
US16/517,143 Active US11119564B2 (en) | 2012-05-23 | 2019-07-19 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/688,146 Active 2033-03-06 US10831258B2 (en) | 2012-05-23 | 2017-08-28 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
US16/517,143 Active US11119564B2 (en) | 2012-05-23 | 2019-07-19 | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs |
Country Status (3)
Country | Link |
---|---|
US (3) | US20130316817A1 (en) |
EP (1) | EP2667294A3 (en) |
JP (1) | JP5377709B2 (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140245216A1 (en) * | 2013-02-27 | 2014-08-28 | Kyocera Document Solutions Inc. | Image processing apparatus, image forming apparatus including same, and method for controlling image processing apparatus |
US20150141823A1 (en) * | 2013-03-13 | 2015-05-21 | Samsung Electronics Co., Ltd. | Method of providing copy image and ultrasound apparatus therefor |
US20160179333A1 (en) * | 2014-06-13 | 2016-06-23 | Zheng Shi | System and method for changing the state of user interface element marked on physical objects |
US20160256777A1 (en) * | 2015-03-05 | 2016-09-08 | Bandai Namco Entertainment Inc. | Method for controlling display of game image and server system |
US20180121061A1 (en) * | 2014-10-28 | 2018-05-03 | Gree, Inc. | Game program, computer control method, and information processing apparatus |
US20180264354A1 (en) * | 2017-03-17 | 2018-09-20 | Gree, Inc. | Program, control method, and information processing device |
EP3345664A4 (en) * | 2015-09-29 | 2019-07-17 | Tencent Technology (Shenzhen) Company Limited | Information processing method, terminal and computer storage medium |
US10391399B2 (en) * | 2015-04-13 | 2019-08-27 | Cygames, Inc. | Program, electronic device, and method that improve ease of operation for user input |
US11096668B2 (en) | 2013-03-13 | 2021-08-24 | Samsung Electronics Co., Ltd. | Method and ultrasound apparatus for displaying an object |
CN113918076A (en) * | 2021-12-15 | 2022-01-11 | 深圳佑驾创新科技有限公司 | Touch method, touch device and storage medium of touch screen |
US20220008820A1 (en) * | 2020-07-08 | 2022-01-13 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
US20220100365A1 (en) * | 2020-09-30 | 2022-03-31 | Benq Corporation | Touch control method and touch control system applying the same |
US20220245991A1 (en) * | 2012-05-24 | 2022-08-04 | Supercell Oy | Graphical user interface for a gaming system |
US20230166187A1 (en) * | 2020-03-31 | 2023-06-01 | Bandai Co., Ltd. | Program, terminal, and game system |
US20230256336A1 (en) * | 2022-02-16 | 2023-08-17 | Nintendo Co., Ltd. | Computer-readable non-transitory storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
US11752432B2 (en) * | 2017-09-15 | 2023-09-12 | Sega Corporation | Information processing device and method of causing computer to perform game program |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6307942B2 (en) * | 2014-03-05 | 2018-04-11 | セイコーエプソン株式会社 | IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD |
US20170131824A1 (en) * | 2014-03-20 | 2017-05-11 | Nec Corporation | Information processing apparatus, information processing method, and information processing program |
JP6952076B2 (en) * | 2017-09-04 | 2021-10-20 | 株式会社バンダイ | Game equipment, programs and game systems |
CN109011573B (en) * | 2018-07-18 | 2022-05-31 | 网易(杭州)网络有限公司 | Shooting control method and device in game |
CN109613993B (en) * | 2018-08-11 | 2022-05-10 | 施红 | Chinese phonetic input method |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060010400A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
US20060025218A1 (en) * | 2004-07-29 | 2006-02-02 | Nintendo Co., Ltd. | Game apparatus utilizing touch panel and storage medium storing game program |
US20080132333A1 (en) * | 2006-07-11 | 2008-06-05 | Aruze Corp. | Gaming machine and image alteration control method of gaming machine |
US20080274780A1 (en) * | 2001-06-18 | 2008-11-06 | Canon Kabushiki Kaisha | Computer device for implementing a trading card game and control method therefor, program executed by computer device, controller, system, and game cards |
US20090051114A1 (en) * | 2007-08-24 | 2009-02-26 | Tc Digital Games, Llc | Systems and Methods for Multi-Platform Trading Card Game |
US20100083111A1 (en) * | 2008-10-01 | 2010-04-01 | Microsoft Corporation | Manipulation of objects on multi-touch user interface |
US20100141680A1 (en) * | 2008-09-12 | 2010-06-10 | Tatsushi Nashida | Information processing apparatus and information processing method |
US20110009195A1 (en) * | 2009-07-08 | 2011-01-13 | Gunjan Porwal | Configurable representation of a virtual button on a game controller touch screen |
US20110130182A1 (en) * | 2009-11-27 | 2011-06-02 | Konami Digital Entertainment Co., Ltd. | Game apparatus, computer-readable recording medium recorded with game control program, and game control method |
US20110169762A1 (en) * | 2007-05-30 | 2011-07-14 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous input |
US20130167062A1 (en) * | 2011-12-22 | 2013-06-27 | International Business Machines Corporation | Touchscreen gestures for selecting a graphical object |
US20130205208A1 (en) * | 2012-02-06 | 2013-08-08 | Hans H. Kim | User Interface Control for Media Editing Application |
US20130275868A1 (en) * | 2012-04-12 | 2013-10-17 | Supercell Oy | System, method and graphical user interface for controlling a game |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US20150113477A1 (en) * | 2012-04-12 | 2015-04-23 | Supercell Oy | System and method for controlling technical processes |
Family Cites Families (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08211992A (en) * | 1995-02-03 | 1996-08-20 | Canon Inc | Graphic forming device and method therefor |
US7663607B2 (en) * | 2004-05-06 | 2010-02-16 | Apple Inc. | Multipoint touchscreen |
JP4686886B2 (en) * | 2001-04-06 | 2011-05-25 | ソニー株式会社 | Information processing device |
JP2003024639A (en) * | 2001-07-18 | 2003-01-28 | Konami Computer Entertainment Osaka:Kk | Game progress control program, device and method for controlling game progress and server device for game |
US7254775B2 (en) * | 2001-10-03 | 2007-08-07 | 3M Innovative Properties Company | Touch panel system and method for distinguishing multiple touch inputs |
US11275405B2 (en) * | 2005-03-04 | 2022-03-15 | Apple Inc. | Multi-functional hand-held device |
US20040248650A1 (en) * | 2003-03-25 | 2004-12-09 | Colbert Savalas O. | Programmable electronic game apparatus |
US7411575B2 (en) * | 2003-09-16 | 2008-08-12 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US7309287B2 (en) * | 2003-12-10 | 2007-12-18 | Nintendo Co., Ltd. | Game machine having display screen with touch panel |
US20050227217A1 (en) * | 2004-03-31 | 2005-10-13 | Wilson Andrew D | Template matching on interactive surface |
KR101146750B1 (en) * | 2004-06-17 | 2012-05-17 | 아드레아 엘엘씨 | System and method for detecting two-finger input on a touch screen, system and method for detecting for three-dimensional touch sensing by at least two fingers on a touch screen |
US7925996B2 (en) * | 2004-11-18 | 2011-04-12 | Microsoft Corporation | Method and system for providing multiple input connecting user interface |
JP2006314349A (en) * | 2005-05-10 | 2006-11-24 | Nintendo Co Ltd | Game program and game device |
US8062115B2 (en) * | 2006-04-27 | 2011-11-22 | Wms Gaming Inc. | Wagering game with multi-point gesture sensing device |
US20070257891A1 (en) * | 2006-05-03 | 2007-11-08 | Esenther Alan W | Method and system for emulating a mouse on a multi-touch sensitive surface |
EP2088499A4 (en) * | 2006-11-30 | 2011-11-30 | Sega Corp | Position inputting apparatus |
US7956847B2 (en) * | 2007-01-05 | 2011-06-07 | Apple Inc. | Gestures for controlling, manipulating, and editing of media files using touch sensitive devices |
WO2008095139A2 (en) * | 2007-01-31 | 2008-08-07 | Perceptive Pixel, Inc. | Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques |
JP5210547B2 (en) * | 2007-05-29 | 2013-06-12 | 任天堂株式会社 | Movement control program and movement control apparatus |
US20090122018A1 (en) * | 2007-11-12 | 2009-05-14 | Leonid Vymenets | User Interface for Touchscreen Device |
US8519965B2 (en) * | 2008-04-23 | 2013-08-27 | Motorola Mobility Llc | Multi-touch detection panel with disambiguation of touch coordinates |
JP4533943B2 (en) * | 2008-04-28 | 2010-09-01 | 株式会社東芝 | Information processing apparatus, display control method, and program |
JP5448370B2 (en) * | 2008-05-20 | 2014-03-19 | キヤノン株式会社 | Information processing apparatus, control method therefor, program, and storage medium |
JP2010020608A (en) | 2008-07-11 | 2010-01-28 | Olympus Imaging Corp | Electronic apparatus, camera, object selection method and object selection program |
CN101661361A (en) * | 2008-08-27 | 2010-03-03 | 比亚迪股份有限公司 | Multipoint touch detection system |
US20100162181A1 (en) * | 2008-12-22 | 2010-06-24 | Palm, Inc. | Interpreting Gesture Input Including Introduction Or Removal Of A Point Of Contact While A Gesture Is In Progress |
US9354795B2 (en) * | 2009-04-29 | 2016-05-31 | Lenovo (Singapore) Pte. Ltd | Refining manual input interpretation on touch surfaces |
US20100285881A1 (en) * | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
US8739055B2 (en) * | 2009-05-07 | 2014-05-27 | Microsoft Corporation | Correction of typographical errors on touch displays |
EP2254032A1 (en) * | 2009-05-21 | 2010-11-24 | Research In Motion Limited | Portable electronic device and method of controlling same |
US8269736B2 (en) * | 2009-05-22 | 2012-09-18 | Microsoft Corporation | Drop target gestures |
US20100321319A1 (en) * | 2009-06-17 | 2010-12-23 | Hefti Thierry | Method for displaying and updating a view of a graphical scene in response to commands via a touch-sensitive device |
US20110014983A1 (en) * | 2009-07-14 | 2011-01-20 | Sony Computer Entertainment America Inc. | Method and apparatus for multi-touch game commands |
KR101117481B1 (en) * | 2009-10-12 | 2012-03-07 | 라오넥스(주) | Multi-touch type input controlling system |
US8222507B1 (en) * | 2009-11-04 | 2012-07-17 | Smule, Inc. | System and method for capture and rendering of performance on synthetic musical instrument |
JP4932010B2 (en) * | 2010-01-06 | 2012-05-16 | 株式会社スクウェア・エニックス | User interface processing device, user interface processing method, and user interface processing program |
US20110205169A1 (en) * | 2010-02-24 | 2011-08-25 | Primax Electronics Ltd. | Multi-touch input apparatus and its interface method using hybrid resolution based touch data |
JP5029767B2 (en) * | 2010-04-27 | 2012-09-19 | カシオ計算機株式会社 | Method for detecting contact state of resistive touch panel, touch panel device, and display device |
JP5508122B2 (en) * | 2010-04-30 | 2014-05-28 | 株式会社ソニー・コンピュータエンタテインメント | Program, information input device, and control method thereof |
KR101091335B1 (en) * | 2010-05-13 | 2011-12-07 | (주)네오위즈게임즈 | Methods, apparatus and recording media for playing games |
US20120026100A1 (en) * | 2010-07-30 | 2012-02-02 | Migos Charles J | Device, Method, and Graphical User Interface for Aligning and Distributing Objects |
US9950256B2 (en) * | 2010-08-05 | 2018-04-24 | Nri R&D Patent Licensing, Llc | High-dimensional touchpad game controller with multiple usage and networking modalities |
KR101685982B1 (en) * | 2010-09-01 | 2016-12-13 | 엘지전자 주식회사 | Mobile terminal and Method for controlling 3 dimention display thereof |
JP5478439B2 (en) * | 2010-09-14 | 2014-04-23 | 任天堂株式会社 | Display control program, display control system, display control apparatus, and display control method |
US9146674B2 (en) * | 2010-11-23 | 2015-09-29 | Sectra Ab | GUI controls with movable touch-control objects for alternate interactions |
US20120179963A1 (en) * | 2011-01-10 | 2012-07-12 | Chiang Wen-Hsiang | Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display |
US20120218203A1 (en) * | 2011-02-10 | 2012-08-30 | Kanki Noriyoshi | Touch drawing display apparatus and operation method thereof, image display apparatus allowing touch-input, and controller for the display apparatus |
JP5945100B2 (en) * | 2011-06-03 | 2016-07-05 | 任天堂株式会社 | INPUT PROCESSING PROGRAM, INPUT PROCESSING DEVICE, INPUT PROCESSING METHOD, AND INPUT PROCESSING SYSTEM |
JP5894380B2 (en) | 2011-06-15 | 2016-03-30 | 株式会社スクウェア・エニックス | Video game processing apparatus and video game processing program |
KR101864618B1 (en) * | 2011-09-06 | 2018-06-07 | 엘지전자 주식회사 | Mobile terminal and method for providing user interface thereof |
KR20130052797A (en) * | 2011-11-14 | 2013-05-23 | 삼성전자주식회사 | Method of controlling application using touchscreen and a terminal supporting the same |
KR101888457B1 (en) * | 2011-11-16 | 2018-08-16 | 삼성전자주식회사 | Apparatus having a touch screen processing plurality of apllications and method for controlling thereof |
US20130120258A1 (en) * | 2011-11-16 | 2013-05-16 | Daryl D. Maus | Multi-touch input device |
JP5460679B2 (en) * | 2011-11-28 | 2014-04-02 | ソニー株式会社 | Information processing apparatus, information processing method, and data structure of content file |
JP2013117885A (en) * | 2011-12-02 | 2013-06-13 | Nintendo Co Ltd | Information processing program, information processing equipment, information processing system and information processing method |
US20130154959A1 (en) * | 2011-12-20 | 2013-06-20 | Research In Motion Limited | System and method for controlling an electronic device |
TW201327334A (en) * | 2011-12-28 | 2013-07-01 | Fih Hong Kong Ltd | Touchable electronic device and finger touch input method |
KR20130095970A (en) * | 2012-02-21 | 2013-08-29 | 삼성전자주식회사 | Apparatus and method for controlling object in device with touch screen |
US9575652B2 (en) * | 2012-03-31 | 2017-02-21 | Microsoft Technology Licensing, Llc | Instantiable gesture objects |
US8814674B2 (en) * | 2012-05-24 | 2014-08-26 | Supercell Oy | Graphical user interface for a gaming system |
US20130285924A1 (en) * | 2012-04-26 | 2013-10-31 | Research In Motion Limited | Method and Apparatus Pertaining to the Interpretation of Touch-Based Actions |
US9916396B2 (en) * | 2012-05-11 | 2018-03-13 | Google Llc | Methods and systems for content-based search |
JP5450718B2 (en) * | 2012-06-11 | 2014-03-26 | 株式会社スクウェア・エニックス | GAME DEVICE AND GAME PROGRAM |
JP6643776B2 (en) * | 2015-06-11 | 2020-02-12 | 株式会社バンダイナムコエンターテインメント | Terminal device and program |
JP6418299B1 (en) * | 2017-09-15 | 2018-11-07 | 株式会社セガゲームス | Information processing apparatus and program |
-
2012
- 2012-05-23 JP JP2012117972A patent/JP5377709B2/en active Active
-
2013
- 2013-01-21 EP EP13152033.0A patent/EP2667294A3/en not_active Withdrawn
- 2013-01-24 US US13/748,937 patent/US20130316817A1/en not_active Abandoned
-
2017
- 2017-08-28 US US15/688,146 patent/US10831258B2/en active Active
-
2019
- 2019-07-19 US US16/517,143 patent/US11119564B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080274780A1 (en) * | 2001-06-18 | 2008-11-06 | Canon Kabushiki Kaisha | Computer device for implementing a trading card game and control method therefor, program executed by computer device, controller, system, and game cards |
US20060010400A1 (en) * | 2004-06-28 | 2006-01-12 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
US20060025218A1 (en) * | 2004-07-29 | 2006-02-02 | Nintendo Co., Ltd. | Game apparatus utilizing touch panel and storage medium storing game program |
US20080132333A1 (en) * | 2006-07-11 | 2008-06-05 | Aruze Corp. | Gaming machine and image alteration control method of gaming machine |
US20110169762A1 (en) * | 2007-05-30 | 2011-07-14 | Microsoft Corporation | Recognizing selection regions from multiple simultaneous input |
US20090051114A1 (en) * | 2007-08-24 | 2009-02-26 | Tc Digital Games, Llc | Systems and Methods for Multi-Platform Trading Card Game |
US20090054124A1 (en) * | 2007-08-24 | 2009-02-26 | Rob Robbers | System and methods for multi-platform trading card game |
US20100141680A1 (en) * | 2008-09-12 | 2010-06-10 | Tatsushi Nashida | Information processing apparatus and information processing method |
US20100083111A1 (en) * | 2008-10-01 | 2010-04-01 | Microsoft Corporation | Manipulation of objects on multi-touch user interface |
US20110009195A1 (en) * | 2009-07-08 | 2011-01-13 | Gunjan Porwal | Configurable representation of a virtual button on a game controller touch screen |
US20110130182A1 (en) * | 2009-11-27 | 2011-06-02 | Konami Digital Entertainment Co., Ltd. | Game apparatus, computer-readable recording medium recorded with game control program, and game control method |
US20130321340A1 (en) * | 2011-02-10 | 2013-12-05 | Samsung Electronics Co., Ltd. | Portable device comprising a touch-screen display, and method for controlling same |
US20130167062A1 (en) * | 2011-12-22 | 2013-06-27 | International Business Machines Corporation | Touchscreen gestures for selecting a graphical object |
US20130205208A1 (en) * | 2012-02-06 | 2013-08-08 | Hans H. Kim | User Interface Control for Media Editing Application |
US20130275868A1 (en) * | 2012-04-12 | 2013-10-17 | Supercell Oy | System, method and graphical user interface for controlling a game |
US20150113477A1 (en) * | 2012-04-12 | 2015-04-23 | Supercell Oy | System and method for controlling technical processes |
Non-Patent Citations (2)
Title |
---|
Blizzard Entertainment, StarCraft Manual, 03/31/1998, Page 20 (Grouping Units) * |
Blizzard Entertainment, WarCraft II: Battle.Net Edition Manual, 12/09/1995, Page 14 (Grouping Units) * |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11776352B2 (en) * | 2012-05-24 | 2023-10-03 | Supercell Oy | Graphical user interface for a gaming system |
US20220245991A1 (en) * | 2012-05-24 | 2022-08-04 | Supercell Oy | Graphical user interface for a gaming system |
US9223485B2 (en) * | 2013-02-27 | 2015-12-29 | Kyocera Documents Solutions Inc. | Image processing apparatus, image forming apparatus including same, and method for controlling image processing apparatus |
US20140245216A1 (en) * | 2013-02-27 | 2014-08-28 | Kyocera Document Solutions Inc. | Image processing apparatus, image forming apparatus including same, and method for controlling image processing apparatus |
US10631825B2 (en) * | 2013-03-13 | 2020-04-28 | Samsung Electronics Co., Ltd. | Method of providing copy image and ultrasound apparatus therefor |
US20150141823A1 (en) * | 2013-03-13 | 2015-05-21 | Samsung Electronics Co., Ltd. | Method of providing copy image and ultrasound apparatus therefor |
US11096668B2 (en) | 2013-03-13 | 2021-08-24 | Samsung Electronics Co., Ltd. | Method and ultrasound apparatus for displaying an object |
US10849597B2 (en) | 2013-03-13 | 2020-12-01 | Samsung Electronics Co., Ltd. | Method of providing copy image and ultrasound apparatus therefor |
US9690473B2 (en) * | 2014-06-13 | 2017-06-27 | Zheng Shi | System and method for changing the state of user interface element marked on physical objects |
US20160179333A1 (en) * | 2014-06-13 | 2016-06-23 | Zheng Shi | System and method for changing the state of user interface element marked on physical objects |
US20180121061A1 (en) * | 2014-10-28 | 2018-05-03 | Gree, Inc. | Game program, computer control method, and information processing apparatus |
US11093121B2 (en) * | 2014-10-28 | 2021-08-17 | Gree, Inc. | Game program, computer control method, and information processing apparatus |
US11914847B2 (en) | 2014-10-28 | 2024-02-27 | Gree, Inc. | Game program, computer control method, and information processing apparatus |
US20240061564A1 (en) * | 2014-10-28 | 2024-02-22 | Gree, Inc. | Game program, computer control method, and information processing apparatus |
US20160256777A1 (en) * | 2015-03-05 | 2016-09-08 | Bandai Namco Entertainment Inc. | Method for controlling display of game image and server system |
US10391399B2 (en) * | 2015-04-13 | 2019-08-27 | Cygames, Inc. | Program, electronic device, and method that improve ease of operation for user input |
EP3345664A4 (en) * | 2015-09-29 | 2019-07-17 | Tencent Technology (Shenzhen) Company Limited | Information processing method, terminal and computer storage medium |
US10786733B2 (en) | 2015-09-29 | 2020-09-29 | Tencent Technology (Shenzhen) Company Limited | Information processing method, terminal, and computer storage medium for releasing virtual skill object based on user gesture |
US20180264354A1 (en) * | 2017-03-17 | 2018-09-20 | Gree, Inc. | Program, control method, and information processing device |
US10758816B2 (en) * | 2017-03-17 | 2020-09-01 | Gree, Inc. | Computer-readable recording medium, method, and electronic device that modify a display position of a plurality of characters on an interface |
US11752432B2 (en) * | 2017-09-15 | 2023-09-12 | Sega Corporation | Information processing device and method of causing computer to perform game program |
US20230166187A1 (en) * | 2020-03-31 | 2023-06-01 | Bandai Co., Ltd. | Program, terminal, and game system |
US11577157B2 (en) * | 2020-07-08 | 2023-02-14 | Nintendo Co., Ltd. | Systems and method of controlling game operations based on touch input |
US11590413B2 (en) * | 2020-07-08 | 2023-02-28 | Nintendo Co., Ltd. | Storage medium storing information processing program with changeable operation modes, information processing apparatus, information processing system, and information processing method |
US20220008820A1 (en) * | 2020-07-08 | 2022-01-13 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing apparatus, information processing system, and information processing method |
CN114327109A (en) * | 2020-09-30 | 2022-04-12 | 明基智能科技(上海)有限公司 | Touch operation method and touch operation system |
US20220100365A1 (en) * | 2020-09-30 | 2022-03-31 | Benq Corporation | Touch control method and touch control system applying the same |
US11604578B2 (en) * | 2020-09-30 | 2023-03-14 | Benq Corporation | Touch control method and touch control system applying ihe same |
CN113918076A (en) * | 2021-12-15 | 2022-01-11 | 深圳佑驾创新科技有限公司 | Touch method, touch device and storage medium of touch screen |
US20230256336A1 (en) * | 2022-02-16 | 2023-08-17 | Nintendo Co., Ltd. | Computer-readable non-transitory storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
US12023588B2 (en) * | 2022-02-16 | 2024-07-02 | Nintendo Co., Ltd. | Computer-readable non-transitory storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
Also Published As
Publication number | Publication date |
---|---|
EP2667294A2 (en) | 2013-11-27 |
EP2667294A3 (en) | 2016-06-01 |
US20190339765A1 (en) | 2019-11-07 |
JP5377709B2 (en) | 2013-12-25 |
US20180011529A1 (en) | 2018-01-11 |
US10831258B2 (en) | 2020-11-10 |
US11119564B2 (en) | 2021-09-14 |
JP2013246521A (en) | 2013-12-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11119564B2 (en) | Information processing apparatus, method for information processing, and game apparatus for performing different operations based on a movement of inputs | |
US9149716B2 (en) | Game apparatus and game program | |
JP5813948B2 (en) | Program and terminal device | |
JP5449422B2 (en) | SCREEN SCROLL DEVICE, SCREEN SCROLL METHOD, AND GAME DEVICE | |
JP6643776B2 (en) | Terminal device and program | |
US8910075B2 (en) | Storage medium storing information processing program, information processing apparatus and information processing method for configuring multiple objects for proper display | |
JP6387299B2 (en) | Input processing apparatus and program | |
JP2017144158A (en) | Program and game device | |
JP2018166943A (en) | Game system and program | |
JP6715361B2 (en) | Information processing device, information processing method, and game device | |
JP6480537B2 (en) | Information processing apparatus, information processing method, and game apparatus | |
JP6230136B2 (en) | Information processing apparatus, information processing method, and game apparatus | |
JP6956209B2 (en) | Terminal devices and programs | |
JP6235544B2 (en) | Program, computer apparatus, screen control method, and system | |
US9962606B2 (en) | Game apparatus | |
JP5980752B2 (en) | Information processing apparatus, information processing method, and game apparatus | |
JP5826244B2 (en) | Game system | |
JP2006102239A (en) | Program, information storage medium, and image generation system | |
JP2017143991A (en) | Program and game device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KABUSHIKI KAISHA SQUARE ENIX (ALSO TRADING AS SQUA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TANAZAWA, YUUICHI;MIYATA, DAISUKE;ASAO, YOSHIMASA;AND OTHERS;REEL/FRAME:030084/0356 Effective date: 20121213 |
|
AS | Assignment |
Owner name: KABUSHIKI KAISHA SQUARE ENIX (ALSO TRADING AS SQUA Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE NAME OF THE FIRST INVENTOR (YUUICHI TANAZAWA) PREVIOUSLY RECORDED ON REEL 030084 FRAME 0356. ASSIGNOR(S) HEREBY CONFIRMS THE NAME OF THE FIRST INVENTOR (YUUICHI TANZAWA);ASSIGNORS:TANZAWA, YUUICHI;MIYATA, DAISUKE;ASAO, YOSHIMASA;AND OTHERS;REEL/FRAME:030487/0446 Effective date: 20121213 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |