WO2005065798A1 - 情報処理システム、エンタテインメントシステム、および情報処理システムの入力受け付け方法 - Google Patents
情報処理システム、エンタテインメントシステム、および情報処理システムの入力受け付け方法 Download PDFInfo
- Publication number
- WO2005065798A1 WO2005065798A1 PCT/JP2005/000038 JP2005000038W WO2005065798A1 WO 2005065798 A1 WO2005065798 A1 WO 2005065798A1 JP 2005000038 W JP2005000038 W JP 2005000038W WO 2005065798 A1 WO2005065798 A1 WO 2005065798A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- touch
- image
- computer
- touch points
- video image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000010365 information processing Effects 0.000 title claims abstract description 20
- 230000008569 process Effects 0.000 claims description 51
- 238000001514 detection method Methods 0.000 claims description 15
- 238000004590 computer program Methods 0.000 claims description 6
- 230000007704 transition Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 239000004065 semiconductor Substances 0.000 description 7
- 206010047571 Visual impairment Diseases 0.000 description 3
- 102100029968 Calreticulin Human genes 0.000 description 2
- MJEMIOXXNCZZFK-UHFFFAOYSA-N ethylone Chemical compound CCNC(C)C(=O)C1=CC=C2OCOC2=C1 MJEMIOXXNCZZFK-UHFFFAOYSA-N 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 101100326671 Homo sapiens CALR gene Proteins 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 229910052900 illite Inorganic materials 0.000 description 1
- VGIBGUSAECPPNB-UHFFFAOYSA-L nonaaluminum;magnesium;tripotassium;1,3-dioxido-2,4,5-trioxa-1,3-disilabicyclo[1.1.1]pentane;iron(2+);oxygen(2-);fluoride;hydroxide Chemical compound [OH-].[O-2].[O-2].[O-2].[O-2].[O-2].[F-].[Mg+2].[Al+3].[Al+3].[Al+3].[Al+3].[Al+3].[Al+3].[Al+3].[Al+3].[Al+3].[K+].[K+].[K+].[Fe+2].O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2.O1[Si]2([O-])O[Si]1([O-])O2 VGIBGUSAECPPNB-UHFFFAOYSA-L 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
Classifications
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
- A63F2300/1093—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
- A63F2300/695—Imported photos, e.g. of the player
Definitions
- the present invention relates to an interface that accepts input from a player in an information processing system, and more particularly to an interface that uses video images taken by a camera.
- Patent Document 1 discloses an image processing apparatus that uses an image taken by a camera as an input interface.
- Patent Document 1 The image processing apparatus disclosed in Patent Document 1 can be applied to an information processing system, or an input interface of an entertainment system. When applied to an information processing system or entertainment system, the value of the entire system can be increased by enhancing entertainment.
- an object of the present invention is to provide a technique related to an input interface with improved entertainment 'property.
- An information processing system shoots the player by means of generating a computer image that prompts the player to virtually touch the plurality of touch points, and photographing means.
- Means for receiving the input of the video image display control means for displaying the video image and the computer image on a display device, and analyzing the video image when the computer image is displayed,
- the detection means is used when an object of a specific color worn by the player in the video image overlaps one of the plurality of touch points in the computer image.
- a virtual touch may be detected.
- the image generation means sequentially generates a computer image including a navigation indicating one touch point to be touched next.
- the means for executing the predetermined process detects the virtual touch on each of the computer images including the navigation sequentially generated by the detection means. Processing may be executed.
- the image generating means sequentially generates a computer image showing two touch points to be touched next. Then, the means for executing the predetermined process is a virtual touch on the two touch points simultaneously for each of the computer images including the navigation sequentially generated by the detecting means. If this is detected, the predetermined process may be executed.
- the image generating means generates a computer image including navigation indicating the order of touches with respect to touch points.
- the means for executing the predetermined process may execute the predetermined process when the detecting means detects that the virtual touch force S is applied in accordance with the navigation.
- An information processing system includes:
- the detection means detects that the virtual touch is continuously applied to a plurality of touch points, an object connecting the touch points with the virtual touch is displayed.
- Object display means When the detection means detects that the virtual touch is continuously applied to a plurality of touch points, an object connecting the touch points with the virtual touch is displayed.
- Object display means When the detection means detects that the virtual touch is continuously applied to a plurality of touch points, an object connecting the touch points with the virtual touch is displayed.
- An entertainment system includes a means for generating a computer image including images for specifying a plurality of areas, a means for receiving an input of a video image taken by the photographing means, The video image and the computer image.
- An entertainment system comprising display control means for displaying on a display device in a superimposed manner and means for analyzing the video image with reference to the computer image, wherein the image generating means includes the plurality of images in a predetermined order.
- a plurality of images for prompting input in which the selected region is displayed in a different manner from the other are generated, and the analysis means generates images for prompting each input.
- FIG. 1 is an overall configuration diagram of an entertainment system according to an embodiment of the present invention.
- FIG. 2 is a configuration diagram of an entertainment device.
- FIG. 3 is a functional configuration diagram of the entertainment device.
- FIG. 4 (a) is a diagram showing an example of a video image after mirror processing.
- Fig. 4 (b) shows an example of an interface image.
- FIG. 4 (c) is a diagram showing an example of a superimposed image.
- FIG. 5 is a flowchart of the entire input reception process.
- FIG. 6 is a flowchart of touch acceptance processing in the case of single navigation.
- FIG. 7 is a flowchart of touch determination processing.
- FIG. 8 Transition image of an object image in the case of single navigation.
- Figure 8 (a) shows the first touch point displayed in flash.
- Figure 8 (b) shows the first and second touch points displayed in flash.
- Figure 8 (c) shows the light line connecting the first and second touch points and the last touch point displayed in flash.
- Figure 8 (d) shows the touch points connected by the light line.
- FIG. 9 is a flowchart of touch acceptance processing in the case of double navigation.
- FIG. 10 Transition image of the object image in the case of double navigation.
- Figure 10 (a) shows the first touch point displayed in flash.
- Figure 10 (b) shows the first touch point connected by the light line and the second touch point displayed in flash with the touch point.
- Figure 10 (c) shows the first touch point connected by the light line and the light Indicates the second touch point connected by a line.
- FIG. 11 is a flowchart of a touch acceptance process in the case of lightning navigation.
- FIG. 12 Transition image of object image in case of lightning navigation.
- Figure 12 (a) shows all touch points to be touched.
- Figure 12 (b) shows that the lightning line connecting the first and second touch points has been deleted.
- Figure 12 (c) shows that the lightning line has been deleted by connecting the second and third touch points.
- Figure 12 (d) shows that the lightning line connecting the third touch point and the first touch point has been deleted.
- FIG. 13 is a flowchart of a touch acceptance process when there is no navigation.
- FIG. 1 An example of the configuration of an entertainment system according to the present embodiment is shown in FIG.
- an analog or digital video camera 1 shoots a player 4 in a position facing the display device 3, and the entertainment device 2 continuously captures the resulting moving images.
- the computer image (CG) generated by the entertainment device 2 and the mirror image of the video image captured from the video camera 1 are displayed on the display device 3 in real time. .
- the mirror moving image can be generated by mirroring the moving image captured by the video camera 1 with the entertainment device 2 (image reversal processing), but a mirror is placed in front of the bidet talent 1 and operated.
- a specular moving image may be generated by capturing a moving image of a mirror image showing a person with the video camera 1. In any case, a composite image whose display form changes in real time according to the movement of the target is displayed on the display device 3.
- the entertainment apparatus 2 is realized by a computer that forms a required function by a computer program.
- the computer according to this embodiment is shown in FIG. As shown in the hardware configuration, it has two buses, a main bus B1 and a sub bus B2, to which a plurality of semiconductor devices each having a unique function are connected. These buses B 1 and B2 are connected to or disconnected from each other via the bus interface INT.
- the main bus B1 includes a main CPU 10 which is a main semiconductor device, a main memory 11 including a RAM, a main DMAC (Direct Memory Access Controller) 12, and an MPE G (Moving Picture Experts Group) decoder (MDEC). ) 13 and a drawing processing unit (Graphic Processing Unit, hereinafter referred to as “GPU”) 14 including a frame memory 15 serving as a drawing memory.
- a CRTC CRT Controller
- the main CPU 10 reads a start program from the ROM 23 on the sub-bus B2 via the nose interface INT when the computer is started, and executes the start program to operate the operating system. Also controls the media drive 27
- the application program data is read from the media 28 loaded in the media drive 27 and stored in the main memory 11. Furthermore, for various data read from the media 28, such as 3D object data composed of multiple basic figures (polygons) ⁇ (coordinate values of polygon vertices (representative points), etc.) Geometry processing (coordinate value calculation processing) to express the image, and polygon definition information by geometry processing (designation of polygon shape to be used and its drawing position, type of material constituting the polygon, color tone, texture, etc.) ) Is generated.
- the GPU 14 holds a drawing context (drawing data including polygon material), reads a necessary drawing context according to a display list notified from the main CPU 10, performs a rendering process (drawing process), and a frame.
- This is a semiconductor device that has the function of drawing polygons in the memory 15.
- the frame memory 15 can also use this as texture memory. Therefore, the pixel image on the frame memory can be pasted on the polygon to be drawn as a texture.
- the main DMAC 12 performs DMA transfer control for each circuit connected to the main bus B1, and performs DMA transfer for each circuit connected to the subbus B2 according to the state of the bus interface INT.
- MDEC 13 is a semiconductor device that performs transfer control, operates in parallel with the main CPU 10, and decompresses data compressed by MPEG (Moving Picture Experts Group) or PEG (Joint Photographic Experts Group). This is a semiconductor device having a function.
- MPEG Motion Picture Experts Group
- PEG Joint Photographic Experts Group
- the sub-bus B2 is stored in a ROM 23 and a sound memory 25 in which control programs such as a sub-CPU 20 constituted by a microprocessor, a sub-memory 21 constituted by a RAM, a sub-DMAC 22 and an operating system are stored.
- Audio processing semiconductor device (SPU (Sound Processing Unit)) 24 that reads out the recorded sound data and outputs it as an audio output
- communication control unit (ATM) 26 that sends and receives information to and from external devices via a network (not shown), CD — Media drive 27 and input unit 31 for loading media 28 such as ROM and DVD—ROM are connected.
- the sub CPU 20 performs various operations in accordance with the control program stored in the ROM 23.
- the sub DMAC 22 controls the DMA transfer and the like for each circuit connected to the sub node B2 only when the bus interface INT is disconnected from the main bus B1 and the sub bus B2. It is a semiconductor device.
- the input unit 31 includes a connection terminal 32 to which an input signal from the operation device 35 is input, a connection terminal 33 to which an image signal from the video camera 1 is input, and a connection terminal to which an audio signal from the video camera 1 is input. 34. In this specification, only the image is described, and the description of the sound is omitted for convenience.
- the computer configured as described above operates as the entertainment device 2 when the main CPU 10, the sub CPU 20, and the GPU 14 read and execute a necessary computer program for the recording medium such as the ROM 23 and the medium 28.
- the entertainment apparatus 2 includes a video image input unit 101, an image inversion unit 102, a determination unit 103, a main control unit 104, a CG generation unit 105, a superimposed image generation unit 106, and a display control unit 107. Then, the touch pattern storage unit 108 is formed.
- the video image input unit 101 captures a video image captured by the video camera 1.
- the video image is a moving image, and the video image input unit 101 continuously captures images sent from the video camera 1.
- the image reversing unit 102 performs mirror surface processing, that is, left / right reversing processing, on the video image captured by the video image input unit 101.
- An example of a mirror-finished video image 200 taken by a player is shown in Fig. 4 (a). Subsequent processing is performed on the mirrored video image.
- the main control unit 104 controls the entire entertainment system. For example, when the entertainment device 2 is executing a game program, the main control unit 104 determines a game story or the like according to the program. Further, when the main control unit 104 determines a story, the determination result of the determination unit 103 may be referred to. Details of this will be described later.
- the CG generation unit 105 generates various computer images along the game story in accordance with instructions from the main control unit 104. For example, a computer image (interface image) 300 for an interface for accepting a request from a player as shown in FIG. 4 (b) is generated.
- the interface image 300 includes an object image 310.
- the superimposed image generation unit 106 generates a superimposed image in which the video image mirror-processed by the image reversing unit 102 and the computer image generated by the CG generation unit 105 are superimposed. For example, when the superimposed image generation unit 106 superimposes the video image 200 shown in FIG. 4 (a) and the interface image 300 shown in FIG. 4 (b), the superimposed image 400 shown in FIG. Generated.
- the display control unit 107 causes the display device 3 to display the superimposed image generated by the superimposed image generation unit 106.
- the touch pattern storage unit 108 stores a navigation pattern to be described later and a specific touch pattern when a touch is received without navigation.
- the navigation pattern and touch pattern may be registered by the player.
- the determination unit 103 analyzes the video image 200 captured from the image inversion unit 102 with reference to the interface image 300 captured from the CG generation unit 105, and the video image 200 is It is determined whether the image is a predetermined image corresponding to the ace image 300. Since the video image 200 is a moving image, this determination is performed in units of frames. For example, the determination unit 103 determines the presence or absence of a predetermined motion corresponding to the interface image 300 using the difference between frames.
- the determination unit 103 includes a counter 103a used when calculating the interframe difference. The counter 103a can count a plurality of values.
- the interface image 300 includes a substantially annular object image 310.
- twelve touch points 320 are arranged in an approximately circular shape and at approximately equal intervals. Therefore, in this case, the determination unit 103 determines whether or not there is a predetermined movement or more in an area corresponding to the touch point 320 of the video image 200.
- touch when the determination unit 103 detects a predetermined motion or more in the area of the video image 200 corresponding to the touch point 320, the player virtually touches the touch point 320 (hereinafter simply referred to as touch). ).
- touch A typical example where it is determined that the touch has been made is when the player moves while touching the touch point 320 while looking at the object image 310 shown on the display device 3 and his / her appearance. is there.
- a predetermined number of pixels it may be determined whether or not a predetermined number of pixels are present in a region corresponding to the touch point 320. .
- color detection and motion detection may be combined to determine motion for a predetermined color. For example, when a glove of a specific color (for example, red) is put on the player and the image of the glove overlaps with the touch point 320, and the number of red pixels in the touch point 320 is greater than or equal to a predetermined number, You may judge.
- the touch point 320 may be determined to be touched when the number of red pixels is equal to or greater than a predetermined number and when a motion greater than or equal to a predetermined number is detected in the red pixels.
- pattern recognition may be used to determine the presence or absence of touch by detecting the movement of the player's arms, hands, fingertips, or gloves.
- the point where the detection target such as a glove in the superimposed image 400 is detected is increased in brightness, or no or illite. You may make it display by emphasizing by displaying. Such a display is displayed in the entire superimposed image 400 regardless of the area of the touch point 320. To! I hope to do it.
- the main control unit 104 determines the timing for displaying the object image 310. Based on the instruction from the main control unit 104, the CG generation unit 105 uses the object image 310 to generate a navigation image as described below in order to prompt the player to touch.
- the navigation is to guide which touch points are to be touched in what order to the player.
- an image showing the touch point to be touched by the player in a different manner (for example, a flashing display that appears to be shining, a different color display, or a blinking display) is displayed on the player.
- Encourage touch In the present embodiment, a case will be described in which a touch point is flash-displayed and guided.
- a plurality of navigation patterns are prepared and stored in the touch pattern storage unit 108.
- navigation pattern hereinafter referred to as “navigation pattern”
- For each navigation pattern hereinafter referred to as “navigation pattern”
- in what order and how many touch points are to be touched are predetermined.
- the first pattern single navigation
- Second pattern double navigation
- third navigation that flashes all points to be touched simultaneously with the number indicating the order of touching
- no-navigation pattern in which a player voluntarily accepts a touch without performing navigation.
- the player continuously touches using the object image 310.
- the remote device 2 accepts that the predetermined input is completed, and executes a predetermined function.
- a predetermined function for example, a character is called or an attack or defense is performed. Different functions may correspond to each touch pattern.
- the entertainment device 2 has a predetermined input. Recognize that.
- a touch point indicating completion of input may be separately displayed, and when a touch at that touch point is received, it may be recognized that an input has been made.
- an instruction to complete the input may be received from the operation device 35.
- FIG. 5 shows the entire processing procedure of the input receiving process.
- the CG generation unit 105 generates a computer image 300 including the object image 310 based on an instruction from the main control unit 104.
- the superimposed image generation unit 106 superimposes the mirror-processed video image 200 (Fig. 4 (a)) acquired from the image inversion unit 102 and the interface image 300 (Fig. 4 (b)) to create an object image 310.
- the superimposed image 400 including the image (FIG. 4 (c)) is displayed on the display device 3 (S101).
- the main control unit 104 selects a navigation pattern (S102).
- the process is divided between the case with navigation and the case without navigation.
- the main control unit 104 selects a navigation pattern stored in the touch pattern storage unit 108. Then, the entertainment device 2 displays a navigation according to the selected navigation pattern on the display device 3 and performs a process of accepting the player's touch (S103). Details of this processing will be described later. Then, the main control unit 104 determines whether or not a touch according to the navigation pattern has been performed by the player performing a touch according to the navigation (S104).
- step S102 main controller 104 does not instruct the navigation display. Then, while the object image 310 is displayed, the player can touch with his own will (S106). Details of this processing will be described later.
- the main control unit 104 refers to the touch pattern storage unit 108, and determines whether or not the accepted touch pattern matches a pre-registered touch pattern (S107). ).
- the main control unit 104 performs a process for calling a specific character corresponding to the registered pattern (S108). Then, a process for calling a general character is performed (S109). As a result, an input interface using the player's own image is realized.
- a touch pattern can be registered using the same interface.
- the player touches the touch point using the object image 310 displayed on the display device 3 in the registration acceptance mode, and the touch pattern at that time is stored in the touch pattern storage unit 108. To do.
- step S103 Next, detailed processing for accepting the player's touch in step S103 will be described.
- the navigation patterns for single navigation, double navigation, and lightning navigation will be described.
- FIG. 6 shows a flowchart in the case of single navigation.
- the main control unit 104 identifies a touch point according to the navi pattern and displays the touch point in flash. (S21) (See Fig. 8 (a)).
- the determination unit 103 performs a determination process for determining whether or not the force is a touch (S22). Details of the touch determination process will be described later.
- the main control unit 104 determines whether or not there is a touch at the touch point during flash display (S23). If there is no touch at the touch point displayed in flash (S23: No), the process ends. If a touch is found (S23: Yes), it is determined whether this touch point is the first touch point (S24).
- the determination unit 103 analyzes the video image, and identifies a video image area that overlaps the position of the touch point displayed in flash in the interface image (S51). Then, the counter 103a for counting the number of pixels is initialized (S52).
- the frame of the video image to be processed is updated (S53).
- the acceptable time for touch is limited to a certain time. Therefore, when the time limit is exceeded, a timeout occurs, so the determination unit 103 monitors it (S54).
- the touch determination is performed by detecting the motion using the inter-frame difference.
- the player makes a red glove as described above and makes a touch determination by detecting the color
- the number of red pixels is counted in step S55, and the number of counts is greater than or equal to the predetermined number. It may be determined that there has been a touch.
- determination by color and determination by movement may be combined.
- the interface image includes an object image 310.
- the first touch point here, "touch point 12”
- the screen transitions to Fig. 8 (b)
- flash indicator 311 is also added to the second touch point (here, "touch point 4").
- FIG. 8 (b) when the touch point 4 is touched in FIG. 8 (b), a transition is made to FIG. 8 (c), and the first touch point and the second touch point are connected by the light line 312. .
- the flash display 311 is also added to the third touch point (here, “touch point 7”).
- FIG. 8 (c) when touched to touch point 7, the transition is made to FIG. 8 (d), where the second touch point and the third touch point are connected by a light line 312. Further, in the example of FIG. 8, since the third touch point 7 is the final touch point, the light line 312 is also connected to the first touch point 12. In this state, input acceptance from the player for single navigation is completed.
- the detected touch point and the previous touch point are connected by a light line (S25, FIG. 8 (c)).
- the touched touch points may be connected together or sequentially by a light line when the last on-touch point input is completed, not every time a touch is detected.
- FIG. 9 shows a flow chart in the case of double navigation as detailed processing in step S 103.
- the main control unit 104 sets two touch points according to the navigation pattern.
- the two touch points are identified and flashed simultaneously (S31) (see Fig. 10 (a)).
- the determination unit 103 performs a determination process as to whether or not there is a touch at each of the two touch points (S32).
- the touch determination process is the same as that shown in FIG. 7 for each of the two touch points.
- the main control unit 104 determines whether or not the two touch points in the flash display are touched almost simultaneously (S33). If there is no touch in either or both (S33: No), the process ends. If both touch points are touched (S33: Yes), the two touched points are connected by a light line (S34) (see Fig. 10 (a) and Fig. 10 (b)).
- the main control unit 104 determines whether or not the navigation pattern has been completed by touching all the touch points of the navigation pattern (S35). If the navigation pattern has not ended (S35: No), step S31 and subsequent steps are repeated. If the navigation pattern is finished (S 35: Yes), it is finished.
- the navigation image includes an object image 310.
- the first touch point (here, “touch points 7 and 12”) has a flash display 311.
- touch points 7 and 12 are touched at the same time in Fig. 10 (a)
- the transition is made to (b). That is, the second touch point (here, “touch point 2, 10”) is flash-displayed 311 and the first touch points, that is, touch points 7 and 12 are connected by the light line 312.
- touch points 2 and 10 are simultaneously touched in FIG. 10 (b), the transition is made to FIG. 10 (c), and the second touch points, that is, touch points 2 and 10 are light lines. Connected at 312. In this state, input acceptance from the player for double navigation is completed.
- the two points may be connected simultaneously or sequentially with a light line.
- step S 103 a flow chart in the case of lightning navigation is shown in FIG.
- the main control unit 104 specifies all touch points included in the navigation pattern and the order of touching according to the navigation pattern.
- each touch point is flashed, and each touch point is connected with a lightning line (a line imitating lightning) (
- the determination unit 103 performs a determination process as to whether or not the touch has been made (S42).
- the touch determination process is the same as that shown in FIG.
- the main control unit 104 determines whether or not there is a touch in the touch order (S43).
- the main control unit 104 determines whether all the touch points of this navigation pattern have been touched and the navigation pattern has ended (S45). If the navigation pattern has not ended (S45: No), repeat steps S41 and after. If the navigation pattern is finished (S 45: Yes), it is finished.
- the navigation image includes an object image 310.
- FIG. 12 (a) shows a number indicating the order of touching all touch points to be touched by the player (here, “touch points 4, 7 and 12”) and a flash display 311, each of which is a lightning bolt. Connected on line 313.
- the touch determination process detects whether a touch is found for any touch point (S61). In other words, the touch detection process of FIG. 7 is performed for all the touch points, and if touch is determined for any one, the process proceeds to step S62.
- the subsequent processing is similar to the processing in the case of single navigation (FIG. 6).
- the touched touch points are connected by a light line (S62-S65).
- the total number (N) for accepting touches is predetermined.
- the player freely touches the N touch points S65: Yes
- the Nth touch point and the first touch point are connected by the light line, and the process is terminated (S66).
- the touched touch points may be connected simultaneously or sequentially with a light line.
- the object image may be a polygon such as a triangle or various shapes! /.
- the touch point is based on certain rules. It is arranged at equal distance (equal) or non-equal distance (non-uniform) with respect to the image outside the object. Alternatively, the object image may not be displayed and only the touch point may be displayed, or only the touch point that should be touched may be displayed.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005516871A JP4824409B2 (ja) | 2004-01-06 | 2005-01-05 | 情報処理システム、エンタテインメントシステム、および情報処理システムの入力受け付け方法 |
US10/585,465 US8345001B2 (en) | 2004-01-06 | 2005-01-05 | Information processing system, entertainment system, and information processing system input accepting method |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-000876 | 2004-01-06 | ||
JP2004000876 | 2004-01-06 | ||
JP2004122975 | 2004-04-19 | ||
JP2004-122975 | 2004-04-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005065798A1 true WO2005065798A1 (ja) | 2005-07-21 |
Family
ID=34752071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/000038 WO2005065798A1 (ja) | 2004-01-06 | 2005-01-05 | 情報処理システム、エンタテインメントシステム、および情報処理システムの入力受け付け方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8345001B2 (ja) |
JP (1) | JP4824409B2 (ja) |
WO (1) | WO2005065798A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012001750A1 (ja) * | 2010-06-28 | 2012-01-05 | 株式会社ソニー・コンピュータエンタテインメント | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
JP2013056125A (ja) * | 2011-09-09 | 2013-03-28 | Sony Computer Entertainment Inc | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
JP2014149856A (ja) * | 2007-07-27 | 2014-08-21 | Qualcomm Inc | 高度なカメラをベースとした入力 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8373654B2 (en) * | 2010-04-29 | 2013-02-12 | Acer Incorporated | Image based motion gesture recognition method and system thereof |
JP5627973B2 (ja) * | 2010-09-24 | 2014-11-19 | 任天堂株式会社 | ゲーム処理をするためのプログラム、装置、システムおよび方法 |
US20130117698A1 (en) * | 2011-10-31 | 2013-05-09 | Samsung Electronics Co., Ltd. | Display apparatus and method thereof |
JP6518689B2 (ja) * | 2014-11-21 | 2019-05-22 | 株式会社ソニー・インタラクティブエンタテインメント | プログラムおよび情報処理装置 |
CN109643468B (zh) * | 2016-08-19 | 2023-10-20 | 索尼公司 | 图像处理装置和图像处理方法 |
CN113920547A (zh) * | 2021-12-14 | 2022-01-11 | 成都考拉悠然科技有限公司 | 一种基于神经网络的手套检测方法及其系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281666A (ja) * | 1994-04-05 | 1995-10-27 | Casio Comput Co Ltd | 画像制御装置 |
JP2000010696A (ja) * | 1998-06-22 | 2000-01-14 | Sony Corp | 画像処理装置および方法、並びに提供媒体 |
JP2001321564A (ja) * | 1999-09-07 | 2001-11-20 | Sega Corp | ゲーム装置、これに使用する入力手段、及び記憶媒体 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR19990011180A (ko) * | 1997-07-22 | 1999-02-18 | 구자홍 | 화상인식을 이용한 메뉴 선택 방법 |
JP3725460B2 (ja) | 2000-10-06 | 2005-12-14 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス |
-
2005
- 2005-01-05 WO PCT/JP2005/000038 patent/WO2005065798A1/ja active Application Filing
- 2005-01-05 US US10/585,465 patent/US8345001B2/en active Active
- 2005-01-05 JP JP2005516871A patent/JP4824409B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07281666A (ja) * | 1994-04-05 | 1995-10-27 | Casio Comput Co Ltd | 画像制御装置 |
JP2000010696A (ja) * | 1998-06-22 | 2000-01-14 | Sony Corp | 画像処理装置および方法、並びに提供媒体 |
JP2001321564A (ja) * | 1999-09-07 | 2001-11-20 | Sega Corp | ゲーム装置、これに使用する入力手段、及び記憶媒体 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014149856A (ja) * | 2007-07-27 | 2014-08-21 | Qualcomm Inc | 高度なカメラをベースとした入力 |
US10268339B2 (en) | 2007-07-27 | 2019-04-23 | Qualcomm Incorporated | Enhanced camera-based input |
US10509536B2 (en) | 2007-07-27 | 2019-12-17 | Qualcomm Incorporated | Item selection using enhanced control |
US11500514B2 (en) | 2007-07-27 | 2022-11-15 | Qualcomm Incorporated | Item selection using enhanced control |
US11960706B2 (en) | 2007-07-27 | 2024-04-16 | Qualcomm Incorporated | Item selection using enhanced control |
WO2012001750A1 (ja) * | 2010-06-28 | 2012-01-05 | 株式会社ソニー・コンピュータエンタテインメント | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
JP2013056125A (ja) * | 2011-09-09 | 2013-03-28 | Sony Computer Entertainment Inc | ゲーム装置、ゲーム制御方法、及びゲーム制御プログラム |
US9072968B2 (en) | 2011-09-09 | 2015-07-07 | Sony Corporation | Game device, game control method, and game control program for controlling game on the basis of a position input received via touch panel |
Also Published As
Publication number | Publication date |
---|---|
US8345001B2 (en) | 2013-01-01 |
JPWO2005065798A1 (ja) | 2007-07-26 |
US20090174652A1 (en) | 2009-07-09 |
JP4824409B2 (ja) | 2011-11-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3725460B2 (ja) | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス | |
JP3847753B2 (ja) | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス | |
TWI469813B (zh) | 在動作擷取系統中追踪使用者群組 | |
US8081822B1 (en) | System and method for sensing a feature of an object in an interactive video display | |
JP2006518237A (ja) | データ処理の制御 | |
US20070126874A1 (en) | Image processing device, image processing method, and information storage medium | |
JP2006520213A (ja) | データ処理の制御 | |
JP4005060B2 (ja) | 情報処理システム、プログラムおよびゲームキャラクタ移動制御方法 | |
JP4824409B2 (ja) | 情報処理システム、エンタテインメントシステム、および情報処理システムの入力受け付け方法 | |
JP2010137097A (ja) | ゲーム装置および情報記憶媒体 | |
JP4689548B2 (ja) | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス | |
JP4809655B2 (ja) | 画像表示装置、画像表示装置の制御方法及びプログラム | |
JP3819911B2 (ja) | エンタテインメント装置 | |
KR200239844Y1 (ko) | 인공시각과 패턴인식을 이용한 체감형 게임 장치. | |
JP2004280856A (ja) | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス | |
WO2009144968A1 (ja) | 画像処理装置、画像処理方法及び情報記憶媒体 | |
JP3853796B2 (ja) | 情報処理装置およびエンタテインメント装置 | |
KR100694283B1 (ko) | 이미지 프로세싱을 이용한 pc 기반의 영상 인식 방법 | |
JP2024154183A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP4767331B2 (ja) | 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス | |
JP2005319193A (ja) | 画像処理システム、プログラム、情報記憶媒体および画像処理方法 | |
JP2005319192A (ja) | 画像処理システム、プログラム、情報記憶媒体および画像処理方法 | |
WO2014101219A1 (zh) | 一种动作识别方法及电视机 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005516871 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10585465 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |