WO2022190961A1 - カメラ装置及びカメラシステム - Google Patents
カメラ装置及びカメラシステム Download PDFInfo
- Publication number
- WO2022190961A1 WO2022190961A1 PCT/JP2022/008572 JP2022008572W WO2022190961A1 WO 2022190961 A1 WO2022190961 A1 WO 2022190961A1 JP 2022008572 W JP2022008572 W JP 2022008572W WO 2022190961 A1 WO2022190961 A1 WO 2022190961A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- unit
- information
- fingertip
- finger
- Prior art date
Links
- 230000003183 myoelectrical effect Effects 0.000 claims abstract description 86
- 238000001514 detection method Methods 0.000 claims abstract description 37
- 210000000707 wrist Anatomy 0.000 claims description 16
- 238000003384 imaging method Methods 0.000 abstract description 53
- 210000003811 finger Anatomy 0.000 description 80
- 238000000034 method Methods 0.000 description 21
- 210000003205 muscle Anatomy 0.000 description 21
- 238000010586 diagram Methods 0.000 description 16
- 239000008186 active pharmaceutical agent Substances 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 210000004932 little finger Anatomy 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 210000003813 thumb Anatomy 0.000 description 3
- 238000005452 bending Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 210000002435 tendon Anatomy 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 210000004905 finger nail Anatomy 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004118 muscle contraction Effects 0.000 description 1
- 210000001087 myotubule Anatomy 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/12—Fingerprints or palmprints
- G06V40/13—Sensors therefor
- G06V40/1318—Sensors therefor using electro-optical elements or layers, e.g. electroluminescent sensing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00097—Sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00194—Optical arrangements adapted for three-dimensional imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/389—Electromyography [EMG]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
- A61B5/6826—Finger
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/24—Details of cameras or camera bodies; Accessories therefor with means for separately producing marks on the film, e.g. title, time of exposure
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/02—Stereoscopic photography by sequential recording
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
- G06V40/117—Biometrics derived from hands
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/0005—Display arrangement combining images e.g. side-by-side, superimposed or tiled
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/005—Flexible endoscopes
- A61B1/009—Flexible endoscopes with bending or curvature detection of the insertion part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the present disclosure relates to a user interface for controlling a compact camera, and more particularly to technology for controlling a camera attached to a fingertip using palm biometric information.
- a camera including a stereo camera unit having a pair of left and right lenses and image pickup devices, a stereo viewfinder and viewer unit having a pair of left and right eyepieces and display devices, and a mobile information terminal unit wirelessly connected to a public communication line
- a terminal with a terminal is known (see, for example, Patent Document 1).
- this type of camera-equipped terminal it is possible to complete the shooting and viewing of stereo images with a single unit.
- the present disclosure has been made in view of the above, and aims to provide a camera device and a camera system that can easily detect the direction of a camera attached to a finger.
- the camera device includes a camera unit worn on a finger, a biometric information acquisition unit that acquires biometric information of the palm, and a a camera direction detection unit for detecting the relative direction of the camera unit with respect to a predetermined reference direction; and an output unit for outputting.
- the camera system is configured to obtain three images based on the above-described camera device, a plurality of captured images captured by the camera unit, and direction information indicating the relative direction of the camera unit corresponding to the captured image information. and a stereoscopic image generation processing unit that generates stereoscopic image information representing a dimensional structure.
- the orientation of the camera unit relative to the predetermined reference direction is detected based on the acquired biological information, so the orientation of the camera attached to the finger can be easily detected.
- FIG. 1 is a schematic diagram showing an example of a fingertip camera system including a fingertip camera device according to the first embodiment.
- FIG. 2 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- FIG. 3 is a flow chart showing the operation procedure of the fingertip camera system.
- FIG. 4 is a schematic diagram showing an example of a fingertip camera system including a fingertip camera device according to the second embodiment.
- FIG. 5 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- FIG. 6 is a flow chart showing the operation procedure of the fingertip camera system.
- FIG. 7 is a schematic diagram showing an example of a fingertip camera system having a fingertip camera device according to the third embodiment.
- FIG. 8 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- FIG. 9 is a flow chart showing the operation procedure of the fingertip camera system.
- FIG. 10 is a diagram showing a schematic configuration of a camera section according to a modification.
- FIG. 11 is a diagram showing a schematic configuration of a camera section according to a modification.
- FIG. 12 is a diagram showing a schematic configuration of a camera section according to a modification.
- FIG. 1 is a schematic diagram showing an example of a fingertip camera system including a fingertip camera device according to the first embodiment.
- FIG. 2 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- the fingertip camera system (camera system) 10 processes an image captured by the camera unit 21 attached to the fingertip of the user's hand and displays the image on the predetermined display unit 34 .
- the fingertip camera system 10 includes a fingertip camera device (camera device) 20 and an image processing unit 30 .
- the fingertip camera device 20 and the image processing unit 30 are connected via, for example, a LAN (Local Area Network).
- the LAN relays between the fingertip camera device 20 and the image processing unit 30, and uses, for example, a wireless LAN such as WI-FI (registered trademark) or a wireless communication path such as LTE or 5G.
- WI-FI registered trademark
- 5G wireless communication path
- the fingertip camera device 20 includes a camera section 21 , a myoelectric sensor (biological information detection section) 22 and a control unit 23 .
- the camera unit 21 includes a camera body 210 attached to the fingertip of a user's hand (for example, the index finger (second finger)) and a lens 211 provided on the camera body 210 .
- the camera body 210 is formed into a bottomed cylindrical shape like a finger sack, and is worn on the fingertip by inserting the fingertip from the opening side.
- An image capturing unit (not shown) is built in.
- the image captured by the image capturing unit is, for example, a moving image, but may be a still image.
- the captured image information is transmitted to the control unit 23 at a predetermined frequency.
- the lens 211 has a function of condensing light, and is provided at the tip (bottomed portion) of the camera body 210. That is, in the example of FIG. It extends in the direction of the fingertip of the attached finger.
- the myoelectric sensor 22 is formed in a ring shape and attached to the base of the finger to which the camera section 21 is attached.
- the myoelectric sensor 22 has a plurality of surface electrodes arranged on the inner peripheral surface of the ring-shaped main body, and these surface electrodes receive myoelectric information (biological information) generated according to the movement of the muscles of the user's fingers. To detect.
- myoelectric information is bioelectric signals generated by contraction of muscles (muscle fibers) when fingers are moved.
- the inner peripheral surface refers to the side that comes into contact with the outer peripheral surface of the user's fingers.
- the myoelectric sensor 22 detects the extension of the left and right tendons of the fingers and the muscles (for example, Detects myoelectric information generated by elongation of adductor pollicis lateral head. Accordingly, the myoelectric sensor 22 is configured to be able to detect movements of a plurality of muscles and tendons by detecting myoelectric information. The detected myoelectric information is transmitted to the control unit 23 as needed at a predetermined frequency.
- the myoelectric sensor 22 is attached to the base of the finger to which the camera unit 21 is attached. The myoelectric sensor 22 may be worn separately from the wrist, or may be worn collectively on the wrist. When a myoelectric sensor is attached to the wrist, it can be provided in the control unit 23 .
- the control unit 23 is connected to the camera section 21 and the myoelectric sensor 22, respectively, and controls the operations of the camera section 21 and the myoelectric sensor 22.
- the control unit 23 is, for example, wrapped around the user's wrist and worn on the wrist, as shown in FIG.
- the control unit 23 is wired to the camera section 21 and the myoelectric sensor 22, supplies power to the camera section 21 and the myoelectric sensor 22, and receives various information from the camera section 21 and the myoelectric sensor 22. get.
- the wired connection is used, but the present invention is not limited to this, and a wireless connection may be used.
- the camera section 21 and the myoelectric sensor 22 are each provided with a power supply (battery) for driving itself.
- control unit 23 includes an imaging information acquisition section 40, a myoelectric information acquisition section (biological information acquisition section) 42, a position sensor (arm direction detection section) 44, and a camera direction detection section 46. , a storage unit 48 , a power supply unit 50 , an output unit 52 , an operation unit 54 and a control unit 56 .
- the imaging information acquisition unit 40 is an interface that acquires imaging information transmitted from the camera unit 21 .
- the myoelectric information acquisition unit 42 acquires myoelectric information transmitted from the myoelectric sensor 22 .
- the acquired imaging information and myoelectric information are stored in the storage unit 48 in association with the camera unit 21, for example.
- the position sensor 44 is a sensor that detects the position and orientation of the control unit 23, and includes, for example, a gyro sensor, a 3-axis acceleration sensor, and a geomagnetic sensor.
- the position sensor 44 can detect the position of the user's wrist (arm) to which the control unit 23 is attached and the direction in which the user's arm extends. A specific example of calculating the direction in which the arm extends will be described.
- two or more position sensors 44 are installed within the control unit 23 .
- Two or more position sensors 44 are installed in the control unit 23 so that the two or more position sensors 44 are positioned on a line in the direction in which the wrist extends when the user attaches the control unit 23 .
- the number of position sensors 44 may be one, and the direction in which the arm extends can be calculated from the change in the position detected by the position sensor 44 .
- the camera direction detection unit 46 detects the direction (imaging direction) of the camera unit 21 relative to a predetermined reference direction based on the acquired myoelectric information.
- the reference direction is the direction in which the user's arm extends detected by the position sensor 44, and the direction of the camera section 21 relative to the direction in which the arm extends is detected.
- the bending direction of the fingers ie, the direction of the camera unit 21 with respect to the palm, is calculated as, for example, a three-dimensional direction vector from myoelectric information.
- the myoelectric information of the muscles at the base of the fingers (for example, the lateral head of the adductor pollicis muscle, the internal oblique muscle of the pollicis muscle, the abductor pollicis brevis muscle, the flexor pollicis brevis muscle, etc.) and the movement of the fingers at that time
- Data associated with each directional information (direction vector) obtained is retrieved from the storage unit 48, and each directional information (direction vector) in which the finger moves is calculated by comparing this data with the acquired myoelectric information. Then, the direction of the camera unit 21 with respect to the palm can be calculated.
- the myoelectric information measured for the muscles at the base of the fingers (for example, the lateral head of the adductor pollicis muscle, the internal oblique muscle of the pollicis muscle, the abductor pollicis brevis muscle, the flexor pollicis brevis muscle, etc.)
- a learning model is generated by performing machine learning using this information as a teacher data set.
- the orientation of the part 21 can be calculated.
- the direction in which the arm extends is calculated as a three-dimensional direction vector based on detection by the position sensor. Therefore, by using the direction in which the arm extends as a reference direction, it is possible to detect (calculate) the relative direction of the camera section 21 with respect to this reference direction.
- the storage unit 48 is configured with, for example, a RAM and a flash memory, and stores acquired imaging information and myoelectric information. Also, the direction (imaging direction) of the camera unit 21 relative to a predetermined reference direction is stored. In addition, the learning model described above is stored in the storage unit 48 . This learning model is generated, for example, by machine learning using myoelectric information generated when a finger is moved and each direction information (direction vector) in which the finger is moved at that time as a teacher data set.
- the power supply unit 50 is a power supply for driving the fingertip camera device 20.
- the power supply unit 50 is, for example, a rechargeable battery, and supplies power to the camera unit 21 and the myoelectric sensor 22 .
- the output section 52 outputs information from the control unit 23 to the image processing unit 30 .
- the output unit 52 is an interface that adds relative direction information of the camera unit 21 detected by the camera direction detection unit 46 to image information captured by the camera unit 21 and outputs the information. In this case, the direction information and the imaging information may be associated (linked) and output.
- the operation unit 54 operates the fingertip camera device 20 .
- the operation section 54 includes a switch formed on the surface of the control unit 23 worn on the wrist, and by operating this switch, the imaging operation of the camera section 21 is started and stopped.
- the control section 56 controls the operation of each component of the control unit 23 .
- the myoelectric information acquisition unit 42, the camera direction detection unit 46, and the control unit 56 are each configured by a CPU, a ROM, a RAM, and the like.
- the image processing unit 30 includes a communication section 31, a storage section 32, a stereoscopic image generation processing section 33, a display section 34, and a control section 35, as shown in FIG.
- the image processing unit 30 is specifically an information processing device such as a computer device or a smart phone.
- the stereoscopic image generation processing section 33 and the control section 35 of the image processing unit 30 are configured by, for example, a CPU, a ROM, and a RAM. Specific examples of the communication unit 31, storage unit 32, and display unit 34 will be described later.
- the communication unit 31 is an interface that receives relative direction information and imaging information of the camera unit 21 output from the control unit 23 of the fingertip camera device 20 .
- the storage unit 32 stores various types of received information and control programs.
- the storage unit 32 may be, for example, a semiconductor memory device such as a flash memory, but may be a storage device such as an HDD.
- the stereoscopic image generation processing unit 33 generates stereoscopic image information representing a three-dimensional structure from a plurality of captured images.
- the stereoscopic image generation processing unit 33 can generate stereoscopic image information using a so-called photogrammetry technique.
- the stereoscopic image generation processing unit 33 extracts an imaging target as a feature point from a plurality of pieces of imaging information captured by the fingertip camera device 20, and based on direction information of a plurality of cameras corresponding to the imaging information, generates a stereoscopic image from the plurality of pieces of imaging information.
- Stereoscopic image information is generated by associating the extracted feature points.
- the generated stereoscopic image information is stored in the storage unit 32 .
- the stereoscopic image may be, for example, a three-dimensional model composed of a three-dimensional point group in a predetermined coordinate system.
- the display unit 34 is, for example, a monitor unit configured with a liquid crystal display (LCD), etc., and displays the generated stereoscopic image information.
- the display section 34 is provided integrally with the image processing unit 30, but the display section 34 may be provided separately, for example, as a glasses-type display such as a head-mounted display.
- the control section 35 controls the operation of the image processing unit 30 as a whole.
- FIG. 3 is a flow chart showing the operation procedure of the fingertip camera system.
- the fingertip camera device 20 described above is attached to the user's hand.
- a camera unit 21 is attached to the tip of the index finger
- a myoelectric sensor 22 is attached to the base of the index finger.
- the control unit 23 is worn on the wrist.
- the basic condition is to extend the index finger straight in the direction of the arm.
- step S1 by operating the operation section 54 of the control unit 23, an image is captured by the camera section 21 attached to the fingertip (step S1).
- the captured imaging information is sent from the camera section 21 to the control unit 23 and acquired by the imaging information acquisition section 40 .
- the myoelectric sensor 22 detects myoelectric information that accompanies the movement of the finger during imaging and sends it to the control unit 23, and the myoelectric information acquisition unit 42 acquires the myoelectric information (step S2).
- These imaging information and myoelectric information are stored in the storage unit 48 in association with each other.
- the camera direction detection unit 46 first calculates the bending direction of the fingers, that is, the angle (direction) of the camera unit 21 with respect to the palm, based on the myoelectric information (step S3).
- myoelectric information measured for the muscles at the base of the fingers for example, the lateral head of the adductor pollicis muscle, the internal oblique muscle of the pollicis muscle, the abductor pollicis brevis muscle, the flexor pollicis brevis muscle, etc.
- a learning model is generated by performing machine learning using each direction information (direction vector) in which the finger moves as a teacher data set, and the camera direction detection unit 46 inputs the detected myoelectric information to this learning model.
- the angle (direction) of the camera unit 21 with respect to the palm is calculated.
- the calculated angle information of the camera unit 21 is, for example, a three-dimensional direction vector.
- the camera direction detection unit 46 detects the relative direction (imaging direction) of the camera unit 21 with the orientation of the arm as the reference direction (step S4).
- the camera direction detection unit 46 acquires the extension direction DS of the user's arm detected by the position sensor 44 .
- the direction DS in which the arm extends is calculated as a three-dimensional direction vector.
- the camera direction detection section 46 detects (calculates) a relative direction D1 of the camera section 21 with respect to this reference direction.
- the output unit 52 adds the relative direction information of the camera unit 21 obtained from the camera direction detection unit 46 to the imaging information captured by the camera unit 21, and outputs the information to the image processing unit 30 (step S5).
- a method of multiplexing the direction information with the imaging information for example, data of a moving image compression method such as MPEG.
- the stereoscopic image generation processing unit 33 of the image processing unit 30 performs three-dimensional image processing based on a plurality of captured images captured by the camera unit 21 and direction information indicating the relative direction of the camera unit corresponding to the captured images.
- Stereoscopic image information representing the dimensional structure is generated (step S6).
- the stereoscopic image generation processing unit 33 can generate stereoscopic image information using a so-called photogrammetry technique. That is, the stereoscopic image generation processing unit 33 extracts the imaging target as a feature point from a plurality of imaging information captured by the fingertip camera device 20 at predetermined time intervals, and based on the direction information of the plurality of cameras corresponding to the imaging information. , stereoscopic image information is generated by associating feature points extracted from a plurality of pieces of imaging information.
- control section 35 of the image processing unit 30 displays the generated stereoscopic image information on the display section 34 (step S7) and ends the process.
- the camera unit 21 is attached to the fingertip, it is possible to easily capture an image of a distant scene or a nearby object simply by pointing. (image) can also be captured. Furthermore, since the myoelectric sensor 22 can simultaneously acquire information about the direction of the finger on which the camera unit 21 is attached, the imaging direction of the camera unit 21 attached to the fingertip can be easily detected.
- FIG. 4 is a schematic diagram showing an example of a fingertip camera system including a fingertip camera device according to the second embodiment.
- FIG. 5 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- the fingertip camera device 20 is configured to include one camera unit 21 and one myoelectric sensor 22, but in the second embodiment, a plurality of camera units 21 and myoelectric sensors 22 are provided.
- the configuration is different in that The same reference numerals are assigned to the same configurations as those of the above-described embodiment, and the description thereof is omitted.
- the fingertip camera system 10A includes a fingertip camera device 20A and an image processing unit 30.
- the fingertip camera device 20A includes two camera units 21A and 21B, two myoelectric sensors (biological information detection units) 22A and 22B, and a control unit 23A.
- one camera unit 21A is attached to a fingertip (for example, index finger (second finger)) of a user's hand
- the other camera unit 21B is attached to an adjacent fingertip (for example, middle finger (second finger)).
- the two camera units 21A and 21B work together to function as a stereo camera.Specifically, they are attached to adjacent fingertips.
- a stereo image can be captured from the images captured by the camera units 21A and 21B. , is the same as the above embodiment.
- the configuration of the control unit 23A differs from that of the control unit 23 of the above embodiment in that it includes an inter-adjacent-camera angle calculator 45 .
- the inter-adjacent camera angle calculation unit 45 calculates the angles of the adjacent camera units 21A and 21B (mainly the angle in the horizontal direction with respect to the palm) based on the myoelectric information of each finger detected by the two electromyographic sensors 22A and 22B. , the angle at which the fingers are spread or narrowed).
- the directions (angles) of the camera units 21A and 21B with respect to the palm can be calculated by inputting detected electromyographic information into a predetermined learning model.
- angles of the adjacent camera units 21A and 21B can be calculated from the directions (angles) of the camera units 21A and 21B. According to this configuration, the parallax between the camera units 21A and 21B functioning as stereo cameras can be calculated, so stereo images can be captured with high accuracy.
- FIG. 6 is a flow chart showing the operation procedure of the fingertip camera system.
- the camera unit 21A is attached to the tip of the index finger, and the myoelectric sensor 22A is attached to the base of the index finger.
- the camera section 21B is attached to the fingertip of the middle finger, and the myoelectric sensor 22B is attached to the base of the middle finger.
- the control unit 23A is worn on the wrist.
- a stereo image is captured by the two camera sections 21A and 21B attached to the two fingertips (step S11).
- the captured image information is sent from the camera units 21A and 21B to the control unit 23A, and the image information acquisition unit 40 acquires it.
- the myoelectric sensors 22A and 22B detect myoelectric information associated with the movement of each finger during imaging and send it to the control unit 23A, and the myoelectric information acquisition unit 42 acquires the myoelectric information of each finger. (Step S12). These imaging information and myoelectric information are stored in the storage unit 48 in association with each other.
- the inter-adjacent camera angle calculation unit 45 calculates the angles of the adjacent camera units 21A and 21B based on the electromyographic information of each finger detected by the two electromyographic sensors 22A and 22B (step S13). .
- the camera direction detection unit 46 detects relative directions (imaging directions) of the respective camera units 21A and 21B with the direction of the arm as a reference direction (step S14).
- the camera direction detection unit 46 acquires the extension direction DS of the user's arm detected by the position sensor 44 .
- the direction DS in which the arm extends is calculated as a three-dimensional direction vector.
- the camera direction detection unit 46 detects (calculates) a relative direction D1 of the camera unit 21A and a relative direction D2 of the camera unit 21B with respect to the reference direction. )do.
- the output unit 52 adds the relative direction information of the camera units 21A and 21B obtained from the camera direction detection unit 46 to the imaging information of the stereo images captured by the camera units 21A and 21B, respectively, and outputs the information to the image processing unit. 30 (step S15).
- a method of multiplexing the direction information with the imaging information for example, data of a moving image compression method such as MPEG.
- the stereoscopic image generation processing unit 33 of the image processing unit 30 generates a plurality of stereo images captured by the camera units 21A and 21B, and relative images of the respective camera units 21A and 21B corresponding to the captured images.
- Stereoscopic image information representing a three-dimensional structure is generated based on the direction information indicating the direction (step S16).
- the stereoscopic image generation processing unit 33 can generate stereoscopic image information using a so-called photogrammetry technique. That is, the stereoscopic image generation processing unit 33 extracts the imaging target as a feature point from the imaging information of the plurality of stereo images captured by the fingertip camera device 20A at predetermined time intervals, and the direction information of the plurality of cameras corresponding to the imaging information. Based on, stereoscopic image information is generated by associating feature points extracted from a plurality of pieces of imaging information.
- control section 35 of the image processing unit 30 displays the generated stereoscopic image information on the display section 34 (step S17) and ends the process.
- the camera unit 21 since the camera unit 21 is attached to the fingertip, it is possible to easily capture an image of a distant scene or a nearby object simply by pointing. (image) can also be captured. Furthermore, since the myoelectric sensor 22 can simultaneously acquire information about the direction of the finger on which the camera unit 21 is attached, the imaging direction of the camera unit 21 attached to the fingertip can be easily detected. Further, in this embodiment, since the two camera units 21A and 21B are provided, stereo images (parallax images) can be obtained, and stereoscopic images using parallax can be generated. In addition, stereoscopic space information can be reconstructed using imaging information from the viewpoints of the two camera units 21A and 21B.
- FIG. 7 is a schematic diagram showing an example of a fingertip camera system having a fingertip camera device according to the third embodiment.
- FIG. 8 is a schematic diagram showing an example of a state in which the fingertip camera device is worn on the hand.
- the fingertip camera device 20A has two camera units 21A and 21B and two myoelectric sensors 22A and 22B. and myoelectric sensors 22A to 22E.
- the same reference numerals are assigned to the same configurations as those of the above-described embodiment, and the description thereof is omitted.
- the fingertip camera system 10B includes a fingertip camera device 20B and an image processing unit 30.
- the fingertip camera device 20B includes five camera units 21A to 21E, five myoelectric sensors (biological information detection units) 22A to 22E, and a control unit 23B.
- the first camera unit 21A is attached to the fingertip of the user's hand (for example, the index finger (second finger)), and the second camera unit 21B is attached to the next fingertip (for example, the middle finger).
- the third camera unit 21C is attached to the adjacent fingertip (for example, the ring finger (fourth fingertip)), and the fourth camera unit 21D is attached to the adjacent fingertip (fingertip of the fourth finger).
- the fifth camera unit 21E is attached to a fingertip (for example, the little finger (fingertip of the fifth finger)).
- the fifth camera unit 21E is attached to another fingertip (for example, the thumb (fingertip of the first finger)).
- the reference numerals of the camera units and the fingers on which they are attached are assigned for convenience of explanation, and can be changed as appropriate.
- the configuration of the control unit 23B differs from that of the control unit 23 of the above embodiment in that it includes an inter-adjacent-camera angle calculator 45 .
- the inter-adjacent-camera angle calculation unit 45 calculates the angles of the two adjacent camera units (mainly the horizontal angle with respect to the palm, the finger The angle at which the is widened or narrowed) is calculated.
- the directions (angles) of the camera units 21A to 21E with respect to the palm can be calculated by inputting the detected myoelectric information into a predetermined learning model. Therefore, the angle between two adjacent camera units can be calculated from the direction (angle) of each of these camera units 21A to 21E.
- FIG. 9 is a flow chart showing the operation procedure of the fingertip camera system.
- a camera unit 21A is attached to the tip of the index finger, and a myoelectric sensor 22A is attached to the base of the index finger.
- the camera section 21B is attached to the fingertip of the middle finger, and the myoelectric sensor 22B is attached to the base of the middle finger.
- the camera section 21C is attached to the fingertip of the ring finger, and the myoelectric sensor 22C is attached to the base of the ring finger.
- a camera unit 21D is attached to the tip of the little finger, and a myoelectric sensor 22D is attached to the base of the little finger.
- a camera section 21E is attached to the tip of the thumb, and a myoelectric sensor 22E is attached to the base of the thumb.
- the control unit 23B is worn on the wrist.
- images are captured by the five camera sections 21A to 21E attached to the five fingertips (step S21).
- the imaging information obtained by imaging is sent from each of the camera units 21A to 21E to the control unit 23B, and the imaging information acquisition unit 40 acquires the information.
- the myoelectric sensors 22A to 22E respectively detect myoelectric information associated with the movement of each finger during imaging and send it to the control unit 23B, and the myoelectric information acquisition unit 42 acquires the myoelectric information of each finger.
- These imaging information and myoelectric information are stored in the storage unit 48 in association with each other.
- the inter-adjacent camera angle calculation unit 45 calculates the angles of the two adjacent camera units based on the electromyographic information of each finger detected by the five electromyographic sensors 22A to 22E (step S23).
- the camera direction detection unit 46 detects the relative direction (imaging direction) of each of the camera units 21A to 21E with the orientation of the arm as the reference direction (step S24).
- the camera direction detection unit 46 acquires the extension direction DS of the user's arm detected by the position sensor 44 .
- the direction DS in which the arm extends is calculated as a three-dimensional direction vector.
- the camera direction detection unit 46 detects a relative direction D1 of the camera unit 21A and a relative direction D2 of the camera unit 21B with respect to the reference direction, and A relative direction D3 of 21C, a relative direction D4 of camera section 21D, and a relative direction D5 of camera section 21E are detected (calculated).
- the output unit 52 adds the relative direction information of the camera units 21A to 21E obtained from the camera direction detection unit 46 to the image information captured by the camera units 21A to 21E, respectively, and outputs the information to the image processing unit 30.
- Output step S25.
- a method of multiplexing the direction information with the imaging information for example, data of a moving image compression method such as MPEG.
- the stereoscopic image generation processing unit 33 of the image processing unit 30 generates a plurality of captured images captured by the camera units 21A to 21E and the relative directions of the camera units 21A to 21E corresponding to the captured images.
- Stereoscopic image information representing a three-dimensional structure is generated based on the indicated direction information (step S26).
- the stereoscopic image generation processing unit 33 can generate stereoscopic image information using a so-called photogrammetry technique. That is, the stereoscopic image generation processing unit 33 extracts an imaging target as a feature point from a plurality of imaging information captured by the camera units 21A to 21E of the fingertip camera device 20B, and converts the direction information of the plurality of cameras corresponding to the imaging information. Based on this, stereoscopic image information is generated by associating feature points extracted from a plurality of pieces of imaging information.
- control section 35 of the image processing unit 30 displays the generated stereoscopic image information on the display section 34 (step S27) and ends the process.
- the camera unit 21 is attached to the fingertip, it is possible to easily capture an image of a distant scene or a nearby object simply by pointing. (image) can also be captured. Furthermore, since the myoelectric sensor 22 can simultaneously acquire information about the direction of the finger on which the camera unit 21 is attached, the imaging direction of the camera unit 21 attached to the fingertip can be easily detected. Further, in this embodiment, since the five camera units 21A to 21E are provided, for example, the five camera units each have a parallax in the space of the extension of each finger around the palm as if gripping a ball. can be imaged.
- the camera unit 121 includes a camera body 122 placed on the fingertip, a lens 123 provided on the camera body 122, and a ring 124 for attaching the camera body 122 to the fingertip.
- this configuration leaves the pad side of the finger open, so there is no hindrance to detailed work with the fingertips or fingerprint authentication.
- the camera unit 221 includes a camera body 222 placed on the fingertip and a lens 223 provided on this camera body 222 .
- the camera body 222 is attached to a fingernail with an adhesive or a suction cup, for example. In this configuration, since the ring is not used, the pad side of the finger is more open, so that fine work and fingerprint authentication can be easily performed with the fingertip.
- the camera unit 321 is of the finger sack type as in the above-described embodiment, and includes a bottomed cylindrical camera body 322 into which a fingertip is inserted, and a and a lens 323 that This lens 323 is arranged, for example, on the pad side of the finger.
- each camera unit 321 is arranged on an arc formed by each fingertip, so that the object to be imaged is 360 degrees. It is possible to take an image in an enclosed state, and a stereoscopic image can be easily formed.
- the fingertip camera device 20 includes the camera unit 21 attached to the fingertip of the hand, the myoelectric information acquisition unit 42 for acquiring myoelectric information of the palm, and the myoelectric information based on the acquired myoelectric information. , a camera direction detection unit 46 for detecting the relative direction of the camera unit 21 with respect to a predetermined reference direction; Since the output unit 52 for additionally outputting is provided, the direction of the camera unit 21 attached to the fingertip can be easily detected.
- the myoelectric information acquisition unit 42 acquires myoelectric information associated with the movement of the fingers to which the camera unit 21 is attached, it is possible to easily detect the direction of the camera unit 21 associated with the movement of the fingers.
- the camera units 21 are attached to each of a plurality of fingertips, it is possible to generate, for example, a stereoscopic image using images captured by the plurality of camera units 21 .
- a position sensor 44 attached to the wrist to detect the orientation of the arm is provided, and a camera direction detection section 46 detects the direction of the camera section 21 relative to the reference direction, using the detected orientation of the arm as a reference direction.
- the direction of the camera unit can be easily detected by following the direction of the arm moved by the user.
- the myoelectric sensor 22 for detecting myoelectric information is attached to the base of the finger where the camera unit 21 is attached or on the wrist, the myoelectric information associated with finger movements can be accurately detected. , the direction of the camera unit 21 can be accurately detected according to the movement of the fingers.
- the fingertip camera system 10 performs a three-dimensional image based on a plurality of captured images captured by the fingertip camera device 20 and the camera unit 21, and direction information indicating the relative direction of the camera unit 21 corresponding to the captured images. Since the three-dimensional image generation processing unit 33 for generating three-dimensional image information expressing the structure is provided, the three-dimensional image can be easily generated.
- the embodiment is not limited by the constituent elements of these embodiments.
- the components described above include those that can be easily assumed by those skilled in the art, those that are substantially the same, and those within the so-called equivalent range.
- the components described above can be combined as appropriate.
- various omissions, replacements, or modifications of components can be made without departing from the gist of the above-described embodiments.
- the myoelectric sensor 22 is provided at the base of the finger, but it may be provided integrally with or separately from the control unit 23 worn on the wrist, for example.
- the camera device and camera system of the present embodiment can be used, for example, as a fingertip camera device that is worn on a fingertip and shoots.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Surgery (AREA)
- Human Computer Interaction (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Neurology (AREA)
- Dermatology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
- Exposure Control For Cameras (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Accessories Of Cameras (AREA)
- Studio Devices (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
Abstract
Description
図1は、第1実施形態に係る指先カメラ装置を備えた指先カメラシステムの一例を示す概略図である。図2は、手に指先カメラ装置が装着された状態の一例を示す概略図である。本実施形態において、指先カメラシステム(カメラシステム)10は、ユーザの手の指先に装着されたカメラ部21が撮像した画像を処理して所定の表示部34に表示するものである。
図4は、第2実施形態に係る指先カメラ装置を備えた指先カメラシステムの一例を示す概略図である。図5は、手に指先カメラ装置が装着された状態の一例を示す概略図である。第1実施形態では、指先カメラ装置20が1台のカメラ部21と1つの筋電センサ22とを備える構成としたが、第2実施形態では、カメラ部21及び筋電センサ22を複数備えている点で構成を異にしている。上記した実施形態と同一の構成については、同一符号を付して説明を省略する。
図7は、第3施形態に係る指先カメラ装置を備えた指先カメラシステムの一例を示す概略図である。図8、手に指先カメラ装置が装着された状態の一例を示す概略図である。第2実施形態では、指先カメラ装置20Aが2台のカメラ部21A,21Bと2つの筋電センサ22A,22Bとを備える構成としたが、第3実施形態では、5台のカメラ部21A~21E及び筋電センサ22A~22Eを備えている点で構成を異にしている。上記した実施形態と同一の構成については、同一符号を付して説明を省略する。
20、20A、20B 指先カメラ装置(カメラ装置)
21、21A、21B、21C、21D、21E カメラ部
22、22A、22B、22C、22D、22E 筋電センサ(生体情報検出部)
23、23A、23B 制御ユニット
30 画像処理ユニット
33 立体画像生成処理部
34 表示部
40 撮像情報取得部
42 筋電情報取得部(生体情報取得部)
44 ポジションセンサ(腕方向検出部)
45 隣接カメラ間角度演算部
46 カメラ方向検出部
52 出力部
Claims (5)
- 指に装着されるカメラ部と、
掌の生体情報を取得する生体情報取得部と、
取得した生体情報に基づき、所定の基準方向に対する前記カメラ部の相対的な方向を検出するカメラ方向検出部と、
前記カメラ方向検出部から得た前記カメラ部の方向情報を、該カメラ部が撮像した撮像情報に付加して出力する出力部と、
を備えたことを特徴とするカメラ装置。 - 前記生体情報は、前記カメラ部が装着された手指の動きに伴う筋電情報であることを特徴とする請求項1に記載のカメラ装置。
- 手首に装着されて腕の向きを検出する腕方向検出部を備え、
前記カメラ方向検出部は、検出された腕の向きを前記基準方向として、該基準方向に対する前記カメラ部の相対的な方向を検出することを特徴とする請求項1または2に記載のカメラ装置。 - 前記生体情報を検出する生体情報検出部を、前記カメラ部が装着された手指の付け根、もしくは、手首に装着することを特徴とする請求項1~3のいずれか一項に記載のカメラ装置。
- 請求項1~4のいずれか一項に記載のカメラ装置と、
前記カメラ部が撮像した複数の撮像画像、及び、これら撮像情報と対応する前記カメラ部の相対的な方向を示す方向情報に基づき、3次元の構造体を表現する立体画像情報を生成する立体画像生成処理部と、
を備えることを特徴とするカメラシステム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280019506.XA CN116998146A (zh) | 2021-03-08 | 2022-03-01 | 相机装置及相机系统 |
EP22766920.7A EP4290308A4 (en) | 2021-03-08 | 2022-03-01 | CAMERA DEVICE AND CAMERA SYSTEM |
US18/461,549 US20230419719A1 (en) | 2021-03-08 | 2023-09-06 | Camera device and camera system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-036476 | 2021-03-08 | ||
JP2021036476A JP7709011B2 (ja) | 2021-03-08 | 2021-03-08 | カメラ装置及びカメラシステム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/461,549 Continuation US20230419719A1 (en) | 2021-03-08 | 2023-09-06 | Camera device and camera system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190961A1 true WO2022190961A1 (ja) | 2022-09-15 |
Family
ID=83226612
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/008572 WO2022190961A1 (ja) | 2021-03-08 | 2022-03-01 | カメラ装置及びカメラシステム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230419719A1 (ja) |
EP (1) | EP4290308A4 (ja) |
JP (2) | JP7709011B2 (ja) |
CN (1) | CN116998146A (ja) |
WO (1) | WO2022190961A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240361845A1 (en) * | 2023-04-27 | 2024-10-31 | Meta Platforms Technologies, Llc | Optical ring that enables thumb-to-index gestures |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004192167A (ja) * | 2002-12-09 | 2004-07-08 | Matsushita Electric Ind Co Ltd | 携帯端末及び情報入力装置 |
JP2005159771A (ja) | 2003-11-26 | 2005-06-16 | Sony Corp | 無線通信装置及び無線通信方法、無線通信システム、並びにコンピュータ・プログラム |
JP2006060584A (ja) * | 2004-08-20 | 2006-03-02 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2019016999A (ja) * | 2016-08-09 | 2019-01-31 | 株式会社アスタリスク | 読取システム及びカメラ |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0637036B2 (ja) * | 1984-04-13 | 1994-05-18 | 三菱重工業株式会社 | 多本指マニプレ−タ |
JP2001344053A (ja) * | 2000-06-01 | 2001-12-14 | Olympus Optical Co Ltd | 操作入力装置 |
US20130135223A1 (en) * | 2009-12-13 | 2013-05-30 | Ringbow Ltd. | Finger-worn input devices and methods of use |
WO2014144015A2 (en) * | 2013-03-15 | 2014-09-18 | Keller Eric Jeffrey | Computing interface system |
US20150035743A1 (en) * | 2013-07-31 | 2015-02-05 | Plantronics, Inc. | Wrist Worn Platform for Sensors |
US20200073483A1 (en) | 2018-08-31 | 2020-03-05 | Ctrl-Labs Corporation | Camera-guided interpretation of neuromuscular signals |
US20180096215A1 (en) | 2016-09-30 | 2018-04-05 | Thomas Alton Bartoshesky | Operator guided inspection system and method of use |
KR102036019B1 (ko) * | 2018-03-02 | 2019-10-24 | 한양대학교 산학협력단 | 수부 기능 장애 진단을 위한 분리형 데이터 글러브 |
-
2021
- 2021-03-08 JP JP2021036476A patent/JP7709011B2/ja active Active
-
2022
- 2022-03-01 EP EP22766920.7A patent/EP4290308A4/en active Pending
- 2022-03-01 WO PCT/JP2022/008572 patent/WO2022190961A1/ja active Application Filing
- 2022-03-01 CN CN202280019506.XA patent/CN116998146A/zh active Pending
- 2022-10-31 JP JP2022174297A patent/JP2023017867A/ja active Pending
-
2023
- 2023-09-06 US US18/461,549 patent/US20230419719A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004192167A (ja) * | 2002-12-09 | 2004-07-08 | Matsushita Electric Ind Co Ltd | 携帯端末及び情報入力装置 |
JP2005159771A (ja) | 2003-11-26 | 2005-06-16 | Sony Corp | 無線通信装置及び無線通信方法、無線通信システム、並びにコンピュータ・プログラム |
JP2006060584A (ja) * | 2004-08-20 | 2006-03-02 | Fuji Photo Film Co Ltd | デジタルカメラ |
JP2019016999A (ja) * | 2016-08-09 | 2019-01-31 | 株式会社アスタリスク | 読取システム及びカメラ |
Non-Patent Citations (1)
Title |
---|
See also references of EP4290308A4 |
Also Published As
Publication number | Publication date |
---|---|
JP2022136727A (ja) | 2022-09-21 |
JP2023017867A (ja) | 2023-02-07 |
EP4290308A1 (en) | 2023-12-13 |
JP7709011B2 (ja) | 2025-07-16 |
US20230419719A1 (en) | 2023-12-28 |
EP4290308A4 (en) | 2024-07-17 |
CN116998146A (zh) | 2023-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102458344B1 (ko) | 카메라의 초점을 변경하는 방법 및 장치 | |
JP2015039522A (ja) | リハビリ装置および幻肢痛治療支援装置 | |
US10198075B2 (en) | Operation apparatus | |
WO2016021252A1 (ja) | 情報処理装置及び情報処理方法、並びに画像表示システム | |
US9442571B2 (en) | Control method for generating control instruction based on motion parameter of hand and electronic device using the control method | |
US20150009103A1 (en) | Wearable Display, Computer-Readable Medium Storing Program and Method for Receiving Gesture Input | |
KR20170062439A (ko) | 제어 장치, 제어 방법 및 프로그램 | |
JP2020071587A (ja) | 表示装置、及び、表示装置の制御方法 | |
US20230141494A1 (en) | Markerless motion capture of hands with multiple pose estimation engines | |
CN106961546A (zh) | 信息处理装置和方法、摄像装置、显示装置、控制方法 | |
JP2023017867A (ja) | カメラ装置及びカメラシステム | |
KR101564967B1 (ko) | 아이 트래킹을 이용한 사용자 관심 대상 파악 시스템 및 방법 | |
KR101356015B1 (ko) | 센서를 이용한 3d 영상 보정 장치 및 이를 위한 방법 | |
JP2012146220A (ja) | ジェスチャ入力装置 | |
EP3971683A1 (en) | Human body portion tracking method and human body portion tracking system | |
JP2012194492A (ja) | ヘッドマウントディスプレイ及びヘッドマウントディスプレイのためのコンピュータプログラム | |
CN114078279B (zh) | 动作捕捉方法、装置、电子设备及存储介质 | |
WO2018076609A1 (zh) | 一种操作终端的方法和终端 | |
US11783492B2 (en) | Human body portion tracking method and human body portion tracking system | |
JP2017074398A (ja) | 情報処理装置、情報処理方法、プログラム、及び測定システム | |
CN210302240U (zh) | 一种增强现实ar腕关节康复评估和训练系统 | |
CN110520792B (zh) | 成像装置和隐形眼镜 | |
JP6149211B2 (ja) | 携帯端末装置、プログラムおよび手ぶれ補正方法 | |
KR102553830B1 (ko) | 카메라를 이용한 실시간 로봇 원격 제어 방법 및 장치 | |
JP2015163113A (ja) | 歩行支援装置、及び情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22766920 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280019506.X Country of ref document: CN Ref document number: 2022766920 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022766920 Country of ref document: EP Effective date: 20230906 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |