CN115299036B - Boot method - Google Patents
Boot methodInfo
- Publication number
- CN115299036B CN115299036B CN202180022168.0A CN202180022168A CN115299036B CN 115299036 B CN115299036 B CN 115299036B CN 202180022168 A CN202180022168 A CN 202180022168A CN 115299036 B CN115299036 B CN 115299036B
- Authority
- CN
- China
- Prior art keywords
- unit
- person
- information
- walking
- image data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Studio Devices (AREA)
Abstract
拍摄人物行走的姿态的拍摄装置(400)具有检测拍摄装置的方向的检测部(421)和在画面显示部上显示表示人物行走的位置的引导线的显示部(422)。显示部(422)根据检测部(421)检测出的拍摄装置的方向而显示不同的引导线。
A camera (400) for capturing a person's walking posture comprises a detection unit (421) for detecting the direction of the camera and a display unit (422) for displaying a guide line indicating the position of the person walking on a screen display unit. The display unit (422) displays different guide lines depending on the direction of the camera detected by the detection unit (421).
Description
Technical Field
The invention relates to a guiding method, an imaging device and a recording medium.
Background
It is known to perform walking analysis of a user from data.
Patent document 1 discloses a document describing walking analysis of a person, for example. Patent document 1 discloses a walking analysis device including a data acquisition unit that acquires two types of image data from a depth sensor, a bone information generation unit that generates bone information from the image data acquired by the data acquisition unit, a correction processing unit that corrects the bone information generated by the bone information generation unit, and an analysis processing unit that uses the corrected bone information to analyze the walking of a user.
Prior art literature
Patent literature
Patent document 1 International publication No. 2017/170832
Disclosure of Invention
Problems to be solved by the invention
In the technique described in patent document 1, a depth sensor (3D sensor) is required, but there is a demand for walking analysis based on image data acquired by a camera without using a 3D sensor such as a depth sensor.
In order to improve the accuracy of analysis when walking analysis is performed based on image data, it is preferable to match imaging conditions at the time of image data imaging as much as possible. However, the image data to be analyzed are acquired at different timings. Therefore, it is difficult to match the shooting conditions.
In this way, there is a problem that it is difficult to match imaging conditions when acquiring image data.
Accordingly, an object of the present invention is to provide a guidance method, an imaging device, and a recording medium, which solve the problem that it is difficult to match imaging conditions when acquiring image data.
Means for solving the problems
In order to achieve the object, a guidance method as one embodiment of the present disclosure adopts a structure in which,
The photographing device photographing the posture of the person walking detects the direction of the photographing device,
The image pickup device displays a guide line indicating the position where the person walks on a screen display unit,
When the guide line is displayed on the screen display unit, the imaging device displays a different guide line according to the detected direction of the imaging device.
In addition, the imaging device according to another embodiment of the present disclosure has a structure in which,
The photographing device photographs a posture of a person walking, wherein,
The imaging device includes:
a detection unit for detecting the direction of the photographing device, and
A display unit for displaying a guide line indicating the walking position of the person on the screen display unit,
The display unit displays the different guide lines according to the direction of the imaging device detected by the detection unit.
In addition, as a recording medium of other modes of the present disclosure, in which,
A program is recorded, and the program causes an imaging device that images the walking posture of a person to realize:
a detection unit for detecting the direction of the photographing device, and
A display unit for displaying a guide line indicating the walking position of the person on the screen display unit,
The display unit displays the different guide lines according to the direction of the imaging device detected by the detection unit.
Effects of the invention
According to the above-described configurations, the imaging conditions when acquiring image data can be made uniform.
Drawings
Fig. 1 is a diagram showing a configuration example of a walking posture measurement system according to a first embodiment of the present disclosure.
Fig. 2 is a view showing an example of capturing a walking posture in the vertical direction.
Fig. 3 is a diagram illustrating an example of capturing a walking posture in the left-right direction.
Fig. 4 is a block diagram showing a configuration example of the smart phone shown in fig. 1.
Fig. 5 is a block diagram showing an example of the configuration of the imaging support unit shown in fig. 4.
Fig. 6 is a view showing an example of the imaging support display in the horizontal grip.
Fig. 7 is a view showing an example of the photographing auxiliary display at the time of vertical grip.
Fig. 8 is a diagram showing an example of the operation of the angle adjustment information output unit shown in fig. 5.
Fig. 9 is a block diagram showing a configuration example of the walking posture measuring device shown in fig. 1.
Fig. 10 is a diagram showing an example of the bone information shown in fig. 9.
Fig. 11 is a diagram showing an example of measurement result information shown in fig. 9.
Fig. 12 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 13 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 14 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 15 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 16 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 17 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 18 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 19 is a diagram for explaining the processing of the actual measurement value calculation unit.
Fig. 20 is a flowchart showing an example of the operation of the imaging support unit in the smartphone.
Fig. 21 is a flowchart showing an example of the operation of the walking posture measuring device.
Fig. 22 is a flowchart showing an example of processing for calculating actual measurement value information.
Fig. 23 is a diagram for explaining a tracking example of the walking posture measuring device in the second embodiment of the present disclosure.
Fig. 24 is a block diagram showing a configuration example of the walking posture measurement device in the second embodiment of the present disclosure.
Fig. 25 is a diagram showing a configuration example of the tracking unit shown in fig. 24.
Fig. 26 is a diagram showing an example of the graphics generated by the package graphics generating unit shown in fig. 25.
Fig. 27 is a diagram showing an example of the graphics generated by the package graphics generating unit shown in fig. 25.
Fig. 28 is a flowchart showing an example of the operation of the tracking unit.
Fig. 29 is a diagram showing an example of a hardware configuration of an imaging device according to a third embodiment of the present disclosure.
Fig. 30 is a block diagram showing a configuration example of an imaging device.
Fig. 31 is a block diagram showing a configuration example of an information processing apparatus according to a fourth embodiment of the present disclosure.
Fig. 32 is a block diagram showing a configuration example of an information processing apparatus in a fifth embodiment of the present disclosure.
Fig. 33 is a block diagram showing a configuration example of a tracking device according to a sixth embodiment of the present disclosure.
Detailed Description
First embodiment
With respect to the first embodiment of the present disclosure, description will be made with reference to fig. 1 to 22. Fig. 1 is a diagram showing a configuration example of a walking posture measurement system 100. Fig. 2 is a view showing an example of capturing a walking posture in the vertical direction. Fig. 3 is a diagram illustrating an example of capturing a walking posture in the left-right direction. Fig. 4 is a block diagram showing an example of the structure of the smartphone 200. Fig. 5 is a block diagram showing an example of the configuration of the imaging support unit 212. Fig. 6 is a view showing an example of the imaging support display in the horizontal grip. Fig. 7 is a view showing an example of the photographing auxiliary display at the time of vertical grip. Fig. 8 is a diagram showing an example of the operation of the angle adjustment information output unit 2124. Fig. 9 is a block diagram showing a configuration example of the walking posture measuring device 300. Fig. 10 is a diagram showing an example of the bone information 334. Fig. 11 is a diagram showing an example of measurement result information 336. Fig. 12 is a diagram for explaining the processing of the actual measurement value calculation unit 343. Fig. 13 to 19 are diagrams for explaining the processing of the actual measurement value calculation unit 343. Fig. 20 is a flowchart showing an example of the operation of the imaging support unit 212 in the smartphone 200. Fig. 21 is a flowchart showing an example of the operation of the walking posture measuring device 300. Fig. 22 is a flowchart showing an example of processing for calculating the measured value information 335.
In the first embodiment of the present disclosure, a walking posture measurement system 100 for measuring a walking posture of a person from a moving image obtained by using an imaging device such as a smartphone 200 will be described. In the walking posture measurement system 100, a moving image (i.e., a plurality of image data) indicating the posture of a person walking in the back-and-forth direction of an image and a moving image indicating the posture of a person walking in the left-and-right direction of an image are captured using the smartphone 200. Then, the walking posture measurement system 100 measures a walking posture such as a stride length, a walking speed, and straightness corresponding to a shake of a head or the like at the time of walking, based on a plurality of image data as moving images acquired by imaging. As will be described later, the walking posture measurement system 100 includes a configuration for matching imaging conditions as much as possible when image data is acquired using the smartphone 200, a configuration for measuring an actual measurement value from the image data, and the like.
Fig. 1 shows a configuration example of a walking posture measurement system 100. Referring to fig. 1, the walking posture measurement system 100 includes, for example, a smartphone 200 and a walking posture measurement device 300. As shown in fig. 1, the smartphone 200 and the walking posture measuring device 300 are connected to each other by wireless or wired communication, for example.
The smartphone 200 functions as a photographing device that photographs the posture of the person walking. The smart phone 200 may be a smart phone having a camera function, a touch panel 201 for displaying a screen, various sensors such as a GPS sensor and an acceleration sensor, and other general functions.
In the case of the present embodiment, as shown in fig. 2, in a state where the smartphone 200 is held vertically (a state where it is held vertically), the photographer photographs a posture in which a person walks in the inside-outside direction from a screen to the front. In other words, in a state where the smartphone 200 having a rectangular shape is held horizontally and vertically on the ground, the photographer photographs a posture in which the person walks in the inside-outside direction. As shown in fig. 3, in a state where the smartphone 200 is held laterally (a laterally held state), the photographer photographs a posture in which the person walks in the left-right direction from the left side to the right side of the screen. In other words, in a state where the smartphone 200 is horizontally held horizontally on the ground, the photographer photographs a posture in which the person walks in the inside-outside direction. In this way, in the walking posture measurement system 100, the photographer photographs a moving image (a plurality of image data) of a type corresponding to the direction of the smartphone 200. The posture of the person walking in the inside-outside direction and the posture of the person walking in the left-right direction may be photographed in two times using, for example, 1 smartphone 200, or may be photographed simultaneously using 2 smartphones 200.
Fig. 4 shows a characteristic configuration example of the present embodiment of the smartphone 200. Referring to fig. 4, the smartphone 200 includes a measurement operation imaging unit 210 in addition to a general configuration of the smartphone as an acceleration sensor, a gyro sensor, or the like for detecting the direction (longitudinal direction, lateral direction) of the smartphone 200. As shown in fig. 4, the measurement operation imaging unit 210 includes an image data imaging unit 211 and an imaging support unit 212.
For example, the smart phone 200 includes an arithmetic device such as a CPU (Central Processing Unit: central processing unit) and a storage device. The smart phone 200 implements the processing unit described above by, for example, the arithmetic device executing a program stored in the storage device.
The measurement operation imaging unit 210 images the posture of the person walking as the measurement operation in the present embodiment, in accordance with the operation of the smartphone 200 by the photographer. The measurement operation imaging unit 210 includes a function for assisting imaging so that imaging can be performed under the same conditions as possible when imaging is performed. For example, the measurement operation imaging unit 210 is an imaging application program having a camera operation function for imaging the walking posture of a person and a guide function for making the imaging conditions such as the walking direction, angle, and size of the screen as uniform as possible when the walking posture of the person is imaged. As described above, the measurement operation imaging unit 210 includes the image data imaging unit 211 and the imaging support unit 212.
The image data capturing unit 211 captures a person by using a camera included in the smartphone 200 to obtain a moving image (a plurality of image data). The image data capturing unit 211 can associate information indicating the date and time of acquisition of the image data, information acquired by the imaging support unit 212 described later, and the like with the moving image (image data) captured by the image data capturing unit 211.
The imaging support unit 212 performs support for making imaging conditions as uniform as possible when the image data imaging unit 211 acquires image data. Fig. 5 shows an example of the configuration of the imaging support unit 212. Referring to fig. 5, the imaging assisting unit 212 includes, for example, a guide line display unit 2121, an angle information display unit 2122, a height information input unit 2123, and an angle adjustment information output unit 2124.
The guide line display unit 2121 displays a guide line 2011, which is a reference of the walking position of the person, such as the position of the foot of the person when the person is photographed, on the touch panel 201. By making the person travel along the guide line 2011 displayed on the touch panel 201 as much as possible, the direction, angle, size of the screen, and the like of travel can be made uniform when capturing the posture of the person traveling. In order to match the position indicated by the guide line 2011, a marker or the like may be placed in the real world, and the person may walk using the marker.
In the case of the present embodiment, the guide line display unit 2121 displays a different guide line 2011 on the touch panel 201 according to whether the smartphone 200 is held vertically or horizontally. For example, when it is determined that the smartphone 200 is held laterally based on information acquired from an acceleration sensor or the like, the guide line display unit 2121 displays a guide line 2011 for capturing a person walking in the left-right direction of the screen on the touch panel 201 as shown in fig. 6. In other words, the guide line display unit 2121 displays a guide line 2011 for guiding a person walking in the left-right direction on the touch panel 201. Referring to fig. 6, a region that is a shooting region that is being shot by a camera included in the smartphone 200 is displayed on the touch panel 201, and a guide line 2011 is displayed. When it is determined that the smartphone 200 is held vertically based on information acquired from an acceleration sensor or the like, the guide line display unit 2121 displays a guide line 2011 for capturing a person walking in the back-and-forth direction of the screen on the touch panel 201 as shown in fig. 7. In other words, the guide line display unit 2121 displays a guide line 2011 for guiding a person walking in the medial-lateral direction on the touch panel 201.
The guide wire display unit 2121 displays the guide wire 2011 at a predetermined position, for example. For example, in the case where the smartphone 200 is held horizontally, as shown in fig. 6, the guide line display unit 2121 displays the guide line 2011 below the center of the imaging region (for example, at the very middle level of the region in the lower half). In addition, for example, when the smartphone 200 is held in the vertical direction, the guide line display unit 2121 displays the guide line 2011 in the center of the imaging region as shown in fig. 7. The position where the guide wire display unit 2121 displays the guide wire 2011 may be a position other than the above-described example.
The angle information display unit 2122 obtains information indicating the angle of the smartphone 200, such as the inclination in the left-right direction and the inclination in the back-and-forth direction, from an acceleration sensor, a gyro sensor, or the like included in the smartphone 200. Then, the angle information display unit 2122 displays the acquired information on the touch panel 201 as angle information 2012 indicating the angle of the smartphone 200. When capturing the posture of the person walking, it is preferable to capture the image in a state in which the smartphone 200 is not tilted as much as possible. By displaying the angle information 2012 on the touch panel 201 by the angle information display unit 2122, the photographer can correct the angle of the smartphone 200 when photographing the posture of the person walking, and can photograph the posture of the person walking in a desired state in which the smartphone 200 is not inclined.
In the case of the present embodiment, the angle information display unit 2122 displays information indicating the inclination in the left-right direction and information indicating the inclination in the back-and-forth direction on the touch panel 201. In other words, the angle information display unit 2122 displays information indicating the inclination of the smartphone 200 in the horizontal direction and the vertical direction. For example, as shown in fig. 6 and 7, the angle information display unit 2122 displays angle information 2012 at a predetermined position on the touch panel 201. The display of the angle information 2012 by the angle information display 2122 may be the same (display according to the inclination) when the smartphone 200 is held vertically and when the smartphone 200 is held horizontally.
The height information input unit 2123 receives input of height information indicating the height h of the smartphone 200 from the ground from the person, and displays the height h of the input on the touch panel 201 on the height display unit 2013. The height h input by the height information input unit 2123 can be flexibly used when the walking posture measuring device 300 calculates the actual measurement value W.
As shown in fig. 6 and 7, the height information input unit 2123 displays the height display unit 2013 at a predetermined position on the touch panel 201. Then, the height information input unit 2123 receives, for example, by touching the height display unit 2013 with a person operating the smartphone 200, input of information indicating the height h from the person. Then, the height information input unit 2123 displays information indicating the received height h on the height display unit 2013. The display of the height display unit 2013 by the height information input unit 2123 may be performed in the same manner as the display of the smartphone 200 when the smartphone 200 is held in the vertical direction and the display of the smartphone 200 when the smartphone is held in the horizontal direction.
The angle adjustment information output unit 2124 outputs information for adjusting the inclination of the smartphone 200. For example, the angle adjustment information output unit 2124 outputs information corresponding to the inclination of the smartphone 200, which differs according to the inclination of the smartphone 200. Specifically, for example, the angle adjustment information output unit 2124 outputs, as information for adjusting the inclination of the smartphone 200, a sound adjusted according to the inclination of the smartphone 200.
Fig. 8 shows an example of the processing of the angle adjustment information output unit 2124. Referring to fig. 8, for example, the angle adjustment information output unit 2124 outputs a sound whose length is adjusted according to the inclination of the smartphone 200 in the left-right direction, and outputs a sound whose pitch is adjusted according to the inclination of the smartphone 200 in the inside-outside direction. In this way, the angle adjustment information output unit 2124 performs different adjustments according to the tilt mode of the smartphone 200. For example, referring to fig. 8, the angle adjustment information output unit 2124 adjusts the length of the sound so that the 1 sound becomes shorter as the smartphone 200 is tilted to the left. The angle adjustment information output unit 2124 adjusts the length of the sound so that the length of 1 sound increases as the smartphone 200 is tilted to the right. The angle adjustment information output unit 2124 adjusts the musical interval so that the musical interval decreases as the smartphone 200 is tilted in the forward direction (for example, toward the photographer). The angle adjustment information output unit 2124 adjusts the musical interval so that the musical interval increases as the smartphone 200 is tilted in the depth direction (for example, on the opposite side to the photographer).
According to the above configuration, when the smartphone 200 is tilted in any direction, the angle adjustment information output unit 2124 adjusts at least one of the length and the pitch of the sound according to the tilt system, thereby outputting two kinds of sounds. On the other hand, when the smartphone 200 is not tilted, the angle adjustment information output unit 2124 outputs 1 sound since the length and pitch of the sound are not adjusted. In this way, by configuring the angle adjustment information output unit 2124 to output the adjusted sound, for example, even if the photographer has difficulty in seeing the touch panel 201, the angle adjustment of the smartphone 200 can be easily performed.
The information for adjusting the inclination of the smartphone 200 outputted by the angle adjustment information output unit 2124 is not necessarily limited to a voice. For example, the angle adjustment information output unit 2124 may emit light or vibrate the smartphone 200 instead of sound. For example, the angle adjustment information output unit 2124 may be configured to perform processing such as lighting a lamp and vibrating the smartphone 200 when the angle is close to a correct angle, that is, when the angle is not inclined. The angle adjustment information output unit 2124 may perform various combinations of the processes of outputting a sound according to the inclination in the left-right direction, emitting a light according to the inclination in the inside-outside direction, outputting a sound according to the inclination in the left-right direction, and emitting a light.
The above is a configuration example of the characteristic smartphone 200 in the present embodiment.
The walking posture measuring device 300 is a server device that measures a walking posture such as a stride length, a walking speed, and straightness based on a moving image (i.e., a plurality of image data) captured by the smartphone 200. Fig. 9 shows a configuration example of the walking posture measuring device 300. Referring to fig. 9, the walking posture measuring device 300 includes, for example, a screen display unit 310, a communication I/F unit 320, a storage unit 330, and an arithmetic processing unit 340 as main components.
The screen display unit 310 is configured by a screen display device such as a touch panel or a liquid crystal display. The screen display unit 310 can display the image information 333, the bone information 334, the measurement result information 336, the display of the position where the bone information 334 indicates is superimposed on the image data included in the image information 333, and the like, in accordance with the instruction from the arithmetic processing unit 340.
The communication I/F section 320 is constituted by a data communication circuit. The communication I/F unit 320 performs data communication with an external device, a smartphone 200, or the like connected via a communication line.
The storage unit 330 is a storage device such as a hard disk or a memory. The storage unit 330 stores processing information and programs 337 necessary for various kinds of processing in the arithmetic processing unit 340. The program 337 is read into the arithmetic processing unit 340 and executed to realize various processing units. The program 337 is read in advance from an external device or a recording medium by a data input/output function such as the communication I/F unit 320, and is stored in the storage unit 330. As the main information stored in the storage unit 330, for example, a learned model 331, camera setting information 332, image information 333, bone information 334, measured value information 335, measurement result information 336, and the like are given.
The learned model 331 is a learned model used when the bone recognition unit 342 performs bone recognition. The learning model 331 is generated in advance by performing machine learning using training data such as image data to which skeletal coordinates are input in an external device or the like, and is acquired from the external device or the like through the communication I/F unit 320 or the like and stored in the storage unit 330.
The learning model 331 may be updated by a relearning process or the like using additional training data.
The camera setting information 332 includes information indicating parameters of a camera included in the smartphone 200 used when the smartphone 200 photographs walking of a person. The camera setting information 332 includes, for example, information indicating the vertical angle of view θ and the horizontal angle of view ψ of the camera.
The camera setting information 332 is acquired in advance from the smartphone 200 or the like through the communication I/F unit 320 or the like, for example, and is stored in the storage unit 330. The camera setting information 332 may be acquired from the smartphone 200 together with the image data and stored in the storage unit 330 when acquiring the image data from the smartphone 200.
The image information 333 includes image data (moving image) acquired by a camera included in the smartphone 200. In the image information 333, for example, for each unit that is a moving image, image data, information indicating the date and time at which the smartphone 200 acquired the image data, information indicating the height input by the height information input unit 2123, and the like are associated. In the image information 333, a moving image in which a person walks in the left-right direction and a moving image in which a person walks in the inside-outside direction are associated. As will be described later, the measurement unit 344 performs corresponding walking posture measurement based on a moving image of a person walking in the left-right direction and a moving image of a person walking in the inside-outside direction, respectively.
The bone information 334 includes information indicating coordinates of each part of the person recognized by the bone recognition unit 342. Fig. 10 shows an example of bone information 334. Referring to fig. 10, in the bone information 334, for example, for each person to be identified, time and position information of each part are associated. The time indicates an elapsed time from the start of moving image capturing, a time at which the moving image is captured, and the like. The positional information of each part includes information indicating coordinates of each part in the image data, such as the position of the pelvis.
The location included in the location information of each location corresponds to the learned model 331. For example, in fig. 10, the pelvis, the center of the spine, are illustrated. The positional information of each part may include, for example, a part (or a part other than the illustrated part) around 30 parts such as a right shoulder, a right elbow, a right knee, and a right hand. The location information of each location may include a location other than that illustrated in fig. 10 or the like.
The measured value information 335 includes information indicating the measured value W calculated by the measured value calculation unit 343. For example, in the measured value information 335, the measured value W corresponds to the reference line, the identification information of the image data, or the like. The measured value information 335 may include information indicating a stride length or the like. The details of the processing of the actual measurement value calculation unit 343 will be described later.
The measurement result information 336 indicates the result of the walking posture measurement measured by the measurement unit 344. Fig. 11 shows an example of measurement result information 336. Referring to fig. 11, in measurement result information 336, correspondence is established for each person to be measured, for example, time, walking speed, stride length, and straightness. Here, the time means an elapsed time from the start of moving image capturing, a time at which a moving image is captured, or the like. The walking speed represents the speed at which the person walks. In addition, a stride represents the length between the tips of the right and left feet (or between the heels) of a person walking. Further, straightness indicates the degree of shaking or degree of shaking of the head or body when the person walks. The measurement result information 336 may include information indicating a time period for a step, such as a distance between the two.
Here, among the various pieces of information included in the measurement result information 336, the walking speed and the stride length are pieces of information measured by the actual measurement value calculation unit 343 and the measurement unit 344 based on moving images (image data) of the person walking in the left-right direction. Among the various pieces of information included in the measurement result information 336, the straightness is information that the measurement unit 344 measures based on a moving image (image data) of a person traveling in the vertical direction. For example, as described above, the measurement result information 336 includes information measured by the measurement unit 344 based on a moving image of a person traveling in the left-right direction and information measured by the measurement unit 344 based on a moving image of a person traveling in the inside-outside direction.
The arithmetic processing unit 340 has a microprocessor such as an MPU and peripheral circuits. The arithmetic processing unit 340 reads and executes the program 337 from the storage unit 330, and realizes various processing units by making the hardware cooperate with the program 337. As main processing units implemented by the arithmetic processing unit 340, there are, for example, an image acquisition unit 341, a bone recognition unit 342, an actual measurement value calculation unit 343, a measurement unit 344, and an output unit 345.
The image acquisition unit 341 acquires the moving image (a plurality of image data) acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 320. Then, the image acquisition unit 341 associates the acquired image data with, for example, information indicating the date and time of acquisition, the height, and the like of the image data, and stores the image data as image information 333 in the storage unit 330.
In the case of the present embodiment, the image acquisition unit 341 acquires, from the smartphone 200, a moving image (image data) in which the person walks in the left-right direction and a moving image (image data) in which the person walks in the inside-outside direction so as to be associated or able to be associated. Then, the image acquisition unit 341 associates the two types of moving images acquired and stores the two types of moving images as image information 333 in the storage unit 330.
The bone recognition unit 342 recognizes bones of the person to be the object of the walking posture measurement from the image data using the learned model 331. For example, the bone recognition unit 342 recognizes the upper part of the spine, the right shoulder, the left shoulder, the right elbow, the left elbow, the right wrist, the left wrist, the right hand, the left hand, the. The bone recognition unit 342 calculates coordinates in the screen data of each recognized portion. Then, the bone recognition unit 342 associates the result of recognition/calculation with the recognition information or the like for recognizing the person, and stores the result as bone information 334 in the storage unit 330 for each person.
The portion identified by the bone identification unit 342 corresponds to the learned model 331 (training data used when learning the learned model 331). Therefore, the bone recognition unit 342 may recognize a portion other than the above-described example based on the learned model 331.
The measured value calculation unit 343 calculates the measured value W from one end of the screen to the other end of the screen of the reference line based on the height h of the camera of the smartphone 200 from the ground, the vertical angle of view θ and the horizontal angle of view ψ of the camera of the smartphone 200, and the ratio α of the reference line to the half screen of an arbitrary height on the screen. The actual measurement value calculation unit 343 can calculate the stride of the person or the like using the calculated actual measurement value W. Then, the actual measurement value calculation unit 343 stores the calculated actual measurement value W and the like as actual measurement value information 335 in the storage unit 330. The actual measurement value calculation unit 343 can calculate the actual measurement value W for every 1 frame (i.e., every image data) in the moving image.
In general, from image data in which a depth value cannot be obtained by a camera or the like included in the smartphone 200, only a planar position on a screen can be obtained, and an actual length or the like in the real world cannot be immediately known. The actual measurement value calculation unit 343 can calculate the actual length (i.e., the length in the real world), that is, the actual measurement value W, by using the various values described above.
Fig. 12 shows a mathematical expression used when the actual measurement value calculation unit 343 calculates the actual measurement value W. As shown in fig. 12, the actual measurement value calculation unit 343 calculates the actual measurement value W by solving the equation shown in equation 1. That is, the actual measurement value calculation unit 343 calculates the actual measurement value W based on the height h of the camera of the smartphone 200 from the ground, the parameters of the camera of the smartphone 200, and the ratio α of the reference line of the calculated actual measurement value W to the half screen.
[ Mathematics 1]
Here, the actual measurement value W represents, for example, the actual length of the reference line in the image data from one end of the screen to the other end. In addition, h is the height of the camera (smartphone 200) from the ground. The value of the height h is input by the photographer when acquiring the image data using the height information input unit 2123. In addition, θ is a vertical field angle, and ψ is a horizontal field angle. Alpha is the ratio of the reference line of an arbitrary height on the screen to half the screen.
Next, with reference to fig. 13 to 18, equation 1 will be described in detail.
For example, as shown in fig. 13, a person on a reference line, which is an arbitrary height on the screen, is photographed using a camera of the smartphone 200 located at the height h. In this case, as shown in fig. 14, the position of the smartphone 200 at the height h is set as the origin O. Further, the point G is the contact point with the floor when the smartphone 200 is lowered vertically in the floor direction, and the point P is the position of the person. In addition, an angle formed by a line connecting the origin O and the point P and the ground (or a line horizontal to the ground) is set as an angle
In the above case, when the distance from the point G to the point P is d, as shown in fig. 15, the equation 2 is represented by equation 3.
[ Math figure 2]
[ Math 3]
In addition, as shown in FIG. 16, the angleThe ratio of the positions on the screen at any point P is expressed by equation 4 assuming that the vertical angle of view θ is set.
[ Mathematics 4]
When the scene shown in fig. 14 is viewed from directly above, the scene is as shown in fig. 17. In fig. 17, one end of a line passing through a height on an arbitrary screen of the point P is set as a point Q. In this case, as shown in fig. 18, if the width between the point P and the point Q is set to be the width w, the expression 5 is expressed as expression 6. Since equation 7 is obtained from equations 3 and 4, equation 8 is obtained when equation 7 is substituted into equation 6.
[ Math 5]
[ Math figure 6]
[ Math 7]
[ Math figure 8]
Accordingly, since the actual measurement value W is 2W, which is the entire length of the screen at the position distant from the smartphone 200 by the distance d, the above equation 8 is multiplied by 2 times, and the equation 1 is obtained. From this, it can be seen that the actual measurement value W can be calculated by calculating equation 1.
The reference line of an arbitrary height on the screen on which the scale α is calculated may be determined, for example, from the position of the feet of the person in the image data. That is, the actual measurement value calculating unit 343 can perform the process of calculating the scale α after determining the reference line from the position of the foot of the person in the image data. For example, the actual measurement value calculation unit 343 may use a line having a slope of 0, such as an average value of Y coordinates of left and right feet of a person or Y coordinates of any one of the feet, as a reference line (the line may be a line other than the illustration). The above-described processing by the actual measurement value calculation unit 343 may be performed for each 1 frame (that is, for each image data), for example. The height on the screen of the reference line may be predetermined based on the position of the guide line 2011 displayed on the guide line display unit 2121.
The actual measurement value calculation unit 343 can calculate a stride length or the like from the calculated actual measurement value W. For example, the actual measurement value calculation unit 343 can calculate the stride from the image data of the front and rear divided feet during walking, the calculated actual measurement value W, and the resolution of the image data, which are included in the moving image of the person walking in the left-right direction shown in fig. 19. Specifically, for example, when the resolution of the image data is 1920×1080, the actual measurement value k for each 1 pixel on the reference line becomes k=w/1920. Therefore, for example, in the case where the difference between the values of the x coordinates (of the toe portion recognized by the bone recognition unit 342, for example) of the left and right feet in the image data shown in fig. 19 is 100 pixels, the actual measurement value calculation unit 343 can calculate the actual stride by calculating 100×k. In this way, the actual measurement value calculation unit 343 can calculate the stride from the calculated actual measurement value W, resolution, and the number of pixels.
For example, as described above, the actual measurement value calculation unit 343 may calculate the actual measurement value W of the length from one end to the other end of the screen of the reference line, or calculate the stride of the person from the actual measurement value W.
The measurement unit 344 measures the walking posture of the person using the calculation result of the measured value calculation unit 343, the recognition result of the bone recognition unit 342, and the like. Then, the measurement unit 344 stores the measurement result or the like as measurement result information 336 in the storage unit 330.
As described above, the measurement unit 344 can perform measurement of a moving image based on a person walking in the left-right direction and measurement of a moving image based on a person walking in the inside-outside direction. For example, the measurement unit 344 can calculate the traveling speed, the pitch, and the like based on a moving image (image data) in which the person travels in the left-right direction. For example, the measurement unit 344 may calculate the movement distance between frames or the like of the portion identified by the bone identification unit 342, and calculate the walking speed based on the calculated movement distance and the time of each frame (image data). The measurement unit 344 may use the calculation result of the actual measurement value calculation unit 343 such as the stride length when calculating the walking speed. The measurement unit 344 can calculate the straightness or the like based on a moving image (image data) in which the person walks in the vertical direction. For example, the measurement unit 344 can calculate the straightness based on the shake or the like of the coordinates of the head recognized by the bone recognition unit 342.
The measurement unit 344 may be configured to perform measurements other than those illustrated above.
The output unit 345 can output the bone information 334, the measured value information 335, the measurement result information 336, the information in which the bone information 334 is superimposed on the moving image included in the image information 333, and the like. For example, the output unit 345 outputs the information by displaying the information on the screen display unit 310 or transmitting the information to an external device connected via the communication I/F unit 320.
The above is a configuration example of the walking posture measuring device 300.
Next, an example of the operation of the walking posture measurement system 100 will be described with reference to fig. 20 to 22. First, an example of the operation of the angle adjustment information output unit 2124 will be described with reference to fig. 20.
Fig. 20 shows an example of the operation of the angle adjustment information output unit 2124. Referring to fig. 20, when the smartphone 200 is tilted in the vertical direction (yes in step S101), the angle adjustment information output unit 2124 corrects the pitch according to the tilt (step S102). That is, the more the smartphone 200 is tilted in the forward or deep direction, the greater the angle adjustment information output unit 2124 corrects the musical interval. Then, the angle adjustment information output unit 2124 outputs the corrected sound (step S103).
When the smartphone 200 is tilted in the left-right direction (yes in step S104), the angle adjustment information output unit 2124 corrects the length of the sound based on the tilt (step S102). That is, the angle adjustment information output unit 2124 corrects the length of the sound to be larger as the smartphone 200 is tilted to the left or right. Then, the angle adjustment information output unit 2124 outputs the corrected sound (step S103).
The output of the sound of the angle adjustment information output unit 2124 satisfies the termination condition (yes in step S107), and terminates. The termination condition may be, for example, start of capturing a moving image, termination of capturing a moving image, elapse of a predetermined time since the tilt of the smartphone 200 disappears and the sound matches, or execution of an output stop process by a person. The termination condition may be a condition other than the above-described example.
The above is an example of the operation of the angle adjustment information output unit 2124. Next, an example of the overall operation of the walking posture measuring device 300 will be described with reference to fig. 21. Fig. 21 shows an overall operation example of the walking posture measuring device 300. Referring to fig. 21, the image acquisition unit 341 acquires the moving image (a plurality of image data) acquired by the smartphone 200 from the smartphone 200 via the communication I/F unit 320 (step S201). In the case of the present embodiment, the image acquisition unit 341 acquires, from the smartphone 200, a moving image (image data) in which the person walks in the left-right direction and a moving image (image data) in which the person walks in the inside-outside direction so as to be associated or able to be associated.
The bone recognition unit 342 recognizes the bone of the person to be the object of the walking posture measurement from the image data using the learned model 331 (step S202).
The measurement unit 344 obtains an actual measurement value W or the like indicated by the actual measurement value information 335 (step S203). The actual measurement value W indicated by the actual measurement value information 335 may be calculated in advance by the actual measurement value calculating unit 343, for example, or may be calculated by the bone recognizing unit 342 in parallel with the bone recognizing process or after the recognizing process by the bone recognizing unit 342 by the actual measurement value calculating unit 343.
The measurement unit 344 measures the walking posture of the person using the calculation result or the like of the measured value calculation unit 343 (step S204). For example, the measurement unit 344 performs measurement based on a moving image in which a person walks in the left-right direction and measurement based on a moving image in which a person walks in the inside-outside direction.
The output unit 345 outputs the bone information 334, the measured value information 335, the measurement result information 336, the information in which the bone information 334 is superimposed on the moving image included in the image information 333, and the like (step S205).
The above is an example of the overall operation of the walking posture measuring device 300. Next, an example of the process of calculating the actual measurement value W by the actual measurement value calculating unit 343 will be described with reference to fig. 22.
Referring to fig. 22, the measured value calculation unit 343 obtains information indicating the height input by the height information input unit 2123, which is included in the image information 333. The measured value calculation unit 343 refers to the camera setting information 332 and acquires information indicating the vertical angle of view θ and the horizontal angle of view ψ of the camera (step S301).
The measured value calculation unit 343 determines a reference line for calculating the measured value W, and calculates the ratio α (step S302). For example, the measured value calculation unit 343 determines a reference line based on the position of the foot of the person in the image data. The actual measurement value calculation unit 343 calculates the ratio α of the reference line to the half screen based on the specified reference line.
The measured value calculation unit 343 calculates the measured value W based on the height h, the vertical angle of view θ, the horizontal angle of view ψ, and the ratio α (step S303). For example, the measured value calculation unit 343 calculates the measured value W by solving the equation shown in the above equation 1. Then, the measured value calculation unit 343 stores the calculated measured value W as measured value information 335 in the storage unit 330.
The actual measurement value calculation unit 343 takes the difference between the values of the x coordinates (for example, of the toe) of the left and right feet in the image data. Then, a step is calculated based on the difference between the values of the x-coordinates, the measured value W, and the resolution (step S304).
The above is a processing example of the actual measurement value calculation unit 343.
As described above, the walking posture measuring device 300 includes the bone recognition unit 342 and the measuring unit 344. With this configuration, the measurement unit 344 can measure the walking posture based on the result of the bone recognition by the bone recognition unit 342. As a result, the measurement unit 344 can measure the walking posture based on the image data acquired by using the camera or the like of the smartphone 200, without using a depth sensor or the like.
The walking posture measuring device 300 is configured to acquire a moving image indicating a posture of a person walking in the back-and-forth direction of the image and a moving image indicating a posture of a person walking in the left-and-right direction of the image. With this configuration, the measurement unit 344 can perform measurement based on a moving image indicating a posture of the person traveling in the back-and-forth direction of the image, and also perform measurement based on a moving image indicating a posture of the person traveling in the left-and-right direction of the image. As a result, the measurement unit 344 can measure various walking postures, which are difficult to measure from 1 moving image, based on image data obtained by using a camera or the like included in the smartphone 200, without using a depth sensor or the like.
The walking posture measuring device 300 further includes an actual measurement value calculating unit 343. With this configuration, the actual measurement value calculation unit 343 can calculate an actual measurement value. As a result, the step and the like can be calculated based on the moving image captured using the camera of the smartphone 200. Thus, the walking posture measurement device 300 can measure a correct walking posture based on a moving image captured by the camera of the smartphone 200.
Further, the smartphone 200 includes an imaging support unit 212. With such a configuration, the smartphone 200 can acquire a moving image indicating a posture of the person traveling in the back-and-forth direction of the image and a moving image indicating a posture of the person traveling in the left-and-right direction of the image with the assistance of the imaging assisting unit 212. As a result, the smartphone 200 can acquire the two moving images in a state where the imaging conditions are as close as possible. In this way, in the case of performing measurement using two moving images, the accuracy of measurement by the measurement unit 344 can be improved. In addition, according to the above configuration, the imaging conditions when moving images are acquired at different timings can be suppressed. That is, the imaging conditions can be matched when the image data in the same direction acquired at various timings is imaged. As a result, the image data in the same direction can be compared with each other with high accuracy, and the accuracy of analysis can be improved.
The imaging support unit 212 includes an angle adjustment information output unit 2124. With this configuration, the angle adjustment information output unit 2124 can output information such as a sound for adjusting the inclination of the smartphone 200. As a result, even in a state where it is difficult to see the screen of the smartphone 200, the angle adjustment of the smartphone 200 can be easily performed. This facilitates matching of imaging conditions, and can easily improve the accuracy of measurement by the measurement unit 344.
In the present embodiment, a case where the smartphone 200 is used as the imaging device is illustrated. However, a camera other than the smartphone 200 may be used to acquire a moving image. That is, the imaging device included in the walking posture measurement system 100 is not limited to the case of the smartphone 200.
In the present embodiment, a case is illustrated in which 1 information processing device realizes the function as the walking posture measuring device 300. However, the function of the walking posture measuring device 300 may be realized by a plurality of information processing devices connected via a network, for example. In other words, the function of the walking posture measuring device 300 is not limited to the case of being realized by 1 information processing device, and may be realized on the cloud, for example.
The imaging support unit 212 may have all of the functions illustrated in fig. 5, or may have several (at least 1) of the functions illustrated in fig. 5. For example, the imaging assisting unit 212 may have only the function as the angle adjustment information output unit 2124 without displaying the angle information 2012 of the angle information display unit 2122.
The assist function of the imaging assist unit 212 described in the present embodiment may be applied to a system other than the walking posture measurement system 100. The assist function of the imaging assist unit 212 may be applied to various scenes in which it is necessary to match imaging conditions as much as possible when acquiring image data. Similarly, the calculation process of the measured value W by the measured value calculation unit 343 may be applied to a system other than the walking posture measurement system 100. The calculation process of the actual measurement value W by the actual measurement value calculation unit 343 can be applied to various scenes in which the actual measurement value is calculated based on image data.
Second embodiment
Next, a second embodiment of the present invention will be described with reference to fig. 23 to 28. Fig. 23 is a diagram for explaining an example of tracking of the walking posture measuring device 300. Fig. 24 is a block diagram showing a configuration example of the walking posture measurement device 300 according to the second embodiment. Fig. 25 is a diagram showing a configuration example of the tracking unit 346. Fig. 26 and 27 are diagrams showing an example of the graphics generated by the package graphics generating unit 3461. Fig. 28 is a flowchart showing an example of the operation of the tracking unit 346.
In the second embodiment of the present disclosure, a modification of the walking posture measuring device 300 described in the first embodiment will be described. For example, as shown in fig. 23, when a plurality of persons exist in the image data, if tracking of the same person is not accurately performed between frames, calculation of the travel speed using the travel distance between frames cannot be accurately performed. Therefore, in the case of the present embodiment, the tracking unit 346 is provided in addition to the configuration of the walking posture measuring device 300 described in the first embodiment. The tracking unit 346 is configured to track the result of the identification by the bone identification unit 342.
Fig. 24 shows a configuration example of a walking posture measurement device 300 according to the second embodiment. Referring to fig. 24, the walking posture measuring device 300 has a tracking unit 346 in addition to the configuration described in the first embodiment. The characteristic configuration of the present embodiment will be described below.
The tracking unit 346 performs tracking of the person based on the result of the identification by the bone identification unit 342. For example, the tracking unit 346 performs tracking of the person by assigning an identification number to the identified person. That is, the tracking unit 346 performs tracking of the person by assigning the same identification number to the person determined that the image data before one frame and the image data in the current frame are the same. Fig. 25 shows a more detailed configuration example of the tracking unit 346. Referring to fig. 25, the tracking unit 346 includes, for example, an inside package pattern generating unit 3461, an average bone coordinate calculating unit 3462, and a comparison tracking unit 3463.
The interior package pattern generation unit 3461 generates a pattern including the coordinates of all the parts recognized by the bone recognition unit 342 (the coordinates included in the bone information 334) for each person. For example, the inner-bag pattern generation unit 3461 generates any one of a minimum convex bag, a rectangle, and a circle including all coordinates. The package pattern generation unit 3461 calculates the area of the generated pattern.
For example, the packet pattern generation unit 3461 performs packet pattern generation and area calculation processing on the image data of each frame. When a plurality of persons are included in the image data of 1 frame, the package graphics generation unit 3461 performs the generation of the package graphics and the calculation of the area for each of the plurality of persons included in the image data.
Fig. 26 shows an example of a graph including coordinates of a person walking in the back-and-forth direction on the screen. As shown in fig. 26, the inner-bag pattern generation unit 3461 can output any one of a minimum convex bag, a rectangle, and a circle including all coordinates. Fig. 27 shows an example of a graph including coordinates of a person walking in the left-right direction on the screen. In the case shown in fig. 27, the inner-bag pattern generation unit 3461 can generate any one of a minimum convex bag, a rectangle, and a circle including all coordinates, similarly to the case shown in fig. 26.
The inner-package pattern generation unit 3461 also determines which pattern of convex-package, rectangular, or circular is generated in advance, for example. For example, the package pattern generation unit 3461 determines that a circle is generated as a pattern including coordinates.
The average bone coordinate calculating unit 3462 calculates an average value of the coordinates (coordinates included in the bone information 334) of all the sites identified by the bone identifying unit 342 for each individual. Thus, the average bone coordinate calculating unit 3462 calculates the average bone coordinate based on the coordinates of each part of the person recognized by the bone recognition unit 342.
The average skeleton coordinate calculating unit 3462 calculates the average skeleton coordinates of the image data of each frame, as in the case of the packet pattern generating unit 3461. When a plurality of persons are included in the image data of 1 frame, the average skeleton coordinate calculation unit 3462 performs calculation processing of average skeleton coordinates for each of the plurality of persons included in the image data.
The comparison tracking unit 3463 tracks the person based on the area of the interior pattern calculated by the interior pattern generating unit 3461 and the average bone coordinates calculated by the average bone coordinate calculating unit 3462. For example, the comparison tracking unit 3463 compares the area corresponding to the person to be tracked calculated in the current frame with the area calculated one frame before. The comparison tracking unit 3463 also compares the average bone coordinates corresponding to the person to be tracked with the average bone coordinates calculated one frame before the person is tracked, based on the result of the comparison.
Specifically, for example, when the area within the first allowable value of the area difference between the areas corresponding to the persons in the previous frame and the area corresponding to the person to be tracked in the current frame is 1, the comparison and tracking unit 3463 determines that the person in the previous frame whose area difference is within the first allowable value is the same person as the person in the current frame. As a result, the comparison and tracking unit 3463 sets, for example, the identification number corresponding to the person before the one frame determined to be the same as the identification number corresponding to the person in the current frame.
When the areas of which the differences between the areas of the plurality of persons and the tracking object are within the first allowable value are calculated before one frame, the comparison and tracking unit 3463 compares the average skeleton coordinates of the person to be tracked with the average skeleton coordinates of the person of which the differences between the areas are within the first allowable value before one frame. Then, the comparison and tracking unit 3463 determines that the person preceding the one frame whose difference in average skeleton coordinates falls within the second allowable value is the same person as the person in the current frame. As a result, the comparison and tracking unit 3463 sets, for example, the identification number corresponding to the person before the one frame determined as the same person as the identification number corresponding to the person in the current frame. In addition, when there are a plurality of persons whose differences between the average bone coordinates fall within the second allowable value, the comparison and tracking unit 3463 can determine that the person whose differences between the average bone coordinates are closest to the second allowable value is the same person. When there are a plurality of persons whose differences between the average bone coordinates fall within the second allowable value, the comparison and tracking unit 3463 may be configured to perform processing other than the above-described examples, such as determining that the difference is an error.
When the area within the first allowable value is not calculated one frame before the difference between the area of the object and the area of the object to be tracked, and when there is no person within the second allowable value that is one frame before the difference between the average skeleton coordinates, the comparison and tracking unit 3463 determines the person to be tracked as the newly recognized person. In this case, the comparison and tracking unit 3463 assigns a new identification number to the newly identified person.
The comparison and tracking unit 3463 may use, as the first allowable value, an estimated area value estimated based on the rate of increase or decrease in the area corresponding to the person determined to be the same person in the previous frames, or the like. The first allowable value may be a predetermined value.
The comparison and tracking unit 3463 may use, as the second allowable value, an estimated coordinate value estimated from the moving speed of the average skeleton coordinate corresponding to the person determined to be the same person in the previous frames, or the like. The second allowable value may be a predetermined value similarly to the first allowable value.
The following section 346 is configured as an example. Next, an example of the operation of the tracking unit 346 will be described with reference to fig. 28.
Referring to fig. 28, the interior package pattern generation unit 3461 generates a pattern including the coordinates of all the parts recognized by the bone recognition unit 342 (the coordinates included in the bone information 334) for each person. The package pattern generation unit 3461 calculates the area of the generated pattern (step S401).
The average bone coordinate calculating unit 3462 calculates an average bone coordinate by calculating an average value of the coordinates (coordinates included in the bone information 334) of all the sites recognized by the bone recognition unit 342 for each person (step S402).
The comparison tracking unit 3463 compares the area corresponding to the person to be tracked calculated in the current frame with the area calculated one frame before (step S403).
When the area of the area corresponding to the person to be tracked in the current frame is 1 out of the areas of the persons preceding the one frame (step S403, 1), the comparison and tracking unit 3463 determines that the person preceding the one frame whose area difference is within the first allowable value is the same as the person of the current frame (step S404). As a result, the comparison and tracking unit 3463 sets, for example, the identification number corresponding to the person preceding the one frame as the identification number of the current frame.
When a plurality of areas are calculated before one frame, the difference between the areas of the comparison tracking unit 3463 and the tracking target being within the first allowable value (step S403, a plurality of areas), the comparison tracking unit 3463 compares the average bone coordinates of the tracking target person with the average bone coordinates of the tracking target person before one frame, the difference between the average bone coordinates and the areas of which are within the first allowable value (step S405). Then, when a person whose difference from the average skeleton coordinates of the tracking target is within the second allowable value is present before one frame (yes in step S405), the comparison tracking unit 3463 determines that the person before one frame is the same person as the person to be tracked (step S404). When there are a plurality of persons whose differences from the average skeleton coordinates of the tracking target are within the second allowable value before one frame, the comparison tracking unit 3463 can determine that the person whose differences are closest to the second allowable value is the same person, for example. If there is no person whose difference from the average skeleton coordinates of the tracking target is within the second allowable value before one frame (no in step S405), the comparison and tracking unit 3463 determines that the person to be tracked is a newly identified person (step S406). In this case, the comparison and tracking unit 3463 assigns a new identification number to the newly identified person.
If the area within the first allowable value is not calculated for the first 1 pieces of area (step S403, 0), the comparison and tracking unit 3463 determines the person to be tracked as the newly recognized person (step S406). In this case, the comparison and tracking unit 3463 assigns a new identification number to the newly identified person.
The following section 346 operates as an example.
As described above, in the case of the walking posture measuring device 300 of the present embodiment, the tracking unit 346 is provided. With this configuration, the tracking unit 346 can track the same person based on the result of the identification by the bone identification unit 342. As a result, erroneous calculation of the walking speed and the like can be suppressed, and the accuracy of the walking posture measurement can be improved.
In the present embodiment, the area of the package graphics generated by the package graphics generating unit 3461 is calculated. However, the package pattern generation unit 3461 may be configured to calculate a value other than the area based on the generated package pattern. For example, the inside package pattern generation unit 3461 may calculate the height, diameter, and the like of the generated inside package pattern instead of the area. In this case, the comparison and tracking unit 3463 compares the values of the height and the like calculated based on the graphics generated by the package graphics generating unit 3461, instead of the area.
The package pattern generation unit 3461 generates a pattern included in all coordinates. However, the interior pattern generation unit 3461 may be configured to generate a pattern including a part of the coordinates recognized by the bone recognition unit 342, such as a pattern included in the coordinates corresponding to the upper body of the person. The average bone coordinate calculating unit 3462 may calculate the average coordinates based on a part of the coordinates recognized by the bone recognizing unit 342, such as the average coordinates corresponding to the coordinates of the upper body of the person.
The comparison tracking unit 3463 may be configured to perform only one of the comparison of the values of the interior graphics generated by the interior graphics generating unit 3461 and the comparison of the average bone coordinates.
The tracking using the result of the recognition by the bone recognition unit 342 described in the present embodiment can be applied to a case where the person is tracked outside the scene where the walking posture is measured. That is, the function of the tracking unit 346 may be applied to a device requiring person tracking other than the walking posture measuring device 300. As described above, the method for tracking a person using bone information described in this embodiment is not limited to the case of measuring the walking posture, and can be flexibly used in various scenes.
In the case of the present embodiment, various modifications may be employed as in the first embodiment.
Third embodiment
Next, a third embodiment of the present invention will be described with reference to fig. 29 and 30. Fig. 29 and 30 show a configuration example of the imaging device 400.
The photographing device 400 photographs the posture of the person walking. Fig. 29 shows an example of a hardware configuration of the imaging device 400. Referring to fig. 29, imaging device 400 has the following hardware configuration, as an example, in addition to a camera or the like for imaging.
CPU (Central processing Unit) 401 (computing device)
ROM (read Only memory) 402 (storage device)
RAM (random access memory: random access memory) 403 (memory device)
Program group 404 loaded into RAM403
Memory device 405 for storing program group 404
Drive device 406 for reading from and writing to recording medium 410 outside of information processing apparatus
Communication interface 407 connected to communication network 411 outside the information processing apparatus
Input/output interface 408 for inputting/outputting data
Bus 409 connecting the components
Further, the imaging device 400 obtains the program group 404 by the CPU401 and executes the program group 404 by the CPU401, thereby realizing the functions as the detection unit 421 and the display unit 422 shown in fig. 30. The program group 404 is stored in advance in the storage device 405 or the ROM402, for example, and the CPU401 is loaded into the RAM403 or the like as necessary to be executed. The program group 404 may be supplied to the CPU401 via the communication network 411, or may be stored in the recording medium 410 in advance, and the drive device 406 reads out the program and supplies it to the CPU401.
Fig. 29 shows an example of a hardware configuration of the imaging device 400. The hardware configuration of the imaging device 400 is not limited to the above case. For example, the imaging device 400 may be configured by a part of the above-described configuration without the driving device 406 or the like.
The detection unit 421 detects the direction of the imaging device 400. For example, the detection unit 421 detects whether the imaging device 400 is in the vertical direction or the horizontal direction.
The display unit 422 displays a guide line indicating the position where the person walks on the screen display unit. For example, the display unit 422 displays different guide lines according to the direction of the imaging device 400 detected by the detection unit 421.
Thus, the imaging device 400 includes the detection unit 421 and the display unit 422. With this configuration, the display unit 422 can display different guide lines according to the direction of the imaging device 400 detected by the detection unit 421. As a result, when image data corresponding to the direction of the imaging device 400 is acquired, appropriate assistance corresponding to each can be performed. As a result, when a plurality of image data are acquired, the imaging conditions can be made as uniform as possible.
The imaging device 400 described above can be realized by incorporating a predetermined program into the imaging device 400. Specifically, a program according to another embodiment of the present invention is a program that causes the imaging device 400 to realize a detection unit 421 that detects the direction of the imaging device, and a display unit 422 that displays a guide line indicating the position of the person walking on a screen display unit, and the display unit 422 displays the guide line differently depending on the direction of the imaging device detected by the detection unit 421.
The guidance method performed by the imaging device 400 is a method in which the imaging device 400 that captures the posture of the person walking detects the program of the imaging device, and a guide line indicating the position of the person walking is displayed on the screen display unit.
Even in the invention of the program or the guidance method having the above-described configuration, the above-described object of the present invention can be achieved because the program or the guidance method has the same operation and effects as those of the imaging device 400.
Fourth embodiment
Next, a fourth embodiment of the present invention will be described with reference to fig. 31. Fig. 31 shows an example of the configuration of an information processing apparatus 500.
The information processing apparatus 500 has, for example, the same hardware configuration as the imaging apparatus 400 described with reference to fig. 29. Further, the information processing apparatus 500 can realize the function as the calculating unit 521 shown in fig. 31 by acquiring a program group included in the information processing apparatus 500 by a CPU and executing the program group by the CPU.
The calculating unit 521 calculates the actual length at a predetermined position in the image data based on the parameters of the imaging device that acquired the image data and information indicating the height of the imaging device at the time of acquiring the image data.
Thus, the information processing apparatus 500 has the calculating unit 521. With such a configuration, the calculating unit 521 can calculate the actual length at the predetermined position in the image data based on various information. As a result, analysis using the actual length or the like can be performed based on the image data acquired by the imaging device such as the smart phone.
The information processing apparatus 500 described above can be realized by incorporating a predetermined program into the information processing apparatus 500. Specifically, a program according to another embodiment of the present invention is a program for causing the information processing apparatus 500 to realize a calculating unit 521, wherein the calculating unit 521 calculates an actual length at a predetermined position in image data based on parameters of an imaging device that acquires image data and information indicating a height of the imaging device at the time of acquiring the image data.
The calculation method performed by the information processing apparatus 500 is a method in which the information processing apparatus 500 acquires parameters of an imaging device for acquiring image data and information indicating the height of the imaging device at the time of acquiring the image data, and calculates the actual length at a predetermined position in the image data based on the acquired information.
Even in the invention of the program or the calculation method having the above-described configuration, the above-described object of the present invention can be achieved because the program or the calculation method has the same operation and effect as the information processing apparatus 500.
Fifth embodiment
Next, a fifth embodiment of the present invention will be described with reference to fig. 32. Fig. 32 shows an example of the structure of an information processing apparatus 600.
The information processing apparatus 600 has, for example, the same configuration as the hardware configuration of the imaging apparatus 400 described with reference to fig. 29. The information processing apparatus 600 can realize the functions of the detection unit 621 and the output unit 622 shown in fig. 32 by acquiring a program group included in the information processing apparatus 600 by a CPU and executing the program group by the CPU.
The detection unit 621 detects the inclination of the information processing apparatus.
The output unit 622 outputs information corresponding to the inclination of the information processing apparatus detected by the detection unit 621, which is different depending on the inclination of the information processing apparatus.
Thus, the information processing apparatus 600 has the detection section 621 and the output section 622. With this configuration, the output unit 622 can output information corresponding to the inclination of the information processing apparatus detected by the detection unit 621. As a result, the operator who operates the information processing apparatus 600 can correct the inclination based on the output information.
The information processing apparatus 600 described above can be realized by incorporating a predetermined program into the information processing apparatus 600. Specifically, a program according to another embodiment of the present invention is a program for causing the information processing apparatus 600 to realize the detection unit 621 for detecting the inclination of the information processing apparatus 600 and the output unit 622 for outputting information corresponding to the inclination of the information processing apparatus 600 detected by the detection unit 621, which is different depending on the inclination of the information processing apparatus 600.
The calculation method performed by the information processing apparatus 600 is a method in which the information processing apparatus 600 detects the inclination of the information processing apparatus 600 and outputs information corresponding to the detected inclination of the information processing apparatus, which differs according to the inclination of the information processing apparatus.
Even in the invention of the program or the output method having the above-described configuration, the above-described object of the present invention can be achieved because the program or the output method has the same operation and effect as the information processing apparatus 600.
Sixth embodiment
Next, a sixth embodiment of the present invention will be described with reference to fig. 33. Fig. 33 shows an example of the structure of the tracking device 700.
The tracking device 700 has, for example, the same hardware configuration as the imaging device 400 described with reference to fig. 29. The tracking device 700 can realize the functions of the acquisition unit 721 and the tracking unit 722 shown in fig. 33 by acquiring a program group included in the tracking device 700 by a CPU and executing the program group by the CPU.
The acquisition unit 721 acquires information indicating a plurality of parts of the identified person by identifying bones of the person in the image data.
The tracking unit 722 tracks the same person among the plurality of image data based on the information acquired by the acquisition unit 721.
Thus, the tracking device 700 includes the acquisition unit 721 and the tracking unit 722. With this configuration, the tracking unit 722 can perform tracking based on the information indicating the location acquired by the acquisition unit 721. Thus, easy tracking can be realized.
The tracking device 700 described above can be realized by incorporating a predetermined program into the tracking device 700. Specifically, a program according to another embodiment of the present invention is a program for causing the tracking device 700 to realize an acquisition unit 721 for acquiring information indicating a plurality of parts of an identified person by identifying bones of the person in image data, and a tracking unit 722 for tracking the same person among the plurality of image data based on the information acquired by the acquisition unit 721.
The tracking method performed by the tracking device 700 is a method in which the tracking device 700 acquires information indicating a plurality of parts of the identified person by identifying bones of the person in the image data, and performs tracking of the same person among the plurality of image data based on the acquired information.
Even in the invention of the program or the tracking method having the above-described configuration, the above-described object of the present invention can be achieved because the program or the tracking method has the same operation and effects as those of the tracking device 700.
< Additional notes >
Some or all of the above embodiments may be described as follows. The following will explain the outline of the guiding method and the like of the present invention. However, the present invention is not limited to the following configuration.
(Additionally, 1)
A guiding method, wherein,
The photographing device photographing the posture of the person walking detects the direction of the photographing device,
The image pickup device displays a guide line indicating the position where the person walks on a screen display unit,
When the guide line is displayed on the screen display unit, the imaging device displays a different guide line according to the detected direction of the imaging device.
(Additionally remembered 2)
In the guidance method described in supplementary note 1,
The guide lines are displayed differently depending on whether the photographing device is longitudinal or transverse.
(Additionally, the recording 3)
In the guidance method described in supplementary note 1 or supplementary note 2,
In the case where the photographing device is vertical, the guide line for guiding the person walking in the back-and-forth direction of the screen is displayed.
(Additionally remembered 4)
In the guidance method of any one of supplementary notes 1 to 3,
When the imaging device is in the landscape orientation, a guide line for guiding a person walking in the left-right direction on the screen is displayed.
(Additionally noted 5)
In the guidance method of any one of supplementary notes 1 to 4,
The inclination of the imaging device is detected, and information indicating the detected inclination is displayed on the screen display unit.
(Additionally described 6)
In the guidance method of any one of supplementary notes 1 to 5,
The inclination of the imaging device is detected, and information corresponding to the detected inclination is output.
(Additionally noted 7)
In the guidance method described in supplementary note 6,
The inclination in the left-right direction and the inclination in the inside-outside direction are detected, and different information is output when the inclination in the left-right direction is detected and when the inclination in the inside-outside direction is detected.
(Additionally noted 8)
In the guidance method described in supplementary note 6 or supplementary note 7,
When the inclination in the lateral direction is detected and when the inclination in the medial-lateral direction is detected, sounds adjusted by different methods are output.
(Additionally, the mark 9)
A photographing apparatus photographs a posture of a person walking, wherein,
The imaging device includes:
a detection unit for detecting the direction of the photographing device, and
A display unit for displaying a guide line indicating the walking position of the person on the screen display unit,
The display unit displays the different guide lines according to the direction of the imaging device detected by the detection unit.
(Additionally noted 10)
A program for causing an imaging device that images the walking posture of a person to realize:
a detection unit for detecting the direction of the photographing device, and
A display unit for displaying a guide line indicating the walking position of the person on the screen display unit,
The display unit displays the different guide lines according to the direction of the imaging device detected by the detection unit.
(Additionally noted 11)
A method of calculation, wherein,
The information processing device acquires parameters of an imaging device for acquiring image data and information indicating the height of the imaging device at the time of acquiring the image data, and calculates the actual length at a predetermined position in the image data based on the acquired information.
(Additional recording 12)
In the calculation method described in supplementary note 11,
A reference line for calculating the length is determined based on a predetermined reference, and the actual length of the determined reference line from one end of the image data to the other end is calculated.
(Additional recording 13)
In the computing method described in supplementary note 12,
The reference line is determined based on the position of the feet of the person in the image data.
(Additional recording 14)
In the calculation method described in supplementary note 12 or supplementary note 13,
Calculating a ratio of a position of the reference line on a screen represented by image data to a half screen, and calculating an actual length of the reference line based on the calculated ratio, the parameter, and the height.
(Additional recording 15)
In the calculation method of any one of supplementary notes 11 to 14,
The parameters include information indicating a vertical field angle and a horizontal field angle of the photographing device.
(Additionally remembered 16)
In the calculation method of any one of supplementary notes 11 to 15,
The actual length W is calculated from the mathematical expression 1 using the vertical angle of view θ, the horizontal angle of view ψ, the ratio α of the position of the reference line specified on the screen to the half screen, and the height h of the imaging device.
(Additionally noted 17)
In the calculation method of any one of supplementary notes 11 to 16,
Based on the calculated length and the resolution of the image data, the stride of the person is calculated.
(Additional notes 18)
In the calculation method described in supplementary note 17,
The number of pixels between the left and right feet of the person in the image data is acquired,
The step of the person is calculated based on the calculated length, the resolution of the image data, and the number of pixels acquired.
(Additionally, a mark 19)
An information processing apparatus, wherein,
The information processing device includes a calculation unit that calculates an actual length at a predetermined position in image data based on parameters of an imaging device for acquiring the image data and information indicating a height of the imaging device at the time of acquiring the image data.
(Additionally noted 20)
A program for causing an information processing apparatus to realize a calculation section,
The calculation unit calculates an actual length at a predetermined position in the image data based on parameters of the imaging device for acquiring the image data and information indicating a height of the imaging device at the time of acquiring the image data.
(Additionally, the recording 21)
An output method, wherein,
The information processing apparatus detects the inclination of the information processing apparatus,
Different information corresponding to the inclination of the information processing apparatus is output according to the inclination of the information processing apparatus.
(With 22)
In the output method described in supplementary note 21,
Detecting the inclination of the information processing device in the left-right direction and the inclination in the inside-outside direction,
When the inclination in the lateral direction is detected, different information is output from when the inclination in the medial-lateral direction is detected.
(Additionally note 23)
In the output method described in supplementary note 21 or supplementary note 22,
When the inclination in the lateral direction is detected and when the inclination in the medial-lateral direction is detected, sounds adjusted by different methods are output.
(Additionally noted 24)
In the output method described in supplementary note 23,
When the inclination in the lateral direction is detected, a sound in which either one of the pitch and the length of the sound is adjusted is outputted, and when the inclination in the medial-lateral direction is detected, a sound in which the pitch and the length of the sound are adjusted by a method different from that when the inclination in the lateral direction is detected is outputted.
(Additionally noted 25)
In the output method of any one of supplementary notes 21 to 24,
In the case where the information processing apparatus is tilted, two kinds of sounds are output,
1 Sound is output without tilting the information processing apparatus.
(Additionally noted 26)
In the output method of any one of supplementary notes 1 to 25,
Information indicating the inclination of the information processing apparatus is displayed on a screen display unit.
(Additionally noted 27)
An information processing apparatus has:
A detection unit for detecting the inclination of the information processing device, and
And an output unit configured to output different pieces of information corresponding to the inclination of the information processing apparatus detected by the detection unit, according to the inclination of the information processing apparatus.
(Additionally noted 28)
In the information processing apparatus described in supplementary note 27,
The detection unit detects the inclination of the information processing device in the left-right direction and the inclination in the inside-outside direction,
The output unit outputs different information when the detection unit detects the inclination in the lateral direction and the inclination in the medial-lateral direction.
(Additional notes 29)
A program for causing an information processing apparatus to realize:
A detection unit for detecting the inclination of the information processing device, and
And an output unit configured to output different pieces of information corresponding to the inclination of the information processing apparatus detected by the detection unit, according to the inclination of the information processing apparatus.
(Additional notes 30)
In the procedure described in the annex 29,
The detection unit detects the inclination of the information processing device in the left-right direction and the inclination in the inside-outside direction,
The output unit outputs different information when the detection unit detects the inclination in the lateral direction and the inclination in the medial-lateral direction.
(Additionally noted 31)
A tracking method, wherein,
The information processing device acquires information indicating a plurality of parts of a person identified by identifying bones of the person in image data,
The information processing device performs tracking of the same person among a plurality of image data based on the acquired information.
(Additionally noted 32)
In the tracking method described in supplementary note 31,
Generating an inner package graphic comprising at least a portion of the identified locations,
Tracking of the same person is performed based on a value corresponding to the generated package graphic.
(Additionally noted 33)
In the tracking method described in supplementary note 32,
Tracking the same person is performed based on the difference between the values corresponding to the package graphics among the image data.
(Additional notes 34)
In the tracking method described in supplementary note 33,
When the number of differences between the value of the tracking object corresponding to the package pattern and the value of the corresponding to the package pattern included in the image data different from the image data to which the tracking object belongs is 1 or less, the person whose difference is within the predetermined value is determined to be the same person as the person to be tracked.
(Additional notes 35)
In the tracking method described in supplementary note 34,
The predetermined value is determined based on a change in a value corresponding to the packet pattern among the plurality of image data.
(Additional notes 36)
In the tracking method of any one of supplementary notes 31 to 35,
Calculating an average of coordinates of at least a portion of the coordinates of the identified locations,
And tracking the same person based on the calculated result.
(Additionally noted 37)
In the tracking method described in supplementary note 36,
Tracking of the same person is performed based on the difference of the average values between the image data.
(Additional notes 38)
In the tracking method described in supplementary note 36 or supplementary note 37,
Tracking of the same person is performed based on a difference between the average value of the tracking object and the average value corresponding to the person included in the image data different from the image data to which the tracking object belongs.
(Additional notes 39)
A tracking device, comprising:
an acquisition unit for acquiring information indicating a plurality of parts of a person identified by identifying bones of the person in the image data, and
And a tracking unit configured to track the same person among the plurality of image data based on the information acquired by the acquisition unit.
(By-note 40)
A program for causing a tracking device to implement:
an acquisition unit for acquiring information indicating a plurality of parts of a person identified by identifying bones of the person in the image data, and
And a tracking unit configured to track the same person among the plurality of image data based on the information acquired by the acquisition unit.
The programs described in the above embodiments and the supplementary notes are stored in a storage device or recorded in a computer-readable recording medium. For example, the recording medium is a medium having mobility such as a floppy disk, an optical disk, a magneto-optical disk, and a semiconductor memory.
The present application has been described above with reference to the above embodiments, but the present application is not limited to the above embodiments. Various modifications, which will be apparent to those skilled in the art, may be made within the scope of the application, both as to the construction and the details of the application.
The present invention enjoys the benefit of the priority of the patent application of japanese patent application publication No. 2020-053974, which was filed on 25 th month of japan in 2020, the contents of which are incorporated herein in their entirety.
Description of the reference numerals
100. Walking posture measuring system
200. Smart phone
210. Action shooting part for measurement
211. Image data shooting part
212. Photographing auxiliary part
2121. Guide wire display unit
2122. Angle information display unit
2123. Height information input unit
2124. Angle adjustment information output unit
201. Touch panel
2011. Guide wire
2012. Angle information
2013. Height display unit
300. Walking posture measuring device
310. Picture display unit
320. Communication I/F unit
330. Storage unit
331. Model for learning
332. Camera setting information
333. Image information
334. Bone information
335. Information of actual measurement value
336. Measurement result information
337. Program
340. Arithmetic processing unit
341. Image acquisition unit
342. Bone recognition part
343. Actual measurement value calculation unit
344. Measuring unit
345. Output unit
346. Tracking part
3461. Inner package graph generating part
3462. Average skeleton coordinate calculating unit
3463. Comparison tracking part
400. Image pickup apparatus
401 CPU
402 ROM
403 RAM
404. Program group
405. Storage device
406. Driving device
407. Communication interface
408. Input/output interface
409. Bus line
410. Recording medium
411. Communication network
421. Detection unit
422. Display unit
500. Information processing apparatus
521. Calculation unit
600. Information processing apparatus
621. Detection unit
622. Output unit
700. Tracking device
721. Acquisition unit
722. Tracking part
Claims (6)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020053974 | 2020-03-25 | ||
| JP2020-053974 | 2020-03-25 | ||
| PCT/JP2021/008566 WO2021192905A1 (en) | 2020-03-25 | 2021-03-04 | Guide method |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN115299036A CN115299036A (en) | 2022-11-04 |
| CN115299036B true CN115299036B (en) | 2025-09-02 |
Family
ID=77890074
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202180022168.0A Active CN115299036B (en) | 2020-03-25 | 2021-03-04 | Boot method |
Country Status (3)
| Country | Link |
|---|---|
| JP (1) | JP7323234B2 (en) |
| CN (1) | CN115299036B (en) |
| WO (1) | WO2021192905A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010017447A (en) * | 2008-07-14 | 2010-01-28 | Nippon Telegr & Teleph Corp <Ntt> | Walking movement analyzer, walking movement analyzing method, walking movement analyzing program and its recording medium |
| JP2012227578A (en) * | 2011-04-15 | 2012-11-15 | Olympus Imaging Corp | Camera |
| JP2018074439A (en) * | 2016-10-31 | 2018-05-10 | キヤノン株式会社 | Imaging apparatus and control method thereof |
| JP2019054378A (en) * | 2017-09-14 | 2019-04-04 | キヤノン株式会社 | Imaging apparatus and control method thereof, and program |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6107844B2 (en) * | 2015-01-28 | 2017-04-05 | カシオ計算機株式会社 | Detection device, detection control method, and program |
| JP6587489B2 (en) * | 2015-10-07 | 2019-10-09 | キヤノン株式会社 | Image processing apparatus, image processing method, and image processing system |
| WO2017130397A1 (en) * | 2016-01-29 | 2017-08-03 | 富士通株式会社 | Position estimation device, position estimation method, and position estimation program |
| CN108462829A (en) * | 2017-02-21 | 2018-08-28 | 卡西欧计算机株式会社 | Shoot processing unit, shooting processing method and recording medium |
| DE102018005612B4 (en) * | 2017-07-19 | 2024-05-16 | Fanuc Corporation | Reporting of violations |
| CN110517298B (en) * | 2019-08-27 | 2022-10-21 | 北京百度网讯科技有限公司 | Trajectory matching method and device |
-
2021
- 2021-03-04 JP JP2022509481A patent/JP7323234B2/en active Active
- 2021-03-04 WO PCT/JP2021/008566 patent/WO2021192905A1/en not_active Ceased
- 2021-03-04 CN CN202180022168.0A patent/CN115299036B/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010017447A (en) * | 2008-07-14 | 2010-01-28 | Nippon Telegr & Teleph Corp <Ntt> | Walking movement analyzer, walking movement analyzing method, walking movement analyzing program and its recording medium |
| JP2012227578A (en) * | 2011-04-15 | 2012-11-15 | Olympus Imaging Corp | Camera |
| JP2018074439A (en) * | 2016-10-31 | 2018-05-10 | キヤノン株式会社 | Imaging apparatus and control method thereof |
| JP2019054378A (en) * | 2017-09-14 | 2019-04-04 | キヤノン株式会社 | Imaging apparatus and control method thereof, and program |
Also Published As
| Publication number | Publication date |
|---|---|
| CN115299036A (en) | 2022-11-04 |
| WO2021192905A1 (en) | 2021-09-30 |
| JPWO2021192905A1 (en) | 2021-09-30 |
| JP7323234B2 (en) | 2023-08-08 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5898199B2 (en) | Posture estimation apparatus, posture estimation method, and posture estimation program | |
| JP5921271B2 (en) | Object measuring apparatus and object measuring method | |
| US8608595B2 (en) | Image-capturing apparatus for putting practice and training putter having image-capturing apparatus | |
| KR101480410B1 (en) | Apparatus and method for correcting sports posture in digital image processing device | |
| JP6981531B2 (en) | Object identification device, object identification system, object identification method and computer program | |
| JP2017129567A (en) | Information processing apparatus, information processing method, and program | |
| CN115244360A (en) | Calculation method | |
| CN107517344A (en) | Method and device for adjusting recognition range of camera device | |
| WO2019172363A1 (en) | Information processing device, object measurement system, object measurement method, and program storage medium | |
| JP2020052979A (en) | Information processing device and program | |
| US20130069939A1 (en) | Character image processing apparatus and method for footskate cleanup in real time animation | |
| KR102313801B1 (en) | Apparatus and method for guiding correct posture of medical image system | |
| US9492748B2 (en) | Video game apparatus, video game controlling program, and video game controlling method | |
| CN115299036B (en) | Boot method | |
| JP2013248089A (en) | Scoliosis screening system, scoliosis determination program used therefor, and terminal device | |
| JP7343237B2 (en) | Tracking method | |
| JP4102119B2 (en) | Stride measuring device and stride measuring method | |
| JP2011118767A (en) | Facial expression monitoring method and facial expression monitoring apparatus | |
| KR101837142B1 (en) | Apparatus for providing treadmill content using interaction with user and method thereof | |
| US20220084244A1 (en) | Information processing apparatus, information processing method, and program | |
| TWI736148B (en) | Posture detecting system and method thereof | |
| CN113287152A (en) | Information processing apparatus, information processing method, and program | |
| WO2021192907A1 (en) | Output method | |
| JP2024501161A (en) | 3D localization of objects in images or videos | |
| US10952648B2 (en) | Measurement device and measurement method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant |