[go: up one dir, main page]

CN105664462A - Auxiliary training system based on human body posture estimation algorithm - Google Patents

Auxiliary training system based on human body posture estimation algorithm Download PDF

Info

Publication number
CN105664462A
CN105664462A CN201610008638.9A CN201610008638A CN105664462A CN 105664462 A CN105664462 A CN 105664462A CN 201610008638 A CN201610008638 A CN 201610008638A CN 105664462 A CN105664462 A CN 105664462A
Authority
CN
China
Prior art keywords
human body
auxiliary
attitude
joint
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610008638.9A
Other languages
Chinese (zh)
Inventor
贾庆轩
孙汉旭
陈钢
张东梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN201610008638.9A priority Critical patent/CN105664462A/en
Publication of CN105664462A publication Critical patent/CN105664462A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2208/00Characteristics or parameters related to the user or player
    • A63B2208/02Characteristics or parameters related to the user or player posture

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

本发明公开了一种基于人体姿态估计算法的辅助训练系统。首先是基于ViBe模型的背景建模方法从单目视频中检测并提取出人体的二值轮廓图,接着运用Canny边缘检测算法得到图像的轮廓边缘图,然后经过水平线扫描、人体长度比例约束等图像处理方法,去获取图像中人体15个主要关节点的坐标,在上述基础上,搭建辅助训练系统,采用由15个关节点组成的5个关节角作为训练指标,选取欧式距离作为姿态的相似度度量,以关节角轨迹和姿态相似度两个辅助指标作为系统输出。本发明通过进行量化分析运动特征,实现运动员姿态的分析和对比,科学地提高运动员水平和成绩,使体育训练摆脱纯粹依靠经验的状态。

The invention discloses an auxiliary training system based on a human body posture estimation algorithm. First, the background modeling method based on the ViBe model detects and extracts the binary contour map of the human body from the monocular video, and then uses the Canny edge detection algorithm to obtain the contour edge map of the image, and then passes through the horizontal line scanning, human body length ratio constraints and other images The processing method is to obtain the coordinates of the 15 main joint points of the human body in the image. On the basis of the above, build an auxiliary training system, use 5 joint angles composed of 15 joint points as the training index, and select the Euclidean distance as the posture similarity The measurement takes two auxiliary indicators of joint angle trajectory and pose similarity as the system output. The present invention realizes the analysis and comparison of athlete's posture by performing quantitative analysis on sports characteristics, scientifically improves the level and performance of athletes, and makes physical training get rid of the state of purely relying on experience.

Description

基于人体姿态估计算法的辅助训练系统Auxiliary Training System Based on Human Pose Estimation Algorithm

技术领域:Technical field:

本发明涉及一种基于人体姿态估计算法的辅助训练系统,属于计算机视觉技术在人体运动分析领域、体育训练领域内的应用。The invention relates to an auxiliary training system based on a human body posture estimation algorithm, which belongs to the application of computer vision technology in the field of human motion analysis and sports training.

背景技术:Background technique:

在传统的体育运动训练中,通常采用基于肉眼观察的训练方法,而随着计算机视觉的发展,人们开始利用摄像头对运动员的动作进行捕获和分析。并且随着计算机技术在体育训练领域的应用,国内外出现了一批致力于研究高尔夫球辅助训练系统的公司。目前采用的方法主要有:影像法、图析法、便携传感器法。现有的成果有加拿大体育软件开发公司MediaVention开发的动作软件MotionCoach,瑞士体育分析软件开发公司DartFish开发的DartGoffer,以及泽普(Zepp)科技有限公司推出的GolfSense系统。但这些系统数字化的程度不够,仍然需要消耗大量的人力与物理来完成,实用性不强。In traditional sports training, training methods based on naked eye observation are usually used, but with the development of computer vision, people begin to use cameras to capture and analyze athletes' movements. And with the application of computer technology in the field of sports training, a number of companies dedicated to the research of golf auxiliary training systems have emerged at home and abroad. The methods currently used mainly include: image method, map analysis method, and portable sensor method. The existing achievements include the motion software MotionCoach developed by Canadian sports software development company MediaVention, the DartGoffer developed by Swiss sports analysis software development company DartFish, and the GolfSense system launched by Zepp Technology Co., Ltd. However, the degree of digitization of these systems is not enough, and it still needs to consume a lot of manpower and physics to complete, and the practicability is not strong.

高尔夫辅助训练系统的研究朝着智能化和科学化的方向发展,越来越多的视频处理技术和图像处理技术被运用于高尔夫挥杆分析中。尽管相关的辅助训练系统越来越多,但是比较智能化的分析高尔夫挥杆动作的系统还较少。The research on golf auxiliary training system is developing towards intelligent and scientific direction, more and more video processing technology and image processing technology are applied in golf swing analysis. Although there are more and more related auxiliary training systems, there are still few intelligent golf swing analysis systems.

发明内容:Invention content:

有鉴于此,本发明提供了一种基于人体姿态估计算法的辅助训练系统,能够应用在高尔夫球辅助训练方面,具有智能化程度高的优点,从图像序列中推测运动员的动作,通过进行量化分析运动特征,实现运动员姿态的分析和对比,科学地提高运动员水平和成绩,使体育训练摆脱纯粹依靠经验的状态。In view of this, the present invention provides an auxiliary training system based on the human body posture estimation algorithm, which can be applied in golf auxiliary training and has the advantage of high intelligence. Sports characteristics, realize the analysis and comparison of athletes' posture, scientifically improve the level and performance of athletes, and make sports training get rid of the state of purely relying on experience.

为了实现上述目的,本发明提出了一种基于人体姿态估计算法的辅助训练系统,实现从基于经验的运动训练方法到基于计算机视觉的人体运动分析方法的转变;具体实现过程包括以下几个步骤:In order to achieve the above object, the present invention proposes an auxiliary training system based on a human body posture estimation algorithm, realizing the transition from an experience-based motion training method to a computer vision-based human motion analysis method; the specific implementation process includes the following steps:

步骤(1)目标检测;基于ViBe模型的背景建模方法检测并提取出视频中人体二值轮廓图;Step (1) target detection; The background modeling method based on ViBe model detects and extracts the binary contour figure of human body in the video;

步骤(2)轮廓边缘特征提取;运用Canny边缘检测算法得到图像的轮廓边缘图;Step (2) contour edge feature extraction; use the Canny edge detection algorithm to obtain the contour edge map of the image;

步骤(3)人体姿态估计;经过水平线扫描、人体长度比例约束等图像处理方法,去获取轮廓边缘图中人体模型的15个关节点坐标;Step (3) Estimation of human body posture; through image processing methods such as horizontal line scanning and human body length ratio constraints, to obtain 15 joint point coordinates of the human body model in the contour edge image;

步骤(4)搭建辅助训练系统;系统以关节角轨迹和姿态相似度两个辅助指标作为输出。Step (4) Build an auxiliary training system; the system takes two auxiliary indicators of joint angle trajectory and posture similarity as output.

优选地,上述基于人体姿态估计算法的辅助训练系统中,基于边缘轮廓特征加图像处理的姿态估计方法,首先采用Canny边缘检测算法得到图像的轮廓边缘图,接着水平线扫描方法提取头部、脚部关节点;基于人体长度比例约束提取膝部、髋部关节点、颈部、胸部关节点;基于垂直扫描方法得到手部、肘部、肩部关节点。Preferably, in the above-mentioned auxiliary training system based on the human body posture estimation algorithm, based on the posture estimation method based on edge contour features plus image processing, the Canny edge detection algorithm is first used to obtain the contour edge map of the image, and then the horizontal line scanning method is used to extract the head and feet. Joint points: extract knee, hip joint points, neck, and chest joint points based on human body length ratio constraints; obtain hand, elbow, and shoulder joint points based on vertical scanning methods.

优选地,上述基于人体姿态估计算法的辅助训练系统中,采用由15个关节点组成的5个关节角作为训练指标,选取欧式距离作为姿态的相似度度量,以关节角轨迹和姿态相似度两个辅助指标作为系统输出;通过对比训练者和教练的关节角轨迹以及根据姿态相似度,从而直观的进行分析与指导。Preferably, in the above-mentioned auxiliary training system based on the human body posture estimation algorithm, 5 joint angles composed of 15 joint points are used as the training index, and the Euclidean distance is selected as the similarity measure of the posture, and the joint angle trajectory and the posture similarity are two Auxiliary indicators are taken as the system output; by comparing the joint angle trajectory of the trainer and the coach and according to the similarity of posture, it can be analyzed and guided intuitively.

本发明中的基于人体姿态估计算法的辅助训练系统,能够应用在高尔夫球辅助训练方面,具有智能化程度高的优点,从图像序列中推测运动员的动作,通过进行量化分析运动特征,实现运动员姿态的分析和对比,科学地提高运动员水平和成绩,使体育训练摆脱纯粹依靠经验的状态,实现从基于经验的运动训练方法到基于计算机视觉的人体运动分析方法的转变。The auxiliary training system based on the human body posture estimation algorithm in the present invention can be applied in golf auxiliary training, and has the advantage of a high degree of intelligence. It guesses the athlete's action from the image sequence, and realizes the athlete's posture by quantitatively analyzing the motion characteristics. Scientifically improve the level and performance of athletes, make sports training get rid of the state of purely relying on experience, and realize the transformation from experience-based sports training methods to computer vision-based human motion analysis methods.

附图说明:Description of drawings:

图1是本发明技术方案流程示意图;Fig. 1 is a schematic flow chart of the technical solution of the present invention;

图2是基于ViBe模型的背景建模方法流程图;Fig. 2 is the flow chart of background modeling method based on ViBe model;

图3是Canny边缘检测算法的流程图;Fig. 3 is the flowchart of Canny edge detection algorithm;

图4是人体二维骨骼模型。Fig. 4 is a two-dimensional skeleton model of a human body.

具体实施方式:detailed description:

下面结合附图对本发明进行进一步说明。The present invention will be further described below in conjunction with the accompanying drawings.

如图1所示,基于人体轮廓边缘特征加图像处理的姿态估计方法可以得到人体15个关节点的数据,作为辅助训练系统的输入。为了避免由于人的身高,与摄像机距离的不同造成的差异,本发明采用由15个关节点组成的5个关节角作为系统指标进行辅助分析,5个关节角分别记为Angle1(头、颈、胸)、Angle2(左肩、左肘、左腕)、Angle3(右肩、右肘、右腕)、Angle4(左髋、左膝、左脚)、Angle5(右髋、右膝、右脚)。选取欧式距离作为姿态的相似度度量。最后系统以关节角轨迹和姿态相似度两个辅助指标作为输出。As shown in Figure 1, the pose estimation method based on the edge features of the human body contour plus image processing can obtain the data of 15 joint points of the human body as the input of the auxiliary training system. In order to avoid the difference caused by the height of the person and the distance from the camera, the present invention uses 5 joint angles made up of 15 joint points as system indicators to carry out auxiliary analysis, and the 5 joint angles are respectively recorded as Angle1 (head, neck, Chest), Angle2 (left shoulder, left elbow, left wrist), Angle3 (right shoulder, right elbow, right wrist), Angle4 (left hip, left knee, left foot), Angle5 (right hip, right knee, right foot). Euclidean distance is chosen as the similarity measure of poses. Finally, the system takes two auxiliary indicators of joint angle trajectory and posture similarity as output.

基于ViBe模型的背景建模方法流程图如图2所示。ViBe算法是采用邻域像素来创建背景模型,通过比对当前输入图像的像素值和背景模型来检测出前景目标,实现流程包括三个步骤:(1)对单帧图像中每个像素点的背景模型进行初始化;(2)对后续的图像序列进行前景的目标分割操作;(3)采用八邻域更新法在后续每帧图片中进行背景模型的动态更新。The flow chart of the background modeling method based on the ViBe model is shown in Figure 2. The ViBe algorithm uses neighborhood pixels to create a background model, and detects the foreground target by comparing the pixel value of the current input image with the background model. The background model is initialized; (2) the foreground target segmentation operation is performed on the subsequent image sequence; (3) the background model is dynamically updated in each subsequent frame of pictures by using the eight-neighborhood update method.

当t=k时,像素点(x,y)的背景模型为像素值为fk(x,y)。按照下面式子判断该像素值是否为前景。When t=k, the background model of the pixel point (x, y) is The pixel value is f k (x,y). Determine whether the pixel value is foreground according to the following formula.

ff kk (( xx ,, ythe y )) == BKBK Mm kk -- 11 (( xx rr ,, ythe y rr )) >> TT ff oo rr ee gg rr oo uu nno dd BKBK Mm kk -- 11 (( xx rr ,, ythe y rr )) ≤≤ TT bb aa cc kk ff rr oo uu nno dd -- -- -- (( 11 ))

Canny边缘检测算子对人体轮廓进行边缘提取的结果如附图3所示。Canny边缘检测算法是一种先平滑再求导数的方法,实现流程分为以下几步:(1)用高斯滤波器对图像进行平滑处理;(2)用一阶差分来计算梯度的幅值和方向;(3)对梯度幅值进行非极大值抑制;(4)双阈值检测和连接边缘。The result of the edge extraction of the human body contour by the Canny edge detection operator is shown in Figure 3. The Canny edge detection algorithm is a method of first smoothing and then calculating the derivative. The implementation process is divided into the following steps: (1) smoothing the image with a Gaussian filter; (2) using the first-order difference to calculate the magnitude of the gradient and direction; (3) non-maximum suppression of gradient magnitude; (4) dual threshold detection and connection edges.

附图4是人体二维骨骼模型图,通过人体姿态估计的算法能得到人体模型的15个关节点坐标。具体实现流程包括以下几步:Accompanying drawing 4 is a diagram of a two-dimensional skeleton model of a human body, and the coordinates of 15 joint points of the human body model can be obtained through the algorithm of human body posture estimation. The specific implementation process includes the following steps:

(1)基于水平线扫描算法提取头部、脚部关节点。(1) Extract head and foot joint points based on horizontal line scanning algorithm.

在边缘轮廓中,从图像底部作一水平线,并使其逐行上移,由于人体的脚部是处于最低处且是对称的,当水平线与轮廓边缘第一次相交时,应该有两个交点,从左到右的交点依次标记为左脚和右脚的关节点,此时水平线的坐标为H1。In the edge contour, draw a horizontal line from the bottom of the image and move it up line by line. Since the human body's feet are at the bottom and are symmetrical, when the horizontal line intersects the edge of the outline for the first time, there should be two intersection points , the intersection point from left to right is marked as the joint point of the left foot and the right foot, and the coordinate of the horizontal line is H1.

从图像顶部做一水平线,并使其逐行下移,当水平线与轮廓边缘第一次相交时,标记该点为头部关节点,此时水平线的坐标为H2,人的身高就可以估算为L=H2-H1。Make a horizontal line from the top of the image and move it down line by line. When the horizontal line intersects with the contour edge for the first time, mark this point as the head joint point. At this time, the coordinate of the horizontal line is H2, and the height of the person can be estimated as L=H2-H1.

(2)基于长度比例约束提取膝部、髋部关节点、颈部、胸部关节点(2) Extract knee, hip joint points, neck and chest joint points based on length ratio constraints

根据人体的长度比例约束,脚与膝的长度为0.286L,膝与髋的长度为0.271L。从脚部做一水平线,并使其逐行上移0.286L的距离,得到水平线与人体边缘轮廓图像的四个交点,分别标记两交点的中点,得到的即为左膝和右膝关节点。再计算脚关节点与膝关节点的距离,要在人体比例系数范围以内,否则对结果进行修正。同样的道理,还能得到髋部关节的位置。According to the length ratio constraints of the human body, the length of the feet and knees is 0.286L, and the length of the knees and hips is 0.271L. Make a horizontal line from the foot, and move it up a distance of 0.286L line by line, get four intersection points between the horizontal line and the contour image of the human body edge, mark the midpoints of the two intersection points respectively, and get the left knee and right knee joint points . Then calculate the distance between the foot joint point and the knee joint point, and it must be within the range of the human body scale coefficient, otherwise the result should be corrected. In the same way, the position of the hip joint can also be obtained.

在头部关节点做一条垂线,根据头部与颈部、胸部的长度比例约束,可以估计出颈部关节点和胸部关节点的坐标。Draw a vertical line at the head joint point, and according to the length ratio constraints of the head, neck, and chest, the coordinates of the neck joint point and chest joint point can be estimated.

(3)基于垂直扫描算法得到手部、肘部、肩部关节点(3) Obtain the joint points of the hand, elbow and shoulder based on the vertical scanning algorithm

从图像左部做一垂线,并使其逐行右移,当垂线与左轮廓边缘第一次相交时,标记该点为左手关节点,手与肘的长度为0.243L,肘与肩的长度为0.2L,从手部关节点做一垂线,使其右移0.243L,从上向下扫描,取其第一个交点标记为左肘,从肘部再右移垂线0.2L,扫描其第一个交点记为左肩。同理,右手的重要关节点也可以因此得到。Make a vertical line from the left part of the image and move it to the right line by line. When the vertical line intersects with the edge of the left contour for the first time, mark this point as the joint point of the left hand. The length between the hand and the elbow is 0.243L, and the length between the elbow and the shoulder The length is 0.2L, draw a vertical line from the joint point of the hand, move it to the right by 0.243L, scan from top to bottom, take the first intersection point and mark it as the left elbow, and move the vertical line from the elbow to the right by 0.2L , scan its first intersection point and record it as the left shoulder. In the same way, the important joint points of the right hand can also be obtained.

Claims (3)

1. the auxiliary training system based on human body attitude algorithm for estimating, it is characterised in that comprise the following steps:
Step (1) target detection; Background modeling method based on ViBe model detects and extracts human body two-value profile diagram in video;
Step (2) contour edge feature extraction; Canny edge detection algorithm is used to obtain the contour edge figure of image;
Step (3) human body attitude is estimated; Through image processing methods such as horizon scan line, human body length ratio constraints, go to obtain 15 body joint point coordinate of anthropometric dummy in contour edge figure;
Step (4) builds auxiliary training system; System is using joint angle track and two auxiliary characteristicss of attitude similarity as output.
2. the auxiliary training system based on human body attitude algorithm for estimating according to claim 1, it is characterized in that: add the Attitude estimation algorithm of image procossing based on edge contour feature, obtain the contour edge figure of image initially with Canny edge detection algorithm, then horizon scan line method extracts head, foot's articulare; Knee, coxa joint point, cervical region, chest articulare is extracted based on the constraint of human body length ratio; Hand, ancon, shoulder joints point is obtained based on vertical scanning method.
3. the auxiliary training system based on human body attitude algorithm for estimating according to claim 1, it is characterized in that: adopt 5 joint angles being made up of 15 articulares as training quota, choose the Euclidean distance measuring similarity as attitude, export as system using joint angle track and two auxiliary characteristicss of attitude similarity; By the joint angle track of comparative training person and coach and according to attitude similarity, thus being analyzed intuitively and guidance.
CN201610008638.9A 2016-01-07 2016-01-07 Auxiliary training system based on human body posture estimation algorithm Pending CN105664462A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610008638.9A CN105664462A (en) 2016-01-07 2016-01-07 Auxiliary training system based on human body posture estimation algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610008638.9A CN105664462A (en) 2016-01-07 2016-01-07 Auxiliary training system based on human body posture estimation algorithm

Publications (1)

Publication Number Publication Date
CN105664462A true CN105664462A (en) 2016-06-15

Family

ID=56299235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610008638.9A Pending CN105664462A (en) 2016-01-07 2016-01-07 Auxiliary training system based on human body posture estimation algorithm

Country Status (1)

Country Link
CN (1) CN105664462A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107688465A (en) * 2016-08-04 2018-02-13 惠州学院 A kind of motion analysis system that swings based on computer vision
CN108960211A (en) * 2018-08-10 2018-12-07 罗普特(厦门)科技集团有限公司 A kind of multiple target human body attitude detection method and system
CN109871800A (en) * 2019-02-13 2019-06-11 北京健康有益科技有限公司 A kind of estimation method of human posture, device and storage medium
CN110929596A (en) * 2019-11-07 2020-03-27 河海大学 A shooting training system and method based on smart phone and artificial intelligence
CN112023373A (en) * 2020-09-07 2020-12-04 东南大学 Tennis training method based on attitude sensor
CN112071006A (en) * 2020-09-11 2020-12-11 湖北德强电子科技有限公司 High-efficiency low-resolution image area intrusion recognition algorithm and device
CN112419388A (en) * 2020-11-24 2021-02-26 深圳市商汤科技有限公司 Depth detection method and device, electronic equipment and computer readable storage medium
CN112465890A (en) * 2020-11-24 2021-03-09 深圳市商汤科技有限公司 Depth detection method and device, electronic equipment and computer readable storage medium
CN113240695A (en) * 2021-06-02 2021-08-10 四川轻化工大学 Electric power operation personnel wearing identification method based on posture perception
CN113361333A (en) * 2021-05-17 2021-09-07 重庆邮电大学 Non-contact riding motion state monitoring method and system
CN113384861A (en) * 2021-05-20 2021-09-14 上海奥视达智能科技有限公司 Table tennis training device, table tennis training method, and computer-readable storage medium
CN114534224A (en) * 2022-01-13 2022-05-27 上海凯视力成科技有限公司 Intelligent mirror for golf swing
CN115136197A (en) * 2020-02-19 2022-09-30 本田技研工业株式会社 Information acquisition device, information acquisition method, and control program
CN117152797A (en) * 2023-10-30 2023-12-01 深圳慢云智能科技有限公司 Behavior gesture recognition method and system based on edge calculation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2023268A1 (en) * 2007-07-23 2009-02-11 Commissariat à l'Energie Atomique Method and device for recognising the position or movement of a device or living being
WO2009061283A2 (en) * 2007-11-09 2009-05-14 National University Of Singapore Human motion analysis system and method
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CN103390174A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Physical education assisting system and method based on human body posture recognition
CN103559491A (en) * 2013-10-11 2014-02-05 北京邮电大学 Human body motion capture and posture analysis system
CN104200491A (en) * 2014-08-15 2014-12-10 浙江省新华医院 Motion posture correcting system for human body

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2023268A1 (en) * 2007-07-23 2009-02-11 Commissariat à l'Energie Atomique Method and device for recognising the position or movement of a device or living being
WO2009061283A2 (en) * 2007-11-09 2009-05-14 National University Of Singapore Human motion analysis system and method
CN103390174A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Physical education assisting system and method based on human body posture recognition
CN103230664A (en) * 2013-04-17 2013-08-07 南通大学 Upper limb movement rehabilitation training system and method based on Kinect sensor
CN103559491A (en) * 2013-10-11 2014-02-05 北京邮电大学 Human body motion capture and posture analysis system
CN104200491A (en) * 2014-08-15 2014-12-10 浙江省新华医院 Motion posture correcting system for human body

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张杜,陈元枝,邱凭婷: "基于ViBe 算法及Canny 边缘检测的运动目标提取", 《微机型与应用》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107688465A (en) * 2016-08-04 2018-02-13 惠州学院 A kind of motion analysis system that swings based on computer vision
CN108960211A (en) * 2018-08-10 2018-12-07 罗普特(厦门)科技集团有限公司 A kind of multiple target human body attitude detection method and system
CN109871800A (en) * 2019-02-13 2019-06-11 北京健康有益科技有限公司 A kind of estimation method of human posture, device and storage medium
CN109871800B (en) * 2019-02-13 2022-02-18 北京健康有益科技有限公司 Human body posture estimation method and device and storage medium
CN110929596A (en) * 2019-11-07 2020-03-27 河海大学 A shooting training system and method based on smart phone and artificial intelligence
CN115136197A (en) * 2020-02-19 2022-09-30 本田技研工业株式会社 Information acquisition device, information acquisition method, and control program
CN112023373A (en) * 2020-09-07 2020-12-04 东南大学 Tennis training method based on attitude sensor
CN112071006A (en) * 2020-09-11 2020-12-11 湖北德强电子科技有限公司 High-efficiency low-resolution image area intrusion recognition algorithm and device
CN112419388A (en) * 2020-11-24 2021-02-26 深圳市商汤科技有限公司 Depth detection method and device, electronic equipment and computer readable storage medium
CN112465890A (en) * 2020-11-24 2021-03-09 深圳市商汤科技有限公司 Depth detection method and device, electronic equipment and computer readable storage medium
CN112419388B (en) * 2020-11-24 2024-11-05 深圳市商汤科技有限公司 Depth detection method, device, electronic device and computer readable storage medium
CN113361333A (en) * 2021-05-17 2021-09-07 重庆邮电大学 Non-contact riding motion state monitoring method and system
CN113384861A (en) * 2021-05-20 2021-09-14 上海奥视达智能科技有限公司 Table tennis training device, table tennis training method, and computer-readable storage medium
CN113240695A (en) * 2021-06-02 2021-08-10 四川轻化工大学 Electric power operation personnel wearing identification method based on posture perception
CN114534224A (en) * 2022-01-13 2022-05-27 上海凯视力成科技有限公司 Intelligent mirror for golf swing
CN117152797A (en) * 2023-10-30 2023-12-01 深圳慢云智能科技有限公司 Behavior gesture recognition method and system based on edge calculation

Similar Documents

Publication Publication Date Title
CN105664462A (en) Auxiliary training system based on human body posture estimation algorithm
CN111144217B (en) Motion evaluation method based on human body three-dimensional joint point detection
US10417775B2 (en) Method for implementing human skeleton tracking system based on depth data
CN110472554B (en) Table tennis action recognition method and system based on attitude segmentation and key point features
CN107392086B (en) Human body posture assessment device, system and storage device
CN102609683B (en) Automatic labeling method for human joint based on monocular video
WO2017133009A1 (en) Method for positioning human joint using depth image of convolutional neural network
CN109758756B (en) Gymnastics video analysis method and system based on 3D camera
CN104167016B (en) A kind of three-dimensional motion method for reconstructing based on RGB color and depth image
CN104200200B (en) Fusion depth information and half-tone information realize the system and method for Gait Recognition
CN110738154A (en) pedestrian falling detection method based on human body posture estimation
CN109344694A (en) A real-time recognition method of basic human actions based on 3D human skeleton
CN100541540C (en) 3D Human Motion Restoration Method Based on Silhouette and End Nodes
CN103729647B (en) The method that skeleton is extracted is realized based on depth image
CN114550027B (en) Vision-based motion video fine analysis method and device
CN101789125A (en) Method for tracking human skeleton motion in unmarked monocular video
CN102682452A (en) Human movement tracking method based on combination of production and discriminant
CN111680586B (en) Badminton player motion attitude estimation method and system
CN106815855A (en) Based on the human body motion tracking method that production and discriminate combine
CN102075686A (en) Robust real-time on-line camera tracking method
CN111709365A (en) An automatic detection method of human motion pose based on convolutional neural network
Yang et al. Multiple marker tracking in a single-camera system for gait analysis
CN102156994B (en) Joint positioning method for single-view unmarked human motion tracking
CN106964137B (en) A kind of ball service behavior rating method based on image procossing
CN113673327A (en) Penalty ball hit prediction method based on human body posture estimation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160615