CN108363490A - A kind of good intelligent robot system of interaction effect - Google Patents
A kind of good intelligent robot system of interaction effect Download PDFInfo
- Publication number
- CN108363490A CN108363490A CN201810172878.1A CN201810172878A CN108363490A CN 108363490 A CN108363490 A CN 108363490A CN 201810172878 A CN201810172878 A CN 201810172878A CN 108363490 A CN108363490 A CN 108363490A
- Authority
- CN
- China
- Prior art keywords
- human body
- body behavior
- module
- behavior
- human
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 30
- 230000000694 effects Effects 0.000 title claims abstract description 23
- 230000004927 fusion Effects 0.000 claims abstract description 15
- 238000004891 communication Methods 0.000 claims abstract description 9
- 230000003542 behavioural effect Effects 0.000 claims description 22
- 238000000605 extraction Methods 0.000 claims description 14
- 229910002056 binary alloy Inorganic materials 0.000 claims description 2
- 239000000284 extract Substances 0.000 claims 1
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000000155 melt Substances 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0005—Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/08—Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2218/00—Aspects of pattern recognition specially adapted for signal processing
- G06F2218/12—Classification; Matching
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Multimedia (AREA)
- Social Psychology (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Manipulator (AREA)
Abstract
The present invention provides a kind of good intelligent robot systems of interaction effect, including Human bodys' response subsystem, communication subsystem and robot body, the Human bodys' response subsystem is for being identified human body behavior, obtain Human bodys' response result, the communication subsystem is used to Human bodys' response result being sent to robot body, the robot body is interacted according to human body behavior and people, and the Human bodys' response subsystem includes data acquisition module, characteristic extracting module, sort module and Decision fusion module.Beneficial effects of the present invention are:A kind of good intelligent robot system of interaction effect is provided, voice input or keyboard input of the system independent of people are directly identified human body behavior, realize the good interaction between people and robot, greatly improve user experience.
Description
Technical field
The present invention relates to robotic technology fields, and in particular to a kind of good intelligent robot system of interaction effect.
Background technology
With the development of science and technology, more and more people begin to focus on intelligent robot and are researched and developed to intelligent robot,
The application of intelligent robot is increasingly universal, and as intelligent robot rapidly enters the work and life of people, people are to intelligent machine
More stringent requirements are proposed by device people.It is desirable to robots to carry out interaction with people, and existing intelligent robot mainly passes through
Voice or the mode of keyboard input carry out interaction with people, and this interaction mode has often aggravated the workload of personnel, and efficiency
Lowly.
Human bodys' response is an emerging research direction in artificial intelligence field, be with a wide range of applications with it is non-
The economic value of Chang Keguan, the application field being related to include mainly:Video monitoring, medical diagnosis and monitoring, motion analysis, intelligence
Human-computer interaction, virtual reality etc..
The corresponding groundwork flow of Human bodys' response is:Various kinds of sensors is selected to obtain human body behavioral data information,
And the behavioral trait of sensor characteristics and people is combined to establish rational behavior model, it is carried from acquired original data on this basis
The feature that there is stronger descriptive power to behavior type is taken out, and these features are trained using suitable method, in turn
Realize the pattern-recognition to human body behavior.In general, the Activity recognition system based on camera work pattern, being relatively specific for can
The environment (for example, laboratory environment) of control, and when being applied in outdoor or other complex scenes, due to illumination variation and its
The influence of its disturbing factor, Activity recognition precision may be severely impacted.
Invention content
In view of the above-mentioned problems, the present invention is intended to provide a kind of good intelligent robot system of interaction effect.
The purpose of the present invention is realized using following technical scheme:
Provide a kind of good intelligent robot system of interaction effect, including Human bodys' response subsystem, communicator
System and robot body, the Human bodys' response subsystem obtain human body behavior and know for human body behavior to be identified
Not as a result, the communication subsystem is used to Human bodys' response result being sent to robot body, the robot body root
It is interacted according to human body behavior and people, the Human bodys' response subsystem includes data acquisition module, characteristic extracting module, divides
Generic module and Decision fusion module, the data acquisition module on wearable device by being arranged sensor to human body behavior number
According to being acquired, the sensor includes microcomputer speedometer and minisize gyroscopes;The characteristic extracting module is used for according to acquisition
Human body behavioral data human body behavioural characteristic is extracted, the sort module be used for according to human body behavioural characteristic to human body row
To classify, the Decision fusion module obtains human body row for merging the classification results of multiple sensor nodes
For recognition result.
Beneficial effects of the present invention are:A kind of good intelligent robot system of interaction effect is provided, which disobeys
Rely the voice input in people or keyboard input, directly human body behavior is identified, is realized good between people and robot
Good interaction, greatly improves user experience.
Description of the drawings
Using attached drawing, the invention will be further described, but the embodiment in attached drawing does not constitute any limit to the present invention
System, for those of ordinary skill in the art, without creative efforts, can also obtain according to the following drawings
Other attached drawings.
Fig. 1 is the structural schematic diagram of the present invention;
Reference numeral:
Human bodys' response subsystem 1, communication subsystem 2, robot body 3.
Specific implementation mode
The invention will be further described with the following Examples.
Referring to Fig. 1, a kind of good intelligent robot system of interaction effect of the present embodiment, including Human bodys' response
System 1, communication subsystem 2 and robot body 3, the Human bodys' response subsystem 1 are used to that human body behavior to be identified,
Human bodys' response is obtained as a result, the communication subsystem 2 is used to Human bodys' response result being sent to robot body 3,
The robot body 3 is interacted according to human body behavior and people, and the Human bodys' response subsystem 1 includes data acquisition module
Block, characteristic extracting module, sort module and Decision fusion module, the data acquisition module on wearable device by being arranged
Sensor is acquired human body behavioral data, and the sensor includes microcomputer speedometer and minisize gyroscopes;The feature carries
For being extracted to human body behavioural characteristic according to the human body behavioral data of acquisition, the sort module is used for according to people modulus block
Body behavioural characteristic classifies to human body behavior, the Decision fusion module be used for by the classification results of multiple sensor nodes into
Row fusion, obtains Human bodys' response result.
Present embodiments provide a kind of good intelligent robot system of interaction effect, voice of the system independent of people
Input or keyboard input, are directly identified human body behavior, realize the good interaction between people and robot, carry significantly
User experience is risen;Although being still the main skill of current Human bodys' response using camera acquisition human body behavior sequence image
Art means, but with the rapid development of technologies such as electronics, wireless communication in recent years, wearable sensing Activity recognition also has become
One emerging research direction.The present embodiment is identified human body behavior using wearable device, can be from shade and screening
The influence of the factors such as gear, and it will not be attached to personal privacy information, thus human body behavior can show more natural.In addition,
Sensor is only made of (micro accelerometer, minisize gyroscopes) mechanics sensor, and the behavioral data of acquisition is time-domain signal, therefore
Relative to the two-dimensional image data of higher-dimension, the requirement to data space and computing resource can be reduced.
Preferably, the robot body 3 includes control device, driving device and telecontrol equipment, and the control device is used
In being sent to driving device according to human body behavior generation control instruction, and by the control instruction, the driving device is for connecing
It is moved by the control instruction, and according to control instruction controlled motion device.
This preferred embodiment realizes effective control to robot body, improves the level of interaction of robot.
Preferably, the data acquisition module carries out human body behavioral data by the way that sensor is arranged on wearable device
Acquisition, specially:It is small data slot by original data division, the length of window of data is M, and sensor is according to length of window
Data are acquired;
The characteristic extracting module includes fisrt feature extraction module, and second feature extraction module and comprehensive characteristics determine mould
Block, the fisrt feature extraction module are used to extract the fisrt feature of human body behavior, and the second feature extraction module is for carrying
The second feature of human body behavior, the comprehensive characteristics determining module is taken to be used for the fisrt feature and second according to the human body behavior
Feature determines the comprehensive characteristics of human body behavior;
The fisrt feature extraction module is used to extract the fisrt feature of human body behavior, specially:It is acquired according to sensor
Human body behavioral data, the fisrt feature of human body behavior is determined using following formula:
In above-mentioned formula, T1The fisrt feature for indicating human body behavior, indicates the length of window of data, DmIndicate window data
Than the m-th data;
The second feature extraction module is used to extract the second feature of human body behavior, specially:It is acquired according to sensor
Human body behavioral data, the second feature of human body behavior is determined using following formula:
In above-mentioned formula, T2Indicate the second feature of human body behavior;
The comprehensive characteristics determining module is used to determine human body according to the fisrt feature and second feature of the human body behavior
The comprehensive characteristics of behavior, specially:Fisrt feature and second feature are connected, comprehensive characteristics T=[T are constituted1, T2];
This preferred embodiment helps to obtain highest discrimination, in feature extraction by adjusting the length of data window
In the process, fisrt feature fully reflects the average level of human body behavioral data and second feature fully reflects human body behavior
The stability of data is laid a good foundation for follow-up human body behavior classification.
Preferably, the sort module is for classifying to human body behavior according to human body behavioural characteristic, specially:According to
The feature of human body behavior obtains the probability output of human body behavior classification;
The Decision fusion module is for merging the classification results of multiple sensor nodes, specially:
It votes the binary class result of each sensor node, is specifically merged using following fusion rule:
In above-mentioned formula, j indicates that the label of sensor node, n indicate that the number of total sensor node, N (i) indicate people
Body behavior is the number of votes obtained of the i-th class behavior, and C indicates that the sum of the class of human body behavior, ω indicate the class label of fusion results, I tables
Show transforming function transformation function, for the probability output of sensor node to be converted to binary system output.
In order to improve recognition performance, this preferred embodiment melts the classification results of each sensing node in decision-making level
It closes, generates last classification results, by more sensing node Decision fusions, different sensing nodes are capable of providing the mutual of human body behavior
Information is mended, the accuracy of Human bodys' response is greatly improved.
Human-computer interaction is carried out using the good intelligent robot system of interaction effect of the present invention, 5 users is chosen and carries out in fact
It tests, respectively user 1, user 2, user 3, user 4, user 5, human-computer interaction efficiency and user satisfaction is counted, together
Existing intelligent robot system is compared, and generation has the beneficial effect that shown in table:
Human-computer interaction efficiency improves | User satisfaction improves | |
User 1 | 29% | 27% |
User 2 | 27% | 26% |
User 3 | 26% | 26% |
User 4 | 25% | 24% |
User 5 | 24% | 22% |
Finally it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than the present invention is protected
The limitation of range is protected, although being explained in detail to the present invention with reference to preferred embodiment, those skilled in the art answer
Work as understanding, technical scheme of the present invention can be modified or replaced equivalently, without departing from the reality of technical solution of the present invention
Matter and range.
Claims (8)
1. a kind of good intelligent robot system of interaction effect, which is characterized in that including Human bodys' response subsystem, communication
Subsystem and robot body, the Human bodys' response subsystem obtain human body behavior for human body behavior to be identified
Recognition result, the communication subsystem are used to Human bodys' response result being sent to robot body, the robot body
Interacted according to human body behavior and people, the Human bodys' response subsystem include data acquisition module, characteristic extracting module,
Sort module and Decision fusion module, the data acquisition module on wearable device by being arranged sensor to human body behavior
Data are acquired, and the sensor includes microcomputer speedometer and minisize gyroscopes;The characteristic extracting module is used for basis and adopts
The human body behavioral data of collection extracts human body behavioural characteristic, and the sort module is used for according to human body behavioural characteristic to human body
Behavior is classified, and the Decision fusion module obtains human body for merging the classification results of multiple sensor nodes
Activity recognition result.
2. the good intelligent robot system of interaction effect according to claim 1, which is characterized in that the robot sheet
Body includes control device, driving device and telecontrol equipment, and the control device is used to generate control instruction according to human body behavior, and
The control instruction is sent to driving device, the driving device is used to receive the control instruction, and according to control instruction
Controlled motion device moves.
3. the good intelligent robot system of interaction effect according to claim 2, which is characterized in that the data acquisition
Module is acquired human body behavioral data by the way that sensor is arranged on wearable device, specially:By original data division
Length of window for small data slot, data is M, and sensor is acquired data according to length of window.
4. the good intelligent robot system of interaction effect according to claim 3, which is characterized in that the feature extraction
Module includes fisrt feature extraction module, second feature extraction module and comprehensive characteristics determining module, the fisrt feature extraction
Module is used to extract the fisrt feature of human body behavior, and the second feature extraction module is used to extract the second spy of human body behavior
Sign, the comprehensive characteristics determining module are used to determine human body behavior according to the fisrt feature and second feature of the human body behavior
Comprehensive characteristics.
5. the good intelligent robot system of interaction effect according to claim 4, which is characterized in that the fisrt feature
Extraction module is used to extract the fisrt feature of human body behavior, specially:According to the human body behavioral data that sensor acquires, under
Formula determines the fisrt feature of human body behavior:
In above-mentioned formula, T1Indicate that the fisrt feature of human body behavior, M indicate the length of window of data, DmIndicate the in window
M data.
6. the good intelligent robot system of interaction effect according to claim 5, which is characterized in that the second feature
Extraction module is used to extract the second feature of human body behavior, specially:According to the human body behavioral data that sensor acquires, under
Formula determines the second feature of human body behavior:
In above-mentioned formula, T2Indicate the second feature of human body behavior;
The comprehensive characteristics determining module is used to determine human body behavior according to the fisrt feature and second feature of the human body behavior
Comprehensive characteristics, specially:Fisrt feature and second feature are connected, comprehensive characteristics T=[T are constituted1, T2]。
7. the good intelligent robot system of interaction effect according to claim 6, which is characterized in that the sort module
For being classified to human body behavior according to human body behavioural characteristic, specially:Human body behavior is obtained according to the feature of human body behavior
The probability output of classification.
8. the good intelligent robot system of interaction effect according to claim 7, which is characterized in that the Decision fusion
Module is for merging the classification results of multiple sensor nodes, specially:
It votes the binary class result of each sensor node, is specifically merged using following fusion rule:
In above-mentioned formula, j indicates that the label of sensor node, n indicate that the number of total sensor node, N (i) indicate human body row
For the number of votes obtained for the i-th class behavior, C indicates that the sum of the class of human body behavior, ω indicate that the class label of fusion results, I indicate to become
Exchange the letters number, for the probability output of sensor node to be converted to binary system output.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810172878.1A CN108363490A (en) | 2018-03-01 | 2018-03-01 | A kind of good intelligent robot system of interaction effect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810172878.1A CN108363490A (en) | 2018-03-01 | 2018-03-01 | A kind of good intelligent robot system of interaction effect |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108363490A true CN108363490A (en) | 2018-08-03 |
Family
ID=63003371
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810172878.1A Pending CN108363490A (en) | 2018-03-01 | 2018-03-01 | A kind of good intelligent robot system of interaction effect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108363490A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113524194A (en) * | 2021-04-28 | 2021-10-22 | 重庆理工大学 | Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103886323A (en) * | 2013-09-24 | 2014-06-25 | 清华大学 | Behavior identification method based on mobile terminal and mobile terminal |
CN103984315A (en) * | 2014-05-15 | 2014-08-13 | 成都百威讯科技有限责任公司 | Domestic multifunctional intelligent robot |
CN104268577A (en) * | 2014-06-27 | 2015-01-07 | 大连理工大学 | A Human Behavior Recognition Method Based on Inertial Sensor |
CN105335696A (en) * | 2015-08-26 | 2016-02-17 | 湖南信息职业技术学院 | 3D abnormal gait behavior detection and identification based intelligent elderly assistance robot and realization method |
WO2016110804A1 (en) * | 2015-01-06 | 2016-07-14 | David Burton | Mobile wearable monitoring systems |
CN105868779A (en) * | 2016-03-28 | 2016-08-17 | 浙江工业大学 | Method for identifying behavior based on feature enhancement and decision fusion |
CN107708553A (en) * | 2015-09-03 | 2018-02-16 | 三菱电机株式会社 | Activity recognition device, air conditioner and robot controller |
-
2018
- 2018-03-01 CN CN201810172878.1A patent/CN108363490A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103886323A (en) * | 2013-09-24 | 2014-06-25 | 清华大学 | Behavior identification method based on mobile terminal and mobile terminal |
CN103984315A (en) * | 2014-05-15 | 2014-08-13 | 成都百威讯科技有限责任公司 | Domestic multifunctional intelligent robot |
CN104268577A (en) * | 2014-06-27 | 2015-01-07 | 大连理工大学 | A Human Behavior Recognition Method Based on Inertial Sensor |
WO2016110804A1 (en) * | 2015-01-06 | 2016-07-14 | David Burton | Mobile wearable monitoring systems |
CN105335696A (en) * | 2015-08-26 | 2016-02-17 | 湖南信息职业技术学院 | 3D abnormal gait behavior detection and identification based intelligent elderly assistance robot and realization method |
CN107708553A (en) * | 2015-09-03 | 2018-02-16 | 三菱电机株式会社 | Activity recognition device, air conditioner and robot controller |
CN105868779A (en) * | 2016-03-28 | 2016-08-17 | 浙江工业大学 | Method for identifying behavior based on feature enhancement and decision fusion |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113524194A (en) * | 2021-04-28 | 2021-10-22 | 重庆理工大学 | Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning |
CN113524194B (en) * | 2021-04-28 | 2023-03-21 | 重庆理工大学 | Target grabbing method of robot vision grabbing system based on multi-mode feature deep learning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jain et al. | American sign language recognition using support vector machine and convolutional neural network | |
Nandi et al. | Indian sign language alphabet recognition system using CNN with diffGrad optimizer and stochastic pooling | |
Ahmed et al. | Real-time sign language framework based on wearable device: analysis of MSL, DataGlove, and gesture recognition | |
Benalcázar et al. | Real-time hand gesture recognition based on artificial feed-forward neural networks and EMG | |
Angona et al. | Automated Bangla sign language translation system for alphabets by means of MobileNet | |
Fang et al. | Dynamic gesture recognition using inertial sensors-based data gloves | |
CN108256631A (en) | A kind of user behavior commending system based on attention model | |
Kaluri et al. | A framework for sign gesture recognition using improved genetic algorithm and adaptive filter | |
Pezzuoli et al. | Recognition and classification of dynamic hand gestures by a wearable data-glove | |
Li et al. | Upper body motion recognition based on key frame and random forest regression | |
Sagayam et al. | Recognition of hand gesture image using deep convolutional neural network | |
Islam et al. | Applied human action recognition network based on SNSP features | |
Wang et al. | Cornerstone network with feature extractor: a metric-based few-shot model for chinese natural sign language | |
Li et al. | Chinese sign language recognition based on shs descriptor and encoder-decoder lstm model | |
Amin et al. | Sign gesture classification and recognition using machine learning | |
Sharma et al. | Real-time attention-based embedded LSTM for dynamic sign language recognition on edge devices | |
Mahesh et al. | Preeminent sign language system by employing mining techniques | |
CN115438691A (en) | Small sample gesture recognition method based on wireless signals | |
Abdul-Ameer et al. | Development smart eyeglasses for visually impaired people based on you only look once | |
Salim et al. | A review on hand gesture and sign language techniques for hearing impaired person | |
CN108363490A (en) | A kind of good intelligent robot system of interaction effect | |
Qiao et al. | Group behavior recognition based on deep hierarchical network | |
Kammoun et al. | ArSign: Toward a mobile based Arabic sign language translator using LMC | |
Rodríguez-Moreno et al. | A hierarchical approach for spanish sign language recognition: From weak classification to robust recognition system | |
Li et al. | [Retracted] Human Motion Representation and Motion Pattern Recognition Based on Complex Fuzzy Theory |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180803 |
|
RJ01 | Rejection of invention patent application after publication |