[go: up one dir, main page]

CN113128384B - Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning - Google Patents

Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning Download PDF

Info

Publication number
CN113128384B
CN113128384B CN202110376347.6A CN202110376347A CN113128384B CN 113128384 B CN113128384 B CN 113128384B CN 202110376347 A CN202110376347 A CN 202110376347A CN 113128384 B CN113128384 B CN 113128384B
Authority
CN
China
Prior art keywords
electroencephalogram signal
sample entropy
feature
convolution
electroencephalogram
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110376347.6A
Other languages
Chinese (zh)
Other versions
CN113128384A (en
Inventor
王卓峥
宋霖涛
董雨萌
任博雯
丁熠辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN202110376347.6A priority Critical patent/CN113128384B/en
Publication of CN113128384A publication Critical patent/CN113128384A/en
Application granted granted Critical
Publication of CN113128384B publication Critical patent/CN113128384B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/12Classification; Matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Fuzzy Systems (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

A brain-computer interface software key technical method of a cerebral apoplexy rehabilitation system based on deep learning belongs to the technical field of deep learning. In the invention, the electroencephalogram signal characteristic extraction and classification stage adopts the autoregressive model and the sample entropy to extract the characteristics of the electroencephalogram signal, and the CNN algorithm is used for classifying the electroencephalogram signal, so that the accuracy of the electroencephalogram signal classification can be improved, and the electroencephalogram signal classification can be used for realizing rehabilitation treatment of a cerebral apoplexy patient after being applied to brain-computer interface software of a cerebral apoplexy rehabilitation system.

Description

Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning
Technical Field
The invention belongs to the technical field of deep learning.
Background
At present, the related technology of brain electrical signals and the technology of brain-computer interfaces (Brain Computer Interface, BCI) are continuously developed, and the application of the brain electrical signals in cerebral apoplexy rehabilitation is also more and more widespread. Related research results show that the brain electrical signals of the motion imagination of the cerebral apoplexy patients can help the cerebral apoplexy patients to carry out effective rehabilitation treatment. The research shows that the brain electric signal activity can not change due to the damage of limbs and muscles, and the brain electric signal can still accurately reflect the electrophysiological activity performed by human cerebral cortex. Therefore, the related technology of brain-computer interfaces based on motor imagery has become an important point in rehabilitation research of cerebral stroke patients.
The deep learning is different from the traditional machine learning method, the end-to-end learning mode is realized, the original data is used as the model input, and the output of the model is generated by direct mapping. The more classical deep learning model is a convolutional neural network and a cyclic neural network. However, the drawbacks are also that, for example, the deep neural network requires a large amount of data to train the model, and for most small-scale data sets, it is difficult to train the model sufficiently; lack of sufficient tag data to effectively train the deep learning model; there is a huge computational demand. Depth forest (multi-Grained Cascade forest, gcForest), which is a model of a deep neural network including convolutional neural networks and cyclic neural networks, is a feature fusion model, and has achieved remarkable results in various fields such as vision, text, voice and the like. In deep forests, the correlation between the model captured features can be also helped by using a multi-granularity scanning strategy, so that the model characterization learning capacity is further improved.
The traditional electroencephalogram signal processing method mainly comprises temporal filtering, spatial filtering, principal component analysis, independent component analysis and the like. The current mainstream electroencephalogram feature extraction mode is a power spectral density and public space mode algorithm. The traditional machine learning method is to apply a proper classification algorithm to classify samples on the basis of completing data preprocessing and manually constructing sample characteristics. In the method, an algorithm is further optimized by adopting a deep learning mode, so that the processing speed and accuracy of the method are improved.
The invention in the field of China is as follows: the motor imagery electroencephalogram classification method [1] based on the data enhancement classifies electroencephalogram signals through the convolutional neural network, and the average accuracy of two, three and four classification tasks respectively reaches 87.32%,76.26% and 64.72%. The motor imagery electroencephalogram signal characteristic recognition method [2] based on the LFCNN-GRU algorithm model extracts frequency domain characteristics of electroencephalogram signals by using a convolutional neural network based on interlayer characteristic fusion, and further extracts time domain characteristics of the electroencephalogram signals by using a gating circulation network. The brain electricity identification method [3] based on CWT and MLMSFFCNN utilizes MLMSFFCNN characteristic fusion capability and multi-resolution computing capability to fully extract time, frequency and space domain characteristic information of signals, and improves classification accuracy.
[1] Liu Yue, du, yue Kang, tian Geliang. Data-enhanced convolutional neural network motor imagery brain electrical classification method [ P ]. Beijing city: CN111950366A,2020-11-17.
[2] Mao Xuefeng, xie Zhirong, zhang Yi, luo Yuan. A motor imagery electroencephalogram signal characteristic recognition method [ P ] based on LFCNN-GRU algorithm model: CN111950455A,2020-11-17.
[3] Li Mingai, han Jianfu, yang Jinfu, sun Yan. Brain electrical identification method based on CWT and MLMSFFCNN [ P ]. Beijing city: CN111582041a,2020-08-25.
Disclosure of Invention
In the invention, the electroencephalogram signal characteristic extraction and classification stage adopts the autoregressive model and the sample entropy to extract the characteristics of the electroencephalogram signal, and the CNN algorithm is used for classifying the electroencephalogram signal, so that the accuracy of the electroencephalogram signal classification can be improved, and the electroencephalogram signal classification can be used for realizing rehabilitation treatment of a cerebral apoplexy patient after being applied to brain-computer interface software of a cerebral apoplexy rehabilitation system.
The technical scheme of the invention comprises the following stages: in the electroencephalogram signal acquisition and preprocessing stage, a motor imagery EEG signal acquisition system is utilized to acquire motor imagery EEG signals, then the acquired EEG signals are preprocessed by EEGLAB to remove eye electricity, artifacts and the like, and an electroencephalogram signal with noise filtered is obtained, and dimension reduction is carried out through principal component analysis; in the electroencephalogram signal feature extraction stage, an autoregressive model (Autoregressive Model) and sample entropy are used for carrying out motor imagery EEG signal feature extraction, and linear and nonlinear signal features can be obtained from a frequency domain and a space domain respectively, wherein AR model parameters are used as feature vectors; and in the feature classification stage, an electroencephalogram signal is classified based on a CNN design classifier, two types of signal features obtained by feature extraction are respectively added into the CNN for classification, different classification results are obtained, and the results are integrated according to a certain weight to obtain a final classification result.
According to the invention, by extracting the characteristics of the frequency domain, the airspace, the linearity and the nonlinearity of the two electroencephalogram signals, the accuracy of classifying the motor imagery electroencephalogram signals is improved.
Description of the drawings:
fig. 1 is a flow chart of the present invention.
The specific embodiment is as follows:
1. electroencephalogram signal acquisition stage
The invention uses an electroencephalogram signal acquisition instrument to acquire signals, selects the sampling frequency of 250Hz, places electrodes according to the international 10-20 standard electrode placement method, selects a plurality of healthy subjects, acquires left-hand and right-hand motor imagery electroencephalogram signals respectively, uses a band elimination filter to filter, and selects EEG signals of 1Hz-50 Hz.
2. Electroencephalogram signal preprocessing stage
The brain electrical signal of human body is very weak and the background noise is very large. Due to the influence of external factors such as instruments, muscle movements of a human body and the like, various noise interferences such as baseline drift interference, power frequency interference, myoelectric interference, electrode contact noise and the like can be generated when the electroencephalogram signals are acquired. The above disturbances will affect both the time domain analysis and the frequency domain analysis of the EEG.
The main purpose of preprocessing is to improve the signal-to-noise ratio of the brain electrical signal, stabilize the baseline, reduce artifacts, and remove spurious points. The traditional preprocessing method generally adopts Fourier transformation, but the effect of the Fourier transformation on denoising is not ideal because of the characteristic of random instability of the brain electrical signal. In recent years, with the deep research of electroencephalogram signals, methods such as wavelet transformation and independent component analysis gradually play an important role in signal preprocessing. The fast fixed point algorithm (FastICA) in the independent component analysis has the characteristics of parallelism, distribution, less occupied memory, fast convergence and the like.
The invention uses EEGLAB for pretreatment, adopts FastICA to denoise experimental data, and uses principal component analysis to reduce the dimension of the data.
3. Feature extraction stage
After pretreatment of the EEG signals, the treated EEG signals are subjected to feature extraction by using an autoregressive model (Autoregressive Model) and sample entropy to obtain two types of signal features.
3.1 autoregressive model (Autoregressive Model)
Autoregressive model (Autoregressive Model) describes a linear regression model of random variables at a later time instant using a linear combination of random variables at earlier times, a common form in time series. Assume that the electroencephalogram signal sequence is y 1 ,y 2 ,…,y n The P-order autoregressive model (abbreviated as AR (P)) indicates y in the sequence t Is a function of the linear combination of the first P sequences and the error term, and the mathematical model in general form is:
wherein,is a constant term->Is a model parameter e t Is white noise with a mean value of 0 and a variance of sigma. Will->As EEG signal feature vectors.
3.2 sample entropy
Sample entropy is a nonlinear analysis method, and the nonlinear characteristics of the brain electrical signal can be reflected by measuring the complexity of the brain electrical signal.
Let the one-dimensional time series of the electroencephalogram signals be { y (i) }, i=1, 2, …, n, where y (i) represents the electroencephalogram signal series of the ith second and n represents the total time length. The sample entropy can be obtained by the following calculation:
(1) The sequences { y (i) } are sequentially combined into m-dimensional vectors, m=2 being chosen in the present invention, i.e.
Y m (i)=[y(i)y(i+1)…y(i+m-1)]
i=1,2,…,n-m+1
(2) For each instant i, a vector Y is calculated m (i) And vector Y m (j) The distance between, i.e
d[Y m (i),Y m (j)]=max|y(i+k)-y(j+k)|
k=0,1,…,m-1;i=1,2,…,n-m+1;i≠j
(3) Given a threshold r (r>0) In the invention, r is 0.2 times of standard deviation of original time sequence, and d [ Y ] is counted for each moment i m (i),Y m (j)]A number less than r, expressed asAnd the ratio of the number to the total distance number n-m is recorded asI.e.
(4) Solving forAverage of all values, noted as B m (r), i.e
(5) Sequentially forming the sequences { y (i) } into m+1-dimensional vectors, and repeating the steps (1) to (4) to obtainAnd B m+1 (r)
(6) The sample entropy of the sequence { y (i) } is
In practical calculation, since the sequence length is limited, the sample entropy estimation value when the sequence length is n is finally obtained
sampEn(m,r,n)=-ln[B m+1 (r)/B m (r)]}
The specific steps of extracting the nonlinear characteristics of the electroencephalogram signals by using the sample entropy are as follows: firstly, a sliding time window with the length of 1s is added to an electroencephalogram signal, the sample entropy of the electroencephalogram signal is calculated, the window moves by one sampling point each time, the sample entropy of the electroencephalogram signal of the next 1s time window is calculated until the sample entropy of the electroencephalogram signal of the last 1s time window is calculated, so that a time sequence of the sample entropy of the electroencephalogram signal is obtained, and then the sample entropy of the sample electroencephalogram signal data is obtained by superposing and averaging the sample entropy sequences.
4. Feature classification stage
The invention uses Convolutional Neural Network (CNN) as classifier to classify the two extracted feature vectors, then weighting the classification results of the two feature vectors, finally obtaining the classification result.
4.1CNN constructs and principles thereof
The CNN used in the invention is composed of six convolution layers, two pooling layers and two full-connection layers, and the sequence is as follows: three convolution layers, one pooling layer, three convolution layers, one pooling layer and two full connection layers are all implemented using Tensorflow.
The data set is divided into a training set and a testing set according to the proportion of 7:3, and two types of feature vectors obtained by feature extraction are obtained(wherein i.epsilon.1, n)]N represents the total time length) and SE i ={i∈[1,n]As inputs, three prediction possibilities (T0, T1, T2, respectively representing resting state, imagined left hand, imagined right hand) are output. An Adam optimizer is used, and the learning rate is 1 x 10-5.
(1) Convolution layer (convolutional layer): the size of each convolution kernel of each convolution layer in the convolution neural network is 3*3, and the parameters of each convolution unit are optimized through a back propagation algorithm, so that different characteristics of the input are extracted.
(2) Pooling layer (Pooling layer): after the convolution layer, features of very large dimensions (size [28 x 20 x 64 ]) are obtained, the features are cut into regions of [2,2] respectively, and the maximum value is taken to obtain new features of smaller dimensions (size [14 x 10 x 64 ]).
(3) Supervised learning
The model uses supervised learning and back propagation algorithms to calculate the parameters of each convolution element.
The main process is as follows:
calculation of a loss function using Euclidean distance
E=(x-pred) 2
Where x is the value of the actual feature vector and pred is the value of the predicted feature vector.
Back-propagating the loss function from the output layer to the hidden layer until propagating to the input layer; during the back propagation, adjusting the value of the parameter of each convolution element according to the loss function; the above process is iterated until convergence.
4.2 weighted integration
And in the last step, the result obtained after classification of the two types of feature vectors adopts a method of respectively weighting and summing 50% to obtain a final prediction result. As can be seen from the above, the process,and SE pred The prediction result vectors of the two types of feature vectors are respectively in the form ofAnd-> Wherein pred is i (i=T 0 ,T 1 ,T 2 ) Representing the predicted probability that the set of electroencephalograms is of the class one in this model. The final prediction result vector is
Final prediction results are available from final.

Claims (1)

1. The key technical method of brain-computer interface software of a cerebral apoplexy rehabilitation system based on deep learning is characterized by comprising the following steps of electroencephalogram signal acquisition and preprocessing, and specifically comprises the following steps:
1) Electroencephalogram signal acquisition stage
Using an electroencephalogram signal acquisition instrument to acquire signals, selecting electrodes with the sampling frequency of 250Hz, placing the electrodes according to an international 10-20 standard electrode placing method, selecting a plurality of healthy subjects, respectively acquiring left-hand and right-hand motor imagery electroencephalogram signals, filtering by using a band elimination filter, and selecting EEG signals with the frequency of 1Hz-50 Hz;
2) Electroencephalogram signal preprocessing stage
Preprocessing by using EEGLAB, denoising experimental data by using FastICA, and reducing the dimension of the data by using principal component analysis;
3) Feature extraction stage
After preprocessing, performing feature extraction on the processed EEG signals by using an autoregressive model and sample entropy to obtain two types of signal features;
3.1 autoregressive model
The autoregressive model utilizes the linear combination of random variables at a plurality of moments in the early stage to describe a linear regression model of random variables at a certain moment in the later stage; assume that the electroencephalogram signal sequence is y 1 ,y 2 ,...,y n The P-order autoregressive model is denoted as AR (P) indicating y in the sequence t Is a function of the linear combination of the first P sequences and the error term, and the mathematical model is:
wherein,is a constant term->Is a model parameter e t Is white noise with mean value of 0 and variance of sigma; will beAs EEG signal feature vectors;
3.2 sample entropy
Let the one-dimensional time sequence of the electroencephalogram signal be { y (i) }, i=1, 2..n, where y (i) represents the electroencephalogram signal sequence of the i-th second and n represents the total time length; the sample entropy is obtained by the following calculation:
(1) The sequences { y (i) } are sequentially combined into m-dimensional vectors, and m=2 is selected, i.e
Y m (i)=[y(i)y(i+1)...y(i+m-1)]
i=1,2,...,n-m+1
(2) For each instant i, a vector Y is calculated m (i) And vector Y m (j) The distance between, i.e
d[Y m (i),Y m (j)]=max|y(i+k)-y(j+k)|
k=0,1,...,m-1;i=1,2,...,n-m+1;i≠j
(3) Given a threshold r, r>0, wherein r is 0.2 times of standard deviation of the original time sequence, and d [ Y ] is counted for each i moment m (i),Y m (j)]A number less than r, expressed asAnd remembers the ratio of this number to the total distance number n-m as +.>I.e.
(4) Solving forThe average of all i values is denoted as B m (r), i.e
(5) Sequentially forming the sequences { y (i) } into m+1-dimensional vectors, and repeating the steps (1) to (4) to obtainAnd B m+1 (r)
(6) The sample entropy of the sequence { y (i) } is
Obtaining a sample entropy estimated value of n sequence length
sampEn(m,r,n)=-ln[B m+1 (r)/B m (r)]}
The specific steps of extracting the nonlinear characteristics of the electroencephalogram signals by using the sample entropy are as follows: firstly, adding a sliding time window with the length of 1s to an electroencephalogram signal, calculating the sample entropy of the electroencephalogram signal, moving the window by one sampling point each time, calculating the sample entropy of the electroencephalogram signal of the next 1s time window until the sample entropy of the electroencephalogram signal of the last 1s time window is calculated, thus obtaining a time sequence of the sample entropy of the group of electroencephalogram signals, and then, superposing and averaging the sequence of the group of sample entropies to obtain the sample entropy of a group of sample electroencephalogram signal data;
4) Feature classification stage
The convolutional neural network CNN is used as a classifier to respectively classify the extracted two types of feature vectors, and then the classification results of the two types of feature vectors are weighted respectively to finally obtain classification results;
4.1CNN formation and principle thereof
The CNN used consists of six convolutional layers, two pooling layers and two full-connection layers, and the sequence is as follows: three convolution layers, a pooling layer, three convolution layers, a pooling layer and two full connection layers are all realized by Tensorflow;
the data set is divided into a training set and a testing set, and two types of feature vectors obtained by feature extraction are divided into a training set and a testing setWherein i is E [1, n]N represents the total time length; and SE i ={i∈[1,n]The input of the I sampEn (m, r, i) is three prediction possibilities, and the T0, the T1 and the T2 respectively represent a resting state, an imagined left hand and an imagined right hand; an Adam optimizer is used, and the learning rate is 1 x 10-5;
A. convolution layer: the size of each convolution kernel of each convolution layer in the convolution neural network is 3*3, and the parameters of each convolution unit are obtained by optimizing a back propagation algorithm, so that different input characteristics are extracted;
B. pooling layer: after the convolution layer, obtaining a feature with the dimension of [28 x 20 x 64], cutting the feature into regions with the dimensions of [2,2] respectively, and taking the maximum value of the regions to obtain a new feature with the dimension of [14 x 10 x 64 ];
C. supervised learning
Calculating parameters of each convolution unit by using a back propagation algorithm;
calculation of a loss function using Euclidean distance
E=(x-pred) 2
Wherein x is the value of the actual feature vector, pred is the value of the predicted feature vector;
back-propagating the loss function from the output layer to the hidden layer until propagating to the input layer; during the back propagation, adjusting the value of the parameter of each convolution element according to the loss function; continuously iterating the process until convergence;
4.2 weighted integration
The result of classifying the two types of feature vectors in the last step adopts a method of respectively weighting and summing 50% to obtain a final prediction result; as can be seen from the above, the process,and SE pred The prediction result vectors of the two types of feature vectors are respectively in the form ofAnd-> Wherein pred is i (i=T 0 ,T 1 ,T 2 ) Representing a predictive probability that the set of electroencephalograms is of class i in the model; the final prediction result vector is
Final prediction results are obtained by final.
CN202110376347.6A 2021-04-01 2021-04-01 Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning Active CN113128384B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110376347.6A CN113128384B (en) 2021-04-01 2021-04-01 Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110376347.6A CN113128384B (en) 2021-04-01 2021-04-01 Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning

Publications (2)

Publication Number Publication Date
CN113128384A CN113128384A (en) 2021-07-16
CN113128384B true CN113128384B (en) 2024-04-05

Family

ID=76775280

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110376347.6A Active CN113128384B (en) 2021-04-01 2021-04-01 Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning

Country Status (1)

Country Link
CN (1) CN113128384B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114082169B (en) * 2021-11-22 2023-03-28 江苏科技大学 Disabled hand soft body rehabilitation robot motor imagery identification method based on electroencephalogram signals
CN114587391A (en) * 2022-03-10 2022-06-07 山东中科先进技术研究院有限公司 A kind of rehabilitation training device and training method based on brain-computer interface
CN114664434A (en) * 2022-03-28 2022-06-24 上海韶脑传感技术有限公司 Cerebral apoplexy rehabilitation training system for different medical institutions and training method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107958213A (en) * 2017-11-20 2018-04-24 北京工业大学 A kind of cospace pattern based on the medical treatment of brain-computer interface recovering aid and deep learning method
CN109620223A (en) * 2018-12-07 2019-04-16 北京工业大学 A kind of rehabilitation of stroke patients system brain-computer interface key technology method
CN112370066A (en) * 2020-09-30 2021-02-19 北京工业大学 Brain-computer interface method of stroke rehabilitation system based on generation of countermeasure network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107958213A (en) * 2017-11-20 2018-04-24 北京工业大学 A kind of cospace pattern based on the medical treatment of brain-computer interface recovering aid and deep learning method
CN109620223A (en) * 2018-12-07 2019-04-16 北京工业大学 A kind of rehabilitation of stroke patients system brain-computer interface key technology method
CN112370066A (en) * 2020-09-30 2021-02-19 北京工业大学 Brain-computer interface method of stroke rehabilitation system based on generation of countermeasure network

Also Published As

Publication number Publication date
CN113128384A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN107844755B (en) An EEG feature extraction and classification method combining DAE and CNN
CN111012336A (en) Parallel convolutional network motor imagery electroencephalogram classification method based on spatio-temporal feature fusion
CN105559777B (en) Electroencephalogramrecognition recognition method based on wavelet packet and LSTM type RNN neural networks
CN113158964B (en) Sleep stage method based on residual error learning and multi-granularity feature fusion
CN113128384B (en) Brain-computer interface software key technical method of cerebral apoplexy rehabilitation system based on deep learning
CN114533086B (en) Motor imagery brain electrolysis code method based on airspace characteristic time-frequency transformation
CN113128552B (en) Electroencephalogram emotion recognition method based on depth separable causal graph convolution network
Cheng et al. The optimal wavelet basis function selection in feature extraction of motor imagery electroencephalogram based on wavelet packet transformation
CN109325586B (en) A system for denoising EEG signals
CN108038429A (en) A kind of single brain electrical feature extraction sorting technique of Motor execution
CN115581467A (en) A recognition method of SSVEP based on time, frequency and time-frequency domain analysis and deep learning
CN113111831A (en) Gesture recognition technology based on multi-mode information fusion
CN113065526A (en) Electroencephalogram signal classification method based on improved depth residual error grouping convolution network
CN111387975B (en) Electroencephalogram signal identification method based on machine learning
CN116340824A (en) EMG signal action recognition method based on convolutional neural network
CN113476056A (en) Motor imagery electroencephalogram signal classification method based on frequency domain graph convolution neural network
CN114861706A (en) Electrocardio identity recognition method based on quality evaluation and deep transfer learning
CN113052099B (en) A SSVEP Classification Method Based on Convolutional Neural Networks
CN109657646B (en) Method and device for representing and extracting features of physiological time series and storage medium
CN109009098B (en) A method for feature recognition of EEG signals in motor imagery state
CN117235576A (en) A motor imagery EEG intention classification method based on Riemannian space
CN116849679A (en) Cross-mode hand action evaluation method
CN115017960B (en) An EEG signal classification method and application based on joint spatiotemporal MLP network
He et al. HMT: an EEG signal classification method based on CNN architecture
CN119157556A (en) Brain electrical signal denoising method based on dual-path convolution denoising network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant