[go: up one dir, main page]

JPH04256087A - Pattern recognition device - Google Patents

Pattern recognition device

Info

Publication number
JPH04256087A
JPH04256087A JP3060784A JP6078491A JPH04256087A JP H04256087 A JPH04256087 A JP H04256087A JP 3060784 A JP3060784 A JP 3060784A JP 6078491 A JP6078491 A JP 6078491A JP H04256087 A JPH04256087 A JP H04256087A
Authority
JP
Japan
Prior art keywords
pattern
dictionary
discriminant
distance
class
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP3060784A
Other languages
Japanese (ja)
Inventor
Toshifumi Yamauchi
山内 俊史
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP3060784A priority Critical patent/JPH04256087A/en
Publication of JPH04256087A publication Critical patent/JPH04256087A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)
  • Character Discrimination (AREA)

Abstract

PURPOSE:To recognize a pattern with the high identification capacity of a similar class and with less recognition errors. CONSTITUTION:Pattern data observed in a pattern input part 1 is converted into a feature vector in a feature extraction part 2. A main component dictionary part 3 generated by the analysis of the main component of learning data and a discrimination dictionary part 4 generated by discriminating and analyzing learning data are given. An unknown input pattern is distance-calculated with the main component dictionary in a projection distance calculation part 7 and with the discrimination dictionary in a discrimination dictionary calculation part 11. A synthetic judgement part 13 rejects a pattern whose distance value is large based on a projection distance value and adopts a judgement result based on a discrimination distance value as to the pattern whose distance value is approximated to the other different class. Thus, erroneous recognition for the pattern which is not considered at the time of learning by a main component analysis method effective for catching the whole image of the pattern is prevented, a discrimination analysis method whose identification capacity of the similar class is high is shared and high recognition precision can be recognized.

Description

【発明の詳細な説明】[Detailed description of the invention]

【0001】0001

【産業上の利用分野】本発明は手書き文字,印刷文字,
音声等を自動認識するためのパターン認識方法に関し、
特に類似したクラスの文字字型,音声等の自動認識を行
うパターン認識方法に関する。
[Industrial Application Field] The present invention is applicable to handwritten characters, printed characters,
Regarding pattern recognition methods for automatically recognizing speech, etc.
In particular, it relates to a pattern recognition method for automatically recognizing similar classes of character shapes, sounds, etc.

【0002】0002

【従来の技術】一般のパターン認識方法は観測された特
徴ベクトル
[Background Art] A general pattern recognition method uses observed feature vectors.

【数1】[Math 1]

【0003】0003

【0004】を誤りなく効率的にカテゴリー集合に対応
付けるためにある。ところが文字,音声等を誤りなく認
識するためには観測する特徴ベクトルxは高次元なベク
トルとする必要があるが認識に無関係な冗長も含み、情
報圧縮せず認識処理を行うと膨大な計算量となる。よっ
て効率的に認識するためには特徴ベクトルxの統計的構
造に基づいて、さらに分類,識別に有効な特徴を抽出し
、より少数次元の効率の良い特徴空間Fdを構成するこ
とが必要である。従来から特徴空間Fdの構成方法とし
て統計的手法である主成分分析法,判別分析法を用いた
手法が報告されている。〔参考文献:大津:“パターン
認識における特徴抽出に関する数理的研究”,電子技術
総合研究所研究報告,第818号(1981)”〕。
[0004] This is for efficiently associating [0004] with a category set without error. However, in order to recognize characters, sounds, etc. without errors, the observed feature vector x needs to be a high-dimensional vector, but it also contains redundancy that is unrelated to recognition, and performing recognition processing without information compression requires a huge amount of calculation. becomes. Therefore, for efficient recognition, it is necessary to further extract features effective for classification and identification based on the statistical structure of the feature vector x, and to construct an efficient feature space Fd with fewer dimensions. . Conventionally, methods using principal component analysis and discriminant analysis, which are statistical methods, have been reported as methods for constructing the feature space Fd. [Reference: Otsu: “Mathematical research on feature extraction in pattern recognition,” Research Report of the Institute of Electronics Technology, No. 818 (1981)”].

【0005】[0005]

【発明が解決しようとする課題】上述した従来の主成分
分析法,判別分析法を用いたパターン認識方式は一長一
短がある。
[Problems to be Solved by the Invention] The conventional pattern recognition methods using principal component analysis and discriminant analysis described above have advantages and disadvantages.

【0006】主成分分析法はあるクラスに属する高次元
のパターン集合を低次元の互いに直交した主成分軸で表
現し、主成分分析により求まる超平面はパターン集合を
2乗誤差の基準で最適に近似する。パターンの全体像を
少ない特徴で記述するという意味では優れた手法である
が、他のクラスのパターンの分布については全く考慮に
入れていないため、類似するクラスが存在した場合の識
別能力が低いという欠点がある。
[0006] The principal component analysis method expresses a high-dimensional pattern set belonging to a certain class using low-dimensional mutually orthogonal principal component axes, and the hyperplane obtained by principal component analysis optimizes the pattern set using the standard of square error. Approximate. Although it is an excellent method in the sense that it describes the overall picture of a pattern with a small number of features, it does not take into account the distribution of patterns in other classes at all, so its ability to discriminate when similar classes exist is low. There are drawbacks.

【0007】判別分析法は複数のクラスの高次元パター
ン集合に対し、互いのクラスを分離する能力の高い判別
軸を抽出し低次元の判別空間へ変換するため類似するク
ラスの識別能力が高い反面、クラス識別のために強調さ
れる特徴は部分特徴となる可能性が高く、学習時考慮さ
れていないパターンが入力された場合、誤ってクラスを
判定する場合が多いという欠点がある。
Discriminant analysis extracts discriminant axes with high ability to separate classes from a set of high-dimensional patterns of multiple classes and converts them into a low-dimensional discriminant space. , there is a high possibility that the features emphasized for class identification will be partial features, and if a pattern that has not been considered during learning is input, there is a drawback that the class will often be determined incorrectly.

【0008】本発明の目的は主成分分析法,判別分析法
の両手法が異なる特長を有することに着目し、両手法の
複合方式をとることによりそれぞれの手法の長所を生か
し高精度な認識性能を有するパターン認識装置を提供す
ることにある。
[0008] The purpose of the present invention is to focus on the fact that principal component analysis and discriminant analysis have different features, and to utilize the strengths of each method to achieve high-accuracy recognition performance by combining both methods. An object of the present invention is to provide a pattern recognition device having the following features.

【0009】[0009]

【課題を解決するための手段】本発明は、パターン入力
部で観測されたパターンデータを特徴抽出部にて特徴ベ
クトルに変換し、特徴ベクトルと認識辞書ベクトル間の
距離値に基づき判定を行うパターン認識装置において、
学習パターンの主成分分析に基づき作成される主成分辞
書部と、特徴ベクトルと主成分辞書間の距離計算を行う
投影距離計算部と、学習パターンの判別分析に基づき作
成される判別辞書部と、特徴ベクトルと判別辞書間の距
離計算を行う判別距離計算部と、投影距離計算部により
求められた投影距離値および判別距離計算部により求め
られた判別距離値に基づき総合判定を行う総合判定部と
を有することを特徴とする。
[Means for Solving the Problems] The present invention provides a pattern in which pattern data observed in a pattern input unit is converted into a feature vector in a feature extraction unit, and a pattern is determined based on the distance value between the feature vector and the recognition dictionary vector. In the recognition device,
a principal component dictionary section created based on principal component analysis of the learning pattern; a projection distance calculation section that calculates the distance between the feature vector and the principal component dictionary; a discriminant dictionary section created based on the discriminant analysis of the learning pattern; a discriminant distance calculation section that calculates the distance between the feature vector and the discriminant dictionary; and a comprehensive judgment section that performs comprehensive judgment based on the projection distance value obtained by the projection distance calculation section and the discriminant distance value obtained by the discriminant distance calculation section. It is characterized by having the following.

【0010】このように本発明では、主成分分析法によ
り求めた主成分辞書部と判別分析法より求めた判別辞書
部を有しており、複数のクラスに候補を絞る分類の段階
では全体特徴の記述能力に優れ安定である主成分辞書を
用い、複数の候補から1つのクラスに属するカテゴリー
を決定する識別の段階では類似するクラスの識別能力の
高い判別辞書を用いる。
As described above, the present invention has a principal component dictionary part obtained by principal component analysis method and a discriminant dictionary part obtained by discriminant analysis method. A principal component dictionary with excellent descriptive ability and stability is used, and a discriminant dictionary with high discrimination ability for similar classes is used in the identification stage to determine categories belonging to one class from a plurality of candidates.

【0011】[0011]

【実施例】次に本発明の実施例について図面を参照し説
明する。
Embodiments Next, embodiments of the present invention will be described with reference to the drawings.

【0012】図1は一実施例の構成および処理の流れを
示している。このパターン認識装置は、パターン入力部
1と、パターン入力部1で観測されたパターンデータを
特徴ベクトルに変換する特徴抽出部2と、学習パターン
の主成分分析に基づき作成される主成分辞書部3と、特
徴ベクトルと主成分辞書間の距離計算を行う投影距離計
算部7と、学習パターンの判別分析に基づき作成される
判別辞書部4と特徴ベクトルと判別辞書間の距離計算を
行う判別距離計算部11と、投影距離計算部7により求
められた投影距離値および判別距離計算部11により求
められた判別距離値に基づき総合判定を行う総合判定部
13とを有している。
FIG. 1 shows the configuration and processing flow of one embodiment. This pattern recognition device includes a pattern input section 1, a feature extraction section 2 that converts pattern data observed by the pattern input section 1 into feature vectors, and a principal component dictionary section 3 that is created based on principal component analysis of learning patterns. , a projection distance calculation section 7 that calculates the distance between the feature vector and the principal component dictionary, a discriminant dictionary section 4 that is created based on the discriminant analysis of the learning pattern, and a discriminant distance calculation section that calculates the distance between the feature vector and the discriminant dictionary. unit 11, and a comprehensive determination unit 13 that performs comprehensive determination based on the projection distance value determined by the projection distance calculation unit 7 and the discrimination distance value determined by the discrimination distance calculation unit 11.

【0013】パターン入力部1はCCDセンサやマイク
ロホンから成り、これらを介して入力された文字信号や
音声信号をA/D変換し、観測データを時系列の離散的
データに変換する。
The pattern input section 1 is composed of a CCD sensor and a microphone, and performs A/D conversion on character signals and audio signals inputted through these, and converts observed data into time-series discrete data.

【0014】特徴抽出部2は時系列の離散的データを特
徴ベクトルxに変換する。特徴抽出部2の処理の一例を
図2〜図4を用い説明を行う。パターン入力部1によっ
て得られた文字の観測データを図2に示す。15は文字
の黒点部であり16が文字の背景部であり、これらが時
系列の離散的データに相当する。特徴抽出部2において
は文字輪郭のトレース処理が行われる。文字輪郭上の点
で図3に示す4方向の外接点17をP4、8方向の外接
点18をP8としたときP4からP8間を等間隔にサン
プルし19,20,21に示す様な特徴点f1,f2,
…,fn/2を得る。同様にP8からP4の間を等間隔
にサンプルし22,23,24に示す様な特徴点f(n
/2)+1,…,fn−1,fnを得る。
The feature extractor 2 converts time-series discrete data into a feature vector x. An example of the processing of the feature extraction unit 2 will be explained using FIGS. 2 to 4. Character observation data obtained by the pattern input section 1 is shown in FIG. 15 is a black dot part of the character, and 16 is a background part of the character, and these correspond to time-series discrete data. The feature extraction unit 2 performs character contour tracing processing. Assuming that the points on the character contour shown in FIG. 3 are the external contact points 17 in the four directions as P4 and the external contact points 18 in the eight directions as P8, samples are sampled at equal intervals between P4 and P8, and the characteristics shown in 19, 20, and 21 are obtained. Points f1, f2,
..., obtain fn/2. Similarly, samples are taken at equal intervals between P8 and P4, and feature points f(n
/2) Obtain +1,..., fn-1, fn.

【0015】それぞれ求めた特徴点f1,…,fnに対
し特徴量として座標,方向,角度の特徴量を抽出する。 図4は抽出された特徴量を示す。例えば25に示す特徴
点fiに対して26に示すx座標pi(1)、27に示
すy座標pi(2)、30に示すfiの接線方向のx成
分である28のpi(3)、y成分である29のpi(
4)、31に示す特徴点fi に隣接するfi−1 ,
fi+1 間でなす34に示す角度pi(5)を求め特
徴点fiの特徴ベクトル
For each of the obtained feature points f1, . . . , fn, the coordinate, direction, and angle features are extracted as features. FIG. 4 shows the extracted feature amounts. For example, for the feature point fi shown in 25, the x coordinate pi (1) shown in 26, the y coordinate pi (2) shown in 27, the x component in the tangential direction of fi shown in 30, pi (3) in 28, y The component 29 pi (
4), fi-1 adjacent to the feature point fi shown in 31,
Find the angle pi(5) shown in 34 between fi+1 and find the feature vector of the feature point fi.

【数2】[Math 2]

【0016】[0016]

【0017】を生成する。1文字全体としての特徴ベク
トルxは
[0017] is generated. The feature vector x for one character as a whole is

【数3】[Math 3]

【0018】[0018]

【0019】により生成される。It is generated by

【0020】図1の主成分辞書部3は以下の様にして求
められる。クラスckに属する学習パターンの特徴ベク
トルの集合
The principal component dictionary section 3 in FIG. 1 is obtained as follows. A collection of feature vectors of learning patterns belonging to class ck

【数4】[Math 4]

【0021】[0021]

【0022】についてRegarding [0022]

【数5】[Math 5]

【0023】[0023]

【0024】は0024

【数6】[Math 6]

【0025】[0025]

【0026】[0026]

【数7】[Math 7]

【0027】[0027]

【0028】により求められる。主成分分析は数8の固
有値問題を解き
It is determined by: Principal component analysis solves the eigenvalue problem of number 8.

【数8】[Math. 8]

【0029】[0029]

【0030】固有値λ1≧λ2≧…≧λMとなるように
対応する固有ベクトルUを並べ換え
[0030] Rearrange the corresponding eigenvectors U so that the eigenvalues λ1≧λ2≧…≧λM

【数9】[Math. 9]

【0031】[0031]

【0032】を主成分軸として選択することである。主
成分辞書部3は全クラスc={ck}について、平均ベ
クトルμkと主成分軸ベクトルuk1,uk2,…,u
kNとから成る。平均ベクトルμkを中心に主成分軸ベ
クトルuk1,…,ukNで形成される部分空間をクラ
スckの主成分辞書とする。
[0032] is selected as the principal component axis. The principal component dictionary unit 3 calculates the average vector μk and principal component axis vectors uk1, uk2, ..., u for all classes c={ck}.
kN. A subspace formed by principal component axis vectors uk1, .

【0033】次に図1の投影距離計算部7では以下に示
す処理を行う。数3に示す特徴ベクトルで未知パターン
に対する未知パターン特徴ベクトルを
Next, the projection distance calculating section 7 in FIG. 1 performs the following processing. The unknown pattern feature vector for the unknown pattern is calculated using the feature vector shown in Equation 3.

【数10】[Math. 10]

【0034】[0034]

【0035】としたとき、クラスckの主成分辞書に対
する投影距離dk(z)は
Then, the projection distance dk(z) for the principal component dictionary of class ck is

【数11】[Math. 11]

【0036】[0036]

【0037】により求まる。各クラス{ck}の主成分
辞書に対する投影距離を計算し、数12に示す距離が最
小となるck′に未知パターン特徴ベクトルzは判定さ
れる。
It is determined by: The projection distance of each class {ck} to the principal component dictionary is calculated, and the unknown pattern feature vector z is determined to be ck', which minimizes the distance shown in Equation 12.

【0038】[0038]

【数12】[Math. 12]

【0039】[0039]

【0040】k=αのときの投影距離dα(z)を図5
で説明を行う。クラスcαの学習パターンの集合{xα
}に対し36に示す平均ベクトルμαを求め、μαを中
心に主成分分析を行い37に示すデータの分散の最も大
きい第1主成分軸uα1とuα1の直交補空間より38
に示す第2主成分軸uα2を得る。未知パターン特徴ベ
クトルzのクラスcαの主成分辞書平面に対する投影距
離41は
FIG. 5 shows the projection distance dα(z) when k=α.
I will explain. Set of learning patterns of class cα {xα
}, find the average vector μα shown in 36, perform principal component analysis centering on μα, and calculate 38 from the orthogonal complementary space of the first principal component axis uα1 and uα1 with the largest variance of the data shown in 37.
A second principal component axis uα2 shown in is obtained. The projection distance 41 of the unknown pattern feature vector z to the principal component dictionary plane of the class cα is

【数13】[Math. 13]

【0041】[0041]

【0042】により求められる。数13の右辺第1項は
39に示す未知パターン特徴ベクトルzとクラスcαの
平均ベクトルμαとのユークリッド距離、右辺第2項は
41に示すzの平面に下ろした垂線と35に示すクラス
cαの主成分辞書平面との交点42と平均ベクトルμα
との距離であり、dα(z)は41に示すzから主成分
辞書平面35に下ろした垂線の距離となる。
It is determined by: The first term on the right side of Equation 13 is the Euclidean distance between the unknown pattern feature vector z shown in 39 and the average vector μα of class cα, and the second term on the right side is the Euclidean distance between the perpendicular to the plane of z shown in 41 and the distance of class cα shown in 35. The intersection point 42 with the principal component dictionary plane and the average vector μα
, and dα(z) is the distance from the perpendicular line drawn from z to the principal component dictionary plane 35 shown in 41.

【0043】またクラスcβに対しても同様に平均ベク
トル44をμβ、第1主成分軸45をuβ1、第2主成
分軸46をuβ2とする。未知パターン特徴ベクトルz
のクラスcβ の主成分辞書平面に対する投影距離47
Similarly, for class cβ, the average vector 44 is μβ, the first principal component axis 45 is uβ1, and the second principal component axis 46 is uβ2. unknown pattern feature vector z
The projection distance of class cβ to the principal component dictionary plane is 47
teeth

【数14】[Math. 14]

【0044】[0044]

【0045】により求められる。It is determined by:

【0046】図1の判別辞書部4は次の様に求められる
。クラスcαに属する学習パターンの特徴ベクトルの集
The discriminant dictionary section 4 in FIG. 1 is obtained as follows. A set of feature vectors of learning patterns belonging to class cα

【数15】[Math. 15]

【0047】[0047]

【0048】とクラスcβに属する学習パターンの特徴
ベクトルの集合
and a set of feature vectors of learning patterns belonging to class cβ

【数16】[Math. 16]

【0049】[0049]

【0050】について、クラスcαの特徴ベクトルの平
均ベクトルをμα、クラスcβの特徴ベクトルの平均ベ
クトルをμβ としたとき各々のクラスの共分散行列Σ
α,Σβは
For [0050], let the average vector of the feature vectors of class cα be μα, and the average vector of the feature vectors of class cβ be μβ, then the covariance matrix Σ of each class
α, Σβ are

【数17】[Math. 17]

【0051】[0051]

【0052】[0052]

【数18】[Math. 18]

【0053】[0053]

【0054】となり、クラス内共分散行列ΣWはThen, the intra-class covariance matrix ΣW is

【数1
9】
[Number 1
9]

【0055】[0055]

【0056】ここでωα,ωβはクラスcα,cβのパ
ターンの出現確率である。
Here, ωα and ωβ are the probabilities of appearance of patterns of classes cα and cβ.

【0057】次に2つのクラス全体の平均ベクトルμt
Next, the average vector μt of the entire two classes
teeth

【数20】[Math. 20]

【0058】[0058]

【0059】により求まりクラス間共分散行列ΣBはThe inter-class covariance matrix ΣB is determined by


数21】
[
Number 21]

【0060】[0060]

【0061】となる。[0061]

【0062】両クラスのM次元の特徴ベクトルのデータ
集合に対し線形的に分離する能力が最も高い変換行列A
αβは数22の固有値問題を解くことにより求められる
Transformation matrix A that has the highest ability to linearly separate data sets of M-dimensional feature vectors of both classes
αβ is obtained by solving the eigenvalue problem of Equation 22.

【0063】[0063]

【数22】[Math. 22]

【0064】[0064]

【0065】よって図6に示す様な2クラス問題のとき
はN=1となることより判別軸としてはaαβ1のみが
得られる。
Therefore, in the case of a two-class problem as shown in FIG. 6, since N=1, only aαβ1 can be obtained as the discriminant axis.

【0066】図6においてクラスcαに属する特徴ベク
トルの
In FIG. 6, the feature vectors belonging to class cα

【数23】[Math. 23]

【0067】[0067]

【0068】の分布を48に、クラスcβに属する特徴
ベクトルの
Let the distribution of 48 be the feature vector belonging to class cβ.

【数24】[Math. 24]

【0069】[0069]

【0070】の分布を49に示す。数22に基づきクラ
スcα,cβを線形的に分離する能力の最も高い判別特
徴軸aαβ1は50の様になる。
The distribution of [0070] is shown in 49. Based on Equation 22, the discriminant feature axis aαβ1 having the highest ability to linearly separate the classes cα and cβ is 50.

【0071】次にクラスcαに属する特徴ベクトルの集
合48を判別特徴軸aαβ1に射影したとき、51に示
す射影値の最小値をVαmin、52に示す射影値の最
大値をVαmax、クラスcβ に属する特徴ベクトル
の集合49を判別特徴軸aαβ1に射影したとき、53
に示す射影値の最小値をVβmin、54に示す射影値
の最大値をVβmaxとする。また未知入力パターンベ
クトルの射影値55は
Next, when the set 48 of feature vectors belonging to the class cα is projected onto the discriminant feature axis aαβ1, the minimum value of the projected values shown at 51 is Vαmin, the maximum value of the projected values shown at 52 is Vαmax, and it belongs to the class cβ. When the set of feature vectors 49 is projected onto the discriminant feature axis aαβ1, 53
Let the minimum value of the projection values shown in 54 be Vβmin, and the maximum value of the projection values shown in 54 be Vβmax. Also, the projection value 55 of the unknown input pattern vector is

【数25】[Math. 25]

【0072】[0072]

【0073】となる。クラスcαの判別辞書に対する距
離Dα,クラスcβの判別辞書に対する距離Dβは次式
の様になる。
[0073] The distance Dα to the discriminant dictionary of class cα and the distance Dβ to the discriminant dictionary of class cβ are as follows.

【0074】[0074]

【数26】[Math. 26]

【0075】[0075]

【0076】[0076]

【数27】[Math. 27]

【0077】[0077]

【0078】未知入力パターンベクトルzの属するクラ
スは数28に基づきcl′に決定される。
The class to which the unknown input pattern vector z belongs is determined as cl' based on Equation 28.

【0079】[0079]

【数28】[Math. 28]

【0080】[0080]

【0081】図6の例においては56に示すDα(Vz
)と57に示すDβ(Vz)を比較すると
In the example of FIG. 6, Dα (Vz
) and Dβ(Vz) shown in 57 are compared.

【数29】[Math. 29]

【0082】[0082]

【0083】となるためクラスはcβに判定される。##EQU1## Therefore, the class is determined to be cβ.

【0084】図1の総合判定部13では、投影距離計算
部7の結果をもとに制御される。投影距離計算部7にお
いて距離値が最小のクラスをcα、第2位のクラスをc
βとしたとき、数13,数14のaα(z),aβ(z
)について
The comprehensive determination section 13 in FIG. 1 is controlled based on the results of the projection distance calculation section 7. In the projection distance calculation unit 7, the class with the smallest distance value is cα, and the class with the second largest distance value is cα.
When β, aα(z) and aβ(z
)about

【数30】[Math. 30]

【0085】[0085]

【0086】を満たすときクラスcαに判定を行う。ま
たaβ(z)−aα(z)<δ1の時は判別距離計算部
の計算結果を参照し数28に基づきクラスcl′に判定
を行う(図6の例ではl′=β)。判別距離計算部の距
離値が全て等しいとき、例えばDα(Vz)=Dβ(V
z)のときはリジェクト処理を行う。
When the following is satisfied, the class cα is determined. Further, when aβ(z)-aα(z)<δ1, the calculation result of the discriminant distance calculation unit is referred to and the class cl' is determined based on Equation 28 (l'=β in the example of FIG. 6). When all the distance values of the discriminant distance calculation unit are equal, for example, Dα(Vz)=Dβ(V
z), reject processing is performed.

【0087】また投影距離計算部の距離の最小値dα(
z)が
Furthermore, the minimum distance value dα(
z) is

【数31】[Math. 31]

【0088】[0088]

【0089】のときもリジェクト処理を行う。Reject processing is also performed in the case of [0089].

【0090】以上に述べた総合判定部の判定論理を図7
に示す。
The judgment logic of the comprehensive judgment section described above is shown in FIG.
Shown below.

【0091】図8において本発明の有効性について説明
を行う。
The effectiveness of the present invention will be explained with reference to FIG.

【0092】58に示す様な未知パターンベクトルz1
に対し主成分分析法に基づく手法では他のクラスを考慮
に入れていないため類似クラスの識別能力が低く、z1
のようなパターンに対しては投影距離法では59に示す
dα(z1)と60に示すdβ(z1)がほとんど等し
い値をとり識別困難であった。ところが判別分析法に基
づく手法では類似クラスの識別能力が高いため61に示
すDα(Vz1)と62に示すDα(Vz2)の値に差
が発生しクラスcα に属すると識別することが可能で
ある。
Unknown pattern vector z1 as shown in 58
On the other hand, the method based on principal component analysis does not take other classes into consideration, so its ability to identify similar classes is low, and z1
With the projection distance method, it is difficult to identify a pattern such as dα (z1) shown in 59 and dβ (z1) shown in 60, which have almost the same value. However, since the method based on discriminant analysis has a high ability to identify similar classes, a difference occurs between the values of Dα (Vz1) shown in 61 and Dα (Vz2) shown in 62, and it is possible to identify it as belonging to class cα. .

【0093】また63に示すz2の様なパターンでは判
別分析法に基づく手法ではDα(Vz)=0となり、ま
た64に示すDβ(Vz)>0となることより、cα 
の分布から大きく離れたパターンでもクラスcα に属
すると判定してしまう場合があったが、本発明では65
に示すdα(z2)>δ2となることによりリジェクト
することが可能である。
In addition, for a pattern like z2 shown in 63, Dα (Vz) = 0 in the method based on discriminant analysis, and Dβ (Vz) > 0 shown in 64, so cα
In some cases, even a pattern that is far from the distribution of
It is possible to reject if dα(z2)>δ2 as shown in FIG.

【0094】[0094]

【発明の効果】以上説明したように、本発明のパターン
認識方法では類似するクラスが存在するパターンに対し
ては類似パターンの識別能力に優れた判別分析法で判定
し、学習時考慮されていない様なパターンに対してはパ
ターンの全体像の記述に優れる主成分分析法に基づく手
法により距離値のチェックを行いリジェクトすることを
可能にした。よって認識誤りが少なく類似クラスの識別
能力の高いパターン認識方法を実現した。また今回は文
字認識の例を用いて説明を行っているが特徴パターンを
周波数成分等にすることにより音声認識等にも容易に適
用可能である。
[Effects of the Invention] As explained above, in the pattern recognition method of the present invention, patterns for which similar classes exist are judged using the discriminant analysis method, which has excellent ability to identify similar patterns, and is not taken into consideration during learning. For such patterns, we have made it possible to check the distance values and reject them using a method based on principal component analysis, which is excellent in describing the overall image of the pattern. Therefore, we have achieved a pattern recognition method with fewer recognition errors and high ability to identify similar classes. In addition, this time we will explain using an example of character recognition, but it can also be easily applied to voice recognition etc. by making the feature pattern into a frequency component or the like.

【図面の簡単な説明】[Brief explanation of the drawing]

【図1】本発明の構成図および処理の流れを示す図であ
る。
FIG. 1 is a diagram showing a configuration diagram and a process flow of the present invention.

【図2】特徴点のサンプルを示す図である。FIG. 2 is a diagram showing samples of feature points.

【図3】8方向を示す図である。FIG. 3 is a diagram showing eight directions.

【図4】特徴点fiにおける特徴量を示す図である。FIG. 4 is a diagram showing feature amounts at a feature point fi.

【図5】未知パターン特徴ベクトルzの主成分辞書に対
する投影距離を示す図である。
FIG. 5 is a diagram showing a projection distance of an unknown pattern feature vector z to a principal component dictionary.

【図6】未知パターンベクトルzの判別辞書に対する距
離を示す図である。
FIG. 6 is a diagram showing the distance of an unknown pattern vector z to a discriminant dictionary.

【図7】総合判定部の判定論理を示す図である。FIG. 7 is a diagram showing determination logic of a comprehensive determination section.

【図8】本発明の有効性についての説明図である。FIG. 8 is an explanatory diagram of the effectiveness of the present invention.

【符号の説明】[Explanation of symbols]

1  パターン入力部 2  特徴抽出部 3  主成分辞書部 4  判別辞書部 7  投影距離計算部 11  判別距離計算部 13  総合判定部 1 Pattern input section 2 Feature extraction part 3 Principal component dictionary section 4 Discrimination dictionary section 7 Projection distance calculation section 11 Discrimination distance calculation unit 13 Comprehensive Judgment Department

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】パターン入力部で観測されたパターンデー
タを特徴抽出部にて特徴ベクトルに変換し、特徴ベクト
ルと認識辞書ベクトル間の距離値に基づき判定を行うパ
ターン認識装置において、学習パターンの主成分分析に
基づき作成される主成分辞書部と、特徴ベクトルと主成
分辞書間の距離計算を行う投影距離計算部と、学習パタ
ーンの判別分析に基づき作成される判別辞書部と、特徴
ベクトルと判別辞書間の距離計算を行う判別距離計算部
と、投影距離計算部により求められた投影距離値および
判別距離計算部により求められた判別距離値に基づき総
合判定を行う総合判定部とを有することを特徴とするパ
ターン認識装置。
Claim 1: A pattern recognition device that converts pattern data observed in a pattern input unit into a feature vector in a feature extraction unit, and makes a determination based on a distance value between the feature vector and a recognition dictionary vector. A principal component dictionary section created based on component analysis, a projection distance calculation section that calculates the distance between a feature vector and a principal component dictionary, a discriminant dictionary section created based on discriminant analysis of learning patterns, and a feature vector and discrimination section. A discriminant distance calculation unit that calculates a distance between dictionaries, and a comprehensive judgment unit that performs a comprehensive judgment based on the projection distance value obtained by the projection distance calculation unit and the discriminant distance value obtained by the discriminant distance calculation unit. Characteristic pattern recognition device.
JP3060784A 1991-02-07 1991-02-07 Pattern recognition device Pending JPH04256087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3060784A JPH04256087A (en) 1991-02-07 1991-02-07 Pattern recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3060784A JPH04256087A (en) 1991-02-07 1991-02-07 Pattern recognition device

Publications (1)

Publication Number Publication Date
JPH04256087A true JPH04256087A (en) 1992-09-10

Family

ID=13152267

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3060784A Pending JPH04256087A (en) 1991-02-07 1991-02-07 Pattern recognition device

Country Status (1)

Country Link
JP (1) JPH04256087A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6778701B1 (en) 1999-10-04 2004-08-17 Nec Corporation Feature extracting device for pattern recognition
US7634140B2 (en) 2002-02-27 2009-12-15 Nec Corporation Pattern feature selection method, classification method, judgment method, program, and device
JP2010237729A (en) * 2009-03-30 2010-10-21 Nec Corp Subjective rating value detection apparatus, subjective rating value detection method and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6778701B1 (en) 1999-10-04 2004-08-17 Nec Corporation Feature extracting device for pattern recognition
US7634140B2 (en) 2002-02-27 2009-12-15 Nec Corporation Pattern feature selection method, classification method, judgment method, program, and device
JP2010237729A (en) * 2009-03-30 2010-10-21 Nec Corp Subjective rating value detection apparatus, subjective rating value detection method and program

Similar Documents

Publication Publication Date Title
Chan et al. Multiscale local phase quantization for robust component-based face recognition using kernel fusion of multiple descriptors
Sarfraz et al. Head Pose Estimation in Face Recognition Across Pose Scenarios.
Bazen et al. Likelihood-ratio-based biometric verification
EP0436819B1 (en) Handwriting recognition employing pairwise discriminant measures
EP0355748A2 (en) A pattern recognition apparatus and method for doing the same
Yazdanpanah et al. Multimodal biometric system using face, ear and gait biometrics
CN112613480B (en) A face recognition method, system, electronic device and storage medium
Bourennane et al. Comparison of shape descriptors for hand posture recognition in video
Mwaura et al. Multimodal biometric system:-fusion of face and fingerprint biometrics at match score fusion level
Liu et al. A recognition system for partially occluded dorsal hand vein using improved biometric graph matching
CN112257585A (en) Breaker state monitoring method based on probability density
CN115690803A (en) Digital image recognition method, device, electronic device and readable storage medium
Gawali et al. 3d face recognition using geodesic facial curves to handle expression, occlusion and pose variations
JP2008140093A (en) Abnormal event extraction device, abnormal event extraction method, program for the method, and storage medium recording the program
JPH04256087A (en) Pattern recognition device
Mohankrishnan et al. On-line signature verification using a nonstationary autoregressive model representation
Mane et al. Novel multiple impression based multimodal fingerprint recognition system
JP2001014465A (en) Method and device for recognizing object
Lin et al. Spatial-temporal histograms of gradients and HOD-VLAD encoding for human action recognition
Baydoun et al. Hand pose recognition in first person vision through graph spectral analysis
Telgad et al. Development of an efficient secure biometric system by using iris, fingerprint, face
Song et al. Learning discriminative and invariant representation for fingerprint retrieval.
CN110580469B (en) Palm vein recognition system and method based on embedded equipment
Gambhir et al. Person recognition using multimodal biometrics
Mohammed Mean-discrete algorithm for individuality representation