EP1872189A2 - Generic classification system - Google Patents
Generic classification systemInfo
- Publication number
- EP1872189A2 EP1872189A2 EP06728271A EP06728271A EP1872189A2 EP 1872189 A2 EP1872189 A2 EP 1872189A2 EP 06728271 A EP06728271 A EP 06728271A EP 06728271 A EP06728271 A EP 06728271A EP 1872189 A2 EP1872189 A2 EP 1872189A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- classification
- training
- algorithm
- vectors
- classification system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q90/00—Systems or methods specially adapted for administrative, commercial, financial, managerial or supervisory purposes, not involving significant data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
Definitions
- the present invention relates to the classification of objects and patterns and, more particularly, to a classification system with a generic classification algorithm that is automatically optimized for classifying specific kinds of objects or patterns.
- FIG. 1 is a simple example of a two-dimensional feature space 10 partitioned into two classes.
- the objects are people.
- the features are height and weight, so the coordinate axes of feature space 10 are a HEIGHT axis and a WEIGHT axis.
- Feature space 10 is partitioned into two classes (“obese” and "non-obese") by a line 12.
- a person whose (weight-value, height-value) vector lies below and to the right of line 12 is classified as “obese”.
- a person whose (weight-value, height-value) vector lies above and to the right of line 12 is classified as "non-obese”.
- Feature space 10 is a "binary" feature space with two classes.
- the number of classes into which a feature space is partitioned is application-specific. For example, it may be useful to partition a geographic feature space whose coordinate axis features are "latitude” and “altitude” into three climate classes: “hot”, “temperate” and “cold”.
- Feature space 10 is a simple two-dimensional feature space that is easy to visualize and so easy to partition. In practical applications, feature spaces may have tens of feature coordinates. Such high-dimensional spaces are difficult or impossible to visualize.
- classification algorithms have been developed for partitioning high-dimensional feature spaces into classes according to training sets of vectors. Examples of such algorithms include nearest neighbor algorithms, support vector machines and least squares algorithms. See Richard O. Duda et al., Pattern Classification (Wiley Interscience, 2000). For any specific application, a set of training vectors is chosen and is classified manually. The algorithm chosen for partitioning the feature space then selects the boundaries ⁇ e.g. hyperplanes) that partition the feature space into classes in accordance with the training vectors. Subsequently, given a new vector of feature values, the algorithm decides which class that vector belongs to.
- the X's in Figure 1 represent six training vectors that could be used to train a classification algorithm for partitioning feature space 10.
- Line 12 is a boundary between the two classes, "obese" and “non-obese”, that would be selected by a least squares classification algorithm.
- partitioning of the feature space need not be, and often is not, explicit.
- the algorithm chosen for partitioning the feature space actually operates by determining values of algorithm parameters in accordance with the manual classification of the training vectors. These parameter values define the partition boundaries implicitly, in the sense that, given a new vector of feature values, the algorithm decides on which side of the boundaries the new vector falls.
- the selection of the best classification algorithm to use for a specific application is a difficult task even for a specialist. So, for example, a manufacturer of digital cameras who desires to include in each camera a chip for advising the user of the camera about the quality (acceptable vs. non-acceptable, for example) of each photograph, would have to invest in an expensive research and development effort to select and optimize the appropriate classification algorithm. There is thus a widely recognized need for, and it would be highly advantageous to have, a system that could be trained by a non-specialist to perform near-optimum classifications for any particular application.
- a classification system including: (a) a training device for: (i) selecting which one of a plurality of training classification algorithms best classifies a set of training vectors, and (ii) finding a set of values, of parameters of a generic classification algorithm, that enable the generic classification algorithm to substantially emulate the selected training classification algorithm; and (b) at least one classification device for classifying at least one vector other than the training vectors, using the generic classification algorithm with the values.
- a classification system including: (a) a training device for selecting which one of a plurality of classification algorithms best classifies a set of training vectors; and (b)at least one classification device for classifying at least one vector other than the training vectors, using the selected classification algorithm.
- the basic system of a first embodiment of the present invention includes two kinds of devices: a training device and one or more (preferably more than one) classification devices.
- the training device selects which one of a set of two or more training classification algorithms best classifies a set of training vectors, and then finds a set of values, of parameters of a generic classification algorithm, that enable the generic classification algorithm to substantially emulate the training classification algorithm that best classifies the set of training vectors.
- the classification device(s) use(s) the generic classification algorithm, parametrized with the values that the training device found, to classify other vectors.
- the classification device(s) is/are reversibly operationally connectable to the training device to receive the generic classification algorithm parameter values that the training device finds.
- the training device finds the generic classification algorithm parameter values by steps including resampling the feature space of the training vectors, thereby obtaining a set of resampling vectors, and then classifying the resampling vectors using the training classification algorithm that best classifies the set of training vectors.
- the resampling by the training device resamples the feature space more densely than does the set of training vectors.
- the training device has the option of dimensionally reducing the set of training vectors before selecting the training classification algorithm that best classifies the set of training vectors, and the classification device(s) also has/have the option to similarly dimensionally reduce the other vectors that it/they classifies/classify.
- the system also includes, for each classification device, a respective memory for storing the generic classification algorithm parameter values.
- Each memory is reversibly operationally connectable to the training device and to the memory's classification device.
- each classification device includes a mechanism for executing the generic classification algorithm.
- the mechanism includes a general purpose processor, and/or a nonvolatile memory for storing the generic classification algorithm program code, or a field programmable gate array, or an application-specific integrated circuit.
- the generic classification algorithm is a k-nearest-neighbors algorithm.
- the training device includes a nonvolatile memory for storing program code for effecting the selection of the best training classification algorithm and the finding of the corresponding parameters of the generic classification algorithm. Most preferably, at least a portion of such code is included in a dynamically linked library.
- the basic system of a second embodiment of the present invention also includes a training device and one or more classification devices.
- the training device selects which one of a set of two or more classification algorithms best classifies a set of training vectors.
- the classification device(s) use the selected classification algorithm to classify other vectors.
- each classification device includes a mechanism for executing the selected classification algorithm and a memory for storing an indication of which one of the classification algorithms has been selected by the training device.
- each classification device itself does not include such a memory.
- the system includes, for each classification device, a respective memory, for storing the indication of which one of the classification algorithms has been selected by the training device, that is reversibly operationally connectable to the training device and to the memory device's classification device.
- the memory also is for storing at least one parameter of the classification algorithm that has been selected by the training device.
- FIG. 1 is an example of a feature space with six training vectors
- FIG. 2 is a high-level block diagram of a system of the present invention
- FIG. 3 is the feature space of FIG. 1 with resampling vectors substituted for the training vectors;
- FIG. 4 illustrates an alternative construction of the classification device of FIG. 2.
- the present invention is of a system which can be trained by a non-specialist to classify objects for any specific application.
- FIG. 2 is a high-level block diagram of a system 20 of the present invention.
- System 20 includes two major components: a training device 30 and a classification device 40.
- Training device 30 is represented functionally in Figure 2, as a flow chart of the activities of training device 30.
- training device 30 preferably is a general purpose computer that is programmed to implement the illustrated flow chart.
- a nonvolatile memory e.g. hard disk
- a dynamically linked library for implementing the illustrated flow chart.
- the input to training device 30 is a set 22 of training vectors that have been classified manually.
- set 22 is used as input to several classification algorithms that are used independently, to determine which of these classification algorithms is the best algorithm to use in the application from which the training vectors of set 22 have been selected. These classification algorithms are called “training classification algorithms” herein.
- Training classification algorithms One way to determine the best algorithm to use is the "leave one out” method. Given a set 22 of JV training vectors, each training classification algorithm is run JV times on set 22, each time leaving out one of the vectors. The training classification algorithm that duplicates the manual classification of the largest number of "left out” vectors is selected in block 36 as the "best" training classification algorithm for the application at hand.
- values of parameters of a generic classification algorithm are selected that enable that generic classification algorithm to emulate the best training classification algorithm.
- One way to do this is to resample the feature space that is sampled by training set 22, to use the best training classification algorithm to classify the resampling vectors, and to use the resampling vectors thus classified to train the generic classification algorithm.
- the resampling vectors are distributed randomly in the feature space.
- the resulting parameter values are output as a generic parameter set 24.
- Figure 3 illustrates schematically what is involved in this kind of resampling. Specifically, Figure 3 is Figure 1 with the substitution of resampling vectors, represented by "+"s, for the six training vectors.
- the least squares classification algorithm that produced line 12 in response to the six training vectors would classify the "+"s above and to the left of line 12 as “non-obese” and the "+”s below and to the right of line 12 as “obese”.
- Classification device 40 includes a non-volatile memory 42 for storing the generic parameter values of set 24 and a classifying mechanism 44 that uses the generic classification algorithm, as parameterized by the parameter values stored in memory 42, to classify any new feature vector that is presented to classifying mechanism 44.
- Classifying mechanism 44 may be implemented in hardware, firmware or software.
- a preferred software implementation of classifying mechanism 44 includes a non-volatile memory for storing program code of the generic classification algorithm, a general purpose processor for executing the code, and a random access memory into which the program code instructions are loaded in order to be executed.
- a preferred hardware implementation of classifying mechanism 44 includes a field programmable gate array or an application- specific integrated circuit that is hardwired to implement the generic classification algorithm.
- the preferred generic classification algorithm is a k-nearest-neighbors algorithm.
- Some features of the vectors of training set 22 may be irrelevant to the classification of these vectors. For example, the color of a person's hair has no bearing on whether that person is obese. Including values of the feature "hair color" in the vectors of a training set for training a classification algorithm to partition feature space 10 would just introduce noise to the training process. It is obvious in this simple example not to include a "hair color" feature in an obesity training set; but in practical cases of higher dimensionality it is not obvious what features or combination of features to exclude from the training vectors. Therefore, optionally, before the training classification algorithms are executed in block 34, the dimensionality of the feature space described by training set 22 is reduced in block 32, using a procedure such as principle component analysis that culls irrelevant dimensions from the feature space.
- Classification device 40 preferably is physically separate from training device 30 and is reversibly operationally connected to training device 30 only for the purpose of loading generic parameter set 24 into memory 42.
- the manufacturer of digital cameras mentioned above trains the generic classification algorithm using training device 30 and an appropriate set 22 of training vectors, and then equips each one of the cameras with its own classification device 40, implemented e.g. as a set of integrated circuits in a multi- chip package, with the parameter values of generic parameter set 24 loaded in its memory 42.
- FIG 4 illustrates an alternative construction of classification device 40.
- Classification device 40 of Figure 4 lacks a memory 42. Instead, memory 42 is included in a physically separate memory device 50.
- Classification device 40 and memory device 50 include respective interfaces 46 and 52 that enables memory 42 to be reversibly operationally connected to classification device 40.
- training device 30 includes a similar interface to enable memory 42 to be reversibly operationally connected to training device 30. After training device 30 has determined generic parameter set 24, memory device 50 is operationally connected to training device 30 and generic parameter set 24 is loaded into memory 42. Then memory device is disconnected from training device 30 and is connected to classification device 40 as shown in Figure 4.
- That training device 30 is (preferably) a general purpose computer makes it easy to replace the training classification algorithms and the generic classification algorithm with improved algorithms. This replacement is most conveniently done by downloading the new algorithms from an external source such as the Internet.
- That system 20 is self-contained and allows a user of system 20 to implement near- optimal classification without the assistance of specialists allows the user to maintain confidentiality of training set 22.
- An alternative embodiment of the present invention lacks the generic classification algorithm. Instead, both training device 30 and classifying mechanism 44 share the same set of classification algorithms. Training device 30 selects the classification algorithm that best classifies training set 22, as above. Then, instead of selecting parameters for a generic classification algorithm, training device 30 prepares a bit string that indicates which of the classification algorithms is the best algorithm. This bit string is transferred to classification device 40, which therefore subsequently knows which of its classification algorithms to use to classify new feature vectors. Along with this bit string, training device 30 sends classification device 40 a set of parameters that defines for classification device 40 the feature space that the best algorithm determined. For example, if the best algorithm is a least squares algorithm then training device 30 sends classification device 40 the parameters of a hypersurface analogous to line 12 of Figure 1.
- Figures 2 and 4 in addition to illustrating the first embodiment of the present invention, also serve to illustrate this alternative embodiment, with the understanding that block 38 is deleted and that "generic parameter set" 24 now includes the bit string that indicates to classification device 40 which classification algorithm to use and the parameters that define the feature space that was determined by the best algorithm.
- one of the strengths of the present invention is its ability to enable non-specialists to perform near-optimal classification of vectors in feature spaces of high dimension. There also are low-dimension cases that, because of their complexity, benefit from the present invention.
- the facility is equipped with three biometric authentication devices.
- the first biometric authentication device measures the iris patterns of people who seek access to the facility.
- the second biometric authentication device measures the facial features of people who seek access to the facility.
- the third biometric authentication device measures fingerprints of people who seek access to the facility.
- Each biometric authentication device also reads an identity card of a person seeking access to the facility (which identity card, of course, must be an identity card of a person who is authorized to have access to the facility), compares its biometric measurement to a corresponding measurement in a database of such measurements made on people with authorized access, and produces a number representative of the probability that the person seeking access is the person identified by the identity card.
- the facility manager wants to combine the three biometric measurements in order to minimize false positives and false negatives.
- the present invention allows the facility manager to do this without being or hiring a classification specialist.
- the probability produced by each biometric authentication device is a value of a corresponding feature in a three-dimensional feature space.
- the facility manager generates a training set 22 for system 20 by assembling a suitably large and varied population of people and by using the biometric authentication devices to make many measurements of respective biometric signatures of each member of the population. For each member of the population, one of these measurements is designated as a reference measurement, and the remaining measurements are transformed into corresponding training vectors by combining them with the reference measurement of that member of the population. These training vectors are classified as "access authorized”. Then the remaining measurements of that member of the population are transformed into another set of corresponding training vectors by combining them with the reference measurement of a different member of the population who is selected at random. These training vectors are classified as "access denied”. System 20 then is trained and implemented as described above.
- the present invention in addition to being useful to users who lack the expertise to develop classification algorithms that are optimized for their own specific applications, also is useful to users who do have such expertise.
- the present invention by its generic nature, spares such a user the time and expense of developing and manufacturing a classification device 40 that is custom-tailored to that user's specific needs, even if the user is capable of doing so.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Medical Informatics (AREA)
- Data Mining & Analysis (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Mathematical Physics (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Business, Economics & Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IL168091A IL168091A (en) | 2005-04-17 | 2005-04-17 | Generic classification system |
PCT/IL2006/000470 WO2006111963A2 (en) | 2005-04-17 | 2006-04-11 | Generic classification system |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1872189A2 true EP1872189A2 (en) | 2008-01-02 |
EP1872189A4 EP1872189A4 (en) | 2010-03-03 |
Family
ID=37115560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP06728271A Withdrawn EP1872189A4 (en) | 2005-04-17 | 2006-04-11 | Generic classification system |
Country Status (4)
Country | Link |
---|---|
US (1) | US20100049674A1 (en) |
EP (1) | EP1872189A4 (en) |
IL (1) | IL168091A (en) |
WO (1) | WO2006111963A2 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7088872B1 (en) * | 2002-02-14 | 2006-08-08 | Cogent Systems, Inc. | Method and apparatus for two dimensional image processing |
US8131477B2 (en) * | 2005-11-16 | 2012-03-06 | 3M Cogent, Inc. | Method and device for image-based biological data quantification |
US8275179B2 (en) * | 2007-05-01 | 2012-09-25 | 3M Cogent, Inc. | Apparatus for capturing a high quality image of a moist finger |
US8411916B2 (en) * | 2007-06-11 | 2013-04-02 | 3M Cogent, Inc. | Bio-reader device with ticket identification |
US20100014755A1 (en) * | 2008-07-21 | 2010-01-21 | Charles Lee Wilson | System and method for grid-based image segmentation and matching |
US10679749B2 (en) * | 2008-08-22 | 2020-06-09 | International Business Machines Corporation | System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet |
WO2012127577A1 (en) * | 2011-03-18 | 2012-09-27 | 富士通フロンテック株式会社 | Verification device, verification program, and verification method |
CN108737379A (en) * | 2018-04-19 | 2018-11-02 | 河海大学 | A kind of big data transmission process algorithm |
CN109145554A (en) * | 2018-07-12 | 2019-01-04 | 温州大学苍南研究院 | A kind of recognition methods of keystroke characteristic abnormal user and system based on support vector machines |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5142593A (en) * | 1986-06-16 | 1992-08-25 | Kabushiki Kaisha Toshiba | Apparatus and method for classifying feature data at a high speed |
US6678548B1 (en) * | 2000-10-20 | 2004-01-13 | The Trustees Of The University Of Pennsylvania | Unified probabilistic framework for predicting and detecting seizure onsets in the brain and multitherapeutic device |
US20020165839A1 (en) * | 2001-03-14 | 2002-11-07 | Taylor Kevin M. | Segmentation and construction of segmentation classifiers |
US6879709B2 (en) * | 2002-01-17 | 2005-04-12 | International Business Machines Corporation | System and method for automatically detecting neutral expressionless faces in digital images |
US6938049B2 (en) * | 2002-06-11 | 2005-08-30 | The Regents Of The University Of California | Creating ensembles of decision trees through sampling |
US7146050B2 (en) * | 2002-07-19 | 2006-12-05 | Intel Corporation | Facial classification of static images using support vector machines |
US7073013B2 (en) * | 2003-07-03 | 2006-07-04 | H-Systems Flash Disk Pioneers Ltd. | Mass storage device with boot code |
US7319779B1 (en) * | 2003-12-08 | 2008-01-15 | Videomining Corporation | Classification of humans into multiple age categories from digital images |
-
2005
- 2005-04-17 IL IL168091A patent/IL168091A/en not_active IP Right Cessation
-
2006
- 2006-04-11 US US11/911,722 patent/US20100049674A1/en not_active Abandoned
- 2006-04-11 WO PCT/IL2006/000470 patent/WO2006111963A2/en active Application Filing
- 2006-04-11 EP EP06728271A patent/EP1872189A4/en not_active Withdrawn
Non-Patent Citations (5)
Title |
---|
A. K. JAIN, R. P. W. DUIN, J. MAO: "Statistical pattern recognition: a review" IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 22, no. 1, January 2000 (2000-01), pages 4-37, XP000936788 * |
L. I. KUNCHEVA: "Classifier selection" COMBINING PATTERN CLASSIFIERS: METHODS AND ALGORITHMS, July 2004 (2004-07), pages 189-202, XP007911049 * |
L. I. KUNCHEVA: "Multiple classifier systems" COMBINING PATTERN CLASSIFIERS: METHODS AND ALGORITHMS, July 2004 (2004-07), pages 101-110, XP007911048 * |
S. DZEROSKI, B. ZENKO: "Is combining classifiers with stacking better than selecting the best one?" MACHINE LEARNING, vol. 54, no. 3, March 2004 (2004-03), pages 255-273, XP019213403 * |
See also references of WO2006111963A2 * |
Also Published As
Publication number | Publication date |
---|---|
EP1872189A4 (en) | 2010-03-03 |
IL168091A (en) | 2010-04-15 |
WO2006111963A3 (en) | 2007-05-31 |
US20100049674A1 (en) | 2010-02-25 |
WO2006111963A2 (en) | 2006-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100049674A1 (en) | Generic classification system | |
US8155406B2 (en) | Image processing system, particularly for use with diagnostic images | |
Cao | Singular value decomposition applied to digital image processing | |
JP4483334B2 (en) | Image processing device | |
US20080130962A1 (en) | Method and apparatus for extracting face feature | |
US20080273743A1 (en) | Synthesis of databases of realistic, biologically-based 2-D images | |
CN101414351A (en) | Fingerprint recognition system and control method | |
CN109325393A (en) | Face detection, pose estimation and estimation of distance from camera using a single network | |
US20230093044A1 (en) | Methods and devices for spectacle frame selection | |
JP4624635B2 (en) | Personal authentication method and system | |
CN116013552A (en) | Remote consultation method and system based on blockchain | |
CN113378982A (en) | Training method and system of image processing model | |
KR20220000851A (en) | Dermatologic treatment recommendation system using deep learning model and method thereof | |
US11379760B2 (en) | Similarity based learning machine and methods of similarity based machine learning | |
Bajwa et al. | A multifaceted independent performance analysis of facial subspace recognition algorithms | |
Huo et al. | Ultra-Fast Approximate Inference Using Variational Functional Mixed Models | |
Peng et al. | Neuron recognition by parallel potts segmentation | |
Damianou et al. | A top-down approach for a synthetic autobiographical memory system | |
CN115985499A (en) | Physical mechanism data generation system and treatment plan generation system | |
Wu et al. | A spline-based nonparametric analysis for interval-censored bivariate survival data | |
KR102151251B1 (en) | Method for estimating a turnaround time in hospital | |
Malathi et al. | Detection of Parkinson disease for handwriting dataset using deep learning algorithms | |
JP2020149715A (en) | A learning contour identification system that uses portable contour metrics derived from contour mapping | |
JPH11161671A (en) | Method, device, and system for information classification | |
Schneider | Convolutional Neural Net Models and Image Processing Methods for Predicting Surgical Site Infection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20071112 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK YU |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06E 1/00 20060101AFI20080207BHEP |
|
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20100202 |
|
17Q | First examination report despatched |
Effective date: 20100609 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20101020 |