US20210142436A1 - Event-driven authentication of physical objects - Google Patents
Event-driven authentication of physical objects Download PDFInfo
- Publication number
- US20210142436A1 US20210142436A1 US17/125,437 US202017125437A US2021142436A1 US 20210142436 A1 US20210142436 A1 US 20210142436A1 US 202017125437 A US202017125437 A US 202017125437A US 2021142436 A1 US2021142436 A1 US 2021142436A1
- Authority
- US
- United States
- Prior art keywords
- authentication
- event
- digital
- unusual
- authentication action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 claims abstract description 102
- 230000009471 action Effects 0.000 claims abstract description 41
- 238000004519 manufacturing process Methods 0.000 claims description 12
- 238000013473 artificial intelligence Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 64
- 230000001960 triggered effect Effects 0.000 abstract description 11
- 238000004891 communication Methods 0.000 abstract description 7
- 238000003384 imaging method Methods 0.000 abstract description 7
- 230000004044 response Effects 0.000 abstract description 7
- 239000013598 vector Substances 0.000 description 33
- 230000006698 induction Effects 0.000 description 19
- AEOCXXJPGCBFJA-UHFFFAOYSA-N ethionamide Chemical compound CCC1=CC(C(N)=S)=CC=N1 AEOCXXJPGCBFJA-UHFFFAOYSA-N 0.000 description 18
- 238000000605 extraction Methods 0.000 description 18
- 230000015654 memory Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 238000001514 detection method Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 8
- 230000004075 alteration Effects 0.000 description 7
- 238000013459 approach Methods 0.000 description 7
- 230000008901 benefit Effects 0.000 description 7
- 238000007689 inspection Methods 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 5
- 238000013515 script Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 239000010437 gem Substances 0.000 description 4
- 239000000976 ink Substances 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 229910001751 gemstone Inorganic materials 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 238000012015 optical character recognition Methods 0.000 description 3
- 238000012550 audit Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 239000013078 crystal Substances 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 235000013361 beverage Nutrition 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000003490 calendering Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 229910003460 diamond Inorganic materials 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000009474 immediate action Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000010970 precious metal Substances 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
- G06F16/43—Querying
- G06F16/432—Query formulation
- G06F16/434—Query formulation using image data, e.g. images, photos, pictures taken by a user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G06K9/00201—
-
- G06K9/6215—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0021—Image watermarking
- G06T1/0028—Adaptive watermarking, e.g. Human Visual System [HVS]-based watermarking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/60—Memory management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
Definitions
- Centralized databases storing digital fingerprints of objects enabling enhanced security, rapid searching, and high reliability.
- event-triggered authentication of objects utilizing digital fingerprints are particularly useful in particular.
- FIG. 1 is an example of an authentication region and fingerprint template definition for a U.S. passport.
- FIG. 2 is a simplified flow diagram of a process for authentication of a physical object based on digital fingerprinting.
- FIG. 3 is a simplified flow diagram of a process for authentication of a previously fingerprinted object.
- FIG. 4A shows an image of the numeral “3” representing the first digit in a serial number of an “original” or known U.S. dollar bill.
- FIG. 4B shows an image of the numeral “3” representing the first digit in a serial number of a U.S. dollar bill to be authenticated.
- FIG. 5A is an illustration of results of feature extraction showing selected locations of interest in the image of FIG. 4A .
- FIG. 5B is an illustration of results of feature extraction showing selected locations of interest in the image of FIG. 4B .
- FIG. 6A shows the same dollar bill image as in FIG. 4A , juxtaposed with FIG. 6B for comparison.
- FIG. 6B shows an image of the numeral “3” that has been damaged or degraded.
- FIG. 7A shows detail of two fingerprint feature locations on the numeral 3.
- FIG. 7B shows detail of the damaged bill with the corresponding fingerprint feature locations called out for comparison.
- FIG. 8 is a simplified illustration of a rotational transformation in the process of comparing digital fingerprints of two images.
- FIG. 9 is a simplified flow diagram of an induction-authentication process.
- FIG. 10 is a simplified flow diagram of an in-field induction process to enable tracing an object.
- FIG. 11 is a simplified hybrid system/ communication diagram illustrating several different arrangements and applications of the present disclosure.
- FIG. 12 is a simplified flow diagram of one example of a process in accordance with the present disclosure for event-triggered authentication.
- first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first machine could be termed a second machine, and, similarly, a second machine could be termed a first machine, without departing from the scope of the inventive concept.
- Eliminating the need to add extrinsic identifiers or any physical modifications to an object offers a number of advantages to manufacturers, distributors, buyers, sellers, users, and owners of goods. Forgoing the addition of extrinsic identifiers reduces the cost of manufacturing and offers greater security than physical tagging. Moreover, physical identifiers can be damaged, lost, modified, stolen, duplicated, or counterfeited whereas digital fingerprints cannot.
- a system in accordance with the present disclosure utilizes the extraction of features to identify and authenticate objects.
- Feature extraction enables users to take a large amount of information and reduce it to a smaller set of data points that can be processed more efficiently. For example, a large digital image that contains tens of thousands of pixels may be reduced to a few locations of interest that can be used to identify an object. This reduced set of data is called a digital fingerprint.
- the digital fingerprint contains a set of fingerprint features or locations of interest which are typically stored as feature vectors.
- Feature vectors make image processing more efficient and reduce storage requirements as the entire image need not be stored in the database, only the feature vectors need to be stored.
- Examples of feature extraction algorithms include—but are not limited to—edge detection, corner detection, blob detection, wavelet features, Gabor, gradient and steerable output filter histograms, scale-invariant feature transformation, active contours, shape contexts, and parameterized shapes.
- system While the most common applications of the system may be in the authentication of physical objects such as manufactured goods and documents, the system is designed to be applicable to any object that can be identified, characterized, quality tested, or authenticated with a digital fingerprint. These include but are not limited to mail pieces, parcels, art, coins, currency, precious metals, gems, jewelry, apparel, mechanical parts, consumer goods, integrated circuits, firearms, pharmaceuticals, and food and beverages.
- system is used in a broad sense, including the methods of the present disclosure as well as apparatus arranged to implement such methods.
- scan is used in the broadest sense, referring to any and all means for capturing an image or set of images, which may be in digital form or transformed into digital form. Images may, for example, be two dimensional, three dimensional, or in the form of a video. Thus a “scan” may refer to an image (or digital data that defines an image) captured by a scanner, a camera, a specially adapted sensor or sensor array (such as a CCD array), a microscope, a smartphone camera, a video camera, an x-ray machine, a sonar, an ultrasound machine, a microphone (or other instruments for converting sound waves into electrical energy variations), etc.
- any device that can sense and capture either electromagnetic radiation or mechanical wave that has traveled through an object or reflected off an object or any other means to capture surface or internal structure of an object is a candidate to create a “scan” of an object.
- Various means to extract “fingerprints” or features from an object may be used; for example, through sound, physical structure, chemical composition, or many others. The remainder of this application will use terms like “image” but when doing so, the broader uses of this technology should be implied. In other words, alternative means to extract “fingerprints” or features from an object should be considered equivalents within the scope of this disclosure.
- scanner and “scanning equipment” herein may be used in a broad sense to refer to any equipment capable of carrying out “scans” as defined above, or to equipment that carries out “scans” as defined above as part of their function.
- authentication is not limited to specifically describing successful matching of inducted objects or generally describing the outcome of attempted authentications.
- a counterfeit object may be described as “authenticated” even if the “authentication” fails to return a matching result.
- action described as “authentication” or “attempted authentication” may also, post facto, also be properly described as an “induction”.
- An authentication of an object may refer to the induction or authentication of an entire object or of a portion of an object.
- a chosen region may be the image of the entire object; in other embodiments chosen regions may be one or more sub-regions of the image of the object.
- a digital image of the entire photograph may be chosen for feature extraction.
- Each photograph is different and there may be unique feature information anywhere in a photograph.
- the authentication region may be the entire photograph.
- multiple regions may be used for fingerprinting.
- a template may be used (see FIG. 1 ) to define regions of interest, including elimination of regions of little interest.
- an object such as a bank note
- an object may be deemed authenticated if a few small arbitrary regions scattered across the surface are fingerprinted, possibly combined with one or more recognitions of, for example, the contents of a region signifying the value of the bank note or one containing the bank note serial number.
- the fingerprints of any region may be considered sufficient to establish the authenticity of the bill.
- multiple fingerprinted regions may be referenced in cases where one or more region may be absent from an object (through, for example, tearing) when, for example, a bank note is presented for authentication. In other embodiments, however, all regions of an object may need to be authenticated to ensure an object is both authentic and has not been altered.
- a passport may provide an example of feature extractions from multiple authentication regions; see FIG. 1 .
- features chosen for authentication may be extracted from regions containing specific identification information such as the passport number, the recipient name, the recipient photo, etc., as illustrated in FIG. 1 .
- a user may define a feature template specifying the regions whose alteration from the original would invalidate the passport, such as the photo, identifying personal data, or other regions considered important by the user. More details of feature templates are given in Ross, et at. U.S. Pat. No. 9,443,298.
- FIG. 1 illustrates one example of an authentication region and a fingerprint template definition for a U.S. passport.
- brace 101 refers to a simplified flow diagram of a process as follows: At process block 102 , an object is scanned to generate an “original image”, that is, a digital image file or a digital data file in any suitable format that is herein simply referred to as an “image”. The original image is illustrated as the data page spread of a U.S. passport book, at block 150 .
- the system processes the image data to determine an authentication region.
- the authentication region is the biographic data page of the U.S. Passport, located in the lower portion of image 150 , identified by dashed box 154 .
- the process generates an authentication image for feature extraction, block 106 .
- the authentication image is illustrated at reference 156 .
- the process defines one or more locations of interest for feature vector extraction.
- the locations of interest in this example are, as shown in image 158 by dashed boxes 160 , the surname, the given name, the passport number, and the passport photo.
- the process 100 comprises creating a fingerprint template 120 .
- template 120 identifies an object class (U.S. Passport), defines an authentication region (for example, by X-Y coordinates), and lists one or more locations of interest within that authentication region.
- the list comprises passport number, photo, first name, and last name.
- an ability to define and store optimal authentication regions for classes of objects may offer benefits to a user.
- a location box and crosshairs may automatically appear in the viewfinder of a smartphone camera application, to help the user center the camera on an authentication region, and automatically lock onto a region and complete a scan when the device is focused on an appropriate area.
- scanning may be of any kind, including 2-D, 3-D, stereoscopic, HD, etc. and is not limited to the use of visible light or to the use of light at all (as previously noted, sonar and ultrasound are, for example, appropriate scanning technologies).
- objects may have permanent labels or other identifying information attached to them.
- these attachments may also be referenced as features for digital fingerprinting, particularly where the label or other identifying information becomes a permanent part of the object.
- a permanent label may be used as an authentication region for the object to which it is affixed.
- a label may be used in conjunction with the object itself to create a fingerprint of multiple authentication regions referencing both a label and an object to which the label is affixed.
- wine may be put into a glass bottle and a label affixed to the bottle. Since it is possible that a label may be removed and re-applied elsewhere merely using the label itself as an authentication region may not be sufficient.
- the authentication region may be defined so as to include both a label and a substrate it is attached to—in this example some portion of a label and some portion of a glass bottle.
- This “label and substrate” approach may be useful in defining authentication regions for many types of objects, such as various types of goods and associated packaging.
- authentication may reveal changes in the relative positions of some authentication regions such as in cases where a label has been moved from its original position, which may be an indication of tampering or counterfeiting. If an object has “tamper-proof” packaging, this may also be included in the authentication region.
- multiple authentication regions may be chosen from which to extract unique features.
- multiple authentication regions may be selected to enable the separate authentication of one or more components or portions of an object.
- features may be extracted from two different parts of a firearm. Both features may match the original firearm but since it is possible that both parts may have been removed from the original firearm and affixed to a weapon of different quality, it may also be useful to determine whether the relative positions of the parts have changed. In other words, it may be helpful to determine that the distance (or other characteristics) between Part A's authentication region and Part B's authentication region remains consistent with the original feature extraction. If the positions of Parts A and B are found to be consistent to the relative locations of the original authentication regions, the firearm may be authenticated. Specifications of this type may be stored with or as part of a digital fingerprint of an object.
- the system can create a fingerprint template (as shown in FIG. 1 ) that can be used to control subsequent authentication operations for that class of objects.
- This template may be created either automatically by the system or by a human-assisted process.
- a fingerprint template is not required for the system to authenticate an object, as the system can automatically extract features and create a digital fingerprint of an object without it.
- the presence of a template may optimize the authentication process and add additional functionality to the system.
- Feature n METHODS: [Programs that can be run on features of an object]
- Feature 2 Photo Method 1: [checkphoto.exe] Check for uneven edges indicating photo substitution . . . Method n
- Feature n Method n ADDITIONAL DATA [Additional data associated with the object]
- the uses of the fingerprint template include but are not limited to determining the regions of interest on an object, the methods of extracting fingerprinting and other information from those regions of interest, and methods for comparing such features at different points in time.
- the name “fingerprint template” is not important; other data with similar functionality (but a different name) should be considered equivalent.
- the uses of the fingerprint template include but are not limited to determining the regions of interest on an object, the methods of extracting fingerprinting and other information from those regions of interest, and methods for comparing such features at different points in time.
- the name “fingerprint template” is not important; other data with similar functionality (but a different name) should be considered equivalent.
- an object is fingerprinted preferably during the creation process (or at any time when its provenance may be sufficiently ascertained) or at a point where an expert has determined its authenticity. Subsequently, the object is later re-fingerprinted, and the two sets of fingerprints are compared to establish authenticity of the object.
- the fingerprints may be generated by extracting a single fingerprint from the entire object or by extracting multiple sets of features from multiple authentication regions. Fingerprinting may also involve reading or otherwise detecting a name, number, or other identifying characteristics of the object using optical character recognition or other means which may be used to expedite or facilitate a comparison with other fingerprints.
- serial numbers or other readable identifiers may be utilized to directly access the database record for the object and compare its digital fingerprint to the original that was previously stored, rather than searching an entire digital fingerprinting database for a match.
- case (2) a fingerprinted object is compared, region by region, with a digital fingerprint of an original object to detect low or nonexistent matching of the fingerprint features from those regions. While case (1) is designed to determine whether the original object is now present, case (2) is designed to detect whether the original object has been altered and, if so, how it has been altered. In some embodiments, authentication regions having poor or no matching fingerprint features will be presumed to have been altered.
- an object may not have been fingerprinted while its provenance was sufficiently ascertainable.
- One example would be bills or passports created prior to initiating the use of a digital fingerprinting system.
- digital fingerprints of certain regions of interest on an object may be compared with digital fingerprints from known, or suspected, counterfeit objects or with both those and fingerprints of properly authenticated objects.
- a photograph may be spuriously added to a passport and, as an artifact of the counterfeiting, the edge of the added photo may tend to be sharper than an edge of an original, unaltered, photograph.
- fingerprint characteristics of known authentic passports and those of passports that are known (or suspected to) have been altered by changing a photograph may be compared with the passport being inspected to estimate whether the passport exhibits indications of alteration.
- the digital image which will be used to create the unique digital fingerprint for the object, is generated.
- the digital image (or set of images) provides the source information for the feature extraction process.
- a digital fingerprinting feature is defined as a feature or a location of interest in an object, which feature is inherent to the object itself.
- features preferably are a result of a manufacturing process, other external processes, or of any random, pseudo-random, or deliberate process or force, such as use.
- gemstones have a crystal pattern which provides an identifying feature set. Every gemstone is unique and every gem stone has a series of random flaws in its crystal structure. This pattern of random flaws may be used for the extraction of feature vectors for identification and authentication.
- a “feature” is not necessarily concerned with reading or recognizing meaningful content, for example by using methods like optical character recognition.
- a digital fingerprint of an object may capture both features of the object and features of any identifiers that are affixed or attached to the object.
- Feature vectors extracted from authentication regions located on an affixed identifier are based on the substances of which the identifier is physically comprised rather than the information (preferably alphanumeric) that is intended to be communicated by the identifier. For instance, in the case of a wine bottle, features may be captured from the bottle and from a label affixed to the bottle.
- the paper of the label and the ink pattern of the bar code may be used to extract a feature vector without reading the alphanumeric information reflected by the bar code.
- An identifier such as a UPC bar code print consisting of lines and numbers, has no greater significance in the generation and use of a feature vector than a set of randomly printed lines and numbers.
- identifier information such as a name, serial number, or a bar code
- the system may allow the user to capture such information and store it in the digital fingerprint.
- Identifier information may, for example, be read and stored by utilizing techniques such as optical character recognition, and may be used to facilitate digital fingerprint comparisons.
- serial numbers may be used as the primary index into a database that may also contain digital fingerprints. There may be practical reasons for referencing serial numbers in relations to digital fingerprints. In one example, a user is seeking determine whether a bank note is a match with a particular original.
- the user may be able to expedite the comparison by referencing the bank note serial number as an index into the digital fingerprinting database rather than iterating through a large quantity of fingerprints.
- the index recognition may speed up the comparison process but it is not essential to it.
- the digital fingerprint may be stored or registered in a database.
- the digital fingerprint may comprise one or more fingerprint features which are stored as feature vectors.
- the database should preferably be secure.
- a unique identifier such as a serial number
- assigning a unique identifier is not essential as a digital fingerprint may itself serve as a key for searching a database independent of any addition of a unique identifier.
- a digital fingerprint of an object identifies the object by the unique features and characteristics of the object itself the digital fingerprint renders unnecessary the use of arbitrary identifiers such as serial numbers or other labels and tags, etc.
- FIG. 2 represents an example of a simplified flow diagram of a process 200 for authenticating or identifying an object using digital fingerprinting using a U.S. passport for illustration for part of the process.
- the process begins with scanning the object, block 202 .
- An image 250 is acquired, in this illustration the front page of a U.S. passport is used.
- the next step is to determine a class of the object, block 204 .
- This step may be omitted where the class is known.
- a station may be in use that only checks U.S. passports.
- the system may be at a passport printing facility.
- the class of objects may be known a priori.
- a database query may be conducted to see if a template exists in the system for the object that was scanned at 202 .
- the initial image may be processed to extract a serial number or other identifying information.
- the database may then be interrogated; decision 206 , to see if a template exists for that serial number. If the answer is YES, path 208 , the system accesses the template 212 and uses it to select one or more authentication regions 210 .
- the template 212 lists the regions and their respective locations in the image (i.e. on the passport front page in this example). Physical locations may, as an example, be specified relative to a given location, and/ or relative to each other.
- the template guides the authentication software in analyzing the image data. In that analysis, for each authentication region (called a “Feature” in 212 ), various features are extracted from the image data, block 222 .
- each feature may be described by a feature vector. Location and other data and metadata may be included in the fingerprint.
- the process for extracting features and describing them in feature vectors may be specified in the template.
- the template may also specify which regions must be matched to declare the passport a match. In the passport example, all specified regions must match a record in the database for the passport to be determined to be authentic and unaltered. In other cases, a few matches may be sufficient.
- the digital fingerprint generated at block 224 is then used to query a reference database 230 for a match.
- Non-Template Object Class there may not be an existing template in the system for the object under inspection—NO branch for “Non-Template Object Class.”
- the process here may vary with the type of object under inspection and the purpose for the inspection.
- a scanned image of an object may be processed to find locations of interest, block 232 , for example, surface areas that are non-homogenous and thus have considerable image data content.
- finding locations of interest may be automated or semi-automated.
- the locations may be used to extract features, block 234 and/or recorded in a template for later use.
- locations should be recorded in, or otherwise associated with, the digital fingerprint of the object.
- user input may be used to select authentication regions, and then the process proceeds to 234 as before.
- an entire object may be scanned and all of the data processed to find and record digital fingerprint data.
- the process proceeds to create a digital fingerprint, block 236 , which can then be used to query the database 230 for a match.
- the match result may not be binary (yes/no); rather, in many cases, the result may indicate a confidence level of a match or may be a composite of binary results or confidence levels—such as when an object has been altered in part or in whole and/or has been assembled, or disassembled.
- an object is scanned and an image is generated.
- the steps that follow depend on the operation to be performed. Several illustrative example cases are discussed below.
- Case 1 For authentication of a previously fingerprinted object, the following steps may be followed (see FIG. 3 , discussed below):
- FIG. 3 illustrates such a process 300 in diagrammatic form.
- the process scans an object and creates an authentication image, block 304 .
- the image is represented at 350 , using a passport as an example.
- Features are extracted, block 306 , and optionally, other information, such as a serial number or similar ID number, preferably unique, may be extracted as well, block 310 .
- the extracted data is processed to generate a digital fingerprint, block 312 .
- a database 320 may be queried for a matching fingerprint, block 314 .
- a “match” may be defined by a binary, probability, or similarity metric or be a composite of metrics.
- Results of the database query may be reported to a user, block 322 .
- a new digital fingerprint may be added to the database 320 , shown at process block 330 .
- Case 2 For inspection of specific features of a previously fingerprinted object to determine whether they have been altered, the steps are similar to Case 1, but the process is aimed at detection of alterations rather than authentication of the object:
- the system is arranged to look up and match objects in the database when there is a “near miss.” For example, two feature vectors [0, 1, 5, 5, 6, 8] and [0, 1, 6, 5, 6, 8] are not identical but by applying an appropriate difference metric the system can determine that they are close enough to say with a degree of certainty that they are from the same object that has been seen before.
- One example would be to calculate Euclidean distance between the two vectors in multi-dimensional space, and compare the result to a threshold value. This is similar to the analysis of human fingerprints. Each fingerprint taken is slightly different, but the identification of key features allows a statistical match with a high degree of certainty.
- FIG. 4A illustrates an image of the numeral “3” representing a number printed on an “original” or known U.S. dollar bill.
- the bill may have been fingerprinted, for example, at the time of manufacture or public release, as described herein, or otherwise sufficiently authenticated for use as a reference.
- fingerprint databases of currency and the like may be secured. Such databases preferably exclude raw image data. This image, on the order of about 40-fold magnification, shows a number of distinctive features visible to the naked eye.
- FIG. 4B illustrates an image of a number printed on a second or unknown U.S. dollar bill.
- the second bill may be fingerprinted using the same process, and then the resulting digital fingerprints, i.e., the respective fingerprint feature vectors, may be compared as further explained below, to determine whether or not the second bill is in fact the same one as the first bill. The comparison may take place even though the bill may have changed from wear and tear.
- FIG. 5A is a simplified illustration of the results of feature extraction applied to the numeral 3 of FIG. 4A .
- the locations of interest need not necessarily be circular, but circular areas are advantageous for many applications. Below is a discussion on how these areas may be selected in an image.
- Fingerprint feature extraction is applied to each of the circular locations of interest. The results for each location may be stored as fingerprint feature vectors.
- a “location of interest” (sometimes referred to as a “point” or “area” of interest), for example 1720 , may well be a physical feature on the object, but the “feature vector” that characterizes that location of interest is not just a variation on the image around that location; rather, the feature vector is derived from it by any of a number of possible means.
- a feature vector may be an array of numeric values.
- a collection of feature vectors, say for location 1750 may be stored as a feature vector array.
- FIG. 5B is a simplified illustration of the results of feature extraction applied to locations of interest on the numeral 3 of FIG. 4B .
- the same fingerprinting process may be applied to this image.
- the same locations of interest as in FIG. 5A are labeled 1720 and 1760 , respectively.
- the stored features are compared with the features extracted from the new object. As in this case, if the locations of interest are not encountered in the second object, or of the feature vectors characterizing those locations of interest are too different, there is no match (or a low confidence level for a match) for that location of interest.
- Variables such as which locations must match and/or how many locations must match and/or the degree of matching required to conclude that an object matches the one previously fingerprinted, may in some embodiments be specified in a digital fingerprint record, further described below, or in some other associated record, to guide the decision process.
- This arrangement may be advantageous, for example, for exporting a database to a generic processor or system for remote authentication work.
- the matching logic may be embedded in the digital fingerprint record.
- the matching logic is implemented in software as part of an authentication system.
- FIG. 6A shows a numeral from the same dollar bill image as in FIG. 4A , juxtaposed with FIG. 6B for comparison.
- FIG. 6B shows the numeral on the same bill after the bill has been subjected to washing in a washing machine, perhaps as a result of being left in the pocket of a piece of clothing.
- the image or, rather, the dollar bill
- FIG. 15B the image (or, rather, the dollar bill) has been degraded; there is significant loss of ink and destruction of the paper surface in multiple locations.
- a bitmapped approach to matching would likely fail to match these two figures due to the large number of pixels that are now different, as relatively few of the pixels remain the same as the original.
- FIG. 7A shows the detail of two fingerprint feature locations as before, 1610 and 1650 .
- FIG. 7B shows detail of the damaged bill with the corresponding locations called out as 1620 and 1660 , respectively.
- a comparison between the similarities of area 1610 to area 1620 and of area 1650 to area 1660 illustrates how a comparison of the corresponding fingerprint feature vectors would be adequate to result in a match. In practice, a much larger number of features would be used.
- the image of the damaged bill is analyzed by a processor.
- the processor accesses a database of previously stored fingerprint data. If the dollar bill serial number is legible (by eye or machine), the record for the corresponding bill may be accessed from the datastore using the serial number as an index. Similarly, if any portion of the serial number is legible, the search for a matching record can be narrowed on that basis. Either way, a candidate record, containing a set of stored regions of interest may be compared to the image of the damaged bill.
- the feature-based approach is able to address other external problems such as rotated images. This is especially important in a system where an unsophisticated user, such as a retail customer, may be scanning an object to be authenticated. In such cases, external factors like lighting and rotation may not be under the system operator's control.
- FIG. 8 which shows the original image on the left side, with a small set of fingerprint features marked as small diamond shapes. This is merely a callout symbol for illustration. In some embodiments, as noted, preferably circular areas are used.
- a search is conducted of the suspect image on the right side of FIG. 8 (or a portion of it) for a matching feature.
- the position may not match exactly, due to “stretch”, an effective difference in magnification, and/or due to rotation of the image, or due to other circumstances. Although it may not match locations literally; a mathematical transformation may be defined that maps one image to the other, thereby accounting for rotation and stretch as appropriate.
- a bounding rectangle A indicated by the box in the left side image may be mapped to a quadrilateral, indicated by the line B in the right-side image.
- mapping or transformation should be restricted depending on the type of object under inspection. For instance, some objects may be inflexible, which may restrict the possible deformations of the object.
- the system preferably should provide sufficient imaging capability to show invariant features. Particulars will depend on the regions used for authentication. For many applications, 10-fold magnification may be adequate. For ink bleeds on passports, bills, and other high-value authentication, 40-fold magnification may likely be sufficient.
- the software should implement a flexible response to accommodate misalignment (rotation), misorientation, and scale changes. Color imaging and analysis is generally not required for using the processes described above, but may be used in some cases.
- FIG. 9 is a simplified diagram illustrating the concepts of induction and authentication.
- induction is used in a general manner to refer to entering an object or a set of objects into an electronic system for subsequently identifying, tracking, or authenticating the object, or for other operations.
- the object itself is not entered into the system in a physical sense; rather, induction refers to creating and entering information into a memory or datastore from which it can later be searched, interrogated, retrieved, or utilized in other kinds of database operations.
- induction 1802 thus may refer to a process that includes capturing an image of an object (or part of an object), processing the image to extract descriptive data, storing the extracted data, or any or all of these operations.
- the inducted object represented by a cube 1804 , then leaves the induction site, and proceeds in time and space along a path 1806 .
- Induction may be done at the point of creation or manufacture of the object, or at any subsequent point in time. In some cases, induction may be done clandestinely, such as without the knowledge of the person or entity currently having ownership and/or possession of an object.
- the term “possession” is used in the broadest sense to include, for example, actual physical possession, as well as control—for example, having they key to a secure physical storage where an object is kept.
- an object 1804 may encounter wear and tear, and otherwise may change, intentionally or not, in ways that may not be known a priori, represented by the question mark 1808 .
- the original object 1804 may even in fact be lost or stolen after induction and a counterfeit may be introduced.
- an object 1810 may be presented for authentication, represented by block 1820 .
- an agent may take a photograph of an object with a smartphone, without the knowledge or consent of the possessor of the object, and the resulting image may be utilized for induction and/or authentication as described herein.
- some part of the induction/ authentication process may be done remote from a facility intended for that purpose.
- some part of the induction/authentication process may be accomplished without the knowledge of the then-current possessor of an object.
- the induction and/or authentication are not part of the current possessors' normal processes. These two criteria are not essential for the present disclosure, but are generally representative of some applications.
- FIG. 10 is a simplified flow diagram of one example of a process for creating a digital fingerprint that includes feature vectors based on a scanned image of an object.
- the process begins with initialization at block 2120 .
- This step may comprise initializing a datastore, calibrating an image capture system, or other preliminary operations.
- An object or object is scanned, block 2122 , forming digital image data.
- the scanning may be automated. In other cases, an operator may be involved in manual scanning.
- an authentication image is generated, block 2124 , which may comprise all or a selected subset of the scan data.
- a digital fingerprint record may be initialized, for example in a memory or datastore, block 2126 .
- At least one authentication region is selected, block 2130 , in the authentication image data. This selection preferably is carried out by the fingerprinting software.
- the authentication region(s) may be selected according to a predetermined template based on the class of objects. Locations of the authentication regions may be stored in the digital fingerprint record, block 2132 .
- a software process may automatically select a large number—typically hundreds or even thousands per square mm—of preferred locations of interest for purposes of the digital fingerprint.
- a location may be of interest because of a relatively high level of content. That “content” in a preferred embodiment may comprise a gradient or vector, including a change in value and a direction.
- the selected locations of interest may be added to the fingerprint record, block 2136 .
- such areas may be identified by a location or centroid, and a radius thus defining a circular region. Circular regions are preferred for some applications because they are not affected by rotation of the image.
- a fingerprint record may include first and second feature vectors (each describing a corresponding feature extracted from an area of interest) and a relative location of one to the other.
- the feature extraction may be repeated, block 2150 , using an adjusted area size or scale (such as magnification). Feature vectors created at the adjusted size may be added to the fingerprint, block 2152 . Additional features may be extracted at additional magnification values, until an adequate number are provided, decision 2154 . This additional data may be added to the fingerprint, block 2156 . This data may be helpful in finding a matching fingerprint where the authentication image magnification is not the same as the image at the time of induction of the object. Finally, and optionally, the scanned image itself (generated at 2122 ) may be added to the database, block 2158 . This process to build a digital fingerprint ends at 2160 .
- an adjusted area size or scale such as magnification
- Authentication may be conducted in response to a trigger. That is, authentication performed outside the normal steady functioning of a system (in contrast, for example, to inducting parts as they are manufactured and authenticating them as they are installed).
- a trigger That is, authentication performed outside the normal steady functioning of a system (in contrast, for example, to inducting parts as they are manufactured and authenticating them as they are installed).
- any form of event trigger see the progression below
- any form of authentication using fingerprinting or similar technology are non-limiting examples of events that could serve as triggers. Each of them could be utilized to trigger the kinds of authentication taught above in this document.
- Schedule-based triggering In one example, this disclosure envisions a system where authentication is triggered on a schedule (e.g. as part of quarterly inventory, or two hours past closing time). Triggering on a schedule is close to being “part of the normal . . . functioning of the system” but is included for completeness in the spectrum of “event-driven authentication”. This form would include normal calendaring but also following computer scripts or even periodic, random, or from time-to-time manual interrupts of normal processes.
- FIG. 11 is a simplified hybrid system/communication diagram illustrating several different arrangements and applications of the present disclosure.
- a particular system may implement all of the features shown in FIG. 11 , or more typically, only a subset of them.
- an event-triggered system may be “local” in the sense of installation at one location, for example, at a parts manufacturer, or a shipping or warehouse facility. In a local installation, the remote sensors and internet connectivity may be unnecessary. In other applications, remote sensors, and remote authentication equipment may be used.
- an event trigger processor 2200 which may comprise any type of programmable digital processor, is arranged for various communications. Details of network interfaces, user interfaces, memory, etc. which are familiar in the industry, are omitted for clarity.
- one or more local sensors 2202 may be coupled to a network interface 2204 for communication with the event trigger processor 2200 via link 2206 , which may be wired or wireless. Output signals from the sensor(s) may be utilized by the ETP 2200 as triggers to initiate authentication actions, local or remote, as further explained below.
- the ETP may initiate various actions, responsive to a trigger input signal, for example, by sending a message to another entity or system, in particular an authentication system. Hence the title, “Event-Driven Authentication.”
- the ETP may command the actions, for example, using known network communication protocols.
- the ETP may send a message to a remote system to have it conduct an inventory of the warehouse, in part or in whole.
- the remote system may utilize appropriate scanning equipment to capture images for the inventory for fingerprinting.
- the processes illustrated by FIG. 11 include the use of sensors of all types, such as RFID and thermal sensors, to smart dust connected to the internet or computerized networks to name but a few.
- one or more remote sensors 2210 may be coupled over a network, such as a LAN, WAN, or the internet 2212 , for connection to the ETP 2200 via a suitable network interface 2216 .
- output signals from the remote sensor(s) may be utilized by the ETP 2200 as triggers to initiate authentication actions, which again may be local or remote.
- other remote processes or systems 2230 may be similarly coupled over a network to communicate with the ETP 2200 .
- a piece of luggage is going down a conveyor (not shown) and is normally to be routed by reading the bag tag. It passes a bag tag reader, but this time the reader does not get a read.
- the bag tag reader may be a remote process or system 2230 coupled to the ETP 2200 .
- a tag reader failure message triggers a process or response in the ETP 2200 that initiates a full fingerprint-based authentication of the (previously inducted) luggage item.
- the authentication process may be performed in various ways, several of which are described in detail above.
- the ETP 2200 may direct a local field imaging system 2232 via a link 2234 .
- the ETP may be coupled directly to the local imaging system 2232 in some applications. In other cases, it may be communicatively coupled over a network.
- the local system 2232 may acquire image data of an object 2236 (for example, the aforementioned luggage item).
- the imaging system 2232 may interact via link 2236 with a fingerprint processing and storage system 2240 .
- the fingerprint system 2240 may include a digital fingerprint processor 2256 , a secure database server 2258 , and a fingerprint database 2260 described in more detail above.
- the fingerprinting system 2240 may be local or remote, for example, in the cloud. It may be coupled via link 2243 to the ETP 2200 .
- the triggered authentication process may be done remotely from the ETP 2200 .
- the ETP 2200 may communicate via interface 2216 and internet 2212 with a remote field image acquisition system 2242 .
- This system is configured for image capture for authentication (and optionally other purposes).
- the image system 2242 may be part of a larger manufacturing, assembly, or other operation.
- the image system 2242 may be integrated into other machinery, or it may stand alone.
- the image system 2242 may be operable by a robot 2250 to capture an image of an object 2248 for authentication.
- the robot 2250 may be mobile, for example, to move about a warehouse capturing images for inventory control.
- the robot may capture images, for example, following a door ajar or break-in trigger (detected by a sensor as described).
- the image system 2242 may work in concert with a fingerprint system such as 2240 , with which it may communicate over a network.
- authentication may be triggered by loading dock receipt of components missing an expected RFID tag or documentation.
- authentication may be triggered by sensors (as noted), or by rules or logic 2253 , which may be realized in the form of computer code or scripts, or by the physical presence of an unexpected item, or the absence of an expected one.
- the trigger processor may take an action based on a combination of inputs, processed according to the applicable rules and logic.
- This disclosure further includes authentication triggered by detection of another event—which event may or may not be directly related to the authentication process.
- Other events and processes 2222 may communicate with the ETP 2200 as illustrated or otherwise.
- One example is a conveyor that is carrying bags to their airplanes when a jam occurs. Currently this would mean that all those bags must, once the belt is restarted, be routed past a bag tag reader to reestablish each bag's identity. With a proposed embodiment, the system would immediately authenticate and locate each bag on the affected conveyor(s) so that when the jam is cleared, each bag can continue on its way without the need to reroute past a bag tag reader.
- an image system 2232 or 2242 may be configured to capture images of luggage items, responsive to direction from the ETP 2200 , which reacts to a jam sensor signal (from, for example, local sensors 2202 ) from a luggage conveyor (not shown).
- a particular machined part may be both expensive and critical to system functioning and its arrival at an aircraft manufacturer may trigger a full authentication process (e.g. reading the serial number and manufacturer, fingerprinting the item, comparing the fingerprints with those in the reference database, and confirming/denying the authenticity of the item.)
- Security cameras have in recent years become commonplace and widespread in both the public and private sector. Some security cameras are monitored by security personnel but others (such as at baggage or parcel handling facilities, along with most in-store security cameras) are intended for post facto forensics.
- the present disclosure teaches the triggering of authentication by real-time forensics, generally taken to mean using some form of predictive analytics or artificial intelligence to determine that an unusual event has taken place and what the correct response to that event is. Systems and methods such as those illustrated above may be used to provide these features.
- an AI program detects a person moving near a baggage conveyor in the airport where no persons are supposed to be present.
- a camera may be the input for local sensor 2202 that provides image data (still or motion) as its “output signals.”
- An AI program may be part of the ETP 2200 for analyzing the image data. In response to this recognition “trigger,” the ETP 2200 may enhance or escalate the level of tracking on the bags in the airport luggage handling system, such as looking to find bags that have been added or are now missing from the system or that are now out of place.
- the system may then acquire fingerprints of bags at a given location—say in the vicinity of the detected unauthorized person—using a system 2232 , and query the fingerprint system 2240 database (via link 2242 ) to confirm that no bags have been added or removed.
- This feature may be applied for parcels at a sortation house, manufactured items on a conveyor, and many other cases.
- the proposed system may also include predictive or AI modeling to monitor external data (e.g. on the web) such as related news and sentiment to weight the frequency of authentication as well as communicate awareness/status on any item or group of items related to the area of abnormal concern.
- FIG. 12 is a simplified flow diagram of one example 2300 of a process in accordance with the present disclosure for event-triggered authentication.
- a process begins with initializing or loading one or more rules, logic or scripts, block 2320 .
- these elements may be implemented in software.
- such software may be executed in a server, such as the ETP 2200 .
- the software monitors various inputs, block 2322 , communicated from one or more external processes, sensors, etc. as described with regard to FIG. 11 .
- Inputs (for example, sensor output signals) may be monitored by polling, interrupts, scheduled messaging, etc.
- the process When a particular input or condition is detected, block 2324 , the process next selects a responsive action, 2326 , based on the applicable rules, logic or scripts. Next the process directs or initiates the selected action, 2340 , such as acquiring and processing authentication data as mentioned above. Next, the process may acquire results of the authentication-related actions, block 2350 , in some cases, and then take further action based on the results, block 2360 , if indicated by the based on the applicable rules, logic or scripts. Next the process may loop via path 2370 to continue monitoring block 2322 . The steps here described are merely illustrative and some of them may be executed in parallel rather than seriatim. Some types of sensor inputs may trigger immediate actions, while others may be cumulative or otherwise have lower priority.
- the typical portable device is likely to include one or more processors and software executable on those processors to carry out the operations described.
- software herein in its commonly understood sense to refer to programs or routines (subroutines, objects, plug-ins, etc.), as well as data, usable by a machine or processor.
- computer programs generally comprise instructions that are stored in machine-readable or computer-readable storage media.
- Some embodiments of the present invention may include executable programs or instructions that are stored in machine-readable or computer-readable storage media, such as a digital memory.
- a “computer” in the conventional sense is required in any particular embodiment.
- various processors, embedded or otherwise may be used in equipment such as the components described herein.
- memory associated with a given processor may be stored in the same physical device as the processor (“on-board” memory); for example, RAM or FLASH memory disposed within an integrated circuit microprocessor or the like.
- the memory comprises an independent device, such as an external disk drive, storage array, or portable FLASH key fob.
- the memory becomes “associated” with the digital processor when the two are operatively coupled together, or in communication with each other, for example by an I/O port, network connection, etc. such that the processor can read a file stored on the memory.
- Associated memory may be “read only” by design (ROM) or by virtue of permission settings, or not.
- a “software product” refers to a memory device in which a series of executable instructions are stored in a machine-readable form so that a suitable machine or processor, with appropriate access to the software product, can execute the instructions to carry out a process implemented by the instructions.
- Software products are sometimes used to distribute software. Any type of machine-readable memory, including without limitation those summarized above, may be used to make a software product. That said, it is also known that software can be distributed via electronic transmission (“download”), in which case there typically will be a corresponding software product at the transmitting end of the transmission, or the receiving end, or both.
- Embodiments of the invention may include a non-transitory machine-readable medium comprising instructions executable by one or more processors, the instructions comprising instructions to perform the elements of the embodiments as described herein.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Collating Specific Patterns (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
A system may include an event trigger processor (ETP) configured to receive signals from sensors or another system (FIG. 11). Output signals from the sensor(s), local or remotely located, may be utilized by the ETP as trigger inputs to initiate a process or response, namely authentication actions, which also may be local or remote from the ETP. Events from external systems also may serve as trigger inputs to the ETP. In some embodiments, as a triggered response, the ETP may direct a local field imaging system to acquire an image of an object, generate a digital fingerprint from the image, and query a database using the generated digital fingerprint to identify or authenticate the object. The ETP may initiate or direct various actions by sending a message to another entity or system, for example, using known network communication protocols.
Description
- This application is a non-provisional of, and claims priority pursuant to 35 U.S.C. § 119(e) (2012) to U.S. provisional application no. 62/374,162 filed Aug. 12, 2016, hereby incorporated by reference as through fully set forth.
- COPYRIGHT© 2016-2017 Alitheon, Inc. A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. 37 C.F.R. § 1.71(d) (2017).
- Centralized databases storing digital fingerprints of objects enabling enhanced security, rapid searching, and high reliability. Methods and apparatus to identify, track, and authenticate any physical object utilizing a suitable database. In particular, event-triggered authentication of objects utilizing digital fingerprints.
- Many different approaches are known to uniquely identify and authenticate physical objects, including labeling and tagging strategies using serial numbers, barcodes, holographic labels, RFID tags, and hidden patterns using security inks or special fibers. All currently known methods rely on applied identifiers that are extrinsic to the object and, as such, may fail to detect introduction of counterfeit or otherwise unknown objects. In addition, many applied identifiers add substantial costs to the production and handling of the objects sought to be identified or authenticated. Applied identifiers, such as labels and tags, are also at themselves at risk of being damaged, lost, stolen, duplicated, or otherwise counterfeited.
- The following is a summary of the present disclosure in order to provide a basic understanding of some features and context. This summary is not intended to identify key or critical elements of the disclosure or to delineate the scope of the disclosure. Its sole purpose is to present some concepts of the present disclosure in simplified form as a prelude to a more detailed description that is presented later.
- There are many known approaches to establishing or reestablishing the authenticity of an object, including secure supply chains, expert assessment, and counterfeit detection. What is lacking, however, and is provided by the current disclosure, is the ability to perform event-triggered authentication utilizing digital fingerprints and fingerprint templates for both overt and covert authentication, counterfeiting, conformity, and non-conformity assessments.
- Additional aspects and advantages of this disclosure will be apparent from the following detailed description of preferred embodiments, which proceeds with reference to the accompanying drawings.
- In order to describe the manner in which the above-recited and other advantages and features of the present disclosure can be obtained, a more particular description follows by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the disclosure will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 is an example of an authentication region and fingerprint template definition for a U.S. passport. -
FIG. 2 is a simplified flow diagram of a process for authentication of a physical object based on digital fingerprinting. -
FIG. 3 is a simplified flow diagram of a process for authentication of a previously fingerprinted object. -
FIG. 4A shows an image of the numeral “3” representing the first digit in a serial number of an “original” or known U.S. dollar bill. -
FIG. 4B shows an image of the numeral “3” representing the first digit in a serial number of a U.S. dollar bill to be authenticated. -
FIG. 5A is an illustration of results of feature extraction showing selected locations of interest in the image ofFIG. 4A . -
FIG. 5B is an illustration of results of feature extraction showing selected locations of interest in the image ofFIG. 4B . -
FIG. 6A shows the same dollar bill image as inFIG. 4A , juxtaposed withFIG. 6B for comparison. -
FIG. 6B shows an image of the numeral “3” that has been damaged or degraded. -
FIG. 7A shows detail of two fingerprint feature locations on thenumeral 3. -
FIG. 7B shows detail of the damaged bill with the corresponding fingerprint feature locations called out for comparison. -
FIG. 8 is a simplified illustration of a rotational transformation in the process of comparing digital fingerprints of two images. -
FIG. 9 is a simplified flow diagram of an induction-authentication process. -
FIG. 10 is a simplified flow diagram of an in-field induction process to enable tracing an object. -
FIG. 11 is a simplified hybrid system/ communication diagram illustrating several different arrangements and applications of the present disclosure. -
FIG. 12 is a simplified flow diagram of one example of a process in accordance with the present disclosure for event-triggered authentication. - Reference will now be made in detail to embodiments of the inventive concept, examples of which are illustrated in the accompanying drawings. The accompanying drawings are not necessarily drawn to scale. In the following detailed description, numerous specific details are set forth to enable a thorough understanding of the inventive concept. It should be understood, however, that persons having ordinary skill in the art may practice the inventive concept without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
- It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first machine could be termed a second machine, and, similarly, a second machine could be termed a first machine, without departing from the scope of the inventive concept.
- It will be understood that when an element or layer is referred to as being “on,” “coupled to,” or “connected to” another element or layer, it can be directly on, directly coupled to or directly connected to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly coupled to,” or “directly connected to” another element or layer, there are no intervening elements or layers present. Like numbers refer to like elements throughout. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
- The terminology used in the description of the inventive concept herein is for the purposes of describing particular embodiments only and is not intended to be limiting of the inventive concept. As used in the description of the inventive concept and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed objects. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
- The methods described in the present disclosure enable the identification of an object without the need for attaching, applying, or associating physical tags or other extrinsic identifying materials with the object. A system does this by creating a unique digital signature for the object, which is referred to as a digital fingerprint. Digital fingerprinting utilizes the structure of the object, including random and/or deliberate features created, for example, during manufacturing or use of the object, to generate a unique digital signature for that object—similar to the way in which a human fingerprint references the friction ridges on a finger. Also, like a human fingerprint, the digital fingerprint can be stored and retrieved to identify objects at a later time.
- Eliminating the need to add extrinsic identifiers or any physical modifications to an object offers a number of advantages to manufacturers, distributors, buyers, sellers, users, and owners of goods. Forgoing the addition of extrinsic identifiers reduces the cost of manufacturing and offers greater security than physical tagging. Moreover, physical identifiers can be damaged, lost, modified, stolen, duplicated, or counterfeited whereas digital fingerprints cannot.
- Unlike prior art approaches that simply utilize a comparison of pixels, a system in accordance with the present disclosure utilizes the extraction of features to identify and authenticate objects. Feature extraction enables users to take a large amount of information and reduce it to a smaller set of data points that can be processed more efficiently. For example, a large digital image that contains tens of thousands of pixels may be reduced to a few locations of interest that can be used to identify an object. This reduced set of data is called a digital fingerprint. The digital fingerprint contains a set of fingerprint features or locations of interest which are typically stored as feature vectors. Feature vectors make image processing more efficient and reduce storage requirements as the entire image need not be stored in the database, only the feature vectors need to be stored. Examples of feature extraction algorithms include—but are not limited to—edge detection, corner detection, blob detection, wavelet features, Gabor, gradient and steerable output filter histograms, scale-invariant feature transformation, active contours, shape contexts, and parameterized shapes.
- While the most common applications of the system may be in the authentication of physical objects such as manufactured goods and documents, the system is designed to be applicable to any object that can be identified, characterized, quality tested, or authenticated with a digital fingerprint. These include but are not limited to mail pieces, parcels, art, coins, currency, precious metals, gems, jewelry, apparel, mechanical parts, consumer goods, integrated circuits, firearms, pharmaceuticals, and food and beverages. Here the term “system” is used in a broad sense, including the methods of the present disclosure as well as apparatus arranged to implement such methods.
- In this application, the term “scan” is used in the broadest sense, referring to any and all means for capturing an image or set of images, which may be in digital form or transformed into digital form. Images may, for example, be two dimensional, three dimensional, or in the form of a video. Thus a “scan” may refer to an image (or digital data that defines an image) captured by a scanner, a camera, a specially adapted sensor or sensor array (such as a CCD array), a microscope, a smartphone camera, a video camera, an x-ray machine, a sonar, an ultrasound machine, a microphone (or other instruments for converting sound waves into electrical energy variations), etc. Broadly, any device that can sense and capture either electromagnetic radiation or mechanical wave that has traveled through an object or reflected off an object or any other means to capture surface or internal structure of an object is a candidate to create a “scan” of an object. Various means to extract “fingerprints” or features from an object may be used; for example, through sound, physical structure, chemical composition, or many others. The remainder of this application will use terms like “image” but when doing so, the broader uses of this technology should be implied. In other words, alternative means to extract “fingerprints” or features from an object should be considered equivalents within the scope of this disclosure. Similarly, terms such as “scanner” and “scanning equipment” herein may be used in a broad sense to refer to any equipment capable of carrying out “scans” as defined above, or to equipment that carries out “scans” as defined above as part of their function.
- In this application, different forms of the words “authenticate” and “authentication” will be used broadly to describe both authentication and attempts to authenticate which comprise creating a digital fingerprint of the object. Therefore, “authentication” is not limited to specifically describing successful matching of inducted objects or generally describing the outcome of attempted authentications. As one example, a counterfeit object may be described as “authenticated” even if the “authentication” fails to return a matching result. In another example, in cases where unknown objects are “authenticated” without resulting in a match and the authentication attempt is entered into a database for subsequent reference the action described as “authentication” or “attempted authentication” may also, post facto, also be properly described as an “induction”. An authentication of an object may refer to the induction or authentication of an entire object or of a portion of an object.
- Because digital fingerprinting works with many different types of objects, it may be useful to define what regions of digital images of objects are to be used for the extraction of features for authentication purposes. The chosen regions may vary for different classes of objects. In some embodiments, a chosen region may be the image of the entire object; in other embodiments chosen regions may be one or more sub-regions of the image of the object.
- For instance, in the case of a photograph, a digital image of the entire photograph may be chosen for feature extraction. Each photograph is different and there may be unique feature information anywhere in a photograph. In such a case, the authentication region may be the entire photograph.
- In some embodiments, multiple regions may be used for fingerprinting. In some examples, there may be several regions where significant variations take place among different similar objects that need to be distinguished while, in the same objects, there may be regions of little significance. In other examples, a template may be used (see
FIG. 1 ) to define regions of interest, including elimination of regions of little interest. - In one embodiment, an object, such as a bank note, may be deemed authenticated if a few small arbitrary regions scattered across the surface are fingerprinted, possibly combined with one or more recognitions of, for example, the contents of a region signifying the value of the bank note or one containing the bank note serial number. In such examples, the fingerprints of any region (along with sufficient additional information to determine the bank note value and its purported identity) may be considered sufficient to establish the authenticity of the bill. In some embodiments, multiple fingerprinted regions may be referenced in cases where one or more region may be absent from an object (through, for example, tearing) when, for example, a bank note is presented for authentication. In other embodiments, however, all regions of an object may need to be authenticated to ensure an object is both authentic and has not been altered.
- In one embodiment, a passport may provide an example of feature extractions from multiple authentication regions; see
FIG. 1 . In the case of a passport, features chosen for authentication may be extracted from regions containing specific identification information such as the passport number, the recipient name, the recipient photo, etc., as illustrated inFIG. 1 . In some examples, a user may define a feature template specifying the regions whose alteration from the original would invalidate the passport, such as the photo, identifying personal data, or other regions considered important by the user. More details of feature templates are given in Ross, et at. U.S. Pat. No. 9,443,298. -
FIG. 1 illustrates one example of an authentication region and a fingerprint template definition for a U.S. passport. In this figure, brace 101 refers to a simplified flow diagram of a process as follows: Atprocess block 102, an object is scanned to generate an “original image”, that is, a digital image file or a digital data file in any suitable format that is herein simply referred to as an “image”. The original image is illustrated as the data page spread of a U.S. passport book, atblock 150. - Next, the system processes the image data to determine an authentication region. In this example, the authentication region is the biographic data page of the U.S. Passport, located in the lower portion of
image 150, identified by dashed box 154. Next, the process generates an authentication image for feature extraction, block 106. The authentication image is illustrated atreference 156. Next, atblock 108, the process defines one or more locations of interest for feature vector extraction. The locations of interest in this example are, as shown inimage 158 by dashedboxes 160, the surname, the given name, the passport number, and the passport photo. - Finally, at
block 110, theprocess 100 comprises creating afingerprint template 120. In this example,template 120 identifies an object class (U.S. Passport), defines an authentication region (for example, by X-Y coordinates), and lists one or more locations of interest within that authentication region. In this instance, the list comprises passport number, photo, first name, and last name. - In some embodiments, an ability to define and store optimal authentication regions for classes of objects may offer benefits to a user. In some embodiments, it may be preferable to scan limited regions of objects rather than to scan entire objects. For instance, in the case of an article of designer clothing, scanning a clothing label may be preferable to scanning an entire garment. (To be clear, the label or a portion of it is scanned for fingerprinting, not to recognize text on the label.) Further, defining such regions may enable detection of partial alteration of an object.
- Once an authentication region is defined, specific applications may be created for different markets or classes of objects that may assist users in locating and scanning an optimal authentication region. In some embodiments, for example when utilizing a mobile device, a location box and crosshairs may automatically appear in the viewfinder of a smartphone camera application, to help the user center the camera on an authentication region, and automatically lock onto a region and complete a scan when the device is focused on an appropriate area. It should be noted that, although some examples suggested above are two-dimensional objects (passport, bank note), the present disclosure is fully applicable to three-dimensional objects as well. As previously noted, scanning may be of any kind, including 2-D, 3-D, stereoscopic, HD, etc. and is not limited to the use of visible light or to the use of light at all (as previously noted, sonar and ultrasound are, for example, appropriate scanning technologies).
- In some embodiments, objects may have permanent labels or other identifying information attached to them. In addition to the objects themselves, these attachments may also be referenced as features for digital fingerprinting, particularly where the label or other identifying information becomes a permanent part of the object. In one example, a permanent label may be used as an authentication region for the object to which it is affixed. In another example, a label may be used in conjunction with the object itself to create a fingerprint of multiple authentication regions referencing both a label and an object to which the label is affixed.
- In one example, wine may be put into a glass bottle and a label affixed to the bottle. Since it is possible that a label may be removed and re-applied elsewhere merely using the label itself as an authentication region may not be sufficient. In this case, the authentication region may be defined so as to include both a label and a substrate it is attached to—in this example some portion of a label and some portion of a glass bottle. This “label and substrate” approach may be useful in defining authentication regions for many types of objects, such as various types of goods and associated packaging. In other instances, authentication may reveal changes in the relative positions of some authentication regions such as in cases where a label has been moved from its original position, which may be an indication of tampering or counterfeiting. If an object has “tamper-proof” packaging, this may also be included in the authentication region.
- In some embodiments, multiple authentication regions may be chosen from which to extract unique features. In a preferred embodiment, multiple authentication regions may be selected to enable the separate authentication of one or more components or portions of an object. For example, in one embodiment, features may be extracted from two different parts of a firearm. Both features may match the original firearm but since it is possible that both parts may have been removed from the original firearm and affixed to a weapon of different quality, it may also be useful to determine whether the relative positions of the parts have changed. In other words, it may be helpful to determine that the distance (or other characteristics) between Part A's authentication region and Part B's authentication region remains consistent with the original feature extraction. If the positions of Parts A and B are found to be consistent to the relative locations of the original authentication regions, the firearm may be authenticated. Specifications of this type may be stored with or as part of a digital fingerprint of an object.
- In an embodiment, when a new type or class of object is being scanned into a system for the first time, the system can create a fingerprint template (as shown in
FIG. 1 ) that can be used to control subsequent authentication operations for that class of objects. This template may be created either automatically by the system or by a human-assisted process. - A fingerprint template is not required for the system to authenticate an object, as the system can automatically extract features and create a digital fingerprint of an object without it. However, the presence of a template may optimize the authentication process and add additional functionality to the system.
-
TABLE 1 Example Fingerprint Template. CLASS: [Description of the object] United States Passport AUTHENTICATION REGION: [Description of the authentication regions for the object] Region 1: (x1, y1, z1), (x2, y2, z2) . . . Region n REGION MATCH LIST [List of the regions that are required to match to identify an object] Region List: 1 . . . n FEATURES: [Key features of the object] Feature 1: Passport Number Feature 2: Photo Feature 3: First Name Feature 4: Last Name . . . Feature n METHODS: [Programs that can be run on features of an object] Feature 2: Photo Method 1: [checkphoto.exe] Check for uneven edges indicating photo substitution . . . Method n Feature n Method n ADDITIONAL DATA [Additional data associated with the object] Data 1: example data . . . Data n - The uses of the fingerprint template include but are not limited to determining the regions of interest on an object, the methods of extracting fingerprinting and other information from those regions of interest, and methods for comparing such features at different points in time. The name “fingerprint template” is not important; other data with similar functionality (but a different name) should be considered equivalent.
- In an embodiment, four different but related uses for this technology are particularly in view in the present disclosure. These are illustrative but are not intended to be limiting of the scope of the disclosure. These applications may be classified broadly as (1) authentication of a previously scanned original, (2) detection of alteration of a previously scanned original, (3) detection of a counterfeit object without benefit of an original, and (4) assessing the degree to which an object conforms with a predetermined specification, such as a manufacturing specification or other applicable specification.
- The uses of the fingerprint template include but are not limited to determining the regions of interest on an object, the methods of extracting fingerprinting and other information from those regions of interest, and methods for comparing such features at different points in time. The name “fingerprint template” is not important; other data with similar functionality (but a different name) should be considered equivalent.
- In an embodiment, four different but related uses for this technology are particularly in view in the present disclosure. These are illustrative but are not intended to be limiting of the scope of the disclosure. These applications may be classified broadly as (1) authentication of a previously scanned original, (2) detection of alteration of a previously scanned original, (3) detection of a counterfeit object without benefit of an original, and (4) assessing the degree to which an object conforms with a predetermined specification, such as a manufacturing specification.
- In example (1), an object is fingerprinted preferably during the creation process (or at any time when its provenance may be sufficiently ascertained) or at a point where an expert has determined its authenticity. Subsequently, the object is later re-fingerprinted, and the two sets of fingerprints are compared to establish authenticity of the object. The fingerprints may be generated by extracting a single fingerprint from the entire object or by extracting multiple sets of features from multiple authentication regions. Fingerprinting may also involve reading or otherwise detecting a name, number, or other identifying characteristics of the object using optical character recognition or other means which may be used to expedite or facilitate a comparison with other fingerprints. For instance, in cases where manufacturing (or other object) databases use serial numbers or other readable identifiers, such identifiers may be utilized to directly access the database record for the object and compare its digital fingerprint to the original that was previously stored, rather than searching an entire digital fingerprinting database for a match.
- In case (2), a fingerprinted object is compared, region by region, with a digital fingerprint of an original object to detect low or nonexistent matching of the fingerprint features from those regions. While case (1) is designed to determine whether the original object is now present, case (2) is designed to detect whether the original object has been altered and, if so, how it has been altered. In some embodiments, authentication regions having poor or no matching fingerprint features will be presumed to have been altered.
- In case (3), an object may not have been fingerprinted while its provenance was sufficiently ascertainable. One example would be bills or passports created prior to initiating the use of a digital fingerprinting system. In such examples, digital fingerprints of certain regions of interest on an object may be compared with digital fingerprints from known, or suspected, counterfeit objects or with both those and fingerprints of properly authenticated objects. In one example, a photograph may be spuriously added to a passport and, as an artifact of the counterfeiting, the edge of the added photo may tend to be sharper than an edge of an original, unaltered, photograph. In such a case, fingerprint characteristics of known authentic passports and those of passports that are known (or suspected to) have been altered by changing a photograph may be compared with the passport being inspected to estimate whether the passport exhibits indications of alteration.
- In an embodiment, once an object has been scanned and at least one authentication region has been identified, the digital image, which will be used to create the unique digital fingerprint for the object, is generated. The digital image (or set of images) provides the source information for the feature extraction process.
- In the present disclosure, a digital fingerprinting feature is defined as a feature or a location of interest in an object, which feature is inherent to the object itself. In some embodiments, features preferably are a result of a manufacturing process, other external processes, or of any random, pseudo-random, or deliberate process or force, such as use. To give one example, gemstones have a crystal pattern which provides an identifying feature set. Every gemstone is unique and every gem stone has a series of random flaws in its crystal structure. This pattern of random flaws may be used for the extraction of feature vectors for identification and authentication.
- In the present disclosure, a “feature” is not necessarily concerned with reading or recognizing meaningful content, for example by using methods like optical character recognition. A digital fingerprint of an object may capture both features of the object and features of any identifiers that are affixed or attached to the object. Feature vectors extracted from authentication regions located on an affixed identifier are based on the substances of which the identifier is physically comprised rather than the information (preferably alphanumeric) that is intended to be communicated by the identifier. For instance, in the case of a wine bottle, features may be captured from the bottle and from a label affixed to the bottle. If the label includes a standard UPC bar code, the paper of the label and the ink pattern of the bar code may be used to extract a feature vector without reading the alphanumeric information reflected by the bar code. An identifier, such as a UPC bar code print consisting of lines and numbers, has no greater significance in the generation and use of a feature vector than a set of randomly printed lines and numbers.
- Although reading identifier information is not necessary for digital fingerprinting, in some embodiments, where a user desires to capture or store identifier information (such as a name, serial number, or a bar code) in an association with an object, the system may allow the user to capture such information and store it in the digital fingerprint. Identifier information may, for example, be read and stored by utilizing techniques such as optical character recognition, and may be used to facilitate digital fingerprint comparisons. In some cases, serial numbers may be used as the primary index into a database that may also contain digital fingerprints. There may be practical reasons for referencing serial numbers in relations to digital fingerprints. In one example, a user is seeking determine whether a bank note is a match with a particular original. In this case, the user may be able to expedite the comparison by referencing the bank note serial number as an index into the digital fingerprinting database rather than iterating through a large quantity of fingerprints. In these types of cases, the index recognition may speed up the comparison process but it is not essential to it.
- Once a suitable digital fingerprint of an object is generated the digital fingerprint may be stored or registered in a database. For example, in some embodiments, the digital fingerprint may comprise one or more fingerprint features which are stored as feature vectors. The database should preferably be secure. In some embodiments, a unique identifier, such as a serial number, may also be assigned to an object to serve, for example, as a convenient index. However, assigning a unique identifier is not essential as a digital fingerprint may itself serve as a key for searching a database independent of any addition of a unique identifier. In other words, since a digital fingerprint of an object identifies the object by the unique features and characteristics of the object itself the digital fingerprint renders unnecessary the use of arbitrary identifiers such as serial numbers or other labels and tags, etc.
-
FIG. 2 represents an example of a simplified flow diagram of aprocess 200 for authenticating or identifying an object using digital fingerprinting using a U.S. passport for illustration for part of the process. The process begins with scanning the object, block 202. Animage 250 is acquired, in this illustration the front page of a U.S. passport is used. The next step is to determine a class of the object, block 204. This step may be omitted where the class is known. For example, at a border, a station may be in use that only checks U.S. passports. In another example, the system may be at a passport printing facility. Thus, the class of objects may be known a priori. - Next, at block 206, a database query may be conducted to see if a template exists in the system for the object that was scanned at 202. For example, in some cases, the initial image may be processed to extract a serial number or other identifying information. In an embodiment, the database may then be interrogated; decision 206, to see if a template exists for that serial number. If the answer is YES, path 208, the system accesses the
template 212 and uses it to select one ormore authentication regions 210. Thetemplate 212 lists the regions and their respective locations in the image (i.e. on the passport front page in this example). Physical locations may, as an example, be specified relative to a given location, and/ or relative to each other. Location may be important because, for example, a replaced photograph may not be in exactly the same location as the removed original. In short, the template guides the authentication software in analyzing the image data. In that analysis, for each authentication region (called a “Feature” in 212), various features are extracted from the image data, block 222. - The extracted features are used to form a digital fingerprint of the object, block 224. For example, each feature may be described by a feature vector. Location and other data and metadata may be included in the fingerprint. In general, the process for extracting features and describing them in feature vectors may be specified in the template. The template may also specify which regions must be matched to declare the passport a match. In the passport example, all specified regions must match a record in the database for the passport to be determined to be authentic and unaltered. In other cases, a few matches may be sufficient. The digital fingerprint generated at block 224 is then used to query a
reference database 230 for a match. - Returning to the decision block 206, there may not be an existing template in the system for the object under inspection—NO branch for “Non-Template Object Class.” The process here may vary with the type of object under inspection and the purpose for the inspection. In some cases, a scanned image of an object may be processed to find locations of interest, block 232, for example, surface areas that are non-homogenous and thus have considerable image data content. In other words, finding locations of interest may be automated or semi-automated. The locations may be used to extract features, block 234 and/or recorded in a template for later use. Preferably, locations should be recorded in, or otherwise associated with, the digital fingerprint of the object.
- In other examples, user input may be used to select authentication regions, and then the process proceeds to 234 as before. In some embodiments, an entire object may be scanned and all of the data processed to find and record digital fingerprint data. Whatever the case, the process proceeds to create a digital fingerprint, block 236, which can then be used to query the
database 230 for a match. The match result may not be binary (yes/no); rather, in many cases, the result may indicate a confidence level of a match or may be a composite of binary results or confidence levels—such as when an object has been altered in part or in whole and/or has been assembled, or disassembled. - In an embodiment, an object is scanned and an image is generated. The steps that follow depend on the operation to be performed. Several illustrative example cases are discussed below.
- Case 1: For authentication of a previously fingerprinted object, the following steps may be followed (see
FIG. 3 , discussed below): - 1. One or more authentication regions are determined, such as automatically by a system, or by utilizing the authentication region definitions stored in a fingerprint template.
- 2. Relevant features are extracted from each authentication region and a digital fingerprint is generated. Feature extractions preferably will be in the form of feature vectors, but other data structures may be used, as appropriate.
- 3. Optionally, other information, for example a unique identifier such as a serial number may be extracted and stored to augment subsequent search and identification functions.
- 4. The digital fingerprint of the object to be authenticated is compared to digital fingerprints stored in a database.
- 5. The system reports whether (or to what extent) the object matches one or more of the digital fingerprints stored in the database.
- 6. The system may store the digital fingerprint of the object to be authenticated in the database along with the results of the authentication process. Preferably, only the extracted features will be stored in the database, but the authentication image and/or the original image and/or other data and metadata may be stored in the database, for example for archival or audit purposes.
-
FIG. 3 illustrates such aprocess 300 in diagrammatic form. Beginning atstart block 302, the process scans an object and creates an authentication image, block 304. The image is represented at 350, using a passport as an example. Features are extracted, block 306, and optionally, other information, such as a serial number or similar ID number, preferably unique, may be extracted as well, block 310. - The extracted data is processed to generate a digital fingerprint, block 312. A
database 320 may be queried for a matching fingerprint, block 314. A “match” may be defined by a binary, probability, or similarity metric or be a composite of metrics. Results of the database query may be reported to a user, block 322. Finally, a new digital fingerprint may be added to thedatabase 320, shown atprocess block 330. - Case 2: For inspection of specific features of a previously fingerprinted object to determine whether they have been altered, the steps are similar to
Case 1, but the process is aimed at detection of alterations rather than authentication of the object: - 1. One or more authentication regions are determined, such as automatically by the system, or by utilizing the authentication region definitions stored in a fingerprint template.
- 2. The features to be inspected are extracted from an authentication region and the digital fingerprint is generated. The features extracted may be in the form of feature vectors for the features to be inspected but other data structures may be used, as appropriate.
- 3. Optionally, other information, for example a unique identifier such as a serial number may be extracted and stored to be used to augment subsequent search and identification functions.
- 4. The digital fingerprint of features to be inspected for alteration is compared to the fingerprint of the corresponding features from the original object stored in the database.
- 5. The system reports whether the object has been altered; i.e. the extent to which the digital fingerprint of the features to be inspected match those previously stored in the database from the original object, in whole or in part.
- 6. The system may store the digital fingerprint of the features to be inspected in the database along with the results of the inspection process. Preferably, only the features will be stored in the database, but the authentication image and/or the original image and/or other data and metadata may be stored in the database for archival or audit purposes.
-
Cases 3 and 4 are elaborated in related patent applications. - In all of the above cases, features may be extracted from images of objects scanned under variable conditions, such as different lighting conditions. Therefore, it is unlikely two different scans will produce completely identical digital fingerprints. In a preferred embodiment, the system is arranged to look up and match objects in the database when there is a “near miss.” For example, two feature vectors [0, 1, 5, 5, 6, 8] and [0, 1, 6, 5, 6, 8] are not identical but by applying an appropriate difference metric the system can determine that they are close enough to say with a degree of certainty that they are from the same object that has been seen before. One example would be to calculate Euclidean distance between the two vectors in multi-dimensional space, and compare the result to a threshold value. This is similar to the analysis of human fingerprints. Each fingerprint taken is slightly different, but the identification of key features allows a statistical match with a high degree of certainty.
-
FIG. 4A illustrates an image of the numeral “3” representing a number printed on an “original” or known U.S. dollar bill. The bill may have been fingerprinted, for example, at the time of manufacture or public release, as described herein, or otherwise sufficiently authenticated for use as a reference. As noted below, fingerprint databases of currency and the like may be secured. Such databases preferably exclude raw image data. This image, on the order of about 40-fold magnification, shows a number of distinctive features visible to the naked eye. -
FIG. 4B illustrates an image of a number printed on a second or unknown U.S. dollar bill. The second bill may be fingerprinted using the same process, and then the resulting digital fingerprints, i.e., the respective fingerprint feature vectors, may be compared as further explained below, to determine whether or not the second bill is in fact the same one as the first bill. The comparison may take place even though the bill may have changed from wear and tear. -
FIG. 5A is a simplified illustration of the results of feature extraction applied to thenumeral 3 ofFIG. 4A . In this figure, only the ends of the numeral are shown. Two locations of interest are called out by circles 1710 and 1750. The locations of interest need not necessarily be circular, but circular areas are advantageous for many applications. Below is a discussion on how these areas may be selected in an image. Fingerprint feature extraction is applied to each of the circular locations of interest. The results for each location may be stored as fingerprint feature vectors. To clarify, a “location of interest” (sometimes referred to as a “point” or “area” of interest), for example 1720, may well be a physical feature on the object, but the “feature vector” that characterizes that location of interest is not just a variation on the image around that location; rather, the feature vector is derived from it by any of a number of possible means. Preferably, a feature vector may be an array of numeric values. As such, feature vectors lend themselves to comparison and other analyses in a database system. A collection of feature vectors, say for location 1750, may be stored as a feature vector array. -
FIG. 5B is a simplified illustration of the results of feature extraction applied to locations of interest on thenumeral 3 ofFIG. 4B . The same fingerprinting process may be applied to this image. The same locations of interest as inFIG. 5A are labeled 1720 and 1760, respectively. The stored features (from the original object) are compared with the features extracted from the new object. As in this case, if the locations of interest are not encountered in the second object, or of the feature vectors characterizing those locations of interest are too different, there is no match (or a low confidence level for a match) for that location of interest. Variables, such as which locations must match and/or how many locations must match and/or the degree of matching required to conclude that an object matches the one previously fingerprinted, may in some embodiments be specified in a digital fingerprint record, further described below, or in some other associated record, to guide the decision process. This arrangement may be advantageous, for example, for exporting a database to a generic processor or system for remote authentication work. The matching logic may be embedded in the digital fingerprint record. Preferably, the matching logic is implemented in software as part of an authentication system. - One advantage of the feature-based method is that when an object is worn from handling or use (even very worn), a system may still identify the object as original, which may be impossible with the bitmapped approach.
FIG. 6A shows a numeral from the same dollar bill image as inFIG. 4A , juxtaposed withFIG. 6B for comparison.FIG. 6B shows the numeral on the same bill after the bill has been subjected to washing in a washing machine, perhaps as a result of being left in the pocket of a piece of clothing. InFIG. 15B , the image (or, rather, the dollar bill) has been degraded; there is significant loss of ink and destruction of the paper surface in multiple locations. A bitmapped approach to matching would likely fail to match these two figures due to the large number of pixels that are now different, as relatively few of the pixels remain the same as the original. -
FIG. 7A shows the detail of two fingerprint feature locations as before, 1610 and 1650.FIG. 7B shows detail of the damaged bill with the corresponding locations called out as 1620 and 1660, respectively. A comparison between the similarities of area 1610 to area 1620 and of area 1650 to area 1660 illustrates how a comparison of the corresponding fingerprint feature vectors would be adequate to result in a match. In practice, a much larger number of features would be used. - The image of the damaged bill is analyzed by a processor. The processor accesses a database of previously stored fingerprint data. If the dollar bill serial number is legible (by eye or machine), the record for the corresponding bill may be accessed from the datastore using the serial number as an index. Similarly, if any portion of the serial number is legible, the search for a matching record can be narrowed on that basis. Either way, a candidate record, containing a set of stored regions of interest may be compared to the image of the damaged bill.
- As explained above, in addition to being able to recognize a worn object, the feature-based approach is able to address other external problems such as rotated images. This is especially important in a system where an unsophisticated user, such as a retail customer, may be scanning an object to be authenticated. In such cases, external factors like lighting and rotation may not be under the system operator's control.
- Referring now to
FIG. 8 , which shows the original image on the left side, with a small set of fingerprint features marked as small diamond shapes. This is merely a callout symbol for illustration. In some embodiments, as noted, preferably circular areas are used. For each feature (preferably identified in the database record), a search is conducted of the suspect image on the right side ofFIG. 8 (or a portion of it) for a matching feature. The position may not match exactly, due to “stretch”, an effective difference in magnification, and/or due to rotation of the image, or due to other circumstances. Although it may not match locations literally; a mathematical transformation may be defined that maps one image to the other, thereby accounting for rotation and stretch as appropriate. Thus, a bounding rectangle A indicated by the box in the left side image may be mapped to a quadrilateral, indicated by the line B in the right-side image. - Once an appropriate transformation is found, further matching may be done to increase the level of confidence of the match, if desired. In some embodiments, a number of matches on the order of tens or hundreds of match points may be considered sufficient. The number of non-match points also should be taken into account. That number should preferably be relatively low, but it may be non-zero due to random dirt, system “noise”, and other circumstances. Preferably, the allowed mapping or transformation should be restricted depending on the type of object under inspection. For instance, some objects may be inflexible, which may restrict the possible deformations of the object.
- Summarizing the imaging requirements for a typical fingerprinting system, for example for inspecting documents, the system preferably should provide sufficient imaging capability to show invariant features. Particulars will depend on the regions used for authentication. For many applications, 10-fold magnification may be adequate. For ink bleeds on passports, bills, and other high-value authentication, 40-fold magnification may likely be sufficient. In preferred embodiments, the software should implement a flexible response to accommodate misalignment (rotation), misorientation, and scale changes. Color imaging and analysis is generally not required for using the processes described above, but may be used in some cases.
-
FIG. 9 is a simplified diagram illustrating the concepts of induction and authentication. The term “induction” is used in a general manner to refer to entering an object or a set of objects into an electronic system for subsequently identifying, tracking, or authenticating the object, or for other operations. The object itself is not entered into the system in a physical sense; rather, induction refers to creating and entering information into a memory or datastore from which it can later be searched, interrogated, retrieved, or utilized in other kinds of database operations. - In
FIG. 9 ,induction 1802 thus may refer to a process that includes capturing an image of an object (or part of an object), processing the image to extract descriptive data, storing the extracted data, or any or all of these operations. The inducted object, represented by acube 1804, then leaves the induction site, and proceeds in time and space along apath 1806. Induction may be done at the point of creation or manufacture of the object, or at any subsequent point in time. In some cases, induction may be done clandestinely, such as without the knowledge of the person or entity currently having ownership and/or possession of an object. The term “possession” is used in the broadest sense to include, for example, actual physical possession, as well as control—for example, having they key to a secure physical storage where an object is kept. - After induction, the
object 1804 may encounter wear and tear, and otherwise may change, intentionally or not, in ways that may not be known a priori, represented by thequestion mark 1808. Theoriginal object 1804 may even in fact be lost or stolen after induction and a counterfeit may be introduced. Alongpath 1809, an object 1810 may be presented for authentication, represented byblock 1820. Below are described some additional scenarios and use cases for the authentication technology described herein, and what may be done under the broad heading of “authentication”. Under many circumstances, induction, authentication, or both may be done remotely by use of technology such as drones or by other covert means. In one example, an agent may take a photograph of an object with a smartphone, without the knowledge or consent of the possessor of the object, and the resulting image may be utilized for induction and/or authentication as described herein. - More specifically, in some embodiments, some part of the induction/ authentication process may be done remote from a facility intended for that purpose. In addition, some part of the induction/authentication process may be accomplished without the knowledge of the then-current possessor of an object. In particular, the induction and/or authentication are not part of the current possessors' normal processes. These two criteria are not essential for the present disclosure, but are generally representative of some applications.
-
FIG. 10 is a simplified flow diagram of one example of a process for creating a digital fingerprint that includes feature vectors based on a scanned image of an object. The process begins with initialization atblock 2120. This step may comprise initializing a datastore, calibrating an image capture system, or other preliminary operations. An object or object is scanned,block 2122, forming digital image data. Preferably, depending on the context, the scanning may be automated. In other cases, an operator may be involved in manual scanning. From the image data, an authentication image is generated, block 2124, which may comprise all or a selected subset of the scan data. Next, a digital fingerprint record may be initialized, for example in a memory or datastore,block 2126. - To begin forming a digital fingerprint of a scanned object, at least one authentication region is selected,
block 2130, in the authentication image data. This selection preferably is carried out by the fingerprinting software. The authentication region(s) may be selected according to a predetermined template based on the class of objects. Locations of the authentication regions may be stored in the digital fingerprint record,block 2132. - At
block 2134, the process continues by selecting locations of interest within each authentication region. To select locations of interest (areas in an image from which to extract fingerprint features), a software process may automatically select a large number—typically hundreds or even thousands per square mm—of preferred locations of interest for purposes of the digital fingerprint. A location may be of interest because of a relatively high level of content. That “content” in a preferred embodiment may comprise a gradient or vector, including a change in value and a direction. The selected locations of interest may be added to the fingerprint record,block 2136. In one example, such areas may be identified by a location or centroid, and a radius thus defining a circular region. Circular regions are preferred for some applications because they are not affected by rotation of the image. - Next,
block 2138, the process calls for extracting features from each location of interest, and forming feature vectors to describe those features in a compact form that facilitates later analysis, for example, calculation of vector distances as a metric of similarity in comparing fingerprints for authentication. Various techniques are known for extracting such features. The resulting feature vectors are added to the fingerprint,block 2140. Atblock 2142, additional information may be added to the digital fingerprint identifying other fingerprints and related information associated with the same object. In some embodiments, a relationship, such as relative location of the other fingerprints to the current fingerprint may be used. For example, in some objects, multiple regions may be authentic individually, but a change in their relative location may indicate that the object is not authentic. Thus, a fingerprint record may include first and second feature vectors (each describing a corresponding feature extracted from an area of interest) and a relative location of one to the other. - Above, with regard to
FIG. 8 , the transformation from one set of feature vectors to another was described, to accommodate stretch, rotation or variations in magnification. In similar fashion, relative locations of features in a fingerprint can be stored in the record and used for comparison to a new fingerprint under consideration. The feature extraction may be repeated,block 2150, using an adjusted area size or scale (such as magnification). Feature vectors created at the adjusted size may be added to the fingerprint,block 2152. Additional features may be extracted at additional magnification values, until an adequate number are provided,decision 2154. This additional data may be added to the fingerprint,block 2156. This data may be helpful in finding a matching fingerprint where the authentication image magnification is not the same as the image at the time of induction of the object. Finally, and optionally, the scanned image itself (generated at 2122) may be added to the database,block 2158. This process to build a digital fingerprint ends at 2160. - Authentication may be conducted in response to a trigger. That is, authentication performed outside the normal steady functioning of a system (in contrast, for example, to inducting parts as they are manufactured and authenticating them as they are installed). In view in this disclosure is any form of event trigger (see the progression below) and any form of authentication using fingerprinting or similar technology. Each of the following are non-limiting examples of events that could serve as triggers. Each of them could be utilized to trigger the kinds of authentication taught above in this document.
- Schedule-based triggering. In one example, this disclosure envisions a system where authentication is triggered on a schedule (e.g. as part of quarterly inventory, or two hours past closing time). Triggering on a schedule is close to being “part of the normal . . . functioning of the system” but is included for completeness in the spectrum of “event-driven authentication”. This form would include normal calendaring but also following computer scripts or even periodic, random, or from time-to-time manual interrupts of normal processes.
- Event triggering.
FIG. 11 is a simplified hybrid system/communication diagram illustrating several different arrangements and applications of the present disclosure. A particular system may implement all of the features shown inFIG. 11 , or more typically, only a subset of them. For example, an event-triggered system may be “local” in the sense of installation at one location, for example, at a parts manufacturer, or a shipping or warehouse facility. In a local installation, the remote sensors and internet connectivity may be unnecessary. In other applications, remote sensors, and remote authentication equipment may be used. - Referring to
FIG. 11 in the center, an event trigger processor 2200 (“ETP”), which may comprise any type of programmable digital processor, is arranged for various communications. Details of network interfaces, user interfaces, memory, etc. which are familiar in the industry, are omitted for clarity. In some embodiments, one or morelocal sensors 2202 may be coupled to anetwork interface 2204 for communication with theevent trigger processor 2200 vialink 2206, which may be wired or wireless. Output signals from the sensor(s) may be utilized by theETP 2200 as triggers to initiate authentication actions, local or remote, as further explained below. - The ETP may initiate various actions, responsive to a trigger input signal, for example, by sending a message to another entity or system, in particular an authentication system. Hence the title, “Event-Driven Authentication.” The ETP may command the actions, for example, using known network communication protocols. In one example, responsive to the back door of a warehouse being detected as opening (a sensor input), the ETP may send a message to a remote system to have it conduct an inventory of the warehouse, in part or in whole. The remote system may utilize appropriate scanning equipment to capture images for the inventory for fingerprinting. The processes illustrated by
FIG. 11 include the use of sensors of all types, such as RFID and thermal sensors, to smart dust connected to the internet or computerized networks to name but a few. - In some embodiments, one or more
remote sensors 2210, i.e., sensors that are not at the same physical location as theETP 2200, may be coupled over a network, such as a LAN, WAN, or theinternet 2212, for connection to theETP 2200 via a suitable network interface 2216. In operation, output signals from the remote sensor(s) may be utilized by theETP 2200 as triggers to initiate authentication actions, which again may be local or remote. - In some embodiments, other remote processes or
systems 2230 may be similarly coupled over a network to communicate with theETP 2200. As one illustration: a piece of luggage is going down a conveyor (not shown) and is normally to be routed by reading the bag tag. It passes a bag tag reader, but this time the reader does not get a read. The bag tag reader may be a remote process orsystem 2230 coupled to theETP 2200. In this case, a tag reader failure message triggers a process or response in theETP 2200 that initiates a full fingerprint-based authentication of the (previously inducted) luggage item. The authentication process may be performed in various ways, several of which are described in detail above. - In some embodiments, the
ETP 2200 may direct a localfield imaging system 2232 via alink 2234. The ETP may be coupled directly to thelocal imaging system 2232 in some applications. In other cases, it may be communicatively coupled over a network. In an embodiment, thelocal system 2232 may acquire image data of an object 2236 (for example, the aforementioned luggage item). Theimaging system 2232 may interact vialink 2236 with a fingerprint processing andstorage system 2240. In an embodiment, thefingerprint system 2240 may include adigital fingerprint processor 2256, asecure database server 2258, and afingerprint database 2260 described in more detail above. Thefingerprinting system 2240 may be local or remote, for example, in the cloud. It may be coupled vialink 2243 to theETP 2200. - In some embodiments, the triggered authentication process may be done remotely from the
ETP 2200. for example, theETP 2200 may communicate via interface 2216 andinternet 2212 with a remote fieldimage acquisition system 2242. This system is configured for image capture for authentication (and optionally other purposes). Theimage system 2242 may be part of a larger manufacturing, assembly, or other operation. Theimage system 2242 may be integrated into other machinery, or it may stand alone. Theimage system 2242 may be operable by arobot 2250 to capture an image of anobject 2248 for authentication. Therobot 2250 may be mobile, for example, to move about a warehouse capturing images for inventory control. The robot may capture images, for example, following a door ajar or break-in trigger (detected by a sensor as described). Theimage system 2242 may work in concert with a fingerprint system such as 2240, with which it may communicate over a network. In another example, authentication may be triggered by loading dock receipt of components missing an expected RFID tag or documentation. - Preferably, authentication may be triggered by sensors (as noted), or by rules or
logic 2253, which may be realized in the form of computer code or scripts, or by the physical presence of an unexpected item, or the absence of an expected one. The trigger processor may take an action based on a combination of inputs, processed according to the applicable rules and logic. - This disclosure further includes authentication triggered by detection of another event—which event may or may not be directly related to the authentication process. Other events and
processes 2222 may communicate with theETP 2200 as illustrated or otherwise. One example is a conveyor that is carrying bags to their airplanes when a jam occurs. Currently this would mean that all those bags must, once the belt is restarted, be routed past a bag tag reader to reestablish each bag's identity. With a proposed embodiment, the system would immediately authenticate and locate each bag on the affected conveyor(s) so that when the jam is cleared, each bag can continue on its way without the need to reroute past a bag tag reader. Thus, in such a scenario, animage system ETP 2200, which reacts to a jam sensor signal (from, for example, local sensors 2202) from a luggage conveyor (not shown). - In another embodiment, a particular machined part may be both expensive and critical to system functioning and its arrival at an aircraft manufacturer may trigger a full authentication process (e.g. reading the serial number and manufacturer, fingerprinting the item, comparing the fingerprints with those in the reference database, and confirming/denying the authenticity of the item.)
- Security cameras have in recent years become commonplace and widespread in both the public and private sector. Some security cameras are monitored by security personnel but others (such as at baggage or parcel handling facilities, along with most in-store security cameras) are intended for post facto forensics. The present disclosure teaches the triggering of authentication by real-time forensics, generally taken to mean using some form of predictive analytics or artificial intelligence to determine that an unusual event has taken place and what the correct response to that event is. Systems and methods such as those illustrated above may be used to provide these features.
- As a further illustration, an AI program detects a person moving near a baggage conveyor in the airport where no persons are supposed to be present. In some embodiments, a camera may be the input for
local sensor 2202 that provides image data (still or motion) as its “output signals.” An AI program may be part of theETP 2200 for analyzing the image data. In response to this recognition “trigger,” theETP 2200 may enhance or escalate the level of tracking on the bags in the airport luggage handling system, such as looking to find bags that have been added or are now missing from the system or that are now out of place. For example, the system may then acquire fingerprints of bags at a given location—say in the vicinity of the detected unauthorized person—using asystem 2232, and query thefingerprint system 2240 database (via link 2242) to confirm that no bags have been added or removed. This feature may be applied for parcels at a sortation house, manufactured items on a conveyor, and many other cases. The proposed system may also include predictive or AI modeling to monitor external data (e.g. on the web) such as related news and sentiment to weight the frequency of authentication as well as communicate awareness/status on any item or group of items related to the area of abnormal concern. -
FIG. 12 is a simplified flow diagram of one example 2300 of a process in accordance with the present disclosure for event-triggered authentication. In the diagram 2300, a process begins with initializing or loading one or more rules, logic or scripts,block 2320. In some embodiments, these elements may be implemented in software. In some embodiments, such software may be executed in a server, such as theETP 2200. In operation, the software monitors various inputs, block 2322, communicated from one or more external processes, sensors, etc. as described with regard toFIG. 11 . Inputs (for example, sensor output signals) may be monitored by polling, interrupts, scheduled messaging, etc. When a particular input or condition is detected,block 2324, the process next selects a responsive action, 2326, based on the applicable rules, logic or scripts. Next the process directs or initiates the selected action, 2340, such as acquiring and processing authentication data as mentioned above. Next, the process may acquire results of the authentication-related actions,block 2350, in some cases, and then take further action based on the results,block 2360, if indicated by the based on the applicable rules, logic or scripts. Next the process may loop via path 2370 to continue monitoring block 2322. The steps here described are merely illustrative and some of them may be executed in parallel rather than seriatim. Some types of sensor inputs may trigger immediate actions, while others may be cumulative or otherwise have lower priority. - Most of the equipment discussed above comprises hardware and associated software. For example, the typical portable device is likely to include one or more processors and software executable on those processors to carry out the operations described. We use the term software herein in its commonly understood sense to refer to programs or routines (subroutines, objects, plug-ins, etc.), as well as data, usable by a machine or processor. As is well known, computer programs generally comprise instructions that are stored in machine-readable or computer-readable storage media. Some embodiments of the present invention may include executable programs or instructions that are stored in machine-readable or computer-readable storage media, such as a digital memory. We do not imply that a “computer” in the conventional sense is required in any particular embodiment. For example, various processors, embedded or otherwise, may be used in equipment such as the components described herein.
- Memory for storing software again is well known. In some embodiments, memory associated with a given processor may be stored in the same physical device as the processor (“on-board” memory); for example, RAM or FLASH memory disposed within an integrated circuit microprocessor or the like. In other examples, the memory comprises an independent device, such as an external disk drive, storage array, or portable FLASH key fob. In such cases, the memory becomes “associated” with the digital processor when the two are operatively coupled together, or in communication with each other, for example by an I/O port, network connection, etc. such that the processor can read a file stored on the memory. Associated memory may be “read only” by design (ROM) or by virtue of permission settings, or not. Other examples include but are not limited to WORM, EPROM, EEPROM, FLASH, etc. Those technologies often are implemented in solid state semiconductor devices. Other memories may comprise moving parts, such as a conventional rotating disk drive. All such memories are “machine readable” or “computer-readable” and may be used to store executable instructions for implementing the functions described herein.
- A “software product” refers to a memory device in which a series of executable instructions are stored in a machine-readable form so that a suitable machine or processor, with appropriate access to the software product, can execute the instructions to carry out a process implemented by the instructions. Software products are sometimes used to distribute software. Any type of machine-readable memory, including without limitation those summarized above, may be used to make a software product. That said, it is also known that software can be distributed via electronic transmission (“download”), in which case there typically will be a corresponding software product at the transmitting end of the transmission, or the receiving end, or both.
- Having described and illustrated the principles of the invention with reference to illustrated embodiments, it will be recognized that the illustrated embodiments can be modified in arrangement and detail without departing from such principles, and can be combined in any desired manner. And although the foregoing discussion has focused on particular embodiments, other configurations are contemplated. In particular, even though expressions such as “according to an embodiment of the invention” or the like are used herein, these phrases are meant to generally reference embodiment possibilities, and are not intended to limit the invention to particular embodiment configurations. As used herein, these terms can reference the same or different embodiments that are combinable into other embodiments.
- Embodiments of the invention may include a non-transitory machine-readable medium comprising instructions executable by one or more processors, the instructions comprising instructions to perform the elements of the embodiments as described herein.
- Consequently, in view of the wide variety of permutations to the embodiments described herein, this detailed description and accompanying material is intended to be illustrative only, and should not be taken as limiting the scope of the invention.
- Having described and illustrated the principles of the invention in a preferred embodiment thereof, it should be apparent that the invention may be modified in arrangement and detail without departing from such principles. We claim all modifications and variations coming within the spirit and scope of the following claims.
Claims (39)
1.-24. (canceled)
25. A system, comprising:
an event trigger processor, the event trigger processor communicatively coupled to one or more sensors which are located at a remote location that is located remotely from the event trigger processor, the event trigger processor operable to:
receive information representative of events or conditions;
determine whether an unusual event or unusual condition has occurred based at least in part on the received information;
determine, at least in part on a determination that an unusual event or unusual condition has occurred, to perform an authentication action; and
trigger the authentication action to be performed.
26. The system of claim 25 wherein to determine whether an unusual event or unusual condition has occurred, the event trigger processor employs a predictive or an artificial intelligence model applied to the received information representative of events or conditions.
27. The system of claim 25 wherein to determine whether an unusual event or unusual condition has occurred, the event trigger processor employs a predictive or an artificial intelligence model applied to the received information representative of events or conditions as well as to one or more pieces of external data.
28. The system of claim 25 wherein the event trigger processor is communicatively coupled to one or more sensors which are at one or more of the remote locations to receive the information representative of events or conditions as sensed by the one or more sensors.
29. The system of claim 25 wherein to trigger the authentication action to be performed, the event trigger processor triggers acquisition of digital image data representing an image of at least a portion of at least one physical object at a one of the remote locations at which the unusual event or unusual condition has occurred.
30. The system of claim 25 wherein the event trigger processor determines a type of authentication action to be performed based on the received information representative of events or conditions.
31. The system of claim 30 wherein to trigger the authentication action to be performed the event trigger processor causes a digital fingerprinting system to perform the determined type of authentication action.
32. The system of claim 30 wherein the digital fingerprinting system is operable to perform authentication actions with respect to one or more physical objects using respective ones of a plurality of digital fingerprints, each digital fingerprint based on digital image data of at least a portion of a corresponding physical object, and wherein each digital fingerprint is based solely on native features of the corresponding physical object and not based on any identifier, label, or other proxy added to the physical object for identification or authentication and the digital fingerprint contains a set of fingerprint features which are extracted from one or more authentication regions in the digital image data.
33. The system of claim 25 wherein to determine whether an unusual event or unusual condition has occurred based at least in part on the received information, the event trigger processor determines whether a person is detected in an area in which no persons are supposed to be present at a defined time.
34. The system of claim 33 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more physical objects in the area in which the person is detected but in which no persons are supposed to be present at a defined time.
35. The system of claim 25 wherein to determine whether an unusual event or unusual condition has occurred based at least in part on the received information, the event trigger processor determines whether an unauthorized person is detected in an area.
36. The system of claim 35 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more physical objects in a vicinity of a detected unauthorized person.
37. The system of claim 35 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more physical objects in the area in which the unauthorized person was detected.
38. The system of claim 25 wherein to determine whether an unusual event or unusual condition has occurred based at least in part on the received information, the event trigger processor determines whether a person is detected proximate to a baggage conveyor in an airport where no persons are supposed to be present during a defined time.
39. The system of claim 38 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more pieces of baggage on the baggage conveyor.
40. The system of claim 38 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more pieces of baggage on the baggage conveyor, and determination whether any pieces of baggage have been either removed or added to the conveyor.
41. The system of claim 25 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more pieces of luggage or parcels at a defined location.
42. The system of claim 25 wherein to trigger the authentication action to be performed, the event trigger processor causes an acquisition of digital fingerprints of one or more physical objects in at least one of a sortation facility or a manufacturing line.
43. The system of claim 25 wherein to receive information representative of events or conditions the event trigger processor receives image data from one or more cameras.
44. A method, comprising:
receiving, by an event trigger processor, information representative of events or conditions;
determining, by the event trigger processor, whether an unusual event or unusual condition has occurred based at least in part on the received information;
determining, by the event trigger processor, at least in part on a determination that an unusual event or unusual condition has occurred, to perform an authentication action; and
triggering, by the event trigger processor, the authentication action to be performed.
45. The method of claim 44 wherein determining whether an unusual event or unusual condition has occurred includes applying a predictive or an artificial intelligence model to the received information representative of events or conditions.
46. The method of claim 44 wherein determining whether an unusual event or unusual condition has occurred includes applying a predictive or an artificial intelligence model applied to the received information representative of events or conditions as well as to one or more pieces of external data.
47. The method of claim 44 wherein the event trigger processor is communicatively coupled to one or more sensors which are at one or more of the remote locations, and receiving the information representative of events or conditions includes receiving the information as sensed by the one or more sensors.
48. The method of claim 44 wherein triggering the authentication action to be performed includes triggering an acquisition of digital image data representing an image of at least a portion of at least one physical object at a one of one or more remote locations at which the unusual event or unusual condition has occurred.
49. The method of claim 44 , further comprising:
determining, by the event trigger processor, a type of authentication action to be performed based on the received information representative of events or conditions.
50. The method of claim 49 wherein triggering the authentication action to be performed includes causing a digital fingerprinting system to perform the determined type of authentication action.
51. The method of claim 49 wherein causing a digital fingerprinting system to perform the determined type of authentication action includes causing the digital fingerprinting system to perform authentication actions with respect to one or more physical objects using respective ones of a plurality of digital fingerprints, each digital fingerprint based on digital image data of at least a portion of a corresponding physical object, and wherein each digital fingerprint is based solely on native features of the corresponding physical object and not based on any identifier, label, or other proxy added to the physical object for identification or authentication and the digital fingerprint contains a set of fingerprint features which are extracted from one or more authentication regions in the digital image data.
52. The method of claim 44 wherein determining whether an unusual event or unusual condition has occurred based at least in part on the received information includes determining whether a person is detected in an area in which no persons are supposed to be present at a defined time.
53. The method of claim 52 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more physical objects in the area in which the person is detected but in which no persons are supposed to be present at a defined time.
54. The method of claim 44 wherein determining whether an unusual event or unusual condition has occurred based at least in part on the received information includes determining whether an unauthorized person is detected in an area.
55. The method of claim 54 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more physical objects in a vicinity of a detected unauthorized person.
56. The method of claim 54 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more physical objects in the area in which the unauthorized person was detected.
57. The method of claim 44 wherein determining whether an unusual event or unusual condition has occurred based at least in part on the received information includes determining whether a person is detected proximate a baggage conveyor in an airport where no persons are supposed to be present during a defined time.
58. The method of claim 57 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more pieces of baggage on the baggage conveyor.
59. The method of claim 57 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more pieces of baggage on the baggage conveyor, and determining whether any pieces of baggage have been either removed or added to the conveyor.
60. The method of claim 44 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more pieces of luggage or parcels at a defined location.
61. The method of claim 44 wherein triggering the authentication action to be performed includes causing an acquisition of digital fingerprints of one or more physical objects in at least one of a sortation facility or a manufacturing line.
62. The method of claim 44 wherein receiving information representative of events or conditions includes receiving, by the event trigger processor, image data from one or more cameras.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/125,437 US20210142436A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662374162P | 2016-08-12 | 2016-08-12 | |
US15/672,182 US10902540B2 (en) | 2016-08-12 | 2017-08-08 | Event-driven authentication of physical objects |
US17/125,437 US20210142436A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/672,182 Continuation US10902540B2 (en) | 2016-08-12 | 2017-08-08 | Event-driven authentication of physical objects |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210142436A1 true US20210142436A1 (en) | 2021-05-13 |
Family
ID=59677018
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/672,182 Active 2037-11-15 US10902540B2 (en) | 2016-08-12 | 2017-08-08 | Event-driven authentication of physical objects |
US17/125,431 Abandoned US20210104008A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
US17/125,437 Abandoned US20210142436A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
US17/125,424 Abandoned US20210104007A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/672,182 Active 2037-11-15 US10902540B2 (en) | 2016-08-12 | 2017-08-08 | Event-driven authentication of physical objects |
US17/125,431 Abandoned US20210104008A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/125,424 Abandoned US20210104007A1 (en) | 2016-08-12 | 2020-12-17 | Event-driven authentication of physical objects |
Country Status (2)
Country | Link |
---|---|
US (4) | US10902540B2 (en) |
EP (1) | EP3282391A1 (en) |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8774455B2 (en) | 2011-03-02 | 2014-07-08 | Raf Technology, Inc. | Document fingerprinting |
US9443298B2 (en) | 2012-03-02 | 2016-09-13 | Authentect, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
US10572883B2 (en) | 2016-02-19 | 2020-02-25 | Alitheon, Inc. | Preserving a level of confidence of authenticity of an object |
EP3236401A1 (en) | 2016-04-18 | 2017-10-25 | Alitheon, Inc. | Authentication-triggered processes |
US10740767B2 (en) | 2016-06-28 | 2020-08-11 | Alitheon, Inc. | Centralized databases storing digital fingerprints of objects for collaborative authentication |
US10915612B2 (en) | 2016-07-05 | 2021-02-09 | Alitheon, Inc. | Authenticated production |
US10839528B2 (en) | 2016-08-19 | 2020-11-17 | Alitheon, Inc. | Authentication-based tracking |
EP3435287A3 (en) | 2017-07-25 | 2019-05-01 | Alitheon, Inc. | Model-based digital fingerprinting |
US11087013B2 (en) | 2018-01-22 | 2021-08-10 | Alitheon, Inc. | Secure digital fingerprint key object database |
US11941114B1 (en) * | 2018-01-31 | 2024-03-26 | Vivint, Inc. | Deterrence techniques for security and automation systems |
JP6962858B2 (en) * | 2018-04-27 | 2021-11-05 | ファナック株式会社 | Image management device |
CN108872972B (en) * | 2018-05-30 | 2020-07-14 | 杭州电子科技大学 | Signal source positioning method based on event-triggered communication |
US10963670B2 (en) | 2019-02-06 | 2021-03-30 | Alitheon, Inc. | Object change detection and measurement using digital fingerprints |
US20220156349A1 (en) * | 2019-03-26 | 2022-05-19 | Nec Corporation | Authentication method, authentication device, program |
EP3734506A1 (en) | 2019-05-02 | 2020-11-04 | Alitheon, Inc. | Automated authentication region localization and capture |
EP3736717A1 (en) * | 2019-05-10 | 2020-11-11 | Alitheon, Inc. | Loop chain digital fingerprint method and system |
US11238146B2 (en) | 2019-10-17 | 2022-02-01 | Alitheon, Inc. | Securing composite objects using digital fingerprints |
EP3859603A1 (en) | 2020-01-28 | 2021-08-04 | Alitheon, Inc. | Depth-based digital fingerprinting |
US11568683B2 (en) | 2020-03-23 | 2023-01-31 | Alitheon, Inc. | Facial biometrics system and method using digital fingerprints |
US11341348B2 (en) | 2020-03-23 | 2022-05-24 | Alitheon, Inc. | Hand biometrics system and method using digital fingerprints |
EP3929806A3 (en) | 2020-04-06 | 2022-03-09 | Alitheon, Inc. | Local encoding of intrinsic authentication data |
US11663849B1 (en) | 2020-04-23 | 2023-05-30 | Alitheon, Inc. | Transform pyramiding for fingerprint matching system and method |
US11983957B2 (en) | 2020-05-28 | 2024-05-14 | Alitheon, Inc. | Irreversible digital fingerprints for preserving object security |
EP3926496A1 (en) | 2020-06-17 | 2021-12-22 | Alitheon, Inc. | Asset-backed digital security tokens |
WO2023205526A1 (en) * | 2022-04-22 | 2023-10-26 | Outlander Capital LLC | Blockchain powered art authentication |
US12111869B2 (en) | 2022-08-08 | 2024-10-08 | Bank Of America Corporation | Identifying an implementation of a user-desired interaction using machine learning |
US12299093B1 (en) | 2022-08-23 | 2025-05-13 | Wells Fargo Bank, N.A. | Machine-learning for real-time and secure analysis of digital metrics |
US12301558B1 (en) | 2022-08-23 | 2025-05-13 | Wells Fargo Bank, N.A. | Secure generation of authentication datasets from network activity |
US12200132B1 (en) | 2022-08-25 | 2025-01-14 | Wells Fargo Bank, N.A. | Secure multi-verification of biometric data in a distributed computing environment |
US12347095B1 (en) | 2022-08-26 | 2025-07-01 | Wells Fargo Bank, N.A. | Sensor data processing for monitoring device performance |
US12248545B1 (en) | 2022-09-01 | 2025-03-11 | Wells Fargo Bank, N.A. | Secure digital authorization via generated datasets |
US20240334029A1 (en) * | 2023-03-30 | 2024-10-03 | Dragonfruit Ai, Inc. | Energy management of cameras in an environment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6107918A (en) * | 1997-11-25 | 2000-08-22 | Micron Electronics, Inc. | Method for personal computer-based home surveillance |
US20110064267A1 (en) * | 2009-09-17 | 2011-03-17 | Wesley Kenneth Cobb | Classifier anomalies for observed behaviors in a video surveillance system |
US20120051598A1 (en) * | 2009-04-28 | 2012-03-01 | Nec Corporation | Object position estimation device, object position estimation method and program |
US20120316676A1 (en) * | 2011-06-10 | 2012-12-13 | Microsoft Corporation | Interactive robot initialization |
US20140140570A1 (en) * | 2011-09-15 | 2014-05-22 | Raf Technology, Inc. | Object identification and inventory management |
US20140201094A1 (en) * | 2013-01-16 | 2014-07-17 | Amazon Technologies, Inc. | Unauthorized product detection techniques |
US20140380446A1 (en) * | 2013-05-23 | 2014-12-25 | Tencent Technology (Shenzhen) Co., Ltd. | Method and apparatus for protecting browser private information |
US20190095744A1 (en) * | 2016-03-02 | 2019-03-28 | Siemens Aktiengesellschaft | Method for making a description of a piece of luggage and luggage description system |
Family Cites Families (257)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4218674A (en) | 1975-09-09 | 1980-08-19 | Dasy Inter S.A. | Method and a system for verifying authenticity safe against forgery |
CA1160749A (en) | 1980-06-23 | 1984-01-17 | Robert N. Goldman | Non-counterfeitable document system |
LU83333A1 (en) | 1981-05-04 | 1983-03-24 | Euratom | USE OF SURFACE TEXTURES AS A SINGLE IDENTITY MARK |
FR2567947B1 (en) | 1984-07-23 | 1986-12-26 | Euratom | SURFACE TEXTURE READING ACCESS CONTROL SYSTEM |
JPS61234481A (en) | 1985-04-10 | 1986-10-18 | Toshiba Corp | Method for recognizing information |
US4700400A (en) | 1985-10-10 | 1987-10-13 | The Palantir Corporation | Feature extraction technique for use in a pattern recognition system |
US4742555A (en) * | 1986-09-30 | 1988-05-03 | Pattern Processing Technologies, Inc. | Pattern processor controlled illuminator |
US5133019A (en) | 1987-12-03 | 1992-07-21 | Identigrade | Systems and methods for illuminating and evaluating surfaces |
US4921107A (en) | 1988-07-01 | 1990-05-01 | Pitney Bowes Inc. | Mail sortation system |
US5079714A (en) | 1989-10-03 | 1992-01-07 | Pitney Bowes Inc. | Mail deliverability by mail and database processing |
US5031223A (en) | 1989-10-24 | 1991-07-09 | International Business Machines Corporation | System and method for deferred processing of OCR scanned mail |
NL9001368A (en) | 1990-06-15 | 1992-01-02 | Tel Developments B V | SECURITY OF OBJECTS OR DOCUMENTS. |
US5518122A (en) | 1991-08-09 | 1996-05-21 | Westinghouse Electric Corp. | Modular mail processing method and control system |
US5703783A (en) | 1992-04-06 | 1997-12-30 | Electrocom Automation, L.P. | Apparatus for intercepting and forwarding incorrectly addressed postal mail |
US5422821B1 (en) | 1992-04-06 | 1998-07-21 | Electrocom Automation Lp | Apparatus for intercepting and forwarding incorrectly addressed postal mail |
US5734568A (en) | 1992-08-21 | 1998-03-31 | International Business Machines Corporation | Data processing system for merger of sorting information and redundancy information to provide contextual predictive keying for postal addresses |
US5393939A (en) | 1992-11-20 | 1995-02-28 | Westinghouse Electric Corp. | Apparatus and method for weighing moving objects |
US5324927A (en) | 1993-01-08 | 1994-06-28 | Board Of Regents-Univ. Of Nebraska | Return mail piece and method of marking the same |
US5521984A (en) | 1993-06-10 | 1996-05-28 | Verification Technologies, Inc. | System for registration, identification and verification of items utilizing unique intrinsic features |
JPH07192112A (en) | 1993-12-27 | 1995-07-28 | Oki Electric Ind Co Ltd | Intruder recognition method |
US20090097695A9 (en) | 1995-05-08 | 2009-04-16 | Rhoads Geoffrey B | Personal document authentication system using watermarking |
CA2183608A1 (en) | 1995-08-23 | 1997-02-24 | Flavio M. Manduley | Apparatus and method for generating address change notice |
US6246794B1 (en) | 1995-12-13 | 2001-06-12 | Hitachi, Ltd. | Method of reading characters and method of reading postal addresses |
US6860375B2 (en) | 1996-05-29 | 2005-03-01 | Cummins-Allison Corporation | Multiple pocket currency bill processing device and method |
US5923848A (en) | 1996-05-31 | 1999-07-13 | Microsoft Corporation | System and method for resolving names in an electronic messaging environment |
US5745590A (en) | 1996-08-08 | 1998-04-28 | U S West, Inc. | Closed loop mail piece processing method |
US5883971A (en) | 1996-10-23 | 1999-03-16 | International Business Machines Corporation | System and method for determining if a fingerprint image contains an image portion representing a smudged fingerprint impression |
DE19644163A1 (en) | 1996-10-24 | 1998-05-07 | Siemens Ag | Method and device for online processing of mail items to be forwarded |
US5974150A (en) | 1997-09-30 | 1999-10-26 | Tracer Detection Technology Corp. | System and method for authentication of goods |
US6343327B2 (en) | 1997-11-12 | 2002-01-29 | Pitney Bowes Inc. | System and method for electronic and physical mass mailing |
US6205261B1 (en) | 1998-02-05 | 2001-03-20 | At&T Corp. | Confusion set based method and system for correcting misrecognized words appearing in documents generated by an optical character recognition technique |
JP3246432B2 (en) | 1998-02-10 | 2002-01-15 | 株式会社日立製作所 | Address reader and mail sorting machine |
JPH11226513A (en) | 1998-02-18 | 1999-08-24 | Toshiba Corp | Mail address reader and mail address classifier |
US7068808B1 (en) | 1998-06-10 | 2006-06-27 | Prokoski Francine J | Method and apparatus for alignment, comparison and identification of characteristic tool marks, including ballistic signatures |
US6400805B1 (en) | 1998-06-15 | 2002-06-04 | At&T Corp. | Statistical database correction of alphanumeric identifiers for speech recognition and touch-tone recognition |
GB2345264B (en) | 1998-12-29 | 2001-01-24 | Rue De Int Ltd | Improvement in security features |
US6597809B1 (en) | 1999-03-19 | 2003-07-22 | Raf Technology, Inc. | Rollup functions for efficient storage presentation and analysis of data |
US6434601B1 (en) | 1999-03-31 | 2002-08-13 | Micron Technology, Inc. | Pre test electronic mail process |
US6549892B1 (en) | 1999-05-21 | 2003-04-15 | Pitney Bowes Inc. | System for delivering mail |
WO2001001260A2 (en) | 1999-06-30 | 2001-01-04 | Raf Technology, Inc. | Secure, limited-access database system and method |
SG67584A1 (en) | 1999-07-08 | 2001-02-20 | Ct For Signal Proc Of The Nany | Two-stage local and global fingerprint matching technique for automated fingerprint verification/indentification |
US6977353B1 (en) | 1999-08-31 | 2005-12-20 | United States Postal Service | Apparatus and methods for identifying and processing mail using an identification code |
US6539098B1 (en) | 1999-09-24 | 2003-03-25 | Mailcode Inc. | Mail processing systems and methods |
US6833911B2 (en) | 1999-10-08 | 2004-12-21 | Identification Dynamics, Inc. | Method and apparatus for reading firearm microstamping |
US6424728B1 (en) | 1999-12-02 | 2002-07-23 | Maan Ammar | Method and apparatus for verification of signatures |
KR100336498B1 (en) | 2000-01-27 | 2002-05-15 | 오길록 | Delivery Information of Mail Items and Automatic Sorting Process System |
US6954729B2 (en) | 2000-01-27 | 2005-10-11 | Bowe Bell & Howell Postal Systems Company | Address learning system and method for using same |
US6370259B1 (en) | 2000-02-04 | 2002-04-09 | Engineered Support Systems, Inc. | Automatic address extractor |
US6741724B1 (en) | 2000-03-24 | 2004-05-25 | Siemens Dematic Postal Automation, L.P. | Method and system for form processing |
US6778703B1 (en) | 2000-04-19 | 2004-08-17 | International Business Machines Corporation | Form recognition using reference areas |
DE10021734C1 (en) | 2000-05-04 | 2001-05-23 | Siemens Ag | Method and device for determining areas with distribution information on shipments |
US6360001B1 (en) | 2000-05-10 | 2002-03-19 | International Business Machines Corporation | Automatic location of address information on parcels sent by mass mailers |
US7152047B1 (en) | 2000-05-24 | 2006-12-19 | Esecure.Biz, Inc. | System and method for production and authentication of original documents |
US7162035B1 (en) | 2000-05-24 | 2007-01-09 | Tracer Detection Technology Corp. | Authentication method and system |
DE10030404A1 (en) | 2000-06-21 | 2002-01-03 | Bosch Gmbh Robert | Fingerprint identification method for car access control system, involves comparing input fingerprint data with reference fingerprints having highest similarity being sorted based on identified degree of similarity |
ATE499160T1 (en) | 2000-06-26 | 2011-03-15 | Us Postal Service | METHOD AND SYSTEM FOR PROCESSING LETTERS AND FLAT OBJECTS IN A SINGLE TURN |
CA2417663C (en) | 2000-07-28 | 2008-09-30 | Raf Technology, Inc. | Orthogonal technology for multi-line character recognition |
US7120302B1 (en) | 2000-07-31 | 2006-10-10 | Raf Technology, Inc. | Method for improving the accuracy of character recognition processes |
US20040217159A1 (en) | 2000-09-29 | 2004-11-04 | Belanger Rene M | Method and system for identification of firearms |
US20020091945A1 (en) | 2000-10-30 | 2002-07-11 | Ross David Justin | Verification engine for user authentication |
EP1202214A3 (en) | 2000-10-31 | 2005-02-23 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for object recognition |
US7016532B2 (en) | 2000-11-06 | 2006-03-21 | Evryx Technologies | Image capture and identification system and process |
US7680324B2 (en) | 2000-11-06 | 2010-03-16 | Evryx Technologies, Inc. | Use of image-derived information as search criteria for internet and other search engines |
US6865559B2 (en) | 2000-12-07 | 2005-03-08 | International Business Machines Corporation | Method and system in electronic commerce for inspection-service-based release of escrowed payments |
US7518080B2 (en) | 2000-12-15 | 2009-04-14 | United States Postal Service | Just-in-time sort plan creation |
WO2002048953A1 (en) | 2000-12-15 | 2002-06-20 | United States Postal Service | Method and apparatus for alphanumeric recognition |
GB0031016D0 (en) | 2000-12-20 | 2001-01-31 | Alphafox Systems Ltd | Security systems |
JP2002222424A (en) | 2001-01-29 | 2002-08-09 | Nec Corp | Fingerprint matching system |
AU2002306576A1 (en) | 2001-02-23 | 2002-09-12 | Technical Graphics Security Products, Llc | Security label having security element and method of making same |
US6816602B2 (en) | 2001-03-01 | 2004-11-09 | Lockheed Martin Corporation | System and method of deferred postal address processing |
WO2002080520A2 (en) | 2001-03-30 | 2002-10-10 | Siemens Dematic Postal Automation, L.P. | Method and system for image processing |
US7996333B2 (en) | 2001-04-13 | 2011-08-09 | United States Postal Service | Manifest delivery system and method |
AU2002305641A1 (en) | 2001-05-16 | 2002-11-25 | United States Postal Service | Dynamic change of address notification |
US6829369B2 (en) | 2001-05-18 | 2004-12-07 | Lockheed Martin Corporation | Coding depth file and method of postal address processing using a coding depth file |
US6985926B1 (en) | 2001-08-29 | 2006-01-10 | I-Behavior, Inc. | Method and system for matching and consolidating addresses in a database |
US8437530B1 (en) | 2001-09-27 | 2013-05-07 | Cummins-Allison Corp. | Apparatus and system for imaging currency bills and financial documents and method for using the same |
US6922687B2 (en) | 2001-12-21 | 2005-07-26 | Barca, L.L.C. | Closed loop asset management process |
US7590544B2 (en) | 2002-03-06 | 2009-09-15 | United States Postal Service | Method for correcting a mailing address |
US6816784B1 (en) | 2002-03-08 | 2004-11-09 | Navteq North America, Llc | Method and system using delivery trucks to collect address location data |
US6697500B2 (en) | 2002-03-11 | 2004-02-24 | Bowe Bell + Howell Postal Systems Company | Method and system for mail detection and tracking of categorized mail pieces |
ATE354834T1 (en) | 2002-03-15 | 2007-03-15 | Computer Sciences Corp | METHOD AND DEVICE FOR ANALYZING WRITING IN DOCUMENTS |
TWI281126B (en) | 2002-03-19 | 2007-05-11 | Intervideo Digital Technology | Image detection method based on region |
WO2003087991A2 (en) | 2002-04-09 | 2003-10-23 | The Escher Group, Ltd. | System and method for authentication of a workpiece using three dimensional shape recovery |
EP1512245A4 (en) | 2002-05-29 | 2010-11-10 | Raf Technology Inc | Authentication query strategizer and results compiler |
US20040065598A1 (en) | 2002-06-17 | 2004-04-08 | Ross David Justin | Address disambiguation for mail-piece routing |
FR2841673B1 (en) | 2002-06-26 | 2004-12-03 | Solystic | TIMING OF POSTAL OBJECTS BY IMAGE SIGNATURE AND ASSOCIATED SORTING MACHINE |
JP4059047B2 (en) | 2002-09-24 | 2008-03-12 | セイコーエプソン株式会社 | Input device, information device, and control information generation method |
AU2003285891A1 (en) * | 2002-10-15 | 2004-05-04 | Digimarc Corporation | Identification document and related methods |
US7286634B2 (en) | 2002-12-23 | 2007-10-23 | Select Technologies, Llc | Method and apparatus for improving baggage screening examination |
JP2004252621A (en) | 2003-02-19 | 2004-09-09 | Chiyoda Maintenance Kk | Product authentication system preventing market distribution of fake |
US20050119786A1 (en) | 2003-04-22 | 2005-06-02 | United Parcel Service Of America, Inc. | System, method and computer program product for containerized shipping of mail pieces |
US7949613B2 (en) | 2003-05-02 | 2011-05-24 | Siemens Industry, Inc. | Apparatus for improved sortation and delivery point processing of military mail |
US7256398B2 (en) | 2003-06-26 | 2007-08-14 | Prime Technology Llc | Security markers for determining composition of a medium |
US20050007776A1 (en) * | 2003-07-07 | 2005-01-13 | Monk Bruce C. | Method and system for a processor controlled illumination system for reading and analyzing materials |
EP2003624A1 (en) | 2003-08-01 | 2008-12-17 | Cummins-Allison Corporation | Currency processing device and method |
WO2005086616A2 (en) | 2003-09-10 | 2005-09-22 | Sommer Jr Edward J | Method and apparatus for improving baggage screening examination |
GB0321090D0 (en) | 2003-09-10 | 2003-10-08 | Abs Systems Inc | Baggage screening system |
US7305404B2 (en) | 2003-10-21 | 2007-12-04 | United Parcel Service Of America, Inc. | Data structure and management system for a superset of relational databases |
ITTO20030859A1 (en) | 2003-10-31 | 2005-05-01 | Elsag Spa | SUPPORT SYSTEM FOR DELIVERY OF POSTAL ITEMS. |
US20050125360A1 (en) | 2003-12-09 | 2005-06-09 | Tidwell Lisa C. | Systems and methods for obtaining authentication marks at a point of sale |
US20050137882A1 (en) | 2003-12-17 | 2005-06-23 | Cameron Don T. | Method for authenticating goods |
WO2005069186A1 (en) | 2003-12-29 | 2005-07-28 | United States Postal Service | Methods and systems for providing secondary address information |
US7587064B2 (en) | 2004-02-03 | 2009-09-08 | Hrl Laboratories, Llc | Active learning system for object fingerprinting |
FR2866252B1 (en) | 2004-02-18 | 2006-04-21 | Solystic | METHOD FOR SORTING POSTAL SHIPMENTS IN MULTIPLE SORT PASSES |
US20050188213A1 (en) | 2004-02-23 | 2005-08-25 | Xiaoshu Xu | System for personal identity verification |
WO2005082056A2 (en) | 2004-02-25 | 2005-09-09 | Ecoenvelopes, Llc | Reusable envelope structures and methods |
US20050204144A1 (en) * | 2004-03-10 | 2005-09-15 | Kabushiki Kaisha Toshiba | Image processing apparatus and personal information management program |
AU2005220385B2 (en) | 2004-03-12 | 2010-07-15 | Ingenia Holdings Limited | Authenticity verification methods, products and apparatuses |
US8103716B2 (en) | 2004-05-05 | 2012-01-24 | United States Postal Service | Methods and systems for forwarding an item to an alternative address |
FR2870376B1 (en) | 2004-05-11 | 2006-09-22 | Yann Boutant | METHOD FOR RECOGNIZING FIBROUS MEDIA, AND APPLICATIONS OF SUCH A METHOD IN THE COMPUTER FIELD, IN PARTICULAR |
US20050289061A1 (en) | 2004-06-24 | 2005-12-29 | Michael Kulakowski | Secure authentication system for collectable and consumer items |
US7212949B2 (en) | 2004-08-31 | 2007-05-01 | Intelligent Automation, Inc. | Automated system and method for tool mark analysis |
EP1653395A3 (en) | 2004-10-01 | 2006-05-10 | Axalto SA | Protection method for a portable personal object |
EP1645992A1 (en) | 2004-10-08 | 2006-04-12 | Philip Morris Products S.A. | Methods and systems for marking, tracking and authentication of products |
US20060083414A1 (en) | 2004-10-14 | 2006-04-20 | The Secretary Of State For The Home Department | Identifier comparison |
US7564593B2 (en) * | 2004-11-19 | 2009-07-21 | Xerox Corporation | Method and apparatus for identifying document size |
EP1846881A4 (en) | 2005-01-28 | 2009-08-26 | United Parcel Service Inc | REGISTRATION AND MAINTENANCE OF ADDRESS DATA FOR EACH SERVICE POINT OF A TERRITORY |
US20080011841A1 (en) | 2005-02-03 | 2008-01-17 | Yottamark, Inc. | System and Method of Detecting Product Code Duplication and Product Diversion |
US8157171B2 (en) * | 2005-02-25 | 2012-04-17 | Nidec Sankyo Corporation | Information reading apparatus |
FR2883493B1 (en) | 2005-03-24 | 2007-04-20 | Solystic Sas | METHOD FOR PROCESSING SHIPMENTS INCLUDING DIGITAL IMPRINT MANAGEMENT OF SHIPMENTS |
US7676433B1 (en) | 2005-03-24 | 2010-03-09 | Raf Technology, Inc. | Secure, confidential authentication with private data |
WO2006124910A2 (en) | 2005-05-17 | 2006-11-23 | United States Postal Service | System and method for automated management of an address database |
US7096152B1 (en) | 2005-07-20 | 2006-08-22 | Pitney Bowes Inc. | Method and apparatus for determining weight of moving mailpieces |
DK2024899T3 (en) | 2005-09-05 | 2016-02-15 | Alpvision S A | Means of use of the material surface microstructure as a unique identifier |
US9208394B2 (en) * | 2005-09-05 | 2015-12-08 | Alpvision S.A. | Authentication of an article of manufacture using an image of the microstructure of it surface |
WO2007031176A1 (en) | 2005-09-13 | 2007-03-22 | Siemens Aktiengesellschaft | Device for determining the mass of an object, in particular an item of mail |
JP2007094862A (en) | 2005-09-29 | 2007-04-12 | Sharp Corp | Information generation device, information generation method, information generation program and machine-readable recording medium |
US7603344B2 (en) | 2005-10-19 | 2009-10-13 | Advanced Digital Forensic Solutions, Inc. | Methods for searching forensic data |
KR100831601B1 (en) | 2005-10-26 | 2008-05-23 | 이항경 | Method and system for good authentification on communication network by using serial number and password |
DE102005058480A1 (en) | 2005-12-07 | 2007-06-14 | Siemens Ag | Object`s e.g. internal organ, medical image data set position-correct assigning method, involves selecting two sub-areas with respect to objects in each set, and determining local measure for position deviation of both sets in each sub-area |
FR2895541B3 (en) | 2005-12-23 | 2008-04-18 | Signoptic Technologies Sarl | METHOD FOR EXTRACTING A RANDOM SIGNATURE FROM A MATERIAL ELEMENT |
US7822263B1 (en) | 2005-12-28 | 2010-10-26 | Prokoski Francine J | Method and apparatus for alignment, comparison and identification of characteristic tool marks, including ballistic signatures |
DE102006005927A1 (en) | 2006-02-06 | 2007-08-09 | Dietrich Heinicke | Copy protection in conjunction with protection against counterfeiting |
JP2007213148A (en) | 2006-02-07 | 2007-08-23 | Fuji Xerox Co Ltd | Device and method of determining genuineness of banknote, device and method of registering surface characteristics of banknote, system of determining genuineness of banknote, device and method of printing card, device and method of determining genuineness of card, system of determining genuineness of card, and card |
US7787711B2 (en) | 2006-03-09 | 2010-08-31 | Illinois Institute Of Technology | Image-based indexing and classification in image databases |
US7958019B2 (en) | 2006-03-13 | 2011-06-07 | Ebay Inc. | Peer-to-peer trading platform with roles-based transactions |
GB2440386A (en) | 2006-06-12 | 2008-01-30 | Ingenia Technology Ltd | Scanner authentication |
US20080008377A1 (en) | 2006-07-07 | 2008-01-10 | Lockheed Martin Corporation | Postal indicia categorization system |
GB2451392A (en) | 2006-11-27 | 2009-01-28 | Authix Technologies Ltd | System for product authentication and tracking |
US20080128496A1 (en) | 2006-12-01 | 2008-06-05 | Patrick Bertranou | Method and apparatus for verification of items |
US7926210B2 (en) * | 2007-02-06 | 2011-04-19 | Freddy Versteeg | Apparatus and method for baggage check and promotional advertisement |
US8022832B2 (en) | 2007-02-15 | 2011-09-20 | Eprovenance, Llc | Methods and systems for certifying provenance of alcoholic beverages |
US7840340B2 (en) | 2007-04-13 | 2010-11-23 | United Parcel Service Of America, Inc. | Systems, methods, and computer program products for generating reference geocodes for point addresses |
US7941442B2 (en) | 2007-04-18 | 2011-05-10 | Microsoft Corporation | Object similarity search in high-dimensional vector spaces |
CA2629930A1 (en) | 2007-04-26 | 2008-10-26 | Bowe Bell + Howell Company | Document processing system control using document feature analysis for identification |
US7816617B2 (en) | 2007-05-21 | 2010-10-19 | Lockheed Martin Corporation | Configurable intelligent conveyor system and method |
US8108309B2 (en) | 2007-05-29 | 2012-01-31 | Provalidate | Protecting a manufactured item from counterfeiting |
FR2918199B1 (en) | 2007-06-26 | 2009-08-21 | Solystic Sas | METHOD FOR PROCESSING POSTAL SHIPMENTS THAT EXPLOIT THE VIRTUAL IDENTIFICATION OF SHIPMENTS WITH READRESSING |
US7834289B2 (en) | 2007-08-30 | 2010-11-16 | Bowe Bell & Howell Company | Mail processing system for address change service |
FR2920678B1 (en) | 2007-09-07 | 2009-10-16 | Solystic Sas | METHOD FOR PROCESSING POSTAL DELIVERY WITH CLIENT CODES ASSOCIATED WITH DIGITAL IMPRESSIONS. |
US7687727B2 (en) | 2007-09-13 | 2010-03-30 | Raf Technology, Inc. | Weigh on the fly |
US8650097B2 (en) | 2007-12-03 | 2014-02-11 | Yu Yung Choi | System and method for streamlined registration of products over a communication network and for verification and management of information related thereto |
US8194933B2 (en) | 2007-12-12 | 2012-06-05 | 3M Innovative Properties Company | Identification and verification of an unknown document according to an eigen image process |
KR100926565B1 (en) | 2007-12-13 | 2009-11-12 | 한국전자통신연구원 | Address database construction device and method |
DE112009000100T5 (en) | 2008-01-04 | 2010-11-11 | 3M Innovative Properties Co., St. Paul | Navigate between images of an object in 3D space |
GB2456216A (en) | 2008-01-11 | 2009-07-15 | Lockheed Martin Uk Ltd | Block analyser in mail sorting system |
US8150108B2 (en) | 2008-03-17 | 2012-04-03 | Ensign Holdings, Llc | Systems and methods of identification based on biometric parameters |
US8705873B2 (en) | 2008-03-20 | 2014-04-22 | Universite De Geneve | Secure item identification and authentication system and method based on unclonable features |
US8358852B2 (en) | 2008-03-31 | 2013-01-22 | Lexmark International, Inc. | Automatic forms identification systems and methods |
EP2107506A1 (en) | 2008-04-02 | 2009-10-07 | BCS machine vision GmbH | Method for identifying uniform objects in production lines without codes |
US8180667B1 (en) | 2008-06-03 | 2012-05-15 | Google Inc. | Rewarding creative use of product placements in user-contributed videos |
US8626672B2 (en) | 2008-07-23 | 2014-01-07 | I-Property Holding Corp. | Secure tracking of tablets |
WO2010018464A2 (en) | 2008-08-12 | 2010-02-18 | Medical Systems Design Pty Ltd. | Methods and systems for tracking objects in a validation process |
US8385971B2 (en) | 2008-08-19 | 2013-02-26 | Digimarc Corporation | Methods and systems for content processing |
EP2166493A1 (en) | 2008-09-12 | 2010-03-24 | BRITISH TELECOMMUNICATIONS public limited company | Control of supply networks and verification of items |
US20100070527A1 (en) | 2008-09-18 | 2010-03-18 | Tianlong Chen | System and method for managing video, image and activity data |
US8285734B2 (en) | 2008-10-29 | 2012-10-09 | International Business Machines Corporation | Comparison of documents based on similarity measures |
TWI405457B (en) | 2008-12-18 | 2013-08-11 | Ind Tech Res Inst | Multi-target tracking system, method and smart node using active camera handoff |
US20100166303A1 (en) | 2008-12-31 | 2010-07-01 | Ali Rahimi | Object recognition using global similarity-based classifier |
US8374920B2 (en) | 2009-01-21 | 2013-02-12 | Nike, Inc. | Anti-counterfeiting system and method |
CN102349091B (en) | 2009-03-13 | 2014-08-06 | 日本电气株式会社 | Image identifier matching device |
US8374399B1 (en) | 2009-03-29 | 2013-02-12 | Verichk Global Technology Inc. | Apparatus for authenticating standardized documents |
US8391583B1 (en) | 2009-04-15 | 2013-03-05 | Cummins-Allison Corp. | Apparatus and system for imaging currency bills and financial documents and method for using the same |
DE102009020664A1 (en) | 2009-05-11 | 2010-11-25 | Siemens Aktiengesellschaft | Method and device for sorting various objects |
JP2010277252A (en) | 2009-05-27 | 2010-12-09 | Toshiba Corp | Paper sheet handling apparatus |
US8194938B2 (en) | 2009-06-02 | 2012-06-05 | George Mason Intellectual Properties, Inc. | Face authentication using recognition-by-parts, boosting, and transduction |
GB2547363A (en) | 2009-06-08 | 2017-08-16 | Kezzler As | Method and system for storage and retrieval of track and trace information |
US8644622B2 (en) | 2009-07-30 | 2014-02-04 | Xerox Corporation | Compact signature for unordered vector sets with application to image retrieval |
US8116527B2 (en) | 2009-10-07 | 2012-02-14 | The United States Of America As Represented By The Secretary Of The Army | Using video-based imagery for automated detection, tracking, and counting of moving objects, in particular those objects having image characteristics similar to background |
US9558520B2 (en) | 2009-12-31 | 2017-01-31 | Hartford Fire Insurance Company | System and method for geocoded insurance processing using mobile devices |
US8520903B2 (en) | 2010-02-01 | 2013-08-27 | Daon Holdings Limited | Method and system of accounting for positional variability of biometric features |
US8041956B1 (en) | 2010-08-16 | 2011-10-18 | Daon Holdings Limited | Method and system for biometric authentication |
US20110267192A1 (en) | 2010-04-28 | 2011-11-03 | Alcatel-Lucent Usa Inc. | Individualized Baggage Tracking And Notification System |
WO2011163296A2 (en) | 2010-06-24 | 2011-12-29 | Yariv Glazer | Methods of marking products to permit tracking their later identification and tracking, applications of such methods, and products produced by such methods |
US8457354B1 (en) | 2010-07-09 | 2013-06-04 | Target Brands, Inc. | Movement timestamping and analytics |
GB2515926B (en) | 2010-07-19 | 2015-02-11 | Ipsotek Ltd | Apparatus, system and method |
KR101144016B1 (en) | 2010-07-20 | 2012-05-09 | 한국과학기술원 | Method for Establishing Wi-Fi Fingerprint Database |
US20120089639A1 (en) | 2010-10-12 | 2012-04-12 | Dapeng Wang | System and method for retrieving lost baggage in travel industry |
US8526743B1 (en) | 2010-11-01 | 2013-09-03 | Raf Technology, Inc. | Defined data patterns for object handling |
US8639016B2 (en) | 2011-01-11 | 2014-01-28 | Bank Of America Corporation | Mobile communication device-based check verification |
US10504073B2 (en) | 2011-01-19 | 2019-12-10 | Alon Atsmon | System and process for automatically analyzing currency objects |
US9443298B2 (en) | 2012-03-02 | 2016-09-13 | Authentect, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
US8774455B2 (en) | 2011-03-02 | 2014-07-08 | Raf Technology, Inc. | Document fingerprinting |
US8717165B2 (en) | 2011-03-22 | 2014-05-06 | Tassilo Gernandt | Apparatus and method for locating, tracking, controlling and recognizing tagged objects using RFID technology |
TWI424377B (en) | 2011-04-01 | 2014-01-21 | Altek Corp | Method for analyzing object motion in multi frames |
JP6036805B2 (en) | 2011-04-29 | 2016-12-07 | モネ ロワイヤル カナディエンヌ/ロイヤル カナディアン ミントMonnaie Royale Canadienne/Royal Canadian Mint | Method and apparatus for appraisal of coins or other manufactured goods |
US8488842B2 (en) | 2011-06-23 | 2013-07-16 | Covectra, Inc. | Systems and methods for tracking and authenticating goods |
US9234843B2 (en) | 2011-08-25 | 2016-01-12 | Alliance For Sustainable Energy, Llc | On-line, continuous monitoring in solar cell and fuel cell manufacturing using spectral reflectance imaging |
US9361596B2 (en) | 2011-10-04 | 2016-06-07 | Raf Technology, Inc. | In-field device for de-centralized workflow automation |
CN102564589B (en) | 2011-12-20 | 2013-07-24 | 华中科技大学 | Spectral characteristic detection identification method for multi-wave-band moving objects and device thereof |
US9727911B2 (en) | 2012-02-10 | 2017-08-08 | New York University | Systems, method and computer-accessible mediums for providing secure paper transactions using paper fiber identifiers |
EP3413222B1 (en) | 2012-02-24 | 2020-01-22 | Nant Holdings IP, LLC | Content activation via interaction-based authentication, systems and method |
US8714442B2 (en) | 2012-04-19 | 2014-05-06 | Zortag Inc | System for and method of securing articles along a supply chain |
US9031329B1 (en) | 2012-05-02 | 2015-05-12 | Fourandsix Technologies, Inc. | Photo forensics using image signatures |
EP2850557A4 (en) | 2012-05-18 | 2016-01-06 | Sri Internat Inc | System and method for authenticating a manufactured product with a mobile device |
WO2013191281A1 (en) | 2012-06-22 | 2013-12-27 | 日本電気株式会社 | Verification method, verification system, verification device, and program therefor |
CN104541290A (en) | 2012-07-23 | 2015-04-22 | Metaio有限公司 | Method of providing image feature descriptors |
US20140032322A1 (en) | 2012-07-26 | 2014-01-30 | Marc Schwieger | Methods and systems for on-site promotion of products |
JP6213843B2 (en) | 2012-09-13 | 2017-10-18 | 日本電気株式会社 | Image processing system, image processing method, and program |
US10225524B2 (en) | 2013-04-16 | 2019-03-05 | Nec Corporation | Information processing system, information processing method, and program |
US9286528B2 (en) | 2013-04-16 | 2016-03-15 | Imageware Systems, Inc. | Multi-modal biometric database searching methods |
GB2516037A (en) | 2013-07-08 | 2015-01-14 | Univ Surrey | Compact and robust signature for large scale visual search, retrieval and classification |
US20150058123A1 (en) | 2013-08-23 | 2015-02-26 | Michael George Lenahan | Contextually aware interactive advertisements |
US9390327B2 (en) | 2013-09-16 | 2016-07-12 | Eyeverify, Llc | Feature extraction and matching for biometric authentication |
JP2016535375A (en) | 2013-09-20 | 2016-11-10 | モバイル、サーチ、セキュリティ、リミテッド、ライアビリティ、カンパニーMobile Search Security Llc | Certificate and document authentication system |
EP2869240A3 (en) | 2013-11-01 | 2015-06-03 | RAF Technology, Inc. | Digital fingerprinting object authentication and anti-counterfeiting system |
US20150127430A1 (en) | 2013-11-04 | 2015-05-07 | Statimate Systems Inc. | Method and system for identifying items and managing a collection of items |
US9350714B2 (en) | 2013-11-19 | 2016-05-24 | Globalfoundries Inc. | Data encryption at the client and server level |
WO2015089346A1 (en) | 2013-12-13 | 2015-06-18 | Battelle Memorial Institute | Electronic component classification |
US20160057138A1 (en) | 2014-03-07 | 2016-02-25 | Hoyos Labs Ip Ltd. | System and method for determining liveness |
US9224196B2 (en) | 2014-03-12 | 2015-12-29 | ClearMark Systems, LLC | System and method for authentication |
CN106462549B (en) | 2014-04-09 | 2020-02-21 | 尹度普有限公司 | Identifying Solid Objects Using Machine Learning from Microscopic Changes |
US20150309502A1 (en) | 2014-04-24 | 2015-10-29 | International Business Machines Corporation | Managing provenance and authenticity for digitally manufactured objects |
US9256944B2 (en) * | 2014-05-19 | 2016-02-09 | Rockwell Automation Technologies, Inc. | Integration of optical area monitoring with industrial machine control |
US20160034914A1 (en) | 2014-07-29 | 2016-02-04 | YPB Group Ltd | Process for crowd-sourced completion of investigative tasks |
US20160034329A1 (en) * | 2014-07-31 | 2016-02-04 | Western Integrated Technologies, Inc. | Correlation and prediction analysis of collected data |
TW201619917A (en) | 2014-09-09 | 2016-06-01 | 西克帕控股有限公司 | Banknotes with interrelated features |
US20160117631A1 (en) | 2014-10-22 | 2016-04-28 | Honeywell International Inc. | Orphaned item identification |
WO2016081831A1 (en) | 2014-11-21 | 2016-05-26 | Mutti Christopher M | Imaging system for object recognition and assessment |
US9940726B2 (en) | 2014-12-19 | 2018-04-10 | The Boeing Company | System and method to improve object tracking using tracking fingerprints |
CA2972721C (en) | 2014-12-30 | 2022-07-12 | Alarm.Com Incorporated | Digital fingerprint tracking |
WO2016123630A1 (en) | 2015-01-30 | 2016-08-04 | The United States Of America, As Represented By The Secretary, Department Of Health & Human Services | Devices and methods for detection of counterfeit or adulterated products and/or packaging |
BR112017016160A2 (en) * | 2015-01-30 | 2018-04-17 | Sicpa Holding Sa | device and method for authenticating a security article and using the device |
US9361507B1 (en) | 2015-02-06 | 2016-06-07 | Hoyos Labs Ip Ltd. | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices |
CA2978660C (en) * | 2015-04-10 | 2024-03-12 | Sicpa Holding Sa | Mobile, portable apparatus for authenticating a security article and method of operating the portable authentication apparatus |
US20170004444A1 (en) | 2015-07-01 | 2017-01-05 | Amadeus S.A.S. | Baggage tracking system |
DE102016115560A1 (en) | 2015-09-01 | 2017-03-02 | Johnson Electric S.A. | Single-phase brushless motor and power tool using it |
US9830506B2 (en) * | 2015-11-09 | 2017-11-28 | The United States Of America As Represented By The Secretary Of The Army | Method of apparatus for cross-modal face matching using polarimetric image data |
US10572883B2 (en) | 2016-02-19 | 2020-02-25 | Alitheon, Inc. | Preserving a level of confidence of authenticity of an object |
TWI601423B (en) | 2016-04-08 | 2017-10-01 | 晶睿通訊股份有限公司 | Image capture system and sychronication method thereof |
EP3236401A1 (en) | 2016-04-18 | 2017-10-25 | Alitheon, Inc. | Authentication-triggered processes |
CN110036390A (en) | 2016-05-24 | 2019-07-19 | 艾迪密身份与安全美国有限责任公司 | Document image quality assessment |
US10614302B2 (en) | 2016-05-26 | 2020-04-07 | Alitheon, Inc. | Controlled authentication of physical objects |
IL245932A (en) | 2016-05-30 | 2017-10-31 | Elbit Systems Land & C4I Ltd | System for object authenticity detection including a reference image acquisition module and a user module and methods therefor |
US10740767B2 (en) | 2016-06-28 | 2020-08-11 | Alitheon, Inc. | Centralized databases storing digital fingerprints of objects for collaborative authentication |
US10915612B2 (en) | 2016-07-05 | 2021-02-09 | Alitheon, Inc. | Authenticated production |
US20180018627A1 (en) | 2016-07-15 | 2018-01-18 | Alitheon, Inc. | Database records and processes to identify and track physical objects during transportation |
US10839528B2 (en) | 2016-08-19 | 2020-11-17 | Alitheon, Inc. | Authentication-based tracking |
EP3435287A3 (en) | 2017-07-25 | 2019-05-01 | Alitheon, Inc. | Model-based digital fingerprinting |
CN109583287B (en) | 2017-09-29 | 2024-04-12 | 浙江莲荷科技有限公司 | Object identification method and verification method |
US11087013B2 (en) | 2018-01-22 | 2021-08-10 | Alitheon, Inc. | Secure digital fingerprint key object database |
US20200153822A1 (en) | 2018-11-13 | 2020-05-14 | Alitheon, Inc. | Contact and non-contact image-based biometrics using physiological elements |
US20200233901A1 (en) | 2019-01-18 | 2020-07-23 | Alitheon, Inc. | Active identification and dimensioning |
US10963670B2 (en) | 2019-02-06 | 2021-03-30 | Alitheon, Inc. | Object change detection and measurement using digital fingerprints |
US20200257791A1 (en) | 2019-02-07 | 2020-08-13 | Alitheon, Inc. | Regionalized change detection using digital fingerprints |
EP3734506A1 (en) | 2019-05-02 | 2020-11-04 | Alitheon, Inc. | Automated authentication region localization and capture |
EP3736717A1 (en) | 2019-05-10 | 2020-11-11 | Alitheon, Inc. | Loop chain digital fingerprint method and system |
-
2017
- 2017-08-08 US US15/672,182 patent/US10902540B2/en active Active
- 2017-08-10 EP EP17185801.2A patent/EP3282391A1/en not_active Ceased
-
2020
- 2020-12-17 US US17/125,431 patent/US20210104008A1/en not_active Abandoned
- 2020-12-17 US US17/125,437 patent/US20210142436A1/en not_active Abandoned
- 2020-12-17 US US17/125,424 patent/US20210104007A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6107918A (en) * | 1997-11-25 | 2000-08-22 | Micron Electronics, Inc. | Method for personal computer-based home surveillance |
US20120051598A1 (en) * | 2009-04-28 | 2012-03-01 | Nec Corporation | Object position estimation device, object position estimation method and program |
US20110064267A1 (en) * | 2009-09-17 | 2011-03-17 | Wesley Kenneth Cobb | Classifier anomalies for observed behaviors in a video surveillance system |
US20120316676A1 (en) * | 2011-06-10 | 2012-12-13 | Microsoft Corporation | Interactive robot initialization |
US20140140570A1 (en) * | 2011-09-15 | 2014-05-22 | Raf Technology, Inc. | Object identification and inventory management |
US20140201094A1 (en) * | 2013-01-16 | 2014-07-17 | Amazon Technologies, Inc. | Unauthorized product detection techniques |
US20140380446A1 (en) * | 2013-05-23 | 2014-12-25 | Tencent Technology (Shenzhen) Co., Ltd. | Method and apparatus for protecting browser private information |
US20190095744A1 (en) * | 2016-03-02 | 2019-03-28 | Siemens Aktiengesellschaft | Method for making a description of a piece of luggage and luggage description system |
Non-Patent Citations (1)
Title |
---|
Li, Yuan Yuan, and Lynne E. Parker. "Intruder detection using a wireless sensor network with an intelligent mobile robot response." IEEE SoutheastCon 2008. IEEE, 2008. (Year: 2008) * |
Also Published As
Publication number | Publication date |
---|---|
US20210104007A1 (en) | 2021-04-08 |
EP3282391A1 (en) | 2018-02-14 |
US20180047128A1 (en) | 2018-02-15 |
US10902540B2 (en) | 2021-01-26 |
US20210104008A1 (en) | 2021-04-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210142436A1 (en) | Event-driven authentication of physical objects | |
US20230409689A1 (en) | Authentication-based tracking | |
US10614302B2 (en) | Controlled authentication of physical objects | |
US11423641B2 (en) | Database for detecting counterfeit items using digital fingerprint records | |
US11321964B2 (en) | Loop chain digital fingerprint method and system | |
US11636191B2 (en) | Authenticated production | |
US20210312158A1 (en) | Model-based digital fingerprinting | |
US20210279462A1 (en) | Authentication of a suspect object using extracted native features | |
EP2869240A2 (en) | Digital fingerprinting object authentication and anti-counterfeiting system | |
US10740767B2 (en) | Centralized databases storing digital fingerprints of objects for collaborative authentication | |
EP3270342A1 (en) | Database records and processes to identify and track physical objects during transportation | |
US20210233203A1 (en) | Depth-based digital fingerprinting | |
US20240296695A1 (en) | Irreversible digital fingerprints for preserving object security | |
CN120388194A (en) | Intelligent item identification system and identification method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |