[go: up one dir, main page]

JP2004045072A - Meat identification method and meat identification device - Google Patents

Meat identification method and meat identification device Download PDF

Info

Publication number
JP2004045072A
JP2004045072A JP2002199762A JP2002199762A JP2004045072A JP 2004045072 A JP2004045072 A JP 2004045072A JP 2002199762 A JP2002199762 A JP 2002199762A JP 2002199762 A JP2002199762 A JP 2002199762A JP 2004045072 A JP2004045072 A JP 2004045072A
Authority
JP
Japan
Prior art keywords
detection light
meat
detection
chicken
internal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2002199762A
Other languages
Japanese (ja)
Inventor
Hideki Toida
戸井田 秀基
Sunao Kondo
近藤 直
Gentaro Kakemizu
掛水 源太郎
Juichi Yoshimaru
吉丸 寿一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ishii Corp
Original Assignee
Ishii Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ishii Corp filed Critical Ishii Corp
Priority to JP2002199762A priority Critical patent/JP2004045072A/en
Publication of JP2004045072A publication Critical patent/JP2004045072A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Medicinal Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Analysing Materials By The Use Of Radiation (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

【課題】食肉が何処の部位であるか正確且つ確実に識別することができ、各部位毎に仕分けする作業が適確に行える食肉識別方法を提供する。
【解決手段】検出光照射装置3から投光される検出光B及び放射線照射装置5から放射される放射線Dを、搬送コンベア2により搬送される鶏肉Aに照射すると共に、その鶏肉Aが反射する反射光Cを撮像カメラ4で撮像し、鶏肉A内部を透過する放射線Dの透過像を撮像カメラ6で撮像する。且つ、撮像カメラ4,6から出力される画像情報に基づいて、鶏肉Aが何処の部位であるかを部位判定装置7のアルゴリズムにより個々に総合判定する。
【選択図】 図4
An object of the present invention is to provide a meat identification method capable of accurately and surely identifying where a meat is located, and capable of accurately performing an operation of sorting for each part.
Kind Code: A1 A detection light emitted from a detection light irradiation device and a radiation emitted from a radiation irradiation device are irradiated on chicken conveyed by a conveyor, and the chicken is reflected. The reflected light C is captured by the imaging camera 4, and the transmitted image of the radiation D transmitted through the inside of the chicken A is captured by the imaging camera 6. In addition, based on the image information output from the imaging cameras 4 and 6, where the chicken A is located is individually and comprehensively determined by the algorithm of the part determination device 7.
[Selection diagram] Fig. 4

Description

【0001】
【発明の属する技術分野】
この発明は、例えば鶏肉や豚肉、牛肉等の食肉が何処の部位であるか識別する作業に用いられる食肉識別方法及びその食肉識別装置に関する。
【0002】
【従来の技術】
従来、上述例の鶏肉を各部位毎に仕分け処理する方法としては、例えば一般的な鶏肉処理工場では、複数の各部位に分離及び捌かれた鶏肉を1本のコンベアにより混ざり合ったまま搬送すると共に、そのコンベアにより搬送される鶏肉の各部位を、作業者の目視により個々に識別及び判定して各部位毎に仕分け処理する方法がある。
【0003】
【発明が解決しようとする課題】
しかし、上述のコンベアにより搬送される鶏肉の各部位を、作業者の目視により識別及び判定する場合、作業者の熟練度によって判定基準が異なるため、仕分けミスが発生することがあり、鶏肉の各部位を正確に仕分けすることが困難である。且つ、コンベアにより搬送される多数の各部位を手作業により仕分け処理するので、作業能率が悪く、作業者の処理能力に限界があるため、多数の各部位を連続して仕分けすることが困難であり、鶏肉の各部位を仕分けする作業に手間及び時間が掛かるという問題点を有している。
【0004】
この発明は上記問題に鑑み、内部検出光検出手段及び外部検出光検出手段の検出情報に基づいて、食肉が何処の部位であるかを部位判定手段のアルゴリズムで判定することにより、食肉が何処の部位であるか正確且つ確実に識別及び判定することができる食肉識別方法及びその食肉識別装置の提供を目的とする。
【0005】
【課題を解決するための手段】
この発明は、外部検出光照射手段から投光される外部検出光及び内部検出光照射手段から投光される内部検出光を食肉に照射し、上記食肉が反射する外部検出光を外部検出光検出手段で検出し、該食肉内部を透過する内部検出光を内部検出光検出手段で検出すると共に、上記外部検出光検出手段及び内部検出光検出手段から出力される検出情報に基づいて、上記食肉が何処の部位であるかを部位判定手段のアルゴリズムで判定する食肉識別方法及びその食肉識別装置であることを特徴とする。
【0006】
上述の食肉は、例えば鶏や豚、牛等の食肉で構成され、実施例では、鶏肉のドラム(Dram)や手羽先(Wing)、キール(Keel)、リブ(Limb)、サイ(Sai)等の部位で構成しているが、豚や牛の各部位等で構成することもできる。
【0007】
また、外部検出光照射手段は、例えばハロゲンランプやキセノンランプ、紫外線ランプ、蛍光灯、白熱灯等の検出光を照射する光源(又は照光装置)で構成することができる。また、内部検出光照射手段は、例えばガス線管やイオン線管、クーリッジ管、高真空線管、高電圧線管等の放射線を照射又は発生する放射線発生源で構成することができる。
【0008】
また、外部検出光検出手段及び内部検出光検出手段は、例えばカラー又はモノクロの撮像カメラ(CCDカメラ)やディジタルカメラ、画像素子等で構成することができる。また、部位判定手段は、例えばパーソナルコンピューターやCPU、ROM、RAMを備えた制御装置等の部位判定装置で構成することができる。また、搬送コンベアは、例えばベルトコンベアやバケットコンベア、ローラコンベア、チェーンコンベア、食肉が載置されるフリーのトレイ等で構成することができる。
【0009】
つまり、外部検出光照射手段から投光される外部検出光と、内部検出光照射手段から投光される内部検出光(例えばX線や軟X線等の放射線)とを食肉に照射して、食肉が反射する外部検出光を外部検出光検出手段で検出し、食肉内部を透過した内部検出光を内部検出光検出手段で検出する。且つ、外部検出光検出手段及び内部検出光検出手段から出力される検出情報に基づいて、食肉が何処の部位であるかを部位判定手段のアルゴリズムにより個々に識別及び判定する。
【0010】
実施の形態として、上記外部検出光照射手段及び外部検出光検出手段と、上記内部検出光照射手段及び内部検出光検出手段とを、上記食肉を搬送する搬送コンベア上に設け、上記外部検出光照射手段から投光される外部検出光及び内部検出光照射手段から投光される内部検出光を、上記搬送コンベアにより搬送される食肉に照射して、上記食肉が反射する外部検出光及び該食肉内部を透過する内部検出光を、該食肉を搬送しながら外部検出光検出手段及び内部検出光検出手段で検出する食肉識別方法を構成することができる。
【0011】
また、上記外部検出光照射手段及び外部検出光検出手段と、上記内部検出光照射手段及び内部検出光検出手段とを、上記食肉を搬送する搬送コンベア上に設けることができる。また、上記外部検出光照射手段を、上記食肉に対して検出光を照射する検出光照射手段で構成し、上記外部検出光検出手段を、上記食肉が反射する反射光を検出する反射光検出手段で構成することができる。また、上記内部検出光照射手段を、上記食肉に対して放射線を照射する放射線照射手段で構成し、上記内部検出光検出手段を、上記食肉を透過した放射線を検出する放射線検出手段で構成することができる。
【0012】
【作用及び効果】
この発明によれば、食肉が反射する外部検出光及び食肉を透過した内部検出光を、内部検出光検出手段及び外部検出光検出手段により検出すると共に、その各検出手段から出力される検出情報に基づいて、食肉が何処の部位であるか部位判定手段のアルゴリズムにより個々に判定するので、食肉の各部位を正確且つ確実に識別及び判定することができると共に、様々な食肉を各部位毎に仕分けする作業が適確に行え、仕分け精度の向上及び仕分け作業の省力化を図ることができる。
【0013】
【実施例】
この発明の一実施例を以下図面に基づいて詳述する。
図面は、食肉の一例である鶏肉を各部位毎に仕分けする作業に用いられる食肉識別方法及びその食肉識別装置を示し、図1に於いて、この食肉識別装置1は、複数の部位に分離及び捌かれた鶏肉Aを搬送コンベア2に載置して、搬送路上に設定した第1検出区間a及び第2検出区間bに整列搬送し、第1検出区間aに搬送される鶏肉Aに対して検出光照射装置3から投光される検出光Bを照射して、その鶏肉Aが反射する反射光Cを撮像カメラ4で撮像する。続いて、第2検出区間bに搬送される鶏肉Aに対して放射線照射装置5から放射される放射線D(例えばX線)を照射して、その鶏肉A内部を透過した放射線Dの透過像を撮像カメラ6で撮像する。且つ、撮像カメラ4,6から出力される画像情報に基づいて、鶏肉Aが何処の部位であるか部位判定装置7のアルゴリズムにより判定する装置である。
【0014】
図2は、食肉識別装置1の制御回路ブロック図を示し、例えばパーソナルコンピューターで構成される部位判定装置7には、CPU8と、ROM9と、RAM10と、比較情報記憶装置11とが内蔵され、CPU8は、ROM9に格納されたプログラムに沿って、搬送コンベア2と、検出光照射装置3と、カラーの撮像カメラ4と、放射線照射装置5と、モノクロの撮像カメラ6と、画像処理装置12と、検知センサー13との駆動及び停止を制御する他、経過時間を計時するタイマーを内蔵している。また、RAM22には、検出処理及び計測処理に必要なデータを記憶する。
【0015】
上述の搬送コンベア2は、鶏肉Aが反射する反射光Cを撮像するのに適した配色及び材質と、放射線照射装置5から投光される放射線Dの透過を許容する材質及び構造とに構成され、搬送路下部からの外光を遮光する。且つ、鶏肉Aが載置される載置部に、その鶏肉Aと略対応して移動する番地を設定又は付設している。
【0016】
前述の検出光照射装置3は、第1計測区間に搬送される鶏肉A外面と対向して略斜め上方に複数配設され、鶏肉A外部の所定項目を検出するのに適した波長の検出光Bを、鶏肉Aの上部外面及び側部外面に向けて略均等に照射する。
【0017】
前述の撮像カメラ4は、検出光照射装置3から投光される検出光Bが照射される鶏肉Aの上部外面及び側部外面と対向して搬送路上部に配設され、その鶏肉Aが反射する反射光Cを撮像して後述する画像処理装置12に出力する。また、撮像カメラ4を、第1計測区間に搬送される鶏肉Aを中心として、略真横や略斜め上方、略斜め前後等に配設してもよい。
【0018】
前述の放射線照射装置5は、第2計測区間に搬送される鶏肉A下面と対向して搬送路下部(搬送コンベア2下部)に配設され、鶏肉A内部を透過及びその鶏肉A内部の所定項目を検出するのに適した波長の放射線Dを、その放射線Dの透過を許容する材質及び構造の搬送コンベア2を介して、鶏肉Aの下部外面に向けて下方から略均等に照射する。
【0019】
前述の撮像カメラ6は、放射線照射装置5から投光される放射線Dが透過する鶏肉Aの上部外面及び側部外面と対向して搬送路上部に配設され、その鶏肉A内部を透過した放射線Dを後述するシート状又はプレート状の蛍光体14(例えばシンチレータ)により光学的に撮像可能な波長の光線に変換し、その変換された放射線Dの透過像を撮像して後述する画像処理装置12に出力する。
【0020】
上述の蛍光体14は、第2計測区間に搬送される鶏肉Aと、その鶏肉A上部に配設した放射線照射装置5との対向面間に配設され、鶏肉A内部を透過した放射線Dを撮像カメラ6により光学的に撮像可能な波長の光線に変換する。つまり、鶏肉A内部を透過した放射線Dが蛍光体14に照射されると、その放射線Dの透過量に応じて、撮像カメラ6により光学的に撮像可能な波長の光線に変換し、放射線Dが照射された鶏肉Aの全体又は一部の透過像を上面側に投影する。
【0021】
前述の画像処理装置12は、撮像カメラ4,6から出力される画像情報を、例えば鶏肉のドラム(Dram)や手羽先(Wing)、キール(Keel)、リブ(Limb),サイ(Sai)等の各部位を判定するに適した原画像に変換処理(例えば2値化処理)して部位判定装置7に出力する。
【0022】
且つ、撮像カメラ4から出力される各部位の原画像に基づいて、例えば図3に示すように、赤1(R1)、赤2(R2)、黄1(Y1)、黄2(Y2)、黄緑1(YG1)、黄緑2(YG2)、緑(G)、青(BR1)、青(BR2)、青(BR3)、青(BR4)、白(WH)、黒(BL)等の色情報に変換及び分類する。つまり、血は、R1orR2に分類し、肉は、R2orY1orY2に分類し、皮及び骨は、Y2orYG1orYG2に分類し、脂肪は、YG1orYG2等に変換及び分類する。
【0023】
且つ、放射線照射装置5に接続された放射線制御装置5a(例えばX線コントローラ)は、放射線照射装置5が発生する放射線Dの発生量を、鶏肉Aの部位を識別するのに適した数値に可変調節するための装置である。また、放射線制御装置5aを、後述する部位判定装置7のCPU8により制御することもできる。加えて、上述の放射線照射装置5の照射角度や照射領域、装置の配置数と、撮像カメラ6の撮像角度や撮像範囲、カメラの台数とを変更してもよい。
【0024】
上述の蛍光体14に代わる他の変換方法として、例えば酸硫化ガドニウム=Gd2O2S:Tbや沃化セシウム=CsI:TI、蛍光体とアモルファスシリコンセンサーとを組み合わせるか、アモルファスセレン光電導体を使用する等して変換することもできる。
【0025】
前述の検知センサー13は、第1計測区間及び第2計測区間に搬送される鶏肉Aと対向して搬送路側部に配設され、第1計測区間及び第2計測区間に鶏肉Aが搬送されたことを検知して、その検知信号を部位判定装置7に出力する。また、検知センサー13の代わりに、例えばリミットスイッチや近接スイッチ等を用いてもよい。
【0026】
前述の部位判定装置7は、画像処理装置12から出力される画像情報を検出情報記憶領域(RAM)に一旦記憶し、その検出情報記憶領域に記憶された鶏肉Aの検出情報と、比較情報記憶装置11に記憶された各部位を識別するのに必要な基準情報とを比較及び演算して、鶏肉Aの色や形状、寸法等の情報を得ると共に、搬送コンベア2の載置部(図示省略)に載置された鶏肉Aが鶏肉の何処の部位であるか図4及び図5のアルゴリズムにより個々に総合判定する。
【0027】
つまり、画像処理装置12から出力される画像情報に基づいて、例えば面積(穴が含まれない鶏肉A全体の面積)、周囲長(穴が含まれない鶏肉Aの周囲の長さ、領域の隣接する境界画素の中間点を結ぶ線の長さ)、円形度(4π×面積/(周囲長)2乗=円形度)、サイズ(2×面積/周囲長=サイズ)、重心距離(鶏肉Aの重心から外側輪郭までの距離の最大及び最小の平均値)、絶対最大長(鶏肉Aの最も長い部分=絶対最大長)、対角幅(絶対最大長と平行する2本の直線で鶏肉Aを挟んだとき、その2本の直線間の最短距離)、針状比(絶対最大長/対角幅=針状比)、複雑度、分散等の形状特徴量を抽出する。
【0028】
且つ、画像情報及び原画像を基にして、例えば各部位の比率、個数、複雑度、位置等の色情報を抽出する。その鶏肉Aの計測結果と、比較情報記憶装置に記憶された基準情報とを比較及び演算して、鶏肉Aが鶏肉の何処の部位(例えばドラム(Dram)や手羽先(Wing)、キール(Keel)、リブ(Limb)、サイ(Sai)等)であるかを、図4及び図5のアルゴリズムにより個々に判定及び識別する。且つ、鶏肉Aの判定結果と、鶏肉Aが載置された載置部(図示省略)の番地情報とを対応させて記憶する。
【0029】
また、上述の番地やタイムラグに代わる他の識別方法として、例えば鶏肉Aの載置部自体に付設又は載置部と略対応して移動する記録媒体(例えばバーコード、IDカード、磁気カード等)の固有情報(番地やキャリア番号)を読取るか、第1及び第2の計測区間で計測した計測情報を記録媒体に記録する等してもよい。
【0030】
図示実施例は上記の如く構成するものにして、以下、上述の食肉識別装置1による鶏肉Aの部位識別方法を説明する。
【0031】
先ず、図1に示すように、複数の各部位に分離及び捌かれた鶏肉Aを搬送コンベア2に載置して第1検出区間aに搬送し、検出光照射装置3から投光される検出光Bを第1計測区間に搬送される鶏肉Aに照射し、その鶏肉Aが反射する反射光Cを撮像カメラ4で撮像して、その撮像情報を画像処理装置12に出力する。且つ、鶏肉Aと略対応して設定又は移動する搬送コンベア2の番地情報と、撮像カメラ4で撮像した画像情報とを対応させて部位判定装置7に出力する。
【0032】
次に、放射線照射装置5から投光される放射線Dを、第2計測区間に搬送された鶏肉Aに対して搬送コンベア2の下方から照射し、鶏肉A内部を透過した放射線Dを、撮像カメラ6により撮像可能な波長の光線に蛍光体14で変換して、鶏肉Aの全体又は一部の透過像を上面側に投影すると共に、蛍光体14により変換された鶏肉Aの透過像を撮像カメラ6で撮像して、その撮像情報を画像処理装置12に出力する。且つ、鶏肉Aと略対応して設定又は移動する番地情報と、撮像カメラ6で撮像した画像情報とを対応させて部位判定装置7に出力する。
【0033】
一方、上述の画像処理装置12は、撮像カメラ4,6により撮像した画像情報を、各部位を識別するのに適した情報形態(例えば2値化処理、図3に示す色情報)処理して部位判定装置7に出力する。その部位判定装置7は、撮像カメラ4,6から出力される画像情報に基づいて、搬送コンベア2により搬送される鶏肉Aが何処の部位であるかを個々に総合判定すると共に、その判定された鶏肉Aの部位情報と、鶏肉Aが載置された載置部(図示省略)の番地情報とを対応させて記憶する。
【0034】
次に、撮像カメラ4,6で撮像した画像情報に基づいて、鶏肉Aが鶏肉の何処の部位であるか部位判定装置7のアルゴリズムにより識別及び判定する場合、図4に示すように、鶏肉Aのサイズ<110?であるかステップ1で判定し、その鶏肉Aのサイズが基準値以下であると判定された場合、針状比>2.5?であるかステップ2で判定し、その針状比が基準値以上であると判定された場合、ドラム(Dram)であると識別される。
【0035】
次に、ステップ2において、針状比が基準値以下であると判定された場合、重心距離最小/面積<18?であるかステップ3で判定し、その重心距離最小/面積が基準値以下であると判定された場合、面積<50000?であるかステップ4で判定し、その面積が基準値以下であると判定された場合、左又は右の手羽先(Wing)であると識別される。つまり、ドラム及び手羽先については、形状を中心とした特徴量に基づいて、略問題なく識別することができる。
【0036】
次に、ステップ3において、重心距離最小/面積が基準値以上であると判定された場合、ステップ4において、面積が基準値以上であると判定された場合、ステップ5に移行する。
【0037】
次に、複雑度<1500?であるかステップ5で判定し、その複雑度が基準値以下であると判定された場合、針状比>1.45?であるかステップ6で判定し、その針状比が基準値以上であると判定された場合、キール(Keel)であると識別される。且つ、ステップ5において、複雑度が基準値以上であると判定された場合、図5のステップ13に移行する。
【0038】
次に、ステップ6において、針状比が基準値以下であると判定された場合、針状比>1.25?であるかステップ7で判定し、その針状比が基準値以上であると判定された場合、YGの個数<10?であるかステップ8で判定し、そのYGの個数が基準値以下であると判定された場合、YGの複雑度<100?であるかステップ9で判定し、そのYGの複雑度が基準値以下であると判定された場合、キール(Keel)の裏であると識別される。
【0039】
また、ステップ8において、YGの個数が基準値以上であると判定された場合、ステップ10に移行する。ステップ9において、そのYGの複雑度が基準値以上であると判定された場合、ステップ11に移行する。
【0040】
次に、YGの個数<15?であるかステップ10で判定し、そのYGの個数が基準値以下であると判定された場合、Y2の比率<25%?であるかステップ11で判定し、その比率が基準値以下であると判定された場合、Y2+YGの比率>70%?であるかステップ12で判定し、そのY2+YGの比率が基準値以上であると判定された場合、キール(Keel)の表であると識別される。
【0041】
次に、針状比が基準値以下であるとステップ7で判定された場合、YGの個数が基準値以下であるとステップ10で判定された場合、Y2の比率が基準値以上であるとステップ11で判定された場合、Y2+YGの比率が基準値以下であるとステップ12で判定された場合、図5のステップ13に移行する。
【0042】
次に、図5に示すように、R分散度<200?であるかステップ13で判定し、そのR分散度が基準値以下であると判定された場合、サイ(Sai)の表であると識別される。
【0043】
次に、ステップ13において、R分散度が基準値以上であると判定された場合、R1+R2の面積>2000?であるかステップ14で判定し、そのR1+R2の面積が基準値以上であると判定された場合、R1+R2の距離>20?であるかステップ15で判定し、そのR1+R2の距離が基準値以上であると判定された場合、R1+R2の個数<=10?であるかステップ16で判定し、そのR1+R2の個数が基準値以以下又は略同等であると判定された場合、サイ(Sai)の裏であると識別される。且つ、ステップ15において、R1+R2の距離が基準値以下であると判定された場合、リブ(Limb)の裏であると識別される。
【0044】
次に、ステップ14において、R1+R2の面積が基準値以下であると判定された場合、ステップ16において、R1+R2の個数が基準値以上であると判定された場合、ステップ17に移行する。
【0045】
次に、R1+R2の面積<1000?であるかステップ17で判定し、そのR1+R2の面積が基準値以下であると判定された場合、Y2+YGの比率<60%?であるかステップ18で判定し、そのY2+YGの比率が基準値以下であると判定された場合、リブ(Limb)の表であると識別される。つまり、撮像カメラ4で撮像した画像情報のみで識別することが困難であった形状色共に類似するキールの一部及びリブ、サイを、撮像カメラ6で撮像した透過画像に基づいて、高い確立で略正確に識別することができる。
【0046】
次に、ステップ17において、R1+R2の面積が基準値以上であると判定された場合、ステップ18において、Y2+YGの比率が基準値以上であると判定された場合、ステップ19に移行する。
【0047】
次に、R1+R2の個数<5?であるかステップ19で判定し、そのR1+R2の個数が基準値以下であると判定された場合、リブ(Limb)の表であると識別される。且つ、R1+R2の個数が基準値以上であると判定された場合、鶏肉Aの部位を判定することができないため、ステップ20で反転(例えば表裏、左右、前後等)して、上述のステップ1〜ステップ20を何回か繰り返して行い識別する。なお、2回目以降は判定不能であるため回収又は廃棄する。
【0048】
また、残りの鶏肉Aを、サイ裏及びリブ表裏まで絞り込むことができれば、この部分に形状からリブの表裏を識別させることも可能である。この場合、ドラム及び手羽先、キール、サイ表を確実に識別できることが一つの条件である。
【0049】
以上のように、検出光照射装置3から投光される検出光B及び放射線照射装置5から放射される放射線Dを搬送コンベア2により搬送される鶏肉Aに対して照射すると共に、鶏肉Aが反射する反射光C及び鶏肉Aを透過した放射線Dの透過像を撮像カメラ4,6で撮像し、その撮像カメラ4,6から出力される画像情報に基づいて、鶏肉Aが何処の部位であるか部位判定装置7のアルゴリズムにより個々に総合判定するので、鶏肉Aの各部位を正確且つ確実に識別及び判定することができると共に、様々な鶏肉Aを各部位毎に仕分けする作業が適確に行え、仕分け精度の向上及び仕分け作業の省力化を図ることができる。
【0050】
なお、放射線照射装置5から放射される放射線Dを、搬送コンベア2により搬送される鶏肉Aに先に照射して、その鶏肉A内部を透過した放射線Dの透過像を撮像カメラ6で撮像した後、検出光照射装置3から投光される検出光Bを鶏肉Aに照射して、その鶏肉Aが反射する反射光Cを撮像カメラ4で撮像することもでき、実施例の構成のみに限定されるものではない。
【0051】
また、図4及び図5のアルゴリズムに記載された基準値(数値)を、他の鶏肉や牛肉、豚肉等の種類に応じた値に変更してもよく、その肉の各部位を識別するのに適用することができる。
【図面の簡単な説明】
【図1】食肉識別装置による鶏肉の部位識別方法を示す斜視図。
【図2】食肉識別装置の制御回路ブロック図。
【図3】鶏肉のカラー処理状態を示す説明図。
【図4】ドラム及び手羽先、キールの識別方法を示すアルゴリズム図。
【図5】図4に続くサイ及びリブの識別方法を示すアルゴリズム図。
【符号の説明】
A…鶏肉
B…検出光
C…反射光
D…放射線
1…食肉識別装置
2…搬送コンベア
3…検出光照射装置
5…放射線照射装置
4,6…撮像カメラ
7…部位判定装置
11…比較情報記憶装置
12…画像処理装置
13…検知センサー
[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a meat identification method and a meat identification device used for an operation of identifying a portion of meat such as chicken, pork, and beef.
[0002]
[Prior art]
Conventionally, as a method of sorting the chicken in the above-described example for each part, for example, in a general chicken processing plant, the chicken separated and separated into a plurality of parts is transported while being mixed by one conveyor. At the same time, there is a method in which each part of the chicken conveyed by the conveyer is individually identified and determined visually by an operator, and sorting processing is performed for each part.
[0003]
[Problems to be solved by the invention]
However, when each part of the chicken conveyed by the above-mentioned conveyor is visually identified and determined by an operator, the determination criteria differ depending on the skill of the operator, so that sorting errors may occur, and each chicken It is difficult to sort the parts accurately. Also, since a large number of parts conveyed by the conveyor are manually sorted, the working efficiency is poor and the processing capacity of the operator is limited, so it is difficult to sort a large number of parts continuously. In addition, there is a problem that it takes time and labor to sort each part of the chicken.
[0004]
In view of the above-described problems, the present invention determines where meat is based on the detection information of the internal detection light detection means and the external detection light detection means by using an algorithm of the part determination means, thereby determining where the meat is. It is an object of the present invention to provide a meat identification method and a meat identification device capable of accurately and reliably identifying and judging whether a part is a part.
[0005]
[Means for Solving the Problems]
The present invention irradiates meat with external detection light projected from external detection light irradiation means and internal detection light projected from internal detection light irradiation means, and detects external detection light reflected by the meat on external detection light detection. Means, the internal detection light transmitted through the meat is detected by the internal detection light detection means, and the meat is detected based on the detection information output from the external detection light detection means and the internal detection light detection means. It is a meat identification method and a meat identification device for determining where a part is by an algorithm of a part determination means.
[0006]
The above-mentioned meat is composed of meat such as chicken, pork, and cow, and in the embodiment, chicken drum (Dram), wing (Wing), keel (Keel), rib (Limb), rhino (Sai) and the like. Although it is composed of the parts described above, it may be composed of various parts such as pigs and cows.
[0007]
The external detection light irradiating means can be constituted by a light source (or an illuminating device) that irradiates detection light such as a halogen lamp, a xenon lamp, an ultraviolet lamp, a fluorescent lamp, an incandescent lamp, and the like. Further, the internal detection light irradiating means can be constituted by a radiation generating source which irradiates or generates radiation such as a gas tube, an ion tube, a coolidge tube, a high vacuum tube, and a high voltage tube.
[0008]
Further, the external detection light detection means and the internal detection light detection means can be constituted by, for example, a color or monochrome imaging camera (CCD camera), a digital camera, an image element, or the like. Further, the part determining means can be configured by a part determining device such as a personal computer or a control device including a CPU, a ROM, and a RAM. Further, the transport conveyor can be constituted by, for example, a belt conveyor, a bucket conveyor, a roller conveyor, a chain conveyor, a free tray on which meat is placed, and the like.
[0009]
That is, the meat is irradiated with external detection light projected from the external detection light irradiation means and internal detection light (for example, radiation such as X-rays and soft X-rays) projected from the internal detection light irradiation means, The external detection light reflected by the meat is detected by the external detection light detection means, and the internal detection light transmitted through the meat is detected by the internal detection light detection means. In addition, based on the detection information output from the external detection light detection means and the internal detection light detection means, the position of the meat is individually identified and determined by the algorithm of the part determination means.
[0010]
As an embodiment, the external detection light irradiation unit and the external detection light detection unit, and the internal detection light irradiation unit and the internal detection light detection unit are provided on a conveyor that conveys the meat, and the external detection light irradiation The external detection light emitted from the means and the internal detection light emitted from the internal detection light irradiating means are applied to the meat conveyed by the conveyor, and the external detection light reflected by the meat and the inside of the meat are reflected. A meat identification method can be configured in which the internal detection light transmitted through the meat is detected by the external detection light detection means and the internal detection light detection means while conveying the meat.
[0011]
In addition, the external detection light irradiation unit and the external detection light detection unit, and the internal detection light irradiation unit and the internal detection light detection unit can be provided on a conveyor that conveys the meat. Further, the external detection light irradiation means is constituted by detection light irradiation means for irradiating the meat with detection light, and the external detection light detection means is a reflected light detection means for detecting reflected light reflected by the meat. Can be configured. Further, the internal detection light irradiation means may be constituted by radiation irradiation means for irradiating the meat with radiation, and the internal detection light detection means may be constituted by radiation detection means for detecting radiation transmitted through the meat. Can be.
[0012]
[Action and effect]
According to the present invention, the external detection light reflected by the meat and the internal detection light transmitted through the meat are detected by the internal detection light detection unit and the external detection light detection unit, and the detection information output from each detection unit is detected. On the basis of which part of the meat is determined individually by the algorithm of the part determination means, so that each part of the meat can be identified and determined accurately and reliably, and various kinds of meat can be sorted for each part. The sorting operation can be performed accurately, and the sorting accuracy can be improved and the sorting operation can be labor-saving.
[0013]
【Example】
An embodiment of the present invention will be described below in detail with reference to the drawings.
The drawing shows a meat identification method and a meat identification device used for sorting chicken, which is an example of meat, for each part, and in FIG. 1, the meat identification apparatus 1 separates a plurality of parts into pieces. The sorted chicken A is placed on the conveyor 2 and aligned and transported to the first detection section a and the second detection section b set on the transport path, and the chicken A transported to the first detection section a The detection light B emitted from the detection light irradiation device 3 is irradiated, and the reflected light C reflected by the chicken A is imaged by the imaging camera 4. Subsequently, the chicken A conveyed to the second detection section b is irradiated with radiation D (for example, X-rays) emitted from the radiation irradiating device 5, and a transmission image of the radiation D transmitted through the chicken A is formed. An image is taken by the imaging camera 6. In addition, based on the image information output from the imaging cameras 4 and 6, it is an apparatus that determines an area of the chicken A by an algorithm of the area determination apparatus 7.
[0014]
FIG. 2 shows a block diagram of a control circuit of the meat identification device 1. For example, a site determination device 7 composed of a personal computer incorporates a CPU 8, a ROM 9, a RAM 10, and a comparison information storage device 11, and the CPU 8 According to the program stored in the ROM 9, the conveyor 2, the detection light irradiation device 3, the color imaging camera 4, the radiation irradiation device 5, the monochrome imaging camera 6, the image processing device 12, In addition to controlling driving and stopping with the detection sensor 13, a timer for counting elapsed time is built in. The RAM 22 stores data necessary for the detection processing and the measurement processing.
[0015]
The above-described transport conveyor 2 is configured to have a color scheme and a material suitable for imaging the reflected light C reflected by the chicken A, and a material and a structure that allow transmission of the radiation D emitted from the radiation irradiation device 5. Shields external light from the lower part of the transport path. In addition, an address to which the chicken A is moved is set or attached to the placing portion on which the chicken A is placed.
[0016]
A plurality of the above-mentioned detection light irradiation devices 3 are disposed substantially obliquely above the outer surface of the chicken A conveyed to the first measurement section, and have a detection light having a wavelength suitable for detecting a predetermined item outside the chicken A. B is irradiated substantially evenly on the upper outer surface and side outer surface of chicken A.
[0017]
The above-described imaging camera 4 is disposed in the upper part of the transport path facing the upper outer surface and the side outer surface of the chicken A to which the detection light B emitted from the detection light irradiation device 3 is irradiated, and the chicken A is reflected. The reflected light C is captured and output to an image processing device 12 described later. Further, the imaging camera 4 may be disposed substantially directly beside, substantially obliquely upward, substantially obliquely front and rear, etc., with the chicken A transported to the first measurement section as a center.
[0018]
The above-mentioned radiation irradiation device 5 is disposed at the lower part of the transport path (the lower part of the conveyor 2) facing the lower surface of the chicken A transported to the second measurement section, penetrates the inside of the chicken A, and a predetermined item inside the chicken A. Radiation D having a wavelength suitable for detecting the radiation D is emitted from the lower part of the chicken A substantially uniformly from below via the conveyor 2 having a material and structure that allows the transmission of the radiation D.
[0019]
The above-mentioned imaging camera 6 is disposed in the upper part of the conveyance path facing the upper outer surface and the side outer surface of the chicken A through which the radiation D emitted from the radiation irradiating device 5 passes, and the radiation transmitted through the inside of the chicken A D is converted into a light beam having a wavelength that can be optically imaged by a sheet-shaped or plate-shaped phosphor 14 (for example, a scintillator), and a transmitted image of the converted radiation D is imaged. Output to
[0020]
The above-mentioned phosphor 14 is provided between the chicken A conveyed to the second measurement section and the facing surface of the radiation irradiating device 5 arranged above the chicken A, and emits the radiation D transmitted through the inside of the chicken A. The light is converted into light having a wavelength that can be optically imaged by the imaging camera 6. That is, when the radiation D transmitted through the inside of the chicken A is applied to the phosphor 14, the radiation D is converted into a light beam having a wavelength that can be optically imaged by the imaging camera 6 in accordance with the amount of the transmitted radiation D. The whole or a part of the transmitted image of the irradiated chicken A is projected on the upper surface side.
[0021]
The image processing apparatus 12 described above converts image information output from the imaging cameras 4 and 6 into, for example, chicken drum (Dram), chicken wing (Wing), keel (Keel), rib (Limb), rhino (Sai), and the like. Is converted (for example, binarized) into an original image suitable for judging each part and output to the part judging device 7.
[0022]
Further, based on the original image of each part output from the imaging camera 4, for example, as shown in FIG. 3, red 1 (R1), red 2 (R2), yellow 1 (Y1), yellow 2 (Y2), Yellow-green 1 (YG1), yellow-green 2 (YG2), green (G), blue (BR1), blue (BR2), blue (BR3), blue (BR4), white (WH), black (BL), etc. Convert and classify into color information. That is, blood is classified as R1 or R2, meat is classified as R2 or Y1 or Y2, skin and bone are classified as Y2 or YG1 or YG2, and fat is converted and classified as YG1 or YG2.
[0023]
The radiation controller 5a (for example, an X-ray controller) connected to the radiation irradiator 5 changes the amount of radiation D generated by the radiation irradiator 5 to a value suitable for identifying the part of the chicken A. It is a device for adjusting. Further, the radiation control device 5a can be controlled by the CPU 8 of the region determination device 7 described later. In addition, the irradiation angle and irradiation area of the radiation irradiation apparatus 5 and the number of arrangements of the apparatuses, the imaging angle and imaging range of the imaging camera 6, and the number of cameras may be changed.
[0024]
As other conversion methods instead of the above-mentioned phosphor 14, for example, gadolinium oxysulfide = Gd2O2S: Tb or cesium iodide = CsI: TI, a combination of a phosphor and an amorphous silicon sensor, or the use of an amorphous selenium photoconductor is used. Can also be converted.
[0025]
The above-described detection sensor 13 is disposed on the side of the transport path opposite to the chicken A transported to the first measurement section and the second measurement section, and the chicken A is transported to the first measurement section and the second measurement section. Is detected, and the detection signal is output to the region determination device 7. Further, instead of the detection sensor 13, for example, a limit switch or a proximity switch may be used.
[0026]
The above-described region determination device 7 temporarily stores the image information output from the image processing device 12 in the detection information storage area (RAM), and stores the chicken A detection information stored in the detection information storage area and the comparison information storage. By comparing and calculating reference information necessary for identifying each part stored in the apparatus 11, information such as the color, shape, and size of the chicken A is obtained, and the mounting portion (not shown) of the conveyor 2 is obtained. 4) and 5), where the portion of the chicken A placed on the chicken is located.
[0027]
In other words, based on the image information output from the image processing apparatus 12, for example, the area (the entire area of the chicken A without the hole), the perimeter (the length of the periphery of the chicken A without the hole, the neighborhood of the region) Length of the line connecting the intermediate points of the boundary pixels to be formed), circularity (4π × area / (perimeter) squared = circularity), size (2 × area / perimeter = size), centroid distance (of chicken A Average the maximum and minimum distances from the center of gravity to the outer contour), the absolute maximum length (the longest part of chicken A = absolute maximum length), and the diagonal width (two straight lines parallel to the absolute maximum length) When sandwiched, shape feature quantities such as the shortest distance between the two straight lines, the needle ratio (absolute maximum length / diagonal width = needle ratio), complexity, and variance are extracted.
[0028]
In addition, based on the image information and the original image, for example, color information such as the ratio, the number, the complexity, and the position of each part is extracted. The measurement result of the chicken A and the reference information stored in the comparison information storage device are compared and calculated, and the chicken A is placed in any part of the chicken (for example, a drum (Dram), a wing (Wing), a keel (Keel). ), Ribs (Limb), rhinoceros (Sai), etc.) are individually determined and identified by the algorithm of FIGS. 4 and 5. In addition, the determination result of the chicken A and the address information of the placing portion (not shown) on which the chicken A is placed are stored in association with each other.
[0029]
As another identification method in place of the address or the time lag described above, for example, a recording medium (for example, a bar code, an ID card, a magnetic card, or the like) attached to or substantially corresponding to the placing portion of the chicken A itself. May be read, or the measurement information measured in the first and second measurement sections may be recorded on a recording medium.
[0030]
The illustrated embodiment is configured as described above, and a method of identifying a part of chicken A by the above-described meat identification device 1 will be described below.
[0031]
First, as shown in FIG. 1, the chicken A separated and separated into a plurality of parts is placed on the conveyor 2 and transported to the first detection section a, and the detection light emitted from the detection light irradiation device 3 is detected. The light B is applied to the chicken A transported to the first measurement section, the reflected light C reflected by the chicken A is captured by the imaging camera 4, and the captured information is output to the image processing device 12. The address information of the conveyor 2 set or moved substantially in correspondence with the chicken A and the image information captured by the imaging camera 4 are output to the site determination device 7 in association with each other.
[0032]
Next, the radiation D emitted from the radiation irradiating device 5 is applied to the chicken A transported to the second measurement section from below the conveyor 2, and the radiation D transmitted through the chicken A is taken by the imaging camera. 6 converts the transmitted light of the chicken A into whole or a part of the transmitted light of the chicken A on the upper surface side, and converts the transmitted image of the chicken A converted by the fluorescent 14 into an imaging camera. 6 and outputs the image information to the image processing device 12. In addition, the address information set or moved substantially in correspondence with the chicken A and the image information captured by the image capturing camera 6 are output to the site determination device 7 in association with each other.
[0033]
On the other hand, the above-described image processing device 12 processes image information captured by the imaging cameras 4 and 6 in an information form (for example, binarization processing, color information shown in FIG. 3) suitable for identifying each part. It is output to the part determination device 7. The part determination device 7 comprehensively individually determines where the chicken A transported by the transport conveyor 2 is based on the image information output from the imaging cameras 4 and 6, and the determination is performed. The site information of the chicken A and the address information of the mounting portion (not shown) on which the chicken A is mounted are stored in association with each other.
[0034]
Next, based on the image information captured by the imaging cameras 4 and 6, where the chicken A is identified and determined by the algorithm of the part determination device 7 using the algorithm of the part determination device 7, as shown in FIG. Size <110? It is determined in step 1 whether or not the size of the chicken A is smaller than the reference value. Is determined in step 2, and if it is determined that the needle ratio is equal to or greater than the reference value, the needle is identified as a drum (Dram).
[0035]
Next, when it is determined in step 2 that the needle ratio is equal to or less than the reference value, the center of gravity distance minimum / area <18? Is determined in step 3, and if it is determined that the center of gravity distance minimum / area is equal to or less than the reference value, the area <50000? Is determined in step 4, and if it is determined that the area is equal to or smaller than the reference value, it is identified as a left or right wing. In other words, the drum and the wing can be identified without any problem based on the characteristic amount centered on the shape.
[0036]
Next, in Step 3, when it is determined that the center of gravity distance minimum / area is equal to or more than the reference value, and when it is determined in Step 4 that the area is equal to or more than the reference value, the process proceeds to Step 5.
[0037]
Next, complexity <1500? Is determined in step 5, and if it is determined that the complexity is equal to or less than the reference value, the needle ratio> 1.45? Is determined in step 6, and if it is determined that the needle ratio is equal to or more than the reference value, it is identified as a keel. If it is determined in step 5 that the complexity is equal to or larger than the reference value, the process proceeds to step 13 in FIG.
[0038]
Next, in step 6, when it is determined that the needle ratio is equal to or less than the reference value, the needle ratio is greater than 1.25? Is determined in step 7, and if it is determined that the needle ratio is equal to or more than the reference value, the number of YG <10? Is determined in step 8, and if it is determined that the number of YGs is equal to or smaller than the reference value, the complexity of YG <100? Is determined in step 9, and if it is determined that the complexity of the YG is equal to or less than the reference value, it is determined that the YG is behind a keel.
[0039]
If it is determined in step 8 that the number of YGs is equal to or greater than the reference value, the process proceeds to step 10. If it is determined in step 9 that the complexity of the YG is equal to or more than the reference value, the process proceeds to step 11.
[0040]
Next, the number of YG <15? Is determined in step 10, and when it is determined that the number of YGs is equal to or less than the reference value, the ratio of Y2 <25%? Is determined in step 11, and if the ratio is determined to be equal to or less than the reference value, the ratio of Y2 + YG> 70%? Is determined in step 12, and if it is determined that the ratio of Y2 + YG is equal to or more than the reference value, the table is identified as a table of Keel.
[0041]
Next, if it is determined in step 7 that the needle ratio is equal to or less than the reference value, if it is determined in step 10 that the number of YGs is equal to or less than the reference value, it is determined that the ratio of Y2 is equal to or greater than the reference value. If it is determined in step 11 that the ratio of Y2 + YG is equal to or less than the reference value in step 12, the process proceeds to step 13 in FIG.
[0042]
Next, as shown in FIG. 5, R dispersion <200? Is determined in step 13, and when it is determined that the R dispersion is equal to or smaller than the reference value, the table is identified as a table of Sai.
[0043]
Next, when it is determined in step 13 that the R dispersion is equal to or larger than the reference value, the area of R1 + R2> 2000? Is determined in step 14, and if it is determined that the area of R1 + R2 is equal to or larger than the reference value, the distance of R1 + R2> 20? Is determined in step 15, and if it is determined that the distance of R1 + R2 is equal to or greater than the reference value, the number of R1 + R2 <= 10? Is determined in step 16, and when it is determined that the number of R1 + R2 is equal to or less than the reference value or substantially equal, it is identified as the back side of the die (Sai). If it is determined in step 15 that the distance of R1 + R2 is equal to or less than the reference value, it is determined that the distance is behind the rib (Limb).
[0044]
Next, in Step 14, when it is determined that the area of R1 + R2 is equal to or less than the reference value, and in Step 16, it is determined that the number of R1 + R2 is equal to or more than the reference value, the process proceeds to Step 17.
[0045]
Next, the area of R1 + R2 <1000? Is determined in step 17, and if it is determined that the area of R1 + R2 is equal to or smaller than the reference value, the ratio of Y2 + YG <60%? Is determined in step 18, and if it is determined that the ratio of Y2 + YG is equal to or less than the reference value, it is identified as a table of ribs (Limb). That is, based on the transmission image captured by the imaging camera 6, a part of the keel, the ribs, and the rhinoceros that are similar in shape and color, which were difficult to be identified only by the image information captured by the imaging camera 4, are established with high probability. Can be identified almost exactly.
[0046]
Next, in Step 17, when it is determined that the area of R1 + R2 is equal to or more than the reference value, and in Step 18, it is determined that the ratio of Y2 + YG is equal to or more than the reference value, the process proceeds to Step 19.
[0047]
Next, the number of R1 + R2 <5? Is determined in step 19, and when it is determined that the number of R1 + R2 is equal to or smaller than the reference value, the table is identified as a table of ribs (Limb). If it is determined that the number of R1 + R2 is equal to or more than the reference value, the portion of the chicken A cannot be determined. Step 20 is repeated several times to identify. Since the determination cannot be made for the second time or later, the information is collected or discarded.
[0048]
Also, if the remaining chicken A can be narrowed down to the back of the rhinoceros and the front and back of the ribs, it is possible to identify the front and back of the ribs from this shape. In this case, one condition is that the drum and the wings, the keel, and the rhinoceros table can be reliably identified.
[0049]
As described above, the detection light B emitted from the detection light irradiation device 3 and the radiation D emitted from the radiation irradiation device 5 are applied to the chicken A transported by the transport conveyor 2, and the chicken A is reflected. The transmitted images of the reflected light C and the radiation D transmitted through the chicken A are captured by the imaging cameras 4 and 6, and based on the image information output from the imaging cameras 4 and 6, where the chicken A is located Since the individual determination is performed comprehensively by the algorithm of the part determination device 7, each part of the chicken A can be identified and determined accurately and reliably, and the operation of sorting various chickens A for each part can be performed accurately. In addition, it is possible to improve the sorting accuracy and save labor of the sorting operation.
[0050]
The radiation D emitted from the radiation irradiating device 5 is first irradiated onto the chicken A transported by the transport conveyor 2, and a transmission image of the radiation D transmitted through the chicken A is captured by the imaging camera 6. It is also possible to irradiate the chicken A with the detection light B emitted from the detection light irradiating device 3 and take an image of the reflected light C reflected by the chicken A with the imaging camera 4, which is not limited to the configuration of the embodiment. Not something.
[0051]
Further, the reference value (numerical value) described in the algorithm of FIGS. 4 and 5 may be changed to a value according to the type of other chicken, beef, pork, etc., and each part of the meat is identified. Can be applied to
[Brief description of the drawings]
FIG. 1 is a perspective view showing a method of identifying a part of chicken by a meat identification device.
FIG. 2 is a control circuit block diagram of the meat identification device.
FIG. 3 is an explanatory diagram showing a color processing state of chicken.
FIG. 4 is an algorithm diagram showing a method for identifying a drum, a wing, and a keel.
FIG. 5 is an algorithm diagram showing a method of identifying a rhino and a rib following FIG. 4;
[Explanation of symbols]
A ... chicken B ... detection light C ... reflected light D ... radiation 1 ... meat identification device 2 ... transport conveyor 3 ... detection light irradiation device 5 ... radiation irradiation devices 4, 6 ... imaging camera 7 ... part determination device 11 ... comparison information storage Device 12 image processing device 13 detection sensor

Claims (6)

外部検出光照射手段から投光される外部検出光及び内部検出光照射手段から投光される内部検出光を食肉に照射し、
上記食肉が反射する外部検出光を外部検出光検出手段で検出し、該食肉内部を透過する内部検出光を内部検出光検出手段で検出すると共に、
上記内部検出光検出手段及び外部検出光検出手段から出力される検出情報に基づいて、上記食肉が何処の部位であるかを部位判定手段のアルゴリズムで判定する食肉識別方法。
The meat is irradiated with external detection light projected from the external detection light irradiation means and internal detection light projected from the internal detection light irradiation means,
The external detection light reflected by the meat is detected by the external detection light detection means, and the internal detection light transmitted through the meat is detected by the internal detection light detection means,
A meat identification method for determining, based on detection information output from the internal detection light detection means and the external detection light detection means, a part of the meat by an algorithm of a part determination means.
上記外部検出光照射手段及び外部検出光検出手段と、上記内部検出光照射手段及び内部検出光検出手段とを、上記食肉を搬送する搬送コンベア上に設け、
上記外部検出光照射手段から投光される外部検出光及び内部検出光照射手段から投光される内部検出光を、上記搬送コンベアにより搬送される食肉に照射して、上記食肉が反射する外部検出光及び該食肉内部を透過する内部検出光を、該食肉を搬送しながら外部検出光検出手段及び内部検出光検出手段で検出する
請求項1記載の食肉識別方法。
The external detection light irradiation means and the external detection light detection means, and the internal detection light irradiation means and the internal detection light detection means, provided on a conveyor that conveys the meat,
The external detection light emitted from the external detection light irradiating means and the internal detection light emitted from the internal detection light irradiating means are irradiated on the meat conveyed by the conveyor, and the external detection in which the meat is reflected The meat identification method according to claim 1, wherein the light and the internal detection light transmitted through the meat are detected by the external detection light detection means and the internal detection light detection means while the meat is being conveyed.
食肉外部の所定項目を検出するのに適した波長の外部検出光を該食肉に対して照射する外部検出光照射手段と、
上記食肉が反射する外部検出光を検出する外部検出光検出手段と、
上記食肉内部の所定項目を検出するのに適した波長の内部検出光を該食肉に対して照射する内部検出光照射手段と、
上記食肉内部を透過した内部検出光を検出する内部検出光検出手段と、
上記外部検出光出手段及び内部検出光検出手段で検出した検出情報に基づいて、上記食肉が何処の部位であるかを所定のアルゴリズムで判定する部位判定手段とを備えた
食肉識別装置。
External detection light irradiation means for irradiating the meat with external detection light having a wavelength suitable for detecting a predetermined item outside the meat,
External detection light detection means for detecting the external detection light reflected by the meat,
Internal detection light irradiating means for irradiating the meat with internal detection light having a wavelength suitable for detecting a predetermined item inside the meat,
Internal detection light detection means for detecting the internal detection light transmitted through the meat,
A meat identification apparatus comprising: a part determination unit that determines a part of the meat by a predetermined algorithm based on detection information detected by the external detection light emitting unit and the internal detection light detection unit.
上記外部検出光照射手段及び外部検出光検出手段と、上記内部検出光照射手段及び内部検出光検出手段とを、上記食肉を搬送する搬送コンベア上に設けた
請求項3記載の食肉識別装置。
4. The meat identification device according to claim 3, wherein the external detection light irradiation unit and the external detection light detection unit, and the internal detection light irradiation unit and the internal detection light detection unit are provided on a conveyor that conveys the meat.
上記外部検出光照射手段を、上記食肉に対して検出光を照射する検出光照射手段で構成し、
上記外部検出光検出手段を、上記食肉が反射する反射光を検出する反射光検出手段で構成した
請求項1又は2記載の食肉識別方法及び請求項3又は4記載の食肉識別装置。
The external detection light irradiating means comprises detection light irradiating means for irradiating the meat with detection light,
The meat identification method according to claim 1 or 2, and the meat identification device according to claim 3 or 4, wherein the external detection light detection unit is configured by a reflection light detection unit that detects reflected light reflected by the meat.
上記内部検出光照射手段を、上記食肉に対して放射線を照射する放射線照射手段で構成し、
上記内部検出光検出手段を、上記食肉を透過した放射線を検出する放射線検出手段で構成した
請求項1又は2記載の食肉識別方法及び請求項3又は4記載の食肉識別装置。
The internal detection light irradiation means, radiation irradiation means for irradiating the meat with radiation,
The meat identification method according to claim 1 or 2, and the meat identification device according to claim 3 or 4, wherein the internal detection light detection unit includes a radiation detection unit that detects radiation transmitted through the meat.
JP2002199762A 2002-07-09 2002-07-09 Meat identification method and meat identification device Pending JP2004045072A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002199762A JP2004045072A (en) 2002-07-09 2002-07-09 Meat identification method and meat identification device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002199762A JP2004045072A (en) 2002-07-09 2002-07-09 Meat identification method and meat identification device

Publications (1)

Publication Number Publication Date
JP2004045072A true JP2004045072A (en) 2004-02-12

Family

ID=31706816

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002199762A Pending JP2004045072A (en) 2002-07-09 2002-07-09 Meat identification method and meat identification device

Country Status (1)

Country Link
JP (1) JP2004045072A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006234422A (en) * 2005-02-22 2006-09-07 Anritsu Sanki System Co Ltd X-ray inspection apparatus and stored information extraction system
KR100784854B1 (en) 2007-02-14 2007-12-14 한국기초과학지원연구원 Excitation light source unit and luminescence measurement system having same
JP2011117817A (en) * 2009-12-03 2011-06-16 Si Seiko Co Ltd Article inspection device
WO2012056793A1 (en) * 2010-10-27 2012-05-03 株式会社前川製作所 Deboning method and deboning apparatus of bony chops with x-rays
JP2013019689A (en) * 2011-07-07 2013-01-31 Anritsu Sanki System Co Ltd X-ray inspection device
EP3531126A1 (en) * 2018-02-27 2019-08-28 Milan Fish S.R.L. Method and apparatus for the inspection of packaged fish products
JP2019526061A (en) * 2016-07-29 2019-09-12 ノルデイシェル・マシーネンバウ・ルド・バアデル・ゲーエムベーハー・ウント・コンパニ・カーゲーNordischer Maschinenbau Rud.Baader Gesellschaft Mit Beschrankter Haftung+Compagnie Kommanditgesellschaft Apparatus for acquiring and analyzing product-specific data of products in the food processing industry, system including the apparatus, and product processing method in the food processing industry
WO2019232113A1 (en) * 2018-06-01 2019-12-05 Cryovac, Llc Image-data-based classification of meat products

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06242102A (en) * 1993-02-23 1994-09-02 Snow Brand Milk Prod Co Ltd Inspection method and apparatus for flesh tissue
JPH06318244A (en) * 1993-05-10 1994-11-15 Toshiba Eng Co Ltd Image processor and beef carcass judging system using the same
JPH06324006A (en) * 1993-04-30 1994-11-25 Snow Brand Milk Prod Co Ltd Inspection and machining method and device of flesh tissue
JPH0854350A (en) * 1991-07-26 1996-02-27 Nikka Densoku Kk Method and device for inspecting sheet of dried laver
JPH10132762A (en) * 1996-10-31 1998-05-22 Shimadzu Corp X-ray foreign matter inspection device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0854350A (en) * 1991-07-26 1996-02-27 Nikka Densoku Kk Method and device for inspecting sheet of dried laver
JPH06242102A (en) * 1993-02-23 1994-09-02 Snow Brand Milk Prod Co Ltd Inspection method and apparatus for flesh tissue
JPH06324006A (en) * 1993-04-30 1994-11-25 Snow Brand Milk Prod Co Ltd Inspection and machining method and device of flesh tissue
JPH06318244A (en) * 1993-05-10 1994-11-15 Toshiba Eng Co Ltd Image processor and beef carcass judging system using the same
JPH10132762A (en) * 1996-10-31 1998-05-22 Shimadzu Corp X-ray foreign matter inspection device

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006234422A (en) * 2005-02-22 2006-09-07 Anritsu Sanki System Co Ltd X-ray inspection apparatus and stored information extraction system
KR100784854B1 (en) 2007-02-14 2007-12-14 한국기초과학지원연구원 Excitation light source unit and luminescence measurement system having same
JP2011117817A (en) * 2009-12-03 2011-06-16 Si Seiko Co Ltd Article inspection device
WO2012056793A1 (en) * 2010-10-27 2012-05-03 株式会社前川製作所 Deboning method and deboning apparatus of bony chops with x-rays
US8376814B2 (en) 2010-10-27 2013-02-19 Mayekawa Mfg. Co., Ltd. Deboning method and apparatus for meat with bone using X-ray
JPWO2012056793A1 (en) * 2010-10-27 2014-03-20 株式会社前川製作所 Method and apparatus for deboning boned meat using X-ray
JP2013019689A (en) * 2011-07-07 2013-01-31 Anritsu Sanki System Co Ltd X-ray inspection device
JP2019526061A (en) * 2016-07-29 2019-09-12 ノルデイシェル・マシーネンバウ・ルド・バアデル・ゲーエムベーハー・ウント・コンパニ・カーゲーNordischer Maschinenbau Rud.Baader Gesellschaft Mit Beschrankter Haftung+Compagnie Kommanditgesellschaft Apparatus for acquiring and analyzing product-specific data of products in the food processing industry, system including the apparatus, and product processing method in the food processing industry
JP2021092582A (en) * 2016-07-29 2021-06-17 ノルデイシェル・マシーネンバウ・ルド・バアデル・ゲーエムベーハー・ウント・コンパニ・カーゲーNordischer Maschinenbau Rud.Baader Gesellschaft Mit Beschrankter Haftung+Compagnie Kommanditgesellschaft Device for acquiring and analyzing product specific data of product of food processing industry, system with the device, and product processing method of food processing industry
JP7265569B2 (en) 2016-07-29 2023-04-26 ノルデイシェル・マシーネンバウ・ルド・バアデル・ゲーエムベーハー・ウント・コンパニ・カーゲー A device for acquiring and analyzing product-specific data of products in the food processing industry, a system equipped with the device, and a product processing method in the food processing industry
EP3531126A1 (en) * 2018-02-27 2019-08-28 Milan Fish S.R.L. Method and apparatus for the inspection of packaged fish products
WO2019232113A1 (en) * 2018-06-01 2019-12-05 Cryovac, Llc Image-data-based classification of meat products

Similar Documents

Publication Publication Date Title
JP7391173B2 (en) Food inspection aid system, food inspection aid device, and computer program
JP5340717B2 (en) X-ray inspection equipment
EP3239925A1 (en) Fish type determination device and fish type determination method
JP2001037367A (en) Automatic selection system for egg and detection system for defective egg
KR102205445B1 (en) Method and System for Detecting Foreign Material on Processing Vegetable Using Multispectral Fluorescence Imaging
US10429316B2 (en) Method and device for scanning of objects using a combination of spectral ranges within vision, near infrared light and X-rays
JP2004045072A (en) Meat identification method and meat identification device
JP3715524B2 (en) X-ray foreign object detection device
JP2004245695A (en) Image processing method and foreign substance detection apparatus
JP3737950B2 (en) X-ray foreign object detection apparatus and defective product detection method in the apparatus
JP2003279503A (en) X-ray inspection equipment
JP2019211280A (en) Grain gloss measuring device
JP7029343B2 (en) Foreign matter detection device and foreign matter detection method
JP4590553B2 (en) Nondestructive judgment method for ginger damaged grains
WO2019193924A1 (en) Drug determination device and drug determination method
JP2020193897A (en) X-ray imaging system and method for x-ray imaging
JP2016075660A (en) Boiled egg quality determination device
JP2002062113A (en) Method and device for measuring object to be detected
US20110286637A1 (en) Method for assigning a stonefruit to a predetermined class and a device therefor
JP2004191074A (en) Density estimation method of tablet and inspection device
JP2004028821A (en) Method and recorder for recording x-ray inspection information
WO2021210617A1 (en) Radiography method, trained model, radiography module, radiography program, radiography system, and machine learning method
TWI753131B (en) detection device
JP2002243668A (en) X-ray foreign-body detection apparatus
JP4266887B2 (en) Fruit and vegetable inspection equipment

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050407

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070829

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070904

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20080108