[go: up one dir, main page]

WO2015146972A1 - Système d'endoscope, dispositif de processeur de système d'endoscope, et méthode d'utilisation d'un système d'endoscope - Google Patents

Système d'endoscope, dispositif de processeur de système d'endoscope, et méthode d'utilisation d'un système d'endoscope Download PDF

Info

Publication number
WO2015146972A1
WO2015146972A1 PCT/JP2015/058897 JP2015058897W WO2015146972A1 WO 2015146972 A1 WO2015146972 A1 WO 2015146972A1 JP 2015058897 W JP2015058897 W JP 2015058897W WO 2015146972 A1 WO2015146972 A1 WO 2015146972A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
signal
imaging
illumination light
filter
Prior art date
Application number
PCT/JP2015/058897
Other languages
English (en)
Japanese (ja)
Inventor
小柴 賢明
村山 任
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2016510379A priority Critical patent/JP6150364B2/ja
Publication of WO2015146972A1 publication Critical patent/WO2015146972A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/043Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances for fluorescence imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths

Definitions

  • the present invention relates to an endoscope system for obtaining biological function information related to oxygen saturation of blood hemoglobin from an imaging signal obtained by imaging an observation site in a living body, a processor device of the endoscope system, and an endoscope system. It relates to the method of operation.
  • the endoscope system has a normal observation mode in which observation is performed by irradiating white normal light to an observation site in a living body, and a special observation mode in which observation is performed by irradiating special light to the observation site.
  • This special observation mode one capable of acquiring the oxygen saturation of blood hemoglobin at the observation site is known (see Patent Document 1).
  • the oxygen saturation is biological function information that enables discrimination between normal tissue and cancer tissue.
  • the first illumination light and the second illumination light are alternately supplied from the light source device to the endoscope, and irradiated to the observation site from the distal end portion of the endoscope.
  • the first illumination light is normal light.
  • the second illumination light is special light including light of different absorption wavelength, which has a spectral characteristic different from that of the first illumination light, and has different absorption coefficients between oxyhemoglobin and reduced hemoglobin.
  • the processor device obtains oxygen saturation based on the first and second imaging signals obtained at the time of irradiation with the first and second illumination lights, generates a normal observation image based on the first imaging signal, and oxygen saturation An oxygen saturation image (special observation image) is generated by performing image processing on the normal observation image based on the degree.
  • CMOS Complementary Metal-Oxide Semiconductor
  • ADC Analog-to-Digital Converter
  • Patent Document 2 proposes that a light extinction period is provided when switching between normal light and special light, and signal readout is performed during the light extinction period.
  • the present invention provides an endoscope system, a processor device of the endoscope system, and a method of operating the endoscope system that can improve the S / N ratio and reliability of the oxygen saturation image. Objective.
  • the endoscope system of the present invention includes an illumination unit, an endoscope, a control unit, a displacement amount calculation unit, a positioning unit, and an image processing unit.
  • the illumination unit irradiates the specimen with the first illumination light and the second illumination light including different light absorption wavelength light having different spectral characteristics from the first illumination light and having different absorption coefficients between the oxygenated hemoglobin and the reduced hemoglobin.
  • An endoscope is a CMOS type image sensor that images a specimen illuminated by an illumination unit with a plurality of pixels arranged two-dimensionally in a row direction and a column direction via a plurality of types of color filters having different spectral transmission characteristics.
  • the control unit alternately irradiates the first illumination light and the second illumination light through the extinction period, and during the extinction period, adds pixels corresponding to the first filter of the plurality of types of color filters.
  • An imaging method in which the signal is read without performing the signal readout by performing pixel addition in at least one of the row direction and the column direction for the pixels corresponding to the second filter other than the first filter. Let the element execute.
  • the positional deviation amount calculation unit reads the first imaging signal read from the image sensor during the extinguishing period after irradiation of the first illumination light and the first imaging signal read from the imaging element during the extinction period after irradiation of the second illumination light.
  • the amount of positional deviation between images based on the first and second imaging signals is calculated by comparing the two imaging signals based on the signal of the pixel corresponding to the first filter.
  • the alignment unit aligns images between the first and second imaging signals based on the amount of displacement.
  • the image processing unit generates an oxygen saturation image including oxygen saturation information based on the first and second imaging signals that have been aligned.
  • the first filter is a green filter and the second filter is a red filter and a blue filter.
  • the color filters are arranged in a Bayer array.
  • the pixel corresponding to the second filter is preferably read by adding two pixels corresponding to the second filter of the same color in the column direction.
  • the image processing unit calculates oxygen saturation based on the first and second imaging signals that have been aligned, and generates an oxygen saturation image by performing image processing on the normal observation image based on the oxygen saturation. It is preferable.
  • the image processing unit may generate an oxygen saturation image after adding the pixel signals corresponding to the first filter after signal readout for each of the first and second imaging signals.
  • the control unit reads out the pixel corresponding to the second filter by performing pixel addition, and when the brightness is equal to or larger than the certain value, the control unit reads the pixel corresponding to the second filter. It is preferable to read without pixel addition.
  • the processor device of the endoscope system includes a control unit, a misregistration amount calculation unit, an alignment unit, and an image processing unit.
  • the control unit alternately irradiates the first illumination light and the second illumination light through the extinction period, and during the extinction period, adds pixels corresponding to the first filter of the plurality of types of color filters.
  • An imaging method in which the signal is read without performing the signal readout by performing pixel addition in at least one of the row direction and the column direction for the pixels corresponding to the second filter other than the first filter. Let the element execute.
  • the positional deviation amount calculation unit reads the first imaging signal read from the image sensor during the extinguishing period after irradiation of the first illumination light and the first imaging signal read from the imaging element during the extinction period after irradiation of the second illumination light.
  • the amount of positional deviation between images based on the first and second imaging signals is calculated by comparing the two imaging signals based on the signal of the pixel corresponding to the first filter.
  • the alignment unit aligns images between the first and second imaging signals based on the amount of displacement.
  • the image processing unit generates an oxygen saturation image including oxygen saturation information based on the first and second imaging signals that have been aligned.
  • the operating method of the endoscope system of the present invention includes a first step, a second step, a third step, and a fourth step.
  • the control unit alternately irradiates the first illumination light and the second illumination light through the extinguishing period, and in the extinguishing period, a pixel corresponding to the first filter among the plurality of types of color filters.
  • the illumination unit and the image sensor are executed.
  • the misregistration amount calculation unit receives the first imaging signal read from the imaging element during the extinguishing period after irradiation of the first illumination light and the imaging element during the extinguishing period after irradiation of the second illumination light.
  • the amount of positional deviation between images based on the first and second imaging signals is calculated by comparing the read second imaging signal based on the signal of the pixel corresponding to the first filter.
  • the alignment unit performs image alignment between the first and second imaging signals based on the amount of displacement.
  • the image processing unit generates an oxygen saturation image including oxygen saturation information based on the first and second imaging signals that have been aligned.
  • the first illumination light and the second illumination light are alternately irradiated through the extinguishing period, and in the extinguishing period, the pixel corresponding to the first filter among the plurality of types of color filters is defined as a pixel.
  • Signal readout is performed without addition, and pixels corresponding to the second filter other than the first filter are subjected to pixel addition to perform signal readout, and during the extinguishing period after irradiation with the first and second illumination lights
  • the amount of positional deviation between images based on the first and second imaging signals is calculated.
  • the oxygen saturation image is generated after aligning the image between the first and second imaging signals based on the positional deviation amount, the S / N ratio and the reliability of the oxygen saturation image are improved. Can be improved.
  • an endoscope system 10 includes an endoscope 11 that images an observation site (specimen) in a living body, and a processor device 12 that generates a display image of the observation site based on an imaging signal obtained by imaging.
  • a light source device 13 that supplies illumination light for irradiating an observation site to the endoscope 11 and a monitor 14 that displays a display image are provided.
  • an input unit 15 such as a keyboard and a mouse is connected to the processor device 12.
  • the endoscope 11 connects the insertion unit 16 inserted into the digestive tract of a living body, the operation unit 17 provided at the proximal end portion of the insertion unit 16, and the endoscope 11 to the processor device 12 and the light source device 13. And a universal cord 18 for the purpose.
  • the insertion portion 16 includes a distal end portion 19, a bending portion 20, and a flexible tube portion 21, and is connected in this order from the distal end side.
  • the operation unit 17 is provided with an angle knob 22a, a mode switch 22b, and the like.
  • the angle knob 22a is used for the operation of bending the bending portion 20. By operating the angle knob 22a, the tip 19 can be directed in a desired direction.
  • the mode switching SW 22b is used for switching operation between two types of observation modes, a normal observation mode and a special observation mode.
  • the normal observation mode is a mode in which a normal observation image obtained by capturing an observation target in full color with white light is displayed on the monitor 14.
  • the special observation mode is a mode in which the oxygen saturation of blood hemoglobin to be observed is obtained, and an oxygen saturation image obtained by performing image processing on the normal observation image based on the oxygen saturation is displayed on the monitor 14.
  • An imaging element 39 (see FIG. 3) is built in the back of the observation window 24.
  • the bending portion 20 is composed of a plurality of connected bending pieces, and bends in the vertical and horizontal directions according to the operation of the angle knob 22a of the operation portion 17. By curving the bending portion 20, the distal end portion 19 is directed in a desired direction.
  • the flexible tube portion 21 has flexibility and can be inserted into a tortuous tube passage such as an esophagus or an intestine.
  • a control signal for driving the image sensor 39, a signal cable for transmitting an image signal output from the image sensor 39, and illumination light supplied from the light source device 13 are guided to the illumination window 23.
  • a light guide 35 (see FIG. 3) is inserted.
  • the operation unit 17 includes an air supply / water supply operated when air / water is supplied from a forceps port 27 for inserting a treatment instrument and an air / water supply nozzle 25.
  • a water supply button 28, a freeze button (not shown) for taking a still image, and the like are provided.
  • a communication cable and a light guide 35 extending from the insertion portion 16 are inserted into the universal cord 18, and a connector 29 is attached to one end of the processor device 12 and the light source device 13 side.
  • the connector 29 is a composite type connector composed of a communication connector 29a and a light source connector 29b.
  • the communication connector 29a and the light source connector 29b are detachably connected to the processor device 12 and the light source device 13, respectively.
  • One end of a communication cable is disposed on the communication connector 29a.
  • An incident end 35a (see FIG. 3) of the light guide 35 is disposed on the light source connector 29b.
  • the light source device 13 includes first and second laser diodes (LDs) 30a and 30b, a light source control unit 31, first and second optical fibers 32a and 32b, and an optical coupler 33.
  • the first LD 30a emits a first blue laser beam having a center wavelength of 445 nm.
  • the second LD 30b emits a second blue laser beam having a center wavelength of 473 nm.
  • the half widths of the first and second blue laser beams are about ⁇ 10 nm, respectively.
  • a broad area type InGaN laser diode, an InGaNAs laser diode, a GaNAs laser diode, or the like is used for the first and second LDs 30a and 30b.
  • the light source control unit 31 individually controls turning on and off of the first and second LDs 30a and 30b. In the normal observation mode, the light source control unit 31 turns on the first LD 30a. In the special observation mode, the first LD 30a and the second LD 30b are turned on in order.
  • the first blue laser light emitted from the first LD 30a is incident on the first optical fiber 32a.
  • the second blue laser light emitted from the second LD 30b is incident on the second optical fiber 32b.
  • the first and second optical fibers 32 a and 32 b are connected to the optical coupler 33.
  • the optical coupler 33 integrates the optical paths of the first and second optical fibers 32a and 32b, and causes the first and second blue laser beams to enter the incident end 35a of the light guide 35 of the endoscope 11, respectively.
  • the endoscope 11 includes a light guide 35, a phosphor 36, an illumination optical system 37, an imaging optical system 38, an imaging element 39, and a signal transmission unit 40.
  • One light guide 35 is provided for each illumination window 23.
  • a multimode fiber can be used.
  • a thin fiber cable having a core diameter of 105 ⁇ m, a cladding diameter of 125 ⁇ m, and a diameter of 0.3 to 0.5 mm including a protective layer serving as an outer skin can be used.
  • each light guide 35 disposed on the light source connector 29 b faces the emission end of the optical coupler 33.
  • a phosphor 36 is arranged to face the emission end of each light guide 35 located at the tip 19. The first blue laser light or the second blue laser light is incident on the phosphor 36 via the light guide 35.
  • the phosphor 36 is formed by dispersing a plurality of types of phosphor materials (for example, YAG phosphors or phosphors such as BAM (BaMgALS 0 O 17 )) in a binder to form a rectangular parallelepiped shape.
  • the phosphor 36 is excited by absorbing a part of the laser light (first blue laser light or second blue laser light) incident from the light guide 35 and emits fluorescence having a wavelength band from green to red. Further, part of the laser light incident on the phosphor 36 passes through the phosphor 36 as it is without being absorbed by the phosphor 36. Therefore, the fluorescent material 36 emits fluorescent light and a part of the laser light.
  • the first illumination light having the spectrum shown in FIG. 4 is emitted from the phosphor 36.
  • the first illumination light includes first blue laser light and first fluorescence excited and emitted from the phosphor 36 by the first blue laser light.
  • the second LD 30b is turned on and the second blue laser light is incident on the phosphor 36
  • the phosphor 36 emits second illumination light having a spectrum shown in FIG.
  • the second illumination light includes the second blue laser light and the second fluorescence excited and emitted from the phosphor 36 by the second blue laser light, and has different spectral characteristics from the first illumination light.
  • the spectrum shapes of the first fluorescence and the second fluorescence are substantially the same. That is, the ratio between the intensity I 1 ( ⁇ ) of the first fluorescence and the intensity I 2 ( ⁇ ) of the second fluorescence at the wavelength ⁇ is substantially constant.
  • the first and second illumination lights emitted from the phosphor 36 are collected by the illumination optical system 37 and irradiated onto the observation site in the living body through the illumination window 23.
  • the reflected light from the observation site enters the imaging optical system 38 through the observation window 24 and is imaged on the imaging surface 39 a of the imaging device 39 by the imaging optical system 38.
  • the light source device 13, the light guide 35, the phosphor 36, and the illumination optical system 37 correspond to the illumination unit described in the claims.
  • the imaging element 39 is a CMOS type, and images reflected light from the observation site based on the imaging control signal supplied from the processor device 12 and outputs an imaging signal.
  • the signal transmission unit 40 transmits the image signal obtained by the image sensor 39 to the processor device 12 by a well-known low voltage operation signaling transmission method. Further, when the above-described mode change switch 22 b provided in the endoscope 11 is operated, a mode change operation signal is transmitted from the mode change switch 22 b to the processor device 12.
  • the processor device 12 includes a control unit 41, a signal reception unit 42, a digital signal processing unit (DSP: Digital Signal Processor) 43, an image processing unit 44, and a display control unit 45.
  • the control unit 41 controls each unit in the processor device 12 and controls the image sensor 39 of the endoscope 11 and the light source control unit 31 of the light source device 13.
  • the signal receiving unit 42 receives an imaging signal transmitted from the signal transmitting unit 40 of the endoscope 11.
  • the DSP 43 performs well-known signal processing such as defect correction processing, gain correction processing, white balance processing, gamma conversion, and synchronization processing on the imaging signal received by the signal receiving unit 42.
  • the image processing unit 44 is obtained by imaging the reflected light from the observation site irradiated with the first illumination light by the imaging device 39, and for the imaging signal subjected to signal processing by the DSP 43.
  • a normal observation image is generated by performing color conversion processing, color enhancement processing, structure enhancement processing, and the like.
  • the image processing unit 44 is obtained by the imaging element 39 imaging the reflected light from the observation site irradiated with the first and second illumination lights, and is subjected to signal processing by the DSP 43.
  • Oxygen saturation is calculated based on the imaging signal, a normal observation image is calculated, and the normal observation image is processed based on the oxygen saturation, thereby obtaining an oxygen saturation image including oxygen saturation information (special observation). Image).
  • the display control unit 45 converts the image generated by the image processing unit 44 into a display format signal and displays it on the monitor 14.
  • the image sensor 39 includes a pixel array unit 50, a row scanning circuit 51, a column ADC circuit 52 in which a plurality of ADCs (Analog-to-Digital Converters) are arranged in the row direction, a line memory 53, It has a column scanning circuit 54 and a timing generator (TG: Timing Generator) 55.
  • the TG 55 generates a timing signal based on the imaging control signal input from the control unit 41 of the processor device 12 and controls each unit.
  • the pixel array unit 50 includes a plurality of pixels 50a two-dimensionally arranged in a matrix in the row direction (X direction) and the column direction (Y direction), and is provided on the imaging surface 39a.
  • a first row selection line LS1, a second row selection line LS2, and a row reset line LR are wired along the row direction.
  • the first column signal line LV1 and the row reset line LR are arranged along the column direction.
  • a second column signal line LV2 is wired.
  • the first row selection line LS1, the second row selection line LS2, and the row reset line LR are provided for each pixel row.
  • the first column signal line LV1 and the second column signal line LV1 are provided for each pixel column.
  • the pixel row refers to one row of pixels 50a arranged in the row direction.
  • the pixel column refers to one column of pixels 50a arranged in the column direction.
  • a color filter array 60 is provided on the light incident side of the pixel array unit 50 as shown in FIG.
  • the color filter array 60 includes a green (G) filter 60a, a blue (B) filter 60b, and a red (R) filter 60c. Any one of these filters is arranged on each pixel 50a.
  • the color arrangement of the color filter array 60 is a Bayer arrangement, in which G filters 60a are arranged every other pixel in a checkered pattern, and the B filter 60b and the R filter 60c are each in a square lattice pattern on the remaining pixels. Is arranged.
  • the G filter 60a corresponds to the first filter described in the claims, and the B filter 60b and the red R filter 60c correspond to the second filter.
  • the pixel 50a in which the G filter 60a is disposed is referred to as a G pixel
  • the pixel 50a in which the B filter 60b is disposed is referred to as a B pixel
  • the pixel 50a in which the R filter 60c is disposed is referred to as an R pixel.
  • B pixels and G pixels are alternately arranged in each pixel row of even number (0, 2, 4,... N ⁇ 1).
  • G pixels and R pixels are alternately arranged in each odd (1, 3, 5,..., N) pixel row.
  • Each pixel 50a in one pixel row is commonly connected to the row reset line LR.
  • the G pixel is commonly connected to the first row selection line LS1
  • the B pixel and the R pixel are commonly connected to the second row selection line LS2.
  • Each pixel 50a is connected to the first column signal line LV1 or the second column signal line LV2. Specifically, each pixel of 0, 1, 4, 5, 8, 9,..., N-3, N-2 pixel rows (hereinafter referred to as a first pixel row group) among all the pixel rows. 50a is connected to the first column signal line LV1. The other pixels 50a in the 2, 3, 6, 7, 10, 11,..., N ⁇ 1, N pixel rows (hereinafter referred to as the second pixel row group) are connected to the second column signal line LV2. Has been.
  • each pixel 50a has a photodiode D1, an amplifier transistor M1, a pixel selection transistor M2, and a reset transistor M3.
  • the photodiode D1 photoelectrically converts incident light to generate signal charges corresponding to the amount of incident light, and accumulates the signal charges.
  • the amplifier transistor M1 converts the signal charge accumulated in the photodiode D1 into a voltage value (pixel signal PS).
  • the pixel selection transistor M2 is controlled by the first row selection line LS1 or the second row selection line LS2, and outputs the pixel signal PS generated by the amplifier transistor M1 to the first column signal line LV1 or the second column signal line LV2.
  • Let The reset transistor M3 is controlled by the row reset line LR, and discards (resets) the signal charge accumulated in the photodiode D1 to the power supply line.
  • the row scanning circuit 51 generates a row selection signal S1 and a reset signal S2 based on the timing signal input from the TG 55.
  • the row scanning circuit 51 applies the row selection signal S1 to the first row selection line LS1 or the second row selection line LS2 during the signal read operation, thereby causing the first row selection line LS1 or the first row selection line LS1 to which the row selection signal S1 is applied.
  • the pixel signal PS of the pixel 50a connected to the second row selection line LS2 is output to the first column signal line LV1 or the second column signal line LV2.
  • the row scanning circuit 51 resets the pixels 50a connected to the row reset line LR to which the reset signal S2 is given by giving the reset signal S2 to the row reset line LR during the reset operation.
  • the column ADC circuit 52 includes a comparator 52a, a counter 52b, a reference signal generator 52c, first to third capacitors C1 to C3, and first and second switches SW1 and SW2.
  • the first and second capacitors C1 and C2 are connected in parallel to the first input terminal of the comparator 52a, and the third capacitor C3 is connected to the second input terminal.
  • a counter 52b is connected to the output terminal of the comparator 52a.
  • a set of first and second column signal lines LV1 and LV2 provided in each pixel column is connected to the first and second capacitors C1 and C2 of one comparator 52a via the first and second switches SW1 and SW2. Are connected to each other.
  • the first and second switches SW1 and SW2 are on / off controlled based on a timing signal input from the TG 55, and one or both of them are turned on at the time of signal reading. Specifically, when the pixel signal PS is read from the first pixel row group, the first switch SW1 is turned on and the pixel signal PS is input to the first capacitor C1. When the pixel signal PS is read from the second pixel row group, the second switch SW2 is turned on, and the pixel signal PS is input to the second capacitor C2. In the case of pixel addition reading described later, the first and second switches SW1 and SW2 are both turned on, and the pixel signal PS read from the first and second pixel row groups is the first and second capacitors C1. , C2 respectively.
  • the reference signal generator 52c is commonly connected to the third capacitor C3 of each comparator 52a, and inputs the reference signal Vref to the third capacitor C3. Specifically, the reference signal generation unit 52c generates a reference signal Vref that increases linearly with time based on the clock signal input from the TG 55, as shown in FIG. Input to C3.
  • Each comparator 52a compares the pixel signal PS input to the first and second capacitors C1 and C2 with the reference signal Vref, and represents the magnitude relationship between the two voltage values as shown in FIG. 8B.
  • the signal CS is output.
  • each of the comparators 52a compares the pixel signal PS with the reference signal Vref when the pixel signal PS is input to one of the first and second capacitors C1 and C2.
  • the sum of the two pixel signals PS is compared with the reference signal Vref.
  • the output signal CS is input to the counter 52b.
  • the counter 52b Based on the clock signal input from the TG 55, the counter 52b starts counting as the reference signal Vref starts increasing, as shown in FIG. 8C.
  • the counter 52b stops the counting operation when the voltage values of the pixel signal PS and the reference signal Vref match and the output signal CS changes from the low level to the high level.
  • the count value when the counter 52b stops the count operation corresponds to the pixel signal PS.
  • This count value is a digital signal, and is output from the column ADC circuit 52 to the line memory 53 as a digitized pixel signal PSD.
  • the line memory 53 collectively holds the pixel signals PSD for one row digitized by the column ADC circuit 52.
  • the column scanning circuit 54 sequentially outputs the pixel signal PSD from the output terminal Vout by scanning the line memory 53 based on the timing signal input from the TG 55.
  • the pixel signal PSD for one frame output from the output terminal Vout is the above-described imaging signal.
  • FIG. 9 shows the spectral characteristics of the color filter array 60.
  • the G filter 60a, the B filter 60b, and the R filter 60c have different spectral transmission characteristics.
  • the G filter 60a has a high transmittance with respect to a wavelength region of about 450 to 630 nm.
  • the B filter 60b has a high transmittance with respect to a wavelength region of about 380 to 560 nm.
  • the R filter 60c has a high transmittance with respect to a wavelength region of about 580 to 760 nm.
  • the B pixel has the highest sensitivity with respect to light of different absorption wavelength described later.
  • the first blue laser light and the short wavelength side component of the first fluorescence are incident on the B pixel, the main wavelength component of the first fluorescence is incident on the G pixel, and the first pixel is incident on the R pixel.
  • One fluorescent long wavelength side component enters.
  • the short wavelength side component of the second blue laser light and the second fluorescence is incident on the B pixel, the main wavelength component of the second fluorescence is incident on the G pixel, and the R pixel.
  • the long-wavelength side component of the second fluorescence is incident on. Since the emission intensity of the first and second blue laser light is higher than that of the first and second fluorescence, respectively, most of the light incident on the B pixel is the first blue laser light or the second blue laser light. It is a component of light.
  • the image pickup device 39 is a single-plate color image sensor, the image pickup signal is divided into G, B, and R pixel signals, respectively.
  • “sequential readout method”, “partial readout method”, and “pixel addition readout method” can be executed.
  • the sequential readout method a set of first and second row selection lines LS1 and LS2 of each pixel row are sequentially selected by the row scanning circuit 51 while the first and second row selection lines LS1 and LS2 are selected.
  • a row selection signal S1 is given.
  • signal reading is sequentially performed for each pixel row from the first pixel row “0” to the last pixel row “N”.
  • the first switch SW1 is turned on and the second switch SW2 is turned off when performing signal reading of the pixel rows of the first pixel row group.
  • the second switch SW2 is turned on and the first switch SW1 is turned off.
  • the row scanning circuit 51 applies the row selection signal S1 to the selected first row selection line LS1 while sequentially selecting only the first row selection line LS1 of each pixel row.
  • the first switch SW1 is turned on and the second switch SW2 is turned off when performing signal readout of the pixel rows of the first pixel row group.
  • the second switch SW2 is turned on and the first switch SW1 is turned off.
  • the pixel signals read from the G pixels in the even pixel rows are stored in the odd columns of the line memory 53, and are read from the G pixels in the odd pixel rows in the even columns of the line memory 53.
  • the output pixel signal is stored. Therefore, the column scanning circuit 54 scans the line memory 53 every time a pixel signal (G pixel signal) for one row is stored in the line memory 53 by the readout scanning for two pixel rows by the row scanning circuit 51. Do.
  • the row scanning circuit 51 simultaneously selects the second row selection line LS2 of two pixel rows separated by one pixel row in the column direction, and simultaneously selects the two second row selection lines LS2.
  • a row selection signal S1 is applied.
  • Pixel addition readout is performed while two pixel rows are sequentially selected in a combination of. That is, the B pixels and the R pixels of the pixel rows of the first pixel row group and the pixel rows of the second pixel row group are added and read.
  • both the first and second switches SW1 and SW2 are turned on, and the pixel signal PS input to the first capacitor C1 and the pixel signal PS input to the second capacitor C2 are added. .
  • the column scanning circuit 54 generates a line each time pixel signals (B pixel signal and R pixel signal) for one row are stored in the line memory 53 by the addition reading scan for four pixel rows by the row scanning circuit 51.
  • the memory 53 is scanned.
  • “sequential reset method” and “batch reset method” can be executed as the reset method.
  • the row reset line LR is sequentially selected by the row scanning circuit 51, and the reset signal S2 is given to the selected row reset line LR.
  • the reset is performed sequentially for each pixel row from the first pixel row “0” to the last pixel row “N”.
  • all the row reset lines LR are selected by the row scanning circuit 51, and the reset signal S2 is given to all the row reset lines LR at once. Thereby, all the pixel rows of the pixel array unit 50 are simultaneously reset at the same time.
  • control unit 41 controls the light source control unit 31 to turn on the first LD 30a, thereby emitting the first illumination light from the illumination window 23 of the endoscope 11. .
  • the image pick-up element 39 is controlled and it drives with a rolling shutter system.
  • the reset is sequentially performed for each pixel row from the first pixel row “0” to the last pixel row “N” by a sequential reset method.
  • signal readout is sequentially performed for each pixel row from the first pixel row “0” to the last pixel row “N” by the sequential readout method.
  • an image signal for one frame is output from the image sensor 39.
  • This rolling shutter drive is repeatedly executed during the normal observation mode, and an image signal for one frame is obtained every frame time FT (for example, 1/60 seconds).
  • the control unit 41 controls the light source control unit 31 to perform the first operation.
  • the first and second illumination lights are alternately supplied from the illumination window 23 of the endoscope 11 through the turn-off period by alternately turning on the second LDs 30a and 30b through the turn-off period. Let me inject.
  • the first illumination light is emitted from the illumination window 23 of the endoscope 11, and all the pixel rows are simultaneously reset by a batch reset method.
  • a half time (ET / 2) of the exposure time ET has elapsed from the execution of the batch reset, the emission of the first illumination light is stopped and the light is turned off.
  • the G pixel is sequentially read out by the partial readout method described above, and the B pixel and the R pixel are added and read out by the pixel addition readout method.
  • the G, B, and R pixel signals included in the imaging signal (hereinafter referred to as the first imaging signal) read during the extinguishing period after the irradiation of the first illumination light are respectively the G1 pixel signal and the B1 pixel.
  • G, B, and R pixel signals included in an imaging signal (hereinafter, referred to as a second imaging signal) read during the extinguishing period after irradiation of the second illumination light are respectively referred to as a G2 pixel signal and a B2 pixel signal. , R2 pixel signal.
  • signal readout is performed by the partial readout method and the pixel addition readout method after irradiation of each illumination light, so that the readout time is thinned out after irradiation of each illumination light as in the past.
  • the special observation mode of this embodiment can be executed without reducing the frame rate from the normal observation mode.
  • the first and second imaging signals are input to the DSP 43.
  • the DSP 43 performs synchronization processing and interpolation processing to generate one set of B1, G1, R1 pixel signals and one set of B2, G2, R2 pixel signals per pixel.
  • the image processing unit 44 of the processor device 12 includes a misregistration amount calculation unit 70, a registration unit 71, a signal ratio calculation unit 72, a correlation storage unit 73, an oxygen saturation calculation unit 74, A normal observation image generation unit 75 and a gain correction unit 76 are provided.
  • the G1 pixel signal and the G2 pixel signal are input to the positional deviation amount calculation unit 70.
  • the positional deviation amount calculation unit 70 compares the image based on the G1 pixel signal and the image based on the G2 pixel signal, and calculates the positional deviation amount between the two images.
  • the positional deviation amount calculation unit 70 is based on a cumulative histogram method described in JP2013-165576A or a known correlation calculation method between images, and positional deviation amounts ⁇ X and ⁇ Y in the X direction and the Y direction. Is calculated.
  • the G1 pixel signal, the R1 pixel signal, and the B2 pixel signal are input to the alignment unit 71.
  • the alignment unit 71 includes an image based on the G1 pixel signal and the R1 pixel signal of the first imaging signal, and B2 of the second imaging signal based on the positional deviation amounts ⁇ X and ⁇ Y calculated by the positional deviation amount calculation unit 70. Image alignment based on pixel signals is performed.
  • the B2 pixel signal, G1 pixel signal, and R1 pixel signal after the alignment processing are input from the alignment unit 71 to the signal ratio calculation unit 72.
  • the signal ratio calculation unit 72 calculates a signal ratio B2 / G1 between the B2 pixel signal and the G1 pixel signal and a signal ratio R1 / G1 between the R1 pixel signal and the G1 pixel signal for each pixel.
  • These signal ratios B2 / G1, R1 / G1 are used for calculation of the oxygen saturation.
  • the signal ratio essential for calculating the oxygen saturation is B2 / G1.
  • the correlation storage unit 73 stores the correlation between the signal ratio B2 / G1, R1 / G1 and the oxygen saturation. As shown in FIG. 13, the correlation is stored in a two-dimensional table in which isolines of oxygen saturation are defined in a two-dimensional space. The positions and shapes of the isolines for the signal ratios B2 / G1, R1 / G1 are obtained in advance by physical simulation of light scattering, and the interval between the isolines depends on the blood volume (signal ratio R1 / G1). Change. The correlation between the signal ratios B2 / G1, R1 / G1 and the oxygen saturation is stored on a log scale.
  • the above correlation is closely related to the light absorption characteristic of oxygenated hemoglobin (one-dot chain line 77) and the light absorption characteristic of reduced hemoglobin (solid line 78) shown in FIG.
  • the oxygen saturation can be calculated by using light of a wavelength (different light absorption wavelength light) having a large difference between the extinction coefficient due to oxidized hemoglobin and the extinction coefficient due to reduced hemoglobin, such as the center wavelength 473 nm of the second blue laser light. is there.
  • the B2 pixel signal mainly depending on the second blue laser light largely depends not only on the oxygen saturation but also on the blood volume.
  • the R1 pixel signal mainly depends on the blood volume.
  • the oxygen saturation calculation unit 74 refers to the correlation stored in the correlation storage unit 73, and calculates the oxygen saturation corresponding to the signal ratios B2 / G1 and R1 / G1 calculated by the signal ratio calculation unit 72 for each pixel. Respectively.
  • the calculated value of oxygen saturation is rarely below 0% or above 100%. If the calculated value is less than 0%, the oxygen saturation may be 0%, and if it exceeds 100%, the oxygen saturation may be 100%.
  • the first observation signal is input to the normal observation image generation unit 75.
  • the normal observation image generation unit 75 generates a normal observation image using the B1, G1, R1 pixel signals included in the first imaging signal.
  • the gain correction unit 76 performs gain correction corresponding to the oxygen saturation on each of the B1, G1, and R1 pixel signals constituting each pixel of the normal observation image. For example, in a pixel having a corrected oxygen saturation of 60% or more, the gain is set to “1” for any of the B1, G1, and R1 pixel signals. On the other hand, in a pixel having a corrected oxygen saturation of less than 60%, the gain is set to less than “1” for the B1 pixel signal and the gain is set to “1” or more for the G1 and R1 pixel signals. Then, an image is generated using the B1, G1, and R1 pixel signals after gain correction.
  • the normal observation image subjected to gain correction in this way is an oxygen saturation image. In this oxygen saturation image, the high oxygen region (region where the oxygen saturation is 60 to 100%) is the same color as the normal observation image, but the low oxygen region (region where the oxygen saturation is 0 to 60%). Is changed to blue.
  • the positional deviation amounts ⁇ X and ⁇ Y are calculated based on the high-resolution G1 pixel signal and G2 pixel signal read from the image sensor 39 without performing pixel addition, so that the calculation accuracy is improved. . Accordingly, the alignment accuracy between the images of the first and second imaging signals by the alignment unit 71 is improved, and the oxygen saturation calculation accuracy by the oxygen saturation calculation unit 74 is improved. Further, since the B2 pixel signal that is most attributable to the calculation of the oxygen saturation is a signal obtained by pixel addition reading, the light amount is large and the S / N ratio is improved. As a result, the S / N ratio and reliability of the oxygen saturation image generated as a result of gain correction by the gain correction unit 76 are improved.
  • the operator inserts the endoscope 11 into the living body, and the observation site is observed in the normal observation mode (step S10).
  • the imaging device 39 is driven by the rolling shutter method with the first illumination light irradiated on the observation site, and the imaging signal is read from the imaging device 39 every frame time.
  • a normal observation image is generated by the image processing unit 44 and displayed on the monitor 14 (step S11).
  • the display frame rate of the monitor 14 is the same as the frame rate of the image sensor 39, and the normal observation image displayed on the monitor 14 is updated every frame time.
  • Step S13 When the operator finds a potential lesion by observation in the normal observation mode, and operates the mode switching SW 22b to switch the observation mode (YES in step S12), the mode shifts to the special observation mode. (Step S13).
  • the irradiation of the second illumination light is started, the image sensor 39 is collectively reset, and the irradiation of the second illumination light is stopped, and then the light is turned off.
  • the G pixel is sequentially read out by the partial readout method as shown in FIG. 16, and the B pixel and the R pixel are respectively read out by the pixel addition readout method as shown in FIG. The Thereby, a 2nd imaging signal is obtained.
  • the first and second imaging signals are acquired every frame time in the special observation mode.
  • the image processing unit 44 generates a normal observation image based on the first imaging signal and displays it on the monitor 14 (step S14).
  • the first and second imaging signals Based on the above, an oxygen saturation image is generated and displayed on the monitor 14 (step S15). For example, the normal observation image and the oxygen saturation image are displayed side by side on the screen of the monitor 14 at the same time.
  • the generation and display of the normal observation image and the oxygen saturation image are repeatedly performed until the operator operates the mode switching SW 22b again or performs an operation to end the diagnosis.
  • the mode switching SW 22b is operated (YES in step S16)
  • the mode returns to the normal observation mode (step S10), and the same operation is executed.
  • the operation for ending the diagnosis is performed without operating the mode switching SW 22b (YES in step S17)
  • the operation of the endoscope system 10 is ended.
  • the first illumination light is light having the first blue laser light
  • the second illumination light is light having the second blue laser light (different light absorption wavelength light).
  • the first illumination light may be light having second blue laser light (different light absorption wavelength light)
  • the second illumination light may be light having first blue laser light.
  • signal readout is performed in the order of partial readout and pixel addition readout during the extinguishing period after irradiation of each illumination light. Conversely, signal readout is performed in the order of pixel addition readout and partial readout. May be.
  • two pixels corresponding to the column direction are added and read at the time of pixel addition reading, but two pixels corresponding to the row direction may be added and read instead of the column direction. Further, a total of four pixels, which are two pixels corresponding to the column direction and the row direction, may be read out.
  • a capacitance addition method in which pixel signals are added by a capacitor in the column ADC is used, but in a counter addition method in which addition is performed by a counter in the column ADC, or in a floating diffusion unit.
  • An FD addition method for performing addition may be used.
  • the oxygen saturation is calculated based on the G1 pixel signal and G2 pixel signal read without performing pixel addition, and the B2 pixel signal obtained by pixel addition reading.
  • the oxygen saturation may be calculated after pixel addition (so-called digital addition) is performed on the G1 pixel signal and the G2 pixel signal so as to correspond to the B2 pixel signal, respectively.
  • the oxygen saturation image is generated by image processing of the normal observation image based on the oxygen saturation.
  • the oxygen saturation image may be obtained by imaging the oxygen saturation information.
  • batch reset is performed at the start of irradiation of each illumination light, but without performing this batch reset, a reset method is sequentially performed within the extinguishing period before the start of irradiation of each illumination light. You may reset it.
  • the light source device 13 and the image sensor 39 are driven by the imaging method shown in FIG. 11 (hereinafter referred to as the first imaging method).
  • the light source device 13 and the imaging element 39 may be driven by the conventional imaging method shown (hereinafter referred to as the second imaging method).
  • the first and second illumination lights are alternately irradiated through the extinction period, and signal readout is performed by the thinning readout scheme without performing pixel addition in each extinction period.
  • all pixel rows are simultaneously reset by the batch reset method at the start of irradiation of each illumination light.
  • pixel thinning is performed by reading out only the first pixel row group described above from the pixel array unit 50.
  • the frame rate of the second imaging method is the same as that of the first imaging method.
  • the S / N ratio is improved by performing pixel addition, but the resolution may be decreased by performing pixel addition. For this reason, the first imaging method and the second imaging method may be switched according to the brightness of the specimen related to the S / N ratio.
  • the brightness of the specimen is detected after switching from the normal observation mode to the special observation mode.
  • the brightness of the specimen is calculated by the DSP 43 based on the imaging signal.
  • the brightness of the specimen is obtained by calculating an average luminance value from the imaging signal for one frame. That is, the DSP 43 corresponds to a brightness detection unit.
  • This brightness detection may use either an imaging signal based on the first imaging method or an imaging signal based on the second imaging method.
  • the brightness may be detected in a specific range in an image for one frame, and this specific range may be designated.
  • the second imaging method is selected when the brightness is equal to or greater than a certain value
  • the first imaging method is selected when the brightness is less than the certain value.
  • the brightness of the specimen may be calculated in the normal observation mode, and when switching to the special observation mode, the imaging method may be selected based on the brightness calculated in the normal observation mode.
  • the S / N ratio of the imaging signal is low, and it is determined whether the DSP 43 or the like needs a gain of a certain level or more.
  • the second imaging method may be selected when an imaging method is selected and a gain greater than a certain level is not necessary.
  • the reset may be sequentially performed within the light extinction period without performing a batch reset at the start of irradiation of each illumination light.
  • a primary color filter array is used, but a complementary color filter array may be used instead.
  • the first and second illumination lights are generated by irradiating the phosphor 36 with the first and second laser lights emitted from the first and second LDs 30a and 30b.
  • the first and second illumination lights may be generated by a white light source such as a xenon lamp and a wavelength separation filter.
  • a white light source such as a xenon lamp and a wavelength separation filter.
  • LED Light-Emitting Diode
  • an oxygen saturation image is generated as a special observation image.
  • narrow-band light for example, purple narrow-band light having a central wavelength of 405 nm
  • An enhanced observation image may be generated.
  • the light source device and the processor device are configured separately, but may be configured by a light source device and a processor device.
  • the present invention is applicable to a capsule endoscope that captures an image while passing through the digestive tract and transfers the captured image to a recording device.
  • the capsule endoscope 80 includes an illumination unit 81, a lens 82, an image sensor 83, a signal processing unit 84, a memory 85, a transmission unit 86, a control unit 87, A power supply 88 and a capsule housing 89 that accommodates these are configured.
  • the illumination unit 81 includes an LED and a wavelength selection filter, and irradiates the specimen with the first and second illumination lights described above.
  • the image sensor 83 is of the CMOS type, images reflected light from the specimen illuminated with the first and second illumination lights via the lens 82, and outputs the first and second imaging signals described above.
  • the signal processing unit 84 performs signal processing performed by the DSP 43 and the image processing unit 44 of the above-described embodiment on the first and second imaging signals, and generates a normal observation image and an oxygen saturation image.
  • the memory 85 stores each image.
  • the transmission unit 86 wirelessly transmits each image stored in the memory 85 to an external recording device (not shown).
  • the control unit 87 controls each unit.
  • first and second imaging signals may be transmitted from the transmission unit 86 to an external device (not shown), and the normal observation image and the oxygen saturation image may be generated by the external device.
  • the present invention also relates to an endoscope system using a fiberscope that guides reflected light of an observation site of illumination light with an image guide, and an ultrasonic endoscope in which an image pickup element and an ultrasonic transducer are built in a tip portion. It is also applicable to.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

La présente invention concerne un système d'endoscope, un dispositif de processeur de système d'endoscope, et une méthode d'utilisation d'un système d'endoscope, tels qu'il soit possible d'améliorer le rapport signal sur bruit et la fiabilité d'une image de saturation en oxygène. Une première lumière d'éclairage et une deuxième lumière d'éclairage de caractéristiques spectrales différentes sont projetées de manière alternée sur un sujet avec une période de coupure interposée entre elles. Pendant chaque période de coupure, une lecture de signal des pixels G à partir d'un élément de capture d'image (39) est effectuée sans addition de pixels, et un lecture de signal des pixels B et des pixels R est effectuée avec addition de pixels. Par comparaison, sur la base des signaux des pixels G, des premiers signaux de capture d'image qui sont lus à partir de l'élément de capture d'image pendant les périodes de coupure après les projections de la première lumière d'éclairage et des deuxièmes signaux de capture d'image qui sont lus à partir de l'élément de capture d'image pendant les périodes de coupure après les saillies de la deuxième lumière d'éclairage, un écart de position est calculé entre les images sur la base du premier signal de capture d'image et du deuxième signal de capture d'image. Sur la base de l'écart de position, un alignement d'images entre le premier signal de capture d'image et le deuxième signal de capture d'image est effectué, et une image de saturation d'oxygène est générée sur la base de l'alignement du premier signal de capture d'image et du deuxième signal de capture d'image.
PCT/JP2015/058897 2014-03-28 2015-03-24 Système d'endoscope, dispositif de processeur de système d'endoscope, et méthode d'utilisation d'un système d'endoscope WO2015146972A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016510379A JP6150364B2 (ja) 2014-03-28 2015-03-24 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-069805 2014-03-28
JP2014069805 2014-03-28

Publications (1)

Publication Number Publication Date
WO2015146972A1 true WO2015146972A1 (fr) 2015-10-01

Family

ID=54195484

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/058897 WO2015146972A1 (fr) 2014-03-28 2015-03-24 Système d'endoscope, dispositif de processeur de système d'endoscope, et méthode d'utilisation d'un système d'endoscope

Country Status (2)

Country Link
JP (2) JP6150364B2 (fr)
WO (1) WO2015146972A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109998456A (zh) * 2019-04-12 2019-07-12 安翰科技(武汉)股份有限公司 胶囊型内窥镜及其控制方法
CN112702942A (zh) * 2018-09-25 2021-04-23 奥林巴斯株式会社 控制装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114585292A (zh) * 2019-11-27 2022-06-03 富士胶片株式会社 内窥镜系统及其工作方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011250926A (ja) * 2010-06-01 2011-12-15 Fujifilm Corp 電子内視鏡システム
WO2012176561A1 (fr) * 2011-06-21 2012-12-27 オリンパスメディカルシステムズ株式会社 Dispositif médical
JP2013165776A (ja) * 2012-02-14 2013-08-29 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像生成方法
JP2013188244A (ja) * 2012-03-12 2013-09-26 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡動画表示方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5435916B2 (ja) * 2008-09-18 2014-03-05 富士フイルム株式会社 電子内視鏡システム
JP5502812B2 (ja) * 2011-07-14 2014-05-28 富士フイルム株式会社 生体情報取得システムおよび生体情報取得システムの作動方法
JP5623469B2 (ja) * 2012-07-06 2014-11-12 富士フイルム株式会社 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡用制御プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011250926A (ja) * 2010-06-01 2011-12-15 Fujifilm Corp 電子内視鏡システム
WO2012176561A1 (fr) * 2011-06-21 2012-12-27 オリンパスメディカルシステムズ株式会社 Dispositif médical
JP2013165776A (ja) * 2012-02-14 2013-08-29 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び画像生成方法
JP2013188244A (ja) * 2012-03-12 2013-09-26 Fujifilm Corp 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡動画表示方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112702942A (zh) * 2018-09-25 2021-04-23 奥林巴斯株式会社 控制装置
CN112702942B (zh) * 2018-09-25 2024-03-26 奥林巴斯株式会社 控制装置和功能限制方法
CN109998456A (zh) * 2019-04-12 2019-07-12 安翰科技(武汉)股份有限公司 胶囊型内窥镜及其控制方法

Also Published As

Publication number Publication date
JP6444450B2 (ja) 2018-12-26
JPWO2015146972A1 (ja) 2017-04-13
JP2017170168A (ja) 2017-09-28
JP6150364B2 (ja) 2017-06-21

Similar Documents

Publication Publication Date Title
JP6224819B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP6151850B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP5371946B2 (ja) 内視鏡診断装置
JP5623469B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡用制御プログラム
US9629527B2 (en) Endoscope system, processor device of endoscope system, and image processing method
JP5623470B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡用制御プログラム
US10561350B2 (en) Endoscope system, processor device of endoscope system, and method of operating endoscope system
EP2301417A1 (fr) Endoscope électronique
JP6444450B2 (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法
JP2011250925A (ja) 電子内視鏡システム
JP6560968B2 (ja) 内視鏡システム及びその作動方法
JP7454417B2 (ja) 医療用制御装置及び医療用観察システム
JP5734060B2 (ja) 内視鏡システム及びその駆動方法
JP6277068B2 (ja) 内視鏡用光源装置及び内視鏡システム
JP2010184047A (ja) 内視鏡、内視鏡駆動方法、並びに内視鏡システム
JP2010184046A (ja) 内視鏡、内視鏡駆動方法、並びに内視鏡システム
JP2016067373A (ja) 内視鏡用光源装置及び内視鏡システム
JP6227077B2 (ja) 内視鏡システム及びその作動方法
JP6005794B2 (ja) 内視鏡システム及びその駆動方法
JP6572065B2 (ja) 内視鏡用光源装置
JP2019042275A (ja) 内視鏡システム、内視鏡システムのプロセッサ装置、及び内視鏡システムの作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15769587

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016510379

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase
122 Ep: pct application non-entry in european phase

Ref document number: 15769587

Country of ref document: EP

Kind code of ref document: A1