[go: up one dir, main page]

WO2024241985A1 - Urination dynamics identification device and imaging device - Google Patents

Urination dynamics identification device and imaging device Download PDF

Info

Publication number
WO2024241985A1
WO2024241985A1 PCT/JP2024/017895 JP2024017895W WO2024241985A1 WO 2024241985 A1 WO2024241985 A1 WO 2024241985A1 JP 2024017895 W JP2024017895 W JP 2024017895W WO 2024241985 A1 WO2024241985 A1 WO 2024241985A1
Authority
WO
WIPO (PCT)
Prior art keywords
urination
light
image
unit
dynamics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/017895
Other languages
French (fr)
Japanese (ja)
Inventor
泰行 内藤
理 浮村
滋 村田
淳 安食
俊也 油谷
智博 末次
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyoto Institute of Technology NUC
Kyoto Prefectural PUC
Original Assignee
Kyoto Institute of Technology NUC
Kyoto Prefectural PUC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyoto Institute of Technology NUC, Kyoto Prefectural PUC filed Critical Kyoto Institute of Technology NUC
Publication of WO2024241985A1 publication Critical patent/WO2024241985A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/20Measuring for diagnostic purposes; Identification of persons for measuring urological functions restricted to the evaluation of the urinary system

Definitions

  • the present invention relates to a urination behavior determination device and an imaging device.
  • the present invention aims to provide a urination behavior identification device and an imaging device that can identify urination behavior.
  • the urination dynamics specifying device of the present invention is a urination dynamics specifying device for specifying the dynamics of urine released from the human body over time, and includes a specifying unit that specifies the urination fragmentation state on the surface as the dynamics of the urination over time based on an image of a surface that is intersected by irradiating the released urine with planar light, and an output unit that outputs information related to the urination fragmentation state specified by the specifying unit.
  • the urination dynamics identification device of the present invention comprises a light irradiation unit that irradiates planar light that may intersect with the released urine, and an imaging unit that images the surface from a position opposite the surface of the light, the image of the surface being an image captured by the imaging unit over time, and the identification unit preferably includes an extraction unit that extracts at least one of the number of divisions, cross-sectional area, flow rate, degree of separation, and cross-sectional shape of each divided urination based on the image of the surface as the division state of the urination.
  • the light irradiation unit includes a first light irradiation unit that irradiates a planar first light having a first characteristic, and a second light irradiation unit that irradiates a planar second light having a second characteristic in parallel with the plane of the first light at a predetermined interval, and the identification unit preferably calculates the urination passing speed and cross-sectional area between the first light plane and the second light plane based on the split state of urination on the first light plane, the split state of urination on the second light plane, and the predetermined interval, and identifies the urination flow rate as a dynamics according to the urination over time.
  • the identification unit calculates the time it takes for urination to reach the second light surface from the first light surface using a cross-correlation coefficient based on the split state of urination on the first light surface and the split state of urination on the second light surface, thereby calculating the passage speed of the urination.
  • the light irradiation unit has a cylindrical lens that emits the incident laser light in a planar manner.
  • the present invention it is possible to identify and evaluate the state of split urination as a dynamic state according to the passage of time of urination. As a result, it is possible to appropriately diagnose urinary disorders.
  • FIG. 3 is a diagram for explaining a configuration of the urination behavior specifying device.
  • FIG. 11A and 11B are diagrams for explaining a configuration of a horizontal section light illumination unit.
  • 11 is a diagram for explaining an example of an installation state when a horizontal section light illumination unit and a camera are installed on a toilet bowl.
  • FIG. 13 is a flowchart illustrating an example of a urination-related value calculation process for calculating a urination-related value.
  • FIG. 13 shows an example of split urination.
  • FIG. 13 shows the labeling state of an example of split urination.
  • FIG. 13 is a diagram showing an example of urination-related values stored in a predetermined area of a memory.
  • FIG. 11 is a diagram for explaining a specific example of step S04.
  • FIG. 11 is a diagram for explaining an example of a method for generating urination portion passing frequency distribution information.
  • FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when urination is performed at a relatively concentrated portion.
  • FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted.
  • FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted.
  • FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted.
  • FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted.
  • FIG. 11 is a diagram for explaining a specific example of step S04.
  • FIG. 13 is a diagram showing a histogram based on the cross-sectional area and number of urinating individuals based on a urination motion image.
  • FIG. 11 is a diagram for explaining another example of a method for generating urination portion passing frequency distribution information.
  • the urination behavior identification device of the present invention and a program executed in a terminal device capable of communicating with the urination behavior identification device will be described with reference to the drawings.
  • the present invention is not limited or restricted to the following examples.
  • FIG. 1 is a diagram for explaining the configuration of the urination dynamics specifying device 1.
  • the urination dynamics specifying device 1 of this embodiment includes a horizontal section light illumination unit 2 that irradiates planar horizontal section light that can intersect in a substantially horizontal direction onto urine that is released from the human body in a parabolic arc into a toilet bowl or the like, a camera 3 that captures an image of a predetermined area including the plane of the horizontal section light from a position facing the plane, and a urination-related value calculation unit 4 that calculates and outputs urination-related values related to urination dynamics including the state of urination fragmentation on the plane of the horizontal section light from the captured image.
  • FIG. 2 is a diagram for explaining the configuration of the horizontal section light illumination unit 2.
  • FIG. 2 shows the horizontal section light illumination unit 2 as viewed from above, with the left-right and up-down directions in FIG. 2 corresponding to the horizontal direction, and the depth direction corresponding to the vertical direction.
  • the horizontal section light illumination unit 2 is provided with a first laser unit 21 that emits a first laser beam in a predetermined direction, a second laser unit 22 that emits a second laser beam in a direction perpendicular to the predetermined direction, a reflecting prism 23 that reflects the second laser beam in a predetermined direction, and a cylindrical lens 24 that refracts (diffuses) the first and second laser beams that are incident on the plate-shaped base member 20 and emits them so that they spread in a horizontal plane.
  • the laser beams are indicated by dotted lines.
  • the first laser light is, for example, green laser light with a wavelength of 490 to 550 nm
  • the second laser light is, for example, red laser light with a wavelength of 640 to 770 nm, but this is not limiting as long as the wavelengths are different.
  • Figure 3 is a diagram for explaining an example of the installation state when the horizontal section light illumination unit 2 and camera 3 are installed on the toilet bowl 6.
  • Figure 3(A) shows the installation state when the toilet bowl 6 is viewed horizontally from the left side
  • Figure 3(B) shows the installation state when the toilet bowl 6 is viewed from above.
  • the horizontal section light illumination unit 2 is installed at a predetermined position on the toilet 6 so that the first horizontal section P1 (first laser light) and the second horizontal section P2 (second laser light) can intersect in a substantially horizontal direction with the urine discharged from the human body into the toilet 6 (for example, the parabolic dashed-dotted line L in FIG. 3(A)).
  • the first laser light and the second laser light are incident on and emitted from the cylindrical lens 24 so as to be parallel to each other with a predetermined distance D in the vertical direction, so that the first horizontal section P1 and the second horizontal section P2 are parallel to each other with a predetermined distance D in the vertical direction, as shown by the dotted line in FIG. 3(A).
  • the camera 3 is installed at an upper position facing the horizontal cut plane so that the imaging range includes an area including the range where the first horizontal cut plane P1 and the second horizontal cut plane P2 may intersect with the urine discharged into the toilet bowl 6.
  • the camera 3 is installed so that the imaging direction of the camera is aligned with the perpendicular line of the first horizontal cut plane P1 and the second horizontal cut plane P2.
  • FIG. 3B an example of the imaging range is shown by a two-dot chain line. This allows the camera 3 to capture the state (e.g., the position, size, and degree of division of the urine) when the urine passes through the first horizontal cut plane P1 and the second horizontal cut plane P2.
  • the camera 3 may have a function such as a color filter (or an RGB filter) and may be a color camera that can instantly obtain green and red color information separately.
  • the captured moving image captured by the camera 3 is an image that allows images corresponding to the green and red color information to be distinguished from each other.
  • the camera 3 is not limited to a color camera, and may be, for example, a spectroscopic camera.
  • the urination-related value calculation unit 4 is realized by a computer including a memory, an input unit, an output display unit, an arithmetic processing unit, and a communication unit. By executing a program stored in the memory, the urination-related value calculation unit 4 calculates and outputs urination-related values related to urination, such as the urination dynamics of urination passing through a horizontal cross section (e.g., the number of divisions of each divided urination, cross-sectional area, flow rate, degree of separation according to distance from a specified position, cross-sectional shape, etc.) and urination flow rate, based on the captured moving image from the camera 3.
  • urination-related values related to urination such as the urination dynamics of urination passing through a horizontal cross section (e.g., the number of divisions of each divided urination, cross-sectional area, flow rate, degree of separation according to distance from a specified position, cross-sectional
  • the urination-related value calculation unit 4 has a color component extraction unit 41, a urination rate calculation unit 42, a urination cross-sectional area calculation unit 43, a urination flow rate calculation unit 44, a urination dynamics calculation unit 45, and a calculation result output display unit 46.
  • the color component extraction unit 41 has a function of extracting a first urination dynamic image passing through a first horizontal cut plane P1 and a second urination dynamic image passing through a second horizontal cut plane P2 based on the captured video images from the camera 3.
  • the urination rate calculation unit 42 has a function of calculating the urination rate by utilizing the fact that the first urination motion image and the second urination motion image are images shifted in the vertical direction by a predetermined distance D.
  • the urination cross-sectional area calculation unit 43 has a function of calculating the cross-sectional area of each divided urination and the total cross-sectional area of the urination.
  • the urination flow rate calculation unit 44 has a function of calculating the urination flow rate based on the urination rate and the total cross-sectional area of the urination.
  • the urination dynamics calculation unit 45 has a function of calculating the urination dynamics of urination passing through a horizontal cut plane, for example, from the first urination dynamic image.
  • the calculation result output display unit 45 has a function of outputting and displaying the calculation results calculated by each calculation unit.
  • FIGS. 4 to 13 are diagrams for explaining in detail the process of calculating urination-related values related to urination including urination dynamics from moving images captured by the camera 3 (hereinafter also referred to as acquired moving images) acquired by the urination-related value calculation unit 4 from the captured moving images.
  • FIG. 4 is a flowchart for explaining an example of the urination-related value calculation process for calculating the urination-related values.
  • the urination-related value calculation process is included in the process executed by the program stored in the memory.
  • a first urination video image for example, a green laser light
  • a second urination video image for example, a red laser light
  • the camera 3 captures images at, for example, 800 fps, but this is not limited to this.
  • step S02 image analysis is performed on the first urination video image to extract, for each image constituting the first urination video image, the individual urination pieces (lumps) that are the portions through which the green laser light has passed (transmitted) the urination and that have been split into multiple pieces, and a process is performed in which each individual piece is labeled.
  • Figure 5 shows an example of split urination (an example in which urination is split into five chunks).
  • image with serial number 250 is shown in Figure 5 (A), and the parts where the green laser light passed through (transmitted) the urination are shown in white.
  • step S02 for each image constituting the first urination video image, the parts where the green laser light passed through (transmitted) the urination are extracted as individual pieces of split urination based on brightness, etc., and each individual piece is labeled as shown in Figure 5 (B) (labeled 1 to 5 in Figure 5 (B)).
  • Figure 5 shows an example in which the individual pieces of urination are approximately circular.
  • step S03 the cross-sectional area (e.g., the area based on the number of pixels in the portion determined to be a mass) is calculated for each individual labeled in step S02, and the total cross-sectional area of urination in each image constituting the urination video image is calculated and stored in a specified area of memory.
  • Figure 6 shows an example of urination-related values stored in a specified area of memory
  • Figure 6(A) shows an example of storing the cross-sectional area of each labeled urinating individual and the total cross-sectional area for each image (per serial number) constituting the urination video image.
  • steps S02 and S03 are not limited to only targeting the first urination motion image, but also perform image analysis on the second urination motion image, but instead or in addition, image analysis may be performed on an image obtained by combining the first urination motion image and the second urination motion image (or an averaged image).
  • the urination motion images are divided into 20 sets, for example, with images spaced 0.05 seconds apart (one second of images (800 images) is divided into 40 images per set), and the time shift between the first urination motion image and the second urination motion image for each set is calculated and stored using the cross-correlation coefficient.
  • the time required for a given urination to reach the first horizontal cut plane P1 and the second horizontal cut plane P2 is calculated and stored for each set.
  • the time required for a given urination to reach the first horizontal cut plane P1 and the second horizontal cut plane P2 varies depending on the passage of time (speed), but an approximate time can be calculated for each set.
  • Figure 7 is a diagram for explaining a specific example of step S04.
  • Figure 7(A) shows an example of a graph in which the number of high brightness value pixels (equivalent to total cross-sectional area) corresponding to an individual urinating is plotted on the vertical axis for images of Serial Numbers 0 to 800 shown on the horizontal axis among the first urination motion image and the second urination motion image
  • Figure 7(B) shows an example of an enlarged graph for Serial Numbers 200 to 400. Note that in Figures 7(A) and 7(B), the number of high brightness value pixels of the first urination motion image is indicated by a solid line, and the number of high brightness value pixels of the second urination motion image is indicated by a dotted line.
  • Figure 7 (C) shows an example in which, for one set of serial number images (e.g., serial numbers 201 to 240), when the serial numbers are shifted by 8 as shown on the horizontal axis (serial number difference), the cross correlation coefficient is at its highest value, resulting in a high degree of match.
  • the time shift between the first urination motion image and the second urination motion image is calculated to be 0.01 seconds, which corresponds to eight serial numbers. In this way, the time shift is calculated for each set in step S04.
  • step S05 the vertical velocity of urination for each set is calculated and stored based on the time shift for each set calculated in step S04 and a predetermined distance D (e.g., 25 mm) between the first horizontal cut surface P1 and the second horizontal cut surface P2.
  • a predetermined distance D e.g. 25 mm
  • step S06 for example, based on an image in the set and an image shifted by the time shift calculated in step S04, the horizontal displacement from the position of the individual urinating in one image to the position of the individual urinating in the image shifted by the time shift and the time shift are calculated and stored. That is, in step S06, the displacement between images shifted in time and space is evaluated to calculate the speed.
  • the displacement may be an average value of all the displacements of the individual urinating in each image, an average value of the displacements of a predetermined number of individuals with large cross-sectional areas among the individual urinating in each image, or the displacement of the individual with the largest cross-sectional area among the individual urinating in each image.
  • the displacement (or speed) of all images in the set shifted by the time shift calculated in step S04 may be calculated and the average value of these displacements (or speeds) may be stored.
  • the calculated urination speed is multiplied by the total cross-sectional area of urination and integrated over time to calculate and store the urination flow rate for each set.
  • the total cross-sectional area of urination used to calculate the urination flow rate may be the average value of the total cross-sectional areas for each set, as with the urination speed, or the average value of the total cross-sectional area of the image with the smallest serial number in each set and the total cross-sectional area of each of the multiple images (e.g., 10 images) following that serial number, or the average value of the total cross-sectional area of the image with the largest serial number in each set and the total cross-sectional area of each of the multiple images preceding that serial number, or the average value of the total cross-sectional area of a predetermined image in each set and the total cross-sectional area of each of the multiple images following that image.
  • steps S04 to S07 are stored in correspondence with each set, as shown in FIG. 6(B).
  • the time deviation may be stored as the time, or the deviation (number of sheets) of the serial number.
  • step S08 all images constituting, for example, the first urination moving image from among the acquired moving images are superimposed to generate and store urination area passing frequency distribution information.
  • the urination area passing frequency distribution information is information that is generated, for example, by adding 1 to the image density value corresponding to pixels identified as urination individuals, so that the more pixels that are superimposed as urination individuals (i.e., the more pixels that have passed through with urine and the higher the urination passing frequency), the higher the image density value.
  • FIG. 8 is a diagram for explaining an example of a method for generating urination area passing frequency distribution information.
  • FIG. 8(a) shows an image of Serial Number.X among the images constituting the first urination motion image
  • FIG. 8(b) shows an image of Serial Number.X+1 among the images constituting the first urination motion image
  • FIG. 8(c) shows an image of Serial Number.X+2 among the images constituting the first urination motion image.
  • the image density value increases as the number of pixels superimposed as urination individuals increases on a pixel-by-pixel basis. In the drawing, the lower the image density value, the closer to black the color becomes, and the higher the image density value, the closer to white the color becomes.
  • urination area passing frequency distribution information is generated in which the image density value is higher for the parts corresponding to the urination individuals shown in FIG. 8(a) and FIG. 8(b) than for the parts corresponding to the urination individuals shown in FIG. 8(c).
  • Figure 9 shows an example of urination area passing frequency distribution information when urination is concentrated in a relatively concentrated area. Since the individual urinations are concentrated in the center, as shown in Figure 9, the image density value of the center is high and the image density value of the surrounding areas gradually decreases.
  • FIGS. 10 to 12 are diagrams showing an example of urination section passing frequency distribution information when, for example, a urine stream is bifurcated and urination is performed.
  • the urination shown at the top of the diagram is called the first urination section
  • the urination shown at the bottom of the diagram is called the second urination section.
  • FIG. 10 shows a case where the flow rate and degree of dispersion (dispersion degree) of the first urination section and the second urination section are approximately the same.
  • the degree of darkness and spread of the image density value are shown to be approximately the same.
  • FIG. 11 shows a case where the first urination area and the second urination area have the same degree of dispersion (degree of dispersion), but the second urination area has more overlap and a higher flow rate than the first urination area.
  • the spread of the image density values is about the same, but the degree of darkness is different.
  • FIG. 12 shows a case where the first urination area and the second urination area have the same degree of flow rate, but the first urination area has a higher degree of dispersion (degree of dispersion) than the second urination area.
  • the spread of the image density values is wider in the first urination area than the second urination area, so the degree of darkness is different, although the flow rate is about the same.
  • the urination splitting state is calculated and stored based on the generated urination section passing frequency distribution information.
  • the splitting state includes, for example, the number of splits of urination, flow rate, flow rate ratio, degree of separation, degree of dispersion, and cross-sectional shape. Note that the splitting state is not limited to this, and other parameters may be calculated and stored instead of or in addition to this, and at least one of the number of splits of urination, flow rate, flow rate ratio, degree of separation, degree of dispersion, and cross-sectional shape may be calculated and stored.
  • the number of urination fragments is a mass containing a certain number (e.g., 10) or more of pixels with image density values equal to or greater than a certain threshold (e.g., 200 or 100) based on the urination area passing frequency distribution information.
  • a certain threshold e.g. 200 or 100
  • the flow rate is the flow rate for each lump, and is, for example, the total value (sum) of the image density values for each lump, based on the urination section passing frequency distribution information.
  • the sum of the image density values for one lump is calculated, and in the examples of Figures 10 to 12, the sum of the image density values of the first urination section and the sum of the image density values of the second urination section are calculated.
  • the flow rate ratio is the ratio of the flow rate for each mass to the total flow rate. In the example of Figure 9, there is one mass, so the flow rate ratio is 1, and in the examples of Figures 10 to 12, the flow rate ratio for the first urination section and the flow rate ratio for the second urination section are calculated.
  • the degree of separation is the degree of separation of the lumps (the degree of dispersion between distributions), and is calculated based on the urination tract passing frequency distribution information, for example by calculating the center of gravity from all image density values (hereinafter also referred to as the overall center of gravity) and the center of gravity from the image density of each lump (hereinafter also referred to as the center of gravity of each lump), and is the total value (sum) of the distance from the overall center of gravity to the center of gravity of each lump.
  • the degree of dispersion is the degree of scattering of each lump.
  • the pixel position with the maximum image density value (hereinafter also referred to as the maximum pixel position) is identified for each lump based on the urination part passing frequency distribution information, the distribution of image density values on a line passing through the maximum pixel position (such as on a horizontal line in images such as Figures 9 to 12) is calculated, and a pixel position with an image density value that is half the maximum image density value (hereinafter also referred to as the half pixel position) is identified, and the degree of dispersion is the distance between the maximum pixel position and the half pixel position (hereinafter also referred to as the half width) for each lump.
  • a lump with a large half width can be evaluated as having a large degree of dispersion (scattered), and a lump with a large half width can be evaluated as having a small degree of dispersion (concentrated).
  • the cross-sectional shape includes the circularity, which is determined from the relationship between the perimeter and area of each block, and the aspect ratio, which is the ratio of the width to the height.
  • step S10 the urination-related values including the calculated and stored urination flow rate and division state are output, and the urination-related value calculation process is terminated.
  • step S10 for example, based on the cross-sectional area information of FIG. 6(A), a histogram is generated and output, with the horizontal axis representing the size of the cross-sectional area of the urination individual and the vertical axis representing the number of urination individuals.
  • FIG. 13(A) shows a histogram based on the cross-sectional area and number of urination individuals based on the first urination motion image, and FIG.
  • FIG. 13(B) shows a histogram based on the urination individuals and number of individuals based on the second urination motion image.
  • FIG. 13(C) is a diagram showing a comparison between FIG. 13(A) and FIG. 13(B), and shows that there is no significant difference between the calculation results of the first urination motion image and the calculation results of the second urination motion image.
  • urination dynamics information regarding the state of division shown in FIG. 6(C) and images based on urination portion passing frequency distribution information are also output.
  • acquired moving images may also be output. Note that output in step S10 includes display, printing, sending an e-mail to a specified destination, etc.
  • the urination-related values including the urination flow rate and schizophrenia calculated and output by the urination dynamics specification device 1 are used for diagnosis by doctors and the like. Based on the urination-related values including the urination flow rate and schizophrenia calculated based on the urination dynamics images from the subject, doctors can diagnose the risk of prostatic hyperplasia, overactive bladder, urinary incontinence, and other urinary disorders, the possibility of contracting the disease, and the need for hospitalization. This allows the urination dynamics, which is a clinically important point, to be evaluated based on objective facts related to the morphology, distribution, and flow rate of the urination schizophrenia state and their changes over time, and allows appropriate examinations for urinary disorders.
  • the diagnosis results may be sent to the subject via a communication line, etc., or a printed diagnosis result may be sent to the subject's address.
  • the urination dynamics specifying device 1 in this embodiment is a urination dynamics specifying device 1 for specifying the dynamics of urination released from the human body over time, and based on the acquired moving image obtained by irradiating planar light onto urine released from a subject into a toilet or the like and capturing an image of the intersecting plane, the urination-related value calculation unit 4 specifies the urination split state on that plane as the dynamics of urination over time, and outputs information regarding the urination split state. This makes it possible to specify and evaluate the urination split state as the dynamics of urination over time. As a result, it is possible to perform an appropriate diagnosis for urinary disorders.
  • the urination dynamics specification device 1 also includes a horizontal section light illumination unit 2 that irradiates a planar light that may intersect with the released urine, and a camera 3 that captures the plane of the light from a position opposite the plane, and the urination-related value calculation unit 4 extracts at least one of the number of divisions, cross-sectional area, flow rate, degree of separation, and cross-sectional shape of each divided urination as the division state of the urination based on the moving image of the plane acquired by the camera 3. This makes it possible to more specifically specify and evaluate the values of items that provide a more detailed analysis of the division state of the urination.
  • the horizontal section light illumination unit 2 includes a first laser unit 21 that irradiates a planar first laser light having a first wavelength, and a second laser unit 22 that irradiates a planar second laser light having a second wavelength, and generates a first horizontal section P1 and a second horizontal section P2 that are parallel to each other at a predetermined interval D, and the urination-related value calculation unit 4 calculates the urination passing speed and the cross-sectional area between the first horizontal section P1 and the second horizontal section P2 based on the split state of urination on the first horizontal section P1 and the split state of urination on the second horizontal section P2 and the predetermined interval D, and specifies the urination flow rate as a dynamic state according to the time course of urination.
  • the horizontal section light illumination unit 2 also has a cylindrical lens that emits the incident laser light in a horizontal plane. This simplifies the structure of the horizontal section light illumination unit 2.
  • the urination-related value calculation unit 4 also calculates the time it takes for urination to reach the second horizontal cut plane P2 from the first horizontal cut plane P1 using a cross-correlation coefficient based on the split state of urination on the first horizontal cut plane P1 and the split state of urination on the second horizontal cut plane P2, and calculates the urination passage velocity. This allows the urinary flow velocity of urination to be determined efficiently and accurately.
  • the urination behavior specification device 1 is installed in a facility such as a hospital, and the urination behavior including the schizophrenia is specified based on a moving image of urination from a subject who comes to the hospital.
  • the present invention is not limited to this. Only the horizontal section light illumination unit 2 and the camera 3 shown in Fig. 1 and Fig. 3 (imaging device) of the urination behavior specification device 1 are installed in a first facility such as a hospital, and only the urination-related value calculation unit 4 shown in Fig.
  • the first facility may capture a moving image of urination from the subject and transmit it to the second facility, and the second facility may specify the urination behavior including the schizophrenia based on the received moving image of urination, and return the result of the specification to the facility that sent the moving image.
  • a doctor or the like at the first facility may make a diagnosis based on the result of the specification, or a doctor at the second facility may make a diagnosis based on the result of the specification, and a result including the diagnosis result may be returned to the facility that sent the moving image.
  • the horizontal section light illumination unit 2 includes a first laser unit 21 that irradiates a planar first laser light having a first wavelength, and a second laser unit 22 that irradiates a planar second laser light having a second wavelength.
  • the present invention is not limited to this, as long as it irradiates two types of light having different optical characteristics (e.g., wavelength, polarization, etc.).
  • the camera is not limited to a color camera, etc., as long as it can distinguish images according to the characteristics of the irradiated light.
  • a polarized camera may be used when irradiating two types of light having different polarizations.
  • the horizontal section light illumination unit may include one laser unit, and the urination-related value calculation unit 4 may perform the processes of steps S08 to S10 of FIG. 4.
  • the horizontal section light illumination unit may include three or more laser units, and the split state may be identified in more detail from the information on urination in three types of planes.
  • the laser light is expanded into a surface by the cylindrical lens 24 to illuminate the object, but the present invention is not limited to this as long as the illumination can form surface light.
  • slit light that forms surface light through a narrow gap, or LEDs that are lined up in a straight line (the surface to be formed) to form surface light may be used.
  • step S08 of FIG. 4 in the above-mentioned embodiment an example was shown in which all images constituting the acquired moving image are superimposed to generate and store urination area passing frequency distribution information as in method A of FIG. 14, but this is not limited to this.
  • a predetermined number e.g., 100
  • urination area passing frequency distribution information shifted by one frame
  • urination area passing frequency distribution information for each divided number (urination area passing frequency distribution information based on completely separate sets of 100 frames). This allows more detailed urination dynamics to be identified, such as, for example, a change in the number of divisions during urination.
  • the camera's imaging direction is set in a direction that coincides with the perpendicular line of the first and second horizontal cut planes.
  • This makes it possible to prevent the distance between the first and second horizontal cut planes from being constant, and to prevent the camera from taking distorted images without performing complex calculations, thereby enabling accurate calculation of urination-related values including the split state.
  • this is not limited to this, and the camera's imaging direction may not be set in a direction that coincides with the perpendicular line of the first and second horizontal cut planes.
  • first horizontal cut plane and the second horizontal cut plane are assumed to be planes whose perpendicular lines are vertical, that is, horizontal planes, but this is not limited to this, and the first horizontal cut plane and the second horizontal cut plane are not necessarily required to be aligned with the horizontal plane as long as they are parallel to each other, and may be planes that are slightly misaligned with the horizontal plane.
  • Urinary behavior specification device 2 Horizontal section light illumination unit 3: Camera 4: Urination-related value calculation unit

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Urology & Nephrology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Physiology (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Provided is a urination dynamics identification device capable of identifying urination dynamics, and an imaging device. In order to identify the dynamics of urination released from a human body over time, the device comprises: an identification section (urination-related value calculation section 4) that, on the basis of an image of a surface that is caused to intersect by irradiating the released urination with planar light, identifies a split state of urination on said surface as the dynamics of urination over time; and an output section (calculation result output display section 46) that outputs information relating to the split state of the urination identified by the identification unit.

Description

排尿動態特定装置、および、撮像装置Urinary behavior determination device and imaging device

 本発明は、排尿動態特定装置、および、撮像装置に関する。 The present invention relates to a urination behavior determination device and an imaging device.

 従来から、前立腺肥大症や過活動膀胱・尿失禁症その他の排尿障害を伴う疾患を正確に診断するために、人体から放出される排尿の流量を測定する技術が知られている(例えば、特許文献1、特許文献2など)。  Technology for measuring the flow rate of urine released from the human body has been known for some time now in order to accurately diagnose diseases associated with prostatic hyperplasia, overactive bladder, urinary incontinence, and other urination disorders (e.g., Patent Document 1, Patent Document 2, etc.).

特開2015-105948号公報JP 2015-105948 A 特開2017-207317号公報JP 2017-207317 A

 ところで、排尿障害には、例えば、外尿道口からの尿線(尿の流れ)が二又に分かれる、四方八方に飛び散る等の様々な形態学的に異なる排尿動態を評価することが必要なタイプがある。このため、形態学的に異なる排尿動態を評価することは、臨床的には重要なポイントとなる。しかしながら、従来の技術では、このような様々な形態学的に異なる排尿動態の検知および評価を行うことができない。したがって、排尿動態の評価は、患者からの問診によって判断するしかなかった。 Incidentally, there are types of urinary disorders that require the evaluation of various morphologically different urination dynamics, such as when the urinary stream (urine flow) from the external urethral meatus branches into two or splashes in all directions. For this reason, evaluating morphologically different urination dynamics is an important clinical point. However, conventional technology is unable to detect and evaluate such various morphologically different urination dynamics. Therefore, the only way to evaluate urination dynamics is through interviewing the patient.

 本発明は、排尿動態を特定可能とする排尿動態特定装置、および、撮像装置を提供することを目的とする。 The present invention aims to provide a urination behavior identification device and an imaging device that can identify urination behavior.

 前記目的を達成するために、本発明の排尿動態特定装置は、人体から放出される排尿の時間経過に応じた動態を特定するための排尿動態特定装置であって、放出される排尿に対して面状の光を照射させて交差させている面の画像に基づいて、前記排尿の時間経過に応じた動態として、当該面における排尿の分裂状態を特定する特定部と、前記特定部に特定された排尿の分裂状態に関する情報を出力する出力部とを備える。 In order to achieve the above-mentioned object, the urination dynamics specifying device of the present invention is a urination dynamics specifying device for specifying the dynamics of urine released from the human body over time, and includes a specifying unit that specifies the urination fragmentation state on the surface as the dynamics of the urination over time based on an image of a surface that is intersected by irradiating the released urine with planar light, and an output unit that outputs information related to the urination fragmentation state specified by the specifying unit.

 本発明の排尿動態特定装置において、放出される排尿と交差し得る面状の光を照射する光照射部と、当該光の面に対向する位置から当該面を撮像する撮像部とを備え、前記面の画像は、前記撮像部が時間経過に応じて撮像した画像であり、前記特定部は、前記面の画像に基づいて、分裂している排尿各々の分裂数、断面積、流量、分離度、断面形状のうちの少なくともいずれかを前記排尿の分裂状態として抽出する抽出部を含むことが好ましい。 The urination dynamics identification device of the present invention comprises a light irradiation unit that irradiates planar light that may intersect with the released urine, and an imaging unit that images the surface from a position opposite the surface of the light, the image of the surface being an image captured by the imaging unit over time, and the identification unit preferably includes an extraction unit that extracts at least one of the number of divisions, cross-sectional area, flow rate, degree of separation, and cross-sectional shape of each divided urination based on the image of the surface as the division state of the urination.

 本発明の排尿動態特定装置において、前記光照射部は、第1の特性を有する面状の第1光を照射する第1光照射部と、第2の特性を有する面状の第2光を前記第1光の面と所定間隔をあけて平行に照射する第2光照射部とを含み、前記特定部は、前記第1光の面における排尿の分裂状態と、前記第2光の面における排尿の分裂状態と、前記所定間隔とに基づいて、前記第1光の面と前記第2光の面との間における排尿の通過速度と排尿の断面積とを算出し、前記排尿の時間経過に応じた動態として排尿の流量を特定することが好ましい。 In the urination dynamics identification device of the present invention, the light irradiation unit includes a first light irradiation unit that irradiates a planar first light having a first characteristic, and a second light irradiation unit that irradiates a planar second light having a second characteristic in parallel with the plane of the first light at a predetermined interval, and the identification unit preferably calculates the urination passing speed and cross-sectional area between the first light plane and the second light plane based on the split state of urination on the first light plane, the split state of urination on the second light plane, and the predetermined interval, and identifies the urination flow rate as a dynamics according to the urination over time.

 本発明の排尿動態特定装置において、前記特定部は、前記第1光の面における排尿の分裂状態と、前記第2光の面における排尿の分裂状態とに基づき、相互相関係数を用いて排尿が第1光の面から第2光の面に達するまでの時間を算出して、前記排尿の通過速度を算出することが好ましい。 In the urination dynamics identification device of the present invention, it is preferable that the identification unit calculates the time it takes for urination to reach the second light surface from the first light surface using a cross-correlation coefficient based on the split state of urination on the first light surface and the split state of urination on the second light surface, thereby calculating the passage speed of the urination.

 本発明の排尿動態特定装置において、前記光照射部は、入射したレーザー光を面状に出射するシリンドリカルレンズを有することが好ましい。 In the urination behavior determination device of the present invention, it is preferable that the light irradiation unit has a cylindrical lens that emits the incident laser light in a planar manner.

 前記目的を達成するために、人体から放出される排尿の時間経過に応じた動態を特定するための特定用画像を撮像する撮像装置は、放出される排尿と交差し得る面状の光を照射する光照射部と、前記特定用画像として前記光の面に対向する位置から当該面を撮像する撮像部とを備える。 In order to achieve the above objective, an imaging device that captures a specific image for identifying the dynamics of urine released from the human body over time includes a light irradiation unit that irradiates a surface of light that may intersect with the released urine, and an imaging unit that captures the surface from a position facing the surface of the light as the specific image.

 本発明によると、排尿の時間経過に応じた動態として排尿の分裂状態を特定して評価することができる。その結果、排尿障害について適切に診察を行うことができる。  According to the present invention, it is possible to identify and evaluate the state of split urination as a dynamic state according to the passage of time of urination. As a result, it is possible to appropriately diagnose urinary disorders.

排尿動態特定装置が備える構成を説明するための図である。3 is a diagram for explaining a configuration of the urination behavior specifying device. FIG. 水平切断光照明部が備える構成を説明するための図である。11A and 11B are diagrams for explaining a configuration of a horizontal section light illumination unit. 便器に水平切断光照明部とカメラとを設置させたときの設置状況の一例を説明するための図である。11 is a diagram for explaining an example of an installation state when a horizontal section light illumination unit and a camera are installed on a toilet bowl. FIG. 排尿関連値を算出するための排尿関連値算出処理の一例を説明するためのフローチャートである。13 is a flowchart illustrating an example of a urination-related value calculation process for calculating a urination-related value. 排尿の分裂例を示す図である。FIG. 13 shows an example of split urination. 排尿の分裂例にラベリングした状態を示す図である。FIG. 13 shows the labeling state of an example of split urination. メモリの所定領域に記憶される排尿関連値の例を示す図である。FIG. 13 is a diagram showing an example of urination-related values stored in a predetermined area of a memory. ステップS04の具体例を説明するための図である。FIG. 11 is a diagram for explaining a specific example of step S04. 排尿部通過頻度分布情報の生成方法の一例を説明するための図である。FIG. 11 is a diagram for explaining an example of a method for generating urination portion passing frequency distribution information. 比較的まとまった箇所に排尿されたときの排尿部通過頻度分布情報の一例を示す図である。FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when urination is performed at a relatively concentrated portion. 尿線が二又に分かれて排尿されたときの排尿部通過頻度分布情報の一例を示す図である。FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted. 尿線が二又に分かれて排尿されたときの排尿部通過頻度分布情報の一例を示す図である。FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted. 尿線が二又に分かれて排尿されたときの排尿部通過頻度分布情報の一例を示す図である。FIG. 13 is a diagram showing an example of urination portion passing frequency distribution information when a urine stream is bifurcated and excreted. 排尿動画像に基づく排尿の個体の断面積と個体数とに基づくヒストグラムを示す図である。FIG. 13 is a diagram showing a histogram based on the cross-sectional area and number of urinating individuals based on a urination motion image. 排尿部通過頻度分布情報の生成方法の他の例を説明するための図である。FIG. 11 is a diagram for explaining another example of a method for generating urination portion passing frequency distribution information.

 以下、本発明の排尿動態特定装置、および、排尿動態特定装置と通信可能な端末装置において実行されるプログラムについての実施の形態を、図面を参照しながら説明する。ただし、本発明は、以下の例に限定および制限されるものではない。 Below, an embodiment of the urination behavior identification device of the present invention and a program executed in a terminal device capable of communicating with the urination behavior identification device will be described with reference to the drawings. However, the present invention is not limited or restricted to the following examples.

 図1は、排尿動態特定装置1が備える構成を説明するための図である。本実施の形態の排尿動態特定装置1は、図1に示すように、人体から便器などへ放物線を描いて放出される排尿に対して、略水平方向に交差し得る面状の水平切断光を照射する水平切断光照明部2と、水平切断光の面に対向する位置から当該面を含む所定領域を撮像するカメラ3と、撮像した画像から当該水平切断光の面における排尿の分裂状態などを含む排尿動態に関する排尿関連値を算出・出力する排尿関連値算出部4とを備えている。 FIG. 1 is a diagram for explaining the configuration of the urination dynamics specifying device 1. As shown in FIG. 1, the urination dynamics specifying device 1 of this embodiment includes a horizontal section light illumination unit 2 that irradiates planar horizontal section light that can intersect in a substantially horizontal direction onto urine that is released from the human body in a parabolic arc into a toilet bowl or the like, a camera 3 that captures an image of a predetermined area including the plane of the horizontal section light from a position facing the plane, and a urination-related value calculation unit 4 that calculates and outputs urination-related values related to urination dynamics including the state of urination fragmentation on the plane of the horizontal section light from the captured image.

 図2は、水平切断光照明部2が備える構成を説明するための図である。図2では、水平切断光照明部2を上方から見た様子を示しており、図2において左右上下方向が水平方向に相当し、奥行き手前方向が垂直方向(鉛直方向)に相当するものとする。水平切断光照明部2は、板状のベース部材20に、第1のレーザー光を所定方向に発する第1レーザー部21と、第2のレーザー光を所定方向と直交する方向に発する第2レーザー部22と、第2のレーザー光を所定方向に反射させる反射プリズム23と、入射した第1のレーザー光と第2のレーザー光との各々を水平面状に広がるように屈折(拡散)させて出射させるシリンドリカルレンズ24とが設置されている。図2では、レーザー光を点線で示している。 FIG. 2 is a diagram for explaining the configuration of the horizontal section light illumination unit 2. FIG. 2 shows the horizontal section light illumination unit 2 as viewed from above, with the left-right and up-down directions in FIG. 2 corresponding to the horizontal direction, and the depth direction corresponding to the vertical direction. The horizontal section light illumination unit 2 is provided with a first laser unit 21 that emits a first laser beam in a predetermined direction, a second laser unit 22 that emits a second laser beam in a direction perpendicular to the predetermined direction, a reflecting prism 23 that reflects the second laser beam in a predetermined direction, and a cylindrical lens 24 that refracts (diffuses) the first and second laser beams that are incident on the plate-shaped base member 20 and emits them so that they spread in a horizontal plane. In FIG. 2, the laser beams are indicated by dotted lines.

 第1のレーザー光は、例えば、波長490~550nmとなる緑レーザー光であり、第2のレーザー光は、例えば、波長640~770nmとなる赤レーザー光である例を示すが、波長が異なるものであればこれに限るものではない。 In this example, the first laser light is, for example, green laser light with a wavelength of 490 to 550 nm, and the second laser light is, for example, red laser light with a wavelength of 640 to 770 nm, but this is not limiting as long as the wavelengths are different.

 第1レーザー部21と第2レーザー部22とは、第1のレーザー光と、反射プリズム23により反射させた第2のレーザー光とが、垂直方向において所定間隔(例えば、25mm)をあけて平行となるように(後述する図3(A)の水平切断光照明部2からの点線矢印参照)、垂直方向における位置を所定間隔だけずらして設置・構成されている。これにより、反射プリズム23により反射させた第2のレーザー光は、第1のレーザー光から所定間隔だけ垂直下方に離れた位置を通るように、第1のレーザー光と平行に進行させることができる。図2において、第1のレーザー光と、反射プリズム23により反射させた第2のレーザー光とは、明確に図示するために水平方向において点線を便宜上ずらして示しているが、実際には同一の垂直面に沿って通過してシリンドリカルレンズ24に入射させている。 The first laser unit 21 and the second laser unit 22 are installed and configured with their vertical positions shifted by a predetermined distance so that the first laser light and the second laser light reflected by the reflecting prism 23 are parallel to each other at a predetermined distance (e.g., 25 mm) in the vertical direction (see the dotted arrow from the horizontal section light illumination unit 2 in FIG. 3(A) described later). This allows the second laser light reflected by the reflecting prism 23 to travel parallel to the first laser light, passing through a position vertically downward a predetermined distance from the first laser light. In FIG. 2, the first laser light and the second laser light reflected by the reflecting prism 23 are shown shifted horizontally by dotted lines for clarity, but in reality they pass along the same vertical plane and are incident on the cylindrical lens 24.

 シリンドリカルレンズ24に入射したレーザー光は、図2に示すように、水平面状に広がるように拡散する。これにより、シリンドリカルレンズ24の出射方向において、第1のレーザー光が水平面状に広がる第1の水平切断面と、第2のレーザー光が水平面状に広がる第2の水平切断面とを形成することができる。 The laser light incident on the cylindrical lens 24 is diffused so as to spread in a horizontal plane, as shown in FIG. 2. This makes it possible to form a first horizontal cut surface where the first laser light spreads in a horizontal plane, and a second horizontal cut surface where the second laser light spreads in a horizontal plane, in the emission direction of the cylindrical lens 24.

 図3は、便器6に水平切断光照明部2とカメラ3とを設置させたときの設置状況の一例を説明するための図であり、図3(A)は、便器6を水平方向であって左側面から見たときの設置状況を示しており、図3(B)は、便器6を上方から見下ろしたときの設置状況を示している。 Figure 3 is a diagram for explaining an example of the installation state when the horizontal section light illumination unit 2 and camera 3 are installed on the toilet bowl 6. Figure 3(A) shows the installation state when the toilet bowl 6 is viewed horizontally from the left side, and Figure 3(B) shows the installation state when the toilet bowl 6 is viewed from above.

 水平切断光照明部2は、図3の点線で示されるように、第1の水平切断面P1(第1のレーザー光)および第2の水平切断面P2(第2のレーザー光)が、人体から便器6へ放出された排尿(例えば、図3(A)における放物線状の一点鎖線L)と、略水平方向に交差し得るように便器6の所定位置に設置される。第1のレーザー光と第2のレーザー光とは、垂直方向において所定間隔Dを置いて平行となるようにシリンドリカルレンズ24に入射して出射されるため、第1の水平切断面P1と第2の水平切断面P2とは、図3(A)の点線で示されるように、垂直方向において所定間隔Dを置いて平行となる面となる。 The horizontal section light illumination unit 2 is installed at a predetermined position on the toilet 6 so that the first horizontal section P1 (first laser light) and the second horizontal section P2 (second laser light) can intersect in a substantially horizontal direction with the urine discharged from the human body into the toilet 6 (for example, the parabolic dashed-dotted line L in FIG. 3(A)). The first laser light and the second laser light are incident on and emitted from the cylindrical lens 24 so as to be parallel to each other with a predetermined distance D in the vertical direction, so that the first horizontal section P1 and the second horizontal section P2 are parallel to each other with a predetermined distance D in the vertical direction, as shown by the dotted line in FIG. 3(A).

 カメラ3は、第1の水平切断面P1および第2の水平切断面P2が便器6に放出される排尿と交差し得る範囲を含む領域を撮像範囲に含むように、水平切断面に対向する上方位置に設置される。本実施の形態では、カメラの撮像方向が第1の水平切断面P1および第2の水平切断面P2の垂線と一致する向きとなるようにカメラ3が設置されている。図3(B)では、撮像範囲の一例を二点鎖線で示している。これにより、第1の水平切断面P1および第2の水平切断面P2を排尿が通過する際の状態(例えば、排尿の位置や、大きさ、分裂度合いなど)をカメラ3により撮像できる。カメラ3は、例えば色フィルター(あるいはRGBフィルター)などの機能を有しており、緑や赤の色情報を別々に瞬時に取得できるカラーカメラであってもよい。カメラ3により撮像した撮像動画像は、緑や赤各々の色情報に応じた画像を区別可能となる画像となる。なお、カメラ3は、カラーカメラに限らず、例えば、分光カメラなどであってもよい。 The camera 3 is installed at an upper position facing the horizontal cut plane so that the imaging range includes an area including the range where the first horizontal cut plane P1 and the second horizontal cut plane P2 may intersect with the urine discharged into the toilet bowl 6. In this embodiment, the camera 3 is installed so that the imaging direction of the camera is aligned with the perpendicular line of the first horizontal cut plane P1 and the second horizontal cut plane P2. In FIG. 3B, an example of the imaging range is shown by a two-dot chain line. This allows the camera 3 to capture the state (e.g., the position, size, and degree of division of the urine) when the urine passes through the first horizontal cut plane P1 and the second horizontal cut plane P2. The camera 3 may have a function such as a color filter (or an RGB filter) and may be a color camera that can instantly obtain green and red color information separately. The captured moving image captured by the camera 3 is an image that allows images corresponding to the green and red color information to be distinguished from each other. The camera 3 is not limited to a color camera, and may be, for example, a spectroscopic camera.

 図1に戻り、排尿関連値算出部4は、メモリ、入力部、出力表示部、演算処理部、および通信部などを含むコンピュータにより実現されている。排尿関連値算出部4は、メモリに格納されているプログラムを実行することにより、カメラ3からの撮像動画像に基づいて、水平切断面を通過する排尿の排尿動態(例えば、分裂している排尿各々の分裂数、断面積、流量、所定位置からの距離に応じた分離度、断面形状など)や排尿流量などの排尿に関する排尿関連値を算出して出力する。 Returning to FIG. 1, the urination-related value calculation unit 4 is realized by a computer including a memory, an input unit, an output display unit, an arithmetic processing unit, and a communication unit. By executing a program stored in the memory, the urination-related value calculation unit 4 calculates and outputs urination-related values related to urination, such as the urination dynamics of urination passing through a horizontal cross section (e.g., the number of divisions of each divided urination, cross-sectional area, flow rate, degree of separation according to distance from a specified position, cross-sectional shape, etc.) and urination flow rate, based on the captured moving image from the camera 3.

 排尿関連値算出部4は、図1に示すように、色成分抽出部41、排尿速度算出部42、排尿断面積算出部43、排尿流量算出部44、排尿動態算出部45、および、算出結果出力表示部46を有している。色成分抽出部41は、カメラ3からの撮像動画像に基づいて、第1の水平切断面P1を通過する第1の排尿動画像と、第2の水平切断面P2を通過する第2の排尿動画像とを抽出する機能を有している。 As shown in FIG. 1, the urination-related value calculation unit 4 has a color component extraction unit 41, a urination rate calculation unit 42, a urination cross-sectional area calculation unit 43, a urination flow rate calculation unit 44, a urination dynamics calculation unit 45, and a calculation result output display unit 46. The color component extraction unit 41 has a function of extracting a first urination dynamic image passing through a first horizontal cut plane P1 and a second urination dynamic image passing through a second horizontal cut plane P2 based on the captured video images from the camera 3.

 排尿速度算出部42は、第1の排尿動画像と、第2の排尿動画像とが垂直方向において所定間隔Dだけずれた位置の画像であることを利用して、排尿速度を算出する機能を有している。排尿断面積算出部43は、分裂した排尿各々の断面積や排尿の総断面積などを算出する機能を有している。排尿流量算出部44は、排尿速度と、排尿の総断面積とに基づいて、排尿流量を算出する機能を有している。 The urination rate calculation unit 42 has a function of calculating the urination rate by utilizing the fact that the first urination motion image and the second urination motion image are images shifted in the vertical direction by a predetermined distance D. The urination cross-sectional area calculation unit 43 has a function of calculating the cross-sectional area of each divided urination and the total cross-sectional area of the urination. The urination flow rate calculation unit 44 has a function of calculating the urination flow rate based on the urination rate and the total cross-sectional area of the urination.

 排尿動態算出部45は、例えば第1の排尿動画像などから、水平切断面を通過する排尿の排尿動態を算出する機能を有している。算出結果出力表示部45は、各算出部により算出された算出結果を出力表示する機能を有している。 The urination dynamics calculation unit 45 has a function of calculating the urination dynamics of urination passing through a horizontal cut plane, for example, from the first urination dynamic image. The calculation result output display unit 45 has a function of outputting and displaying the calculation results calculated by each calculation unit.

 図4~図13は、カメラ3により撮像された撮像動画像であって排尿関連値算出部4がカメラ3より取得した動画像(以下では取得動画像ともいう)から排尿動態を含む排尿に関する排尿関連値を算出する処理内容を詳細に説明するための図である。図4は、排尿関連値を算出するための排尿関連値算出処理の一例を説明するためのフローチャートである。排尿関連値算出処理は、メモリに格納されているプログラムにより実行される処理に含まれる。 FIGS. 4 to 13 are diagrams for explaining in detail the process of calculating urination-related values related to urination including urination dynamics from moving images captured by the camera 3 (hereinafter also referred to as acquired moving images) acquired by the urination-related value calculation unit 4 from the captured moving images. FIG. 4 is a flowchart for explaining an example of the urination-related value calculation process for calculating the urination-related values. The urination-related value calculation process is included in the process executed by the program stored in the memory.

 ステップS01では、ある被検者による排尿をカメラ3により撮像して取得した取得動画像から、例えば、緑レーザー光となる第1の排尿動画像と、赤レーザー光となる第2の排尿動画像とを抽出する(第1の排尿動画像と第2の排尿動画像とを切り分ける)。なお、カメラ3は、例えば、800fpsで画像を撮像しているものとするが、これに限るものではない。 In step S01, a first urination video image, for example, a green laser light, and a second urination video image, for example, a red laser light, are extracted from the acquired video image captured by the camera 3 of a subject urinating (the first urination video image and the second urination video image are separated). Note that the camera 3 captures images at, for example, 800 fps, but this is not limited to this.

 ステップS02では、例えば、第1の排尿動画像を画像解析することにより、当該第1の排尿動画像を構成する1枚の画像毎に、緑レーザー光が排尿を通過(透過)した部分であって複数に分裂している排尿の個体(塊)を抽出し、各個体にラベリングする処理を行う。 In step S02, for example, image analysis is performed on the first urination video image to extract, for each image constituting the first urination video image, the individual urination pieces (lumps) that are the portions through which the green laser light has passed (transmitted) the urination and that have been split into multiple pieces, and a process is performed in which each individual piece is labeled.

 図5は、排尿の分裂例(排尿が5つの塊に分裂している例)を示している。第1の排尿動画像が例えば15秒に亘る動画像でSerial Number.1~12000(=15秒×800)の画像から構成される場合であって、当該動画像のうちの一例として、図5(A)では、Serial Number.250の画像(250番目の画像)が例示され、緑レーザー光が排尿を通過(透過)した部分が白抜きで示されている。ステップS02では、第1の排尿動画像を構成する各画像について、輝度などに基づいて緑レーザー光が排尿を通過(透過)した部分を排尿が分裂した個体として抽出し、図5(B)に示すように各個体にラベリング(図5(B)では1~5をラベリング)する処理が行われる。実際の排尿の個体は様々な形状となるが、図5では説明の便宜上、排尿の個体を略円形状となる例を示している。 Figure 5 shows an example of split urination (an example in which urination is split into five chunks). For example, the first urination video image is a video image lasting 15 seconds and is composed of images with serial numbers 1 to 12000 (= 15 seconds x 800). As an example of the video image, image with serial number 250 (the 250th image) is shown in Figure 5 (A), and the parts where the green laser light passed through (transmitted) the urination are shown in white. In step S02, for each image constituting the first urination video image, the parts where the green laser light passed through (transmitted) the urination are extracted as individual pieces of split urination based on brightness, etc., and each individual piece is labeled as shown in Figure 5 (B) (labeled 1 to 5 in Figure 5 (B)). Actual individual pieces of urination have various shapes, but for the sake of convenience, Figure 5 shows an example in which the individual pieces of urination are approximately circular.

 ステップS03では、ステップS02でラベリングした個体毎に断面積(例えば、塊と判定される部分のピクセル数に基づく面積)を算出するとともに、排尿動画像を構成する各画像における排尿の総断面積を算出してメモリの所定領域に記憶する。図6は、メモリの所定領域に記憶される排尿関連値の例を示しており、図6(A)は、排尿動画像を構成する画像毎(Serial Number毎)に、ラベリングされた排尿の個体各々の断面積と、総断面積とを記憶する例を示している。例えば、図5(B)に示すように、5つの個体が抽出されて1~5がラベリングされたときには、1にラベリングされた個体の断面積を排尿個体のラベリングNo1に記憶し、2にラベリングされた個体の断面積を排尿個体のラベリングNo2に記憶するといったように、ラベリングされた個体毎に断面積を記憶するとともに、1~5の個体の総断面積を記憶する。なお、ステップS02およびステップS03は、第1の排尿動画像のみを対象とするものに限らず、第2の排尿動画像についても画像解析するが、これに替えてあるいは加えて、第1の排尿動画像と第2の排尿動画像とを合成した画像(あるいは平均化した画像)を画像解析するものであってもよい。 In step S03, the cross-sectional area (e.g., the area based on the number of pixels in the portion determined to be a mass) is calculated for each individual labeled in step S02, and the total cross-sectional area of urination in each image constituting the urination video image is calculated and stored in a specified area of memory. Figure 6 shows an example of urination-related values stored in a specified area of memory, and Figure 6(A) shows an example of storing the cross-sectional area of each labeled urinating individual and the total cross-sectional area for each image (per serial number) constituting the urination video image. For example, as shown in Figure 5(B), when five individuals are extracted and labeled 1 to 5, the cross-sectional area of the individual labeled 1 is stored in urinating individual labeling No. 1, and the cross-sectional area of the individual labeled 2 is stored in urinating individual labeling No. 2, and so on, storing the cross-sectional area for each labeled individual and the total cross-sectional area of individuals 1 to 5. Note that steps S02 and S03 are not limited to only targeting the first urination motion image, but also perform image analysis on the second urination motion image, but instead or in addition, image analysis may be performed on an image obtained by combining the first urination motion image and the second urination motion image (or an averaged image).

 図4に戻り、ステップS04では、排尿動画像について、例えば0.05秒間隔の画像となるように20セットに分けて(1秒間の画像(800枚の画像)を40枚1セットに分けて)、相互相関係数を用いてセット毎の第1の排尿動画像の画像と、第2の排尿動画像の画像との時間のずれ量を算出して記憶する。つまり、ある排尿が第1の水平切断面P1に達してから第2の水平切断面P2に達するまでに要した時間をセット毎に算出して記憶する。このため、時間の経過(速度)に応じて第1の水平切断面P1に達してから第2の水平切断面P2に達するまでに要した時間が変動するが、セット毎に近似する時間を算出できる。 Returning to FIG. 4, in step S04, the urination motion images are divided into 20 sets, for example, with images spaced 0.05 seconds apart (one second of images (800 images) is divided into 40 images per set), and the time shift between the first urination motion image and the second urination motion image for each set is calculated and stored using the cross-correlation coefficient. In other words, the time required for a given urination to reach the first horizontal cut plane P1 and the second horizontal cut plane P2 is calculated and stored for each set. For this reason, the time required for a given urination to reach the first horizontal cut plane P1 and the second horizontal cut plane P2 varies depending on the passage of time (speed), but an approximate time can be calculated for each set.

 図7は、ステップS04の具体例を説明するための図である。図7(A)には、第1の排尿動画像および第2の排尿動画像のうち、横軸に示されるSerial Number.0~800の画像について、排尿の個体に対応する高輝度ピクセル数(High brightness value pixel、総断面積に相当)を縦軸に示したグラフの一例を示し、図7(B)には、Serial Number.200~400についての拡大グラフの一例を示している。なお、図7(A)および図7(B)では、第1の排尿動画像の画像の高輝度ピクセル数を実線で示し、第2の排尿動画像の画像の高輝度ピクセル数を点線で示している。また、図7(C)は、1セット分のSerial Numberの画像(例えば、Serial Number.201~240)について、横軸(Serial number difference)に示されるようにSerial Numberを8ずらした場合に、相互相関係数(Cross correlation coefficient)が最も高い値となっており一致度が高くなっている例が示されている。つまり、図7(C)に示すセットでは、第1の排尿動画像の画像と第2の排尿動画像の画像との時間のずれ量が、Serial Number8個分に相当する0.01秒であると算出されることとなる。ステップS04では、このように1セット毎に時間のずれ量を算出する。 Figure 7 is a diagram for explaining a specific example of step S04. Figure 7(A) shows an example of a graph in which the number of high brightness value pixels (equivalent to total cross-sectional area) corresponding to an individual urinating is plotted on the vertical axis for images of Serial Numbers 0 to 800 shown on the horizontal axis among the first urination motion image and the second urination motion image, and Figure 7(B) shows an example of an enlarged graph for Serial Numbers 200 to 400. Note that in Figures 7(A) and 7(B), the number of high brightness value pixels of the first urination motion image is indicated by a solid line, and the number of high brightness value pixels of the second urination motion image is indicated by a dotted line. Also, Figure 7 (C) shows an example in which, for one set of serial number images (e.g., serial numbers 201 to 240), when the serial numbers are shifted by 8 as shown on the horizontal axis (serial number difference), the cross correlation coefficient is at its highest value, resulting in a high degree of match. In other words, for the set shown in Figure 7 (C), the time shift between the first urination motion image and the second urination motion image is calculated to be 0.01 seconds, which corresponds to eight serial numbers. In this way, the time shift is calculated for each set in step S04.

 図4に戻り、ステップS05では、ステップS04で算出したセット毎の時間のずれ量と、第1の水平切断面P1と第2の水平切断面P2との所定間隔D(例えば、25mm)とから、セット毎の排尿の鉛直方向における速度を算出して記憶する。 Returning to FIG. 4, in step S05, the vertical velocity of urination for each set is calculated and stored based on the time shift for each set calculated in step S04 and a predetermined distance D (e.g., 25 mm) between the first horizontal cut surface P1 and the second horizontal cut surface P2.

 ステップS06では、例えば、セット内の画像のうちある画像と、ステップS04で算出した時間のずれ量だけずれた画像とに基づき、ある画像に含まれる排尿の個体の位置から、時間のずれ量だけずれた画像に含まれる排尿の個体の位置までの水平方向における変位量と、時間のずれ量とから、セット毎の排尿の水平方向における速度を算出して記憶する。つまり、ステップS06では、時間的にも空間的にもずれた画像間で変位量を評価して速度を算出する。変位量としては、各画像における排尿の個体のすべての変位量を平均した値を採用してもよく、各画像における排尿の個体のうち断面積が大きな所定数の個体の変位量を平均した値であってもよく、各画像における排尿の個体のうち断面積が最も大きな個体の変位量を採用してもよい。また、ステップS06では、セット内の画像のうち、ステップS04で算出した時間のずれ量だけずれた画像が当該セット内の画像に含まれるすべての画像各々について変位量(あるいは速度)を算出して、これらの変位量(あるいは速度)を平均した値を記憶等するものであってもよい。 In step S06, for example, based on an image in the set and an image shifted by the time shift calculated in step S04, the horizontal displacement from the position of the individual urinating in one image to the position of the individual urinating in the image shifted by the time shift and the time shift are calculated and stored. That is, in step S06, the displacement between images shifted in time and space is evaluated to calculate the speed. The displacement may be an average value of all the displacements of the individual urinating in each image, an average value of the displacements of a predetermined number of individuals with large cross-sectional areas among the individual urinating in each image, or the displacement of the individual with the largest cross-sectional area among the individual urinating in each image. In addition, in step S06, the displacement (or speed) of all images in the set shifted by the time shift calculated in step S04 may be calculated and the average value of these displacements (or speeds) may be stored.

 ステップS07では、算出された排尿の速度と、排尿の総断面積とを掛け合わせて、時間積分することにより、1セット毎の排尿の流量を算出して記憶する。排尿の流量の算出に用いる排尿の総断面積は、排尿の速度と同様にセット毎の総断面積を平均した値を採用してもよく、各セットのうち最もSerial Numberが小さい画像の総断面積および当該Serial Numberから連続する複数枚(例えば、10枚など)の画像各々の総断面積の平均値を採用してもよく、各セットのうち最もSerial Numberが大きい画像の総断面積および当該Serial Numberより前の複数枚の画像各々の総断面積の平均値を採用してもよく、各セットのうち所定番目の画像の総断面積と当該画像と連続する複数枚の画像各々の総断面積の平均値を採用してもよい。 In step S07, the calculated urination speed is multiplied by the total cross-sectional area of urination and integrated over time to calculate and store the urination flow rate for each set. The total cross-sectional area of urination used to calculate the urination flow rate may be the average value of the total cross-sectional areas for each set, as with the urination speed, or the average value of the total cross-sectional area of the image with the smallest serial number in each set and the total cross-sectional area of each of the multiple images (e.g., 10 images) following that serial number, or the average value of the total cross-sectional area of the image with the largest serial number in each set and the total cross-sectional area of each of the multiple images preceding that serial number, or the average value of the total cross-sectional area of a predetermined image in each set and the total cross-sectional area of each of the multiple images following that image.

 ステップS04~S07により算出された値は、図6(B)に示すように1セット毎に対応させて記憶される。なお、時間のずれ量としては、時間を記憶してもよく、Serial Numberのずれ量(枚数)を記憶してもよい。 The values calculated in steps S04 to S07 are stored in correspondence with each set, as shown in FIG. 6(B). Note that the time deviation may be stored as the time, or the deviation (number of sheets) of the serial number.

 ステップS08では、取得動画像のうち例えば第1の排尿動画像を構成するすべての画像を重ね合わせて排尿部通過頻度分布情報を生成して記憶する。排尿部通過頻度分布情報とは、例えば、排尿の個体として特定されたピクセルに対応する画像濃度値を1加算することにより、排尿の個体として重ね合わされた数が多いピクセル程(すなわち、排尿が通過した数が多く排尿通過頻度の高いピクセル程)、画像濃度値が高くなるように生成される情報である。 In step S08, all images constituting, for example, the first urination moving image from among the acquired moving images are superimposed to generate and store urination area passing frequency distribution information. The urination area passing frequency distribution information is information that is generated, for example, by adding 1 to the image density value corresponding to pixels identified as urination individuals, so that the more pixels that are superimposed as urination individuals (i.e., the more pixels that have passed through with urine and the higher the urination passing frequency), the higher the image density value.

 図8は、排尿部通過頻度分布情報の生成方法の一例を説明するための図である。図8(a)は、第1の排尿動画像を構成する画像のうちのSerial Number.Xの画像を示し、図8(b)は、第1の排尿動画像を構成する画像のうちのSerial Number.X+1の画像を示し、図8(c)は、第1の排尿動画像を構成する画像のうちのSerial Number.X+2の画像を示している。このような画像を重ね合わせた場合、ピクセル単位で、排尿の個体として重ね合わされた数が多いピクセル程、画像濃度値が高くなる。図面では、画像濃度値が低い程、黒に近くなり、画像濃度値が高い程、白に近い色合いとなるように図示する。このため、図8(d)に示すように、図8(a)および図8(b)で示した排尿の個体に対応する部分の方が、図8(c)で示した排尿の個体に対応する部分よりも画像濃度値が高い排尿部通過頻度分布情報が生成されることとなる。 FIG. 8 is a diagram for explaining an example of a method for generating urination area passing frequency distribution information. FIG. 8(a) shows an image of Serial Number.X among the images constituting the first urination motion image, FIG. 8(b) shows an image of Serial Number.X+1 among the images constituting the first urination motion image, and FIG. 8(c) shows an image of Serial Number.X+2 among the images constituting the first urination motion image. When such images are superimposed, the image density value increases as the number of pixels superimposed as urination individuals increases on a pixel-by-pixel basis. In the drawing, the lower the image density value, the closer to black the color becomes, and the higher the image density value, the closer to white the color becomes. Therefore, as shown in FIG. 8(d), urination area passing frequency distribution information is generated in which the image density value is higher for the parts corresponding to the urination individuals shown in FIG. 8(a) and FIG. 8(b) than for the parts corresponding to the urination individuals shown in FIG. 8(c).

 図9は、比較的まとまった箇所に排尿されたときの排尿部通過頻度分布情報の一例を示す図である。排尿の個体が中央部分にまとまっているため、図9に示すように、中央部分の画像濃度値が高くなり、その周辺部分の画像濃度値が徐々に低くなる。 Figure 9 shows an example of urination area passing frequency distribution information when urination is concentrated in a relatively concentrated area. Since the individual urinations are concentrated in the center, as shown in Figure 9, the image density value of the center is high and the image density value of the surrounding areas gradually decreases.

 図10~図12は、例えば尿線が二又に分かれて排尿されたときの排尿部通過頻度分布情報の一例を示す図である。図10~図12では、図中の上方に示されている排尿を第1の排尿部といい、図中の下方に示されている排尿を第2の排尿部という。図10は、第1の排尿部と第2の排尿部との流量および散らばり度合い(分散度)が同程度である場合を示している。画像濃度値の濃さの程度や広がり具合が同程度に示されている。 FIGS. 10 to 12 are diagrams showing an example of urination section passing frequency distribution information when, for example, a urine stream is bifurcated and urination is performed. In FIG. 10 to FIG. 12, the urination shown at the top of the diagram is called the first urination section, and the urination shown at the bottom of the diagram is called the second urination section. FIG. 10 shows a case where the flow rate and degree of dispersion (dispersion degree) of the first urination section and the second urination section are approximately the same. The degree of darkness and spread of the image density value are shown to be approximately the same.

 これに対して、図11は、第1の排尿部と第2の排尿部との散らばり度合い(分散度)については同程度であるが、第2の排尿部の方が第1の排尿部よりも重なりが多く流量が多くなった場合を示している。図11に示されるように画像濃度値の広がり具合が同程度であるものの濃さの程度が異なるように示されている。図12は、第1の排尿部と第2の排尿部との流量については同程度であるが、第1の排尿部の方が第2の排尿部よりも散らばり度合い(分散度)が高くなった場合を示している。図12に示されるように画像濃度値の広がり具合が第1の排尿部の方が第2の排尿部よりも広くなったために流量が同程度であるものの濃さの程度が異なるように示されている。 In contrast, FIG. 11 shows a case where the first urination area and the second urination area have the same degree of dispersion (degree of dispersion), but the second urination area has more overlap and a higher flow rate than the first urination area. As shown in FIG. 11, the spread of the image density values is about the same, but the degree of darkness is different. FIG. 12 shows a case where the first urination area and the second urination area have the same degree of flow rate, but the first urination area has a higher degree of dispersion (degree of dispersion) than the second urination area. As shown in FIG. 12, the spread of the image density values is wider in the first urination area than the second urination area, so the degree of darkness is different, although the flow rate is about the same.

 図4に戻り、ステップS09では、生成した排尿部通過頻度分布情報に基づいて、排尿の分裂状態を算出して記憶する。分裂状態には、例えば、排尿の分裂数、流量、流量比、分離度、分散度、および、断面形状などが含まれる。なお、分裂状態としては、これに限るものではなく、これに替えてあるいは加えて他のパラメータを算出して記憶するものであってもよく、また、排尿の分裂数、流量、流量比、分離度、分散度、および、断面形状のうちの少なくともいずれかを算出して記憶するものであってもよい。 Returning to FIG. 4, in step S09, the urination splitting state is calculated and stored based on the generated urination section passing frequency distribution information. The splitting state includes, for example, the number of splits of urination, flow rate, flow rate ratio, degree of separation, degree of dispersion, and cross-sectional shape. Note that the splitting state is not limited to this, and other parameters may be calculated and stored instead of or in addition to this, and at least one of the number of splits of urination, flow rate, flow rate ratio, degree of separation, degree of dispersion, and cross-sectional shape may be calculated and stored.

 排尿の分裂数は、排尿部通過頻度分布情報に基づいて、例えば画像濃度値が所定の閾値(例えば、200、あるいは100など)以上のピクセルが所定数(例えば、10)以上固まっているピクセルを含む塊である。図9の例では「1」と算出され、図10~図12の例では「2」と算出される。 The number of urination fragments is a mass containing a certain number (e.g., 10) or more of pixels with image density values equal to or greater than a certain threshold (e.g., 200 or 100) based on the urination area passing frequency distribution information. In the example of Figure 9, it is calculated as "1," and in the examples of Figures 10 to 12, it is calculated as "2."

 流量は、塊毎の流量であって、排尿部通過頻度分布情報に基づいて、例えば塊毎の画像濃度値の合計値(和)である。図9の例では1つの塊の画像濃度値の和が算出され、図10~図12の例では第1の排尿部の画像濃度値の和と、第2の排尿部の画像濃度値の和とが算出される。 The flow rate is the flow rate for each lump, and is, for example, the total value (sum) of the image density values for each lump, based on the urination section passing frequency distribution information. In the example of Figure 9, the sum of the image density values for one lump is calculated, and in the examples of Figures 10 to 12, the sum of the image density values of the first urination section and the sum of the image density values of the second urination section are calculated.

 流量比は、全体の流量に対する塊毎の流量の比である。図9の例では塊が一つであるため流量比は1となり、図10~図12の例では第1の排尿部についての流量比と、第2の排尿部についての流量比とが算出される。 The flow rate ratio is the ratio of the flow rate for each mass to the total flow rate. In the example of Figure 9, there is one mass, so the flow rate ratio is 1, and in the examples of Figures 10 to 12, the flow rate ratio for the first urination section and the flow rate ratio for the second urination section are calculated.

 分離度は、塊の離れ具合(分布間の分散具合)であって、排尿部通過頻度分布情報に基づいて、例えばすべての画像濃度値から重心位置(以下、全体の重心位置ともいう)と、塊毎の画像濃度から重心位置(以下、塊毎の重心位置ともいう)とを算出し、全体の重心位置から、塊毎の重心位置までの距離の合計値(和)である。図9の例では全体の重心位置と塊の重心位置とは一致するためゼロが算出され、図10~図12の例では、各図において全体の重心位置や塊毎の重心位置が異なる位置となるため、分離度として異なる値が算出される。 The degree of separation is the degree of separation of the lumps (the degree of dispersion between distributions), and is calculated based on the urination tract passing frequency distribution information, for example by calculating the center of gravity from all image density values (hereinafter also referred to as the overall center of gravity) and the center of gravity from the image density of each lump (hereinafter also referred to as the center of gravity of each lump), and is the total value (sum) of the distance from the overall center of gravity to the center of gravity of each lump. In the example of Figure 9, the overall center of gravity and the center of gravity of each lump are the same, so zero is calculated, and in the examples of Figures 10 to 12, the overall center of gravity and the center of gravity of each lump are different in each figure, so different values are calculated as the degree of separation.

 分散度は、塊毎の散らばり具合である。分散度は、排尿部通過頻度分布情報に基づいて、例えば、塊毎に、最大画像濃度値となる画素位置(以下、最大画素位置ともいう)を特定し、最大画素位置を通る直線上(図9~図12などの画像における水平線上など)の画像濃度値の分布を算出して最大画像濃度値の半分の画像濃度値となる画素位置(以下、半分画素位置ともいう)を特定し、塊毎に最大画素位置と半分画素位置との間隔(以下、半値幅ともいう)である。半値幅が大きい塊については、分散度が大きい(散らばっている)と評価でき、半値幅が大きい塊については、分散度が小さい(まとまっている)と評価できる。 The degree of dispersion is the degree of scattering of each lump. For example, the pixel position with the maximum image density value (hereinafter also referred to as the maximum pixel position) is identified for each lump based on the urination part passing frequency distribution information, the distribution of image density values on a line passing through the maximum pixel position (such as on a horizontal line in images such as Figures 9 to 12) is calculated, and a pixel position with an image density value that is half the maximum image density value (hereinafter also referred to as the half pixel position) is identified, and the degree of dispersion is the distance between the maximum pixel position and the half pixel position (hereinafter also referred to as the half width) for each lump. A lump with a large half width can be evaluated as having a large degree of dispersion (scattered), and a lump with a large half width can be evaluated as having a small degree of dispersion (concentrated).

 断面形状には、塊毎の、周囲長と面積との関係から特定される円形度や、縦横比となるアスペクト比などが含まれる。 The cross-sectional shape includes the circularity, which is determined from the relationship between the perimeter and area of each block, and the aspect ratio, which is the ratio of the width to the height.

 排尿部通過頻度分布情報に基づいて算出された分裂状態の各種値は図6(C)に示されるように項目ごとに記憶される。 Various values of the division state calculated based on the urination passage frequency distribution information are stored for each item as shown in Figure 6 (C).

 図4に戻り、ステップS10では、算出・記憶した排尿流量や分裂状態を含む排尿関連値を出力して、排尿関連値算出処理を終了する。ステップS10では、例えば、図6(A)の断面積の情報に基づいて、横軸を排尿の個体の断面積の大きさとし、縦軸を排尿の個体数とするヒストグラムを生成して出力する。図13(A)は、第1の排尿動画像に基づく排尿の個体の断面積と個体数とに基づくヒストグラムを示し、図13(B)は、第2の排尿動画像に基づく排尿の個体と個体数とに基づくヒストグラムを示している。また、図13(C)は、図13(A)と図13(B)との対比を示す図であり、第1の排尿動画像の算出結果と、第2の排尿動画像の算出結果とに大差が生じていないことが示されている。また、ステップS10では、図6(C)で示した分裂状態や、排尿部通過頻度分布情報に基づく画像(図9~図12参照)に関する排尿動態情報も出力される。ステップS10では、これに加えて取得動画像も出力するようにしてもよい。なお、ステップS10における出力とは、ディスプレイや、プリントアウト、所定の宛先への電子メール送信などが含まれる。 Returning to FIG. 4, in step S10, the urination-related values including the calculated and stored urination flow rate and division state are output, and the urination-related value calculation process is terminated. In step S10, for example, based on the cross-sectional area information of FIG. 6(A), a histogram is generated and output, with the horizontal axis representing the size of the cross-sectional area of the urination individual and the vertical axis representing the number of urination individuals. FIG. 13(A) shows a histogram based on the cross-sectional area and number of urination individuals based on the first urination motion image, and FIG. 13(B) shows a histogram based on the urination individuals and number of individuals based on the second urination motion image. Also, FIG. 13(C) is a diagram showing a comparison between FIG. 13(A) and FIG. 13(B), and shows that there is no significant difference between the calculation results of the first urination motion image and the calculation results of the second urination motion image. Furthermore, in step S10, urination dynamics information regarding the state of division shown in FIG. 6(C) and images based on urination portion passing frequency distribution information (see FIGS. 9 to 12) are also output. In addition, in step S10, acquired moving images may also be output. Note that output in step S10 includes display, printing, sending an e-mail to a specified destination, etc.

 排尿動態特定装置1により算出・出力された排尿流量や分裂状態などを含む排尿関連値は、医師などによる診断に用いられる。医師は、被検者からの排尿動画像に基づき算出された排尿流量や分裂状態を含む排尿関連値に基づいて、前立腺肥大症や過活動膀胱・尿失禁症その他の排尿障害のリスクや、罹患の可能性、通院の必要性などについて診断することができる。これにより、臨床的には重要なポイントとなる排尿動態を排尿の分裂状態の形態・分布・流量やそれらの時間的変化に関連した客観的事実に基づき評価することができ、排尿障害について適切に診察を行うことができる。診断結果は、通信回線などを介して被検者に送信されるものであってもよく、印刷した診断結果書を被検者の住所に送付されるものなどであってもよい。 The urination-related values including the urination flow rate and schizophrenia calculated and output by the urination dynamics specification device 1 are used for diagnosis by doctors and the like. Based on the urination-related values including the urination flow rate and schizophrenia calculated based on the urination dynamics images from the subject, doctors can diagnose the risk of prostatic hyperplasia, overactive bladder, urinary incontinence, and other urinary disorders, the possibility of contracting the disease, and the need for hospitalization. This allows the urination dynamics, which is a clinically important point, to be evaluated based on objective facts related to the morphology, distribution, and flow rate of the urination schizophrenia state and their changes over time, and allows appropriate examinations for urinary disorders. The diagnosis results may be sent to the subject via a communication line, etc., or a printed diagnosis result may be sent to the subject's address.

 本実施の形態における排尿動態特定装置1は、人体から放出される排尿の時間経過に応じた動態を特定するための排尿動態特定装置1であって、被検者から便器などへ放出される排尿に対して面状の光を照射させて交差させている面が撮像されて取得した取得動画像に基づいて、排尿関連値算出部4により排尿の時間経過に応じた動態として、当該面における排尿の分裂状態が特定されて、排尿の分裂状態に関する情報が出力される。これにより、排尿の時間経過に応じた動態として排尿の分裂状態を特定して評価することができる。その結果、排尿障害について適切に診察を行うことができる。 The urination dynamics specifying device 1 in this embodiment is a urination dynamics specifying device 1 for specifying the dynamics of urination released from the human body over time, and based on the acquired moving image obtained by irradiating planar light onto urine released from a subject into a toilet or the like and capturing an image of the intersecting plane, the urination-related value calculation unit 4 specifies the urination split state on that plane as the dynamics of urination over time, and outputs information regarding the urination split state. This makes it possible to specify and evaluate the urination split state as the dynamics of urination over time. As a result, it is possible to perform an appropriate diagnosis for urinary disorders.

 また、排尿動態特定装置1は、放出される排尿と交差し得る面状の光を照射する水平切断光照明部2と、当該光の面に対向する位置から当該面を撮像するカメラ3とを備え、排尿関連値算出部4は、カメラ3から取得した面の取得動画像に基づいて、分裂している排尿各々の分裂数、断面積、流量、分離度、断面形状のうちの少なくともいずれかを排尿の分裂状態として抽出する。これにより、排尿の分裂状態をより細かく解析した項目の値をより具体的に特定して評価することができる。 The urination dynamics specification device 1 also includes a horizontal section light illumination unit 2 that irradiates a planar light that may intersect with the released urine, and a camera 3 that captures the plane of the light from a position opposite the plane, and the urination-related value calculation unit 4 extracts at least one of the number of divisions, cross-sectional area, flow rate, degree of separation, and cross-sectional shape of each divided urination as the division state of the urination based on the moving image of the plane acquired by the camera 3. This makes it possible to more specifically specify and evaluate the values of items that provide a more detailed analysis of the division state of the urination.

 また、水平切断光照明部2は、第1の波長を有する面状の第1のレーザー光を照射する第1レーザー部21と、第2の波長を有する面状の第2のレーザー光を照射する第2レーザー部22とを備え、互いに所定間隔Dをあけて平行となる第1の水平切断面P1および第2の水平切断面P2を生成し、排尿関連値算出部4は、第1の水平切断面P1における排尿の分裂状態と、第2の水平切断面P2における排尿の分裂状態と所定間隔Dとに基づいて、第1の水平切断面P1と第2の水平切断面P2との間における排尿の通過速度と排尿の断面積とを算出し、排尿の時間経過に応じた動態として排尿の流量を特定する。これにより、分裂状態のみならず排尿の流量を特定して評価することができ、排尿障害についての診察の精度を向上させることができる。また、水平切断光照明部2は、入射したレーザー光を水平面状に出射するシリンドリカルレンズを有する。これにより、水平切断光照明部2の構造を簡素化できる。 The horizontal section light illumination unit 2 includes a first laser unit 21 that irradiates a planar first laser light having a first wavelength, and a second laser unit 22 that irradiates a planar second laser light having a second wavelength, and generates a first horizontal section P1 and a second horizontal section P2 that are parallel to each other at a predetermined interval D, and the urination-related value calculation unit 4 calculates the urination passing speed and the cross-sectional area between the first horizontal section P1 and the second horizontal section P2 based on the split state of urination on the first horizontal section P1 and the split state of urination on the second horizontal section P2 and the predetermined interval D, and specifies the urination flow rate as a dynamic state according to the time course of urination. This makes it possible to specify and evaluate not only the split state but also the urination flow rate, thereby improving the accuracy of diagnosis for urinary disorders. The horizontal section light illumination unit 2 also has a cylindrical lens that emits the incident laser light in a horizontal plane. This simplifies the structure of the horizontal section light illumination unit 2.

 また、排尿関連値算出部4は、第1の水平切断面P1における排尿の分裂状態と、第2の水平切断面P2における排尿の分裂状態とに基づき、相互相関係数を用いて排尿が第1の水平切断面P1から第2の水平切断面P2に達するまでの時間を算出して、排尿の通過速度を算出する。これにより、排尿の尿流速度を効率よく正確に特定できる。 The urination-related value calculation unit 4 also calculates the time it takes for urination to reach the second horizontal cut plane P2 from the first horizontal cut plane P1 using a cross-correlation coefficient based on the split state of urination on the first horizontal cut plane P1 and the split state of urination on the second horizontal cut plane P2, and calculates the urination passage velocity. This allows the urinary flow velocity of urination to be determined efficiently and accurately.

 (変形例)
 前述した実施の形態では、病院などの施設に排尿動態特定装置1を設置して、受診にきた被検者からの排尿を撮像した動画像に基づき分裂状態を含む排尿動態を特定する例について説明した。しかし、これに限らず、排尿動態特定装置1のうち、図1および図3に示す水平切断光照明部2およびカメラ3の構成のみ(撮像装置)を病院などの第1の施設に設置し、第1の施設とは異なる第2の施設に図1に示す排尿関連値算出部4のみを排尿動態特定装置として設置し、第1の施設において被検者からの排尿の動画像を撮像して第2の施設に送信し、第2の施設において受信した排尿の動画像に基づき分裂状態を含む排尿動態を特定し、その特定結果を動画像送信元の施設に返信するようにしてもよい。この場合、特定結果に基づき第1の施設の医師などにより診断が行われるようにしてもよく、第2の施設の医師により特定結果に基づく診断がされたうえで、その診断結果を含む結果を動画像送信元の施設に返信するようにしてもよい。
(Modification)
In the above-mentioned embodiment, an example was described in which the urination behavior specification device 1 is installed in a facility such as a hospital, and the urination behavior including the schizophrenia is specified based on a moving image of urination from a subject who comes to the hospital. However, the present invention is not limited to this. Only the horizontal section light illumination unit 2 and the camera 3 shown in Fig. 1 and Fig. 3 (imaging device) of the urination behavior specification device 1 are installed in a first facility such as a hospital, and only the urination-related value calculation unit 4 shown in Fig. 1 is installed as a urination behavior specification device in a second facility different from the first facility, and the first facility may capture a moving image of urination from the subject and transmit it to the second facility, and the second facility may specify the urination behavior including the schizophrenia based on the received moving image of urination, and return the result of the specification to the facility that sent the moving image. In this case, a doctor or the like at the first facility may make a diagnosis based on the result of the specification, or a doctor at the second facility may make a diagnosis based on the result of the specification, and a result including the diagnosis result may be returned to the facility that sent the moving image.

 前述した実施の形態では、水平切断光照明部2が、第1の波長を有する面状の第1のレーザー光を照射する第1レーザー部21と、第2の波長を有する面状の第2のレーザー光を照射する第2レーザー部22とを備える例について説明した。しかし、光学的に異なる特性(例えば、波長、偏光など)を有する2種類の光を照射するものであれば、これに限るものではない。なお、カメラについては、照射する光の特性に応じた画像を区別できるものであればカラーカメラなどに限るものではなく、例えば、偏光が異なる2種類の光を照射する場合には偏光カメラを採用するものであってもよい。また、図4のステップS09で示した排尿の分裂状態を特定して診断する場合、水平切断光照明部は、ひとつのレーザー部を備え、排尿関連値算出部4は、図4のステップS08~ステップS10の処理を行うようにしてもよい。また、水平切断光照明部は、3つ以上のレーザー部を備え、3種類の面における排尿の情報から分裂状態をより詳細に特定できるようにしてもよい。
 また、前述した実施の形態では、レーザー光をシリンドリカルレンズ24によって面状に広げて照明する例を示したが、面状の光を形成できる照明であれば、これに限るものではなく、例えば、細い隙間 を通して面状の光を形成するスリット光や、LEDを直線上(形成する面)に沿って連ねて面状の光を形成するようなものであってもよい。
In the above-described embodiment, an example has been described in which the horizontal section light illumination unit 2 includes a first laser unit 21 that irradiates a planar first laser light having a first wavelength, and a second laser unit 22 that irradiates a planar second laser light having a second wavelength. However, the present invention is not limited to this, as long as it irradiates two types of light having different optical characteristics (e.g., wavelength, polarization, etc.). The camera is not limited to a color camera, etc., as long as it can distinguish images according to the characteristics of the irradiated light. For example, a polarized camera may be used when irradiating two types of light having different polarizations. In addition, when identifying and diagnosing the split state of urination shown in step S09 of FIG. 4, the horizontal section light illumination unit may include one laser unit, and the urination-related value calculation unit 4 may perform the processes of steps S08 to S10 of FIG. 4. In addition, the horizontal section light illumination unit may include three or more laser units, and the split state may be identified in more detail from the information on urination in three types of planes.
In addition, in the above-described embodiment, an example was shown in which the laser light is expanded into a surface by the cylindrical lens 24 to illuminate the object, but the present invention is not limited to this as long as the illumination can form surface light. For example, slit light that forms surface light through a narrow gap, or LEDs that are lined up in a straight line (the surface to be formed) to form surface light may be used.

 前述した実施の形態における図4のステップS08では、図14の方式Aのように取得動画像を構成するすべての画像を重ね合わせて排尿部通過頻度分布情報を生成して記憶する例を示したが、これに限るものではない。例えば、図14の方式Bに示すように、あるフレーム(Serial Number)について、当該フレームから所定数(例えば、100)の画像を重ね合わせて排尿部通過頻度分布情報を生成することにより、すべてのフレーム各々について排尿部通過頻度分布情報(1フレームずつずらした排尿部通過頻度分布情報)を生成するようにしてもよい。また、図14の方式Cに示すように、所定数(例えば、100)ずつに分割して、分割した所定数の画像を重ね合わせて分割した所定数毎に排尿部通過頻度分布情報(100フレームずつ完全に別々のセットに基づく排尿部通過頻度分布情報)を生成するようにしてもよい。これにより、例えば、排尿の途中で分裂数が変化したことなど、より詳細な排尿動態を特定できる。 In step S08 of FIG. 4 in the above-mentioned embodiment, an example was shown in which all images constituting the acquired moving image are superimposed to generate and store urination area passing frequency distribution information as in method A of FIG. 14, but this is not limited to this. For example, as shown in method B of FIG. 14, for a certain frame (Serial Number), a predetermined number (e.g., 100) of images from the frame are superimposed to generate urination area passing frequency distribution information, thereby generating urination area passing frequency distribution information (urination area passing frequency distribution information shifted by one frame) for each of all frames. Also, as shown in method C of FIG. 14, it is possible to divide the frame into a predetermined number (e.g., 100), superimpose the divided images into a predetermined number, and generate urination area passing frequency distribution information for each divided number (urination area passing frequency distribution information based on completely separate sets of 100 frames). This allows more detailed urination dynamics to be identified, such as, for example, a change in the number of divisions during urination.

 前述した実施の形態では、第1の水平切断面と、第2の水平切断面とが平行であり、かつ、カメラの撮像方向が第1の水平切断面および第2の水平切断面の垂線と一致する向きに設置されている例について説明した。これにより、第1の水平切断面と、第2の水平切断面との間隔が一定となり複雑な計算処理を行うことなく、かつ、カメラにより歪んで撮像されることを防止でき、その結果、精度良く分裂状態を含む排尿関連値を算出できる。しかし、これに限らず、カメラの撮像方向が第1の水平切断面および第2の水平切断面の垂線と一致する向きに設置されていないものであってもよい。また、第1の水平切断面と、第2の水平切断面とは、その垂線が鉛直方向となる面、つまり水平面となる例を想定しているが、これに限らず、第1の水平切断面と、第2の水平切断面とは、互いに平行であれば必ずしも水平面と一致させるものに限らず、水平面と若干のずれが生じる面であってもよい。 In the above embodiment, an example was described in which the first horizontal cut plane and the second horizontal cut plane are parallel, and the camera's imaging direction is set in a direction that coincides with the perpendicular line of the first and second horizontal cut planes. This makes it possible to prevent the distance between the first and second horizontal cut planes from being constant, and to prevent the camera from taking distorted images without performing complex calculations, thereby enabling accurate calculation of urination-related values including the split state. However, this is not limited to this, and the camera's imaging direction may not be set in a direction that coincides with the perpendicular line of the first and second horizontal cut planes. In addition, the first horizontal cut plane and the second horizontal cut plane are assumed to be planes whose perpendicular lines are vertical, that is, horizontal planes, but this is not limited to this, and the first horizontal cut plane and the second horizontal cut plane are not necessarily required to be aligned with the horizontal plane as long as they are parallel to each other, and may be planes that are slightly misaligned with the horizontal plane.

1  :排尿動態特定装置
2  :水平切断光照明部
3  :カメラ
4  :排尿関連値算出部
1: Urinary behavior specification device 2: Horizontal section light illumination unit 3: Camera 4: Urination-related value calculation unit

Claims (6)

 人体から放出される排尿の時間経過に応じた動態を特定するための排尿動態特定装置であって、
 放出される排尿に対して面状の光を照射させて交差させている面の画像に基づいて、前記排尿の時間経過に応じた動態として、当該面における排尿の分裂状態を特定する特定部と、
 前記特定部に特定された排尿の分裂状態に関する情報を出力する出力部とを備える、排尿動態特定装置。
A urination behavior specification device for specifying a behavior of urine discharged from a human body over time, comprising:
an identification unit that identifies a state of division of urination on the plane based on an image of a plane crossed by irradiating a planar light onto the released urination as a dynamic state according to the time course of the urination;
and an output unit that outputs information regarding the urination split state identified by the identification unit.
 放出される排尿と交差し得る面状の光を照射する光照射部と、
 当該光の面に対向する位置から当該面を撮像する撮像部とを備え、
 前記面の画像は、前記撮像部が時間経過に応じて撮像した画像であり、
 前記特定部は、前記面の画像に基づいて、分裂している排尿各々の分裂数、断面積、流量、分離度、断面形状のうちの少なくともいずれかを前記排尿の分裂状態として抽出する抽出部を含む、請求項1に記載の排尿動態特定装置。
A light irradiation unit that irradiates planar light that can intersect with the released urine;
an imaging unit that images the surface of the light from a position facing the surface of the light,
the image of the surface is an image captured by the imaging unit over time,
The urination dynamics specifying device according to claim 1, wherein the specifying unit includes an extracting unit that extracts at least one of the number of fragments, cross-sectional area, flow rate, degree of separation, and cross-sectional shape of each fragmented urination based on the image of the surface as the fragmentation state of the urination.
 前記光照射部は、第1の特性を有する面状の第1光を照射する第1光照射部と、第2の特性を有する面状の第2光を前記第1光の面と所定間隔をあけて平行に照射する第2光照射部とを含み、
 前記特定部は、前記第1光の面における排尿の分裂状態と、前記第2光の面における排尿の分裂状態と、前記所定間隔とに基づいて、前記第1光の面と前記第2光の面との間における排尿の通過速度と排尿の断面積とを算出し、前記排尿の時間経過に応じた動態として排尿の流量を特定する、請求項2に記載の排尿動態特定装置。
the light irradiation unit includes a first light irradiation unit that irradiates a first light having a first characteristic and a planar first light, and a second light irradiation unit that irradiates a second light having a second characteristic in parallel with a surface of the first light at a predetermined interval;
The urination dynamics determining device of claim 2, wherein the determining unit calculates a passing speed and a cross-sectional area of the urination between the first light plane and the second light plane based on the fragmentation state of the urination on the first light plane, the fragmentation state of the urination on the second light plane, and the specified interval, and determines a urination flow rate as a dynamics according to the time course of the urination.
 前記特定部は、前記第1光の面における排尿の分裂状態と、前記第2光の面における排尿の分裂状態とに基づき、相互相関係数を用いて排尿が第1光の面から第2光の面に達するまでの時間を算出して、前記排尿の通過速度を算出する、請求項3に記載の排尿動態特定装置。 The urination dynamics determination device according to claim 3, wherein the determination unit calculates the time it takes for urination to reach the second light surface from the first light surface using a cross-correlation coefficient based on the split state of urination on the first light surface and the split state of urination on the second light surface, thereby calculating the passage speed of the urination.  前記光照射部は、入射したレーザー光を面状に出射するシリンドリカルレンズを有する、請求項2に記載の排尿動態特定装置。 The urination behavior determination device according to claim 2, wherein the light irradiation unit has a cylindrical lens that emits incident laser light in a planar manner.  人体から放出される排尿の時間経過に応じた動態を特定するための特定用画像を撮像する撮像装置であって、
 放出される排尿と交差し得る面状の光を照射する光照射部と、
 前記特定用画像として前記光の面に対向する位置から当該面を撮像する撮像部とを備える、撮像装置。
An imaging device for capturing identification images for identifying a dynamic state of urine discharged from a human body over time, comprising:
A light irradiation unit that irradiates planar light that can intersect with the released urine;
an imaging unit that captures an image of the surface of the light from a position facing the surface as the identification image;
PCT/JP2024/017895 2023-05-19 2024-05-15 Urination dynamics identification device and imaging device Pending WO2024241985A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-083008 2023-05-19
JP2023083008 2023-05-19

Publications (1)

Publication Number Publication Date
WO2024241985A1 true WO2024241985A1 (en) 2024-11-28

Family

ID=93589816

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/017895 Pending WO2024241985A1 (en) 2023-05-19 2024-05-15 Urination dynamics identification device and imaging device

Country Status (1)

Country Link
WO (1) WO2024241985A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020124497A (en) * 2019-01-31 2020-08-20 Necソリューションイノベータ株式会社 Urination management device, urination management method, program, and recording medium
US20210275073A1 (en) * 2020-03-05 2021-09-09 Emano Metrics, Inc. Systems And Methods For Uroflowmetry
US20210401341A1 (en) * 2020-06-30 2021-12-30 Zavdi Lichtman Imaged-Based Uroflowmetry Device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020124497A (en) * 2019-01-31 2020-08-20 Necソリューションイノベータ株式会社 Urination management device, urination management method, program, and recording medium
US20210275073A1 (en) * 2020-03-05 2021-09-09 Emano Metrics, Inc. Systems And Methods For Uroflowmetry
US20210401341A1 (en) * 2020-06-30 2021-12-30 Zavdi Lichtman Imaged-Based Uroflowmetry Device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ISOMURA, A. ET AL.: "Estimating flow rate and total volume of simulated urine flow noninvasively from a monocular camera.", 2015 37TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC, 25 August 2015 (2015-08-25), pages 751 - 754, XP032810300, DOI: 10.1109/EMBC.2015.7318471 *

Similar Documents

Publication Publication Date Title
US9351650B2 (en) Image processing apparatus and image processing method
US9307903B2 (en) Image processing apparatus and image processing method
US7320518B2 (en) Ophthalmic apparatus
KR102841844B1 (en) Range differentiators for auto-focusing in optical imaging systems
JP7679826B2 (en) Microscope system, imaging method, and imaging device
CN110709749B (en) Combined bright field and phase contrast microscope system and image processing apparatus equipped therewith
US20190137394A1 (en) Image processing apparatus and method of operating image processing apparatus
DK3051490T3 (en) Ophthalmological device, method of imaging and program
JP4043417B2 (en) Particle size measuring device
TWM376848U (en) Coin detecting apparatus
US20110292197A1 (en) Image creating device and image creating method
JP4585814B2 (en) Ophthalmic equipment
JP5038925B2 (en) Ophthalmic measuring device
JPH1033482A (en) Analysis apparatus of profile of anterior part of eye
WO2024241985A1 (en) Urination dynamics identification device and imaging device
US20220248950A1 (en) Ophthalmic device
CN103799964B (en) Ophthalmoligic instrument and ophthalmic image taking method
US7249852B2 (en) Eye characteristic measuring apparatus
US8996097B2 (en) Ophthalmic measuring method and apparatus
JP2003057535A (en) Method for measuring distance between optical observation device and examined sample with extent, and microscope implementing the same method
WO2016067366A1 (en) Observation device, observation method, and computer program
WO2018003906A1 (en) Ophthalmic measurement device
JP4104302B2 (en) Anterior segment analysis device and anterior segment analysis program
JP2005087300A (en) Ophthalmic imaging equipment
KR20050039585A (en) Funduscopic image processing unit and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24810992

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025522343

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE