CN111144379B - Automatic identification method for visual dynamic response of mice based on image technology - Google Patents
Automatic identification method for visual dynamic response of mice based on image technology Download PDFInfo
- Publication number
- CN111144379B CN111144379B CN202010001628.9A CN202010001628A CN111144379B CN 111144379 B CN111144379 B CN 111144379B CN 202010001628 A CN202010001628 A CN 202010001628A CN 111144379 B CN111144379 B CN 111144379B
- Authority
- CN
- China
- Prior art keywords
- mouse
- head
- tail
- nose tip
- outline
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 89
- 230000004044 response Effects 0.000 title claims abstract description 74
- 238000005516 engineering process Methods 0.000 title claims abstract description 12
- 241000699670 Mus sp. Species 0.000 title abstract description 29
- 230000000007 visual effect Effects 0.000 title abstract description 8
- 210000005069 ears Anatomy 0.000 claims abstract description 34
- 230000033001 locomotion Effects 0.000 claims abstract description 23
- 238000012937 correction Methods 0.000 claims abstract description 7
- 230000005484 gravity Effects 0.000 claims description 35
- 240000001307 Myosotis scorpioides Species 0.000 claims description 28
- 238000000605 extraction Methods 0.000 claims description 12
- 230000004886 head movement Effects 0.000 claims description 12
- 241000581650 Ivesia Species 0.000 claims description 9
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 claims description 2
- 230000001419 dependent effect Effects 0.000 claims description 2
- 241000699666 Mus <mouse, genus> Species 0.000 abstract description 210
- 238000006243 chemical reaction Methods 0.000 abstract description 13
- 210000003128 head Anatomy 0.000 description 52
- 230000009012 visual motion Effects 0.000 description 24
- 238000002474 experimental method Methods 0.000 description 22
- 238000010586 diagram Methods 0.000 description 7
- 230000010354 integration Effects 0.000 description 6
- 101100353046 Caenorhabditis elegans mig-6 gene Proteins 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 3
- 101100244635 Paramecium tetraurelia Ppn2 gene Proteins 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 2
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 2
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000004904 shortening Methods 0.000 description 2
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 1
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 1
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 1
- 241001456035 Stachys crenata Species 0.000 description 1
- 238000010171 animal model Methods 0.000 description 1
- 230000003542 behavioural effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 210000000883 ear external Anatomy 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000002045 lasting effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000003938 response to stress Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
Description
技术领域Technical Field
本发明涉及图像识别方法。The invention relates to an image recognition method.
背景技术Background Art
小鼠(即通常所说的实验用小白鼠,实际应用中不一定为白色)是生物实验中经常使用的一种模式动物。在生命科学和生物学研究中经常需要对动物的应激反应做出评价,但是由于实验动物不能自主表达其主观的感受,因此需要研究准确有效的间接方法来衡量其反应水平。不同的动物个体之间在行为模式上存在较大的差异。例如小鼠对于移动的物体存在视动反应现象,即存在自主追随、复位、追随过程。目前常用的一种小鼠视动实验方法是将小鼠置于圆形的光栅之中,让其观看旋转的光栅。不同的实验个体对于旋转光栅会产生不同程度的视动反应,具体表现为头部跟随光栅运动的频次和单次持续时间的变化。图1是一个典型的简易视动实验装置。在实验过程中,一般使用相机记录小鼠的运动状态,在实验结束后人工对视频进行分析和统计。对于视动实验来说,人工统计存在以下问题:Mice (commonly known as experimental mice, not necessarily white in actual applications) are a model animal often used in biological experiments. In life science and biological research, it is often necessary to evaluate the stress response of animals. However, since experimental animals cannot express their subjective feelings independently, it is necessary to study accurate and effective indirect methods to measure their response levels. There are large differences in behavioral patterns between different animal individuals. For example, mice have a visual-motor response to moving objects, that is, there is an autonomous following, resetting, and following process. A commonly used mouse visual-motor experiment method is to place the mouse in a circular grating and let it watch the rotating grating. Different experimental individuals will have different degrees of visual-motor response to the rotating grating, which is specifically manifested in the frequency of the head following the grating movement and the change in the single duration. Figure 1 is a typical simple visual-motor experiment device. During the experiment, a camera is generally used to record the movement state of the mouse, and the video is manually analyzed and counted after the experiment. For visual-motor experiments, manual statistics have the following problems:
1.小鼠对于光栅的反应不是持续的,而是间歇性的,通常持续数秒即转换为无规律运动,因此实验人员需要完整观看视频(通常长度为数分钟),对于部分不易鉴别的运动,需要反复观看才能做出判断。1. The mice's response to the grating is not continuous, but intermittent, usually lasting for a few seconds before turning into irregular movements. Therefore, the experimenters need to watch the entire video (usually several minutes in length). For some movements that are difficult to identify, they need to watch them repeatedly to make a judgment.
2.视动实验一般需要大量实验样本,单一样本需要录制一定时长的视频,因此人工观看和统计的工作量非常巨大。2. Visual-kinematic experiments generally require a large number of experimental samples, and a single sample needs to be recorded for a certain length of video, so the workload of manual viewing and statistics is very huge.
3.由于小鼠存在个体差异,人工统计时会存在主观因素干扰,导致评判标准的变化,特别是在长时间实验统计当中,人的潜在的心理因素波动会影响整个实验结果的可靠性和客观性。3. Due to individual differences among mice, there will be subjective interference in manual statistics, which will lead to changes in the evaluation criteria. Especially in long-term experimental statistics, fluctuations in people's potential psychological factors will affect the reliability and objectivity of the entire experimental results.
因此建立基于计算机图像技术的小鼠视动反应自动识别方法,能够有效降低人力成本,提高实验的效率和准确率。Therefore, establishing an automatic recognition method for mouse visual-motor response based on computer image technology can effectively reduce labor costs and improve experimental efficiency and accuracy.
发明内容Summary of the invention
本发明的目的是为了解决现有小鼠视动反应识别方法耗费大量时间和人力成本,识别准确率、效率低的问题,而提出基于图像技术的小鼠视动反应自动识别方法。The purpose of the present invention is to solve the problems that the existing mouse visual-motor response recognition method consumes a lot of time and manpower costs and has low recognition accuracy and efficiency, and to propose an automatic mouse visual-motor response recognition method based on image technology.
基于图像技术的小鼠视动反应自动识别方法具体过程为:The specific process of the automatic recognition method of mouse visual motion response based on image technology is as follows:
步骤一、提取小鼠身体、耳朵、尾巴的轮廓;Step 1: Extract the outlines of the mouse's body, ears, and tail;
步骤二、基于小鼠身体、耳朵、尾巴的轮廓,识别小鼠头部方位;Step 2: Identify the position of the mouse head based on the outlines of the mouse body, ears, and tail;
步骤三、基于步骤一得到的小鼠身体、耳朵、尾巴的轮廓和步骤二得到小鼠头部方位,识别小鼠视动反应。Step 3: Based on the outlines of the mouse body, ears, and tail obtained in
本发明的有益效果为:The beneficial effects of the present invention are:
1.全自动实时识别。只需设定实验参数后将小鼠放入实验装置,在达到指定实验时间之后可以立即得到视动实验数据,无需人工干预和监控,极大节约了时间和人力成本。假定每只小鼠需要进行2分钟的实验,每组实验20只小鼠为例,人工方法通常需要2×20=40分钟实验和大约3倍的录像监控时间,总计160分钟。而使用本发明进行自动识别,仅需要40分钟即可完成。1. Fully automatic real-time recognition. You only need to set the experimental parameters and put the mice into the experimental device. After reaching the specified experimental time, you can immediately get the visual-motor experimental data without manual intervention and monitoring, which greatly saves time and labor costs. Assuming that each mouse needs to undergo a 2-minute experiment, and each group of experiments has 20 mice as an example, the manual method usually requires 2×20=40 minutes of experiment and about 3 times the video monitoring time, for a total of 160 minutes. However, using the present invention for automatic recognition, it only takes 40 minutes to complete.
2.客观的评判标准。由计算机对视动反应进行识别和评价,能够完全避免由于实验人员疲劳、经验不足等因素对实验结果造成的影响,得到客观准确的实验结果,这对于医药研究具有非常重要的意义。2. Objective evaluation criteria. The recognition and evaluation of visual-motor responses by computers can completely avoid the influence of factors such as fatigue and lack of experience of experimenters on the experimental results, and obtain objective and accurate experimental results, which is of great significance for medical research.
3.能够实现高通量实验。本发明可以通过设置多台设备并行的方式,来达到实验效率倍增的效果,例如5台装置同时工作,100只小鼠的实验仅需40分钟即可得到结果,而对于人工实验来说,这样的样本量需要投入巨大的人力成本,提高实验的效率和准确率。3. Ability to realize high-throughput experiments. The present invention can achieve the effect of doubling the experimental efficiency by setting up multiple devices in parallel. For example, if 5 devices work at the same time, the experiment of 100 mice can get the results in only 40 minutes. For manual experiments, such a large number of samples requires huge manpower costs to improve the efficiency and accuracy of the experiment.
4.有助于保证实验条件的一致性。由于小鼠的行为状态会随着时间变化而改变,因此需要进了保证所有小鼠在同一时段完成测试,来保证实验条件的一致性。应用本发明进行高通量实验,可以在少量人力投入的情况下实现这一要求。4. It helps to ensure the consistency of experimental conditions. Since the behavior of mice changes over time, it is necessary to ensure that all mice complete the test at the same time to ensure the consistency of experimental conditions. The application of the present invention to high-throughput experiments can achieve this requirement with a small amount of manpower investment.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
图1为本发明人工观察小鼠视动反应的示意图;FIG1 is a schematic diagram of artificially observing the visual motion response of mice according to the present invention;
图2为本发明所用装置图,1为光栅展现模块,由4块液晶显示屏组成,屏幕可显示滚动的明暗条纹,其宽度、亮度、速度、颜色等参数均可由计算机进行预设和修改;2为鼠台,用于放置小鼠;3、4分别为上盖和底板,由镜面材料制成,用来扩展光栅的纵向深度;5为相机,用于实验过程中的视频采集;FIG2 is a diagram of the device used in the present invention, 1 is a grating display module, which is composed of 4 liquid crystal display screens, and the screen can display rolling light and dark stripes, and its width, brightness, speed, color and other parameters can be preset and modified by a computer; 2 is a mouse stand, which is used to place mice; 3 and 4 are the upper cover and the bottom plate, respectively, which are made of mirror material and are used to extend the longitudinal depth of the grating; 5 is a camera, which is used to collect video during the experiment;
图3为本发明小鼠轮廓的提取结果,6是小鼠身体的轮廓,7是小鼠耳朵的轮廓,8是小鼠尾巴的轮廓;FIG3 is the extraction result of the mouse outline of the present invention, FIG6 is the outline of the mouse body, FIG7 is the outline of the mouse ear, and FIG8 is the outline of the mouse tail;
图4为本发明小鼠鼻尖初步定位的结果,9是计算的小鼠身体轮廓重心,10是选择的小鼠耳朵周围区域用来在识别过程中避开小鼠外耳廓周围,11是小鼠尾巴轮廓的外接矩形,12是得到的鼻尖预选点;FIG4 is a result of preliminary positioning of the nose tip of a mouse according to the present invention, 9 is the calculated center of gravity of the mouse body contour, 10 is the selected area around the mouse ear to avoid the area around the mouse auricle during the recognition process, 11 is the circumscribed rectangle of the mouse tail contour, and 12 is the obtained preselected nose tip point;
图5为本发明小鼠鼻尖的矫正结果,13是小鼠前半部分重心,14是用来筛选鼻尖的小鼠头部附近区域,15是最终得到的鼻尖Pn;FIG5 is the correction result of the nose tip of the mouse of the present invention, 13 is the center of gravity of the front half of the mouse, 14 is the area near the head of the mouse used to select the nose tip, and 15 is the final nose tip P n ;
图6为本发明小鼠头部方向的识别结果,16和17分别是以Pn点为圆心,半径为r1和r2的圆,18和19分别是重心Gr1和Gr2,20是后得到的小鼠头部方位;6 is a result of identifying the direction of the mouse head according to the present invention, 16 and 17 are circles with Pn as the center and radii r1 and r2 , 18 and 19 are the centroids Gr1 and Gr2 , and 20 is the orientation of the mouse head obtained later;
图7a为本发明查找小鼠视动反应的预选区域使用的一阶差分的局部积分示意图;FIG7a is a schematic diagram of the local integration of the first-order difference used in the present invention to find the preselected area of the mouse optokinetic response;
图7b为本发明查找小鼠视动反应的预选区域使用的二阶差分的局部积分示意图;FIG7 b is a schematic diagram of the local integration of the second-order difference used in the present invention to find the preselected area of the mouse optokinetic response;
图8为使用本发明所述方法识别的小鼠视动反应状态识别示意图,图中虚线为头部运动随机曲线,实线标记部分为小鼠视动反应状态,除了第三段实线以外其他实线标记部分即为识别出的小鼠视动反应状态。Figure 8 is a schematic diagram of the visual-motor response state recognition of mice using the method described in the present invention. In the figure, the dotted line is a random curve of head movement, the solid line marked part is the visual-motor response state of the mouse, and except for the third solid line, the other solid line marked parts are the identified visual-motor response states of the mouse.
具体实施方式DETAILED DESCRIPTION
具体实施方式一:本实施方式基于图像技术的小鼠视动反应自动识别方法具体过程为:Specific implementation method 1: The specific process of the automatic recognition method of mouse visual motion response based on image technology in this implementation method is as follows:
本发明的核心内容是一种基于计算机视觉技术的小鼠视动反应自动识别方法,能够应用于旋转光栅原理的小鼠视动反应自动实验装置中,通过相机对光栅内部的小鼠进行视频采集,利用计算机对视频进行实时分析,识别并标记小鼠对光栅的反应状态、次数、持续时间等信息。The core content of the present invention is a method for automatically identifying the visual-motor response of mice based on computer vision technology. The method can be applied to an automatic experimental device for the visual-motor response of mice based on the principle of a rotating grating. The video of the mice inside the grating is captured by a camera, and the video is analyzed in real time by a computer to identify and mark the reaction status, number of times, duration and other information of the mice to the grating.
本发明使用的实验装置结构如图2所示,其中1为光栅展现模块,由4块液晶显示屏组成,屏幕可显示滚动的明暗条纹,其宽度、亮度、速度、颜色等参数均可由计算机进行预设和修改;2为鼠台,用于放置小鼠;3、4分别为上盖和底板,由镜面材料制成,用来扩展光栅的纵向深度;5为相机,用于实验过程中的视频采集。The structure of the experimental device used in the present invention is shown in Figure 2, wherein 1 is a grating display module, which is composed of 4 liquid crystal screens. The screen can display scrolling light and dark stripes, and its parameters such as width, brightness, speed, color, etc. can be preset and modified by a computer; 2 is a mouse platform for placing mice; 3 and 4 are the upper cover and bottom plate, respectively, which are made of mirror material and are used to expand the longitudinal depth of the grating; 5 is a camera, which is used for video acquisition during the experiment.
在进行实验时,小鼠被放置在鼠台上,同时屏幕上显示连续滚动的光栅条纹。相机在实验过程中进行视频采集,并将图像发送至计算机中,由识别算法对小鼠的视动反应进行分析和判断。其算法的检测流程为:During the experiment, the mouse was placed on the mouse platform, and the screen displayed continuously scrolling grating stripes. The camera captured video during the experiment and sent the image to the computer, where the recognition algorithm analyzed and judged the visual motion response of the mouse. The detection process of the algorithm is as follows:
步骤一、提取小鼠身体、耳朵、尾巴的轮廓;Step 1: Extract the outlines of the mouse's body, ears, and tail;
步骤二、基于小鼠身体、耳朵、尾巴的轮廓,识别小鼠头部方位;Step 2: Identify the position of the mouse head based on the outlines of the mouse body, ears, and tail;
步骤三、基于步骤一得到的小鼠身体、耳朵、尾巴的轮廓和步骤二得到小鼠头部方位,识别小鼠视动反应;在前面两个步骤的基础上,能够在视频中连续跟踪小鼠的头部朝向和动作变化。对跟踪数据进行分析能够得到小鼠的视动反应数据。Step 3: Based on the outlines of the mouse body, ears, and tail obtained in
具体实施方式二:本实施方式与具体实施方式一不同的是,所述步骤一中提取小鼠身体、耳朵、尾巴的轮廓;具体过程为:Specific implementation method 2: This implementation method is different from the
对采集到的灰度图像进行二值化处理,提取小鼠边缘轮廓;Binarize the collected grayscale image and extract the edge contour of the mouse;
二值化的公式为:The formula for binarization is:
其中,s1是感兴趣区域像素值的最小值,s2是感兴趣区域像素值的最大值;ik,l是采集到的图像中第k行第l列元素的像素值,bik,l是进行二值化后图像中第k行第l列元素的像素值;在识别小鼠的身体、耳朵和尾巴时,由于其颜色不一定相同,在轮廓提取时也应该设置不同的s1,s2。Among them, s 1 is the minimum pixel value of the region of interest, s 2 is the maximum pixel value of the region of interest; i k,l is the pixel value of the element in the kth row and lth column of the acquired image, and bi k,l is the pixel value of the element in the kth row and lth column of the image after binarization; when identifying the body, ears and tail of a mouse, since their colors are not necessarily the same, different s 1 and s 2 should also be set when extracting the contour.
在此基础上根据小鼠身体、耳朵、尾巴不同的大小形态结构便能分别提取出小鼠身体、耳朵、尾巴的轮廓。为提高轮廓提取的稳定性,也可利用彩色图像RGB(红、绿、蓝)三个通道分别进行二值化处理,增加轮廓提取的稳定性。On this basis, the contours of the mouse body, ears, and tail can be extracted according to their different sizes and shapes. To improve the stability of contour extraction, the three channels of the color image RGB (red, green, and blue) can be used for binarization to increase the stability of contour extraction.
提取到的轮廓示意图如图3所示,6是小鼠身体的轮廓,7是小鼠耳朵的轮廓,8是小鼠尾巴的轮廓。The schematic diagram of the extracted contour is shown in FIG3 , where 6 is the contour of the mouse body, 7 is the contour of the mouse ear, and 8 is the contour of the mouse tail.
其它步骤及参数与具体实施方式一相同。The other steps and parameters are the same as those in the first embodiment.
具体实施方式三:本实施方式与具体实施方式一不同的是,所述步骤一中提取小鼠身体、耳朵、尾巴的轮廓;具体过程为:Specific implementation method three: This implementation method is different from the specific implementation method one in that the outlines of the mouse body, ears, and tail are extracted in step one; the specific process is:
对普通摄像机采集到的图片进行处理时,小鼠轮廓的提取有时显得并不是很完整,这对后续小鼠头部的定位和头部方向的识别会造成一定的影响。为了更好地提取小鼠的轮廓,可以使用深度相机对小鼠进行数据的采集。When processing images captured by ordinary cameras, the extraction of the mouse outline is sometimes incomplete, which will have a certain impact on the subsequent positioning of the mouse head and the recognition of the head direction. In order to better extract the mouse outline, a depth camera can be used to collect data on the mouse.
深度相机中包含两个红外相机、红外点阵投射器和RGB相机。这三者的共同作用能得到从周围感知物体的深度信息。在视动反应检测的装置中,小鼠与相机距离的范围固定,选取图像中像素点的深度信息在这个距离范围之间的点所组成的集合就是小鼠的身体部分。小鼠身体部分的外边缘便组成了小鼠身体的轮廓。这样分离出来的小鼠身体轮廓比普通相机通过算法提取出来的轮廓更贴合小鼠的真实轮廓。The depth camera contains two infrared cameras, an infrared dot projector and an RGB camera. The three work together to obtain the depth information of objects perceived from the surroundings. In the visual motion reaction detection device, the distance between the mouse and the camera is fixed, and the set of points composed of the depth information of the pixels in the image within this distance range is the body part of the mouse. The outer edges of the mouse body parts form the outline of the mouse body. The mouse body outline separated in this way is closer to the real outline of the mouse than the outline extracted by the algorithm of the ordinary camera.
使用深度相机对小鼠进行数据采集,小鼠距离深度相机的距离范围满足:Use a depth camera to collect data on mice. The distance between the mouse and the depth camera meets the following requirements:
fmin<f<fmax (2)f min <f <f max (2)
其中,f是小鼠距离深度相机的距离,小鼠不同部位与相机的距离不一样;fmin是小鼠上端(小鼠姿态最高点)距离深度相机的距离,由于小鼠大小不一,而且小鼠上端基本没有物体,fmin的值可以适当取小一些的值;fmax是小鼠底端(小鼠姿态最低点)距离深度相机的距离;Among them, f is the distance between the mouse and the depth camera. The distance between different parts of the mouse and the camera is different; f min is the distance between the upper end of the mouse (the highest point of the mouse posture) and the depth camera. Since mice are of different sizes and there are basically no objects on the upper end of the mouse, the value of f min can be appropriately smaller; f max is the distance between the bottom end of the mouse (the lowest point of the mouse posture) and the depth camera;
根据图片中像素点的深度信息选取公式(2)距离范围之间的点所组成的集合形成小鼠的身体部分;再提取小鼠身体部分的外边缘得到小鼠身体的轮廓;小鼠身体轮廓的提取可以用这种方法从图像中分离出更好的结果。According to the depth information of the pixel points in the image, a set of points within the distance range of formula (2) is selected to form the body part of the mouse; then the outer edge of the mouse body part is extracted to obtain the outline of the mouse body; the extraction of the mouse body outline can be separated from the image in this way to obtain better results.
对小鼠身体的轮廓进行二值化处理,提取小鼠边缘轮廓;Binarize the outline of the mouse body and extract the edge outline of the mouse;
二值化的公式为:The formula for binarization is:
其中,s1是感兴趣区域像素值的最小值,s2是感兴趣区域像素值的最大值;ik,l是采集到的图像中第k行第l列元素的像素值,bik,l是进行二值化后图像中第k行第l列元素的像素值;Wherein, s1 is the minimum pixel value of the region of interest, s2 is the maximum pixel value of the region of interest; i k,l is the pixel value of the element in the kth row and lth column of the acquired image, and bi k,l is the pixel value of the element in the kth row and lth column of the binarized image;
根据小鼠身体、耳朵、尾巴不同的大小形态结构分别提取出小鼠身体、耳朵、尾巴的轮廓。The outlines of the mouse body, ears and tail are extracted respectively according to their different sizes and shapes.
其它步骤及参数与具体实施方式一相同。The other steps and parameters are the same as those in the first embodiment.
具体实施方式四:本实施方式与具体实施方式二或三不同的是,所述步骤二中基于小鼠身体、耳朵、尾巴的轮廓,识别小鼠头部方位;具体过程为:Specific implementation method 4: This implementation method is different from
步骤二一、基于小鼠身体、耳朵、尾巴的轮廓初步定位小鼠鼻尖;Step 21: Preliminarily locate the mouse nose tip based on the contours of the mouse body, ears, and tail;
步骤二二、对步骤二一得到的小鼠鼻尖初步定位的结果进行位置矫正,得到矫正后小鼠的鼻尖Pn;Step 22: performing position correction on the preliminary positioning result of the mouse nose tip obtained in step 21 to obtain the corrected mouse nose tip P n ;
步骤二三、基于步骤二二得到的小鼠的鼻尖Pn,识别小鼠头部方位。Step 23: Based on the nose tip P n of the mouse obtained in step 22, identify the orientation of the mouse head.
其它步骤及参数与具体实施方式二或三相同。The other steps and parameters are the same as those in the second or third embodiment.
具体实施方式五:本实施方式与具体实施方式一至四之一不同的是,所述步骤二一中基于小鼠身体、耳朵、尾巴的轮廓初步定位小鼠鼻尖;具体过程为:Specific implementation method 5: This implementation method is different from
在得到小鼠身体的轮廓之后,计算小鼠身体轮廓的重心Gc;After obtaining the outline of the mouse body, the center of gravity G c of the mouse body outline is calculated;
计算公式如下所示:The calculation formula is as follows:
其中,Ct是小鼠身体轮廓中各点坐标的集合;Among them, C t is the set of coordinates of each point in the mouse body contour;
一般情况下,小鼠身体轮廓的重心在小鼠的后部分,轮廓上距离小鼠重心最远的点就是小鼠的鼻尖。但一些情况下,小鼠的重心会出现在小鼠的前部分,这就很容易使小鼠的鼻尖误识别到尾巴根附近。另外,在识别小鼠身体轮廓时,小鼠外耳廓周围由于有和身体其他部分一样的毛色也将被识别出来,而外耳廓周围特别容易形成小尖,这就很容易使得小鼠鼻尖误识别到小鼠外耳廓周围。Generally, the center of gravity of the mouse's body outline is at the back of the mouse, and the point on the outline farthest from the center of gravity is the tip of the mouse's nose. However, in some cases, the center of gravity of the mouse appears at the front of the mouse, which can easily cause the tip of the mouse's nose to be mistakenly identified as being near the base of the tail. In addition, when identifying the mouse's body outline, the area around the mouse's auricle will also be identified because it has the same fur color as the rest of the body, and it is particularly easy for a small tip to form around the auricle, which can easily cause the tip of the mouse's nose to be mistakenly identified as being around the auricle.
由于存在这两种误识别的情况,本方法在使用重心确定小鼠鼻尖的时候使用小鼠尾巴和耳朵的轮廓对小鼠鼻尖的初步定位进行矫正。选择一个在小鼠耳朵周围的区域(如以小鼠耳朵轮廓重心为中心,以稍大于耳朵尺寸为直径的圆)使得在识别过程中避开小鼠外耳廓周围。针对误识别到尾巴根附近的情况,选择小鼠耳朵周围区域外小鼠身体轮廓上距离Gc最远的点和小鼠耳朵周围区域外小鼠身体轮廓上距离这点最远的点为小鼠鼻尖初步定位点的预选点。分别求取这两点与小鼠尾巴和耳朵的距离(可以用小鼠和耳朵轮廓的重心来计算距离)。选取这两点中远离小鼠尾巴和靠近小鼠耳朵的点作为小鼠鼻尖的预选点Ppn,Ppn就是小鼠鼻尖初步定位的结果。Due to the existence of these two cases of misidentification, this method uses the contours of the mouse tail and ear to correct the preliminary positioning of the mouse nose tip when using the center of gravity to determine the mouse nose tip. Select an area around the mouse ear (such as a circle with the center of gravity of the mouse ear contour as the center and a diameter slightly larger than the ear size) so that the area around the mouse external auricle is avoided during the recognition process. For the case of misidentification near the tail root, select the point on the mouse body contour outside the area around the mouse ear that is farthest from Gc and the point on the mouse body contour outside the area around the mouse ear that is farthest from this point as the pre-selected points of the mouse nose tip preliminary positioning point. Calculate the distance between these two points and the mouse tail and ear respectively (the distance can be calculated using the center of gravity of the mouse and ear contours). Select the point far from the mouse tail and close to the mouse ear among these two points as the pre-selected point Ppn of the mouse nose tip, and Ppn is the result of the preliminary positioning of the mouse nose tip.
识别出小鼠尾巴和耳朵轮廓后,先计算小鼠耳朵周围区域:After identifying the mouse tail and ear outlines, first calculate the area around the mouse ear:
以小鼠耳朵轮廓的重心为圆心,小鼠耳朵轮廓距离小鼠耳朵轮廓重心最远的距离为lec,以稍大于lec的距离(实验中可以使用1.1lec)为半径画圆形成两个圆圈,为小鼠耳朵周围区域;Take the center of gravity of the mouse ear contour as the center of the circle, and the farthest distance between the mouse ear contour and the center of gravity of the mouse ear contour is l ec . Draw two circles with a distance slightly larger than l ec (1.1l ec can be used in the experiment) as the radius, which are the areas around the mouse ear.
在避开小鼠耳朵周围区域的情况下,选取小鼠耳朵周围区域外小鼠身体轮廓上距离小鼠身体轮廓的重心Gc最远的点作为小鼠鼻尖预选点的备选点之一Ppn1,计算小鼠耳朵周围区域外小鼠身体轮廓上距离Ppn1最远的点作为小鼠鼻尖预选点的备选点之二Ppn2;In the case of avoiding the area around the mouse ear, select the point on the mouse body contour outside the area around the mouse ear that is farthest from the center of gravity Gc of the mouse body contour as one of the candidate points Ppn1 for the mouse nose tip pre-selected point, and calculate the point on the mouse body contour outside the area around the mouse ear that is farthest from Ppn1 as the second candidate point Ppn2 for the mouse nose tip pre-selected point;
小鼠的头部远离尾巴,靠近耳朵;小鼠鼻子的预选点便根据备选点距离尾巴和耳朵的距离来判断。实验中发现小鼠的耳朵比尾巴的识别准确度更高,判断是先用尾巴来进行软判断,再用耳朵来进行硬性判断,这样就能保证在尾巴识别出错的情况下,通过耳朵的判断也能得到正确的结果。由于小鼠在一些情况下识别不出耳朵,使用尾巴进行判断的过程也不能缺失。The head of the mouse is far away from the tail and close to the ear; the pre-selected point of the mouse's nose is determined based on the distance of the candidate point from the tail and the ear. The experiment found that the mouse's ears are more accurately recognized than its tail. The judgment is to use the tail for soft judgment first, and then use the ear for hard judgment. This ensures that even if the tail is misidentified, the correct result can be obtained through the ear judgment. Since mice cannot recognize ears in some cases, the process of using the tail for judgment cannot be omitted.
分别求取点Ppn1、Ppn2分别与小鼠尾巴和耳朵的距离(可以用小鼠和耳朵轮廓的重心来计算距离),选取点Ppn1、Ppn2中远离小鼠尾巴而靠近小鼠耳朵的点作为小鼠鼻尖的预选点Ppn,Ppn为小鼠鼻尖初步定位的结果。这样就能确保Ppn不会误识别到尾巴根附近或者是外耳廓附近。The distances between the points Ppn1 and Ppn2 and the mouse's tail and ear are calculated respectively (the distance can be calculated using the centroid of the mouse and ear contours), and the point Ppn1 and Ppn2 that is far from the mouse's tail and close to the mouse's ear is selected as the pre-selected point Ppn of the mouse's nose tip, and Ppn is the result of the preliminary positioning of the mouse's nose tip. This ensures that Ppn will not be mistakenly identified near the tail root or the outer ear.
小鼠鼻尖初步定位的结果如图4所示,9是计算的小鼠身体轮廓重心,10是选择的小鼠耳朵周围区域用来在识别过程中避开小鼠外耳廓周围,11是小鼠尾巴轮廓的外接矩形,12是得到的小鼠鼻尖预选点Ppn。The result of preliminary positioning of the mouse nose tip is shown in FIG4 , where 9 is the calculated center of gravity of the mouse body contour, 10 is the selected area around the mouse ear to avoid the area around the mouse auricle during the recognition process, 11 is the circumscribed rectangle of the mouse tail contour, and 12 is the obtained mouse nose tip preselected point P pn .
其它步骤及参数与具体实施方式一至四之一相同。The other steps and parameters are the same as those in
具体实施方式六:本实施方式与具体实施方式一至五之一不同的是,所述步骤二二中对步骤二一得到的小鼠鼻尖初步定位的结果进行位置矫正,得到矫正后小鼠的鼻尖Pn;具体过程为:Specific implementation example 6: This implementation example is different from specific implementation examples 1 to 5 in that in step 22, the preliminary positioning result of the mouse nose tip obtained in step 21 is corrected to obtain the corrected mouse nose tip P n ; the specific process is as follows:
在得到小鼠鼻尖的初步定位的结果和小鼠身体轮廓的重心Gc之后,计算小鼠鼻尖的初步定位的结果与小鼠身体轮廓的重心Gc之间的距离lpnc,选定小鼠身体轮廓中在以初步定位得到的小鼠的鼻尖为中心,半径稍大于lpnc(如1.2lpnc)长度的圆内的部分作为小鼠轮廓的前半部分,计算小鼠轮廓前半部分的重心;After obtaining the preliminary positioning result of the mouse nose tip and the center of gravity Gc of the mouse body contour, calculate the distance lpnc between the preliminary positioning result of the mouse nose tip and the center of gravity Gc of the mouse body contour, select the part of the mouse body contour within a circle with the preliminary positioning mouse nose tip as the center and a radius slightly larger than lpnc (such as 1.2lpnc ) as the front half of the mouse contour, and calculate the center of gravity of the front half of the mouse contour;
选定小鼠身体轮廓中避开小鼠耳朵周围区域,在以小鼠的鼻尖为中心,半径稍小于lpnc一半(如0.4lpnc)长度的圆内部分作为小鼠轮廓的头部附近区域,选择头部附近区域中距离小鼠轮廓前半部分的重心最远的点作为矫正后小鼠的鼻尖Pn。The area around the mouse ears is avoided in the mouse body contour. The part inside the circle with the mouse nose tip as the center and a radius slightly less than half of lpnc (such as 0.4lpnc ) is selected as the head vicinity of the mouse contour. The point farthest from the center of gravity of the front half of the mouse contour in the head vicinity is selected as the nose tip Pn of the mouse after correction.
小鼠鼻尖的预选点Ppn在一般情况下就可以选为鼻尖了,但当小鼠头部方向与身体方向之间的差异较大时,预选点的位置就会偏离鼻尖真正的位置。合适的矫正方法是在避开小鼠耳朵的同时选取小鼠头部附近区域距离小鼠前半部分重心最远的点作为新的小鼠鼻尖。将小鼠轮廓重心改成小鼠前半部分重心之后,重心距离Ppn变短的程度比距离真正鼻尖变短的程度更大,使得鼻尖成为距离前半部分重心最远的点。识别结果如图5所示,13是小鼠前半部分重心,14是用来筛选鼻尖的小鼠头部附近区域,15是最终得到的鼻尖Pn。The preselected point Ppn of the mouse nose tip can be selected as the nose tip in general, but when the difference between the mouse head direction and the body direction is large, the position of the preselected point will deviate from the true position of the nose tip. The appropriate correction method is to select the point farthest from the center of gravity of the front half of the mouse in the area near the mouse head while avoiding the mouse ears as the new mouse nose tip. After the center of gravity of the mouse contour is changed to the center of gravity of the front half of the mouse, the degree of shortening of the center of gravity from Ppn is greater than the degree of shortening from the true nose tip, making the nose tip the point farthest from the center of gravity of the front half. The recognition result is shown in Figure 5, 13 is the center of gravity of the front half of the mouse, 14 is the area near the mouse head used to screen the nose tip, and 15 is the final nose tip Pn .
其它步骤及参数与具体实施方式一至五之一相同。The other steps and parameters are the same as those in
具体实施方式七:本实施方式与具体实施方式一至六之一不同的是,所述步骤二三中基于步骤二二得到的小鼠的鼻尖Pn,识别小鼠头部方位;具体过程为:Specific implementation example 7: This implementation example is different from specific implementation examples 1 to 6 in that, in step 23, the mouse head position is identified based on the mouse nose tip P n obtained in step 22; the specific process is as follows:
根据小鼠头部的对称性,可以选取以小鼠的鼻尖Pn点为圆心,半径为r1和r2分别画圆,r1<r2,小鼠轮廓中在这两个圆内部分的重心分别为Gr1和Gr2,小鼠头部的方向就可以按如下公式计算:According to the symmetry of the mouse head, we can choose the mouse nose tip Pn as the center and draw circles with radii of r1 and r2 , r1 < r2 . The center of gravity of the part of the mouse outline inside these two circles is Gr1 and Gr2 respectively. The direction of the mouse head can be calculated according to the following formula:
其中,表示小鼠头部方位的向量是从Gr2指向Gr1;in, The vector representing the orientation of the mouse's head is from Gr2 to Gr1 ;
小鼠头部方向识别的结果如图6所示,16和17分别是以Pn点为圆心,半径为r1和r2的圆,18和19分别是重心Gr1和Gr2,20是后得到的小鼠头部方位。The result of mouse head direction recognition is shown in FIG6 , where 16 and 17 are circles with P n as the center and radii r 1 and r 2 , 18 and 19 are the centroids Gr1 and Gr2 , and 20 is the mouse head orientation obtained later.
其它步骤及参数与具体实施方式一至六之一相同。The other steps and parameters are the same as those in
具体实施方式八:本实施方式与具体实施方式一至七之一不同的是,所述r1和r2的取值应该随小鼠大小有所改变,r1的值大致初选为小鼠头部的长度,r2的值大致初选为小鼠鼻尖到小鼠脖颈处长度的两倍,可以不断调整r1、r2使得得到的重心Gr1在小鼠的头部中心,Gr2在连接小鼠头和身体的脖颈处。Specific implementation eight: This implementation is different from any one of specific implementations one to seven in that the values of r1 and r2 should change with the size of the mouse. The value of r1 is roughly initially selected as the length of the mouse's head, and the value of r2 is roughly initially selected as twice the length from the tip of the mouse's nose to the mouse's neck. r1 and r2 can be continuously adjusted so that the center of gravity Gr1 is at the center of the mouse's head, and Gr2 is at the neck connecting the mouse's head and body.
其它步骤及参数与具体实施方式一至七之一相同。The other steps and parameters are the same as those in
具体实施方式九:本实施方式与具体实施方式一至八之一不同的是,所述步骤三中基于步骤一得到的小鼠身体、耳朵、尾巴的轮廓和步骤二得到小鼠头部方位,识别小鼠视动反应;在前面两个步骤的基础上,能够在视频中连续跟踪小鼠的头部朝向和动作变化。对跟踪数据进行分析能够得到小鼠的视动反应数据。具体过程为:Specific implementation method 9: This implementation method is different from any one of
步骤三一、建立小鼠头部运动曲线;Step 31: Establish the mouse head movement curve;
步骤三二、视动反应的提取;具体过程为:Step 32: Extraction of visual-motor response; the specific process is as follows:
步骤三二一、基于差分的视动反应初步提取;Step 321: Preliminary extraction of visual motion response based on difference;
步骤三二二、基于步骤三二一采用最小二乘法确定视动反应。Step 322: Determine the visual-motor response using the least squares method based on step 321.
其它步骤及参数与具体实施方式一至八之一相同。The other steps and parameters are the same as those in
具体实施方式十:本实施方式与具体实施方式一至九之一不同的是,所述步骤三一中建立小鼠头部运动曲线;具体过程为:Specific implementation method 10: This implementation method is different from
基于步骤二,每一帧图像将会得到一个小鼠头部方位,建立一个坐标系,将方向转换成角度,以Gr2为坐标原点,连线小鼠头部坐标,可以建立每帧图像中小鼠头部的角度值,用公式表示为:Based on
其中,θ表示根据头部方位计算出来的角度,Gr1·x和Gr1·y分别是Gr1在像素空间坐标系的横纵坐标,Gr2·x和Gr2·y分别是Gr2在像素空间坐标系的横纵坐标;Wherein, θ represents the angle calculated according to the head orientation, Gr1·x and Gr1·y are the horizontal and vertical coordinates of Gr1 in the pixel space coordinate system, Gr2·x and Gr2·y are the horizontal and vertical coordinates of Gr2 in the pixel space coordinate system;
值得说明的是通过Gr1和Gr2连线的方向计算角度时,角度的变化范围只能是-180°~180°,当数据在-180°和180°附近变化时会导致数据不能保持小差分变化的连续性影响之后小鼠摆头的判断。对此,当数据从-180°附近变到180°附近或从180°附近变到-180°附近时,将当前的值通过调下或调大360°使得当前的值和上一个数据的值保持较小差分变化的连续性。It is worth noting that when calculating the angle by the direction of the line connecting Gr1 and Gr2 , the angle can only change within the range of -180° to 180°. When the data changes near -180° and 180°, the data cannot maintain the continuity of small differential changes, which affects the subsequent judgment of the mouse's head shaking. In this regard, when the data changes from near -180° to near 180° or from near 180° to near -180°, the current value is adjusted down or up by 360° so that the current value and the value of the previous data maintain the continuity of small differential changes.
将连续视频对应的角度连接起来就形成小鼠的头部运动曲线。Connecting the corresponding angles of continuous videos will form the mouse's head movement curve.
其它步骤及参数与具体实施方式一至九之一相同。The other steps and parameters are the same as those in
具体实施方式十一:本实施方式与具体实施方式一至十之一不同的是,所述步骤三二一中基于差分的视动反应初步提取;具体过程为:Specific implementation eleven: This implementation is different from specific implementations one to ten in that the step three-two-one is based on the initial extraction of visual motion response based on the difference; the specific process is:
小鼠跟随光栅的视动反应是平缓不剧烈变化的,也就是其速度变化比较小。衡量小鼠速度变化时可通过小鼠头部方位角度变化的一阶导数和二阶导数曲线进行判断。对于离散信号来讲,即通过计算一阶差分和二阶差分来进行判断。The visual motion reaction of mice following the grating is gentle and does not change dramatically, that is, its speed change is relatively small. When measuring the change of mouse speed, it can be judged by the first-order derivative and second-order derivative curves of the change of the mouse head orientation angle. For discrete signals, it is judged by calculating the first-order difference and the second-order difference.
一阶差分的公式如下:The formula for first-order difference is as follows:
θ′i=θi-θi-1 (6)θ′ i = θ i - θ i-1 (6)
其中,θi、θi-1为小鼠的头部运动曲线中第i个、第i-1个头部方位的角度值,θ′i是小鼠的头部运动曲线中第i个一阶差分;Wherein, θ i and θ i-1 are the angle values of the i-th and i-1-th head orientations in the head motion curve of the mouse, and θ′ i is the i-th first-order difference in the head motion curve of the mouse;
二阶差分的公式如下:The formula for the second-order difference is as follows:
θ″i=θ′i-θi-1 (7)θ″ i =θ′ i -θ i-1 (7)
其中,θ′i、θ′i-1为小鼠的头部运动曲线中第i个、第i-1个一阶差分,θ″i是小鼠的头部运动曲线中第i个二阶差分;Among them, θ′ i and θ′ i-1 are the i-th and i-1-th first-order differences in the mouse's head motion curve, and θ″ i is the i-th second-order difference in the mouse's head motion curve;
小鼠有跟随光栅的视动反应的时候,其一二阶差分值相对较小,而其他情况下,一二阶差分相对较大。因此可以用一二阶差分来区分检测小鼠头部方位的变化是否剧烈。When the mouse has a visual reaction to follow the grating, its first and second order difference values are relatively small, while in other cases, the first and second order differences are relatively large. Therefore, the first and second order differences can be used to distinguish whether the change in the orientation of the mouse's head is drastic.
为了让差分更明显地区分小鼠头部方位变化的剧烈程度,可以对差分局部地进行积分的运算,使得差分较大的地方检测到的结果更大从而更容易检测小鼠头部方位的剧烈程度。一阶差分局部积分的公式如下所示:In order to make the difference more clearly distinguish the severity of the change in the mouse head position, the difference can be locally integrated so that the result detected in the place with larger difference is larger, making it easier to detect the severity of the mouse head position. The formula for the local integration of the first-order difference is as follows:
其中,θ′isum为θ′i之前(包括θ′i)连续z1个一阶差分的局部积分,z1为积分的区间长度;θ′j为小鼠的头部运动曲线中第j个一阶差分,j的取值从i+1-z1到i;Wherein, θ′ isum is the local integral of the z 1 consecutive first-order differences before θ′ i (including θ′ i ), and z 1 is the length of the integral interval; θ′ j is the jth first-order difference in the head movement curve of the mouse, and the value of j ranges from i+1-z 1 to i;
当选取z1值增大时,差分的结果更加显著,检测的稳定性会更高。但当z1值太大时,积分得到的结果对当前小鼠头部角度的变化将不具有代表性,这样会导致检测结果出错,根据经验,z1一般可取值5。When the z1 value is increased, the difference result is more significant and the detection stability is higher. However, when the z1 value is too large, the result obtained by integration will not be representative of the change in the current mouse head angle, which will lead to errors in the detection results. According to experience, z1 can generally be taken as 5.
同样,二阶差分局部积分的公式如下所示:Similarly, the formula for the second-order difference local integral is as follows:
其中,θ″j为小鼠的头部运动曲线中第j个二阶差分,θ″isum为θ″i之前(包括θ″i)连续z2个二阶差分的局部积分,根据经验,一般取值2;Wherein, θ″ j is the jth second-order difference in the head motion curve of the mouse, θ″ isum is the local integral of z 2 consecutive second-order differences before θ″ i (including θ″ i ), and according to experience, it is generally taken as 2;
当一阶差分局部积分大于一阶积分阈值或二阶差分局部积分大于二阶积分阈值时,记作一次非平缓运动;When the local integral of the first-order difference is greater than the first-order integral threshold or the local integral of the second-order difference is greater than the second-order integral threshold, it is recorded as a non-smooth motion;
当非平缓运动次数大于次数阈值时,就把该区间(非平缓运动对应的区间)记作非平缓运动区间,剔除整个小鼠头部运动曲线中非平缓区间的部分,得到平缓的区间,平缓的区间包含小鼠视动反应的部分,完成视动反应区域初步提取;When the number of non-smooth movements is greater than the number threshold, the interval (the interval corresponding to the non-smooth movement) is recorded as the non-smooth movement interval, and the non-smooth interval part of the entire mouse head movement curve is eliminated to obtain the gentle interval. The gentle interval contains the part of the mouse's visual motion response, completing the preliminary extraction of the visual motion response area;
比如一阶积分阈值为14°,二阶积分阈值为7°;次数阈值为3。For example, the first-order integration threshold is 14°, the second-order integration threshold is 7°, and the number threshold is 3.
使用一阶差分排除掉的是曲线比较陡的部分,二阶差分排除掉的是曲线波动特别剧烈的曲线,而剩余部分的曲线就是包含小鼠视动反应的部分。The first-order difference eliminates the steeper part of the curve, the second-order difference eliminates the curve with particularly drastic fluctuations, and the remaining part of the curve is the part that contains the mouse's visual motion response.
图7a、7b所示为小鼠头部角度曲线的一、二阶差分示意图。小鼠头部在剧烈运动时,其差分数据会出现明显的峰值,通过对峰值的分割,可以对包含视动反应的区间进行提取。Figures 7a and 7b show the first and second order difference diagrams of the mouse head angle curve. When the mouse head moves violently, its differential data will have obvious peaks. By segmenting the peaks, the interval containing the visual motion response can be extracted.
其它步骤及参数与具体实施方式一至十之一相同。The other steps and parameters are the same as those in Specific Implementation Examples 1 to 10.
具体实施方式十二:本实施方式与具体实施方式一至十一之一不同的是,所述步骤三二二中基于步骤三二一采用最小二乘法确定视动反应;具体过程为:Specific implementation 12: This implementation is different from
小鼠跟随光栅的视动反应,其速度应该在一定的范围之内,分析小鼠头部方位变化比较平缓区域的速度便能判断小鼠是否有视动反应。而小鼠头部方位变化的速度则可以用斜率来表示。The speed of the mouse's visual reaction to the grating should be within a certain range. By analyzing the speed in the area where the mouse's head position changes more slowly, we can determine whether the mouse has a visual reaction. The speed of the mouse's head position change can be expressed by the slope.
使用最小二乘法拟合步骤三二一中提取的视动反应区域的曲线,得到的拟合直线的斜率和相关系数可以用来评估小鼠头部方位变化的速度;具体过程为:The least squares method is used to fit the curve of the visual motion response area extracted in step 321. The slope and correlation coefficient of the fitted straight line can be used to evaluate the speed of the mouse head position change. The specific process is as follows:
针对步骤三二一中提取的视动反应区域的曲线上的点中的n个观察值(每组为步骤三二一中提取的视动反应区域的曲线上的一个点)(x1,y1),(x2,y2),...,(xn,yn),最小二乘法计算斜率的步骤如下:For n observations (x 1 , y 1 ), (x 2 , y 2 ), ..., (x n , yn ) among the points on the curve of the visual response area extracted in step 321 (each group is a point on the curve of the visual response area extracted in step 321 ), the steps of calculating the slope by the least squares method are as follows :
自变量的平均值为:The mean value of the independent variable is:
因变量的平均值为:The mean value of the dependent variable is:
计算的斜率为:The calculated slope is:
其中,xk为第k个观察值的横坐标,yk为第k个观察值的纵坐标;Where x k is the abscissa of the k-th observation, and y k is the ordinate of the k-th observation;
计算的相关系数为:The calculated correlation coefficient is:
其中,xk为第k个观察值的横坐标,yk为第k个观察值的纵坐标;Where x k is the abscissa of the k-th observation, and y k is the ordinate of the k-th observation;
将得到的平缓的区间依次划分成一系列固定长度(如长度为30个点)的小区间,小区间之间存在部分重叠(如重叠20个点,如The obtained flat interval is divided into a series of small intervals of fixed length (such as 30 points), and there is some overlap between the small intervals (such as 20 points overlap, such as
); );
通过公式(10)、(11)、(12)计算出小区间的斜率,通过公式(13)计算出相关系数,通过斜率和相关系数的大小便可以确定小区间内小鼠头部方位角度变化的速度是否在视动反应的范围之内;The slope between the cells is calculated by formula (10), (11), and (12), and the correlation coefficient is calculated by formula (13). The slope and the correlation coefficient can be used to determine whether the speed of the change of the mouse head orientation angle within the cell is within the range of the visual motion response.
若斜率和相关系数的大小在视动反应的范围之内,判断小区间内发生了视动反应;If the slope and the correlation coefficient are within the range of the visual-motor response, it is judged that the visual-motor response has occurred in the small interval;
若斜率和相关系数的大小不在视动反应的范围之内,判断小区间内未发生视动反应;If the slope and the correlation coefficient are not within the range of the optokinetic response, it is judged that no optokinetic response occurs in the small interval;
将连续判断为视动反应的小区间组合在一起组成大区间,每个大区间判断为一次视动反应。由此便可以确定小鼠视动反应的次数和视动反应持续的时间;The small intervals that are continuously judged as visual motion reactions are combined into large intervals, and each large interval is judged as an visual motion reaction. In this way, the number of visual motion reactions of mice and the duration of visual motion reactions can be determined;
所述视动反应的范围为斜率范围大于等于-0.65小于等于-0.1,相关系数范围大于-1,小于等于-0.8(或者斜率范围大于等于0.1小于等于0.65,相关系数范围大于等于0.8,小于1)。The range of the optokinetic response is that the slope range is greater than or equal to -0.65 and less than or equal to -0.1, and the correlation coefficient range is greater than -1 and less than or equal to -0.8 (or the slope range is greater than or equal to 0.1 and less than or equal to 0.65, and the correlation coefficient range is greater than or equal to 0.8 and less than 1).
记录视频中判断为视动反应的次数和视动反应持续的时间。The number of times the optokinetic responses were judged in the video and the duration of the optokinetic responses were recorded.
图8中曲线即为识别出的小鼠视动反应状态,视觉反应的次数为7,这7次持续的时间合计为13.87s。通过记录视频中判断为视动反应的次数和视动反应持续的时间。The curve in Figure 8 is the identified visual-motor response state of the mouse, the number of visual responses is 7, and the total duration of these 7 times is 13.87 seconds. The number of visual-motor responses and the duration of the visual-motor responses in the video are recorded.
本发明建立的小鼠视动反应自动识别方法,能够在实验过程中实时提供识别数据,包括小鼠视动反应的次数、持续时间等、与条纹速度的同步率等数据。The automatic recognition method for visual-motor response of mice established by the present invention can provide recognition data in real time during the experiment, including the number of visual-motor responses of mice, the duration, the synchronization rate with the stripe speed and other data.
其它步骤及参数与具体实施方式一至十一之一相同。The other steps and parameters are the same as those in
本发明还可有其它多种实施例,在不背离本发明精神及其实质的情况下,本领域技术人员当可根据本发明作出各种相应的改变和变形,但这些相应的改变和变形都应属于本发明所附的权利要求的保护范围。The present invention may also have many other embodiments. Without departing from the spirit and essence of the present invention, those skilled in the art may make various corresponding changes and modifications based on the present invention, but these corresponding changes and modifications should all fall within the scope of protection of the claims attached to the present invention.
Claims (4)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010001628.9A CN111144379B (en) | 2020-01-02 | 2020-01-02 | Automatic identification method for visual dynamic response of mice based on image technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010001628.9A CN111144379B (en) | 2020-01-02 | 2020-01-02 | Automatic identification method for visual dynamic response of mice based on image technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111144379A CN111144379A (en) | 2020-05-12 |
CN111144379B true CN111144379B (en) | 2023-05-23 |
Family
ID=70523301
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010001628.9A Active CN111144379B (en) | 2020-01-02 | 2020-01-02 | Automatic identification method for visual dynamic response of mice based on image technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111144379B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111832531B (en) * | 2020-07-24 | 2024-02-23 | 安徽正华生物仪器设备有限公司 | Analysis system and method suitable for rodent social experiments based on deep learning |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104504404A (en) * | 2015-01-23 | 2015-04-08 | 北京工业大学 | Online user type identification method and system based on visual behavior |
CN105447449A (en) * | 2015-11-12 | 2016-03-30 | 启安动物行为学科技股份有限公司 | Body angle index acquiring method, device and system in mouse gait analysis |
WO2018095346A1 (en) * | 2016-11-25 | 2018-05-31 | 平李⋅斯图尔特 | Medical imaging system based on hmds |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8634635B2 (en) * | 2008-10-30 | 2014-01-21 | Clever Sys, Inc. | System and method for stereo-view multiple animal behavior characterization |
CN101526996A (en) * | 2009-02-23 | 2009-09-09 | 华旭 | Method of mouse spontaneous behavior motion monitoring and posture image recognition |
MX2013002375A (en) * | 2010-08-31 | 2013-10-07 | Univ Cornell | Retina prosthesis. |
JP2012190280A (en) * | 2011-03-10 | 2012-10-04 | Hiroshima Univ | Action recognition device, action recognition method, and program |
CN103070671B (en) * | 2013-01-31 | 2014-07-30 | 郑州大学 | Automatic assessment system for visual cognition behavior function of rodent |
-
2020
- 2020-01-02 CN CN202010001628.9A patent/CN111144379B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104504404A (en) * | 2015-01-23 | 2015-04-08 | 北京工业大学 | Online user type identification method and system based on visual behavior |
CN105447449A (en) * | 2015-11-12 | 2016-03-30 | 启安动物行为学科技股份有限公司 | Body angle index acquiring method, device and system in mouse gait analysis |
WO2018095346A1 (en) * | 2016-11-25 | 2018-05-31 | 平李⋅斯图尔特 | Medical imaging system based on hmds |
Also Published As
Publication number | Publication date |
---|---|
CN111144379A (en) | 2020-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10747999B2 (en) | Methods and systems for pattern characteristic detection | |
CN108717523B (en) | Sow oestrus behavior detection method based on machine vision | |
US10810414B2 (en) | Movement monitoring system | |
Weyler et al. | Joint plant instance detection and leaf count estimation for in-field plant phenotyping | |
CN101271517B (en) | Face region detection device and method | |
Ji et al. | In-field automatic detection of maize tassels using computer vision | |
CN108198167B (en) | An intelligent burn detection and identification device and method based on machine vision | |
CN106952280B (en) | A kind of spray gun paint amount uniformity detection method based on computer vision | |
Zhou et al. | Early detection and continuous quantization of plant disease using template matching and support vector machine algorithms | |
CN105021196B (en) | Crop row detection method based on minimum tangent circle and morphology principle | |
CN110570422B (en) | Capsule defect visual detection method based on matrix analysis | |
CN102750584B (en) | On-corncob corn niblet counting method | |
CN111462058B (en) | Method for rapidly detecting effective rice ears | |
CN115861721B (en) | Livestock and poultry breeding spraying equipment state identification method based on image data | |
CN117058607A (en) | Plant growth state monitoring system based on image visual analysis | |
CN111144379B (en) | Automatic identification method for visual dynamic response of mice based on image technology | |
CN103593840B (en) | Method for detecting phenotype of Arabidopsis | |
CN115063375B (en) | Image recognition method for automatically analyzing ovulation test paper detection result | |
CN117333784A (en) | Wheat scab disease index estimation method based on unmanned aerial vehicle image | |
CN112381028A (en) | Target feature detection method and device | |
CN110866917A (en) | A machine vision-based identification method for tablet type and arrangement | |
CN103591887A (en) | Method for detecting regional phenotype of Arabidopsis | |
CN113516734A (en) | Automatic insect key point labeling method based on top-down deep learning framework and application | |
CN118736409A (en) | A method for identifying crop diseases and insect pests based on hyperspectral technology | |
CN118053154A (en) | Oyster mushroom growth monitoring method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |