CN107977401B - Visualization method for eye tracking mode identification comparison based on matrix structure - Google Patents
Visualization method for eye tracking mode identification comparison based on matrix structure Download PDFInfo
- Publication number
- CN107977401B CN107977401B CN201710971090.2A CN201710971090A CN107977401B CN 107977401 B CN107977401 B CN 107977401B CN 201710971090 A CN201710971090 A CN 201710971090A CN 107977401 B CN107977401 B CN 107977401B
- Authority
- CN
- China
- Prior art keywords
- matrix
- data
- image
- intensity value
- eye tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/26—Visual data mining; Browsing structured data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Pure & Applied Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Algebra (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a visual method for identifying and comparing eye tracking patterns based on a matrix structure, which comprises the following steps: (1) preparing a hotspot graph A of a set of data; (2) establishing a hotspot graph B of a second group of data; (3) a new matrix C is created in which the combination of each point is a linear algebra. The interval of matrix C is [ -1,1], although the dots in the matrix are generally not negative numbers. (4) An image is created in which negative values in matrix C are fixed stepwise with one color and positive values with another strongly contrasting color. The invention discloses a novel visualization method, which can present and compare two groups of data simultaneously through a heat difference diagram, and can show the visualization effect of the multilayer data more effectively and intuitively.
Description
Technical Field
The invention belongs to pattern recognition, and particularly relates to a visual method for recognizing and comparing eye tracking patterns based on a matrix structure.
Background
The hotspot graph is a common method of presenting fixed-point distribution data. In particular, intensity (hot spot) is plotted for each data point, with the greater the intensity, the more points, and the coverage as if it were radiating to nearby areas, as with thermal radiation. The intensity values are represented by different colors. The method is a method for locking a spatial position data visualization mode. In essence, intensity ("heat") is applied to each data point to render its proportion, with the intensity dissipating into the nearby area, similar to its heat dissipation. The intensity values are represented by appropriate color shading gradients.
Eye tracking is a method of collecting information about the focus of eye information. It tracks the visual changes of an object by using one or more cameras, comprising a series of data that can be provided: such as: the head position of the subject, the positions of the eyes and the pupil, and target visual information based on these positions is calculated. The fields of use are numerous, including cognitive research, medical research, and increasing commercial applications.
Fig. 1 shows a prior art image hotspot graph, an eye movement data hotspot graph collected by Kootstra. One is tried for 5 seconds to view a picture provided by the McGill color debugging gallery. The 5 seconds tested visual motor activity was fully recorded. The red point is all behavior time of the gaze activity, the gaze point, which is overlaid on the original image, forming a hotspot graph of the gaze activity. Hotter (redder) colors indicate more gaze. The hotspot graph can be augmented with a decay parameter to mimic the decay of visual tracking intensity over a period of time as the data is studied over a period of time. Thus, a certain hot spot can slowly "cool" until it eventually disappears. This step is very important in video feedback. On the other hand, the hotspot graph is a set of effective visualization tools in the aspects of drawing, user interface design and other specialties. However, it can only present the data to be tested one at a time, so that there is no way to compare different visual movements. This is because the simultaneous presentation of data by a plurality of persons is confusing and thus does not serve a visual role. And if two or more visual data are to be compared, no visualization method exists so far.
Eye movement techniques have gradually moved from the laboratory to various applications. In brief, this technology can be applied to various kinds of inference scientific research and control applications (eye control) by performing data-based tracking on the eye movement of a human body. However, the behavior of eye movement is rather complicated, the data is huge and scattered, and the relative data processing application software is few.
Disclosure of Invention
The purpose of the invention is as follows: in view of the above-mentioned shortcomings of the prior art, the present invention provides a visualization method for eye tracking pattern recognition and comparison based on a matrix structure, which presents and compares two sets of data simultaneously through a "thermal difference map".
The technical scheme is as follows: a visual method for eye tracking pattern recognition contrast based on a matrix structure comprises the following steps:
(1) extracting a first group of data A of an image, wherein the first group of data A is an intensity value matrix extracted by taking any pixel point of the image as a visual center, and the size of a hotspot graph obtained by the intensity value matrix is the same as that of the original image;
(2) extracting a second group of data B of the image, wherein the second group of data B takes the pixel points selected in the non-step (1) as an intensity value matrix extracted from the visual center, and the size of a hotspot graph obtained by the intensity value matrix of the second group of data B is the same as that of the hotspot graph obtained in the step (1);
(3) and (3) obtaining a new matrix C by the following calculation equation in the step (1) and the step (2):
wherein A isij、BijIs the element in the matrix A, B at point i, j, maxA、maxBIs the maximum value in the matrix A, B, and the interval of the matrix C is [ -1,1 [ ]]Namely:
(4) and (4) assigning the matrix obtained in the step (3) to obtain a visual image, wherein negative values in the matrix C are gradually fixed by one color, and positive values are fixed by another strong contrast color.
Further, the extraction of the image intensity value matrix data with the image pixel points as the visual centers comprises two or more groups of data extracted in the steps (1) and (2) and intensity value matrixes thereof.
Further, the step (3) comprises data conversion, size comparison and classification.
Furthermore, the new matrix C obtained by equation calculation in the step (3) includes any two groups or more than two groups of data extracted from any pixel point of the image and the intensity value matrix thereof.
Further, step (4) includes repeating for each frame of the motion picture or video, introducing decay parameters into the matrix to model the regression average.
Further, the decay parameters are introduced into the matrix including a normalization of the intensity values.
Further, the step (4) comprises using cold and hot color expression for the result obtained in the step (3).
Has the advantages that: compared with the prior art, the invention has the remarkable characteristics that: the invention provides a novel visualization method, which can simultaneously present and compare two groups of data through a heat difference graph, improve the visualization effect of the image and ensure that the difference between the two groups of data can be seen through at a glance. Most importantly, the two sets of data are not all shown in the "heat differential map". The invention selects only one group of data as a 'hot group' (expressed by warm colors) and the other group as a 'cool group' (expressed by cool colors) through linear algebra of a difference matrix, and positions the data and the image part as 'ambient temperature'. These properties then dissipate or interact as the physical concept of these heats and cools. This method enables visualization of the data contrast.
Drawings
FIG. 1 is an example of a hotspot graph;
FIG. 2 is a flow chart of a first and second step of the data technology arrangement of the present invention;
FIG. 3 is a third and fourth step of the present invention;
FIG. 4 is a flow chart of the present invention;
FIG. 5 is a first set of hotspot graphs;
FIG. 6 is a second set of hotspot graph comparisons;
FIG. 7 is a third set of hotspot graphs;
FIG. 8 is a comparison of a first set of hotspot graphs with other hotspot graphs.
Wherein, fig. 2, 3, 4 are annotated; a, B, C are matrices; d1 and D2 are input data sets and are fixed lists; d [ i ]. x denotes the x coordinate of the ith observation in set D, i and j being integers.
Detailed Description
In order to explain the technical scheme disclosed by the invention in detail, the following is further explained by combining a specific embodiment and a drawing of the specification.
The method adopted by the technical scheme provided by the invention is compared by physical knowledge, one group of data is selected as a 'hot group' (namely, a group expressed by warm colors), the other group is selected as a 'cool group' (expressed by cool colors), and the 'ambient temperature' is positioned between the data and the image part. The reason for expressing the cold and warm colors is to clearly distinguish the difference between the two groups. These properties then dissipate or interact as the physical concept of these heats and cools. This method enables visualization of the data contrast.
The most common form of eye movement data is gaze time and orientation. Namely: the time and orientation of the eye's point of interest when viewing an image/video, etc. The contrast commonly used is the visual activity of different observers facing the same image. The figures of the specific embodiment of the present invention are two: the observation map itself is fixed, and the gaze point of different observers varies. The gaze points are distributed over the observation map, showing the observer's view of the map. The observation graph does not need too much data processing, and the invention adopts the data as the variable of the fixation point graph data.
In general, in a program equipped with an eye tracker, a general expression of a fixation point is a "hot spot map". Namely: the longer the gaze time, the larger the spot, the warmer the color. But it can only show "hot spots", i.e. a set of gazes of a certain observer, or of all observers, on a certain part of a certain figure. It cannot show the difference. The present invention is directed to "differences".
The flow of a visualization method for eye tracking pattern recognition comparison based on matrix structure is shown in fig. 4, i.e. two heat point map matrices are read respectively, and then a new empty matrix is made, in which the difference between the two matrices is calculated and expressed.
In each matrix, each cell represents a pixel of a viewed picture. The matrix a and the matrix B are eye movement data of two different groups of people observing the same picture, so the matrices a and B are equal in size. The matrix C is identical to the matrices a and B, and therefore is the same size. However, a and B are actual eye movement data (gaze duration, gaze location, etc.), and C represents the difference of AB at the fixation site. 1 represents the pixel most gazed at by A (the data source of A) and B (the data source of B) does not gaze at this pixel at all, -1 represents not gazed at by A but gazed at by B.
The technical scheme provided by the invention specifically comprises the following steps:
(1) extracting a hot spot diagram of the first image, wherein the hot spot diagram of the group of data is called A, the A group of data is represented by an intensity value matrix with the same size in the same graph, the intensity value matrix comprises the hot spot matrix of the first image, and the hot spot matrix is obtained by the following calculation method: first selecting a first set of gaze data, the data as shown in equation 1; for each value k between 1 and n, each pixel point (i, j) that is less than r away from the gaze center, we take the point with equation 3 as shown in equation 2. The calculation equation is as follows:
equation 1:
D=f1,f2,...,fn
equation 2:
|(i,j)-fn|<r
equation 3:
Aij=Aij+r-|(i,j)-fk|
where r is the radius of the gaze point size. The gaze point is a circular point, and the size thereof is the length of the gaze time, and longer gaze times result in larger gaze points. The fixation point itself is a standard outfit for any eye tracking software and is the input data for the application of the present invention. In this embodiment, the default value of r is 30 pixels, and can be adjusted according to the size of the observed image.
(2) And extracting a second hot spot diagram of the image, and referring the hot spot diagram of the group of data as B, and expressing the hot spot data of the image by using a matrix, wherein the group of B data comprises two matrixes with the same graph size. It was obtained in the same manner as A.
(3) Constructing a new matrix, and calling a data group where the image hot spot matrix is located as C, wherein each point in the matrix C is as follows:
wherein A isijDenotes the element, max, located at the (i, j) point in the matrix AAIs the maximum value in matrix a. B isijIs the element in position (i, j) in matrix B, maxBIs the maximum value in matrix B. Using maxAAnd maxBTwo sets of data are expanded or reduced to the same unit on average. Note that, although the numerical value of the conventional hotspot graph is not negative, the interval of the matrix C is [ -1,1]。
(4) And (4) constructing a new image according to the hot spot matrix obtained in the step (3), wherein the image is two groups of fixation points expressed by cold and warm colors. Wherein a set of colors represents all points of maximum difference of the first set of gazes and the second set of gazes; the other set of colors represents all points where the second set differs most from the first set in gaze. When the two groups of fixation points are dispersed and have no intersection point, each group of colors is the independent fixation point of each group; however, when the two groups of fixation points are staggered, the invention expresses the respective fixation centers of the two groups. The method of visualization does not well express the staggered difference at the moment before the invention, because the difference is not simple superposition, and the prior visualization only uses a superposition method and does not express the difference.
Comparing fig. 1 and 5: fig. 5 is a gaze hotspot for another subject on the same map. But now the hot spot is represented by a cool blue and its progressive color. Simply superimposing fig. 1 and 5 is not a correct method of comparison, as the magnitude of the difference is not apparent in the superimposed plot. FIG. 6 shows the same two tested hot spot difference sets. Although the hotspot itself is opaque, the portion where the two tested hotspots intersect can adjust its transparency, which can be very useful in some applications.
In addition, we can also compare the average of a test and a group of tests using this method. Referring to FIG. 7, the nth trial in the Kootstra paper is depicted. FIG. 8 reflects the differences between trial n and the pool of trials 1 to n-1. Thus the feature of the nth number is embodied in the population at once.
Practical application of the invention for example in driving learning, we present a video of driving traffic. First we take a series of eye movement data of skilled drivers and aggregate their results into a hotspot group graph. A student watches the same video and records his eye movements. Subsequently, the attention difference hotspot graph of the old and new drivers can be immediately presented and fed back to the trainee. The method enables the learning efficiency of the student to be improved faster than other methods. The hotspot difference map can well reveal these differences. The part of the curtain not gazed at will not be marked, and the part gazed at the same time will not be marked. Only places with differences are color-coded, and the result reflects that the fixation time of a tested place and other people are different when the tested place and other people look at the same place. In reflecting the different sets of two tested hot spots, the intersection of the two tested hot spots can adjust the transparency thereof, although the hot spots themselves are opaque, which has a great effect in some applications. Although we use driving as an example, this method can be applied to other visual attention important areas as well, from machinery to sports competition everywhere.
Claims (5)
1. A visual method for eye tracking pattern recognition and comparison based on a matrix structure is characterized by comprising the following steps:
(1) extracting a first group of data A of an image, wherein the first group of data A is an intensity value matrix extracted by taking any pixel point of the image as a visual center, and the size of a hotspot graph obtained by the intensity value matrix is the same as that of the original image;
(2) extracting a second group of data B of the image, wherein the second group of data B takes the pixel points selected in the non-step (1) as an intensity value matrix extracted from the visual center, and the size of the hotspot graph obtained by the intensity value matrix is the same as that of the hotspot graph obtained in the step (1);
(3) and (3) obtaining a new matrix C by the following calculation equation in the step (1) and the step (2), wherein the calculation equation of the matrix C is as follows:
wherein A isij、BijIs the element in the matrix A, B at point i, j, maxA、maxBIs the maximum value in the matrix A, B, and the interval of the matrix C is [ -1,1 [ ]]Namely:
(4) and (3) assigning the matrix C obtained in the step (3) to construct a visual image, expressing different fixation points by positive and negative values in the matrix C through cold and warm colors, gradually fixing the negative values in the matrix C by one color, fixing the positive values by another strong contrast color, repeatedly fixing the colors of each frame image of the dynamic picture or video based on the matrix C, and introducing decay parameters into the matrix to simulate a regression average value.
2. The method for visualizing the eye tracking pattern recognition contrast based on the matrix structure as claimed in claim 1, wherein: and (3) extracting the data of the image intensity value matrix by taking the image pixel points as the visual center, wherein the data of the image intensity value matrix comprises two or more groups of data extracted in the step (1) and the step (2) and intensity value matrixes thereof.
3. The method for visualizing the eye tracking pattern recognition contrast based on the matrix structure as claimed in claim 1, wherein: and the step (3) comprises the steps of converting, comparing and classifying the data.
4. The method for visualizing the eye tracking pattern recognition contrast based on the matrix structure as claimed in claim 1, wherein: and (3) calculating through an equation to obtain a new matrix C comprising any two groups of data extracted from any pixel point of the image and the intensity value matrix thereof and the combination of more than two groups of data.
5. The method for visualizing the eye tracking pattern recognition contrast based on the matrix structure as claimed in claim 1, wherein: and (4) introducing decay parameters into the matrix in the step (4) including the standardization processing of the intensity values.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710971090.2A CN107977401B (en) | 2017-10-18 | 2017-10-18 | Visualization method for eye tracking mode identification comparison based on matrix structure |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710971090.2A CN107977401B (en) | 2017-10-18 | 2017-10-18 | Visualization method for eye tracking mode identification comparison based on matrix structure |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107977401A CN107977401A (en) | 2018-05-01 |
CN107977401B true CN107977401B (en) | 2022-04-26 |
Family
ID=62012490
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710971090.2A Active CN107977401B (en) | 2017-10-18 | 2017-10-18 | Visualization method for eye tracking mode identification comparison based on matrix structure |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107977401B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101587542A (en) * | 2009-06-26 | 2009-11-25 | 上海大学 | Field depth blending strengthening display method and system based on eye movement tracking |
CN106062665A (en) * | 2013-09-11 | 2016-10-26 | 深圳市汇顶科技股份有限公司 | User interface based on optical sensing and tracking of user's eye movement and position |
US20170108923A1 (en) * | 2015-10-14 | 2017-04-20 | Ecole Nationale De L'aviation Civile | Historical representation in gaze tracking interface |
CN106959749A (en) * | 2017-02-20 | 2017-07-18 | 浙江工业大学 | Visual attention behavior collaborative visualization method and system based on eye movement tracking data |
CN107133584A (en) * | 2017-04-27 | 2017-09-05 | 贵州大学 | Implicit intention assessment sorting technique based on eye-tracking |
-
2017
- 2017-10-18 CN CN201710971090.2A patent/CN107977401B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101587542A (en) * | 2009-06-26 | 2009-11-25 | 上海大学 | Field depth blending strengthening display method and system based on eye movement tracking |
CN106062665A (en) * | 2013-09-11 | 2016-10-26 | 深圳市汇顶科技股份有限公司 | User interface based on optical sensing and tracking of user's eye movement and position |
US20170108923A1 (en) * | 2015-10-14 | 2017-04-20 | Ecole Nationale De L'aviation Civile | Historical representation in gaze tracking interface |
CN106959749A (en) * | 2017-02-20 | 2017-07-18 | 浙江工业大学 | Visual attention behavior collaborative visualization method and system based on eye movement tracking data |
CN107133584A (en) * | 2017-04-27 | 2017-09-05 | 贵州大学 | Implicit intention assessment sorting technique based on eye-tracking |
Non-Patent Citations (2)
Title |
---|
Model-based Real-time Visualization of Realistic Three-Dimensional Heat Maps for Mobile Eye Tracking and Eye Tracking in Virtual Reality;Thies Pfeiffer等;《ETRA "16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications》;20160331;第95-102页 * |
眼动数据可视化综述;程时伟;《计算机辅助设计与图形学学报》;20140531;第26卷(第5期);第698-707页 * |
Also Published As
Publication number | Publication date |
---|---|
CN107977401A (en) | 2018-05-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR100976357B1 (en) | Technology that facilitates the use of eye tracking data | |
Happy et al. | The Indian spontaneous expression database for emotion recognition | |
Van der Lans et al. | Eye-movement analysis of search effectiveness | |
Greene et al. | Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns | |
Peacock et al. | The role of meaning in attentional guidance during free viewing of real-world scenes | |
Zou et al. | Where do we look? An eye-tracking study of architectural features in building design | |
Battisti et al. | A feature-based approach for saliency estimation of omni-directional images | |
Hayes et al. | Center bias outperforms image salience but not semantics in accounting for attention during scene viewing | |
Panetta et al. | Iseecolor: method for advanced visual analytics of eye tracking data | |
Tang | Analysis of signage using eye-tracking technology | |
CN112541433A (en) | Two-stage human eye pupil accurate positioning method based on attention mechanism | |
CN109074487A (en) | It is read scene cut using neurology into semantic component | |
Nuthmann et al. | Visual search in naturalistic scenes from foveal to peripheral vision: A comparison between dynamic and static displays | |
Bonyad et al. | The relation between mental workload and face temperature in flight simulation | |
CN107977401B (en) | Visualization method for eye tracking mode identification comparison based on matrix structure | |
Hollands et al. | Effects of resolution, range, and image contrast on target acquisition performance | |
Xu et al. | Affect-preserving privacy protection of video | |
Borgo et al. | Evaluating the impact of task demands and block resolution on the effectiveness of pixel-based visualization | |
JP2020126563A (en) | Visibility evaluation device, visibility evaluation program, and visibility evaluation method | |
Demiralp et al. | The verp explorer: a tool for exploring eye movements of visual-cognitive tasks using recurrence plots | |
Hong et al. | Preliminary exploration of three-dimensional visual variables in virtual reality | |
McNamara et al. | Perceptually-motivated graphics, visualization and 3D displays | |
Cárdenas-Delgado et al. | VR-test ViKi: VR test with visual and kinesthetic stimulation for assessment color vision deficiencies in adults | |
EP4446854A1 (en) | Method to determine universal heat map | |
Pavithra et al. | Survey on Face Recognition with pre filtering Techniques and their comparative study |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |