CN117058310A - Electronic sand table system - Google Patents
Electronic sand table system Download PDFInfo
- Publication number
- CN117058310A CN117058310A CN202310861397.2A CN202310861397A CN117058310A CN 117058310 A CN117058310 A CN 117058310A CN 202310861397 A CN202310861397 A CN 202310861397A CN 117058310 A CN117058310 A CN 117058310A
- Authority
- CN
- China
- Prior art keywords
- sand table
- dimensional
- electronic sand
- rectangular coordinate
- coordinate system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000004576 sand Substances 0.000 title claims abstract description 52
- 230000008846 dynamic interplay Effects 0.000 claims abstract description 11
- 230000009471 action Effects 0.000 claims abstract description 10
- 239000011159 matrix material Substances 0.000 claims description 18
- 238000000034 method Methods 0.000 claims description 16
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 6
- 238000010586 diagram Methods 0.000 claims description 4
- 230000003183 myoelectrical effect Effects 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000004088 simulation Methods 0.000 abstract description 6
- 230000002452 interceptive effect Effects 0.000 abstract description 3
- 230000001747 exhibiting effect Effects 0.000 abstract description 2
- 230000000007 visual effect Effects 0.000 abstract description 2
- 230000003993 interaction Effects 0.000 description 3
- 238000012876 topography Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000013049 sediment Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides an electronic sand table system which comprises a data acquisition unit, a dynamic projection unit, a dynamic interaction unit, a three-dimensional animation generation unit and a three-dimensional display unit, wherein the data acquisition unit is used for acquiring data of a sand table; the data acquisition unit is used for acquiring a three-dimensional model of the object to be projected; the dynamic projection unit is used for acquiring a scene graph of the electronic sand table, projecting a three-dimensional model of an object to be projected into the scene graph, and generating a dynamic background graph; the dynamic interaction unit is used for capturing gesture motion signals of a user in the electronic sand table area and generating corresponding gesture motion tracks; the three-dimensional animation generation unit is used for generating corresponding execution action animations in the electronic sand table according to the gesture motion trail; the three-dimensional display unit is used for displaying the dynamic background image and executing the action animation in the electronic sand table. The electronic sand table system can be widely applied to the fields of scene simulation, simulation experiments, exhibition and demonstration, interactive teaching and the like, and has visual and image performance and better exhibiting effect.
Description
Technical Field
The invention relates to the technical field of sand tables, in particular to an electronic sand table system.
Background
The traditional sand table is based on a topography map, an aerial photograph or an on-site topography, and according to a certain proportion relation, the sand table can visually display the on-site topography, object deployment and other conditions by using a model of sediment, chess and other material pile, and is commonly used for building demonstration, city planning construction, exhibition hall navigation and the like in life. The traditional sand table has a single and static display form and can not give vivid image sense to the audience.
The electronic sand table is also called a digital sand table, and the multimedia display and interaction functions are added on the basis of the traditional physical sand table by applying technologies such as multi-channel projection image splicing, intelligent media equipment control and the like. The existing electronic sand table mostly adopts non-contact interaction, but the recognition accuracy is thicker, the control is slightly insufficient for refinement, and the user experience is poor.
Disclosure of Invention
The invention aims to provide an electronic sand table system which can be widely applied to the fields of scene simulation, simulation experiments, exhibition and demonstration, interactive teaching and the like.
The embodiment of the invention is realized by the following technical scheme:
an electronic sand table system comprises a data acquisition unit, a dynamic projection unit, a dynamic interaction unit, a three-dimensional animation generation unit and a three-dimensional display unit;
the data acquisition unit is used for acquiring a three-dimensional model of the object to be projected;
the dynamic projection unit is used for acquiring a scene graph of the electronic sand table, projecting a three-dimensional model of an object to be projected into the scene graph, and generating a dynamic background graph;
the dynamic interaction unit is used for capturing gesture motion signals of a user in the electronic sand table area and generating corresponding gesture motion tracks;
the three-dimensional animation generation unit is used for generating corresponding execution action animations in the electronic sand table according to the gesture motion trail;
the three-dimensional display unit is used for displaying the dynamic background image and executing the action animation in the electronic sand table.
The technical scheme of the embodiment of the invention has at least the following advantages and beneficial effects:
(1) According to the electronic sand table system, the dynamic background image is generated by projecting the object, so that the stereoscopic impression of the electronic sand table is improved, and the display is more comprehensive and vivid;
(2) The electronic sand table system analyzes and processes the gesture movement track to generate and display the execution action animation, so that the problem that the traditional electronic sand table cannot transmit the interaction action in real time is avoided, and the user experience is enhanced;
(3) The electronic sand table system can be widely applied to the fields of scene simulation, simulation experiments, exhibition and demonstration, interactive teaching and the like, and has visual and image performance and better exhibiting effect.
Further, the specific method for generating the dynamic background image by the dynamic projection unit comprises the following steps:
generating corresponding two-dimensional coordinates according to the three-dimensional model of the object to be projected;
preprocessing the two-dimensional coordinates corresponding to the three-dimensional model;
calculating a projection factor corresponding to the preprocessed two-dimensional coordinates;
and projecting the three-dimensional model into the scene graph according to the projection factors corresponding to the preprocessed two-dimensional coordinates, and generating a dynamic background graph.
Further, the method for generating the two-dimensional coordinates corresponding to the three-dimensional model specifically comprises the following steps: establishing a rectangular coordinate system by taking the centroid of the scene graph as an origin; and projecting vertex coordinates corresponding to each triangular patch in the three-dimensional model into a rectangular coordinate system, and generating corresponding two-dimensional coordinates in the rectangular coordinate system.
Further, the specific method for preprocessing the two-dimensional coordinates corresponding to the three-dimensional model comprises the following steps: and carrying out normalization processing on the two-dimensional coordinates corresponding to the three-dimensional model.
Further, the calculation formula of the projection factor sigma is:
wherein M represents the number of two-dimensional coordinates, x m Represents the abscissa, y, of the mth two-dimensional coordinate in a rectangular coordinate system m Representing the ordinate, x, of the mth two-dimensional coordinate in a rectangular coordinate system 0 Representing the abscissa, y, of the centroid of the scene graph in a rectangular coordinate system m Representing the ordinate of the centroid of the scene graph in a rectangular coordinate system.
Further, the method for generating the dynamic background map specifically comprises the following steps: and taking the product of the vertex coordinates corresponding to each triangular patch in the three-dimensional model and the projection factors as final projection coordinates of each triangular patch, and projecting the final projection coordinates into a rectangular coordinate system to generate a dynamic background image.
Further, the specific method for generating the gesture motion trail by the dynamic interaction unit comprises the following steps:
capturing gesture motion signals of a user at all moments in an electronic sand table area, and constructing a motion trail matrix;
constructing a rectangular coordinate system by taking the centroid of the dynamic background diagram as an origin;
and calculating the position coordinates of the gesture motion signals at all moments in a rectangular coordinate system according to the motion track matrix, and generating corresponding gesture motion tracks.
Further, the motion trajectory matrix Z has the expression: z= [ Z ] 1 ,z 2 ,...,z N ] T Wherein N represents the total time, z i The myoelectric signal at the i-th time is represented, i=1, 2.
Further, the calculation formula of the position coordinates of the gesture motion signal at the ith moment in the rectangular coordinate system is as follows:
wherein a is i Representing the abscissa of the gesture motion signal at the ith moment in a rectangular coordinate system, b i Represents the ith timeAnd the ordinate of the gesture motion signal in a rectangular coordinate system, lambda represents the characteristic value of a motion track matrix Z, I represents an identity matrix, and Z represents the motion track matrix.
Further, the three-dimensional display unit adopts a pressure-sensitive touch screen.
Drawings
Fig. 1 is a schematic structural diagram of an electronic sand table system according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
As shown in fig. 1, the invention provides an electronic sand table system, which comprises a data acquisition unit, a dynamic projection unit, a dynamic interaction unit, a three-dimensional animation generation unit and a three-dimensional display unit;
the data acquisition unit is used for acquiring a three-dimensional model of the object to be projected;
the dynamic projection unit is used for acquiring a scene graph of the electronic sand table, projecting a three-dimensional model of an object to be projected into the scene graph, and generating a dynamic background graph;
the dynamic interaction unit is used for capturing gesture motion signals of a user in the electronic sand table area and generating corresponding gesture motion tracks;
the three-dimensional animation generation unit is used for generating corresponding execution action animations in the electronic sand table according to the gesture motion trail;
the three-dimensional display unit is used for displaying the dynamic background image and executing the action animation in the electronic sand table.
In the embodiment of the invention, the specific method for generating the dynamic background image by the dynamic projection unit comprises the following steps:
generating corresponding two-dimensional coordinates according to the three-dimensional model of the object to be projected;
preprocessing the two-dimensional coordinates corresponding to the three-dimensional model;
calculating a projection factor corresponding to the preprocessed two-dimensional coordinates;
and projecting the three-dimensional model into the scene graph according to the projection factors corresponding to the preprocessed two-dimensional coordinates, and generating a dynamic background graph.
In the embodiment of the invention, the method for generating the two-dimensional coordinates corresponding to the three-dimensional model specifically comprises the following steps: establishing a rectangular coordinate system by taking the centroid of the scene graph as an origin; and projecting vertex coordinates corresponding to each triangular patch in the three-dimensional model into a rectangular coordinate system, and generating corresponding two-dimensional coordinates in the rectangular coordinate system.
In the embodiment of the invention, the specific method for preprocessing the two-dimensional coordinates corresponding to the three-dimensional model comprises the following steps: and carrying out normalization processing on the two-dimensional coordinates corresponding to the three-dimensional model.
In the embodiment of the present invention, the calculation formula of the projection factor σ is:
wherein M represents the number of two-dimensional coordinates, x m Represents the abscissa, y, of the mth two-dimensional coordinate in a rectangular coordinate system m Representing the ordinate, x, of the mth two-dimensional coordinate in a rectangular coordinate system 0 Representing the abscissa, y, of the centroid of the scene graph in a rectangular coordinate system m Representing the ordinate of the centroid of the scene graph in a rectangular coordinate system.
In the embodiment of the invention, the method for generating the dynamic background image specifically comprises the following steps: and taking the product of the vertex coordinates corresponding to each triangular patch in the three-dimensional model and the projection factors as final projection coordinates of each triangular patch, and projecting the final projection coordinates into a rectangular coordinate system to generate a dynamic background image.
In the embodiment of the invention, the specific method for generating the gesture motion trail by the dynamic interaction unit comprises the following steps:
capturing gesture motion signals of a user at all moments in an electronic sand table area, and constructing a motion trail matrix;
constructing a rectangular coordinate system by taking the centroid of the dynamic background diagram as an origin;
and calculating the position coordinates of the gesture motion signals at all moments in a rectangular coordinate system according to the motion track matrix, and generating corresponding gesture motion tracks.
In the embodiment of the invention, the expression of the motion trail matrix Z is as follows: z= [ Z ] 1 ,z 2 ,...,z N ] T Wherein N represents the total time, z i The myoelectric signal at the i-th time is represented, i=1, 2.
In the embodiment of the invention, the calculation formula of the position coordinates of the gesture motion signal at the ith moment in the rectangular coordinate system is as follows:
wherein a is i Representing the abscissa of the gesture motion signal at the ith moment in a rectangular coordinate system, b i The motion signal is represented by an ordinate of the gesture motion signal at the ith moment in a rectangular coordinate system, lambda represents a characteristic value of a motion track matrix Z, I represents an identity matrix, and Z represents a motion track matrix.
In the embodiment of the invention, the three-dimensional display unit adopts a pressure-sensitive touch screen.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. The electronic sand table system is characterized by comprising a data acquisition unit, a dynamic projection unit, a dynamic interaction unit, a three-dimensional animation generation unit and a three-dimensional display unit;
the data acquisition unit is used for acquiring a three-dimensional model of an object to be projected;
the dynamic projection unit is used for acquiring a scene graph of the electronic sand table, projecting a three-dimensional model of an object to be projected into the scene graph and generating a dynamic background graph;
the dynamic interaction unit is used for capturing gesture motion signals of a user in the electronic sand table area and generating corresponding gesture motion tracks;
the three-dimensional animation generation unit is used for generating corresponding execution action animations in the electronic sand table according to the gesture motion trail;
the three-dimensional display unit is used for displaying the dynamic background image and executing the action animation in the electronic sand table.
2. The electronic sand table system of claim 1 wherein: the specific method for generating the dynamic background image by the dynamic projection unit comprises the following steps:
generating corresponding two-dimensional coordinates according to the three-dimensional model of the object to be projected;
preprocessing the two-dimensional coordinates corresponding to the three-dimensional model;
calculating a projection factor corresponding to the preprocessed two-dimensional coordinates;
and projecting the three-dimensional model into the scene graph according to the projection factors corresponding to the preprocessed two-dimensional coordinates, and generating a dynamic background graph.
3. The electronic sand table system of claim 2 wherein: the method for generating the two-dimensional coordinates corresponding to the three-dimensional model specifically comprises the following steps: establishing a rectangular coordinate system by taking the centroid of the scene graph as an origin; and projecting vertex coordinates corresponding to each triangular patch in the three-dimensional model into a rectangular coordinate system, and generating corresponding two-dimensional coordinates in the rectangular coordinate system.
4. The electronic sand table system of claim 2 wherein: the specific method for preprocessing the two-dimensional coordinates corresponding to the three-dimensional model comprises the following steps: and carrying out normalization processing on the two-dimensional coordinates corresponding to the three-dimensional model.
5. The electronic sand table system of claim 2 wherein: the calculation formula of the projection factor sigma is as follows:
wherein M represents the number of two-dimensional coordinates, x m Represents the abscissa, y, of the mth two-dimensional coordinate in a rectangular coordinate system m Representing the ordinate, x, of the mth two-dimensional coordinate in a rectangular coordinate system 0 Representing the abscissa, y, of the centroid of the scene graph in a rectangular coordinate system m Representing the ordinate of the centroid of the scene graph in a rectangular coordinate system.
6. The electronic sand table system of claim 2 wherein: the method for generating the dynamic background image specifically comprises the following steps: and taking the product of the vertex coordinates corresponding to each triangular patch in the three-dimensional model and the projection factors as final projection coordinates of each triangular patch, and projecting the final projection coordinates into a rectangular coordinate system to generate a dynamic background image.
7. The electronic sand table system of claim 1 wherein: the specific method for generating the gesture motion trail by the dynamic interaction unit comprises the following steps:
capturing gesture motion signals of a user at all moments in an electronic sand table area, and constructing a motion trail matrix;
constructing a rectangular coordinate system by taking the centroid of the dynamic background diagram as an origin;
and calculating the position coordinates of the gesture motion signals at all moments in a rectangular coordinate system according to the motion track matrix, and generating corresponding gesture motion tracks.
8. According to the weightsThe electronic sand table system of claim 7 wherein: the expression of the motion trail matrix Z is as follows: z= [ Z ] 1 ,z 2 ,...,z N ] T Wherein N represents the total time, z i The myoelectric signal at the i-th time is represented, i=1, 2.
9. The electronic sand table system of claim 7 wherein: the calculation formula of the position coordinates of the gesture motion signal at the ith moment in the rectangular coordinate system is as follows:
wherein a is i Representing the abscissa of the gesture motion signal at the ith moment in a rectangular coordinate system, b i The motion signal is represented by an ordinate of the gesture motion signal at the ith moment in a rectangular coordinate system, lambda represents a characteristic value of a motion track matrix Z, I represents an identity matrix, and Z represents a motion track matrix.
10. The electronic sand table system of claim 1 wherein: the three-dimensional display unit adopts a pressure-sensitive touch screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310861397.2A CN117058310A (en) | 2023-07-13 | 2023-07-13 | Electronic sand table system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310861397.2A CN117058310A (en) | 2023-07-13 | 2023-07-13 | Electronic sand table system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117058310A true CN117058310A (en) | 2023-11-14 |
Family
ID=88663487
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310861397.2A Pending CN117058310A (en) | 2023-07-13 | 2023-07-13 | Electronic sand table system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117058310A (en) |
-
2023
- 2023-07-13 CN CN202310861397.2A patent/CN117058310A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhong et al. | The application of virtual reality technology in the digital preservation of cultural heritage | |
CN102831401B (en) | To following the tracks of without specific markers target object, three-dimensional overlay and mutual method and system | |
CN101320482B (en) | Drafting method for virtual running athlete based on video texture | |
CN103035022B (en) | Facial expression synthetic method based on feature points | |
Yang | The study and improvement of Augmented reality based on feature matching | |
CN102945637A (en) | Augmented reality based embedded teaching model and method | |
CN101968890A (en) | 360-degree full-view simulation system based on spherical display | |
CN102509502A (en) | Virtual experiment making method for convex lens | |
CN102508994A (en) | Virtual experiment teaching oriented dynamic component visualization method | |
CN114092670A (en) | Virtual reality display method, equipment and storage medium | |
CN107895515A (en) | The CAD teaching methods and tutoring system of a kind of augmented reality | |
CN108628455A (en) | A kind of virtual husky picture method for drafting based on touch-screen gesture identification | |
CN110197531A (en) | Role's skeleton point mapping techniques based on deep learning | |
Wang et al. | Virtual chime-bells experimental system based on multi-modal fusion | |
CN109671317A (en) | Types of facial makeup in Beijing operas interactive teaching method based on AR | |
CN117058310A (en) | Electronic sand table system | |
Li | Exploration on the application of artificial intelligence elements in Graphic Design | |
CN106447643A (en) | AR technology based interactive image processing method | |
CN102855652B (en) | Method for redirecting and cartooning face expression on basis of radial basis function for geodesic distance | |
Yang et al. | Research on the dissemination and application of computer 3D technology in the process of intangible cultural heritage display | |
CN104134231A (en) | Coastline constructing method based on image recognition | |
Chouvatut et al. | Virtual piano with real-time interaction using automatic marker detection | |
Yu | Algorithm of Digital Dynamic Sculpture Expression Change Based on Virtual Reality Technology | |
Li | Application of Computer 3D Technology in Graphic Design of Animation Scene | |
CN117079169B (en) | Map scene adaptation method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |