CN104197987A - Combined-type motion capturing system - Google Patents
Combined-type motion capturing system Download PDFInfo
- Publication number
- CN104197987A CN104197987A CN201410440797.7A CN201410440797A CN104197987A CN 104197987 A CN104197987 A CN 104197987A CN 201410440797 A CN201410440797 A CN 201410440797A CN 104197987 A CN104197987 A CN 104197987A
- Authority
- CN
- China
- Prior art keywords
- inertial sensor
- sensor unit
- information
- communication
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004891 communication Methods 0.000 claims abstract description 210
- 238000009434 installation Methods 0.000 claims description 62
- 238000006243 chemical reaction Methods 0.000 claims description 25
- 230000001133 acceleration Effects 0.000 claims description 16
- 230000009471 action Effects 0.000 claims description 12
- 230000003068 static effect Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 7
- 230000010354 integration Effects 0.000 claims description 4
- 241001465754 Metazoa Species 0.000 claims description 3
- 238000000034 method Methods 0.000 abstract description 28
- 230000008569 process Effects 0.000 abstract description 21
- 238000005516 engineering process Methods 0.000 description 9
- 241000282326 Felis catus Species 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 210000003811 finger Anatomy 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000012360 testing method Methods 0.000 description 7
- 238000012856 packing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 210000004247 hand Anatomy 0.000 description 5
- 230000036541 health Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000001217 buttock Anatomy 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 239000002390 adhesive tape Substances 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000013216 cat model Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000035800 maturation Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
A combined-type motion capturing system comprises multiple inertia sensor units, at least one communication unit and a terminal processor. The inertia sensor units are connected with the communication units which are connected with the terminal processor, the inertia sensor units are mounted at positions of one or more motion capturing objects according to different combining modes and measure motion information of the positions where the inertia sensor units are mounted and send the motion information to the communication units, the communication units receive the motion information output by inertia sensors and send the motion information to the terminal processor, and the terminal processor acquires information of the motion capturing objects and mounting position information of the inertia sensor units, generates combining modes of the inertia sensor units according to the information of the motion capturing objects and the mounting position information, receives motion information and processes the motion information according to the combining modes to acquire complete posture and the motion information. By the combined-type motion capturing system, different motion capturing objectives are achieved by freely combining a same set of motion capturing equipment, so that cost is lowered.
Description
Technical field
The invention relates to movement capturing technology, particularly about a kind of combined type movement capture system.
Background technology
Movement capturing technology can record in digital mode the action of object, and current conventional movement capturing technology mainly comprises motion-captured and motion-captured based on inertial sensor of optical profile type, is described below respectively:
Optical profile type motion capture system comprises 4~32 cameras conventionally, arranges the overlapping region of the range of movement of object under test in camera around object under test.The key position of object under test sticks the reflective spot of some speciality or luminous point as the sign of visual identity and processing.After system calibrating, the motion of the continuous shot object of camera also preserves image sequence to analyze and process, and calculates each monumented point in certain flashy locus, thereby and obtains its movement locus accurately.The motion-captured advantage of optical profile type is the restriction that there is no mechanical hook-up, wire cable etc., allow the range of movement of object larger, and sample frequency is higher, can meet the needs of most motion measurements.But this system price is expensive, and the marked ratio of system is more loaded down with trivial details, can only catch the object of which movement of camera overlapping region, and when motion more complicated, sign is easily obscured and blocks, thereby produces wrong result.
Traditional mechanical type inertia sensor prolonged application is in the navigation of aircraft, boats and ships, high speed development along with MEMS (micro electro mechanical system) (MEMS) technology, the technology maturation of mini inertial sensor, in recent years, people start to attempt motion-captured based on mini inertial sensor.Basic skills is that Inertial Measurement Unit (IMU) is connected on object under test and follows object under test and move together.Inertial Measurement Unit generally includes micro-acceleration gauge (acceleration measurement signal) and gyroscope (measured angular rate signal), by the integration to the quadratic integral of acceleration signal and gyroscope signal, can obtain positional information and the azimuth information of object under test.Due to the application of MEMS technology, it is very little that the size of IMU and weight can be done, thereby very little to the motion effects of object under test, and for place require lowly, the range of movement of permission is large, the cost compare of simultaneity factor is low.
Along with the development of virtual reality technology, the movement capturing technology based on inertia starts to have occurred as important interactive means.But the current motion capture system based on inertia is all fixed, the motion capture system of the upper part of the body is merely able to motion above the waist to catch, and can not be realized the motion at other positions of health (as the lower part of the body) is caught by the installation site of transformative transducer.Like this, user if change motion-captured position, can only buy extra motion capture system or the senior more motion capture system of number of sensors, but this will bring the lifting of expense.
Summary of the invention
The invention provides a kind of combined type movement capture system, the object catching by the independent assortment of same set of motion capture device being reached to different motion, and reduce costs.
The invention provides a kind of combined type movement capture system, described combined type movement capture system comprises: a plurality of inertial sensors unit, at least one communication unit and a terminal handler; Described inertial sensor unit connects respectively described communication unit, and described communication unit connects described terminal handler;
Described inertial sensor unit is arranged on respectively each position of one or more motion capture object according to different array modes, measure the movable information of installation position, place, and described movable information is sent to described communication unit by wired or wireless mode;
Described communication unit receives the movable information of described inertial sensor output, and sends to described terminal handler by the mode of wired or wireless telecommunications;
Described terminal handler obtains the information of described motion capture object and the installation site information of described inertial sensor unit, according to the array mode of inertial sensor unit described in the information of described motion capture object and described installation site Information generation, receive the described movable information that described communication unit sends, according to described array mode, the described movable information receiving is processed to obtain complete attitude and the movable information of described motion capture object.
In one embodiment, described movable information comprises azimuth information; In another embodiment, described movable information comprises azimuth information and inertia information, as acceleration information, angular velocity information etc.
In one embodiment, described terminal handler specifically for: obtain the information of described motion capture object and the installation site information of described inertial sensor unit, according to the motion capture object model of the acquisition of information pre-stored of described motion capture object or newly-built motion capture object model, according to the array mode of inertial sensor unit described in described motion capture object model and described installation site Information generation, receive the described movable information that described communication unit sends, according to described array mode, the described movable information receiving is processed to obtain complete attitude and the movable information of object.
In one embodiment, described terminal handler specifically for: according to the mechanics constraint of motion capture object, the orientation of described inertial sensor unit is revised, such as occurring the twisting of joints and ground contact puncture for fear of object and the orientation of object, displacement etc. being revised; To orientation and the motion at the position of inertial sensor unit is not installed, estimate, evaluation method is for to adopt adjacent inertial sensor module to be similar to the estimation of interpolation according to the motion feature at this position.
In one embodiment, described inertial sensor unit comprises:
Sensor assembly, comprising: 3 axis MEMS accelerometer, 3 axis MEMS gyroscope and 3 axis MEMS magnetometer, respectively the acceleration of the installation position of described inertial sensor unit, angular velocity and magnetic force signal are measured;
First microprocessor module, connects described sensor assembly, according to the azimuth information of described acceleration, angular velocity and magnetic force calculated signals installation position;
The first communication module, connects described first microprocessor module, for transmitting described movable information, as azimuth information, inertia information etc.
In one embodiment, described communication unit comprises: the second microprocessor module, the second communication module and the 3rd communication module, the second described communication module and the 3rd communication module connect respectively the second described microprocessor module.
In one embodiment, described communication unit also comprises: battery and DC/DC conversion module; The first described communication module is connected with the mode of the second described communication module by wired serial communication, and the 3rd described communication module is connected with described terminal handler with communication.
In one embodiment, described inertial sensor unit also comprises: battery and DC/DC conversion module; The first described communication module is connected by communication with the second described communication module, and the 3rd described communication module is connected with described terminal handler in the mode of wired serial communication.
In one embodiment, described communication unit also comprises: the first battery and the first DC/DC conversion module, and described inertial sensor unit also comprises: the second battery and the second DC/DC conversion module; The first described communication module is connected by communication with the second described communication module, and the 3rd described communication module is connected with described terminal handler with communication.
In one embodiment, the first described communication module is connected with the mode of the second described communication module by wired serial communication, and the 3rd described communication module is connected with described terminal handler in the mode of wired serial communication; Shown communication unit also comprises DC/DC conversion module.
In one embodiment, described first microprocessor module specifically for: described angular velocity information is carried out to integration, generate dynamic space orientation, according to described acceleration information and ground magnetic vector, generate static absolute space orientation, and utilize described static absolute space orientation to revise described dynamic space orientation, generate described azimuth information.
In one embodiment, each position of described a plurality of motion capture object comprises: each position of human body, animal and/or robot.
In one embodiment, described inertial sensor unit was arranged in different motion capture object in the different moment.
In one embodiment, when user is when using first the array mode of described combined type movement capture system or change inertial sensor unit or installation site, described terminal handler is also used to specify the installation site of current motion-captured array mode and each inertial sensor unit.
In one embodiment, when described sensor unit, from a kind of motion capture object, change while being installed to another kind of motion capture object, described terminal handler is also measured motion capture object model or newly-built motion capture object model for changing.
In one embodiment, complete after the installation of described inertial sensor unit, described terminal handler is also for carrying out calibration actions according to array mode and motion capture object, to revise the alignment error of described inertial sensor unit.
The beneficial effect of the embodiment of the present invention is, a plurality of inertial sensors of the present invention unit can carry out the installation of various array modes in same motion capture object, also can in different types of motion capture object, combine installation, by the independent assortment to same set of motion capture device, can reach the object that different motion catches, and reduce cost.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the combined type movement capture system structural representation of the embodiment of the present invention;
Fig. 2 is the combined type movement capture system structural representation two of the embodiment of the present invention;
Fig. 3 is the combined type movement capture system structural representation two of the embodiment of the present invention;
Fig. 4 is the combined type movement capture system structural representation two of the embodiment of the present invention;
Fig. 5 is the combined type movement capture system structural representation two of the embodiment of the present invention;
Fig. 6 is the implementing procedure figure of the combined type movement capture system of the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
As shown in Figure 1, the invention provides a kind of combined type movement capture system, described combined type movement capture system comprises: a plurality of inertial sensors unit 101, at least one communication unit 102 and a terminal handler 103.
A plurality of inertial sensors unit 101 is connected to communication unit 102 in wired or wireless mode respectively, and communication unit 102 connects terminal handler 103 in wired or wireless mode.
A plurality of inertial sensors unit 101 is arranged on respectively each position of one or more motion capture object according to different array modes, motion capture object can have multiple, such as being human body, robot and animal etc.Installation method has multiple, for human body, installation method can be to connect in one's hands or other position of health by modes such as gloves, bandage or sensor clothes, the movable information of its installation position, place is measured in inertial sensor unit 101, as orientation, acceleration, angular velocity etc., and this movable information is sent to communication unit 102 by wired or wireless mode.
Communication unit 102 receives the movable information of inertial sensor output with wired or wireless mode, and sends to terminal handler 103 by the mode of wired or wireless telecommunications.
Terminal handler 103 obtains the information of motion capture object and the installation site information of described inertial sensor unit 101, according to the array mode of the information of motion capture object and Information generation inertial sensor unit, installation site, receive the described movable information that described communication unit sends, according to described array mode, the described movable information receiving is processed to obtain complete attitude and the movable information of described motion capture object.
During concrete enforcement, terminal handler 102 can obtain the information of motion capture object and the installation site information of described inertial sensor unit 101, according to the motion capture object model of the acquisition of information pre-stored of motion capture object or newly-built motion capture object model, according to the array mode of this motion capture object model and this installation site information (the installation site information of the inertial sensor unit that user's appointment or system detect) generation inertial sensor unit 101, receive the movable information that communication unit 102 sends, according to array mode, the movable information receiving is processed to obtain complete attitude and the movable information of object.
In one embodiment, when terminal handler 103 processes to obtain the complete attitude of object and movable information according to array mode to the movable information receiving, can realize in the following way: according to the mechanics constraint of motion capture object, the motion of inertial sensor unit 101 is revised, as retrained or the correction of ground contiguity constraint to orientation and displacement etc. based on joint; To not being installed, orientation and the motion at the position of inertial sensor unit 101 estimate, evaluation method can obtain for carrying out interpolation calculation according to the movable information of adjacent regions, as the orientation of vertebra and motion can be carried out interpolation estimation by the motion of buttocks and chest, evaluation method also can be for estimating according to the motion conditions of the movement characteristic of himself and father node, as followed orientation and the motion of sole with motion in the orientation of toe when there is no extraneous contact, when tiptoe lands, toe towards consistent with sole, but inclination angle is parallel with surface of contact.
As shown in Figures 2 to 5, when the present invention specifically implements, the inertial sensor unit 101 of combined type movement capture system comprises: sensor assembly 201, first microprocessor module 202 and the first communication module 203.
Sensor assembly 201 comprises: 3 axis MEMS accelerometer 2011,3 axis MEMS gyroscope 2012 and 3 axis MEMS magnetometer 2013.The acceleration signal of the installation position of 2011 pairs of inertial sensor unit 101 of 3 axis MEMS accelerometer is measured.The angular velocity signal of the installation position of 2012 pairs of inertial sensor unit 101 of 3 axis MEMS gyroscope is measured.The magnetic force signal of the installation position of 2013 pairs of inertial sensor unit 101 of 3 axis MEMS magnetometer is measured.
The sensor assembly 201 that first microprocessor module 202 connects in same inertial sensor unit 101, can be according to the azimuth information of the acceleration of sensor assembly 201, angular velocity and magnetic force calculated signals installation position.
In one embodiment, first microprocessor module 202 modules specifically for: angular velocity information is carried out to integration, generate dynamic space orientation, according to described acceleration information and ground magnetic vector, generate static absolute space orientation, and utilize described static absolute space orientation to revise described dynamic space orientation, generating direction information.
The first communication module 203 connects described first microprocessor module 202, for the movable information recording (as azimuth information, acceleration information, angular velocity information etc.) is transferred to communication unit 102.
As shown in Figures 2 to 5, when the present invention specifically implements, the communication unit 102 of combined type movement capture system comprises: the second microprocessor module 2021, the second communication module 2022 and the 3rd communication module 2023.The second communication module 2021 and the 3rd communication module 2022 connect respectively the second microprocessor module 2023.The second microprocessor module 2021 is controlled the second communication module 2022 and is received the movable information that each inertial sensor unit 101 records, and by this movable information packing, then by the 3rd communication module 2023, sends to terminal handler 103.
Between the first communication module 203 and the second communication module 2022, and between terminal handler 103 the 3rd communication module 2023, can all by communication, connect, also can be all mode by wired serial communication connect, can also the first communication module 203 with the second communication module 2022 between by communication, be connected, mode by wired serial communication between terminal handler 103 and the 3rd communication module 2023 is connected, or the mode by wired serial communication between the first communication module 203 and the second communication module 2022 is connected, between terminal handler 103 and the 3rd communication module 2023, by communication, be connected.Above-mentioned several connected mode is described respectively below.
In one embodiment, as shown in Figure 2, the first communication module 203 is connected with the mode of the second communication module 1022 by wired serial communication, and the 3rd communication module 2023 is also connected with terminal handler 103 in the mode of wired serial communication.The first communication module 203, the second communication module 1022 and the 3rd communication module 2023 are serial communication module.Communication unit 102 also comprises DC/DC conversion module 2025, by wired connection, from terminal handler 103, obtains electric power, and after DC/DC conversion module 2025 is carried out DC/DC conversion, gives communication unit and the power supply of all inertial sensors unit.The first communication module 203, the second communication module 2022 and the 3rd communication module 2023 are serial communication module.
This combined type movement capture system comprises that a plurality of inertial sensors unit 101, a communication unit 102 and a PC are as terminal handler 103.
Inertial sensor unit 101 comprises sensor assembly 201, first microprocessor module 202 and the first communication module 203.Sensor assembly 201 comprises 3 axis MEMS accelerometer, 2011 3 axis MEMS gyroscopes 2012 and 3 axis MEMS magnetometer 2013, respectively acceleration, angular velocity and magnetic force signal is measured.
First microprocessor module 202 receives acceleration, magnetic force and angular velocity information from sensor assembly 201, and according to the attitude information of these information calculating sensor modules 201.The first communication module 203 sends to communication unit 102 by movable information in wired mode.Communication unit 102 comprises the second microprocessor module 2021, the second communication module 2022, the 3rd communication module 2023.Communication unit 102 receives the movable information of inertial sensor unit 101 by the second communication module 2022, and the movable information receiving is sent to receiving terminal processor 103 by the 3rd communication module 2023 after the second microprocessor module 2021 packings.
By wired connection, communication unit 102 obtains electric power from terminal handler 103, gives communication unit 102 and 101 power supplies of all inertial sensor unit being attached thereto after DC/DC conversion.Terminal handler 103 receives after the movable information of inertial sensor unit 101, according to the installation site information of the object model of software interface appointment and inertial sensor unit 101, carry out and to process and to calculate accordingly, comprise and according to the mechanics constraint of object, the movable information of inertial sensor unit 101 is revised, for the motion that 101 positions, inertial sensor unit are not installed, estimated etc.Terminal handler 103 can carry out computer animation broadcasting in real time by result of calculation, also can preserve or send by network with certain data layout.
In one embodiment, as shown in Figure 3, the first communication module 203 is connected with the mode of the second communication module 2022 by wired serial communication, and the 3rd communication module 2023 is connected with terminal handler 103 with communication.The second microprocessor module 2021 receives by the second communication module 2022 movable information that each inertial sensor unit 101 records, and after packing, by the 3rd communication module 2023 (RF communication module), sends to terminal handler unit 103.Compare with Fig. 2, the communication unit 102 of Fig. 3 also comprises: battery 2024 and DC/DC conversion module 2025, battery 2024 carries out DC/DC conversion by DC/DC conversion module 2025, then to communication unit 102 and 101 power supplies of all inertial sensors unit.The 3rd communication module 2023 can be RF communication module, can be also that other can carry out with terminal handler 103 module of radio communication.The first communication module 203 and the second communication module 2022 are serial communication module.Wired connection mode by communication unit 102 with inertial sensor unit 101, battery 2024 can also be given the various piece power supply of inertia sensing unit 101.
In one embodiment, as shown in Figure 4, the first communication module 203 is connected by communication with the second communication module 2022, and the 3rd communication module 2023 is connected with terminal handler 103 in the mode of wired serial communication.In the present embodiment, inertial sensor unit 101 also comprises: the electric power of battery 2024 and 2026 pairs of batteries 2024 of the first DC/DC conversion module 2026, the first DC/DC conversion modules carries out DC/DC conversion.Communication unit 102 also comprises the second DC/DC conversion module 2027, wired connection mode by communication unit 102 with terminal handler 103, terminal handler 103 can provide electric power for communication unit 102, and the second DC/DC conversion module 2027 can carry out the electric power in terminal handler 103 to supply with communication unit 102 after DC/DC conversion.The first communication module 203 and the second communication module 2022 can be RF communication modules, can be also that other can carry out with terminal handler 103 module of radio communication.The 3rd communication module 2023 is serial communication module.
In one embodiment, as shown in Figure 5, communication unit 102 also comprises: the first battery 2028 and the first DC/DC conversion module 2026.Inertial sensor unit 101 also comprises: the second battery 2029 and the second DC/DC conversion module 2027.The first communication module 203 is connected by communication with the second communication module 2022, and the 3rd communication module 2023 is connected with terminal handler 103 with communication.The first communication module 203, the second communication module 2022 and the 3rd communication module 2023 can be RF communication modules, can be also that other can carry out with terminal handler 103 module of radio communication.
Fig. 6 is the implementing procedure figure of combined type movement capture system of the present invention.As shown in Figure 6, first, inertial sensor unit 101 is taken, helped the modes such as band, gloves, adhesive tape be connected to it motion capture object by sensor, and set up the physical connection of each several part.Then, open combined type movement capture system, corresponding software on the processor 103 that opens a terminal, the software of setting up each several part connects.Next, according to the installation site of the inertial sensor unit 101 in motion capture object information and motion capture object, on terminal software interface, select the model of motion capture object, if do not comprise the model of corresponding object in software, can manual creation or the model of input object, the model of object comprises the annexation of object various piece, the size of various piece and initial orientation etc.For object model, can also arrange or revise constraint and restriction between various piece, such as the joint motion angle allowing etc.Determine after object model, according to actual sensor unit installation site, specify the installation site of each sensor on the software interface of terminal handler, the position of appointment need to be consistent with physical location.Determined behind the installation site of sensor unit, need to calibrate the alignment error of each sensor.Calibration can be calibrated according to existing human body calibration actions on software, also can specify and design calibration poses by user.During calibration, measuring object need to be made corresponding calibration actions according to the posture of software interface.The movable information that receiving processor measures according to known attitude and sensor unit, determines the alignment error of sensor unit.Complete after the calibration of sensor unit, can start the motion of motion capture object (object to be measured) to catch.Carry out when motion-captured, the movable informations such as orientation of 101Jiang installation position, inertial sensor unit send to communication unit 102 in wired or wireless mode, then by sending to terminal handler 103 in wired or wireless mode after communication unit 102 packings.Terminal handler 103 is according to the installation site of predefined object and inertial sensor unit 101, and the constraint of setting, the movable informations such as orientation of measuring are revised, as orientation, displacement etc. revised to meet joint constraint or extraneous contiguity constraint; And estimate the motion at the position of inertial sensor unit 101 is not installed, as carried out interpolation according to the movable information at adjacent position to obtain the not movable information at installation module position; Then the movable informations such as orientation of the each several part of complete object are mapped on model, make object model can follow the motion of object and move.Terminal handler 103 can play the exercise data of Moving Objects in real time, by network, share or local storage etc.
In one embodiment, inertial sensor unit 101 can be arranged in different motion capture object in the different moment.When user is when using first the array mode of combined type movement capture system or change inertial sensor unit 101 or installation site, terminal handler 101 is also used to specify the installation site of current motion-captured array mode and each inertial sensor unit 101.Determine after object model, according to actual 101 installation sites, inertial sensor unit, specify the installation site of each sensor on the software interface of terminal handler 103, the position of appointment need to be consistent with physical location.
In one embodiment, when inertial sensor unit 101 is changed while being installed to another kind of motion capture object from a kind of motion capture object, terminal handler 101 is also measured motion capture object model or newly-built motion capture object model for changing, on terminal software interface, select the model of motion capture object, if do not comprise the model of corresponding object in software, can manual creation or the model of input object, the model of object comprises the annexation of object various piece, the size of various piece and initial orientation etc.
The beneficial effect of the embodiment of the present invention is, a plurality of inertial sensors of the present invention unit can carry out the installation of various array modes in same motion capture object, also can in different types of motion capture object, combine installation, by the independent assortment to same set of motion capture device, can reach the object that different motion catches, and reduce cost.
For better explanation the present invention, below in conjunction with specific embodiment, be described.
(1) the combined type reality-virtualizing game interactive system based on inertial sensor unit
In the present embodiment, combined type movement capture system comprises 10 inertial sensor unit, a communication unit, a panel computer (as terminal handler, can be also PC) and a wear-type virtual reality display.10 inertial sensor unit combine according to concrete reality-virtualizing game demand, thereby reach, can play with same set of system the object of variety classes reality-virtualizing game.
In the present embodiment, suppose that first user plays a game of throwing dartlike weapon with friend in virtual environment.First user needs 10 sensor units to be respectively installed to each finger (2, thumb, each, all the other 4 fingers), the back of the hand, upper arm, underarm and chest, the inertial sensor unit that is wherein installed to hand can be installed by flexible gloves, and the sensor unit at all the other positions can be installed by bandage.Each inertial sensor unit is connected with the communication unit that is arranged on chest by the mode of wired connection.Communication unit is connected with panel computer by the mode of wired connection.Each inertial sensor unit is measured the orientation of institute installation position respectively, and the mode by wired serial communication sends to communication unit by measurement result.Communication unit sends to panel computer by the azimuth information at each position receiving by USB interface, and from panel computer, obtains electric power by USB interface.Panel computer is connected with communication unit by USB interface, by HDMI interface, is connected with wear-type virtual reality display.Panel computer is connected with a virtual reality scenario on the webserver.The webserver sends to panel computer by network by real-time scene and scene changes information, and computer is crossed HDMI interface by the information exchange of virtual scene and sent to virtual reality head mounted display.Panel computer receives the azimuth information of arm, hand and the chest each several part of communication unit, processes and then obtain the attitude information of whole written notes chest.Panel computer is the role corresponding with wearer in the movable information substitution virtual scene of hand and chest, and in virtual scene, the motion of wearer is followed in role's hand and athletic meeting above the waist.Below the implementation process of the present embodiment is described in detail.
While using native system, first in one's hands, arm and chest are installed to by gloves and bandage in each inertial sensor unit, and each several part is coupled together, then open system start the motion-captured software of panel computer, calibrates each inertial sensor alignment error of motion capture system.Calibration steps is that wearer is shown one or two known attitudes, and T-attitude of closing up as 5 fingers etc., according to the orientation of each inertial sensor recording when the known attitude, can be determined the alignment error of each inertial sensor unit.
Next be exactly the virtual reality server of the dartlike weapon of the throwing on interconnection network on panel computer, after successful connection, the client software of panel computer can generate a virtual role (user also can customize the role of oneself).There is virtual dartlike weapon on the side of virtual role, and there is virtual target on opposite.Wearer can be picked up virtual dartlike weapon with hand and throw to target.By HDMI interface, head mounted display can send to panel computer by the orientation of head.Panel computer, after receiving the head azimuth information of virtual scene information and wear-type virtual reality display, generates image information corresponding to corresponding visual angle and sends to wear-type virtual reality display according to head orientation.Except the dartboard on wearer side, in Same Scene, can there be a plurality of dartboards, thereby can be so that wearer enters Same Scene and plays with a plurality of friends, and can carry out speech exchange by earphone and the microphone of panel computer.
Each inertial sensor unit records local gravity vector 3 axis MEMS magnetometer by 3 axis MEMS micro-acceleration gauge and records locally magnetic vector, the absolute 3 d pose angle of static state that the first microprocessor of inertial sensor unit can computing unit according to gravity vector and magnetic force vector; By 3 axis MEMS gyroscope measured angular speed, the dynamic 3 D attitude angle that the first microprocessor of inertial sensor unit can computing module.According to the actual motion situation of inertial sensor unit, the absolute 3 d pose angle of Integrated Static and dynamic 3 D attitude angle, can obtain the final azimuth information in inertial sensor unit.
The communication module of inertial sensor unit is connected with above-mentioned 10 inertial sensor unit by the mode of serial communication, and from each inertial sensor unit, obtains the azimuth information that it records by the mode of poll, sends to panel computer after packing.
Panel computer is received after the azimuth information at each position that communication unit sends, and these azimuth informations is processed to obtain to orientation and the motion conditions of whole hand and chest.The processing of these azimuth informations is comprised according to the biomechanics constraint of hand orientation is revised, as avoid finger to occur the situations such as the twisting of joints; And for the estimation in orientation that the position of inertial sensor is not installed, as do not have the orientation of the finger tip of installation module to calculate, can think its attitude angle with respect to joint in referring to equal to refer in joint with respect to the attitude angle that refers to root joint.
Panel computer obtains, after the movable information of whole hand and chest, these information being mapped to the corresponding site of virtual role, makes the motion of virtual role in virtual scene can follow the motion of wearer.By the hand exercise of wear-type virtual reality display and wearer, the dartlike weapon that wearer can " be picked up " in virtual scene is thrown.
After dartlike weapon is had a good time, wearer can be played another reality-virtualizing game, as virtual reality shooting game.At this moment, wearer exits the scene of darts, and the inertial sensor unit that is installed to finger is taken off by bandage or sensor clothes and is installed to other positions of health.Now user's 10 inertial sensor unit can be respectively installed on head, chest, buttocks, both hands upper arm, underarm and the back of the hand, peashooter.Then user need to specify the installation site of corresponding sensor unit on the motion-captured software interface of panel computer.Next the attitude (as T-attitude) that user's right hand (or left hand) takes peashooter to show appointment is carried out calibration actions, can connect in virtual reality design games scene after completing calibration actions, carries out real-time virtual reality shooting game.
When carrying out virtual reality shooting game, the real-time information such as orientation to the wearer upper part of the body and peashooter in inertial sensor unit are measured, and metrical information is sent to panel computer by communication unit.Panel computer is processed and is calculated azimuth information, obtains the corresponding movable information of health, and movable information is mapped on the virtual role in reality-virtualizing game scene, makes virtual role follow the motion of wearer and move.The signal that wearer is pulled peashooter is delivered to computer by the radio-frequency information of peashooter, and the virtual gun in reality-virtualizing game scene can be opened fire in cocking, thereby brings shooting game on the spot in person to experience to player.
The present embodiment is the combination of same motion capture object (as human body), is a kind of reality-virtualizing game of combined type cheaply implementation.Adopt less inertial sensor by different array modes, can make user can experience multiple different reality-virtualizing game in the situation that input is lower.
(2) combined type multi-object motion capture application for example
The combined type movement capture system of this enforcement comprises 30 inertial sensor unit, three RF communication units and a terminal handler.Wherein, inertial sensor unit carries out communication by mode and the RF communication unit of wired serial communication, RF communication unit carries out communication by mode and the terminal handler of Wi-Fi communication, the combined type multi-object of the present embodiment is motion-captured multiple application scenarios, it can form the independently upper part of the body motion capture system of 10 sensor units of 3 covers, wherein every cover system comprises a RF communication unit and 10 inertial sensor unit, three cover half body motion capture system can access same terminal handler, realize many people motion-captured.Array mode of the present invention also can form the single all-around exercises capture system that a set of whole body comprises both hands finger and a stage property, realizes the comprehensive seizure to single stunt.Can also be to inhuman object, as the action of cat etc. catches.The specific implementation process of the present embodiment is described below:
A kind of array mode of the present embodiment is three people's reality-virtualizing games.Implementation process is as follows: 30 inertial sensor unit and three communication units are respectively installed on three people's the upper part of the body, everyone above the waist and stage property 10 inertial sensors are installed altogether, back is installed Wi-Fi communication unit, the inertial sensor unit of everyone with it carries out wired connection with Wi-Fi communication unit separately respectively, and Wi-Fi communication unit receives packing after the motion measurement information of each sensor and sends to terminal handler (computer) in the mode of Wi-Fi.After completing the installation of inertial sensor unit and communication unit, open system, 3 manikin objects of motion-captured interface creating on computers, corresponding respectively with three wearers, and specify the installation site of all the sensors unit, and then starting calibration, three wearers are made the calibration actions (as T-attitude) of indication simultaneously, alignment error is calibrated, can catch the motion of each wearer afterwards.Three wearers can pass through wear-type virtual reality display and game item, access the game of carrying out together virtual reality in same computer.
The another kind of array mode of the present embodiment is that 30 inertial sensor unit are all installed on the body of same person, comprise both hands and whole body, the motion measurement signal of inertial sensor unit collection sends to a Wi-Fi communication unit by the mode of wired serial, then sends to computer by Wi-Fi communication unit.The installation that completes inertial sensor unit and Wi-Fi communication unit be connected after, open the motion-captured software of computer, on software interface, only adopt 1 people's model object, and specify the installation site of each inertial sensor unit, then just can calibrate the installation site of sensor, as show both hands and close up, centre of the palm T-attitude down, the attitude of naturally standing etc.Complete after calibration, can gather the all-around exercises of human body.
Another array mode of the present embodiment be by 16 inertial sensor cellular installations in native system to it cat, for the motion to cat, catch.The installation site of concrete sensor comprises head, neck, shoulder, waist, stern, the tail (3) of cat, upper leg and the lower leg of four legs.Inertial sensor unit sends to the signal of collection by the mode of wired serial communication the Wi-Fi communication unit that is arranged on waist.Before carrying out installation of sensors, need on the motion-captured software interface of computer, to create the model of cat in advance, and input health each several part size and the initial attitude of cat model.After completing installation of sensors, need to specify at software interface the concrete installation site of each sensor unit.Then according to the feature of cat arrange a calibration poses (this calibration poses be cat more common and the known attitude in each several part orientation), then by the artificial help such as showing tender care for, make cat show the calibration actions of setting (if having deviation with setting action, need to re-start calibration).Complete after calibration, can catch the action of cat.
Except above-mentioned several array modes, the present embodiment can also catch the motion of any multi-joint object.
Motion-captured when the same set of motion acquisition system of the present embodiment can realize multi-object, also can realize the motion-captured of variety classes object.
Those skilled in the art should understand, embodiments of the invention can be provided as method, system or computer program.Therefore, the present invention can adopt complete hardware implementation example, implement software example or in conjunction with the form of the embodiment of software and hardware aspect completely.And the present invention can adopt the form that wherein includes the upper computer program of implementing of computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) of computer usable program code one or more.
The present invention is with reference to describing according to process flow diagram and/or the block scheme of the method for the embodiment of the present invention, equipment (system) and computer program.Should understand can be in computer program instructions realization flow figure and/or block scheme each flow process and/or the flow process in square frame and process flow diagram and/or block scheme and/or the combination of square frame.Can provide these computer program instructions to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, the instruction of carrying out by the processor of computing machine or other programmable data processing device is produced for realizing the device in the function of flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame appointments.
These computer program instructions also can be stored in energy vectoring computer or the computer-readable memory of other programmable data processing device with ad hoc fashion work, the instruction that makes to be stored in this computer-readable memory produces the manufacture that comprises command device, and this command device is realized the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make to carry out sequence of operations step to produce computer implemented processing on computing machine or other programmable devices, thereby the instruction of carrying out is provided for realizing the step of the function of appointment in flow process of process flow diagram or a plurality of flow process and/or square frame of block scheme or a plurality of square frame on computing machine or other programmable devices.
In the present invention, applied specific embodiment principle of the present invention and embodiment are set forth, the explanation of above embodiment is just for helping to understand method of the present invention and core concept thereof; , for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention meanwhile.
Claims (15)
1. a combined type movement capture system, is characterized in that, described combined type movement capture system comprises: a plurality of inertial sensors unit, at least one communication unit and a terminal handler; Described inertial sensor unit connects respectively described communication unit, and described communication unit connects described terminal handler;
Described inertial sensor unit is arranged on respectively each position of one or more motion capture object according to different array modes, measure the movable information of installation position, place, and described movable information is sent to described communication unit by wired or wireless mode;
Described communication unit receives the movable information of described inertial sensor output, and sends to described terminal handler by the mode of wired or wireless telecommunications;
Described terminal handler obtains the information of described motion capture object and the installation site information of described inertial sensor unit, according to the array mode of inertial sensor unit described in the information of described motion capture object and described installation site Information generation, receive the described movable information that described communication unit sends, according to described array mode, the described movable information receiving is processed to obtain complete attitude and the movable information of described motion capture object.
2. combined type movement capture system according to claim 1, it is characterized in that, described terminal handler specifically for: obtain the information of described motion capture object and the installation site information of described inertial sensor unit, according to the motion capture object model of the acquisition of information pre-stored of described motion capture object or newly-built motion capture object model, according to the array mode of inertial sensor unit described in described motion capture object model and described installation site Information generation, receive the described movable information that described communication unit sends, according to described array mode, the described movable information receiving is processed to obtain complete attitude and the movable information of object.
3. combined type movement capture system according to claim 1, it is characterized in that, described terminal handler specifically for: according to the mechanics constraint of motion capture object, the movable information of described inertial sensor unit is revised, to orientation and the motion at the position of inertial sensor unit is not installed, is estimated.
4. combined type movement capture system according to claim 1, is characterized in that, described inertial sensor unit comprises:
Sensor assembly, comprising: 3 axis MEMS accelerometer, 3 axis MEMS gyroscope and 3 axis MEMS magnetometer, respectively the acceleration of the installation position of described inertial sensor unit, angular velocity and magnetic force signal are measured;
First microprocessor module, connects described sensor assembly, according to the azimuth information of described acceleration, angular velocity and magnetic force calculated signals installation position;
The first communication module, connects described first microprocessor module, for transmitting described movable information.
5. combined type movement capture system according to claim 4, it is characterized in that, described communication unit comprises: the second microprocessor module, the second communication module and the 3rd communication module, the second described communication module and the 3rd communication module connect respectively the second described microprocessor module.
6. combined type movement capture system according to claim 5, is characterized in that, described communication unit also comprises: battery and DC/DC conversion module; The first described communication module is connected with the mode of the second described communication module by wired serial communication, and the 3rd described communication module is connected with described terminal handler with communication.
7. combined type movement capture system according to claim 5, is characterized in that, described inertial sensor unit also comprises: battery and DC/DC conversion module; The first described communication module is connected by communication with the second described communication module, and the 3rd described communication module is connected with described terminal handler in the mode of wired serial communication.
8. combined type movement capture system according to claim 5, it is characterized in that, described communication unit also comprises: the first battery and the first DC/DC conversion module, and described inertial sensor unit also comprises: the second battery and the second DC/DC conversion module; The first described communication module is connected by communication with the second described communication module, and the 3rd described communication module is connected with described terminal handler with communication.
9. combined type movement capture system according to claim 5, it is characterized in that, the first described communication module is connected with the mode of the second described communication module by wired serial communication, and the 3rd described communication module is connected with described terminal handler in the mode of wired serial communication; Shown communication unit also comprises DC/DC conversion module.
10. combined type movement capture system according to claim 4, it is characterized in that, described first microprocessor module specifically for: described angular velocity information is carried out to integration, generate dynamic space orientation, according to described acceleration information and ground magnetic vector, generate static absolute space orientation, and utilize described static absolute space orientation to revise described dynamic space orientation, generate described azimuth information.
11. according to the combined type movement capture system described in any one in claim 1-10, it is characterized in that, each position of described a plurality of motion capture object comprises: each position of human body, animal and/or robot.
12. according to the combined type movement capture system described in any one in claim 1-10, it is characterized in that, described inertial sensor unit was arranged in different motion capture object in the different moment.
13. according to the combined type movement capture system described in any one in claim 1-10, it is characterized in that, when user is when using first the array mode of described combined type movement capture system or change inertial sensor unit or installation site, described terminal handler is also used to specify the installation site of current motion-captured array mode and each inertial sensor unit.
14. combined type movement capture systems according to claim 2, it is characterized in that, when described sensor unit, from a kind of motion capture object, change while being installed to another kind of motion capture object, described terminal handler is also measured motion capture object model or newly-built motion capture object model for changing.
15. combined type movement capture systems according to claim 2, it is characterized in that, complete after the installation of described inertial sensor unit, described terminal handler is also for carrying out calibration actions according to array mode and motion capture object, to revise the alignment error of described inertial sensor unit.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201410440797.7A CN104197987A (en) | 2014-09-01 | 2014-09-01 | Combined-type motion capturing system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201410440797.7A CN104197987A (en) | 2014-09-01 | 2014-09-01 | Combined-type motion capturing system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN104197987A true CN104197987A (en) | 2014-12-10 |
Family
ID=52083315
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201410440797.7A Pending CN104197987A (en) | 2014-09-01 | 2014-09-01 | Combined-type motion capturing system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN104197987A (en) |
Cited By (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104461015A (en) * | 2014-12-30 | 2015-03-25 | 北京智谷睿拓技术服务有限公司 | Shooting control method, shooting control device, shooting device and wearable device |
| CN104536579A (en) * | 2015-01-20 | 2015-04-22 | 刘宛平 | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method |
| CN104834917A (en) * | 2015-05-20 | 2015-08-12 | 北京诺亦腾科技有限公司 | Mixed motion capturing system and mixed motion capturing method |
| WO2016033717A1 (en) * | 2014-09-01 | 2016-03-10 | 北京诺亦腾科技有限公司 | Combined motion capturing system |
| CN105787468A (en) * | 2016-03-24 | 2016-07-20 | 深圳市阿尔法通讯技术有限公司 | Method and system for recognizing bead twisting action |
| CN105824432A (en) * | 2016-06-14 | 2016-08-03 | 上海锡月科技有限公司 | Motion capturing system |
| CN105869107A (en) * | 2016-03-28 | 2016-08-17 | 陈新灏 | System and method for real-time capturing motion |
| CN105867631A (en) * | 2016-04-22 | 2016-08-17 | 江苏卡罗卡国际动漫城有限公司 | Motion capture device for AR (augmented reality) maintenance guiding system |
| CN105865509A (en) * | 2016-03-28 | 2016-08-17 | 联想(北京)有限公司 | Data processing method and electronic device |
| CN105997094A (en) * | 2016-05-09 | 2016-10-12 | 北京科技大学 | A posture identification device and method |
| CN106125908A (en) * | 2016-06-14 | 2016-11-16 | 上海锡月科技有限公司 | A kind of motion capture calibration system |
| CN106125909A (en) * | 2016-06-14 | 2016-11-16 | 上海锡月科技有限公司 | A kind of motion capture system for training |
| CN106647791A (en) * | 2016-12-27 | 2017-05-10 | 广州市中海达测绘仪器有限公司 | Monitoring device for three-dimensional posture, mechanical device and monitoring method for three-dimensional posture |
| CN106621320A (en) * | 2016-11-29 | 2017-05-10 | 维沃移动通信有限公司 | Data processing method of virtual reality terminal and virtual reality terminal |
| CN106886288A (en) * | 2017-03-24 | 2017-06-23 | 苏州创捷传媒展览股份有限公司 | A kind of attitude dynamic method for catching and device |
| CN107844191A (en) * | 2016-09-21 | 2018-03-27 | 北京诺亦腾科技有限公司 | Motion capture device for virtual reality |
| CN107943271A (en) * | 2016-10-10 | 2018-04-20 | 北京诺亦腾科技有限公司 | Exercise data detection method, apparatus and system |
| CN108133558A (en) * | 2018-01-31 | 2018-06-08 | 广西中星电子科技有限公司 | Low-power consumption displacement detection alarming device |
| CN109212256A (en) * | 2018-10-31 | 2019-01-15 | 中国矿业大学(北京) | A kind of device with video camera geographic direction detection function |
| CN109445582A (en) * | 2018-10-18 | 2019-03-08 | 看见故事(苏州)影视文化发展有限公司 | A kind of action inertia captures system and method for catching |
| CN109470263A (en) * | 2018-09-30 | 2019-03-15 | 北京诺亦腾科技有限公司 | Motion capture method, electronic equipment and computer storage medium |
| CN109787740A (en) * | 2018-12-24 | 2019-05-21 | 北京诺亦腾科技有限公司 | Synchronous method, device, terminal device and the storage medium of sensing data |
| CN109814714A (en) * | 2019-01-21 | 2019-05-28 | 北京诺亦腾科技有限公司 | The Installation posture of motion sensor determines method, apparatus and storage medium |
| CN109883260A (en) * | 2019-03-22 | 2019-06-14 | 天津亿量科技有限公司 | Rifle carries multidimensional sensory package, firearms state automatic recognition system and its method |
| CN110646014A (en) * | 2019-09-30 | 2020-01-03 | 南京邮电大学 | IMU installation error calibration method based on assistance of human body joint position capture equipment |
| CN113242527A (en) * | 2021-05-17 | 2021-08-10 | 张衡 | Communication system based on wireless somatosensory inertia measurement module |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1996205A (en) * | 2006-01-05 | 2007-07-11 | 财团法人工业技术研究院 | Method and system for dynamic motion capture and peripheral device interaction |
| DE102010012340A1 (en) * | 2010-02-27 | 2011-09-01 | Volkswagen Ag | Method for detecting motion of human during manufacturing process for motor vehicle utilized in traffic, involves forming output signal, and forming position of inertial sensors based on inertial sensor output signal of inertial sensors |
| CN102257830A (en) * | 2008-12-17 | 2011-11-23 | 索尼电脑娱乐公司 | Tracking system calibration with minimal user input |
| CN102341149A (en) * | 2008-12-05 | 2012-02-01 | 耐克国际有限公司 | Athletic performance monitoring systems and methods in a team sport environment |
| CN102435871A (en) * | 2011-09-05 | 2012-05-02 | 上海格蒂电力科技股份有限公司 | On-line monitoring system for data collection of electric arresters based on GPS (Global Positioning System) synchronization |
| CN103150016A (en) * | 2013-02-20 | 2013-06-12 | 兰州交通大学 | Multi-person motion capture system fusing ultra wide band positioning technology with inertia sensing technology |
| CN103759739A (en) * | 2014-01-21 | 2014-04-30 | 北京诺亦腾科技有限公司 | Multimode motion measurement and analysis system |
-
2014
- 2014-09-01 CN CN201410440797.7A patent/CN104197987A/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1996205A (en) * | 2006-01-05 | 2007-07-11 | 财团法人工业技术研究院 | Method and system for dynamic motion capture and peripheral device interaction |
| CN102341149A (en) * | 2008-12-05 | 2012-02-01 | 耐克国际有限公司 | Athletic performance monitoring systems and methods in a team sport environment |
| CN102257830A (en) * | 2008-12-17 | 2011-11-23 | 索尼电脑娱乐公司 | Tracking system calibration with minimal user input |
| DE102010012340A1 (en) * | 2010-02-27 | 2011-09-01 | Volkswagen Ag | Method for detecting motion of human during manufacturing process for motor vehicle utilized in traffic, involves forming output signal, and forming position of inertial sensors based on inertial sensor output signal of inertial sensors |
| CN102435871A (en) * | 2011-09-05 | 2012-05-02 | 上海格蒂电力科技股份有限公司 | On-line monitoring system for data collection of electric arresters based on GPS (Global Positioning System) synchronization |
| CN103150016A (en) * | 2013-02-20 | 2013-06-12 | 兰州交通大学 | Multi-person motion capture system fusing ultra wide band positioning technology with inertia sensing technology |
| CN103759739A (en) * | 2014-01-21 | 2014-04-30 | 北京诺亦腾科技有限公司 | Multimode motion measurement and analysis system |
Cited By (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016033717A1 (en) * | 2014-09-01 | 2016-03-10 | 北京诺亦腾科技有限公司 | Combined motion capturing system |
| CN104461015A (en) * | 2014-12-30 | 2015-03-25 | 北京智谷睿拓技术服务有限公司 | Shooting control method, shooting control device, shooting device and wearable device |
| CN104536579A (en) * | 2015-01-20 | 2015-04-22 | 刘宛平 | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method |
| CN104834917A (en) * | 2015-05-20 | 2015-08-12 | 北京诺亦腾科技有限公司 | Mixed motion capturing system and mixed motion capturing method |
| CN105787468A (en) * | 2016-03-24 | 2016-07-20 | 深圳市阿尔法通讯技术有限公司 | Method and system for recognizing bead twisting action |
| CN105865509A (en) * | 2016-03-28 | 2016-08-17 | 联想(北京)有限公司 | Data processing method and electronic device |
| CN105869107A (en) * | 2016-03-28 | 2016-08-17 | 陈新灏 | System and method for real-time capturing motion |
| CN105867631A (en) * | 2016-04-22 | 2016-08-17 | 江苏卡罗卡国际动漫城有限公司 | Motion capture device for AR (augmented reality) maintenance guiding system |
| CN105997094B (en) * | 2016-05-09 | 2019-03-29 | 北京科技大学 | A kind of gesture recognition device and method |
| CN105997094A (en) * | 2016-05-09 | 2016-10-12 | 北京科技大学 | A posture identification device and method |
| CN106125908A (en) * | 2016-06-14 | 2016-11-16 | 上海锡月科技有限公司 | A kind of motion capture calibration system |
| CN106125909A (en) * | 2016-06-14 | 2016-11-16 | 上海锡月科技有限公司 | A kind of motion capture system for training |
| CN105824432A (en) * | 2016-06-14 | 2016-08-03 | 上海锡月科技有限公司 | Motion capturing system |
| CN107844191A (en) * | 2016-09-21 | 2018-03-27 | 北京诺亦腾科技有限公司 | Motion capture device for virtual reality |
| CN107943271A (en) * | 2016-10-10 | 2018-04-20 | 北京诺亦腾科技有限公司 | Exercise data detection method, apparatus and system |
| CN106621320A (en) * | 2016-11-29 | 2017-05-10 | 维沃移动通信有限公司 | Data processing method of virtual reality terminal and virtual reality terminal |
| CN106647791A (en) * | 2016-12-27 | 2017-05-10 | 广州市中海达测绘仪器有限公司 | Monitoring device for three-dimensional posture, mechanical device and monitoring method for three-dimensional posture |
| CN106886288A (en) * | 2017-03-24 | 2017-06-23 | 苏州创捷传媒展览股份有限公司 | A kind of attitude dynamic method for catching and device |
| CN108133558A (en) * | 2018-01-31 | 2018-06-08 | 广西中星电子科技有限公司 | Low-power consumption displacement detection alarming device |
| CN109470263A (en) * | 2018-09-30 | 2019-03-15 | 北京诺亦腾科技有限公司 | Motion capture method, electronic equipment and computer storage medium |
| CN109445582A (en) * | 2018-10-18 | 2019-03-08 | 看见故事(苏州)影视文化发展有限公司 | A kind of action inertia captures system and method for catching |
| CN109212256A (en) * | 2018-10-31 | 2019-01-15 | 中国矿业大学(北京) | A kind of device with video camera geographic direction detection function |
| CN109787740A (en) * | 2018-12-24 | 2019-05-21 | 北京诺亦腾科技有限公司 | Synchronous method, device, terminal device and the storage medium of sensing data |
| CN109787740B (en) * | 2018-12-24 | 2020-10-27 | 北京诺亦腾科技有限公司 | Sensor data synchronization method and device, terminal equipment and storage medium |
| CN109814714A (en) * | 2019-01-21 | 2019-05-28 | 北京诺亦腾科技有限公司 | The Installation posture of motion sensor determines method, apparatus and storage medium |
| CN109883260A (en) * | 2019-03-22 | 2019-06-14 | 天津亿量科技有限公司 | Rifle carries multidimensional sensory package, firearms state automatic recognition system and its method |
| CN109883260B (en) * | 2019-03-22 | 2022-04-12 | 天津亿量科技有限公司 | Gun-mounted multi-dimensional sensing assembly, automatic firearm state identification system and method |
| CN110646014A (en) * | 2019-09-30 | 2020-01-03 | 南京邮电大学 | IMU installation error calibration method based on assistance of human body joint position capture equipment |
| CN110646014B (en) * | 2019-09-30 | 2023-04-25 | 南京邮电大学 | IMU installation error calibration method based on human joint position capture equipment |
| CN113242527A (en) * | 2021-05-17 | 2021-08-10 | 张衡 | Communication system based on wireless somatosensory inertia measurement module |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN104197987A (en) | Combined-type motion capturing system | |
| JP6973388B2 (en) | Information processing equipment, information processing methods and programs | |
| CN103759739B (en) | A kind of multimode motion measurement and analytic system | |
| US20160059120A1 (en) | Method of using motion states of a control device for control of a system | |
| CN203763810U (en) | Club/racket swinging assisting training device | |
| JP2021520978A (en) | A method for controlling the interaction between a virtual object and a thrown object, its device, and a computer program. | |
| US10777006B2 (en) | VR body tracking without external sensors | |
| CN203405772U (en) | Immersion type virtual reality system based on movement capture | |
| JP6852673B2 (en) | Sensor device, sensor system and information processing device | |
| US20090046056A1 (en) | Human motion tracking device | |
| CN103488291A (en) | Immersion virtual reality system based on motion capture | |
| CN103370672A (en) | Method and apparatus for tracking orientation of a user | |
| JP5597392B2 (en) | Game system with movable display | |
| CN101579238A (en) | Human motion capture three dimensional playback system and method thereof | |
| US20180216959A1 (en) | A Combined Motion Capture System | |
| US20200320719A1 (en) | Determining a kinematic sequence | |
| KR101755126B1 (en) | A motion information providing device for implementing motions, a method for providing motion information using it and a system for implementing motions | |
| KR20120059824A (en) | A method and system for acquiring real-time motion information using a complex sensor | |
| CN116021514B (en) | Remote operation control method and device for robot, robot and electronic equipment | |
| Callejas-Cuervo et al. | Capture and analysis of biomechanical signals with inertial and magnetic sensors as support in physical rehabilitation processes | |
| Poussard et al. | 3DLive: A multi-modal sensing platform allowing tele-immersive sports applications | |
| CN107291265A (en) | Inertia action catches hardware system | |
| Kadam et al. | Development of Cost Effective Motion Capture System based on Arduino | |
| Chen et al. | The body sensor suit with mixed reality interactive games | |
| CN103763390A (en) | Movement capture data processing method, device and system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| WD01 | Invention patent application deemed withdrawn after publication | ||
| WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20141210 |