CN114225368A - Gesture somatosensory doll machine and control method - Google Patents
Gesture somatosensory doll machine and control method Download PDFInfo
- Publication number
- CN114225368A CN114225368A CN202111428613.1A CN202111428613A CN114225368A CN 114225368 A CN114225368 A CN 114225368A CN 202111428613 A CN202111428613 A CN 202111428613A CN 114225368 A CN114225368 A CN 114225368A
- Authority
- CN
- China
- Prior art keywords
- user
- gesture
- motion
- somatosensory
- doll
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/30—Capturing games for grabbing or trapping objects, e.g. fishing games
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F17/00—Coin-freed apparatus for hiring articles; Coin-freed facilities or services
- G07F17/32—Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Toys (AREA)
Abstract
The invention discloses a gesture motion sensing doll machine and a control method, comprising a payment starting device, an integrated circuit controller, a stepping motor, a motion arm component and a motion sensing sensor; the integrated circuit controller is respectively connected with the payment starting device, the stepping motor and the somatosensory sensor; the stepping motor is connected with the motion arm component; if the payment starting device detects that the user finishes the payment operation, a payment success instruction is transmitted to the integrated circuit controller; the integrated circuit controller sends a starting instruction to the stepping motor and the somatosensory sensor; the motion sensing sensor acquires the motion sensing characteristics of the user and transmits the motion sensing characteristics of the user back to the integrated circuit controller; the stepping motor acquires the user somatosensory characteristics returned by the somatosensory sensor through the integrated circuit controller, and controls the motion arm assembly to perform corresponding doll grabbing operation based on the user somatosensory characteristics. The invention can improve the interactivity between the doll and the user and improve the user experience of the user.
Description
Technical Field
The invention relates to the field of entertainment game equipment, in particular to a gesture motion sensing doll machine and a control method.
Background
At present, in the process of using a doll machine for entertainment, a user is generally required to stand right in front of a cabinet body of the doll machine, and a control console is used for operating a crown block and a gripper in the cabinet body of the doll machine so as to clamp commodities in the cabinet body of the doll machine.
In practice, it is found that the grabbing mode of the doll machine needs to be realized by means of manual control operation of a user on a console, and direct interaction with the user cannot be established, so that the problems of poor interactivity and poor user experience are caused.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a gesture body sensing doll and a control method thereof, which are used for at least improving the interactivity between the doll and a user and improving the user experience of the user.
According to an aspect of an embodiment of the present invention, there is provided a gesture motion sensing doll, including a payment starting device, an integrated circuit controller, a stepping motor, a moving arm assembly, and a motion sensing sensor; the integrated circuit controller is respectively connected with the payment starting device, the stepping motor and the somatosensory sensor; the stepping motor is connected with the motion arm assembly; the payment starting device is used for detecting whether a user completes payment operation on the gesture motion sensing doll machine or not, and if the user detects that the payment operation is completed, transmitting a payment success instruction to the integrated circuit controller; the integrated circuit controller is used for responding to the payment success instruction and sending a starting instruction to the stepping motor and the motion sensing sensor; the motion sensing sensor is used for responding to the starting instruction, acquiring a user motion sensing characteristic and transmitting the user motion sensing characteristic back to the integrated circuit controller; the stepping motor is used for responding to the starting instruction, acquiring the user somatosensory characteristics returned by the somatosensory sensor through the integrated circuit controller, and controlling the motion arm assembly to carry out corresponding doll grabbing operation based on the user somatosensory characteristics.
As an optional implementation manner, the manner of acquiring the somatosensory characteristics of the user by the somatosensory sensor is specifically: acquiring a user image; the user image at least comprises gesture feature information of a user; and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
As an optional embodiment, the step motor controls the motion arm assembly to perform a corresponding doll grabbing operation based on the user somatosensory characteristic in a specific manner: determining angular displacement and linear displacement corresponding to the user somatosensory characteristics; and controlling the moving arm assembly to perform corresponding doll grabbing operation according to the angular displacement and the linear displacement.
As an optional implementation manner, the manner of determining the angular displacement and the linear displacement corresponding to the user somatosensory feature by the stepping motor is specifically: determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics; generating an estimated motion track based on the gesture motion direction and the gesture motion distance; and determining the angular displacement and the linear displacement based on the estimated motion trail.
As an alternative embodiment, the motion arm assembly includes a robotic arm and a manipulator.
According to another aspect of the embodiments of the present invention, there is also provided a control method for a gesture-sensing doll machine, including: controlling the gesture body sensing doll machine to detect whether a user completes payment operation on the gesture body sensing doll machine or not; if the payment operation is detected to be completed, controlling to start the gesture motion sensing doll machine; acquiring the somatosensory characteristics of a user; and controlling a motion arm assembly in the gesture body sensing doll machine to carry out corresponding doll grabbing operation based on the user body sensing characteristics.
As an optional implementation, the acquiring the user somatic sense feature includes: acquiring a user image; the user image at least comprises gesture feature information of a user; and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
As an alternative embodiment, the controlling a motion arm assembly in the gesture motion sensing doll machine to perform a corresponding doll grabbing operation based on the user motion sensing feature includes: determining angular displacement and linear displacement corresponding to the user somatosensory characteristics; and controlling the moving arm assembly to perform corresponding doll grabbing operation according to the angular displacement and the linear displacement.
As an alternative embodiment, the determining angular displacement and linear displacement corresponding to the user somatosensory feature includes: determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics; generating an estimated motion track based on the gesture motion direction and the gesture motion distance; and determining the angular displacement and the linear displacement based on the estimated motion trail.
As an alternative embodiment, the motion arm assembly includes a robotic arm and a manipulator.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, where the computer program is configured to execute the control method for a gesture-sensing doll machine when the computer program is executed.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor executes the control method for the gesture body sensing doll through the computer program.
In the embodiment of the invention, the interactivity between the doll and the user can be improved in a somatosensory interaction mode, and the user experience of the user is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic structural diagram of an alternative gesture-sensing doll according to an embodiment of the invention;
FIG. 2 is a flow diagram of an alternative control method for a gesturing motion sensing doll according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an alternative electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The embodiment of the invention provides a selectable gesture motion sensing doll, which comprises a payment starting device 101, an integrated circuit controller 102, a stepping motor 103, a moving arm assembly 104 and a motion sensing sensor 105; the integrated circuit controller 102 is connected to the payment starting device 101, the stepping motor 103 and the motion sensing sensor 105 respectively; the connection between the stepping motor 103 and the moving arm assembly 104 is established; the payment starting device 101 is configured to detect whether a user completes payment operation on the gesture motion sensing doll, and if the user completes the payment operation, transmit a payment success instruction to the integrated circuit controller 102; the integrated circuit controller 102 is configured to send a start instruction to the stepping motor 103 and the motion sensor 105 in response to the payment success instruction; the motion sensing sensor 105 is configured to respond to the start instruction, acquire a user motion sensing characteristic, and transmit the user motion sensing characteristic back to the integrated circuit controller 102; the stepping motor 103 is configured to, in response to the start instruction, acquire the user somatosensory characteristics returned by the somatosensory sensor 105 through the integrated circuit controller 102, and control the moving arm assembly 104 to perform corresponding doll grabbing operation based on the user somatosensory characteristics.
Wherein, payment starting drive 101 can be by inserting coins the detector and removing the payment box and constitute, and the detector of inserting coins wherein can detect gesture body and feel the baby machine and whether detect the operation of inserting coins, remove the payment box and can detect gesture body and feel the baby machine and whether detect removal payment operation, all can start gesture body through different payment methods and feel the baby machine, and is optional, and the gesture body feels the starting method of baby machine also can be based on other modes and realize, and this embodiment does not limit to this. The integrated circuit controller 102 may be fabricated by a process in which the required components and wiring for the processor, memory, transistors, diodes, resistors, capacitors, inductors, etc. of a circuit are interconnected together on a semiconductor die or dielectric substrate and then packaged in a package to form a microstructure having the required circuit functions. All the elements are structurally integrated, so that the electronic elements have more excellent performances towards microminiaturization, low power consumption and high reliability, and the functions of man-machine interaction, data processing, remote networking, payment verification, motor starting, mechanical arm movement, mechanical hand doll grasping and the like are respectively realized.
Optionally, a Software-as-a-Service (SaaS) secure payment system may be used to build all network infrastructures, Software and hardware operating platforms required for informatization, and is responsible for a series of services such as all previous implementation and later maintenance, so as to implement a payment function. In addition, the offline device can be connected with a gesture body sensing doll machine, and the gesture body sensing doll machine is fed back to a payment terminal of a user; the online payment system can be connected with a server background to support code scanning online payment.
As an optional implementation manner, the manner of acquiring the motion sensing characteristics of the user by the motion sensing sensor 105 is specifically as follows: acquiring a user image; the user image at least comprises gesture feature information of a user; and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
In this implementation, the motion sensor 105 may convert the user image collected by the light into a digital signal through an optical signal. Wherein, need to gather the primitive signal, if the primitive signal is the continuous time signal, must become the discrete signal through sampling, and then change into the binary digital signal through the analog-digital. If the original signal collected is already a discrete signal, only analog-to-digital conversion is needed to obtain a binary digital signal. The function of the digital signal processing system is to process the binary digital signal obtained by sampling and converting the original signal according to a certain requirement, such as the requirement of conversion or filtering, to obtain the required output digital signal. The digital signal is converted into discrete signal by digital-analog conversion, and the discrete signal is restored into continuous time signal by holding circuit and output. The digital signal processing system may be constituted by a digital computer and software, or may be constituted by a general-purpose digital signal processor, or may be constituted by a dedicated signal processor.
Specifically, the gesture feature information may be projected onto the motion sensor 105 through a user image, and then converted into an electrical signal, which is converted into a digital image signal through analog-to-digital conversion, and then sent to the digital signal processing chip for processing.
The gesture feature information may be a gesture state of the user, different positions of the user in front, back, left, and right, whether the user triggers the operation of the hand grip, and the like, which is not limited in this embodiment. By converting the user image into a digital signal, user motion sensing characteristics in the form of a digital signal can be obtained, and these user motion sensing characteristics may include the characteristics in the gesture characteristic information described above. In this manner, an accurate and error-free signal may be sent to the integrated circuit controller 102 to provide a basis for determination.
As an optional embodiment, the step motor 103 controls the moving arm assembly 104 to perform corresponding doll grabbing operations based on the user somatosensory characteristics by: determining angular displacement and linear displacement corresponding to the user somatosensory characteristics; and controlling the moving arm assembly 104 to perform corresponding doll grabbing operations according to the angular displacement and the linear displacement.
In this embodiment, it is preferable to use the stepping motor 103 to control the doll-grabbing operation, wherein the stepping motor 103 is used to control the doll-grabbing operation, and the biggest difference with respect to other motors for control purposes is that it receives the digital electric pulse control signal and converts the digital electric pulse control signal into an angular displacement or a linear displacement corresponding to the digital electric pulse control signal, and is an actuator for performing digital mode conversion. Furthermore, it can be controlled by the position of the open loop, and a defined position increment is obtained by inputting a pulse signal, so that the cost of the incremental position control system is obviously reduced compared with the traditional direct current control system, and the system adjustment is hardly needed. The angular displacement of the stepping motor is strictly proportional to the number of pulses input and is synchronized in time with the pulses. The required rotation angle, speed and direction can thus be obtained by controlling the number, frequency and phase sequence of the motor windings. In the use, angle and position can be adjusted accurately, and better effect is accomplished.
The angular displacement is a displacement in a rotational direction in which movement is required, and the linear displacement is a displacement in a linear direction in which movement is required. The motion arm assembly 104 may be controlled to move to a position where it is desired to grasp the doll based on the angular and linear displacements, and then perform a corresponding doll grasping operation.
As an optional implementation manner, the manner of determining the angular displacement and the linear displacement corresponding to the user somatosensory feature by the stepping motor 103 is specifically: determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics; generating an estimated motion track based on the gesture motion direction and the gesture motion distance; and determining the angular displacement and the linear displacement based on the estimated motion trail.
In this embodiment, the gesture movement direction may include a direction and a rotation angle, and the gesture movement distance may correspond to the angular displacement and the linear displacement. Based on the gesture motion direction and the gesture motion distance, the corresponding gesture motion distance can be moved towards the gesture motion direction to obtain an estimated motion track, wherein the estimated motion track can be a linear track, a rotating track, or a combination of the linear track and the rotating track, which is not limited in this embodiment. And determining the corresponding angular displacement and linear displacement by disassembling the estimated motion trail.
As an alternative embodiment, the motion arm assembly 104 includes a robotic arm and a robot arm.
In this embodiment, the mechanical arm may include a connecting arm, an upper moving arm, a lower moving arm, an upper connecting shaft, a lower connecting shaft, a joint ball, and a connecting block, which are connected to the shaft of the stepping motor 103 to rotate together, and the other end of the connecting shaft is connected to the upper end of the upper moving arm, the upper connecting shaft is movably sleeved to the lower end of the upper moving arm, the joint is respectively sleeved to the upper connecting shaft and the lower connecting shaft in the north-south direction and located at two sides of the upper moving arm and the lower moving arm, the upper end of the lower moving arm is connected to the upper connecting shaft through the joint ball, and the lower end of the lower moving arm is connected to the lower connecting shaft through another joint ball, one end of the connecting block is connected to the lower connecting shaft, and the other end of the connecting block is connected to the moving platform. The manipulator can comprise a mounting seat, a steering engine, a crown block and a mechanical claw arm, wherein the steering engine and the mechanical claw arm are both mounted on the mounting seat; the mechanical claw arm comprises an upper driving arm, a first connecting rod, a second connecting rod, a cross rod, a first driving arm, a third connecting rod, a second lower driving arm and a clamping block, wherein the middle part of the upper driving arm is connected with the mounting seat, one end of the first rod is connected with the middle part of the upper driving arm, and the other end of the first rod is connected with the end part of the cross rod; one end of the second rod is connected with the middle part of the upper driving arm, and the other end of the second rod is connected with the end part of the cross rod; one end of the third rod is connected with the middle part of the upper driving arm, and the other end of the third rod is connected with the end part of the cross rod; the controller controls the robot arm and the manipulator.
In the embodiment of the invention, the interactivity between the doll and the user can be improved in a somatosensory interaction mode, and the user experience of the user is improved.
Further, an embodiment of the present invention provides an alternative control method for a gesture motion sensing doll machine, and as shown in fig. 2, the control method for the gesture motion sensing doll machine includes:
s201, controlling the gesture body sensing doll machine to detect whether a user completes payment operation on the gesture body sensing doll machine.
S202, if the payment operation is detected to be completed, controlling to start the gesture motion sensing doll machine.
And S203, acquiring the somatosensory characteristics of the user.
S204, controlling a motion arm assembly in the gesture motion sensing doll machine to carry out corresponding doll grabbing operation based on the user motion sensing characteristics.
In this embodiment, the execution main body may be the integrated circuit controller 102, or may also be other electronic devices that are connected to the gesture sensing doll and are used to control the gesture sensing doll, which is not limited in this embodiment. If the user finishes the payment operation, the user controls to start the gesture motion sensing doll, and specifically, a start instruction can be sent to the stepping motor 103 and the motion sensing sensor 105 to start the gesture motion sensing doll.
The user motion sensing feature is a feature for describing user motion sensing, and may include, but is not limited to, a gesture motion sensing feature, a body motion sensing feature, and the like.
Specifically, based on the user somatosensory characteristics, controlling a motion arm assembly in the gesture somatosensory doll machine to perform corresponding doll grabbing operations may include: the execution main body controls the motion arm assembly to move to the designated position according to the user body sensing characteristics, and in response to the fact that the gesture body sensing characteristics of the user are detected to have designated grabbing gestures, grabbing operation is executed from the designated position, and therefore doll grabbing operation is completed. The doll grasping operation at this time includes a moving operation and a grasping operation.
As an optional implementation, the acquiring the user somatic sense feature includes: acquiring a user image; the user image at least comprises gesture feature information of a user; and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
As an alternative embodiment, the controlling a motion arm assembly in the gesture motion sensing doll machine to perform a corresponding doll grabbing operation based on the user motion sensing feature includes: determining angular displacement and linear displacement corresponding to the user somatosensory characteristics; and controlling the moving arm assembly to perform corresponding doll grabbing operation according to the angular displacement and the linear displacement.
As an alternative embodiment, the determining angular displacement and linear displacement corresponding to the user somatosensory feature includes: determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics; generating an estimated motion track based on the gesture motion direction and the gesture motion distance; and determining the angular displacement and the linear displacement based on the estimated motion trail.
As an alternative embodiment, the motion arm assembly includes a robotic arm and a manipulator.
In the embodiment of the invention, the interactivity between the doll and the user can be improved in a somatosensory interaction mode, and the user experience of the user is improved.
For detailed descriptions of the steps of the method, please refer to the detailed description of the gesture body-sensing doll-grabbing machine, which is not described herein again.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device for implementing the control method for a gesture-sensing doll machine, as shown in fig. 3, the electronic device includes a memory 302 and a processor 304, the memory 302 stores a computer program, and the processor 304 is configured to execute the steps in any one of the method embodiments through the computer program.
Optionally, in this embodiment, the electronic device may be located in at least one network device of a plurality of network devices of a computer network.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, controlling the gesture body sensing doll machine to detect whether the user completes the payment operation on the gesture body sensing doll machine;
s2, if the payment operation is detected to be completed, controlling to start the gesture motion sensing doll;
s3, acquiring the somatosensory characteristics of the user;
and S4, controlling a motion arm assembly in the gesture motion sensing doll machine to carry out corresponding doll grabbing operation based on the user motion sensing characteristics.
Alternatively, it can be understood by those skilled in the art that the structure shown in fig. 3 is only an illustration, and the electronic device may also be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 3 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 3, or have a different configuration than shown in FIG. 3.
The memory 302 may be configured to store software programs and modules, such as program instructions/modules corresponding to the control method and apparatus for a gesture motion sensing doll machine in the embodiments of the present invention, and the processor 304 executes various functional applications and data processing by running the software programs and modules stored in the memory 302, that is, implements the control method for a gesture motion sensing doll machine. The memory 302 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 302 can further include memory located remotely from the processor 304, which can be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. The memory 302 may be, but not limited to, specifically configured to store information such as operation instructions.
Optionally, the transmission device 306 is used for receiving or sending data via a network. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 306 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices to communicate with the internet or a local area Network. In one example, the transmission device 306 is a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In addition, the electronic device further includes: a display 308 for displaying the display content; and a connection bus 310 for connecting the respective module components in the above-described electronic apparatus.
According to a further aspect of embodiments of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above-mentioned method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, controlling the gesture body sensing doll machine to detect whether the user completes the payment operation on the gesture body sensing doll machine;
s2, if the payment operation is detected to be completed, controlling to start the gesture motion sensing doll;
s3, acquiring the somatosensory characteristics of the user;
and S4, controlling a motion arm assembly in the gesture motion sensing doll machine to carry out corresponding doll grabbing operation based on the user motion sensing characteristics.
Alternatively, in this embodiment, a person skilled in the art may understand that all or part of the steps in the methods of the foregoing embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be substantially or partially implemented in the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, and including instructions for causing one or more computer devices (which may be personal computers, servers, or network devices) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit is merely a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that it is obvious to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and these modifications and improvements should also be considered as the protection scope of the present invention.
Claims (10)
1. A gesture motion sensing doll machine comprises a payment starting device, an integrated circuit controller, a stepping motor and a moving arm assembly, and is characterized by further comprising a motion sensing sensor; the integrated circuit controller is respectively connected with the payment starting device, the stepping motor and the somatosensory sensor; the stepping motor is connected with the motion arm assembly; and
the payment starting device is used for detecting whether a user finishes payment operation on the gesture motion sensing doll machine or not, and if the user finishes the payment operation, transmitting a payment success instruction to the integrated circuit controller;
the integrated circuit controller is used for responding to the payment success instruction and sending a starting instruction to the stepping motor and the motion sensing sensor;
the motion sensing sensor is used for responding to the starting instruction, acquiring a user motion sensing characteristic and transmitting the user motion sensing characteristic back to the integrated circuit controller;
the stepping motor is used for responding to the starting instruction, acquiring the user somatosensory characteristics returned by the somatosensory sensor through the integrated circuit controller, and controlling the motion arm assembly to carry out corresponding doll grabbing operation based on the user somatosensory characteristics.
2. The gesture body-sensing doll machine according to claim 1, wherein the body-sensing sensor obtains the body-sensing characteristics of the user by:
acquiring a user image; the user image at least comprises gesture feature information of a user; and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
3. The gesture body sensing doll machine according to claim 1, wherein the step motor controls the motion arm assembly to perform corresponding doll grabbing operation based on the user body sensing feature by:
determining angular displacement and linear displacement corresponding to the user somatosensory characteristics;
and controlling the moving arm assembly to perform corresponding doll grabbing operation according to the angular displacement and the linear displacement.
4. The gesture somatosensory doll machine according to claim 3, wherein the step motor determines the angular displacement and the linear displacement corresponding to the user somatosensory feature in a manner that:
determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics;
generating an estimated motion track based on the gesture motion direction and the gesture motion distance;
and determining the angular displacement and the linear displacement based on the estimated motion trail.
5. The motion sensing doll machine of claim 1, wherein the motion arm assembly comprises a robotic arm and a manipulator.
6. A control method for a gesture motion sensing doll machine is characterized by comprising the following steps:
controlling the gesture body sensing doll machine to detect whether a user completes payment operation on the gesture body sensing doll machine or not;
if the payment operation is detected to be completed, controlling to start the gesture motion sensing doll machine;
acquiring the somatosensory characteristics of a user;
and controlling a motion arm assembly in the gesture body sensing doll machine to carry out corresponding doll grabbing operation based on the user body sensing characteristics.
7. The method of claim 6, wherein the obtaining the user somatosensory characteristics comprises:
acquiring a user image; the user image at least comprises gesture feature information of a user;
and carrying out digital signal conversion on the user image to obtain the user somatosensory characteristics.
8. The method of claim 6, wherein controlling a motion arm assembly in the gestural somatosensory doll machine to perform a corresponding doll grabbing operation based on the user somatosensory feature comprises:
determining angular displacement and linear displacement corresponding to the user somatosensory characteristics;
and controlling the moving arm assembly to perform corresponding doll grabbing operation according to the angular displacement and the linear displacement.
9. The method of claim 8, wherein the determining angular and linear displacements corresponding to the user somatosensory feature comprises:
determining a gesture movement direction and a gesture movement distance of the user based on the user somatosensory characteristics;
generating an estimated motion track based on the gesture motion direction and the gesture motion distance;
and determining the angular displacement and the linear displacement based on the estimated motion trail.
10. The method of claim 6, wherein the motion arm assembly comprises a robotic arm and a robot hand.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111428613.1A CN114225368A (en) | 2021-11-29 | 2021-11-29 | Gesture somatosensory doll machine and control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111428613.1A CN114225368A (en) | 2021-11-29 | 2021-11-29 | Gesture somatosensory doll machine and control method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114225368A true CN114225368A (en) | 2022-03-25 |
Family
ID=80751643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111428613.1A Pending CN114225368A (en) | 2021-11-29 | 2021-11-29 | Gesture somatosensory doll machine and control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114225368A (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1881778A (en) * | 2005-05-27 | 2006-12-20 | 阿鲁策株式会社 | Stepping motor controller and gaming machine |
CN102905765A (en) * | 2009-11-16 | 2013-01-30 | 百利游戏有限公司 | Gesture-enhanced input devices |
CN205055410U (en) * | 2015-07-16 | 2016-03-02 | 浙江工业大学 | Grab baby's machine based on gesture control |
CN206021449U (en) * | 2016-08-31 | 2017-03-15 | 鲍鸿浩 | One kind selects thing to peddle equipment |
CN111589098A (en) * | 2020-04-21 | 2020-08-28 | 哈尔滨拓博科技有限公司 | Follow-up gesture control method and system for doll with stepping crown block |
CN112070987A (en) * | 2020-08-28 | 2020-12-11 | 哈尔滨拓博科技有限公司 | Game gift device control method based on gesture recognition, storage medium and device |
-
2021
- 2021-11-29 CN CN202111428613.1A patent/CN114225368A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1881778A (en) * | 2005-05-27 | 2006-12-20 | 阿鲁策株式会社 | Stepping motor controller and gaming machine |
CN102905765A (en) * | 2009-11-16 | 2013-01-30 | 百利游戏有限公司 | Gesture-enhanced input devices |
CN205055410U (en) * | 2015-07-16 | 2016-03-02 | 浙江工业大学 | Grab baby's machine based on gesture control |
CN206021449U (en) * | 2016-08-31 | 2017-03-15 | 鲍鸿浩 | One kind selects thing to peddle equipment |
CN111589098A (en) * | 2020-04-21 | 2020-08-28 | 哈尔滨拓博科技有限公司 | Follow-up gesture control method and system for doll with stepping crown block |
CN112070987A (en) * | 2020-08-28 | 2020-12-11 | 哈尔滨拓博科技有限公司 | Game gift device control method based on gesture recognition, storage medium and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107079103B (en) | Cloud platform control method, device and holder | |
US20150100461A1 (en) | Robotic System Controlled by Multi Participants | |
EP2465650B1 (en) | Haptic interface handle with force-indicating trigger mechanism | |
JP7044047B2 (en) | robot | |
CN112230649B (en) | Machine learning method and mobile robot | |
CN113183150B (en) | Bionic hand control optimization method and system and electronic equipment | |
CN107688791A (en) | display content control method and device, system, storage medium and electronic equipment | |
JP2020082315A (en) | Image generating device, robot training system, image generating method, and image generating program | |
JP2013197872A (en) | Control device and control method | |
CN109983424B (en) | Method and device for selecting object in virtual reality scene and virtual reality equipment | |
WO2021003994A1 (en) | Control method for virtual character, and related product | |
CN111599459A (en) | Control method and control device for remote surgery and surgery system | |
JP2011101915A (en) | Robot system | |
CN112121406A (en) | Object control method and device, storage medium, electronic device | |
JP2013066962A (en) | Robot control device, and robot system | |
CN114225368A (en) | Gesture somatosensory doll machine and control method | |
CN114474066B (en) | Intelligent humanoid robot control system and method | |
KR101052521B1 (en) | Integrated servo drive and motor control method | |
CN110488742B (en) | Method and system for monitoring motion platform servo | |
JP2017159429A (en) | Robot control device, information processing device, and robot system | |
CN109531578B (en) | Humanoid mechanical arm somatosensory control method and device | |
JP6455869B2 (en) | Robot, robot system, control device, and control method | |
CN109352678B (en) | Gravity compensation method and device for robot shaft and robot | |
CN110253568B (en) | Robot control method, system, device and storage medium | |
Inaba et al. | Vision-based multisensor integration in remote-brained robots |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |