[go: up one dir, main page]

CN110061755B - Operation assisting method and system, wearable device and engineering vehicle - Google Patents

Operation assisting method and system, wearable device and engineering vehicle Download PDF

Info

Publication number
CN110061755B
CN110061755B CN201910359830.6A CN201910359830A CN110061755B CN 110061755 B CN110061755 B CN 110061755B CN 201910359830 A CN201910359830 A CN 201910359830A CN 110061755 B CN110061755 B CN 110061755B
Authority
CN
China
Prior art keywords
command
image
commander
wearable device
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910359830.6A
Other languages
Chinese (zh)
Other versions
CN110061755A (en
Inventor
张程
叶平
柴君飞
张连第
杨艳
李�权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuzhou Heavy Machinery Co Ltd
Original Assignee
Xuzhou Heavy Machinery Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuzhou Heavy Machinery Co Ltd filed Critical Xuzhou Heavy Machinery Co Ltd
Priority to CN201910359830.6A priority Critical patent/CN110061755B/en
Publication of CN110061755A publication Critical patent/CN110061755A/en
Application granted granted Critical
Publication of CN110061755B publication Critical patent/CN110061755B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Psychiatry (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The utility model provides an operation auxiliary method and system, wearable equipment and engineering vehicle, which relates to the technical field of engineering vehicles, wherein the operation auxiliary method comprises the following steps: the wearable equipment is in communication connection with terminal equipment in an engineering vehicle operating room; the wearable equipment shares a real-time video picture of an operation site to the terminal equipment; in response to an operation by a job site commander, the wearable device determining at least one of a command image and a tagged image of a job target; the wearable device fuses at least one of the command image and the marker image into a real-time video picture of a job site to assist an operator in working.

Description

Operation assisting method and system, wearable device and engineering vehicle
Technical Field
The disclosure relates to the technical field of engineering vehicles, in particular to an operation assisting method and system, wearable equipment and an engineering vehicle.
Background
Along with the rapid development of basic economy and civil engineering, engineering vehicles such as cranes play an important role in the fields of engineering construction, rail transit, emergency rescue and the like.
The hoisting operation of the crane belongs to one of dangerous operation. In order to prevent accidents, people-to-people and people-to-crane coordination is needed. The completion of the operation task needs close cooperation and mutual cooperation of commanders, the staffs and operators, and the cooperation degree of various staffs directly influences the efficiency and the safety of the hoisting operation.
The traditional personnel cooperation mode is mainly carried out through video monitoring, audio talkback and gesture guiding.
Disclosure of Invention
The inventor notices that due to the limitation of construction cost and space, the camera can only be deployed in a local key area, so that video monitoring can only cover the key area and cannot cover the whole working environment. Therefore, the operator cannot clearly see the suspended object and the surrounding environment in the control room. Although the audio intercom system can be carried about, the operation of the operator can be influenced by influencing the commander to carry out limb command actions when the audio intercom system is used. In addition, when the operator is far away from the commander, the operator can not see the command gesture of the commander clearly, so that the accurate judgment and operation decision of the hoisted object are influenced. Therefore, the traditional personnel cooperation mode is adopted, and the operation efficiency is low.
In order to solve the above problem, the embodiments of the present disclosure propose the following solutions.
According to an aspect of the embodiments of the present disclosure, there is provided a job assisting method including: the wearable equipment is in communication connection with terminal equipment in an engineering vehicle operating room; the wearable equipment shares a real-time video picture of an operation site to the terminal equipment; in response to an operation by a job site commander, the wearable device determining at least one of a command image and a tagged image of a job target; the wearable device fuses at least one of the command image and the marker image into a real-time video picture of a job site to assist an operator in working.
In some embodiments, said determining, by the wearable device, at least one of a conductor image and a tagged image of a job target in response to an operation by a job site conductor comprises: responding to a first operation of starting a command function by a commander, and determining the command image by the wearable equipment according to command operation information of the commander; and responding to a second operation of starting a marking function by a commander, and determining the marking image by the wearable equipment according to the gesture action of marking the operation target by the commander.
In some embodiments, the conductor operation information includes at least one of a conductor voice and a conductor gesture.
In some embodiments, the wearable device determining the conductor image according to conductor operation information of a conductor comprises: the wearable device identifies the command voice to obtain a keyword; matching corresponding command instructions according to the keywords; and matching the command image from the command image library according to the matched command instruction.
In some embodiments, the matching the conductor image from the conductor image library according to the matched conductor instruction includes: the wearable device matches a first command image from a command image library according to the matched command instruction; the wearable device matches a second command image from a command image library according to the command gesture; the wearable device judges whether the first command image and the second command image are the same; and taking the first command image or the second command image as the command image when the first command image is the same as the second command image.
In some embodiments, the wearable device determining the marker image from the gesture motion of the commander marking the job target comprises: judging whether the gesture of the commander is an appointed gesture; if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action; and if the gesture action of the commander is a designated action, generating the marked image at the position of the fingertip in an augmented reality mode.
According to another aspect of embodiments of the present disclosure, there is provided a wearable device including: the communication module is used for establishing communication connection with terminal equipment in an engineering vehicle operating room; the sharing module is used for sharing the real-time video picture of the operation site to the terminal equipment; a determination module for determining at least one of a command image and a marking image of a job target in response to an operation of a job site commander; and the fusion module is used for fusing at least one of the command image and the marked image into a real-time video picture of a job site.
In some embodiments, the determination module is to: responding to a first operation of starting a command function by a commander, and determining the command image by the wearable equipment according to command operation information of the commander; and responding to a second operation of starting a marking function by a commander, and determining the marking image by the wearable equipment according to the gesture action of marking the operation target by the commander.
In some embodiments, the conductor operation information includes at least one of a conductor voice and a conductor gesture.
In some embodiments, the determination module is to: recognizing the command voice to obtain a keyword; matching corresponding command instructions according to the keywords; and matching the command image from the command image library according to the matched command instruction.
In some embodiments, the determination module is to: matching a first command image from the command image library according to the matched command instruction; matching a second command image from a command image library according to the command gesture; the wearable device judges whether the first command image and the second command image are the same; and taking the first command image or the second command image as the command image when the first command image is the same as the second command image.
In some embodiments, the determination module is to: judging whether the gesture of the commander is an appointed gesture; if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action; and if the gesture action of the commander is a designated action, generating the marked image at the position of the fingertip in an augmented reality mode.
According to yet another aspect of embodiments of the present disclosure, there is provided a wearable device including: a memory; and a processor coupled to the memory, the processor configured to perform the method of any of the above embodiments based on instructions stored in the memory.
According to still another aspect of the embodiments of the present disclosure, there is provided a work assisting system including: the wearable device of any of the above embodiments; the terminal equipment is used for establishing communication connection with the wearable equipment; and receiving and displaying real-time video pictures of the operation site shared by the wearable equipment.
According to still another aspect of the embodiments of the present disclosure, there is provided an engineering vehicle including: the work assisting system according to any one of the above embodiments.
In some embodiments, the work vehicle comprises a crane.
According to a further aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method according to any one of the embodiments described above.
In the embodiment of the disclosure, the wearable device fuses at least one of the marking image and the command image of the operation target into the real-time video picture of the operation site, so that an operator can be assisted to make operation decisions better, and the accuracy and the efficiency of operation are improved. In addition, compared with a mode of arranging the camera at a fixed position, the wearable equipment is more portable, the operation is simple, and the portability of operation is greatly improved.
The technical solution of the present disclosure is further described in detail by the accompanying drawings and examples.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic flow diagram of a job assistance method according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram of effects of picture sharing according to some embodiments of the present disclosure;
fig. 3 is a schematic structural diagram of a wearable device according to some embodiments of the present disclosure;
fig. 4 is a schematic structural diagram of a wearable device according to further embodiments of the present disclosure;
FIG. 5 is a schematic diagram of a work assistance system according to some embodiments of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
FIG. 1 is a schematic flow diagram of a job assistance method according to some embodiments of the present disclosure.
In step 102, the wearable device establishes a communication connection with a terminal device in an operation room of the engineering vehicle.
For example, the wearable device may establish a wireless communication connection with the terminal device through a WiFi module, a bluetooth module, or a MiracastP2P module, etc. As another example, the wearable device may also establish a wired communication connection with the terminal device.
As an example, the wearable device may be, for example, smart glasses or the like, which may be worn by a commander. The terminal device can be a smart phone, a smart tablet, smart glasses and the like, and can be integrated with an engineering vehicle to form an on-board terminal. Here, the work vehicle may include, for example, but not limited to, a crane, such as a wheel crane, and the like.
In step 104, the wearable device shares real-time video pictures of the job site to the terminal device.
For example, the wearable device may share a real-time video picture of a job site to the terminal device by way of screen projection. For example, the wearable device may enter a screen-cast mode in response to a screen-cast request initiated by the terminal device. Furthermore, the wearable device responds to the operation of opening a preset application program in the wearable device by a commander, automatically starts the camera and shoots a real-time video picture of the operation site. Along with the movement of the commander, the real-time video picture of the operation site can also be changed.
At step 106, in response to an operation by a job site director, the wearable device determines at least one of a conductor image and a tagged image of a job target.
The operator can know the command action of the commander according to the command image, and can know the position of the operation target according to the mark image of the operation target. As an example, the work target may be, for example, a target to be hung.
For example, in response to a commander opening a first operation of a command function, the wearable device determines a command image according to command operation information of the commander. Here, the command operation information may include at least one of a command voice and a command gesture. The manner in which the conductor images are determined will be described in detail below. The first operation may be any one of short-press key, long-press key, double-click or multi-click touch pad.
For another example, in response to the commander turning on the second operation of the marking function, the wearable device determines a marking image of the job target according to a gesture motion of the commander marking the job target. The second operation may be any one of the operation modes of a short-press key, a long-press key, a double-click or multi-click touch pad, for example, and is different from the first operation.
At step 108, the wearable device fuses at least one of the conductor images and the marker images into a real-time video frame of the job site.
For example, the conductor image may be fused to a preset position that does not affect the observation, and the marker image may be fused to a position relative to the construction vehicle that reflects the actual work site.
In some embodiments, the command image disappears from the real-time video picture of the job site in response to the command personnel closing the third operation of the command function. In some embodiments, in response to the fourth operation of the commander turning off the marker function, the marker image disappears from the real-time video picture of the job site. The third operation and the fourth operation may be any operations different from the first and second operations.
Fig. 2 is a schematic diagram of effects of picture sharing according to some embodiments of the present disclosure.
As shown in fig. 2, the real-time video screen of the work site includes the work vehicle itself, a marker image of the work target, and a conductor image. In the real-time video picture of the operation site, the relative positions of the command image and the engineering vehicle are the same as those of the actual operation site. The conductor image can be, for example, a preset gesture image. As an example, the conductor image may be located in the lower right corner of the real-time video frame or the like that does not overlap the engineering vehicle and the marker image.
In the embodiment, the wearable device fuses at least one of the marking image and the command image of the operation target into the real-time video picture of the operation site, so that an operator can be assisted to make operation decisions better, and the operation accuracy and the operation efficiency are improved. In addition, compared with a mode of arranging the camera at a fixed position, the wearable equipment is more portable, the operation is simple, and the portability of operation is greatly improved.
The wearable device can determine the command image according to the command operation information of the commander in different ways. As will be described in detail below.
In some implementations, the wearable device determines the conductor image from the conductor's conductor voice.
Firstly, the wearable device can identify command voice of a commander so as to obtain keywords; then, matching out a corresponding command instruction according to the keyword; and finally, matching the command images from the command image library according to the matched command instructions. The conductor image may be, for example, an image of the GB5082-85 standard.
For example, as shown in table 1 below, the command voice may be "prepared", the recognized keyword may be "prepared", the command instruction may be prepared, and the corresponding command image may be an image of the hand. For another example, the command voice may be "operate the hook upward", the recognized keyword may be "hook up" or "hook up", the command instruction may be "hook up", and the corresponding command image may be an image including the hook and the arm up.
TABLE 1
Figure BDA0002046531980000071
Figure BDA0002046531980000081
In other implementations, the wearable device determines the conductor image from the conductor's conductor gesture. For example, the wearable device can match the corresponding conductor image directly from the conductor image library according to the conductor gesture. For example, a conductor gesture can be recognized and then converted to a corresponding conductor image.
In still other implementations, the wearable device determines the conductor image from both the conductor's conductor voice and the conductor gesture.
For example, the wearable device may match the first conductor image from the conductor image library according to the matched conductor instruction. In addition, the wearable device can match a second command image from the command image library according to the command gesture. Then, the wearable device determines whether the first conductor image and the second conductor image are the same. And in the case that the first command image is the same as the second command image, taking the first command image or the second command image as the command image. The conductor images determined in this manner are more accurate.
The following describes an implementation of the wearable device determining a mark image according to a gesture action of a commander for marking a job target.
Firstly, whether the gesture of the commander is a designated gesture is judged. For example, the designated gesture may be a designated hand contour or a designated finger contour. For example, a gesture may be considered valid after a specified hand shape profile is recognized, i.e., a specified gesture.
And if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action. For example, the specified action may be an action similar to a down button.
And if the gesture motion of the commander is a designated motion, generating a marker image in an augmented reality mode at the position of the fingertip. The position of the fingertip directs the working target pointed by the person. The relative position of the marker image generated in an augmented reality manner and the engineering vehicle can reflect the relative position of the marker image and the engineering vehicle on the work site.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the wearable device embodiment, since it substantially corresponds to the method embodiment, the description is relatively simple, and for relevant points, refer to the partial description of the method embodiment.
Fig. 3 is a schematic structural diagram of a wearable device according to some embodiments of the present disclosure.
As shown in fig. 3, the wearable device includes a communication module 301, a sharing module 302, a determination module 303, and a fusion module 304.
The communication module 301 is used for establishing communication connection with a terminal device in an operation room of the engineering vehicle. The sharing module 302 is configured to share a real-time video frame of a job site to a terminal device in a screen projection manner. The determination module 303 is configured to determine at least one of the conductor image and the marker image of the job target in response to an operation of a job site conductor. The fusion module 304 is configured to fuse at least one of the conductor image and the marker image into a real-time video frame of the job site.
In some embodiments, the determining module 303 is configured to: responding to a first operation of starting a command function by a commander, and determining a command image by the wearable equipment according to command operation information of the commander; and responding to a second operation of starting the marking function by the commander, and determining a marking image by the wearable equipment according to the gesture action of marking the operation target by the commander. For example, the command operation information includes at least one of a command voice and a command gesture.
In some embodiments, the determination module 303 is configured to determine the conductor images according to: recognizing the command voice to obtain a keyword; matching corresponding command instructions according to the keywords; and matching the command images from the command image library according to the matched command instructions.
In some embodiments, the determination module 303 is configured to determine the conductor images according to: matching a first command image from the command image library according to the matched command instruction; matching a second command image from the command image library according to the command gesture; the wearable device judges whether the first command image is the same as the second command image; and in the case that the first command image is the same as the second command image, taking the first command image or the second command image as the command image.
In some embodiments, the determination module 303 is configured to determine the marker image according to: judging whether the gesture of the commander is an appointed gesture; if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action; and if the gesture motion of the commander is a designated motion, generating a marker image in an augmented reality mode at the position of the fingertip.
Fig. 4 is a schematic structural diagram of a wearable device according to further embodiments of the present disclosure.
As shown in fig. 4, the wearable device 400 of this embodiment includes a memory 401 and a processor 402 coupled to the memory 401, and the processor 402 is configured to execute the job assistance method of any one of the foregoing embodiments based on instructions stored in the memory 401.
The memory 401 may include, for example, a system memory, a fixed non-volatile storage medium, and the like. The system memory may store, for example, an operating system, application programs, a BootLoader (BootLoader), and other programs.
Wearable device 400 may also include input-output interface 403, network interface 404, storage interface 405, and the like. The interfaces 403, 404, 405 and the memory 401 and the processor 402 may be connected by a bus 406, for example. The input/output interface 403 provides a connection interface for input/output devices such as a display, a mouse, a keyboard, and a touch screen. The network interface 404 provides a connection interface for various networking devices. The storage interface 405 provides a connection interface for external storage devices such as an SD card and a usb disk.
FIG. 5 is a schematic diagram of a work assistance system according to some embodiments of the present disclosure.
As shown in fig. 5, the work assistance system includes a wearable device 501 and a terminal device 502 of any one of the above embodiments. The terminal device 502 is used for establishing communication connection with the wearable device 501; and receiving and displaying a real-time video picture of the work site shared by the wearable device 501. Wearable device 501 may be worn by a helper, and terminal device 502 is located in an operating room.
The disclosed embodiment also provides an engineering vehicle, including: the work assisting system according to any one of the above embodiments. The work vehicle may be, for example, a crane or the like.
The disclosed embodiments also provide a computer-readable storage medium having computer program instructions stored thereon, wherein the instructions, when executed by a processor, implement the job assistance method of any one of the above embodiments.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that the functions specified in one or more of the flows in the flowcharts and/or one or more of the blocks in the block diagrams can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only exemplary of the present disclosure and is not intended to limit the present disclosure, so that any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (17)

1. A work assistance method comprising:
the wearable equipment is in communication connection with terminal equipment in an engineering vehicle operating room;
the wearable equipment shares a real-time video picture of an operation site to the terminal equipment, wherein the real-time video picture of the operation site comprises the engineering vehicle;
in response to the operation of a commander on a job site, the wearable equipment determines a command image and a mark image of a job target, wherein the job target is a target to be hung;
the wearable device fuses the command image and the mark image into a real-time video picture of a work site to assist an operator in working, wherein in the real-time video picture of the work site, the relative position of the mark image and the engineering vehicle reflects the relative position of the work target and the engineering vehicle on the work site.
2. The job assistance method according to claim 1, wherein the determining, by the wearable device, the conductor image and the mark image of the job target in response to an operation by a job site commander comprises:
responding to a first operation of starting a command function by a commander, and determining the command image by the wearable equipment according to command operation information of the commander;
and responding to a second operation of starting a marking function by a commander, and determining the marking image by the wearable equipment according to the gesture action of marking the operation target by the commander.
3. The work assisting method according to claim 2, wherein the command operation information includes at least one of a command voice and a command gesture.
4. The work assisting method according to claim 3, wherein the wearable device determining the command image according to command operation information of a commander includes:
the wearable device identifies the command voice to obtain a keyword;
matching corresponding command instructions according to the keywords;
and matching the command image from the command image library according to the matched command instruction.
5. The job assisting method according to claim 4, wherein the matching of the conductor image from a conductor image library according to the matched conductor instruction comprises:
the wearable device matches a first command image from a command image library according to the matched command instruction;
the wearable device matches a second command image from a command image library according to the command gesture;
the wearable device judges whether the first command image and the second command image are the same;
and taking the first command image or the second command image as the command image when the first command image is the same as the second command image.
6. The method of claim 2, wherein the wearable device determining the marker image from the gesture motion of the commander marking the job target comprises:
judging whether the gesture of the commander is an appointed gesture;
if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action;
and if the gesture action of the commander is a designated action, generating the marked image at the position of the fingertip in an augmented reality mode.
7. A wearable device, comprising:
the communication module is used for establishing communication connection with terminal equipment in an engineering vehicle operating room;
the sharing module is used for sharing a real-time video picture of an operation site to the terminal equipment, wherein the real-time video picture of the operation site comprises the engineering vehicle;
the wearable device comprises a determining module, a judging module and a judging module, wherein the determining module is used for responding to the operation of a commander in an operation site, and the wearable device determines a command image and a mark image of an operation target, and the operation target is a target to be hung;
and the fusion module is used for fusing the command image and the marked image into a real-time video picture of a working site, wherein in the real-time video picture of the working site, the relative position of the marked image and the engineering vehicle reflects the relative position of the working target and the engineering vehicle on the working site.
8. The wearable device of claim 7, wherein the determination module is to:
responding to a first operation of starting a command function by a commander, and determining the command image by the wearable equipment according to command operation information of the commander;
and responding to a second operation of starting a marking function by a commander, and determining the marking image by the wearable equipment according to the gesture action of marking the operation target by the commander.
9. The wearable device of claim 8, wherein the conductor operation information comprises at least one of a conductor voice and a conductor gesture.
10. The wearable device of claim 9, wherein the determination module is to:
recognizing the command voice to obtain a keyword;
matching corresponding command instructions according to the keywords;
and matching the command image from the command image library according to the matched command instruction.
11. The wearable device of claim 10, wherein the determination module is to:
matching a first command image from the command image library according to the matched command instruction;
matching a second command image from a command image library according to the command gesture;
the wearable device judges whether the first command image and the second command image are the same;
and taking the first command image or the second command image as the command image when the first command image is the same as the second command image.
12. The wearable device of claim 8, wherein the determination module is to:
judging whether the gesture of the commander is an appointed gesture;
if the gesture of the commander is the designated gesture, judging whether the gesture action of the commander is the designated action;
and if the gesture action of the commander is a designated action, generating the marked image at the position of the fingertip in an augmented reality mode.
13. A wearable device, comprising:
a memory; and
a processor coupled to the memory, the processor configured to perform the method of any of claims 1-6 based on instructions stored in the memory.
14. A work assistance system comprising:
the wearable device of any of claims 8-13; and
the terminal equipment is used for establishing communication connection with the wearable equipment; and receiving and displaying real-time video pictures of the operation site shared by the wearable equipment.
15. A work vehicle comprising: a work assistance system as claimed in claim 14.
16. A work vehicle according to claim 15, wherein the work vehicle comprises a crane.
17. A computer readable storage medium having computer program instructions stored thereon, wherein the instructions, when executed by a processor, implement the method of any of claims 1-6.
CN201910359830.6A 2019-04-30 2019-04-30 Operation assisting method and system, wearable device and engineering vehicle Active CN110061755B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910359830.6A CN110061755B (en) 2019-04-30 2019-04-30 Operation assisting method and system, wearable device and engineering vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910359830.6A CN110061755B (en) 2019-04-30 2019-04-30 Operation assisting method and system, wearable device and engineering vehicle

Publications (2)

Publication Number Publication Date
CN110061755A CN110061755A (en) 2019-07-26
CN110061755B true CN110061755B (en) 2021-04-16

Family

ID=67321779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910359830.6A Active CN110061755B (en) 2019-04-30 2019-04-30 Operation assisting method and system, wearable device and engineering vehicle

Country Status (1)

Country Link
CN (1) CN110061755B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112883754B (en) * 2019-11-29 2024-04-12 华晨宝马汽车有限公司 Auxiliary operation system of forklift
US11769134B2 (en) * 2021-03-22 2023-09-26 International Business Machines Corporation Multi-user interactive ad shopping using wearable device gestures
CN114415837B (en) * 2022-01-25 2024-10-29 中国农业银行股份有限公司 Operation auxiliary system and method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2813563Y (en) * 2005-04-20 2006-09-06 苗小平 Visual monitoring device of tower crane
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN106339094A (en) * 2016-09-05 2017-01-18 山东万腾电子科技有限公司 Interactive remote expert cooperation maintenance system and method based on augmented reality technology
CN107203272A (en) * 2017-06-23 2017-09-26 山东万腾电子科技有限公司 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
CN206654664U (en) * 2017-03-20 2017-11-21 河南师范大学 A kind of control tower crane based on virtual reality technology
CN107886172A (en) * 2017-10-19 2018-04-06 中车青岛四方机车车辆股份有限公司 System is supported in a kind of remote visualization maintenance
CN108509025A (en) * 2018-01-26 2018-09-07 吉林大学 A kind of crane intelligent Lift-on/Lift-off System based on limb action identification
CN109405804A (en) * 2018-11-05 2019-03-01 徐州重型机械有限公司 Operation householder method and system
CN109446876A (en) * 2018-08-31 2019-03-08 百度在线网络技术(北京)有限公司 Sign language information processing method, device, electronic equipment and readable storage medium storing program for executing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108879440A (en) * 2018-06-20 2018-11-23 国网山东省电力公司济宁供电公司 Intelligent examination and repair system and method based on wearable augmented reality display terminal and cloud platform

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2813563Y (en) * 2005-04-20 2006-09-06 苗小平 Visual monitoring device of tower crane
CN104063039A (en) * 2013-03-18 2014-09-24 朱慧灵 Human-computer interaction method of wearable computer intelligent terminal
CN106339094A (en) * 2016-09-05 2017-01-18 山东万腾电子科技有限公司 Interactive remote expert cooperation maintenance system and method based on augmented reality technology
CN206654664U (en) * 2017-03-20 2017-11-21 河南师范大学 A kind of control tower crane based on virtual reality technology
CN107203272A (en) * 2017-06-23 2017-09-26 山东万腾电子科技有限公司 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
CN107886172A (en) * 2017-10-19 2018-04-06 中车青岛四方机车车辆股份有限公司 System is supported in a kind of remote visualization maintenance
CN108509025A (en) * 2018-01-26 2018-09-07 吉林大学 A kind of crane intelligent Lift-on/Lift-off System based on limb action identification
CN109446876A (en) * 2018-08-31 2019-03-08 百度在线网络技术(北京)有限公司 Sign language information processing method, device, electronic equipment and readable storage medium storing program for executing
CN109405804A (en) * 2018-11-05 2019-03-01 徐州重型机械有限公司 Operation householder method and system

Also Published As

Publication number Publication date
CN110061755A (en) 2019-07-26

Similar Documents

Publication Publication Date Title
CN110061755B (en) Operation assisting method and system, wearable device and engineering vehicle
US11262895B2 (en) Screen capturing method and apparatus
US20150379771A1 (en) Image data generating device, portable terminal device, and portable control device
US11029766B2 (en) Information processing apparatus, control method, and storage medium
US20150201167A1 (en) Fabrication equipment monitoring device and monitoring method
US10990748B2 (en) Electronic device and operation method for providing cover of note in electronic device
JP2015204615A (en) Method and system for interacting between equipment and moving device
JPWO2014119258A1 (en) Information processing method and information processing apparatus
CN106201663A (en) The child mode changing method of mobile terminal and device
JP2021114343A (en) Display system and display control method of display system
CN108090908A (en) Image partition method, device, terminal and storage medium
US10713488B2 (en) Inspection spot output apparatus, control method, and storage medium
KR20130115094A (en) Information processing apparatus, information processing system and information processing method
CN107885352A (en) A kind of terminal screen bright screen control method, device and medium
CN112558833B (en) Application running method and device and electronic equipment
CN104866194B (en) Image searching method and device
CN103513762A (en) Operating method using user gesture and digital device thereof
CN114550951A (en) Rapid medical treatment method and device, computer equipment and storage medium
CN106598422A (en) Directivity-based control and hybrid control methods, control system and electronic equipment
CN106648501A (en) Information display method and device
KR20230036179A (en) Construction management method for railway infrastructure using bim and ar
BR112017018840B1 (en) DESKTOP AND MOBILE TERMINAL ICON DISPLAY METHOD
JP2016208079A (en) Information processing apparatus, information processing method, and program
US20240005619A1 (en) Augmented reality processing system, information display device, and augmented reality processing method
CN113392263A (en) Data labeling method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant