[go: up one dir, main page]

CN107635710B - Armband-based system and method for controlling welding equipment using gestures and similar actions - Google Patents

Armband-based system and method for controlling welding equipment using gestures and similar actions Download PDF

Info

Publication number
CN107635710B
CN107635710B CN201680028213.2A CN201680028213A CN107635710B CN 107635710 B CN107635710 B CN 107635710B CN 201680028213 A CN201680028213 A CN 201680028213A CN 107635710 B CN107635710 B CN 107635710B
Authority
CN
China
Prior art keywords
welding
command
operator
gesture
welding system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201680028213.2A
Other languages
Chinese (zh)
Other versions
CN107635710A (en
Inventor
托德·杰拉尔德·巴茨勒
罗伯特·亚瑟·巴茨勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Illinois Tool Works Inc
Original Assignee
Illinois Tool Works Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/659,853 external-priority patent/US10987762B2/en
Application filed by Illinois Tool Works Inc filed Critical Illinois Tool Works Inc
Publication of CN107635710A publication Critical patent/CN107635710A/en
Application granted granted Critical
Publication of CN107635710B publication Critical patent/CN107635710B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/095Monitoring or automatic control of welding parameters
    • B23K9/0956Monitoring or automatic control of welding parameters using sensing means, e.g. optical
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/10Other electric circuits therefor; Protective circuits; Remote controls
    • B23K9/1087Arc welding using remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Plasma & Fusion (AREA)
  • Mechanical Engineering (AREA)
  • Arc Welding Control (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Portable, wearable, or integratable devices may be used to detect gestures or actions by an operator of a welding system and perform actions based thereon. The detection may be performed using a sensor (e.g., a MEMS or muscle sensor) that produces sensory data. The detected gestures or actions may be processed and translated into corresponding actions or commands, which may be processed directly or communicated to other devices in the welding system. The actions or commands may include welding actions/commands configured to control a particular component used during a welding operation, and/or gesture-related actions/commands configured to control gesture-related components and/or operations. Feedback is provided to the operator to facilitate, for example, confirmation or verification of gesture detection, identification of a corresponding action or command, and completion of the identified action or command. The feedback may be non-visual feedback, such as tactile or audio feedback.

Description

Armband-based system and method for controlling welding equipment using gestures and similar actions
Priority requirement
This application claims priority to and is also part of the continuation-in-part application (CIP) of U.S. patent application No. 14/659,853 filed on 17.3.2015, and also U.S. patent application No. 14/502,599 filed on 30.9.2014. The entire contents of the above-mentioned applications are incorporated herein by reference.
Background
Welding is a process that is becoming more and more common in all industries. Welding can be performed in an automated manner or in a manual manner. In particular, welding may be automated in some instances, but manual welding operations (e.g., where a welding operator uses a welding gun or torch to weld) remain in widespread use. In either mode (automatic or manual), the success of the welding operation depends largely on the proper use of the welding equipment, e.g., the success of manual welding depends on the proper use of the welding gun or torch by the welding operator. For example, incorrect torch angle, contact tip-to-work distance, travel speed, and aim are parameters that can determine weld quality. However, even experienced welding operators often have difficulty welding monitoring and maintaining these important parameters throughout the welding process.
Disclosure of Invention
Various embodiments of the present disclosure are directed to armband-based systems and methods for controlling welding equipment using gestures and similar actions, substantially as shown in or described in connection with at least one of the figures, as set forth more completely in the claims.
Drawings
Fig. 1 illustrates an exemplary arc welding system according to aspects of the present disclosure.
Fig. 2 illustrates an exemplary welding apparatus in accordance with aspects of the present disclosure.
Fig. 3 is a block diagram illustrating an exemplary use of a motion detection system operating within a welding system in accordance with aspects of the present disclosure.
FIG. 4 is a block diagram illustrating an exemplary motion detection system in accordance with aspects of the present disclosure.
Fig. 5 is a block diagram illustrating an example gesture accessory device that may be used in conjunction with and in wireless communication with a motion detection system in accordance with aspects of the present disclosure.
Fig. 6 is a flow diagram illustrating an exemplary method for transmitting a welding command from a motion detection system to a welding system in accordance with aspects of the present disclosure.
FIG. 7 is a flow diagram illustrating an exemplary method for associating a welding command with a particular gesture or action in accordance with aspects of the present disclosure.
Fig. 8 illustrates an example gesture-based armband apparatus for remotely controlling a welding operation in accordance with aspects of the present disclosure.
Fig. 9 illustrates example circuitry of a gesture-based armband apparatus for remotely controlling a welding operation in accordance with aspects of the present disclosure.
FIG. 10 is a flow diagram illustrating an exemplary method for providing feedback during gesture-based remote control of a welding operation in accordance with aspects of the present disclosure.
Detailed Description
Fig. 1 illustrates an exemplary arc welding system according to aspects of the present disclosure. Referring to fig. 1, an exemplary welding system 10 is shown in which an operator 18 wears a welding headset 20 and welds a workpiece 24 using a welding torch 504 to which power is delivered by a device 12 through a conduit 14, and a welding monitoring device 28 may be used to monitor the welding operation. The apparatus 12 may include a power source, an optional inert shielding gas source, and a wire feeder that automatically provides welding wire/filler material.
The welding system 10 of fig. 1 may be configured to form the weld joint 512 by any known technique, including electric welding techniques such as gas metal arc welding (i.e., stick welding), metal inert gas welding (MIG), tungsten inert gas welding (TIG), and resistance welding.
Optionally, in any embodiment, the welding device 12 may be an arc welding device that provides Direct Current (DC) or Alternating Current (AC) to a consumable or non-consumable electrode 16 (e.g., better shown in fig. 5C) of the welding torch 504. The electrode 16 delivers current to a weld point on the workpiece 24. In the welding system 10, the operator 18 controls the position and operation of the electrode 16 by manipulating the welding torch 504 and triggering the start and stop of the current. When current flows, an arc 26 is formed between the electrode and the workpiece 24. Thus, the conduit 14 and the electrode 16 deliver a current and voltage sufficient to create an arc 26 between the electrode 16 and the workpiece. The arc 26 locally melts the workpiece 24 and the wire or rod supplied to the weld joint 512 (the electrode 16 in the case of a consumable electrode, or a separate wire or rod in the case of a non-consumable electrode) at the weld point between the electrode 16 and the workpiece 24, forming the weld joint 512 as the metal cools.
Alternatively, in any embodiment, the welding monitoring device 28 may be used to monitor the welding operation. The weld monitoring device 28 may be used to monitor various aspects of a welding operation, particularly in real time (i.e., while welding). For example, the weld monitoring device 28 may be operable to monitor arc characteristics such as length, current, voltage, frequency, variation, and instability. Data derived from the weld monitoring may be used (e.g., by operator 18 and/or by an automatic quality control system) to ensure a proper weld.
As shown and described more fully below, the device 12 and the headset 20 may communicate via a link 25, through which link 25 the headset 20 may control the settings of the device 12 and/or the device 12 may provide information to the headset 20 regarding its settings. Although wireless links are shown, the links may be wireless, wired, or optical.
In some cases, an operator (e.g., operator 18) may need to interact with equipment used in the welding operation and/or equipment in the weld inspection of the welding operation. For example, the operator 18 may need to interact with the welding equipment 12 and/or the welding monitoring equipment 28 to, for example, control equipment (e.g., adjust settings of the equipment), obtain real-time feedback information (e.g., real-time equipment status, welding monitoring related information, etc.), and so forth. However, in some use cases, the welding environment may impose certain limitations on possible solutions for interacting with devices used in connection with welding operations. For example, the welding environment may be cluttered (e.g., using many devices, wires, or connectors, etc.) and/or may have space limitations (e.g., compact workspace, awkward location or placement of workpieces, etc.). Therefore, it may not be desirable to add more equipment in such an environment to enable interaction with the welding or welding monitoring equipment, particularly when adding equipment that requires a wired connector. The use of such systems or devices, particularly systems or devices that occupy too much space, may result in additional undesirable clutter and/or may occupy valuable welding cell space. Furthermore, the use of wired connections or connectors (e.g., wires) may limit the distance of use of these devices (e.g., the distance from the power source, the device with which the operator is attempting to interact, etc.) and may create safety issues (e.g., trip risks).
Thus, in various embodiments according to the present disclosure, small control devices configured to utilize non-wire based solutions (e.g., wireless communication technologies; audio, video, and/or sensory input/output (I/O) solutions, etc.) may be used. For example, control devices implemented according to the present disclosure may be small enough so that they may be worn by an operator or integrated into equipment or clothing (e.g., welding helmets, welding gloves, etc.) that the operator uses or wears directly during a welding operation. For example, these devices may be small enough that they can be worn by an operator on or in a belt, arm, welding helmet, or welding glove. Further, such control devices may be configured to support and use wireless technology (e.g., WiFi or bluetooth) to perform the communications needed for interface operations.
In some embodiments, the control device may use or support motion detection and recognition to be able to detect and recognize gestures or motions of the operator. In particular, such control devices may be configured or programmed to recognize a particular gesture of the operator that the operator may perform when the operator is attempting to remotely interact with (e.g., remotely control or obtain data from) a particular device in the welding environment. In some cases, these gestures may simulate operations performed by an operator when interacting directly with the device. For example, a gesture or action may include the operator simulating the turning of a volume control knob, which when detected and interpreted as such, may be transmitted to the corresponding device to trigger a response of the device as if the knob were actually present. This removes the requirement for an actual physical interface, but still provides the required control capability. The control devices may be implemented such that they may be secured to specific parts of the operator's body or clothing worn by the operator. Exemplary embodiments may include a similarly resilient and/or form-fitting armband so that it may be secured to one arm of an operator.
The motion detection and identification may be performed using any solution suitable for use in connection with the welding arrangement according to the present disclosure. These solutions may include, for example, devices or components worn by the operator or integrated onto equipment or clothing that the operator directly uses or wears during the welding operation. For example, a detection component, which may be implemented as a standalone device or as a built-in component (e.g., a control device), may be configured to detect gestures or actions of an operator. Further, a motion recognition component, which may be implemented as a standalone device or as a built-in component (e.g., a control device), may be configured to receive a detected gesture or motion and determine when/whether the detected gesture or motion corresponds to a particular user input (e.g., a command). This may be accomplished by comparing the detected gesture or action to a predefined plurality of welding commands, each predefined welding command being associated with a particular gesture or action. Accordingly, based on a successful match of the detected gesture or action with a gesture or action associated with the welding command, the action recognition component recognizes the welding command from the plurality of welding commands and communicates the recognized welding command to a component of the welding system.
The motion detection and recognition functionality may be configurable and/or programmable. For example, in addition to a normal operating mode, the motion recognition component may also operate in a "configuration" or "programming" mode, where the motion recognition component is simply used to recognize and trigger a particular welding command based on a detected gesture or movement. When operating in this configuration (or programming) mode, the motion recognition component may receive one or more detected gestures or motions and welding-related commands (e.g., provided by an operator using a suitable means), and may associate the welding-related commands with at least one of the detected gestures or motions and store the association for later comparison.
Fig. 2 illustrates an exemplary welding apparatus in accordance with aspects of the present disclosure. The apparatus 12 of fig. 2 includes an antenna 202, a communication port 204, communication interface circuitry 206, a user interface module 208, control circuitry 210, power supply circuitry 212, a wire feeder module 214, and a gas supply module 216.
The antenna 202 may be any type of antenna suitable for the frequency, power level, etc. used by the communication link 25.
The communication ports 204 may include, for example, ethernet twisted pair ports, USB ports, HDMI ports, Passive Optical Network (PON) ports, and/or any other suitable ports for connecting with a wired or optical cable.
The communication interface circuitry 206 may be operable to connect the control circuitry 210 to the antenna 202 and/or the port 204 for transmit and receive operations. For transmission, the communication interface 206 may receive data from the control circuitry 210, packetize the data, and convert the data to physical layer signals according to the protocol used on the communication link 25. For reception, the communication interface may receive physical layer signals via the antenna 202 or the port 204, recover data from the received physical layer signals (demodulate, decode, etc.), and provide the data to the control circuitry 210.
The user interface module 208 may include electromechanical interface components (e.g., screen, speaker, microphone, buttons, touch screen, etc.) and associated drive circuitry. The user interface 208 may generate electrical signals in response to user input (e.g., screen touches, button presses, voice commands, etc.). The drive circuitry of the user interface module 208 may condition (e.g., amplify, digitize, etc.) and adapt the signal to the control circuitry 210. In response to signals from the control circuitry 210, the user interface 208 may produce audible, visual, and/or tactile outputs (e.g., via speakers, displays, and/or motors/actuators/servos, etc.).
Control circuitry 210 includes circuitry (e.g., a microcontroller and memory) operable to process data circuitry from communication interface 206, user interface 208, power supply 212, wire feeder 214, and/or gas supply 216; and is operable to output data and/or control signals to the communication interface 206, the user interface 208, the power supply 212, the wire feeder 214, and/or the gas supply 216.
The power supply circuitry 212 includes circuitry for generating power for delivery to the welding electrode through the conduit 14. The power supply circuitry 212 may include, for example, one or more voltage regulators, current regulators, inverters, and/or the like. The voltage and/or current output by the power supply circuitry 212 may be controlled by control signals from the control circuitry 210. The power supply circuitry 212 may also include circuitry for reporting the present current and/or voltage to the control circuitry 210. In an exemplary embodiment, the power supply circuitry 212 may include circuitry for measuring the voltage and/or current at the conduit 14 (at either or both ends of the conduit 14), such that the reported voltage and/or current is actually used, and not merely an expected value based on calibration.
The wire feeder module 214 is configured to deliver the consumable wire electrode 16 to the weld joint 512. The wire feeder 214 may include, for example, a spool for holding welding wire, an actuator for drawing welding wire from the spool for delivery to the weld joint 512, and circuitry for controlling the rate at which the actuator delivers the welding wire. The actuator may be controlled based on a control signal from the control circuitry 210. Wire feeder module 214 may also include circuitry for reporting the current wire speed and/or the amount of wire remaining to control circuitry 210. In an exemplary embodiment, the wire feeder module 214 may include circuitry and/or mechanical components for measuring the wire speed such that the reported speed is an actual value, not merely an expected value based on calibration.
The gas supply module 216 may be configured to provide a shielding gas for use in the welding process through the pipe 14. The gas supply module 216 may include an electrically controlled valve for controlling the flow of gas. The valve may be controlled by a control signal from the control circuitry 210 (which may be sent through the wire feeder 214, or directly from the controller 210, as shown in phantom). The gas supply module 216 may also include circuitry for reporting the current gas flow to the control circuitry 210. In an exemplary embodiment, the gas supply module 216 may include circuitry and/or mechanical components for measuring the gas flow, such that the reported flow is the actual flow, and not just an expected value based on calibration.
Fig. 3 is a block diagram illustrating an exemplary use of a motion detection system operating within a welding system in accordance with aspects of the present disclosure. Illustrated in fig. 3 is a gesture-based welding arrangement 310 implemented in accordance with an exemplary embodiment that includes a welding system 312 and a motion detection system 314.
Motion detection system 314 may include detection circuitry 316, motion recognition system 318, and communication circuitry 320. In certain embodiments, the detection circuitry 316 may include an accessory device 322 (e.g., a sensor, accelerometer, computing device, tag, etc., which may be incorporated into a worn device or article of clothing) that may be remote from the motion detection system 314, such as being disposed on or near the welding operator 324, but may be in communication with the motion detection system 314 via a wired or wireless system. As described above, the motion detected by the motion detection system 314 is translated into one or more command signals that are utilized by the welding system 312 to change welding operating parameters.
The detection circuitry 316 (e.g., a sensor system) may include one or more cameras or sensor systems that may detect gestures and/or movements of the welding operator 324. It should be noted that in some cases, the detection circuitry 316 may include an accessory device 322. Further, the detection circuitry 316 may be configured to detect an action of the accessory device 322. For example, the detection circuitry 316 may capture movement of a sensor disposed within the accessory device 322. In other cases, the detection circuitry 316 directly detects the pose and/or movement of the welding operator 324 without the need for an intermediate accessory device 322. For example, the detection circuitry 316 may identify the welding operator and capture movements of the welding operator (e.g., movements of joints, limbs, etc. of the welding operator). Further, in some cases, the detection circuitry 316 receives motion information from the accessory device 322 for detecting gestures and/or movements of the welding operator 324. For example, accessory device 322 may detect movement of the welding operator, such as blinking of the eyes or pinching of a finger, and may process and communicate the detected movement to motion detection system 314.
Accordingly, the detection circuitry 316 may include various types of audio/video detection techniques to enable it to detect the position, movement, pose, and/or action of the welding operator 324. For example, the detection circuitry 316 may include a digital camera, a video camera, an infrared sensor, an optical sensor (e.g., video/camera), a radio frequency energy detector, a sound sensor, a vibration sensor, a thermal sensor, a pressure sensor, a magnetic sensor, etc., to detect a position and/or movement of the welding operator 324, and/or to detect an action of the accessory device 322. Likewise, any of these audio/video detection techniques may also be incorporated into the accessory device 322.
In certain embodiments, a camera (e.g., digital, video, etc.) may incorporate motion detection components triggered by motion, heat, or vibration, and which may be used to detect motion of the welding operator 324 or the accessory device 322. In certain embodiments, infrared sensors may be used to measure infrared light radiated from the welding operator 324 or the accessory device 322 to determine or detect gestures or actions. Further, other types of sensors (e.g., thermal, vibration, pressure, sound, magnetic sensors, etc.) may be utilized to detect heat, vibration, pressure, sound, or combinations thereof to determine or detect a pose or action of the welding operator 324 or the accessory device 322. It should be noted that in certain embodiments, multiple sensors may be positioned at various locations (either on the motion detection system 314 or remotely located from the motion detection system 314) to determine these parameters to more accurately determine the motion of the welding operator 324 or the accessory device 322. Further, it should be noted that one or more different types of sensors may be incorporated into the detection circuitry 316, for example, a thermal sensor may be configured to detect motion of the welding operator 324 or the accessory device 322. In certain embodiments, the rf energy sensor may be used to detect motion of the welding operator 324 or accessory device 322 through radar, microwave, or tomographic motion detection.
The detected position, gesture, and/or motion received by the detection circuitry 316 may be input into the motion recognition system 318, and the motion recognition system 318 may translate the detected motion into various welding commands corresponding to the detected motion. After determining the welding command corresponding to the detected action, the action recognition system 318 may send the welding command to the welding system 312 via the communication circuitry 320. The welding system 312, or more specifically, components of the welding system 312, may implement welding commands. For example, the action recognition system 318 may receive the detection action from the detection circuitry 316 and may interpret the detected action as a command to stop the function of a component of the welding system 312. Further, the communication circuitry 320 can signal the welding system 312 to stop components of the welding system 312 as desired by the welding operator 324.
The welding system 312 may include various components that may receive control command signals. The systems and methods described herein may be used with Gas Metal Arc Welding (GMAW) systems, other arc welding processes (e.g., FCAW-G, GTAW (TIG), SAW, SMAW), and/or other welding processes (e.g., friction stir, laser, hybrid). For example, in the illustrated embodiment, the welding system 312 may include a welding power supply 326, a welding wire feeder 328, a welding torch 330, and a gas supply system 332. However, it should be noted that in other embodiments, various other welding components 334 may receive control command signals from the motion detection system 314.
The welding power supply unit 326 generally provides power to the welding system 312 and other various accessories, and may be coupled to a welding wire feeder 328 via a weld cable. The welding power supply unit 326 may also be coupled to a workpiece (not shown) using a wire cable with a clamp. In the illustrated embodiment, the welding wire feeder 328 is coupled to the welding torch 330 by a welding cable to provide welding wire and power to the welding torch 330 during operation of the welding system 312. In another embodiment, the welding power supply 326 may be coupled to the welding torch 330 and directly power the welding torch 330. The welding power supply 326 may generally include power conversion circuitry that receives input power from an alternating current power source 454 (e.g., an AC power grid, an engine/generator set, or a combination thereof), conditions the input power, and provides DC or AC output power circuitry. Thus, the welding power supply 326 may power the welding wire feeder 328, which in turn, powers the welding torch 330, depending on the requirements of the welding system 312. The illustrated welding system 312 may include a gas supply system 332 that supplies a shielding gas or shielding gas mixture to the welding torch 330.
During the welding process, various control devices are often provided to enable an operator to control one or more parameters of the welding operation. For example, in some welding systems 312, a control panel is provided with various knobs and buttons to enable a welding operator to change the amperage, voltage, or any other desired parameter of the welding process. In practice, the welding operator may control various welding parameters (e.g., voltage output, current output, wire feed speed, pulse parameters, etc.) on one or more components of the welding system 312. Accordingly, various welding parameters may be controlled by the detected position, posture, and/or motion received by the detection circuitry 316 and translated into various welding commands via the motion recognition system 318.
For example, a welding operator may wish to adjust the wire feed speed from the welding location. Thus, the welding operator may make a preset action with the gesture, which the motion detection system 314 will detect, recognize, and translate into commands for adjusting the wire feed speed. Further, the welding system 312 receives the command and implements the command to adjust the wire feed speed as needed. In some cases, the operator may implement several consecutive gestures corresponding to a series of commands to operate the welding system 312 in a desired manner. For example, to adjust the voltage output of the welding system 312, an operator may first provide a gesture associated with the welding power source 326, and the gesture indicates a desire to control some characteristic of the welding power source 326. Next, the operator may make gestures to increase or decrease the voltage output of the welding system 312. In some cases, the motion detection system 312 may translate and store each welding command before passing the final welding command to the welding system 312. In other cases, the motion detection system 312 may communicate each welding command directly to the welding system 312. Still further, in some embodiments, the motion detection system 312 may receive only one welding command, but may interpret the welding command as one or more control signals. Thus, the welding system 312 may implement one or more continuous control signals, where each control signal is a step of the received welding command.
As described above, in certain embodiments, the action detection system 314 is coupled to a cloud network 336 having storage 338 that may include a gesture library 440 associated with a particular welding command and/or a type of welding command. In particular, the action recognition system 318 can utilize the cloud 336 to determine one or more welding commands based on the actions detected by the detection circuitry 316. The cloud 336 may refer to various evolving arrangements, infrastructures, networks, etc., typically based on the internet. The term may refer to any type of cloud, including a client cloud, an application cloud, a platform cloud, an infrastructure cloud, a server cloud, and so forth. As will be appreciated by those skilled in the art, such an arrangement typically allows a variety of numbers of entities to receive and store data related to a welding application, transmit data to welders and entities of the welding community for the welding application, provide software as a service (SaaS), provide aspects of a computing platform as a service (PaaS), provide various network infrastructures as a service (IaaS), and so forth. Further, encompassed by the term are various types and business arrangements of such products and services, including public, community, hybrid, and private clouds. In particular, the cloud 336 may be a shared resource accessible to a variety of numbers of welding entities, and each welding entity (e.g., operator group, company, welding location, equipment, etc.) may provide a welding gesture associated with a welding command, which may be used by the action recognition system 318 at a later time.
In an example embodiment, multiple motion detection components or elements may be used to enhance motion detection and/or interaction based thereon. For example, referring to the welding arrangement 310 shown in fig. 3, rather than using only a single accessory device 322, multiple accessory devices 322 may be used. For example, when the accessory device 322 is an armband-based device, the operator may wear the accessory device 322 on each arm. Using multiple motion detection devices (or elements) may allow more complex gestures or motions (e.g., complex three-dimensional (3D) gestures or motions) to be detected (and thus recognized based thereon). The use of multiple motion detection devices (or elements) may also allow the operator to provide gesture-based input using any available means, allowing the operator greater freedom in how to perform the weld. For example, when wearing an armband-based detection device on each arm, the operator can switch the welding device (e.g., welding torch) between hands, but still have free arms/hands to continue interaction (e.g., control selected parameters) without moving the sensor band between wrists. Further, the motion detection device (or element) may be configured to accommodate such flexible operation, e.g., operatively identify on which arm the operator is wearing the device, such as based on performance of a particular gesture that indicates whether the device is being actively used (or not) to provide gesture-based input.
Fig. 4 is a block diagram illustrating an exemplary motion detection system in accordance with aspects of the present disclosure. Shown in fig. 4 is a motion detection system 314, which may include detection circuitry 316, motion recognition system 318, and communication circuitry 320.
As described above, the detection circuitry 316 may include various types of audio/video detection techniques to enable it to detect the position, movement, pose, and/or action of the welding operator 324 and/or the accessory device 322. Further, communication circuitry 320 may enable wired or wireless communication between motion detection system 314 and cloud 336, welding system 312, and/or accessory device 322. The motion detection system 314 may also include a memory 441, a processor 442, a storage medium 444, an input/output (I/O) port 446, and the like. Processor 442 may be any type of computer processor or microprocessor capable of executing computer-executable code. Memory 441 and storage 444 may be any suitable article of manufacture that may be used as a medium to store processor executable code, data, and the like. These articles of manufacture may represent a computer-readable medium (i.e., any suitable form of memory or storage) that may store processor-executable code for use by processor 442 in performing the techniques of this disclosure.
The motion recognition system 318 may receive motion and/or gesture data related to the welding operator 324 and/or the accessory device 322 via wired and/or wireless communication. In particular, the action recognition system 318 interprets the received data to determine a weld command (e.g., a weld control signal) for one or more components of the welding system 312. The memory 441 and the storage 444 may also be used to store data, corresponding interpretations of the data, and welding commands corresponding to the data within the library 440. The illustrated embodiment depicts the memory 444 of the action recognition system 318 storing information related to the data and welding commands corresponding to the data (as described further below), although it should be noted that in other embodiments, the memory 441 and/or the cloud 336 (as described with reference to fig. 3) may be used to store the same information.
The library 440 may include a particular type of action and/or a particular action (e.g., gesture) and welding commands associated with the action or type of action. In some cases, the operating engine mode 448 within the processor 442 of the motion recognition system 318 may be utilized to change the operating mode of the motion recognition system 318. The operating engine mode 448 may be set to, for example, an operating mode or a configuration mode. For example, in the configuration mode, the motion recognition system 318 is programmed to associate a particular motion or gesture with a particular welding command. Accordingly, the operator 324 may provide input to the motion recognition system 318 via the I/O port 446 that indicates a welding command directed to a particular component of the welding system 312. The welding operator 324 may then position itself to allow the detection circuitry 316 to detect a particular action or gesture that the operator 324 intends to associate with the input welding command. In particular, the motion recognition system 318 may store the motion patterns and/or gestures collected by the detection circuitry 316 within the library 440 and may associate the motions with corresponding weld commands.
For example, the operator 324 may provide input to the motion recognition system 318 to enter a configuration mode and associate a particular motion or gesture with a particular welding command directed to a particular component of the welding system 312 (e.g., the welding power source 326). Upon receiving these inputs, the motion recognition system 318 may detect the pose of the operator 324, e.g., while the operator 324 is in the viewing window of the detection circuitry 316, e.g., keeping the arms straight out, palm out, and body up. In some embodiments, the operator 324 need not be within the field of view of the detection circuitry 316, but rather may wear the accessory device 322, which may include one or more sensors (e.g., accelerometers) that track the actions of the operator 324 and communicate the actions to the detection circuitry 316. In other embodiments, the detection circuitry 316 may be configured to track movement of the accessory device 322 from the motion recognition system 318, and more particularly, may track movement of the accessory device 322 and/or one or more sensors disposed within the accessory device 322. Once the motion recognition system 318 detects the motion, the motion recognition system 318 may store the motion and/or gesture as data within the gesture library 440. Specifically, the data is associated with a welding command or task and may be marked as is within the memory 444 and/or the memory 441. In this manner, for example, the operator 324 may configure the upward motion of the palm to a gesture associated with increasing the wire speed of the welding system 312. In some embodiments, the motion recognition system 318 may enter and exit the configuration mode by receiving some input from the operator 324 that does not include any detected motion or gesture. In this case, the configuration mode may be secure and may not be affected by any inadvertent actions or gestures.
In certain embodiments, the operating engine mode of the processor 442 of the motion recognition system 318 is set to the operating mode. In the operational mode, the welding operator 324 may perform a welding task with the welding system 312 and may enable the motion detection system 314. During the welding process, the operator 324 may wish to adjust the welding parameters via one or more gestures or actions. Accordingly, the detection circuitry 316 receives the gestures and/or actions in one of the methods described above and retrieves the welding command from the library 440 based on the detected gestures and/or actions of the operator 324 (or the accessory device 322). For example, if the motion recognition system 318 detects that the operator 324 moves his palm in an upward motion, the motion recognition system 318 may compare the detected motion to the motion or motion pattern stored in the library 440 and determine that the motion corresponds to increasing the wire speed of the welding system 312.
In an exemplary embodiment, the motion detection system 314 may support gestures for disabling (and/or enabling) gesture-based functionality. For example, the library 440 may include associations between particular gestures (or movements) and commands for disabling the motion detection system 314, and the operating engine mode 448 may support a non-operating mode. Thus, when the operator 324 performs a particular gesture, the gesture is detected by the accessory device 322 and then recognized by the detection circuitry 316, a disable command may be issued and executed. Disabling the motion detection system 314 may be accomplished by shutting down the system. Further, before the action detection system 314 is powered down, a notification may be generated and communicated to other systems interacting with the action detection system 314 (e.g., the welding system 312) to ensure that the system takes the necessary steps to address the action detection system 314 power down.
Wherein disabling includes powering down the motion detection system 314, (re) enabling the motion detection system 314 may require manual or direct (re) powering of the system. However, in other cases, disabling the motion detection system 314 may simply include shutting down various components and/or functions therein and transitioning to a minimum functional state in which only the functions and/or components necessary to re-enable the motion detection system 314 remain operational. Thus, when an enabling gesture (which may be the same as a disabling gesture) is detected by the accessory device 322, the gesture may trigger re-enabling or re-activating the motion detection system 314 to a full mode of operation. Transitioning to such a minimum functional state may improve power consumption (as many functions and/or components as possible are shut down or powered down) without affecting the ability to resume motion detection related operations when needed (and do so quickly). In some implementations, the motion detection system 314 (and/or other components or devices for supporting gesture-related operations) may be configured to enable disabling and/or disabling the system by providing other functionality suitable for (re) enabling the system. For example, where an optical component or element (e.g., a camera) is used for optical-based detection, and the optical component or element is disabled as part of an overall disabling detection operation, then the re-enabling may be configured or implemented to be triggerable by the user by way of a non-visually recognized gesture. Examples of such non-visual means may include buttons/controls, sensors that can sense (rather than visually perceive) user movement. In other words, such vision-based motion detection systems may be configured to complement remote gesture detection using local (personal) gesture detection elements to detect re-enabling gestures.
The library 440 may include a plurality of actions 452 and a weld command 454 corresponding to each action. The welding commands may include any commands to control the welding system 312, and/or components of the welding system 312, such as the welding power supply 326, the gas supply system 332, the welding wire feeder 328, the welding torch 330, or other welding components 334 (e.g., grinders, lights, etc.) of the welding system 312. Thus, the welding commands may include, but are not limited to, starting the device, stopping the device, increasing the speed or output of the device, decreasing the speed or output of the device, and the like. For example, a welding command associated with the gas supply system 332 may include adjusting a gas flow rate. Likewise, the welding commands associated with the welding wire feeder 328 may also include adjusting wire speed, changing between push/pull feed systems, and the like. Further, the welding command associated with the welding power supply 326 may include changing the voltage or power delivered to the welding torch 330. Further, the library 440 may include other commands associated with various actions, such as disabling the action recognition system 318, limiting the operator's control or ability to interface with the action recognition system 318, and so forth.
Fig. 5 is a block diagram illustrating an example gesture accessory device that may be used in conjunction with and in wireless communication with a motion detection system in accordance with aspects of the present disclosure. As shown in fig. 5, according to an exemplary embodiment, the motion detection system 314 is operably coupled to an accessory device 322. In this regard, in various embodiments, the accessory device 322 may be in wired or wireless communication with the motion detection system 314.
In some embodiments, the detection circuitry 316 may include an accessory device 322. Further, the detection circuitry 316 may be configured to track movement of the accessory device 322 and/or one or more sensors configured within the accessory device 322 directly from the motion detection system 314. In particular, the accessory device 322 may include a sensor 556 (e.g., infrared, optical, sound, magnetic, vibration, etc.), an accelerometer, a computing device, a smartphone, a tablet, a GPS device, a wireless sensor tag, one or more cameras, or similar devices configured to assist the detection circuitry 316 in detecting the actions and/or gestures of the operator 324. In some cases, accessory device 322 may be incorporated into an article of clothing (e.g., a bracelet, wristband, anklet, necklace, etc.) worn, placed, or carried by operator 324, or may be a device held by operator 324.
In some cases, sensor system 556 is configured to collect posture and/or motion data from operator 324, similar to the manner in which detection circuitry 316. The collected motion and/or gesture data may be digitized via one or more processors within the processing circuitry 558, which may also be associated with the memory 560. The processing circuitry 558 may be any type of computer processor or microprocessor capable of executing computer executable code. The memory 560 may be any suitable article of manufacture that may serve as a medium to store processor executable code, data, and the like. The articles of manufacture may represent computer-readable media (i.e., any suitable form of memory or storage) that may store processor-executable code for use by processing circuitry to perform the disclosed techniques. Further, the digitized data may be communicated to motion detection system 314 via wired and/or wireless communication circuitry 562. As described above, the motion recognition system 318 interprets the received data to determine a welding command (e.g., a welding control signal) for one or more components of the welding system 312 and transmits the welding command to the welding system 312 via the communication circuitry 320 of the motion detection system 314. It should be noted that the communication between the components of the gesture-based welding arrangement 310 may be over a secure channel.
In some embodiments, the communication circuitry 562 of the gestural accessory device 322 is also in communication with the welding system 312. For example, the gesture accessory device 322 may be paired with the welding device 322 prior to beginning the welding operation to ensure that the gesture provided by the operator 324 is safely used for the paired device. In this manner, while the plurality of gestured accessory devices 322 are proximate to the welding system 312, only the pairing device 322 is able to provide gestural commands to the welding system 312 via the motion detection system 314.
Further, in some embodiments, the accessory device 322 may include an I/O port 564 that may enable the operator 324 to provide input to the motion detection system 314. The input may include methods of pairing the accessory device 322 with the welding system 312 and/or the motion detection system 314, and the operator 324 may also use these methods to input identification information and/or welding-related information. In some embodiments, the accessory device 322 may include a display 566 that enables the operator 324 to visualize the welding commands sent by the action detection system 314 to the welding system 312. Further, the display 566 may be used to receive and display various welding-related information from the welding system 312, such as the status of current operating parameters, the status of welding commands (e.g., control signals) sent to the welding system 312, the status of wireless connections, whether a welding command is implemented, errors or alarms, or any information generally related to the gesture-based welding arrangement 310.
In some embodiments, non-visual feedback may be used during gesture-based interaction (e.g., control) of a welding operation to allow feedback to an operator in a non-visual manner, e.g., without requiring the operator to use a display or similar visual interface, which may result in the operator not needing to stare at his eyes while welding a workpiece. For example, the accessory device 322 may be configured to provide feedback to the operator regarding gesture-based interaction (e.g., control) of the welding operation using non-visual means, such as audio (e.g., beepers), tactile (e.g., vibrations), or similar output. For example, the feedback may confirm receipt of a particular gesture. For example, two long vibrations may indicate recognition of an "increase amperage" gesture, while one short vibration may indicate recognition of a "decrease amperage" gesture. The system may then, for example, wait for a second confirmation gesture before validating the recognized gesture.
The use of non-visual feedback may be particularly desirable in situations where the use of a display or visual output is neither practical nor desirable (e.g., for security reasons), or to avoid the need for any such display altogether. For example, in an armband-based embodiment, supporting non-visual feedback may allow the accessory device 322 to be worn in a manner that is not feasible with visually provided feedback, e.g., it may be worn under a welding jacket, thereby protecting from the welding environment. Furthermore, the use of non-visual feedback in the welding operation may be preferable when the operator may not be able to clearly see the visual feedback, as the operator will wear a special helmet/shield and/or look at a display and attempt to read the visual feedback through special welding lenses or glasses.
With non-visual feedback, the characteristics of the non-visual output used may be adjusted to provide different feedback. For example, using vibration-based feedback, the vibration (e.g., in terms of one or more of duration, frequency, and intensity of the vibration) may be adjusted to indicate different feedback. In some cases, the characteristics of the non-visual output may be configured and/or adjusted by the operator to indicate particular feedback. For example, when implemented to provide non-visual feedback, the I/O port 564 of the accessory device 322 may be used to specify a particular type of non-visual feedback (e.g., audio or vibration) and/or to specify particular characteristics (e.g., duration, frequency, and vibration intensity) for each particular desired feedback.
While the accessory device 322 and the motion detection system 314 are shown and described as two separate elements in fig. 3-5, the present disclosure is not necessarily so limited. Thus, in an exemplary embodiment, the accessory device 322 and the motion detection system 314 (and its functionality) may be combined into a single device, which may preferably be configured or designed for portable use in the same manner as described with respect to the accessory device 322. Thus, the single device will be operable to perform the functions of the accessory device 322 (e.g., gesture detection, sensing, input/output) as well as the functions of the motion detection system 314 (e.g., motion recognition, command determination, command association, etc.).
Fig. 6 is a flow diagram illustrating an exemplary method for transmitting a welding command from a motion detection system to a welding system in accordance with aspects of the present disclosure. The flow chart of the method 600 as shown in FIG. 6 includes a number of exemplary steps (represented as blocks 602-608) for transmitting a weld command from the action detection system 314 of FIG. 3 to the welding system 312 in accordance with an exemplary embodiment.
The method 600 may be used to enable an operational mode of the motion detection system 314 over an operational engine mode 448 via the I/O port 446 (step 602). In this manner, the motion detection system 314 may be configured to detect motions and/or gestures and translate the detected motions and/or gestures into welding commands using the gestures library 440.
For example, method 600 may include detecting gestures and/or actions (step 604). As described above, the detection circuitry 316 may include various types of audio/video detection techniques to enable it to detect the position, movement, pose, and/or action of the welding operator 324 and/or the accessory device 322. Further, method 600 may include determining a welding command associated with the detected action and/or gesture (step 606). For example, the action recognition system 318 interprets the received data to determine a welding command (e.g., a welding control signal) for one or more components of the welding system 312. The welding command may be determined by comparing the received data to data within the gestures library 440.
Additionally, the method 600 may include communicating the welding command to the welding system 312 (step 608). The welding commands may include any commands to control the welding system 312, and/or components of the welding system 312, such as the welding power source 326, the gas supply system 332, the welding wire feeder 328, the welding torch 330, or other welding components 334 of the welding system 312. In this manner, the gestures and/or actions provided by the operator 324 and/or the accessory device 322 may be used to control one or more welding parameters of the welding system 312.
FIG. 7 is a flow diagram illustrating an exemplary method for associating a welding command with a particular gesture or action in accordance with aspects of the present disclosure. FIG. 7 is a flowchart illustrating a method 700 that includes a number of exemplary steps (represented as blocks 702-708) for associating a particular weld command with a particular gesture and/or action in accordance with an exemplary embodiment.
As described above, the motion detection system 314 may be configured in an operational mode to detect motions and/or gestures and translate the detected motions and/or gestures into welding commands using the gesture library 440. The illustrated method 700 includes enabling a configuration mode (e.g., learning, pairing, associating, etc.) of the motion detection system 314 over an operating engine mode 448 via an I/O port 446 (step 702). In this manner, the motion detection system 314 may be configured to associate and store a particular motion or gesture with a particular weld command within the memory 441 and/or the storage 444.
Further, the method 700 may include the motion detection system 314 receiving a welding command that the operator 324 wishes to set a gesture and/or motion via the I/O port 446 (step 704). As described above, the welding commands may be for any component of the welding system 312. The welding operator 324 may then position itself to allow the detection circuitry 316 of the motion detection system 318 to detect a particular motion or gesture that the operator 324 intends to associate with the input welding command (step 706). Further, the method 700 may include the motion recognition system 318 storing the motions and/or gestures collected by the detection circuitry 316 within the library 440 and associating the motions with corresponding welding commands (step 708). It should be noted that such associations may be made and stored within the library 440 of the cloud network 336 and retrieved by the local system as needed. In some cases, the pre-associated global welding commands may be overridden with local welding commands that are more personalized to the welding operator 324.
Fig. 8 illustrates an example gesture-based armband apparatus for remotely controlling a welding operation in accordance with aspects of the present disclosure. Shown in fig. 8 is a gesture-based control device 800 worn by an operator (e.g., operator 18) during a welding operation.
The gesture-based control 800 may include suitable circuitry operable to support interaction with and control of equipment used for welding operations and/or welding operation monitoring. Gesture-based control device 800 may be configured such that it may be secured to an operator (e.g., operator 18) and/or an article worn or directly manipulated by the operator (e.g., a welding helmet, a welding glove, a welding torch, etc.). In the particular embodiment shown in fig. 8, gesture-based control device 800 may be configured as an armband-based implementation. The gesture-based control device 800 may be operable to receive operator input specifically provided in the form of gestures and/or actions. In this regard, the gesture-based control device 800 may be operable to detect a gesture and/or action of the operator and then process the detected gesture and/or action. Processing may include determining (or not) whether a detected gesture and/or action corresponds to a particular command (or action). For example, as described in more detail above (e.g., with respect to fig. 3-7), a particular gesture and/or action may be associated with a particular welding command, or may be associated with an action that may be directly performed or processed by the device itself (e.g., generating a new association for a detected gesture and/or action, disabling/(re) enabling gesture/action related components or operations, etc.). An exemplary detailed embodiment of the components and circuitry of the apparatus 800 is depicted in fig. 9.
The gesture-based control device 800 may be configured to communicate with other devices or systems, such as when a detected gesture or action is determined to be associated with a particular welding command. The gesture-based control device 800 may preferably be operable to perform such communication in a wireless manner. In this regard, the gesture-based control device 800 may be operatively connected to other devices or systems (e.g., welding and/or welding monitoring equipment) in a wireless manner, e.g., by establishing and using a connection based on a suitable wireless technology, such as WiFi, bluetooth, etc
The armband-based gesture-based control 800 of fig. 8 includes an elastic band 810 (e.g., a wrist band) that may allow the operator 18 to wear the gesture-based control 800 on his/her arm (as shown at the top of fig. 8). In some cases, a dedicated armband arrangement may be used instead of the gesture-based control device 800 with a built-in band. Such an armband arrangement may include a band 810 and a holder 820 to which a gesture-based control apparatus 800 may be attached. For example, the retainer 820 may include a suitable securing device (e.g., a clip) configured to secure the device 800 into the armband arrangement. Nonetheless, the present disclosure is not so limited, and other methods (and corresponding arrangements) may be used for operator worn control devices, or integrating them into clothing or equipment used or worn by the operator.
The gesture-based control device 800 may be configured to support non-vision based feedback. For example, the gesture-based control device 800 may operably support haptic feedback, such as vibration or audio feedback. In this regard, the gesture-based control device 800 may include a vibration and/or audio component that may generate a vibration and/or audio signal as a way of providing feedback. The vibration assembly may be operable to vary the frequency, amplitude and duration of the vibration to provide different feedback, for example to indicate recognition of certain gestures or to verify that a corresponding action has occurred. This feedback can be detected by the operator due to the close contact of the device with the operator's arm. Similarly, an audio component (e.g., an audio transducer) may provide feedback, for example by providing a particular audio output (e.g., a particular one of a plurality of predefined ring tones, similar to the ring tones typically used in telephones), each of which is uniquely indicative of some feedback.
The gesture-based control device 800 may be a specially designed and applied, dedicated device for interacting with and controlling a welding assembly (e.g., a welding and/or welding monitoring apparatus). However, in some example embodiments, devices that may not be specifically designed or manufactured as "control devices" may still be configured for such use. In this regard, devices having the functions and/or features required as control devices in the manner described in this disclosure may be used. For example, devices that may be used include those having (1) appropriate communication capabilities (e.g., wireless technologies such as WiFi, bluetooth, etc.), (2) appropriate resources (e.g., sensors) for detecting gestures and/or motion interactions, (3) appropriate processing capabilities for processing and analyzing detected motions or gestures, or (4) appropriate resources (e.g., keypads, buttons, text interfaces, or touch screens) for providing feedback, which are also small and/or light enough for the operator to conveniently wear and/or integrate into articles worn by the operator or used directly. Further, devices such as smart phones, smart watches, and the like may be used as "control devices". In this regard, the interface functions may be implemented in software (e.g., an application program) that may be executed or carried out by existing hardware components of the devices.
Fig. 9 illustrates example circuitry of a gesture-based armband apparatus for remotely controlling a welding operation in accordance with aspects of the present disclosure. Illustrated in fig. 9 is circuitry of an exemplary gesture-based control device 900. Gesture-based control device 900 may correspond to device 800 of fig. 8.
As shown in fig. 9, gesture-based control apparatus 900 may include main controller (e.g., Central Processing Unit (CPU)) circuitry 910, communication interface circuitry 920, audio controller circuitry 930, haptic controller circuitry 940, and sensor controller 950.
The main controller circuitry 910 is operable to process data, perform specific tasks or functions, and/or control the operation of other components in the device 900. For example, main controller circuitry 910 may receive sensory data from sensor controller circuitry 950, which may correspond to gestures or movements of an operator wearing device 900. Master controller circuitry 910 may process such sensory data by, for example, applying a pre-programmed gesture recognition algorithm (and/or using pre-stored information, such as a gesture-based library) to discern whether the sensory data indicates that a gesture or action belongs to a trained set of gestures. If the gesture belongs to a trained set of gestures, master controller circuitry 910 may identify an associated control action. The main controller circuitry 910 may then control other components of the device in response to any identified action or command. For example, in the event a particular welding command is identified, the master controller circuitry 910 may send data and/or signals to the communication interface circuitry 920 to transmit the command (e.g., via a WiFi or bluetooth signal 921 to the appropriate welding or welding monitoring device). The master controller circuitry 910 may also generate or determine appropriate feedback (e.g., confirmation of gesture/action detection, verification of detected gesture/action, confirmation of taking a corresponding action, etc.), and may send data and/or signals to feedback-related components (e.g., audio controller circuitry 930 or haptic controller circuitry 940) to enable providing the feedback.
The communication interface circuitry 920 is operable to process communications in the gesture-based control device 900. The communication interface circuitry 920 may be configured to support various wired or wireless technologies. The communication interface circuitry 920 may be operable, for example, to configure, establish and/or use wired and/or wireless connections, such as through appropriate wired/wireless interfaces, and in accordance with wireless and/or wired protocols or standards supported in the device, to facilitate transmission and/or reception of signals (e.g., to carry data). Further, the communication interface circuitry 920 may be operable to process transmitted and/or received signals in accordance with applicable wired or wireless techniques. Examples of wireless technologies that may be supported and/or used by the communication interface circuitry 920 may include Wireless Personal Area Networks (WPANs), such as bluetooth (IEEE 802.15); near Field Communication (NFC); wireless Local Area Networks (WLANs), such as WiFi (IEEE 802.11); cellular technologies such as 2G/2G + (e.g., GSM/GPRS/EDGE and IS-95 or cdmaOne) and/or 3G/3G + (e.g., CDMA2000, UMTS and HSPA); 4G, such as WiMAX (IEEE 802.16) and LTE; ultra Wideband (UWB); and so on. Examples of wired technologies that may be supported and/or used by the communication interface circuitry 920 include ethernet (IEEE 802.3), Universal Serial Bus (USB) based interfaces, and so forth. Examples of signal processing operations that may be performed by apparatus 900 include, for example, filtering, amplification, analog-to-digital and/or digital-to-analog conversion, up/down conversion of baseband signals, encoding/decoding, encryption/decryption, modulation/demodulation, and so forth.
As with the exemplary embodiment shown in fig. 9, the communication interface circuitry 920 may be configured to use an antenna 922 for wireless communication and a port 414 for wired communication. Antenna 922 may be any type of antenna suitable for the frequencies, power levels, etc. required by the wireless interface/protocol supported by gesture-based control device 900. For example, antenna 922 may specifically support WiFi and/or bluetooth transmission/reception. The port 924 may be any type of connector suitable for communicating over a wired interface/protocol supported by the gesture-based control device 900. For example, ports 924 may include ethernet twisted pair ports, USB ports, HDMI ports, Passive Optical Network (PON) ports, and/or any other suitable ports for connecting with a wire or fiber optic cable.
The audio controller circuitry 930 is operable to process audio input and/or output (I/O) functions associated with the device 900. For example, the audio controller circuitry 930 may be operable to drive one or more audio I/O elements 932 (e.g., audio transducers, each operable to convert an audio signal into an electromagnetic signal, and vice versa). In this regard, the audio controller circuitry 930 may generate and/or condition (e.g., amplify or digitize) data corresponding to audio inputs or outputs (signals) in the device 900. For example, with respect to gesture-related operations in the device 900, the audio controller circuitry 930 may be operable to generate data or signals that cause the audio transducer to output an audio signal 931 that represents feedback provided to an operator using (e.g., wearing) the device 900 during a welding operation in response to detection of the operator's gesture or motion.
Haptic controller circuitry 940 is operable to process haptic input and/or output (I/O) functions associated with device 900. For example, haptic controller circuitry 940 may be operable to drive one or more haptic elements 942 (e.g., a buzzer or vibrating transducer). In this regard, haptic controller circuitry 940 may generate and/or adjust (e.g., amplify or digitize) data corresponding to haptic output (signals) in device 900. For example, with respect to gesture-related operations in device 900, haptic controller circuitry 940 may be operable to generate data or signals that cause haptic elements to output haptic signals 941 (e.g., vibrations). The tactile signals 941 may provide feedback to an operator using (e.g., wearing) the device 900 during a welding operation. The tactile signal 941 may be generated in response to detection of a gesture or motion of an operator.
The sensor controller circuitry 950 is operable to process sensing-related operations of the device 900. For example, sensor controller circuitry 950 may be operable to drive or control one or more sensors that may be used to obtain sensory data that is closely related to the operation of device 900. The sensor controller circuitry 950 may process and/or support various types of sensors.
For example, in some cases, a muscle sensor 952 may be used. In this regard, each muscle sensor 952 may include a series of segmented electrodes arranged such that when the muscle sensor 952 is in contact with the operator's body (e.g., forearm), the electrodes may detect differential electrical impulses of nearby muscles. The sensory data obtained in this manner (alone or in combination with information from other sensors) can be used to detect and decode gestures made by the operator using a body part in the vicinity of the sensor where the sensor is located (e.g., the arm where the sensor is located, the wrist of the arm, the hand, or the finger).
A MEMS (micro-electro-mechanical system) sensor 954 may also be used. They may be embedded directly into the device 900 and/or into other items worn or used by the operator (e.g., gloves). These sensors may generate sensory data related to gestures or actions made by an operator (e.g., using a hand, finger, or wrist). MEMS sensor 954 may include a gyroscope-based, accelerometer-based, and/or magnetic-based sensor. MEMS sensor 954 may generate sensory data that may be used (alone or in combination with sensory data from other sensors such as muscle sensor 952) to detect and decode gestures or actions made by an operator.
As an example, where the device 900 is based on an armband, movement of the operator's arm may change the orientation of the device 900. Accordingly, MEMS sensor 954 can generate sensory data that allows the orientation of the device and its changes to be determined. The gyroscopic-based sensors, which can be easily used for sensing in an xyz orthogonal cartesian coordinate system, can be used alone or in combination with other sensors to sense the overall movement and direction of movement of the entire arm on which the armband is located. Accelerometer-based sensors (particularly when configured for 3D sensing) can detect gravity or gravity vectors, so their outputs can be used alone or in combination to determine the orientation of the device on the arm relative to commonly accepted "down" and "up" directions. They can also be used directly or by mathematical integration to determine the speed and direction of the arm containing the armband. Magnetic-based sensors, while perhaps covered by a magnetic field near the welding arc, can determine the direction of the "north pole" (when the arc is not operating), which can be used to instruct the operator to move in a particular direction, finding the device being controlled at a great distance from the location where the operator is working.
Although the device 900 is described as including all of the components shown in fig. 9, the present disclosure is not so limited, and thus some of these components may represent separate dedicated "devices" that are merely coupled with other components of the device 900. Further, due to its intended portable use, the apparatus 900, or at least its primary components (e.g., including at least the components of the main controller circuitry 910), may be implemented as a rechargeable battery-powered platform, including minimal required resources (e.g., processing or storage) to provide or enable the required detection, analysis, communication, and/or feedback functions. In some cases, apparatus 900 may be operable to record and/or store gestures and/or gesture attempts (e.g., those that do not match existing predefined gestures), such as to improve gesture recognition through statistical training.
FIG. 10 is a flow diagram illustrating an exemplary method for providing feedback during gesture-based remote control of a welding operation in accordance with aspects of the present disclosure. Shown in fig. 10 is a flow chart of a method 1000 including a number of exemplary steps (represented as blocks 1002-1010).
In step 1002, a posture and/or an action of an operator may be detected, for example, based on sensory data obtained from sensors placed on or near the operator (or integrated in an article worn or used by the operator).
In step 1004, the detected gesture and/or action may be processed, for example, to determine whether the gesture and/or action matches any predefined gesture or action, and whether there are any associated actions (e.g., welding commands or a new associated programming) corresponding thereto.
In step 1006, it may be determined what feedback, if any, needs to be provided to the operator. The feedback may be confirmation (e.g., the gesture/action is detected, the action is determined, or the action is performed), verification (e.g., operator confirmation is requested), and the like.
In step 1008, a corresponding output may be generated based on the feedback. In particular, the output may be configured based on the feedback itself and the type of output (e.g., modifying characteristics of the output based on the particular feedback). For example, based on the output of the vibration, characteristics such as intensity, frequency, and/or duration may be adjusted to reflect the different feedback.
In step 1010, feedback may be output to an operator.
The methods and systems of the present disclosure may be implemented in hardware, software, or a combination of hardware and software. The method and/or system of the present disclosure may be implemented in a centralized fashion in at least one computing system, or in a distributed fashion where different elements are spread across several interconnected computing systems. Any computing system or other apparatus adapted for carrying out the methods described herein is suited. A typical combination of hardware and software could include a general purpose computing system with program or other code that, when being loaded and executed, controls the computing system such that it carries out the methods described herein. Another exemplary implementation may include an application specific integrated circuit or chip. Some implementations may include a non-transitory machine-readable (e.g., computer-readable) medium (e.g., a flash drive, an optical disk, a magnetic storage disk, etc.) having one or more lines of code stored thereon that are executable by a machine to cause the machine to perform a method as described herein.
While the methods and/or systems of the present disclosure have been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the methods and/or systems of the present disclosure. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the disclosure without departing from the scope thereof. Therefore, it is intended that the present method and/or system not be limited to the particular embodiments disclosed, but that the present method and/or system will include all embodiments falling within the scope of the appended claims.
As used herein, the terms "circuit" and "circuitry" refer to physical electronic components (i.e., hardware) as well as any software and/or firmware ("code") that may configure, be executed by, or otherwise be associated with the hardware. As used herein, for example, a particular processor and memory may include a first "circuit" when executing a first set of one or more lines of code and a second "circuit" when executing a second set of one or more lines of code. As used herein, "and/or" refers to any one or more of the listed items in the list connected by "and/or". By way of example, "x and/or y" means any element of the triplet { (x), (y), (x, y) }. In other words, "x and/or y" means "one or both of x and y". As another example, "x, y, and/or z" refers to any element of the heptad set { (x), (y), (z), (x, y), (x, z), (y, z), (x, y, z) }. In other words, "x, y and/or z" means "one or more of x, y and z". As used herein, the term "example" means serving as a non-limiting example, instance, or illustration. As used herein, the terms "for example (e.g)" and "such as (for example)" list one or more non-limiting examples, instances, or lists of illustrations. As used herein, circuitry is "operable" to perform a function, regardless of whether the circuitry includes the necessary hardware and code (if needed) to perform the function, regardless of whether performance of the function is disabled or not enabled (e.g., by user-configurable settings, factory adjustments, etc.).

Claims (42)

1. A system for controlling a welding system, comprising:
a detection component operable to detect motion of a limb of an operator of a welding system and to send a command based on the detected motion, wherein the command is related to controlling a component or operation of the welding system, and wherein the detection component comprises one or more of: a camera, magnetic sensor, vibration sensor, muscle movement sensor, radio frequency energy detector, sound sensor, thermal sensor, or pressure sensor;
processing circuitry operable to receive the command and execute the command, wherein the execution of the command comprises: disabling at least some portion or operation of the detection component when the command is a disable command; and
a feedback assembly operable to provide an output to the operator related to the detected action.
2. The system of claim 1, wherein the output provided via the feedback component comprises a non-visual output.
3. The system of claim 2, wherein the non-visual output comprises an audio or haptic output.
4. The system of claim 2, wherein the feedback component is operable to modify one or more characteristics of the non-visual output based on input received from the operator of the welding system.
5. The system of claim 4, wherein the one or more characteristics include one or more of a frequency, duration, or intensity of the non-visual output.
6. The system of claim 1, wherein the disabling comprises transitioning to a minimum-functionality state configured to: in response to the identified command being a re-enable request, allowing re-enabling of the at least some portion or operation of the detection component.
7. The system of claim 4, further comprising an accessory device configured to be worn by or otherwise disposed on the operator of the welding system, wherein the accessory device comprises at least a portion of one or more of the detection component, the processing circuitry, or the feedback component, and wherein the input comprises a characteristic or parameter to specify the non-visual output via one or more input/output ports of the accessory device.
8. The system of claim 1, wherein the feedback component comprises one or more of: a display element, a vibrating element, or an audio transducer.
9. The system of claim 1, comprising an accessory device configured to be worn by or otherwise disposed on the operator of the welding system, wherein the accessory device comprises at least a portion of each of the detection component, the processing circuitry, and the feedback component.
10. The system of claim 1, wherein the detection component uses a gesture library within a local memory location or a global memory location to translate the detected action into the command, the gesture library comprising a plurality of stored commands, each stored command associated with a stored gesture.
11. A method for controlling a welding system, comprising:
detecting, by a detection component, a motion of a limb of an operator of the welding system based on sensory data acquired from one or more of a vibration sensor, a muscle movement sensor, a sound sensor, a thermal sensor, or a pressure sensor;
sending a command for controlling operation of the welding system based on the detected action; and
providing an output related to the detected action to the operator of the welding system through a feedback assembly.
12. The method of claim 11, wherein the provided output comprises a non-visual output.
13. The method of claim 12, wherein the non-visual output comprises an audio or haptic output.
14. The method of claim 12, comprising modifying one or more characteristics of the non-visual output based on input received from the operator of the welding system, the one or more characteristics comprising a frequency, duration, or intensity of the non-visual output.
15. The method of claim 11, comprising disabling at least some portions or operations of the detection component in response to the command being a disable request.
16. The method of claim 15, comprising transitioning to a minimum-functionality state in response to the disable request, wherein the minimum-functionality state is configured to allow re-enabling of at least some portions or operations of the disabling of the detection component in response to the command being a re-enable request.
17. The method of claim 14, wherein:
an accessory device is used during the detection of the operator's action;
the accessory device is configured to be worn by or otherwise disposed on the operator of the welding system;
the accessory device includes at least a portion of one or more of the detection component, processing circuitry, or the feedback component; and
the input includes a characteristic or parameter to specify the non-visual output via one or more input/output ports of the accessory device.
18. The method of claim 11, comprising providing the output to the operator of the welding system by generating a signal for output via one or more of a display element, a vibrating element, and an audio transducer.
19. The method of claim 11, wherein detecting the action of the limb of the operator further comprises using an accessory device configured to be worn by or otherwise disposed on the operator of the welding system, wherein the accessory device comprises at least a portion of each of the detection component, processing circuitry, and the feedback component.
20. The method of claim 11, wherein the detection component uses a plurality of commands and associated gestures in a gesture library stored within a local memory location or a global memory location to translate the detected action into the command.
21. A system for controlling a welding system, comprising:
a detection assembly operable to detect a pose or action of an operator of a welding system;
processing circuitry, the processing circuitry operable to:
searching for a plurality of commands based on the detected gesture or action, wherein each command of the plurality of commands is associated with a particular gesture or action and is a command for controlling operation of a particular component of the welding system; and
when a command is identified from the plurality of commands during the search, processing the identified command, wherein the processing of the identified command comprises:
when the recognized command includes a gesture-related command, executing the recognized command, or
Transmitting the identified command to a component of the welding system when the identified command is a welding command; and
a feedback component operable to provide an output to the operator related to the detected gesture or action.
22. The system of claim 21, wherein the output provided via the feedback component comprises a non-visual output.
23. The system of claim 22, wherein the non-visual output comprises an audio or haptic output.
24. The system of claim 22, wherein the feedback component is operable to modify one or more characteristics or parameters of the non-visual output based on feedback communicated to the operator of the welding system.
25. The system of claim 21, wherein the processing circuitry is operable to disable at least some gesture-related components or operations in response to the identified command being a disable request.
26. The system of claim 25, wherein the disabling includes transitioning to a minimum-functionality state configured to allow re-enabling of the at least some gesture-related components in response to the identified command being a re-enabling request.
27. The system of claim 21, wherein the detection component comprises one or more of: magnetic sensors, accelerometers, gyroscopes, vibration sensors, muscle movement sensors, cameras, optical sensors, infrared sensors, radio frequency energy detectors, sound sensors, thermal sensors and pressure sensors.
28. The system of claim 21, wherein the feedback component comprises one or more of: a display element, a vibrating element, and an audio transducer.
29. The system of claim 21, comprising an accessory device configured to be worn by or otherwise disposed on the operator of the welding system, wherein the accessory device comprises at least a portion of each of the detection component, the processing circuitry, and the feedback component.
30. The system of claim 21, wherein the plurality of commands and associated gestures or actions are stored in a gesture library within a local memory location or a global memory location.
31. The system of claim 21, wherein the system further comprises an armband-based device comprising at least a portion of each of the detection component, the processing circuitry, and the feedback component.
32. A method for controlling a welding system, comprising:
detecting a gesture or action of an operator of the welding system by the detection assembly;
identifying, by processing circuitry, a command corresponding to the detected gesture or action from a plurality of commands, wherein each command of the plurality of commands is associated with a particular gesture or action and is a command for controlling operation of a particular component of the welding system, an
Processing the identified command when a command is identified from the plurality of commands, wherein processing the identified command comprises:
when the recognized command includes a gesture-related command, executing the recognized command, or
Transmitting the identified command to a component of the welding system when the identified command is a welding command; and
providing an output related to the detected gesture or action to the operator of the welding system through a feedback component.
33. The method of claim 32, wherein the provided output comprises a non-visual output.
34. The method of claim 33, wherein the non-visual output comprises an audio or haptic output.
35. The method of claim 33, comprising modifying one or more characteristics or parameters of the non-visual output based on feedback communicated to the operator of the welding system.
36. The method of claim 32, comprising disabling at least some gesture-related components or operations in response to the identified command being a disable request.
37. The method of claim 36, comprising transitioning to a minimum-functionality state in response to the disable request, wherein the minimum-functionality state is configured to allow re-enabling of at least some gesture-related components or operations of the disable in response to the identified command being a re-enable request.
38. The method of claim 32, comprising detecting the pose or action of the operator of the welding system based on sensory data obtained from one or more of: magnetic sensors, accelerometers, gyroscopes, vibration sensors, muscle movement sensors, cameras, optical sensors, infrared sensors, radio frequency energy detectors, sound sensors, thermal sensors, and pressure sensors.
39. The method of claim 32, comprising providing the output to the operator of the welding system by generating a signal for output via one or more of a display element, a vibrating element, and an audio transducer.
40. The method of claim 32, wherein:
using an accessory device during the detection of the operator's gesture or action;
the accessory device is configured to be worn by or otherwise disposed on the operator of the welding system; and
the accessory device includes at least a portion of each of the detection component, the processing circuitry, and the feedback component.
41. The method of claim 32, wherein the plurality of commands and associated gestures or actions are stored in a gesture library within a local memory location or a global memory location.
42. The method of claim 32, wherein the method further comprises using an armband-based device comprising at least a portion of each of the detection component, the processing circuitry, and the feedback component.
CN201680028213.2A 2015-03-17 2016-01-19 Armband-based system and method for controlling welding equipment using gestures and similar actions Expired - Fee Related CN107635710B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US14/659,853 2015-03-17
US14/659,853 US10987762B2 (en) 2014-09-30 2015-03-17 Armband based systems and methods for controlling welding equipment using gestures and like motions
PCT/US2016/013867 WO2016148772A1 (en) 2015-03-17 2016-01-19 Armband based systems and methods for controlling welding equipment using gestures and like motions

Publications (2)

Publication Number Publication Date
CN107635710A CN107635710A (en) 2018-01-26
CN107635710B true CN107635710B (en) 2020-05-19

Family

ID=55404786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680028213.2A Expired - Fee Related CN107635710B (en) 2015-03-17 2016-01-19 Armband-based system and method for controlling welding equipment using gestures and similar actions

Country Status (3)

Country Link
EP (1) EP3271104A1 (en)
CN (1) CN107635710B (en)
WO (1) WO2016148772A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109891342B (en) * 2016-10-21 2022-12-06 通快机床两合公司 Manufacturing control based on indoor personnel location in the metal working industry
JP2019190733A (en) * 2018-04-25 2019-10-31 日本電産サンキョー株式会社 Ice maker and control method for ice maker
CN110413135A (en) * 2018-04-27 2019-11-05 开利公司 Postural entry control system and method of operation
GB201812080D0 (en) * 2018-07-24 2018-09-05 Kano Computing Ltd Motion sensing controller
US12042887B2 (en) 2019-05-22 2024-07-23 Illinois Tool Works Inc. Weld monitoring systems with unknown downtime disabling
CN111975171A (en) * 2019-05-22 2020-11-24 伊利诺斯工具制品有限公司 Welding monitoring system with unknown downtime disablement
CN113211390B (en) * 2020-06-03 2022-05-27 德丰电创科技股份有限公司 System for controlling electric equipment to operate
US12246399B2 (en) 2020-06-25 2025-03-11 Illinois Tool Works Inc. Systems and methods for part tracking using machine learning techniques
US12251773B2 (en) 2020-07-28 2025-03-18 Illinois Tool Works Inc. Systems and methods for identifying missing welds using machine learning techniques

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR950003258B1 (en) * 1990-03-31 1995-04-07 기아자동차 주식회사 Deposited metal detector of a spot welder
CN1266391A (en) * 1997-08-08 2000-09-13 株式会社安川电机 Arc welding monitoring device
CN1780712A (en) * 2004-06-16 2006-05-31 菅机械产业株式会社 Control system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009146359A1 (en) * 2008-05-28 2009-12-03 Illinois Tool Works Inc. Welding training system
AT508094B1 (en) * 2009-03-31 2015-05-15 Fronius Int Gmbh METHOD AND DEVICE FOR OPERATING A POWER SOURCE ASSOCIATED WITH A HAND-HELD WORK EQUIPMENT
US9993891B2 (en) * 2010-07-14 2018-06-12 Illinois Tool Works Inc. Welding parameter control via welder motion or position monitoring

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR950003258B1 (en) * 1990-03-31 1995-04-07 기아자동차 주식회사 Deposited metal detector of a spot welder
CN1266391A (en) * 1997-08-08 2000-09-13 株式会社安川电机 Arc welding monitoring device
CN1780712A (en) * 2004-06-16 2006-05-31 菅机械产业株式会社 Control system

Also Published As

Publication number Publication date
CN107635710A (en) 2018-01-26
WO2016148772A1 (en) 2016-09-22
EP3271104A1 (en) 2018-01-24

Similar Documents

Publication Publication Date Title
CN107635710B (en) Armband-based system and method for controlling welding equipment using gestures and similar actions
US10987762B2 (en) Armband based systems and methods for controlling welding equipment using gestures and like motions
US11654501B2 (en) Systems and methods for gesture control of a welding system
US20220161349A1 (en) Remote Power Supply Parameter Adjustment
US9922460B2 (en) Stereoscopic helmet display
US11103948B2 (en) Systems and methods for a personally allocated interface for use in a welding system
WO2009146359A1 (en) Welding training system
CN107735205B (en) Welding output control by a welding vision system
EP4044155A1 (en) Weld tracking systems
EP3789151B1 (en) Gas tungsten arc welding training system, and a method of operating a gas tungsten arc welding system
EP4238680B1 (en) Non transitory computer readable medium, method and welding system for calibration procedures for helmet based weld tracking systems
US20240046816A1 (en) Weld modules for weld training systems
CN107530839B (en) Wearable technology for interfacing with welding and monitoring devices using wireless technology
EP3484649B1 (en) Wearable technology for interfacing with welding equipment and monitoring equipment using wireless technologies
EP4064247A1 (en) Welding simulation systems with observation devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200519

Termination date: 20220119

CF01 Termination of patent right due to non-payment of annual fee