US11103994B2 - System and method for natural tasking of one or more robots - Google Patents
System and method for natural tasking of one or more robots Download PDFInfo
- Publication number
- US11103994B2 US11103994B2 US16/025,544 US201816025544A US11103994B2 US 11103994 B2 US11103994 B2 US 11103994B2 US 201816025544 A US201816025544 A US 201816025544A US 11103994 B2 US11103994 B2 US 11103994B2
- Authority
- US
- United States
- Prior art keywords
- natural
- freedom
- degrees
- robotic
- robot task
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 63
- 230000033001 locomotion Effects 0.000 claims description 22
- 230000007613 environmental effect Effects 0.000 claims description 10
- 238000004891 communication Methods 0.000 claims description 6
- 238000003780 insertion Methods 0.000 claims description 5
- 230000037431 insertion Effects 0.000 claims description 5
- 238000004088 simulation Methods 0.000 claims description 5
- 230000002040 relaxant effect Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 18
- 230000008569 process Effects 0.000 description 17
- 238000003466 welding Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 14
- 238000004590 computer program Methods 0.000 description 9
- 239000012636 effector Substances 0.000 description 8
- 238000003860 storage Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 6
- 239000013598 vector Substances 0.000 description 6
- 239000000463 material Substances 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 102000007469 Actins Human genes 0.000 description 3
- 108010085238 Actins Proteins 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 239000007921 spray Substances 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000013011 mating Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 239000003973 paint Substances 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- the invention generally relates to robotics and more specifically to a system and method for under-constraining a robot or robotic system.
- a gymnast for example, does not model the exact position of hand placements on bars, but rather models the bars as having one degree of freedom in placement.
- tasks like welding and painting have an extra degree of freedom in how the electrode or nozzle is rotated about its primary axes, and the operator tasking a robot benefits by working with these natural constraints.
- a part to be grasped by an industrial robot may have symmetry that allows grasping in multiple ways. Natural tasking specifies the constraint on the robot in its minimal and least restrictive, way.
- a system may include a robotic system having a maximum number of degrees of freedom.
- the system may further include a graphical user interface configured to receive a natural robot task having at least one natural workpiece constraint associated with the natural robot task.
- the system may also include a processor configured to identify a minimum number of degrees of freedom required to perform the natural robot task, wherein the minimum number of degrees of freedom is based upon, at least in part, the at least one natural workpiece constraint associated with the natural robot task.
- the at least one natural workpiece constraint associated with the natural robot task may be selected from the group including directional constraints, assignment to a curve or surface, distance constraints, joint positions, linear and nonlinear functional combinations of joint positions, workpiece geometry, workpiece features, center of mass, and linear and angular momentum.
- the system may include a scanner configured to identify the at least one natural workpiece constraint associated with the natural robot task.
- the system may include a simulator configured to determine a feasible or an optimized implementation of the natural robot task.
- receiving the natural robot task may be received in conjunction with a learning demonstration. The minimum number of degrees of freedom may be less than the maximum number of degrees of freedom.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be specified by a user.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be determined automatically, in response to the scanner identification.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be updated in real-time during an operation of the robotic system.
- the minimum number of degrees of freedom may be based upon, at least in part, sensor data.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be updated using a communication channel before, during, or after an operation of the robotic system.
- the graphical user interface may be at least one of a two-dimensional interface or a three-dimensional interface.
- the minimum number of degrees of freedom may be identified without referencing a robotic system frame used to control the robotic system.
- the minimum number of degrees of freedom may be identified by referencing only one or more features and a geometry of a workpiece.
- the processor may be configured to control the robotic system to perform the natural robot task by exploring an extra degree of freedom, wherein the extra degree of freedom is based upon, at least in part, a different between the maximum number of degrees of freedom and the minimum number of degrees of freedom. Exploring the extra degree of freedom may include optimizing a criteria using the extra degrees of freedom to solve a gradient-based problem that seeks to minimize the criteria.
- the processor and scanner may be configured to define a workspace prior to performing a robotic task.
- the natural robot task may include at least one natural environmental constraint associated with the natural robot task.
- the at least one natural workpiece constraint may include one or more geometrical features associated with a particular workpiece.
- the at least one natural environmental constraint includes a peg-in-hole insertion, line following, camera pointing, grasping a symmetrical object, placing an object on a flat surface, contacting a flat object with a suction cup, steering a mobile base, or controlling the center of mass of a mobile robot to product stability.
- a method may include providing a robotic system having a maximum number of degrees of freedom associated therewith.
- the method may further include receiving, at a graphical user interface, a natural robot task having at least one natural workpiece constraint associated with the natural robot task.
- the method may also include identifying, using at least one processor, a minimum number of degrees of freedom required to perform the natural robot task, wherein the minimum number of degrees of freedom is based upon, at least in part, the at least one natural workpiece constraint associated with the natural robot task.
- the at least one natural workpiece constraint associated with the natural robot task may be selected from the group including directional constraints, assignment to a curve or surface, distance constraints, joint positions, linear and nonlinear functional combinations of joint positions, workpiece geometry, workpiece features, center of mass, and linear and angular momentum.
- the method may include identifying, based upon, at least in part, a scanner, the at least one natural workpiece constraint associated with the natural robot task.
- the method may further include determining, based upon, at least in part, a simulation, a feasible or an optimized implementation of the natural robot task.
- receiving the natural robot task may be received in conjunction with a learning demonstration.
- the minimum number of degrees of freedom may be less than the maximum number of degrees of freedom.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be specified by a user.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be determined automatically, in response to the scanner identification.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be updated in real-time during an operation of the robotic system.
- the minimum number of degrees of freedom may be based upon, at least in part, sensor data.
- the at least one natural workpiece constraint associated with the natural robot task or the identification of a minimum number of degrees of freedom may be updated using a communication channel before, during, or after an operation of the robotic system.
- the graphical user interface may be at least one of a two-dimensional interface or a three-dimensional interface.
- the minimum number of degrees of freedom may be identified without referencing a robotic method frame used to control the robotic system.
- the minimum number of degrees of freedom may be identified by referencing only one or more features and a geometry of a workpiece.
- the method may further include controlling, using the at least one processor, the robotic system to perform the natural robot task by exploring an extra degree of freedom, wherein the extra degree of freedom is based upon, at least in part, a different between the maximum number of degrees of freedom and the minimum number of degrees of freedom.
- the method may also include defining a workspace prior to performing a robotic task wherein defining is based upon, at least in part, data from the processor and the scanner.
- the natural robot task may include at least one natural environmental constraint associated with the natural robot task.
- the at least one natural workpiece constraint may include geometrical features associated with a particular workpiece.
- the at least one natural environmental constraint may include a peg-in-hole insertion, line following, camera pointing, grasping a symmetrical object, placing an object on a flat surface, contacting a flat object with a suction cup, steering a mobile base, or controlling the center of mass of a mobile robot to product stability.
- a manipulator Jacobian may be used, at least in part, to calculate motion associated with the robotic system.
- FIG. 1 is a block diagram of a natural tasking robotic system, according to an embodiment of the present disclosure
- FIG. 2 is a graphical user interface showing multiple degrees of freedom of a natural tasking robotic system, according to an embodiment of the present disclosure
- FIG. 3 is a block diagram of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 4 is a block diagram of a velocity control approach for use in a natural tasking robotic system, according to an embodiment of the present disclosure
- FIG. 5 is a flowchart of a natural tasking robotic method, according to an embodiment of the present disclosure
- FIG. 6 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 7 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 8 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 9 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 10 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 11 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 12 is a graphical user interface of a natural tasking robotic system, according to an embodiment of the present disclosure.
- FIG. 13 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure
- FIG. 14 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure
- FIG. 15 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure
- FIG. 16 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure
- FIG. 17 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure.
- FIG. 18 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure
- FIG. 19 is a graphical user interface of a natural tasking robotic system in a welding application, according to an embodiment of the present disclosure.
- FIG. 20 is a graphical user interface of a natural tasking robotic system in a robotic assembly application, according to an embodiment of the present disclosure
- FIG. 21 is a graphical user interface of a natural tasking robotic system in a robotic assembly application, according to an embodiment of the present disclosure
- FIG. 22 is a graphical user interface of a natural tasking robotic system in a robotic assembly application, according to an embodiment of the present disclosure.
- FIG. 23 is a graphical user interface of a natural tasking robotic system in a robotic assembly application, according to an embodiment of the present disclosure.
- Embodiments of the subject application may include concepts from U.S. Pat. Nos. 6,757,587, 7,680,300, 8,301,421, 8,408,918, 8,428,781, 9,357,708, U.S. Publication No. 2015/0199458, U.S. Publication No. 2016/0321381, and U. S. Publication No. 2018/0060459, the entire contents of each are incorporated herein by reference in their entirety.
- System 100 may include a plurality of components, portions of which may be designed for a particular application and/or task.
- the first component of the system may include a software system 102 for adding new processes to a database 104 .
- database 104 Once built, database 104 may be reused by operators in the field or remotely. Operators may select an element from the database 104 using a graphical user interface 106 for execution by the control software 108 as is shown in FIG. 1 .
- Procedures for the particular application and/or task e.g. welding, robotic assembly, etc.
- This database 104 may be used with a graphical user interface 106 and tasking software online to develop each procedure for each task.
- the software modules may include, but are not limited to training, tasking, and performance of the particular task, etc. All of this may be used to control the manner of operation of robotic hardware 110 .
- Robotic hardware 110 may respond to the controls received from control software 108 , however, it should be noted that the robotic hardware itself may be limited by its own maximum number of degrees of freedom as is discussed in further detail hereinbelow.
- degrees of freedom may refer to specific, defined modes in which a mechanical device or system can move.
- the number of degrees of freedom may be equal to the total number of independent displacements or aspects of motion.
- a six degrees of freedom (“6 DOF”) scenario may refer to the freedom of movement of a rigid body in three-dimensional space.
- the body may be free to change position as forward/backward (surge), up/down (heave), left/right (sway) translation in three perpendicular axes, combined with changes in orientation through rotation about three perpendicular axes, often termed yaw (normal axis), pitch (transverse axis), and roll (longitudinal axis).
- placing a point in space may correspond three degrees of freedom, specifying a distance between two points on different links is one degree of freedom, etc.
- the phrase “robotic system”, as used herein, may include a system of one, two, and/or any number of robots.
- an entire robotic system DOF may refer to the sum of the DOFs on each of the individual robots. This may include one DOF for each single-axis joint, and six DOF for a free-moving base. For example, for a robotic system that includes two robots, one having 6 DOF and the other having 5 DOF, the available entire robotic system degrees of freedom may be 11 DOF.
- Embodiments included herein may be configured to explore an extra number of degrees of freedom of one or many robots to perform a robotic task naturally.
- an entire robotic system may include a maximum number of degrees of freedom associated therewith DOF (“N”) and may also include a minimum number of DOF (“M”) subtracted therefrom.
- a 3D rendering from a graphical user interface 200 depicting a welding tool having five degrees of freedom contrasted with a welding tool having six degrees of freedom, full 3D rotation and orientation is provided.
- the difference between the two is the five degrees of freedom may be found by relaxing rotation about one axis.
- This particular model was tuned to optimize a scan and weld process. This involved configuring the bounding volumes around the scanner tool, as well as aligning the system and base primary frames to make the tool paths and point clouds appear in the correct locations.
- the tool offsets were also configured based on how the tool paths were created from the point clouds. The paths were created along the bottom of the seams, so an offset from the tip was configured so there were no collisions. As shown in FIG.
- the arm may be constrained using two different constraint sets, with each one using a different tool offset, one for scanning the part and one for welding.
- the degrees of freedom set for scanning uses a six degree of freedom frame
- the welding set uses a five degree of freedom frame that allows the tip of the torch to freely rotate around the tip. This allows scanning for the part first, then relaxing of the degrees of freedom for the weld paths, which are more difficult for the robot to achieve given its workspace envelope.
- Natural tasking process 300 may include defining 302 robots, tools, parts, and a robotic environment. Process 300 may further include defining 304 one or more constraints based upon, tasks, tool geometry, part geometry, etc. Process 300 may also include defining 306 one or more task motions using the constraints. Process 300 may be configured to define a minimal number of constraints naturally through analysis of geometry (e.g., tool geometry), such as by finding axes of symmetry or near-symmetry. In this way, the naturalness of the constraint as well as an ability to exploit extra degrees of freedom in robot motion allow for natural control over the robot.
- geometry e.g., tool geometry
- Natural control may be based upon, at least in part, both natural constraints as well as natural optimization of those constraints, as will be discussed in further detail below.
- the task may be executed 308 using a Jacobian-based velocity control approach as is discussed below.
- a block diagram 400 consistent with embodiments of the natural tasking robotic system of the present disclosure is provided.
- Control of the end effectors of the robotic arms in a system may be accomplished using multiple components (e.g., using Energid's Actin® software available from the Assignee of the present disclosure). These components may include, but are not limited to, outer position controller 402 and inner velocity controller 404 .
- This combined control system 400 allows the program to “fly” the end effector throughout task space by specifying a desired placement, and joint velocities and positions may be calculated automatically. This frees the operator to focus on the task, rather than low-level control of the robot.
- V is an m-length vector representation of the motion of the hand or hands (usually some combination of linear and angular velocity referenced to points rigidly attached to parts of the manipulator); q is the n-length vector of joint positions (with ⁇ dot over (q) ⁇ being its time derivative); and J is the m ⁇ n manipulator Jacobian, a function of q.
- V is often the frame velocity with three linear and three angular components. As used herein, it takes on a larger meaning that includes the concatenation of point, frame, or other motion of multiple end-effectors.
- the Jacobian J(q) is the matrix that makes ( ) true for all possible values of ⁇ dot over (q) ⁇ . Note V can represent a concatenation of values for multiple end effectors, enabling coordinated motion of multiple points on the manipulator.
- the velocity control question is the following: given a desired tool motion V, what are the joint rates ⁇ dot over (q) ⁇ that best achieve this motion?
- the framework is based, in part, on the approach below, which uses a scalar a, a matrix function W(q), and a scalar function ⁇ (q) to solve for ⁇ dot over (q) ⁇ given V through the following formula:
- Both ⁇ and NJ are generally functions of q.
- Embodiments of the present disclosure go beyond the formulation above to create a more general framework. Instead of insisting on the use of the gradient of a function, a general column vector F(q) is used. Not all vector functions are gradients. This minor, but important, modification yields the following formula:
- ⁇ , W and F can be defined using XML, to give many different types of velocity control.
- position control system 402 may build upon the velocity control system 404 to give robust position control to the end effectors.
- a position control module may be supplied as an interface to the position control system 402 to enable the addition of new position control algorithms.
- Position control system 402 may provide basic position control for the end effectors by building upon velocity control system 404 . Given the joint velocities from velocity control system 404 , position control system 402 may use Euler integration to calculate the joint positions. Once the joint positions are calculated, the end effector positions are known. The control system may check for joint limit exceedance and collisions as it is iterated forward in time. It may also zero the joint rate commands if a singularity is detected.
- a key component of the robot control which may involve multiple mechanisms, is to avoid collisions with the environment.
- a collision avoidance algorithm may be used based on defined bounding volumes for every shape of interest in the simulation.
- This algorithm employs a gradient-based method that seeks to minimize the function with the general form as follows:
- N is the number of links in the manipulator
- B is the number of obstacles in the environment
- p is a user-defined exponent
- F(i,j) is a measure of the proximity of the bounding volume of i to the bounding volume of link j.
- F(i,j) is zero when the distance is larger than a user-specified threshold, and within the threshold, it is just a scalar constant times the minimum distance needed to move one bounding volume to take it outside the threshold distance from the other bounding volume.
- this function may be used with finite differencing to calculate its gradient, which may then be used as the vector parameter to the core control system to drive the manipulators in a direction that avoids collisions.
- the obstacles in the environment may be well defined, such as for other manipulators that broadcast their state, or it may be determined in real-time, from vision or other sensing mechanisms.
- control methodologies discussed above may be used, at least in part, to allow for the controller to relate the entire robotic system DOF (e.g., the available DOF of the system, “N”) to the required minimum number of degrees of freedom identified based on the workpiece constraints alone (“M”).
- the robotic system and method described herein may only need workpiece data in order to execute a natural robot task.
- the DOF may be based upon other factors as well, including but not limited to, other factors or constraints, however this information may not be necessary.
- Embodiments of the natural tasking robotic system and methods provided herein may be available to provide better control and/or increased performance of the robotic system, more feasible and/or optimized solutions for a robotic task, etc.
- the process 500 may include providing 502 a first robot of one or more robots, the first robot having a maximum number of degrees of freedom associated therewith.
- Process 500 may further include receiving ( 504 ), at a graphical user interface, a natural robot task having at least one natural workpiece constraint associated with the natural robot task.
- Process 500 may also include identifying ( 506 ), using at least one processor, a minimum number of degrees of freedom required to perform the natural robot task, wherein the minimum number of degrees of freedom is based upon, at least in part, the at least one natural workpiece constraint associated with the natural robot task.
- maximum number of degrees of freedom may refer to a maximum number of specific, defined modes in which a mechanical device, robot(s), and/or system may move.
- minimum number of degrees of freedom may refer to a minimum number of specific, defined modes in which a mechanical device, robot(s), and/or system move to execute a task naturally.
- natural robot task may refer to a task that may be performed using a reduced and/or simplified amount of motion.
- specifying a natural robot task may be contrasted with a math intensive way of specifying a task that may require a user to specify in robotic system's coordinate/frame
- the natural robot task may be specified/demonstrated to the robotic system using one or more robots or tools and the minimum number of degrees of freedom may be determined based upon, at least in part, the natural robot task.
- the minimum number of degrees of freedom may be determined through analysis of tool, robot, and/or workpiece geometry. It may be specified using a demonstration, teach to learn robotic techniques, using a GUI, etc.
- the naturalness of the task may refer to the degree of alignment between 1) the commanded robot task and 2) the actual desired thing to be done.
- natural workpiece constraint may relate to geometrical features, material properties, and/or any other aspect associated with a particular workpiece.
- natural environmental constraint may relate to geometrical features, obstacles, material properties, and/or any other aspect associated with an environment that may be associated with a natural robot task.
- workpiece constraints, environmental constraints, and manipulator constraints may form a continuum in this approach, and multiple natural constraints can be combined to define a comprehensive tasking movement. Some of these may include, but are not limited to, directional constraints, orientation constraints, assignment to a curve or surface, distance constraints, joint positions, linear and nonlinear functional combinations of joint positions, center of mass, and linear and angular momentum. These constraints may vary as function of robot configuration. Tool offsets, static objects or obstacles, and dynamic objects or obstacles may be used to define and configure the constraints. In some embodiments, the identification of the minimum number of degrees of freedom may be based on the workpiece constraints geometry and/or its features, some of which may include, but are not limited to, holes, shapes, materials, sizes, etc.
- one or more constraints may be combined and/or specified by the user using a visual editor, such as the GUIs shown in FIGS. 6-12 to create complex combinations of constraints that change over time to best achieve a real-world task. It should be noted that some or all of these constraints may be changed during execution of a robot task or operation, even time step by time step, and may be combined with optimizations such as collision and joint-limit avoidance. In this way, the number and type of constraints may be updated, dynamically, in real-time. For example, in some embodiments, some or all of the constraints may be automatically determined and/or identified using a scanner that may be configured to scan the robotic environment to identify, allow for the necessary processing, and subsequent determinations.
- a scanner may be configured to scan the robotic environment to identify, allow for the necessary processing, and subsequent determinations.
- Various sensors and devices may be used to obtain this data, some of which may include, but are not limited to, data from lidar, stereo cameras, color cameras, multispectral cameras, etc. Additionally and/or alternatively, some or all of the constraints may be updated remotely over a communication channel (e.g., radio, laser, or network communication, etc.).
- the constraints are set using a 2D graphical user interface or a 3D interface (e.g., augmented reality, etc.).
- the constraints may be of the “less than or equal to” variety, not just “equal to”. Accordingly, the natural tasking robotic system included herein may allow a user to constrain a point to be above the floor not just on it, for example.
- embodiments of the natural tasking robotic system described herein may include one or more robots having a maximum number of degrees of freedom “N”.
- the system may allow a user to specify (e.g., teaching the robotic system by example, performing a robot “watch” and learn using one or more sensors, using a GUI such as a teach pendant to perform a demo, etc.) a task naturally by indicating only what needs to be done, such as insert a peg in a hole (using five degrees of freedom in constraint), follow a line with the tip of a welding tool (using three degrees of freedom in constraint), or point a camera at a location in space (using two degrees of freedom in constraint), etc.
- Specifying a task naturally may include specifying the task using a demonstration, for example, teaching or learning the robot how to perform the task. Additionally and/or alternatively, the user may not specify how the robotic system may perform the task and the system may determine how to perform the task automatically or using minimal inputs.
- the natural tasking robotic system may be configured to determine, via exploring the extra (N-M) extra degree of freedom, the most natural or efficient manner in which to perform the task. This may involve optimizing secondary criteria using the extra degrees of freedom, such as the criterion in (4).
- a controller e.g. Actin
- the extra number of degrees of freedom may be used to determine more convenient, faster, energy efficient, and/or feasible, “ways” to perform the task.
- the way or manner of performing may be based upon, at least in part, the determination of collision free paths, coordination of robots (if more than one), etc.
- executing or performing the natural robot task may be at least partly employed using various software methodologies (e.g., using Energid's Actin® software available from the Assignee of the present disclosure). Determining the way or manner of performing the task naturally may also include using one or more simulators that may be configured to perform a simulation. The results of such a simulation may be used to determine how to perform the task.
- exploring the extra degrees of freedom may include completely relaxing some DOF (e.g., the “Z” dimension). Additionally and/or alternatively, embodiments included herein may be configured to relax a DOF partially, for example, allow Z to free spin within + ⁇ 15 degrees. The same concept may be applied to any DOF to provide flexibility when specifying a natural task. Accordingly, this may allow the robot system to find a solution in a more efficient manner.
- DOF e.g., the “Z” dimension
- embodiments included herein may be configured to relax a DOF partially, for example, allow Z to free spin within + ⁇ 15 degrees.
- the same concept may be applied to any DOF to provide flexibility when specifying a natural task. Accordingly, this may allow the robot system to find a solution in a more efficient manner.
- embodiments depicting graphical user interfaces consistent with a natural tasking robotic system are provided.
- embodiments of the present disclosure may be configured to define how the robot or robots are constrained based on the task. This may involve only constraining what is necessary.
- the user may define which links are constrained and how using the general spatial constraint. This may involve selecting a constraint, specifying velocities, and/or defining tool offsets based on the geometry.
- the user may also use GUI 700 of FIG. 7 to configure one or more tool or robot offsets as is shown in FIG. 8 .
- FIGS. 9-10 embodiments depicting graphical user interfaces 900 , 1000 configured to define a tool path and a part-relative waypoint are provided respectively.
- the user may define the task relative to the objects to be manipulated (e.g., the tasks natural reference frame). This may be achieved using one or more waypoint poses, paths, and/or other custom defined motions. Each motion may include its own individual GUI.
- the final step may involve scripting out the order of the operations to be performed.
- FIGS. 11-12 embodiments depicting graphical user interfaces 1100 , 1200 configured to edit constraints and manage tool offset, respectively.
- FIG. 11 includes a constraint editor that includes an edit tool offset option.
- the “Edit Tool Offsets” option may allow users to edit the tool offset in context, as shown in FIG. 12 .
- selecting the option may display a tool offset editor, pre-selected to the offset in use by the constraint.
- a natural tasking robotic system as applied to a welding process are provided.
- minimal constraints may be specified in their natural reference frame, further simplifying the tasking process.
- An example is a task performed by one robot on a part held by another, such as the welding task shown in FIG. 13 .
- the welding task may be defined in the reference frame of the part being held by two robots. This constraint approach is natural in specifying exactly what needs to be done, and allowing motions of all robots to achieve it.
- the system may be configured to constrain the grippers relative to the parts they intend to grasp. No matter where in the workspace the parts are, the script commands them relative to their location.
- FIG. 14 shows an example where two parts may be mated together. This may involve setting the tool offset on one arm to the mating coordinate of the part its holding, and setting the goal to be the mating coordinate of the other part, which may be held by the other arm. The arms may be constrained relative to each other, so there is no specific location in the system frame where these parts will be mated.
- the third arm may move the welder into position relative the assembly which may be held by the other two arms.
- the position of the assembly in the workspace may not be constrained, only the weld gun may be constrained relative to the assembly, and its roll and pitch may be constrained in system.
- the welder may move along the path and the other two arms may automatically manipulate the part to achieve the path motion. It should be noted that the assembly appears to travel randomly around the workspace. After the path is complete, one gripper may retract relative to the assembly, again unconstrained in the system frame. Then move to a nominal system positon.
- This example shows how minimal constraints are specified in their natural reference frame, and may be used to complete a very complex motion coordinated between three arms.
- positioning a part in an assembly task may not require a full 6-DOF depending on the geometry of the parts.
- This particular example shows positioning an axle through a wheel and a carriage.
- the arm holding the axle bolt may be free to rotate about the axle bolt axis, and the arm holding the wheel is also free to rotate about the wheel axis (e.g. Y-axis).
- the first step in the process of inserting the axle may include grasping the parts and then setting the tool offset to the relevant coordinates.
- one arm is the center of the wheel, and the other is the center of the axle.
- the wheel may then be positioned relative to the housing.
- the arm holding the wheel may be free to rotate about the wheel axis, and the arm holing the axle is free to rotate about the axle axis.
- the next step is insertion, which may involve commanding the hand to move the pin into position with its current constraints.
- the arm holding the wheel may be free to rotate about its axis and avoid collision with the other arm.
- the last step may include releasing and retracting relative to the parts being manipulated. This example shows how minimal constraints may be specified in their natural reference frame, and how they may be used to complete an assembly motion coordinated between two robot arms.
- aspects of the present disclosure may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a circuit,” “module” or “system.” Furthermore, aspects of the present disclosure may take the orm of a computer program product embodied in one or more computer readable medium(s) aving computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave.
- Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
- Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C ++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- LAN local area network
- WAN wide area network
- Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
- These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may, but not always, represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
Description
V=J(q){dot over (q)} (1)
½{dot over (q)} T W{dot over (q)}+αF T {dot over (q)} (4)
-
- Desired task (1): Place a cup on a desk.
- Unnatural: Place a cup with handle pointing to the right at x=30, y=50 cm from the corner.
- Natural: Place the cup anywhere within the bounds of the table with the handle pointed in any direction.
- Desired task (2): Point a camera on the robot at 3D point x=1, y=2, z=2 in space.
- Unnatural: Place the camera at x=0, y=0.2, z=0.3 and pointed it at x=1, y=2, z=2.
- Unnatural: Point the camera at {1,2,2} from any location with up in the image aligned with up in the real world.
- Natural: With no constraint on camera position, point the camera at {1,2,2} with any orientation.
- Desired task (3): Spray paint a part using a nozzle 1 m away and pointed directly at the center of the part.
- Unnatural: Spray from 1 m away with the nozzle pointed at the part and specially oriented about its axis.
- Natural: Spray from 1 m away with no special orientation of the nozzle other than pointing it at the part.
Claims (42)
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/025,544 US11103994B2 (en) | 2018-07-02 | 2018-07-02 | System and method for natural tasking of one or more robots |
PCT/US2019/040289 WO2020010076A1 (en) | 2018-07-02 | 2019-07-02 | System and method for natural tasking of one or more robots |
JP2020569928A JP7487118B2 (en) | 2018-07-02 | 2019-07-02 | System and method for natural robot task of one or more robot cross-referencing paragraphs |
CN201980041397.XA CN112384335A (en) | 2018-07-02 | 2019-07-02 | System and method for natural task assignment for one or more robots |
CA3103283A CA3103283A1 (en) | 2018-07-02 | 2019-07-02 | System and method for natural tasking of one or more robots |
MX2020014190A MX2020014190A (en) | 2018-07-02 | 2019-07-02 | System and method for natural tasking of one or more robots. |
SG11202011973VA SG11202011973VA (en) | 2018-07-02 | 2019-07-02 | System and method for natural tasking of one or more robots |
EP19779222.9A EP3817899A1 (en) | 2018-07-02 | 2019-07-02 | System and method for natural tasking of one or more robots |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/025,544 US11103994B2 (en) | 2018-07-02 | 2018-07-02 | System and method for natural tasking of one or more robots |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200001457A1 US20200001457A1 (en) | 2020-01-02 |
US11103994B2 true US11103994B2 (en) | 2021-08-31 |
Family
ID=68073154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/025,544 Active 2039-04-05 US11103994B2 (en) | 2018-07-02 | 2018-07-02 | System and method for natural tasking of one or more robots |
Country Status (8)
Country | Link |
---|---|
US (1) | US11103994B2 (en) |
EP (1) | EP3817899A1 (en) |
JP (1) | JP7487118B2 (en) |
CN (1) | CN112384335A (en) |
CA (1) | CA3103283A1 (en) |
MX (1) | MX2020014190A (en) |
SG (1) | SG11202011973VA (en) |
WO (1) | WO2020010076A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11103994B2 (en) * | 2018-07-02 | 2021-08-31 | Teradyne, Inc. | System and method for natural tasking of one or more robots |
WO2022221136A1 (en) * | 2021-04-16 | 2022-10-20 | Dexterity, Inc. | Repositionable robot riser |
Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6004016A (en) * | 1996-08-06 | 1999-12-21 | Trw Inc. | Motion planning and control for systems with multiple mobile objects |
US6157368A (en) * | 1994-09-28 | 2000-12-05 | Faeger; Jan G. | Control equipment with a movable control member |
US6274839B1 (en) * | 1998-12-04 | 2001-08-14 | Rolls-Royce Plc | Method and apparatus for building up a workpiece by deposit welding |
US6385508B1 (en) * | 2000-10-31 | 2002-05-07 | Fanuc Robotics North America, Inc. | Lead-through teach handle assembly and method of teaching a robot assembly |
US6757587B1 (en) | 2003-04-04 | 2004-06-29 | Nokia Corporation | Method and apparatus for dynamically reprogramming remote autonomous agents |
US20050166413A1 (en) * | 2003-04-28 | 2005-08-04 | Crampton Stephen J. | CMM arm with exoskeleton |
US7016539B1 (en) * | 1998-07-13 | 2006-03-21 | Cognex Corporation | Method for fast, robust, multi-dimensional pattern recognition |
US20070274812A1 (en) | 2006-05-29 | 2007-11-29 | Fanuc Ltd | Workpiece picking device and method |
WO2008119383A1 (en) | 2007-03-30 | 2008-10-09 | Abb Technology Ab | Method and apparatus for programming an industrial robot |
US7680300B2 (en) | 2004-06-01 | 2010-03-16 | Energid Technologies | Visual object recognition and tracking |
US20110172819A1 (en) * | 2010-01-14 | 2011-07-14 | Samsung Electronics Co., Ltd. | Manipulator and control method thereof |
US20120017507A1 (en) * | 2010-07-21 | 2012-01-26 | Cognisense Labs, Inc. | Automated organic polarized object organization |
US8301421B2 (en) | 2006-03-31 | 2012-10-30 | Energid Technologies | Automatic control system generation for robot design validation |
US8408918B2 (en) | 2007-06-27 | 2013-04-02 | Energid Technologies Corporation | Method and apparatus for haptic simulation |
US8428781B2 (en) | 2008-11-17 | 2013-04-23 | Energid Technologies, Inc. | Systems and methods of coordination control for robot manipulation |
US20130158709A1 (en) * | 2011-12-14 | 2013-06-20 | GM Global Technology Operations LLC | Robot control during an e-stop event |
US20130211593A1 (en) * | 2010-11-17 | 2013-08-15 | Mitsubishi Electric Corporation | Workpiece pick-up apparatus |
US20140163729A1 (en) * | 2012-12-07 | 2014-06-12 | GM Global Technology Operations LLC | Planning a Grasp, for Use by a Robotic grasper to Pick Up a Complex Object, Based on Object, Grasper, and Grasper Approach Data |
US20150199458A1 (en) | 2014-01-14 | 2015-07-16 | Energid Technologies Corporation | Digital proxy simulation of robotic hardware |
US20160008978A1 (en) * | 2014-07-09 | 2016-01-14 | Fanuc Corporation | Robot control device for preventing misjudgment by collision judging part |
US20160052043A1 (en) * | 2013-02-20 | 2016-02-25 | Newfrey Llc | Compensating device for a tool unit and fitting method by means of the tool unit |
US9321176B1 (en) * | 2014-04-01 | 2016-04-26 | University Of South Florida | Systems and methods for planning a robot grasp based upon a demonstrated grasp |
US9357708B2 (en) | 2008-05-05 | 2016-06-07 | Energid Technologies Corporation | Flexible robotic manipulation mechanism |
US9375839B2 (en) * | 2012-01-13 | 2016-06-28 | Carnegie Mellon University | Methods and computer-program products for evaluating grasp patterns, and robots incorporating the same |
US20160321381A1 (en) | 2015-04-29 | 2016-11-03 | Energid Technologies Corporation | System and method for evaluation of object autonomy |
US20170143442A1 (en) * | 2015-11-25 | 2017-05-25 | Camplex, Inc. | Surgical visualization systems and displays |
US20170320211A1 (en) | 2016-05-09 | 2017-11-09 | Opiflex Automation AB | system and a method for programming an industrial robot |
US20180060459A1 (en) | 2016-09-01 | 2018-03-01 | Energid Technologies Corporation | System and method for game theory-based design of robotic systems |
US20180354130A1 (en) * | 2015-10-30 | 2018-12-13 | Keba Ag | Method, control system and movement setting means for controlling the movements of articulated arms of an industrial robot |
US20190063907A1 (en) * | 2017-08-22 | 2019-02-28 | Faro Technologies, Inc. | Measurement system having a cooperative robot and three-dimensional imager |
US20190231458A1 (en) * | 2016-07-01 | 2019-08-01 | Intuitive Surgical Operations, Inc. | Computer-assisted medical systems and methods |
US20190258275A1 (en) * | 2018-02-22 | 2019-08-22 | Boston Dynamics, Inc. | Mobile Robot |
US20190321984A1 (en) * | 2018-04-24 | 2019-10-24 | Fanuc Corporation | Robot controller and system |
US20190358817A1 (en) * | 2016-11-10 | 2019-11-28 | Cognibotics Ab | System and method for instructing a robot |
US20190358813A1 (en) * | 2018-05-23 | 2019-11-28 | General Electric Company | System and Method for Controlling a Robotic Arm |
US20200001457A1 (en) * | 2018-07-02 | 2020-01-02 | Teradyne, Inc | System and method for natural tasking of one or more robots |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06187020A (en) * | 1992-12-18 | 1994-07-08 | Kobe Steel Ltd | Operational locus instructing method for robot |
EP1728600B1 (en) * | 2005-05-31 | 2008-03-12 | Honda Research Institute Europe GmbH | Controlling the trajectory of an effector |
US9327401B2 (en) * | 2012-09-10 | 2016-05-03 | Fanuc America Corporation | Method of controlling a redundant robot |
KR102188100B1 (en) * | 2013-03-15 | 2020-12-07 | 삼성전자주식회사 | Robot and control method thereof |
JP2015174184A (en) * | 2014-03-14 | 2015-10-05 | 三菱重工業株式会社 | Controller |
JP2018030210A (en) * | 2016-08-25 | 2018-03-01 | キヤノン株式会社 | Simulation device, control system, robot system, simulation method, program and recording medium |
DE102016221464A1 (en) * | 2016-11-02 | 2018-05-03 | Karlsruher Institut für Technologie | Method of making an optical system and optical system |
-
2018
- 2018-07-02 US US16/025,544 patent/US11103994B2/en active Active
-
2019
- 2019-07-02 MX MX2020014190A patent/MX2020014190A/en unknown
- 2019-07-02 WO PCT/US2019/040289 patent/WO2020010076A1/en unknown
- 2019-07-02 JP JP2020569928A patent/JP7487118B2/en active Active
- 2019-07-02 EP EP19779222.9A patent/EP3817899A1/en active Pending
- 2019-07-02 SG SG11202011973VA patent/SG11202011973VA/en unknown
- 2019-07-02 CA CA3103283A patent/CA3103283A1/en active Pending
- 2019-07-02 CN CN201980041397.XA patent/CN112384335A/en active Pending
Patent Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6157368A (en) * | 1994-09-28 | 2000-12-05 | Faeger; Jan G. | Control equipment with a movable control member |
US6004016A (en) * | 1996-08-06 | 1999-12-21 | Trw Inc. | Motion planning and control for systems with multiple mobile objects |
US7016539B1 (en) * | 1998-07-13 | 2006-03-21 | Cognex Corporation | Method for fast, robust, multi-dimensional pattern recognition |
US6274839B1 (en) * | 1998-12-04 | 2001-08-14 | Rolls-Royce Plc | Method and apparatus for building up a workpiece by deposit welding |
US6385508B1 (en) * | 2000-10-31 | 2002-05-07 | Fanuc Robotics North America, Inc. | Lead-through teach handle assembly and method of teaching a robot assembly |
US6757587B1 (en) | 2003-04-04 | 2004-06-29 | Nokia Corporation | Method and apparatus for dynamically reprogramming remote autonomous agents |
US20050166413A1 (en) * | 2003-04-28 | 2005-08-04 | Crampton Stephen J. | CMM arm with exoskeleton |
US7680300B2 (en) | 2004-06-01 | 2010-03-16 | Energid Technologies | Visual object recognition and tracking |
US8301421B2 (en) | 2006-03-31 | 2012-10-30 | Energid Technologies | Automatic control system generation for robot design validation |
US20070274812A1 (en) | 2006-05-29 | 2007-11-29 | Fanuc Ltd | Workpiece picking device and method |
WO2008119383A1 (en) | 2007-03-30 | 2008-10-09 | Abb Technology Ab | Method and apparatus for programming an industrial robot |
US8408918B2 (en) | 2007-06-27 | 2013-04-02 | Energid Technologies Corporation | Method and apparatus for haptic simulation |
US9357708B2 (en) | 2008-05-05 | 2016-06-07 | Energid Technologies Corporation | Flexible robotic manipulation mechanism |
US8428781B2 (en) | 2008-11-17 | 2013-04-23 | Energid Technologies, Inc. | Systems and methods of coordination control for robot manipulation |
US20110172819A1 (en) * | 2010-01-14 | 2011-07-14 | Samsung Electronics Co., Ltd. | Manipulator and control method thereof |
US20120017507A1 (en) * | 2010-07-21 | 2012-01-26 | Cognisense Labs, Inc. | Automated organic polarized object organization |
US20130211593A1 (en) * | 2010-11-17 | 2013-08-15 | Mitsubishi Electric Corporation | Workpiece pick-up apparatus |
US20130158709A1 (en) * | 2011-12-14 | 2013-06-20 | GM Global Technology Operations LLC | Robot control during an e-stop event |
US9375839B2 (en) * | 2012-01-13 | 2016-06-28 | Carnegie Mellon University | Methods and computer-program products for evaluating grasp patterns, and robots incorporating the same |
US20140163729A1 (en) * | 2012-12-07 | 2014-06-12 | GM Global Technology Operations LLC | Planning a Grasp, for Use by a Robotic grasper to Pick Up a Complex Object, Based on Object, Grasper, and Grasper Approach Data |
US20160052043A1 (en) * | 2013-02-20 | 2016-02-25 | Newfrey Llc | Compensating device for a tool unit and fitting method by means of the tool unit |
US20150199458A1 (en) | 2014-01-14 | 2015-07-16 | Energid Technologies Corporation | Digital proxy simulation of robotic hardware |
US9321176B1 (en) * | 2014-04-01 | 2016-04-26 | University Of South Florida | Systems and methods for planning a robot grasp based upon a demonstrated grasp |
US20160008978A1 (en) * | 2014-07-09 | 2016-01-14 | Fanuc Corporation | Robot control device for preventing misjudgment by collision judging part |
US20160321381A1 (en) | 2015-04-29 | 2016-11-03 | Energid Technologies Corporation | System and method for evaluation of object autonomy |
US20180354130A1 (en) * | 2015-10-30 | 2018-12-13 | Keba Ag | Method, control system and movement setting means for controlling the movements of articulated arms of an industrial robot |
US20170143442A1 (en) * | 2015-11-25 | 2017-05-25 | Camplex, Inc. | Surgical visualization systems and displays |
US20170320211A1 (en) | 2016-05-09 | 2017-11-09 | Opiflex Automation AB | system and a method for programming an industrial robot |
US20190231458A1 (en) * | 2016-07-01 | 2019-08-01 | Intuitive Surgical Operations, Inc. | Computer-assisted medical systems and methods |
US20180060459A1 (en) | 2016-09-01 | 2018-03-01 | Energid Technologies Corporation | System and method for game theory-based design of robotic systems |
US20190358817A1 (en) * | 2016-11-10 | 2019-11-28 | Cognibotics Ab | System and method for instructing a robot |
US20190063907A1 (en) * | 2017-08-22 | 2019-02-28 | Faro Technologies, Inc. | Measurement system having a cooperative robot and three-dimensional imager |
US20190258275A1 (en) * | 2018-02-22 | 2019-08-22 | Boston Dynamics, Inc. | Mobile Robot |
US20190321984A1 (en) * | 2018-04-24 | 2019-10-24 | Fanuc Corporation | Robot controller and system |
US20190358813A1 (en) * | 2018-05-23 | 2019-11-28 | General Electric Company | System and Method for Controlling a Robotic Arm |
US20200001457A1 (en) * | 2018-07-02 | 2020-01-02 | Teradyne, Inc | System and method for natural tasking of one or more robots |
Non-Patent Citations (1)
Title |
---|
International Search Report and Written Opinion in PCT Application No. PCT/US2019/040289 dated Dec. 20, 2019; 17 pages. |
Also Published As
Publication number | Publication date |
---|---|
WO2020010076A1 (en) | 2020-01-09 |
CA3103283A1 (en) | 2020-01-09 |
SG11202011973VA (en) | 2021-01-28 |
JP2021529674A (en) | 2021-11-04 |
US20200001457A1 (en) | 2020-01-02 |
JP7487118B2 (en) | 2024-05-20 |
MX2020014190A (en) | 2021-03-09 |
CN112384335A (en) | 2021-02-19 |
EP3817899A1 (en) | 2021-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11358282B2 (en) | System and method for constraint management of one or more robots | |
CN110198813B (en) | Robot path generation device and robot system | |
Klamt et al. | Supervised autonomous locomotion and manipulation for disaster response with a centaur-like robot | |
US20220402127A9 (en) | Multi-angle end effector | |
Kaldestad et al. | Collision avoidance with potential fields based on parallel processing of 3D-point cloud data on the GPU | |
Valenzuela-Urrutia et al. | Virtual reality-based time-delayed haptic teleoperation using point cloud data | |
US11103994B2 (en) | System and method for natural tasking of one or more robots | |
Wallace et al. | Multimodal teleoperation of heterogeneous robots within a construction environment | |
Pedemonte et al. | Visual-based shared control for remote telemanipulation with integral haptic feedback | |
Jagersand et al. | Visual space task specification, planning and control | |
Quesada et al. | Holo-SpoK: Affordance-aware augmented reality control of legged manipulators | |
Makita et al. | Offline direct teaching for a robotic manipulator in the computational space | |
James et al. | Prophetic goal-space planning for human-in-the-loop mobile manipulation | |
Rastegarpanah et al. | Electric Vehicle Battery Disassembly Using Interfacing Toolbox for Robotic Arms | |
Cho et al. | Development of VR visualization system including deep learning architecture for improving teleoperability | |
Fryc et al. | Efficient pipeline for mobile brick picking | |
Bouzouia et al. | Teleoperation system of the mobile Manipulator Robot ROBUTER_ULM: Implementation issues | |
Jiménez et al. | Autonomous object manipulation and transportation using a mobile service robot equipped with an RGB-D and LiDAR sensor | |
Lang et al. | Visual servoing with LQR control for mobile robots | |
Yoon et al. | Algorithm to Automatically Generate Non-Collision Trajectory to Perform Pick-and-Place Operation with a Mobile Manipulator | |
Giampà | Development of an Autonomous Mobile Manipulator for Industrial and Agricultural Environments | |
EP4088882A1 (en) | Method of manipulating a construction object, construction robot system, and computer program product | |
Ponde et al. | Integrating 3Ds Max with Robot Technology: A Case Study on the Robot Palletizer System | |
Ainampudi | Box pushing with a mobile robot using visual servoing | |
CN115229789A (en) | Teleoperation method of robot, robot and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: TERADYNE, INC., MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIMONE, BRETT L;KEESLING, JUSTIN C;PHOLSIRI, CHALONGRATH;AND OTHERS;SIGNING DATES FROM 20180706 TO 20180827;REEL/FRAME:049656/0964 |
|
AS | Assignment |
Owner name: TRUIST BANK, GEORGIA Free format text: SECURITY INTEREST;ASSIGNOR:TERADYNE, INC.;REEL/FRAME:052595/0632 Effective date: 20200501 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
AS | Assignment |
Owner name: THE GOVERNMENT OF THE UNITED STATES AS REPRSENTED BY THE SECRETARY OF THE AIR FORCE, OHIO Free format text: CONFIRMATORY LICENSE;ASSIGNOR:ENERGID TECHNOLOGIES CORPORATION;REEL/FRAME:056832/0886 Effective date: 20190822 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: UNIVERSAL ROBOTS USA, INC., MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TERADYNE, INC.;REEL/FRAME:062052/0981 Effective date: 20221206 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |